Why the Online Safety Bill must include provisions for data access

By: Sasha Havlicek, CEO, Institute for Strategic Dialogue

17 July 2023


The UK’s efforts to keep the British public safe from online harms will be stymied if the Online Safety Bill fails to mandate platform data access for independent researchers.

A final attempt is being made by cross-party Peers in the House of Lords to strengthen the Online Safety Bill to ensure that researchers get access to social media data that would enable proper independent scrutiny of the Bill’s enforcement. Without powers to mandate such access, the British public will be less safe than their European counterparts, and UK social media users could be kept in the dark about the scale and impact of online harms. 

Social media platforms have enjoyed unprecedented rises in profits, reach and power over the past decade, yet scarcely a day goes by without news of a catastrophe whose origins can be traced online. From genocide in Myanmar to white supremacist violence in Buffalo, New York, real world violence is increasingly inseparable from online radicalisation processes. In the UK in recent months, a foiled Islamist extremist plotter radicalised during the pandemic and a far-right attack on a migrant centre in Dover, both show the violent outcomes of individuals becoming immersed in hateful and conspiratorial online ecosystems. 

ISD’s own research has shown how everything from terrorist propaganda (on TikTok, Facebook, and elsewhere) to information operations linked to hostile state actors (including China and Russia) have continued to proliferate on social media. Even with limited access to data, the research community has uncovered everything from networks of child abusers to Russian troll armies which would have remained free to operate online without their efforts. The emergence of generative AI will only increase these threats. 

Recent testimony from whistleblowers shows social media companies know a great deal about how their products cause real-world harm among children and other vulnerable groups, but have chosen not to share that information with the public, or to make the systemic changes necessary to prevent them. Meanwhile, many platforms are making accessing data even harder for independent researchers. Recent changes by Twitter have made access for independent researchers prohibitively expensive. Meta reportedly plans to shut down the main tool researchers use to access data from Facebook and Instagram, and has selectively removed access from researchers with whom they disagree. TikTok only recently started offering very limited and flawed access to selected US researchers. Currently, companies have few incentives to voluntarily open themselves up to external scrutiny. 

The Online Safety Bill offers a rare opportunity to change this, but it will be squandered unless the Government fully backs a series of key amendments from Lord Bethell, Lord Clement-Jones and Baroness Neville-Jones that will give the Secretary of State and Ofcom additional powers to force social media companies to be more transparent, while protecting user privacy and trade secrets. After a limited Government concession, the Bill would give Ofcom 18 months to decide whether to ‘recommend’ greater data access from tech companies, and no powers to force them to provide it.

Providing access to independent researchers is clearly in Ofcom’s interest, as expressed by their Chief Executive at the Online Safety Bill Joint Committee in 2021. It has full support from the academic community and civil society, who would help the regulator scrutinise platforms and identify emerging risks more efficiently as the online ecosystem continues to evolve at pace, at no extra cost to the taxpayer.  

They would also help to ensure that the regulation is implemented effectively and proportionately, helping hold Ofcom to account and protecting freedom of expression online. Without these vital amendments, the Bill is less enforceable as a whole and UK citizens will be left more vulnerable to digital threats to their safety and national security than their European counterparts. The EU’s equivalent legislation – the Digital Services Act – will give researchers and civil society access to data sooner but UK-based organisations like mine need a mechanism for data access to study harms and threats to UK users or else we will all be operating in the dark.  

Demanding such transparency is not controversial: other key utilities on which we rely are also independently scrutinised by approved researchers. In the oil and gas industry, for example, environmental scientists can independently test water samples in surrounding areas for pollution or contamination. At present, we can’t do this for social media and our information environment.  

The status quo gives social media companies monopoly power over the data that illustrates the consequences of their profit-driven approach, and the ability to shutter access on a whim. They can see the data evidencing the misery they can cause, but don’t share it. It’s time we lifted the lid and rectified this power imbalance. This must be done in months, not years. It is too urgent a problem not to act now. 

TikTok series: Policy recommendations 

ISD identified platform safety improvements for hate speech on TikTok, including better enforcement, clearer policies and filling content knowledge gaps.