Primary page content

Why the Draft Online Safety Bill is cause for concern

By Aysem Diker Vanberg 

On 12 May 2021, the Department for Digital, Culture, Media and Sports published the Draft Online Safety Bill. The Joint Committee on the Draft Online Safety Bill has initiated an inquiry and called for evidence on to scrutinise it. Dr Aysem Diker Vanberg (Lecturer in Law at Goldsmiths), Dr Kim Barker, Dr Guido Noto La Diega and  Dr Ruth Flaherty have given written evidence on the proposed Draft Bill on behalf of Bileta (The British and Irish Law Education Technology Association).

In our submission, we raised concerns over the proposed Draft Online Safety Bill (OSB), both in terms of its substantive aim,but also its likely practical implications.

Scope is too broad 

First, the draft Bill is very broad in scope, and it is likely to have dire consequences on free speech. Section 46(3) of the Draft Bill defines legal but harmful content as “content having, or indirectly having, a significant adverse physical or psychological impact on an adult of ordinary sensibilities.”

This is a very vague definition and creates the danger of censoring a vast amount of content that is neither illegal nor harmful. Due to the hefty fines and other liabilities introduced in the Bill, platforms and service providers are likely to take down content without fully investigating whether it is harmful or not, overzealously to protect their own interests. This will have serious consequences for freedom of expression and on media plurality, encouraging the silencing of controversial and minority (opposition) opinions which are much needed in society.

Unprecedented powers to private companies 

Second, the draft Bill gives unprecedented powers to private companies in terms of moderating and censoring content which is neither desirable nor sustainable in the long run.

OFCOM ill-suited to regulate 

Third, the choice of OFCOM as the likely regulatory body for the proposed regime is concerning. OFCOM is sorely lacking in staffing, and the requisite expertise to competently undertake the required regulatory role in the context of online safety.

In the light of above, the provisions of the Draft Bill need to be amended to avoid ambiguity and over-censoring. Furthermore, given the need for a bespoke and nuanced approach to regulating online content, there is a need for a new regulatory body.

Call for papers: Technology and Human Rights symposium

Call for papers: Technology and Human Rights in the wake of the Facebook-Cambridge Analytica and GCHQ-NSA scandals

Goldsmiths, University of London

Organiser: Law at Goldsmiths (with the Knowing Our Rights research project and the New Europeans civil rights organisation)

Monday, 5 November 2018 (09.30 – 16.30)

data analyticsDigital technology touches on and empowers every aspect of our lives, whether as consumers, users of public services, citizens or voters.

Yet the collection and exchange of personal information can interfere with our right to privacy, and the abuse of technology can distort our democracies and lead to serious breaches of human rights.

The informational privacy mega-scandals of recent years, such as the revelation (by Edward Snowden) of the post 9/11 global, mass, surveillance systems operated by security services, or the Facebook-Cambridge Analytica data harvesting scandal (which came to the surface through the work of Carole Cadwalladr and whistle-blower Chris Wilie), point, worryingly, to the rise of a new Panopticism that threatens to suppress private existence in the interest of security, financial gain and control of political power.

In this one-day international, interdisciplinary, symposium, we will ask:

  • How can the law be used to strike the right balance between freedom and security in the age of the internet?
  • What more can be done to make sure big technology firms respect the rule of law?
  • How can government surveillance best be monitored and held to account so that law enforcement agencies retain public confidence?
  • Can technology be used to promote democracy and development while respecting human rights?
  • Should the government be allowed to operate systems of bulk interception of private communications, intelligence sharing with foreign government and acquisition of communications data from communication service providers?
  • Does the European Court of Human Rights judgment in Big Brother Watch and Others v the United Kingdom strike the right balance?
  • How can the sociology of security and surveillance inform the debate on law, technology and human rights?
  • Who is best positioned to provide the legal solutions required? What are the limitations facing individual countries?
  • Has the European Union been successful in dealing with these challenges, and what are the next steps in the process of regulating the internet, securing privacy rights and enhancing data protection?
  • What role can international organisations such as the Council of Europe play?
  • What extra challenges for human rights are posed by the rise of Artificial Intelligence and machine learning?

We will discuss these issues with international experts and practitioners including from the fields of politics, law and social sciences, in order to crystallise the key questions and issues facing our society as we are confronted with modern technology that is increasingly invading our private lives.

Papers submitted can relate to any of the issues raised, and questions asked, above.

Deadline for the submission of abstracts (max 200 words): Friday, 12 October 2018

Papers should be submitted to