By Eleanor Ludlam, Astrid Hardy and Alexander Dimitrov

|

Published 13 December 2022

Overview

Having survived the resignation of two Prime Ministers, the government’s proposed flagship internet safety legislation, the Online Safety Bill (“OSB”) is back on track. As predicted in our previous article, the latest draft of the OSB contains a number of significant changes from its predecessor, notably a relaxation of the rules regarding “legal but harmful” speech, as well as some extended safeguarding protections for children. The UK government also claims that the latest draft of the OSB boosts transparency and accountability of online services with its new enforcement powers.

We have summarised the key (new) amendments to the OSB as it returned to Parliament below.

  1. Protection of Children - (sections 8,9,10 and 11)–

DCMS recently confirmed that “the OSB’s key objective, above everything else, is the safety of young people online”. The OSB has included additional amendments which go further in strengthening the existing protections for children as follows:

  1. Age verification and restricting underage access. Online services will be forced to set out in their terms of use how they enforce their minimum age policies as a way to stop children circumventing authentication methods and they will have to publish details of when Ofcom has taken action against them. This will be tabled as an amendment in the Commons.
  1. Publishing child risk assessments. Online services must publish more information about the risks their services pose to children. The OSB will require the largest platforms to publish summaries of their risk assessments for illegal content and material that is harmful to children, to allow users and empower parents to clearly understand the risks presented by these services and the approach platforms are taking to children’s safety. This will be tabled as an amendment in the House of Lords.
  1. Children’s Commissioner named as a statutory consultee for Ofcom. This will assist in the development of codes of practice to ensure measures relating to children are robust and reflect any concerns raised by parents. This will be tabled as an amendment in the House of Lords.
  1. Protection of Adults

The current draft of the OSB introduces the so-called “Triple Shield” – i.e. three fundamental principles, aimed at protecting and empowering adult users. Those layers of protection can be summarised as follows:

  1. Stage 1: Removal of “Illegal” content. The starting point rests on the fundamental principle that speech that is illegal on the street should also be illegal online. The OSB includes a number of criminal offences relating to illegal speech, and platforms must proactively prevent users from encountering this content, by removing illegal content, when they become aware of it (Sections 8 and 9).
  1. Stage 2: Enforcement of Terms of Service relating to “legal” content. The OSB mandates that legal content which a platform prohibits in its own Terms of Service should be removed, while legal content that a platform allows in its terms of service should not be removed (Sections 12 and 13).
  1. Stage 3: User Empowerment relating to “legal” content. Under the third “Shield”, platforms should ensure that they have the option to decide the content being suggest to them, rather than the platforms’ algorithms. This third pillar provides for the empowerment of adult users, to choose whether or not to engage with legal forms of abuse and hatred if the platform they are using allows such content. So the ‘Third Shield’ puts a duty on platforms to provide their users with the functionality to genuinely control their exposure to unsolicited content that falls into this category (Section 14).

The government has announced plans that in future drafts of the OSB (whether at the Committee Stage in the Commons or upon passage to the Lords) the provisions giving effect to this duty to allow genuine control will specifically include legal content related to suicide, self-harm and eating disorders, and content that is abusive or incites hate on the basis of race, ethnicity, religion, disability, sex, gender reassignment, or sexual orientation. According to the government, this targeted approach reflects areas where vulnerable adult users, would benefit from having greater choice over how they interact with these kinds of content.

  1. Legal Free Speech

One of the most significant amendments relates to the withdrawal of the “legal but harmful” definition. The amendment proposed comes from the concerns raised with the over-removal of legitimate legal content and the proposed amendment tackles the balance between censorship of legal free speech and removal of content that is illegal.

The new provision requires online services to remove content that is illegal (or prohibited by their terms of use), but the OSB will no longer cover content which is legal but which may be harmful to adults or children. This is quite the volte-face on, arguably, one of the most controversial aspects of the OSB. This update has been welcomed by those subject to the rules as it removes the risks of uncertainty as to what could be considered “harmful”. Further it has been welcomed by charities and campaigners of victims where support and discussion groups are a vital resource in their recovery.

DCMS have also confirmed that the new amendment will retain protections for victims of abusive communications, including victims of domestic abuse, and the UK government continues in progressing new offences for false and threatening communications.

  1. Accountability and Further Measures

Enforcement. The newest draft of the OSB includes specific obligations on platforms to publish details of enforcement action taken by Ofcom, to further strengthen the transparency to users (Section 133).

Self-harm. While no amendments directly relating to this have been tabled for the Committee Stage in the Commons, the government has stated that it intends to include a specific offence of sending a communication that encourages serious self-harm. The UK government said the changes had been influenced by the case of Molly Russell. Content that encourages self-harm will be illegal and will be bring self-harm content in line with communications that encourage suicide which is already illegal. The amendment is set to bring stricter controls over the removal of self-harm content and any person found to have posted such content to face prosecution. The organisations and platforms hosting such content would also face fines. This will be done in the House of Lords.

New criminal offences. A new amendment has been included in respect of ‘assisting or encouraging self-harm’ and ‘controlling or coercive behaviour’. These criminal offences will be listed as types of illegal content that platforms must take steps to prevent users from encountering.

Violence against women and girls. Again, while no amendments relating specifically to the protection of women and girls have been included in the current version of the OSB, the government has stated its intention to provide additional provisions in the House of Lords, to further protect these vulnerable groups of users. These are expected to take the form of listing the offence of Controlling or Coercive Behaviour as a priority offence, therefore mandating that platforms take proactive steps to tackle such content. Further drafts may also name the Victim’s Commissioner and the Domestic Abuse Commissioner as Statutory Consultees for any codes of practice adopted under the OSB.

Pornographic deepfakes and ‘downblousing’. Explicit images or videos that have been manipulated to look like someone without their consent is set as one of the amendments to the OSB. The planned amendments would make the sharing of pornographic deepfakes a crime. It has also focused on the sharing of intimate images and ‘downblousing’ (i.e. photos taken through hidden cameras down a woman’s top without consent). The sharing of explicit images taken without someone’s consent will now be criminalised.

Epilepsy trolling. Finally, the OSB now includes specific offences of sending or showing flashing images electronically, to individuals who the sender knows or suspects have epilepsy. These changes are intended to cover the so-called “epilepsy trolling” – the malicious practice of sending flashing images with the deliberate intent to trigger a seizure in people with epilepsy (Section 160).

According to a Statement by the Minister for Media, Data, and Digital Infrastructure, Paul Scully MP, the OSB is unlikely to progress rapidly through Parliament, as it will instead be sent back to the Committee Stage in the Commons for deeper scrutiny. In the most optimistic scenario, the OSB will make its way to the House of Lords before the end of January. The legislative deadline for the passage of the OSB is April 2023. If the Government fails to meet this deadline, the OSB will need to be scrapped and the entire legislative process will have to start over.

We will keep a close watch of developments as the OSB returns to the House of Commons and through its passage in the Lords. While the OSB is still intended to come into force by May 2023, more delays and turns could still be ahead. We will keep you updated on the progress made of what is undoubtedly one of the most significant legislative initiatives affecting the digital technology sector over the past decade.

Authors