Introduction
New measures
Purpose of amendments


Introduction

The government says that its amendments to the Online Safety Bill "go further than before to shield children", while "protect[ing] free speech online". Therefore, social media firms will still need to protect children and remove content that is illegal or prohibited, but the Bill will no longer define specific types of legal content that they must address and any incentives for social media platforms to "over-remove" legal online content will be removed.

The government says that the changes remove any influence future governments could have on what private companies do about legal speech on their sites and ensure that there is no risk that companies are motivated to take down legitimate posts to avoid sanctions.

New measures

The government says that new measures will also be added to make social media platforms more transparent and accountable to their users. They will be legally required to provide a "triple shield" for consumers and:

  • remove illegal content;
  • take down material in breach of their own terms of service; and
  • provide adults with greater choice over the content they see and engage with.

Duties relating to "legal but harmful" content accessed by adults will therefore be removed from the legislation and replaced with the "triple shield". The Bill will instead give adults greater control over online posts they may not wish to see on platforms. If users are likely to encounter certain types of content – such as the glorification of eating disorders, racism, antisemitism or misogyny not meeting the criminal threshold – internet companies will have to offer adults tools to help them avoid them. These could include human moderation, blocking content flagged by other users or sensitivity and warning screens.

The Bill will also now explicitly prohibit social media platforms from removing or restricting user-generated content, or suspending or banning users, where this does not breach their terms of service or the law. In addition, they will need to have clear, easy to understand and consistently enforced terms of service.

Firms will also have to publish more information about the risks their platforms pose to children so people can see what dangers sites really hold. They will also be made to show how they enforce their user age limits to stop children circumventing authentication methods and publish details of when the regulator Ofcom has taken action against them.

Purpose of amendments

The government says that the changes refocus the Bill on its original aims to:

  • protect children and tackle criminal activity online while preserving free speech;
  • ensure that tech firms are accountable to their users; and
  • empower adults to make more informed choices about the platforms they use.

The changes follow confirmation that the Bill will include measures to make significant changes to the United Kingdom's criminal law to increase protections for vulnerable people online by criminalising the sharing of people's intimate images without their consent.

The criminal offence of controlling or coercive behaviour will also be added to the list of priority offences in the Bill. This means platforms will have to take proactive steps, such as putting in measures to allow users to manage who can interact with them or their content, instead of only responding when this illegal content is flagged to them through complaints.

In addition, the Victim's Commissioner, Domestic Abuse Commissioner and Children's Commissioner will be added as statutory consultees in the Bill, meaning Ofcom must consult with each when drafting the codes that tech firms must follow to comply with the Bill.

For further information on this topic please contact Matthew Dando at Wiggin by telephone (+44 20 7612 9612) or email ([email protected]). The Wiggin website can be accessed at www.wiggin.co.uk.