What’s next for the UK's Online Safety Act and can it solve the misinformation problem?

Against a backdrop of riots and disorder in Summer 2024, some have raised concern that the UK’s Online Safety Act does not go far enough in tackling misinformation that can fuel disorder. Although the Act has passed, the regime that it establishes is undergoing a phased implementation by Ofcom.

Ofcom is under pressure to react to current events promptly and forcefully, but is required to consult on draft regulatory codes and guidance before most of the Act’s duties can bite and Ofcom’s associated enforcement powers can kick in.

However, even once fully implemented, the new regime cannot provide a complete answer to all online hate speech and misinformation and there are important reasons for the constraints built into the Act.

The Online Safety Act (“OSA”), the legislation that establishes a landmark regulatory regime for online content in the UK, is never far from the headlines but has been a particularly prominent feature of new cycles in recent weeks, in the wake of riots and disorder in the UK. Questions have been raised about whether the OSA needs changes, with the London Mayor Sadiq Khan calling it “not fit for purpose”.

When will the regime come into force?

The OSA was passed in October 2023 but very few of the obligations it imposes on online platforms and search engines are currently in force, pending Ofcom’s phased implementation of the full regime. The OSA has been a long time coming – the government White Paper that first proposed an online safety regulatory regime was published in April 2019 and, five years later, the regime is still being implemented and unlikely to be fully in place until 2026. However, that is at least partly due to the very difficult questions posed by this kind of regulation, which require considered debate and careful application. To name a few:

  1. How should the balance be struck between rights to freedom of expression and protection from harm?
  2. How can tech companies identify and remove illegal and harmful content on services of huge scale without surveilling users’ activity in a way that excessively intrudes on their privacy?
  3. How can children be effectively protected online without locking them out of access to the online services they rely on?

These questions persist in the task ahead for Ofcom, which must set out how services can comply with the OSA’s duties in practice in the regulatory codes and guidance. Ofcom is under pressure to react to current events but is necessarily constrained by its responsibilities under the OSA – it must consult on draft versions of the codes and guidance before they are finalised. That consultation has been taking place since the end of last year and the earliest the first set of the OSA’s duties will apply is likely to be early 2025.

What can the OSA do about misinformation and hate speech?

The public and press commentary since the riots has often alluded to regulation of online platforms, and of social media in particular, being a panacea for the ills of the internet or even modern society more broadly. For example, there have been suggestions that, if the OSA were fully in force, much of the criminal activity and hateful speech online seen in recent weeks could have been prevented, or regulatory action could have been taken for failures to prevent it.

The OSA will establish an extensive regulatory regime. In some aspects, it will go much further than its equivalents in other jurisdictions - it has more wide-ranging and onerous obligations in respect of the protection of children online than its EU equivalent, the Digital Services Act. However, there were deliberate policy choices made by Parliament to limit the OSA in a number of ways and it will necessarily fail in meeting the expectations of those who see it as the complete answer:

  1. Systemic duties: The OSA does not regulate individual pieces of content shared by specific users and it is the person posting illegal content, rather than the service on which it is posed, who is liable for it. The OSA instead requires services to use proportionate systems and processes to secure certain objectives, such as preventing users from accessing the most serious illegal content; and mitigating the risks of harm identified in a service’s risk assessment. Where services fail to do so, they could be sanctioned for breaching their duties under the OSA. This means Ofcom will not, for example, investigate and take action in respect of a single post on a social media service, but will instead target systemic failures. That approach is a reflection of the impossibility of both requiring platforms to police every user’s content and requiring Ofcom to adjudicate on every content moderation decision.
  2. Focus on illegal not ‘lawful but awful’: With the exception of content that is harmful to children (users under 18), the OSA only covers illegal content, meaning content that amounts to a criminal offence (including hate crimes). Therefore, the majority of the hateful content and misinformation shared online, which could have encouraged or inflamed riots, would not fall within the scope of the regime, unless it breached the relevant service’s terms and was not acted on. When the legislation was going through Parliament, an earlier version proposed duties in respect of content that is 'legal but harmful to adults’ (which could have covered abuse or misinformation, that does not meet the criminal threshold, but could nonetheless cause significant harm to adults in the UK). These clauses were dropped by the previous government over concerns that they excessively impinged on free speech rights.
  3. The free speech balancing act:  All services regulated by the OSA have positive duties to ‘have particular regard’ to the importance of protecting free speech and user privacy when moderating content. The largest and riskiest services (to be designated as ‘Category 1’ services by Ofcom by the end of this year) will eventually have additional duties to protect (broadly defined) ‘journalistic content’ and ‘content of democratic importance’. There will be an obvious tension for platforms between the risk of failing to tackle hateful content and misinformation, which could breach their terms, and being accused by users of acting against citizen journalism or legitimate political dialogue, in breach of the OSA’s free speech duties.

What could be next?

With the new Government having been elected on a manifesto which included a promise to “build on the Online Safety Act, bringing forward provisions as quickly as possible”, there is a possibility of forthcoming proposals to amend the OSA. However, making significant changes (for example, to reintroduce the regulation of content that is legal but harmful to adults) would require primary legislation, which appears unlikely given the ambitious legislative agenda set out in the King’s Speech and consequent pressures on Parliamentary time. It would also risk slowing down Ofcom’s implementation of the regime, for example, by requiring it to re-consult on revised regulatory codes and guidance. The Prime Minister’s office appears to have confirmed that the OSA is not under “active review”, and the focus is on getting it implemented “quickly and effectively” rather than changing it.

The new government could rely on existing powers under the OSA to make secondary legislation to expand the categories of ‘priority illegal content’, which are subject to the strictest obligations imposed on platforms. However, this would not provide a solution for those who want the OSA to address harmful content that does not reach the criminal standard.

The OSA is likely to remain in the public spotlight for some time to come. The priority for those focused on shaping the regime should be constructive engagement with Ofcom, including through its consultations, to help its mammoth task of introducing an effective and workable regime as quickly as possible.

 

 

Authored by Telha Arshad and Charles Brasted.

 

This website is operated by Hogan Lovells International LLP, whose registered office is at Atlantic House, Holborn Viaduct, London, EC1A 2FG. For further details of Hogan Lovells International LLP and the international legal practice that comprises Hogan Lovells International LLP, Hogan Lovells US LLP and their affiliated businesses ("Hogan Lovells"), please see our Legal Notices page. © 2024 Hogan Lovells.

Attorney advertising. Prior results do not guarantee a similar outcome.