BLOG POST

Tech & Sourcing @ Morgan Lewis

TECHNOLOGY TRANSACTIONS, OUTSOURCING, AND COMMERCIAL CONTRACTS NEWS FOR LAWYERS AND SOURCING PROFESSIONALS

Navigating the UK’s Online Safety Act: Implications for Global Digital Services

The United Kingdom’s Online Safety Act (OSA or the Act), which received Royal Assent in October 2023, establishes a new statutory framework to address harmful online content, protect children, and promote accountability among digital service providers. For counsel advising platforms, publishers, and other organizations operating in the digital space, the OSA introduces a complex set of compliance considerations, particularly given its extraterritorial scope, risk-based obligations, and substantial enforcement powers.

The OSA commenced phased implementation in October 2023, with key provisions, including those relating to illegal content and child safety, becoming enforceable from early 2024. Full implementation, including obligations for categorized services and transparency reporting, is expected through 2025 and beyond, in line with the Office of Communication’s (Ofcom’s) staged regulatory roadmap.

Overview: A New Regulatory Framework for Online Content

The OSA imposes statutory duties of care on a wide range of online services that host user-generated content or facilitate user interaction, including social media platforms, messaging services, video-sharing websites, and search engines. Its primary objectives are to reduce the prevalence of illegal content, safeguard children from harmful material, and enhance transparency in content moderation practices.

The Act grants Ofcom, the United Kingdom’s communications regulator, significant enforcement powers, including the ability to conduct audits, compel information, and impose civil penalties of up to £18 million or 10% of global annual turnover, whichever is greater. In more extreme cases, criminal sanctions may also apply to senior managers.

A Broad Jurisdictional Reach

A defining feature of the OSA is its extraterritorial application. A service need not be established in the UK to fall within its scope; if it is likely to be accessed by UK users or targets the UK market, it may be subject to the legislation.

This has notable implications for multinational companies, particularly those headquartered outside the UK but offering services accessible to UK-based users. The absence of a physical presence or local incorporation does not shield an organization from compliance obligations. The reach of the OSA is intentionally wide and designed to ensure the UK’s regulatory goals are not undermined by geographic boundaries.

This adds to the European Union’s broad jurisdictional reach with its online content regulation through the Digital Services Act, which also seeks to prevent the dissemination of illegal content but with less proactive obligations (except for the very large online platforms). Companies active globally and across Europe will have to align their compliance across the United Kingdom and the European Union.

Risk-Based Obligations and Tiered Duties

The OSA adopts a risk-based regulatory model, with the scope and nature of obligations varying according to the type of service and the level of risk it presents. Higher-risk services, particularly those likely to be accessed by children or where harmful content is prevalent, face more stringent requirements.

Key obligations for in-scope services include:

  • Prompt identification and removal of illegal content
  • Transparent enforcement of terms of service and content moderation policies
  • Implementation of systems and processes to mitigate risk, including user reporting mechanisms and content filtering tools
  • Conducting and maintaining written risk assessments, particularly in relation to children’s safety online

Services likely to be accessed by children must also comply with additional protections, including age-appropriate design requirements and heightened duties around harmful content. Compared to the European Union’s DSA, which foresees exemptions for mere conduit, caching or non-large hosting services, a broader range of services are in scope.

Encryption and Privacy Concerns

One of the more contentious elements of the OSA relates to encrypted messaging services. The Act empowers Ofcom to require providers to use “accredited technology” to identify illegal content on end-to-end encrypted platforms. Industry stakeholders have cautioned that such measures may conflict with privacy and data protection obligations and could compromise encryption security.

While the government maintains that these provisions are both necessary and proportionate, the implementation details remain subject to significant debate.

Related Trends in Online Regulation

The OSA’s focus on online harm and platform accountability sits alongside other UK regulatory initiatives aimed at increasing trust and transparency in the digital environment. In March 2024, the Competition and Markets Authority (CMA) announced enforcement action against businesses suspected of failing to prevent fake and misleading online reviews and has started to take preliminary action.

This work, part of the CMA’s broader consumer protection remit, underscores that UK regulators are increasingly coordinated in addressing digital harms, whether through Ofcom’s content regulation or the CMA’s market integrity measures. A similar trend can be witnessed in the European Union, where, in particular, mass online retailers from outside the European Union have come under increased scrutiny for allegedly false or misleading product information or are accused of counterfeit product sales.

For compliance functions, this trend reinforces the need for an integrated approach to online safety, consumer protection, and advertising standards.

Lessons for Counsel

Organizations within the OSA’s scope should do the following:

  • Conduct detailed risk assessments and document mitigation measures, including evaluating the effectiveness of moderation tools and reporting mechanisms
  • Review and update terms of service, content moderation policies, and user safeguarding processes to ensure transparency and accessibility
  • Train relevant employees on the organization’s safety policies and OSA requirements
  • Establish clear lines of accountability and maintain readiness for regulatory engagement with Ofcom
  • Consider appointing a UK-based representative, where appropriate.
  • Examine exposure to the EU’s DSA and adopt an integrated approach

Conclusion: A Global Regulatory Shift

The OSA is part of a wider international movement toward greater regulation of digital services. Similar legislation has been introduced or proposed in the European Union, Australia, and the United States, creating overlapping regimes with differing approaches to platform accountability, content moderation, and user protection.

Organizations that understand and plan for these requirements will be better positioned to navigate this evolving regulatory landscape. Proactive engagement with regulators, prioritization of risk assessments, and alignment of compliance strategies across jurisdictions will be key to ensuring both compliance and user trust.

If you would like to learn more about how these developments may impact your organization, please contact Tech & Sourcing @ Morgan Lewis or antitrust and competition colleague Christina Renner.