Europe's Premier Trust & Safety Summit
Empowering Leaders to Tackle Operational Challenges, Navigate Regulatory Landscapes, and Drive Cross-Functional Solutions

25 - 26 March 2025 | Hilton London Syon Park, UK

Proportionality in the Online Safety Act: What does it mean for me?

The long-awaited Online Safety Act passed into law back in October 2023, and its impact will be felt across the UK's business landscape. Ofcom, the regulatory body responsible for enforcing the Online Safety Act, is now working on drafting guidance on how businesses should prepare for the Act, as well as which types of businesses will be affected by it.



At the moment, Ofcom is in the consultation period, and they published the first of four of their major consultations on 9 November 2023.

Proportionality is a crucial factor that underpins the Online Safety Act. But what does it mean, and how does it affect businesses and the online services they offer? Before we delve into the concept of proportionality, let's review the key points of the Online Safety Act to provide a better context. 

Online Safety Act Key Points


Purpose: The Online Safety Act sets out to minimise the risks on online space for misuse and exposure to any kind of content and is harmful or illegal.

How will it be managed: The Online Safety Act puts the onus on organisations to demonstrate that they have effective processes and safeguards in place to protect their users and remove any content that has been flagged as inappropriate or illegal.

When will the laws take effect? Most of the rules haven’t come into force yet – Ofcom is taking a phased approach and they expect the first new duties to take effect at the end of 2024.

Who is the Online Safety Act aimed at?

  • Providers of users-to-user services: This includes organisations such as Facebook, X (formerly Twitter), Instagram and smaller services where users can view content created by other users.
  • Providers of search services: This includes platforms such as Google that have a search engine and can be used to find content.
  • And also providers of services that publish adult content such as pornography

What are the penalties? Violations of these rules come with steep punishments, including fines up to 10% of global revenue or a maximum of £18 million.

What are the illegal harms set out by Ofcom?:

  • Terrorism offences
  • Child sexual exploitation an abuse
  • Encouraging or assisting suicide
  • Hate offences
  • Harassment
  • Controlling or coercive behaviour
  • Drugs and psychoactive substances offences
  • Firearms and other weapons offences
  • Unlawful immigration and human trafficking
  • Sexual exploitation of adults
  • Extreme pornography
  • Intimate image abuse
  • Proceeds of crime offences
  • Fraud and financial services offences
  • Foreign interference offences

What is proportionality in the Online Safety Act?

Fundamentally, proportionality involves assessing the risks associated with a company's online services, and then adopting appropriate measures to address those risks. The suitability of any given approach should be proportional to the size, complexity, and scope of the company and its services, the level of risk, as well as the resources required to implement it effectively.

The Online Safety Act at this stage is somewhat broad, which can be both advantageous and disadvantageous. Although this leaves the criteria of proportionality open to interpretation, it also assures that service providers will not be encumbered by undue restrictions in the meantime while the final draft codes and consultation documents are being developed.

According to Ofcom, more than 100,000 digital service providers come under the scope of the Online Safety Act. While the major online corporations like Facebook, Google, WhatsApp, X and adult websites will face the brunt of this act, many smaller companies will also be affected. However, some of these companies may lack the necessary resources to fully comply with the new regulations.

Because of this, the concept of proportionality has become a central point of discussion, and is referenced numerous times on Ofcoms documentation. At its core, proportionality guidance outlines how service providers will be categorised into six groups based on two factors: the size of the service and the level of risk involved.

Here are the categories: 

  • Small service: fewer than 7 million monthly UK users
  • Large service: 7 million or more monthly UK users

For risk, the categories are: 

  • Low risk: assessed as low risk for all illegal harm.
  • Specific risk: assessed as medium or high risk for a certain type of harm.
  • Multi-risk: exposed to significant risks for at least two kinds of illegal harms. For these services, additional measures are implemented to address illegal harms in general, rather than specific risks.

It's worth noting that while proportionality is a crucial aspect of the Online Safety Act, Ofcom does not expect service providers to eradicate all instances of harmful content. Instead, service providers must demonstrate their commitment to implementing proportional and reasonable safety measures.

Join us for the definitive event on restoring trust and safety across online content, communities, and spaces. Register now to gain access to actionable strategies: Trust and safety summit UK

Proportionality: what measures do services need to put in place?

The consultation guidance provides an overview of the proposed measures for services, which will differ based on the size of the service and the level of risk, as listed above. Currently, there are nine categories of measures that companies may have to implement, consisting of 34 individual measures. The measure categories are listed below:

  • Governance & Accountability
  • Content Moderation
  • Automated Content Moderation
  • Reporting and Complaints
  • Terms of service
  • Default settings and support for child users
  • Recommender Systems
  • Enhanced User Control
  • User Access

Generally, service providers who fall under the Online Safety Act's jurisdiction are required to establish protocols that enable users to report illegal or harmful content, particularly if the service lacks age verification processes.

Improving the user experience for reporting incidents is also vital to the Online Safety Act.. Users should be able to file specific complaints about content that violates the Online Safety Act, and they are entitled to bring a claim for breach of contract if their content is removed or restricted on the site.

It is imperative that this complaint procedure is user-friendly and accessible, even for children. Unfortunately, the Online Safety Act is somewhat vague and complicated, which poses a challenge for service providers seeking to comply with it.

To understand how your organisation might be impacted by the upcoming changes, it's recommended that you read the entire summary. However, we can provide you with a few key measures that are prevalent throughout and have the broadest impact on services.


Measures that affect services of all sizes and risk level

  • A named person that is accountable to the most senior governance body for compliance with illegal content safety duties, and reporting and complaints duties.
  • Content moderation systems or processes in place to take down illegal content swiftly.
  • Complaints processes in place to enable UK users and affected persons to make each type of relevant complaint in a way which will secure that appropriate action is taken.
  • Complaints system & processes that are easy to find, easy to access and easy to use.
  • Ability to take appropriate action along with indicative timeframes for considering complaints.
  • Terms of services that provide guidance on how individuals are protected from illegal content.
  • Proactive account removal if there are reasonable grounds to infer they are run by or on behalf of a terrorist group or organisation proscribed by the UK Government.

Measures that only affect small multi-risk services, and all large services

  •  Written statements of responsibilities for senior members of staff who make decisions related to the management of online safety risks.
  • Evidence that new kinds of illegal content on a service, or increases in particular kinds of illegal content, is tracked and reported to the most senior governance body.
  • A published Code of Conduct or principles provided to all staff that sets standards and expectations for employees around protecting users from risks of illegal harm.
  • Internal content moderation policies are set having regard to the findings of risk assessment and any evidence of emerging harms on the service.

Measures that only affect all large services

  • Boards or overall governance bodies to carry out an annual review and record how the service has assessed risk management activities in relation to illegal harms.

Challenges of proportionality

Service providers who are impacted by the Online Safety Act will soon be required to notify Ofcom, as per future documentation from the agency. Non-compliance with the Online Safety Act carries significant penalties, such as auditing and investigating service producers, as well as fines of up to £18m or 10% of worldwide revenue.

As service providers, it's crucial to maintain a balance between compliance procedures and safety measures while preserving the freedom of expression for the services you provide. The Online Safety Act mandates Ofcom to provide more detailed information on the thresholds of proportionality and these guidelines, along with additional codes of practice, will enable service providers to gain a better understanding of their service's compliance and governance and implement any necessary changes more efficiently.

Preparing for the Online Safety Act: Why You Should Attend the Trust & Safety Summit UK

While the first phase of the Online Safety Act is not expected to come into force until late 2024, it's crucial to take action now. The broad scope of this legislation means it practically applies to all organisations with an online presence where users can post content. To stay ahead of the game, consider registering for our upcoming event, the Trust & Safety Summit UK.

As Ofcom sheds more light on abiding by the Online Safety Act and publishes more consultation documents, ensuring user safety and maintaining trust is more important than ever. Our summit is designed to equip industry professionals with actionable insights, strategies, and tools to navigate this rapidly evolving landscape and tackle emerging regulations head-on.

Join us for engaging discussions with a range of online safety experts on topics such as:

  • Regulatory Preparedness
  • Advanced Moderation
  • Restoring Trust
  • Age Verification & Assurance
  • Child Safety & Protection
  • Operational Trust & Safety
  • Modern Online Harms
  • Trust & Safety At Scale

Join us for the definitive event on restoring trust and safety across online content, communities, and spaces. Register now to gain access to actionable strategies: Trust and safety summit UK

Return to Home