UK Online Safety Act 2023
Description
Trying to track the staggered implementation of the UK Online Safety Act 2023 (“OSA”)? You are in safe hands here. See the latest updates and key dates below.
24 April 2025 – Ofcom finalises child safety measures for sites and apps to introduce by July
Ofcom has finalised more than 40 practical measures for service providers to meet their duties under the Online Safety Act to protect children online. These measures include robust age checks, effective content moderation systems, and safer algorithm configurations, amongst others. They are designed to build on and complement the specific rules and requirements that Ofcom has already put in place to prevent children from encountering online pornography and to protect users from illegal online harms. Providers of services likely to be accessed by UK children must complete and record their children’s risk assessments by 24 July 2025 and implement safety measures to mitigate these risks by 25 July 2025.
9 April 2025 – Ofcom investigates online suicide forum
Ofcom launched its first investigation into an unnamed provider of an online suicide forum. The investigation will determine whether the service has failed to comply with its OSA duties, including: (a) putting appropriate safety measures in place to protect users from illegal content; (b) completing an illegal harms risk assessment and corresponding record-keeping duties; and (c) accurately responding to a statutory information request. Following several attempts by Ofcom to engage with the provider and an unsatisfactory response to a statutory information request. Ofcom has maintained that it will not hesitate to take enforcement action if the provider is found to be in breach of its OSA obligations (including imposing fines of up to £18m or 10% of qualifying worldwide revenue, whichever is greater). Ofcom will publish a report from the investigation once it is concluded.
27 March 2025 – Ofcom issues £1.05 million fine to provider of OnlyFans
Ofcom has fined the provider of OnlyFans, Fenix International Limited, £1.05 million for failing to accurately respond to statutory information requests about age assurance measures on the platform. On two occasions, Ofcom sought information from Fenix on the platform’s implementation of age checks and the effectiveness of the platform’s third-party facial estimation technology. Ofcom expects platform providers to have robust checks in place to ensure information is thoroughly reviewed and crosschecked before being submitted. As part of the Ofcom’s the request, Fenix reported that the ‘challenge age’ for its facial age estimation technology was set at 23 years old when it was in fact set at 20 years old, and it had taken the company over 16 months to discover this mistake, in addition to a further two weeks to then report the error to Ofcom. Ofcom found that Fenix contravened its duties in providing accurate reporting to Ofcom. The final fine did include a 30% reduction of the amount, because of Fenix accepting the findings and settling the case.
26 February 2025 – Ofcom finalises guidance on the exercise of its information gathering powers
Under the Online Safety Act, Ofcom has legal powers to gather information held by regulated companies and third parties to provide robust evidence for the purpose of investigations. Ofcom has now published its Guidance on the scope of these powers, the duties of services and other people to comply, and the potential consequences of non-compliance. The Guidance outlines the specific powers Ofcom has, including the power to issue information notices, the power to appoint a skilled person to provide a report, and the powers of audit, entry, and inspection. Ofcom has also provided examples of when these powers may be exercised, instructions on how to comply, and explanations of how datasets may be used during this process. Since the publication of the draft Guidance, Ofcom has mostly made changes to provide further details about the general mechanics of the powers, as requested by stakeholders in response to the open consultation. Ofcom has warned that non-compliance with these powers will result in enforcement action, namely the imposition of a financial penalty or the requiring of specific steps to be taken to come into compliance.
25 February 2025 – Ofcom publishes guidance on women and girls' online safety
Ofcom has published its guidance on measures for service providers to improve the online safety of women and girls. Ofcom has identified four key harms to focus on, namely online misogyny, pile-ons and online harassment, online domestic abuse, and intimate image abuse. In tackling these four harms, Ofcom has set out nine high-level actions for services to take to protect women and girls online, including conducting risk assessments, setting safer defaults, and taking appropriate action when online gender-based harms occur. Within each of these actions, Ofcom has additionally set out a number of examples of good industry practice to illustrate specific changes that can be implemented, including enforcement action taken against repeat violators, adding watermarks and metadata, and adding fact-checking and labelling. Ofcom has also opened a consultation on the draft guidance and is accepting responses until 23 May 2025.
24 January 2025 – Ofcom’s annual report on notices to deal with terrorism content and/or CSEA content
Ofcom has published its first annual report on notices to deal with terrorism content and/or CSEA content, largely drawing on Ofcom’s power to exercise Technology Notice functions under Chapter 5 Part 7 of the Online Safety Act. Ofcom must first advise the Secretary of State about minimum standards of accuracy in the detection of such content before the finalised standards can be published. As a result, Ofcom have not yet been able to assess whether there is any technology which meets any such minimum standards of accuracy. Furthermore, the report also outlined Ofcom’s preparatory work over the last year in relation to its Technology Notice functions, particularly the launching of Consultation: Technology Notices. Following a multi-stakeholder workshop and the commissioning of external consultancy research, the consultation has been launched to cover Ofcom’s policy proposals for advice to the Secretary of State on setting the minimum standards of accuracy, and draft guidance for Part 3 service providers. The deadline to respond to the consultation is 10 March 2025.
23 January 2025 – Ofcom fines video-sharing platform MintStars £7,000
Ofcom has issued a £7,000 fine to video-sharing platform (“VSP”), MintStars Ltd, for failing to adequately protect children from accessing online pornography. An investigation into the platform revealed that MintStars did not have sufficiently robust measures in place or implement measures effectively. Ofcom found that pornographic content could be accessed by any person who visited the site, both through ‘preview’ clips and by subscribing to creators’ content. The ‘self-declaration’ by users that they were over 18 was found to be an inappropriate form of age verification and represented a serious breach of the platform’s duties under the VSP regime. Ofcom’s investigation was concluded with the imposition of a £7,000 fine, reduced by 30% to reflect MintStar’s willingness to admit full liability for this breach, and the platform’s small size and financial position. Ofcom has repeatedly warned of its willingness to take action against non-compliant providers, particularly in the launch of its new enforcement programme to tackle services that display or publish pornographic content.
21 January 2025 – Ofcom launches digital safety toolkit for online services
In an effort to ensure wide-scale compliance, Ofcom has launched a digital toolkit to help online services comply with the Online Safety Act. The toolkit may be useful to any organisation that falls within the scope of the Act, but it has been designed to assist small and medium-sized providers of user-to-user services and search services. Upon answering a handful of questions, the tool will provide tailored compliance recommendations to the provider, including an overview of the required safety measures and how risks of illegal harms could arise on the service. The tool has been launched to help businesses better understand their responsibilities to complete an illegal content risk assessment and comply with safety duties, record-keeping, and reviewing. Ofcom has, however, reiterated that the responsibility to comply still lies with the service providers.
16 January 2025 – Ofcom publishes Children’s Access Assessment Guidance
All services in scope of the OSA are required to carry out children’s access assessments to determine whether their service, or a part of their service, is likely to be accessed by children under the age of 18. The guidance sets out the step-by-step process and recommended factors and evidence to consider at each stage. Child access assessments must be completed by 16 April 2025. The statement also confirms that where a service provider fails to complete an assessment, the service will be treated as likely to be accessed by children and may face enforcement action taken by Ofcom, including significant financial penalties and remedial action.
16 January 2025 – Ofcom publishes guidance on age checks to protect children online
Ofcom has published its industry guidance on highly effective age assurance for Part 5 services to protect children from encountering harmful content and pornographic material online and Part 3 services for the user-to-user aspects of a service.
The final guidance confirms the criteria for ‘highly effective’, being technically accurate, reliable, and fair. The guidance also provides some examples of methods that could be considered HEAA, including credit card checks, digital identity services, email-based age estimation, photo ID matching, and facial age estimation, and those that will not suffice, such as self-declaration, debit cards (which in the UK do not require a user to be over the age of 18), and general contractual restrictions.
Part 5 services that publish their own pornographic content must act immediately to introduce effective age checks, but Part 3 services must introduce highly effective age assurance methods by July 2025.
16 January 2025 – Ofcom publishes Age Assurance and Children’s Access Statement
Ofcom has published an overview of the confirmed approach to child access assessments and highly effective age assurance (HEAA). The statement explains Ofcom’s duties together with providing supporting information they must include from a statutory perspective including evidence relied on and their impact assessment carried out in respect of the proposals. It also provides a summary of key themes in responses to its previous consultation and explains how they have responded to consultation comments in their final guidance.
16 December 2024 – Ofcom publishes a consultation on Technology Notice powers
Ofcom has published their consultation on the framework underpinning their Online Safety Technology Notice powers to tackle terrorism and child sexual exploitation and abuse (CSEA) content. The consultation outlines their policy proposals for minimum standards of accuracy for accredited technologies. Their proposed approach would require applicants to complete an accreditation application form and an audit-based assessment before being pooled into categories of technologies and independently performance tested. It also outlines Ofcom’s draft Technology Notice Guidance for service providers. The guidance is expected to cover 8 sections, including: a summary of the legal framework; Ofcom’s approach to the assessment; what might prompt an exercise of the powers; what to expect when Ofcom is considering issuing a Notice; the stages of Ofcom’s process to decide whether to issue a Notice; next steps following the issuing of a Notice; and the disclosure of information about the exercise of Ofcom’s functions. Responses to the consultation will close on 10 March 2025.
16 December 2024 – Ofcom publishes an update on next steps as regulations come into force
Ofcom has published its first-edition codes of practice on tackling illegal harms under the Online Safety Act. The publication of these codes means that providers now have until 16 March 2025 to complete their illegal harms risk assessment and must start implementing safety measures to mitigate these risks from 17 March 2025. Ahead of these deadlines, Ofcom has reiterated the extent of its enforcement powers. Namely, Ofcom has the power to fine companies up to £18m or 10% of their qualifying world revenue (QWR), as well as the power to apply for a court order to block a site in the UK. Ofcom has also outlined the most important changes delivered by the codes and guidance. These changes include senior accountability for safety; better moderation, easier reporting, and built-in safety tests; and protecting children from sexual abuse and exploitation online.
16 December 2024 – Statement by the SoS approving the categorization thresholds
Government has approved Ofcom’s draft Regulations setting out the threshold conditions for Category 1, 2A and 2B services under the Act. The threshold conditions are: (1) Category 1: (i) services with content recommender systems and a user base of 34 million UK users; or (ii) services with content recommender systems, the ability for users to forward or re-share existing content on the service, and a user base of 7 million UK users; (2) Category 2A: non-vertical search services with more than 7 million UK users; or (3) Category 2B: user-to-user services that allow users to send direct messages and have more than 3 million UK users.
20 November 2024 – Government publishes a draft Statement of Strategic Priorities covering online safety policy priorities
The government have published a draft Statement setting out their strategic priorities for online safety, marking the first time the power is being exercised by the government. The purpose for the statement is to set out priorities for Ofcom to consider when exercising its regulatory functions and to guide the government’s implementation of the Online Safety Act. Ofcom will also have to report back to the Secretary of State on what action it has taken to ensure the priorities are being upheld. The five priorities include (1) safety by design, (2) transparency and accountability, (3) agile regulation, (4) inclusivity and resilience, and (5) technology and innovation.
15 November 2024 – Ofcom reiterates important dates for Online Safety Act compliance
With the first new duties taking effect towards the end of 2024, Ofcom has created a table to explain the important milestones for compliance. The table includes compulsory duties and optional deadlines to respond to consultations, outlining both the start date and the completion date for each milestone. The keyword search feature also allows users to find deadlines specific to their service type. As the earliest start date is December 2024, online service providers are encouraged to make use of the table and the guidance links provided to ensure their compliance with the new duties.
8 November 2024 – Ofcom publishes an open letter to online service providers regarding Generative AI and chatbots
Ofcom has published an open letter to UK online service providers to clarify how the Online Safety Act will apply to Generative AI and chatbots. The letter comes in the wake of the recent news surrounding the misuse of AI, including the use of a Generative AI chatbot platform to create ‘virtual clones’ of real people and deceased children. Ofcom have reiterated what is regulated under the OSA and how it applies to Generative AI chatbot tools and platforms. The letter strongly urges online service providers to prepare now to comply with the relevant duties using the draft Codes of Practice, with the first duties beginning to take effect from December 2024.
24 October 2024 – Ofcom publishes a consultation on a fees and penalties regime
Ofcom has published the first consultation on a new fees and penalties regime to respond to breaches of the Online Safety Act. Fees will be calculated based on providers’ qualifying worldwide revenue (“QWR”) which will then be used to calculate the maximum penalty cap that can be imposed on providers. Ofcom has now drafted secondary legislation to define QWR for the purposes of these calculations. The consultation also proposes a different approach when group undertakings are found jointly and severally liable, with the maximum penalty cap being the greater of £18 million or 10% of the QWR of the provider and every group undertaking related to the provider at the time. Responses to the consultation will close on 9 January 2025.
17 October 2024 – Ofcom publishes a progress update on the roadmap to regulation
Ofcom has provided an update on their progress since the Online Safety Act became law. The update outlines what Ofcom have done to implement the new rules, including publishing their media literacy strategy and publishing proposals for how services should approach content that is harmful to children. The progress report also outlines Ofcom’s intended next steps, focusing on illegal harms (Phase 1), child safety, pornography and the protection of women and girls (Phase 2), and categorisation and additional duties for categorised services (Phase 3).
The roadmap has been updated since the last progress report in October 2023. The implementation plan for Phase 1 remains largely unchanged, except for two additional consultations now proposed: one in December 2024 on minimum standards of accuracy and accreditation for terrorism notices, and another in Q2 2025 on additional measures.
Phase 2 has been reprioritised, with Ofcom accelerating certain implementation steps:
- The consultation on protecting women and girls has been advanced from Q2 2025 to February 2025.
- The timeline for completing children’s access assessments has been moved up from Q2 2025 to Q1 2025, with children’s risk assessments shifted forward from Q3 2025 to Q2 2025.
- The enforcement of the protection of children Codes is now expected in July 2025, earlier than the previous Q3-Q4 2025 timeframe.
These changes have led to Phase 3 being deferred to 2026, instead of 2025 as initially planned. Additionally, the updated roadmap outlines plan for Ofcom to issue advice on fee thresholds in April 2025 and release final guidance for super complaints in Q4 2025.
13 September 2024 – Government makes intimate image abuse a priority offence
The Government announced that the act of sharing intimate images without consent will be classified as a priority offence, which is the most serious type of offence under the Online Safety Act.
7 August 2024 – Ofcom publishes an open letter to online service providers
Ofcom has published an open letter urging UK online service providers to take immediate steps to prevent their platforms from being used to incite hatred, violence, or other illegal activities. The letter reiterates the need for proactive measures ahead of the Online Safety Act’s full implementation, noting that once the final codes of practice are released later this year, platforms will have three months to assess and mitigate risks, and must swiftly address illegal content.
2 August 2024 – Consultation opened: torture and animal cruelty and update on timing for illegal content assessments
Ofcom is consulting on updating its draft illegal harms codes and guidance under the Online Safety Act to include animal cruelty and human torture as content that platforms must address. This follows the November 2023 consultation, where animal cruelty was recognized as a late addition to the Act. The regulatory documents will be published alongside the Illegal Harms Statement in December 2024.
Following the publication of the documents, providers will have three months to conduct their illegal content risk assessment. This assessment will incorporate proposals from the torture and animal harms consultation, the illegal content discussed in the November 2023 consultation, and the findings from the May 2024 protection of children consultation.
The consultation period will run until 13 September 2024.
26 July 2024 – Consultation opened: draft transparency reporting guidance and online safety information guidance
Ofcom is consulting on draft statutory transparency reporting guidance covering the process that it will adopt for deciding what providers must include in their transparency reports. The guidance will also include how information from these reports will be used to inform Ofcom’s transparency report.
Additionally, Ofcom is seeking input on its proposed online safety information guidance. The proposal outlines when and how Ofcom may exercise its powers, designed to be adaptable to account for the specific circumstances in which these powers might be applied.
The consultation period will run until 4 October 2024.
17 July 2024 – Consultation closed: protecting children from harms online
Ofcom will consider all the responses and publish a regulatory statement and conclusions.
8 May 2024 – Consultation opened: protecting children from harms online
Ofcom published the second major consultation focusing on the proposal for how user-to-user services and search services should protect children from harmful content. The proposal includes draft Children’s Safety Code, draft Children’s Risk Assessment Guidance, and draft Children’s Access Assessment. The consultation will end on 17 July 2024.
19 April 2024 – Ofcom publishes principles for media literacy by design
Following a consultation on good media literacy ‘by design’ is for social media, search, video sharing and gaming services, Ofcom suggested 14 common principles categorised under the following:
- Proactivity, priority, transparency and accountability;
- User-centric design and timely interventions; and
- Monitoring and evaluating.
The principles align with what is expected for services regulated under the OSA.
1 April 2024 – Additional provisions from the OSA coming into force
Sections 101 (information in connection with an investigation into the death of a child) and 102 (information notices) came into force under the Online Safety Act 2023 (Commencement No. 2) Regulations 2023.
25 March 2024 – Consultation opened: additional duties applicable to online sites and apps
The regulator started the third phase of online safety regulation regarding extra duties for categorised services (i.e. those falling into Category 1, 2A or 2B). These extra obligations include empowering users with content control tools, safeguarding journalistic content, combating fraudulent advertising, and issuing transparency reports. To formulate codes of practice and guidance, Ofcom invites stakeholders to provide evidence, with a formal consultation scheduled for 2025 to incorporate feedback and finalize the regulatory framework.
5 March 2024 – Consultation closed: guidance for serving providers publishing pornographic content
The regulator closed the second out of four major consultations According to Ofcom’s timeline, final guidance and Parliamentary approval is expected in 2025. Ofcom also published additional materials, including draft guidance, covering online pornography regulation.
29 February 2024 - Ofcom advises the government on categorisation of services
Following research and industry consultation, Ofcom submitted its advice to the Secretary of State on the categorisation of services. In its recommendation, Ofcom sets out thresholds for each service category:
- Category 1 - Thresholds should target services meeting either of two conditions:
a. Condition 1: Using a content recommender system, with over 34 million UK users.
b. Condition 2: Allowing content forwarding or resharing, using a content recommender system, with over 7 million UK users. - Category 2A - thresholds should target search services (excluding vertical ones) with over 7 million UK users.
- Category 2B - thresholds target services allowing direct messaging, with over 3 million UK users.
23 February 2024 – Consultation period closed: protecting people from illegal harms
Ofcom will consider all the responses and plan to publish a regulatory statement and conclusions around the end of 2024. The code of practice is then expected to be submitted for Parliamentary approval.
21 February 2024 – House of Commons Committee Report: Preparedness for online safety regulation
The Committee of Public Accounts published recommendations to the government, including the importance of meeting implementation deadlines for the OSA and for Ofcom to work on ensuring they are well-placed to monitor and enforce the OSA.
31 January 2024 – Additional provisions from the OSA coming into force
Part 10 of the OSA, which deals with communications offences, came into force through the Online Safety Act 2023 (Commencement No. 3) Regulations 2024.
29 January 2024 – Overseas regulators published
The Online Safety (List of Overseas Regulators) Regulations 2024 specify overseas regulators for the purposes of section 114, covering cooperation and disclosure of information among them and Ofcom. The list includes the European Commission and regulators from Australia, Ireland, Netherlands, Germany and France.
10 January 2024 – The Online Safety Act 2023 (Commencement No. 3) Regulations 2024
The regulation was passed and will bring into force Part 10 of the OSA on 31 January 2024. Part 10 creates various new offences such as sending flashing images, sending false communications, encouraging or assisting serious self-harm and threatening communications. The section also establishes extra-territorial application for such offences and liability of corporate officers.
10 January 2024 – Additional provisions from the OSA coming into force
Many provisions of the OSA came into force through the Online Safety Act 2023 (Commencement No. 2) Regulations 2023, including the remaining provisions of section 72 (Further duties about terms of service) which deals with ensuring that user-to-user service includes clear and accessible provisions in the terms of service informing users about their right to bring a claim for breach of contract.
19 December 2023 – The Online Safety Act 2023 (Commencement No. 2) Regulations 2023
The regulation was passed and will bring into force many provisions of the OSA on 10 January 2024 and 1 April 2024.
5 December 2023 – Consultation opened: guidance for serving providers publishing pornographic content
The second consultation focuses on draft guidance to assist providers of online services that publish or display regulated provider pornographic content in complying with their age assurance and record-keeping obligations under the OSA.
21 November 2023 – The Online Safety Act 2023 (Commencement No. 1) Regulations 2023
The regulation establish that subsections (2) and (7) of section 114 of the OSA come into force on 22 November 2023. These subsections deal with cooperation with overseas regulators.
9 November 2023 – Consultation opened: protecting people from illegal harms online started
As the first of four major consultations that Ofcom will publish for the OSA, the consultation focuses on proposals for how internet services that enable the sharing of user-generated content and search services should approach their new duties relating to illegal content.
26 October 2023 – The Online Safety Act 2023 received royal assent and became law in the UK
The law applies to online service providers that permit users to exchange user-generated content and to search engines. Find out more in a write-up when it was introduced.
Multimedia, Perspectives, News & Events
Get in Touch
Other Topics
- Auto Dealership Cyberattack Insurance Resource Center
- CARES Act Resource Center
- Chevron Deference Resource Center
- Coronavirus (COVID-19) Resource Center
- Customized Solutions for IBOR Transition Needs
- Navigating the EU's NIS2 Directive
- Privacy Shield: International Data Transfers and the Schrems II Ruling
- Reproductive Health Working Group and Resource Center
- The Metaverse
- Trump Administration Policy Hub
- Trump tariffs: Navigating international trade
- UK and Continental European Government Responses to COVID-19