What does the White Paper propose?
The White Paper proposes a new statutory duty of care to deal with online harms. The new duty is to be governed by a new regulatory framework and overseen by an independent regulator.
Who will be regulated?
Although the full range of companies the rules will apply to has not yet been defined, the range of companies to whom the rules will apply is likely to be wide. The White Paper says that the new framework is intended to apply to companies that allow users to share or discover user-generated content or to interact with each other online, including social media platforms, file hosting sites, public discussion forums, messaging services and search engines, SMEs, start ups and other organisations such as charities. The White Paper acknowledges that this encapsulates a broad range of business types, and therefore that a “risk based and proportionate approach” is to be taken. The initial focus will be on the companies which the government believes pose the biggest risk of harm to users, either because of the scale of the platforms or because of known issues with serious harms. It appears that the intention is that the duty of care will reflect the diversity of organisations in scope, their capacities, and what is technically possible in terms of proactive measures. The government says that it will minimise excessive burdens according to the size and resources of organisations, but that all companies must take reasonable and proportionate action to tackle harms on their services.
The Regulatory Framework
An independent regulator will regulate the sector and codes of practice will set out how the statutory duty should be fulfilled. The regulator is intended to be industry-funded, potentially via fees, charges and/or a levy. The government will reserve the power to direct the regulator in respect of codes of practice relating to terrorist activity, as well as child sexual exploitation (CSEA). The regulator will be expected to work with law enforcement in respect of codes relating to illegal activities. Deviations from the codes of practice may be permitted provided that it is possible to explain and justify to the regulator how any alternative approach will effectively deliver the same or greater level of impact.
It is anticipated that the regulator will require particularly robust action to tackle terrorist content and CSEA online.
Key measures required to comply with the duty of care are likely to include:
- Regulated bodies should have terms and conditions which are clear and accessible, including to children and other vulnerable users and their effectiveness will be enforced by the regulator.
- Annual transparency reports to be submitted by regulated entities to the regulator outlining the prevalence of harmful content on their platforms and the countermeasures being taken to address it. The reports are likely to be required to include evidence of enforcement of terms and conditions and will be published online so that users can make informed decisions about internet use.
- Easy-to-access user complaints functions should be provided which the regulator will oversee via an independent review mechanism. Designated bodies may be able to make ‘super-complaints’ to the regulator to defend the needs of users.
It is also anticipated that the regulator will have the power to require additional information about the impact of algorithms in selecting content for children in particular and to provide the means for testing the operation of the algorithms.
The White Paper says that the government intends to ensure that the internet remains free and open and that it intends to protect freedom of expression online. The new regulator is expressly not intended to be responsible for policing truth or accuracy. The government also says that it recognises the importance of privacy and seeks feedback on how regulation should apply to private communications.
How will the new duty be enforced?
Enforcement powers of the new regulator may include:
- requiring additional information from regulated bodies about alleged breaches;
- notices served on a company alleged to have breached standards, setting a timeframe for resolution;
- publishing notices of breach;
- requiring third party companies to cease providing services to companies in breach;
- ISPs blocking services in breach;
- personal liability of senior management for fines and possible criminal liability.
Early Action
The White Paper encourages companies to take early action, ahead of the implementation of the regulatory framework and sets out a detailed list in section 7 of what can be expected in the codes of practice. They include taking reasonable steps to:
- enforce terms and conditions effectively and consistently;
- take prompt, transparent, and effective action following user reporting;
- support law enforcement investigations;
- direct users who have suffered or may suffer harm to appropriate support;
- prevent known terrorist/CSEA/seriously violent content being made available to users, and proactively identify and act upon CSEA activity (e.g. grooming), including alongside/within live streams and proactive reports to law enforcement;
- provide effective systems for identifying and safeguarding child users and allow child users or parents and carers to report and remove images which leave them vulnerable to abuse;
- implement effective procedures to deal with interference with legal proceedings, serious violence, hate crime, cyberbullying, harassment and self-harm and suicide-related content;
- promote authoritative news sources and diverse content, improve the transparency of political advertising and have clear policies and reporting systems in place to tackle disinformation;
- protect public figures facing abuse, allowing a mechanism for such users to report others and ensure such harms are dealt with rapidly;
- adhere to DCMS’ existing Social Media Code of Practice (published 2017).
Next Steps – ongoing consultation
The consultation, which is due to close on 1 July 2019, can be responded to at dcms.eu.qualtrics.com.
Client Alert 2019-090