The government, will establish a new statutory duty of care to make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services. An initiative led by Jeremy Wright MP, Secretary of State for Digital, Culture, Media and Sport along with Sajid Javid MP, Home Secretary to preserve a dynamic and innovative internet, while keeping its users safe from serious harm.
Compliance with this duty of care will be overseen and enforced by an independent regulator.
The internet is a part of our lives – nearly 90% of adults in the UK are online and 99% of 12-15 year olds. In many ways it is a powerful force for good. It can forge connections, share knowledge and spread opportunity across the world. But it can also be used to promote terrorism, undermine civil discourse, spread disinformation, and abuse or bully.” – Jeremy Wright MP
Department for Digital, Culture, Media & Sport
Companies affected will need to be able to show that they are fulfilling their duty of care. Relevant terms and conditions will be required to be sufficiently clear and accessible, including to children and other vulnerable users. The regulator will assess how effectively these terms are enforced as part of any regulatory action.
The regulator will have a suite of powers to take effective enforcement action against companies that have breached their statutory duty of care. This may include the powers to issue substantial fines and to impose liability on individual members of senior management.
Reflecting the threat to national security or the physical safety of children, the government is proposing the power to direct the regulator in relation to codes of practice on terrorist activity or child sexual exploitation and abuse (CSEA) online, and these codes must be signed off by the Home Secretary.
For codes of practice relating to illegal harms, including incitement of violence and the sale of illegal goods and services such as weapons, there will be a clear expectation that the regulator will work with law enforcement to ensure the codes adequately keep pace with the threat.
Developing a culture of transparency, trust and accountability will be a critical element of the new regulatory framework. The regulator will have the power to require annual transparency reports from companies in scope, outlining the prevalence of harmful content on their platforms and what counter measures they are taking to address these. These reports will be published online by the regulator, so that users and parents can make informed decisions about internet use. The regulator will also have powers to require additional information, including about the impact of algorithms in selecting content for users and to ensure that companies proactively report on both emerging and known harms.
For the most serious online offending such as CSEA and terrorism, the new regulation expects companies to go much further and demonstrate the steps taken to combat the dissemination of associated content and illegal behaviours.
Read the Online Harms White Paper here.