The full government response to the Online Harms White Paper consultation has been published, and sets out how a proposed legal duty of care on online companies will work in practice and give them new responsibilities towards their users.
The safety of children is at the heart of the measures. The Secretary of State for Digital, Culture, Media and Sport, Oliver Dowden, told parliament the legislation represented "decisive action" to protect both children and adults online. Tech platforms will need to do far more to protect children from being exposed to harmful content or activity such as grooming, bullying and pornography.
Social media sites, websites, apps and other services that host user-generated content or allow people to talk to others online will be required to remove and limit the spread of illegal content such as child sexual abuse, terrorist material and suicide content. The government is also progressing work with the Law Commission on whether the promotion of self harm should be made illegal.
James said, "Many constituents have raised concerns that tech companies are not properly protecting young people from harmful content. These new proposals will protect children and help ensure that everyone can take advantage of the benefits of going online."
In addition, the most popular social media sites with the largest audiences and high-risk features such as Tiktok and Facebook, will need to go further by setting and enforcing clear terms and conditions that explicitly state how they will handle content that is legal but could cause significant physical or psychological harm to adults. This includes dangerous disinformation and misinformation about coronavirus vaccines, and will help bridge the gap between what companies say they do and what happens in practice.
It has been confirmed that Ofcom will be the regulator with the power to fine companies failing in their duty of care up to £18 million or 10% of annual global turnover, whichever is higher. Ofcom will have the power to block non-compliant services from being accessed in the UK.
The legislation also includes proposals to impose criminal sanctions on senior managers. The government will not hesitate to bring these powers into force should companies fail to take the new rules seriously . If they do not respond fully, accurately and in a timely manner to information requests from Ofcom, for example.
The government plans to bring the laws forward in an Online Safety Bill next year and set the global standard for proportionate yet effective regulation. The legislation will include safeguards for freedom of expression and pluralism online - protecting people’s rights to participate in society and engage in robust debate.
The new regulations will apply to any company in the world hosting user-generated content online accessible by people in the UK or enabling them to privately or publicly interact with others online.