Landmark moment for child safety online as government takes major step towards delivery of an Online Harms Bill
Date published: 23 December 2020
Social media sites, websites, apps and other services which host user-generated content or allow people to talk to others online will need to remove and limit the spread of illegal content (posed by model)
The government’s final decisions on new laws to make the UK a safer place to be online have been announced by the Digital Secretary Oliver Dowden and Home Secretary Priti Patel.
The full government response to the Online Harms White Paper consultation sets out how the proposed legal duty of care on online companies will work in practice and gives them new responsibilities towards their users.
Social media sites, websites, apps and other services which host user-generated content or allow people to talk to others online will need to remove and limit the spread of illegal content such as child sexual abuse, terrorist material and suicide content. The government is also progressing work with the Law Commission on whether the promotion of self-harm should be made illegal.
The government says tech platforms will need to do ‘far more’ to protect children from being exposed to harmful content or activity (such as grooming, bullying and pornography) so future generations can enjoy the internet with better protections in place to reduce the risk of harm.
The most popular social media sites, with the largest audiences and high-risk features, will need to go further by setting and enforcing clear terms and conditions which explicitly state how they will handle content which is legal, but could cause significant physical or psychological harm to adults. This includes dangerous disinformation and misinformation.
Ofcom is now confirmed as the regulator with the power to fine companies failing in their duty of care up to £18 million or ten per cent of annual global turnover, whichever is higher. It will have the power to block non-compliant services from being accessed in the UK.
The legislation includes provisions to impose criminal sanctions on senior managers.
Digital Secretary Oliver Dowden said: “Britain is setting the global standard for safety online with the most comprehensive approach yet to online regulation. We are entering a new age of accountability for tech to protect children and vulnerable users, to restore trust in this industry, and to enshrine in law safeguards for free speech.
“This proportionate new framework will ensure we don’t put unnecessary burdens on small businesses but give large digital businesses robust rules of the road to follow so we can seize the brilliance of modern technology to improve our lives.”
Home Secretary Priti Patel said: “We will not allow child sexual abuse, terrorist material and other harmful content to fester on online platforms. Tech companies must put public safety first or face the consequences.”
However, the NSPCC, which has been campaigning for years to make the online world a safer place for children, says the proposals fall short of ensuring criminal sanctions against named directors whose companies fail to uphold their Duty of Care.
The NSPCC has been campaigning for regulation as part of its Wild West Web campaign since 2017, including measures for a legal Duty of Care to keep children safe.
Peter Wanless, NSPCC CEO said: “This is a landmark moment – the NSPCC has long called for an enforceable legal Duty of Care on tech companies and today is a major step towards legislation that can make that a reality. For too long children have been exposed to disgraceful abuse and harm online.
“Child protection and children’s voices must remain front and centre of regulatory requirements. We set out six tests for robust regulation – including action to tackle both online sexual abuse and harmful content and a regulator with the power to investigate and hold tech firms to account with criminal and financial sanctions.
“We will now be closely scrutinising the proposals against those tests. Above all, legislation must ensure Ofcom has the power and resources to enforce the duty of care and be able to identify and then take appropriate action against tech firms that fail.”
It is expected a draft Bill will be scrutinised by MPs and Lords before being passed into law.
The new regulations will apply to any company in the world hosting user-generated content online accessible by people in the UK or enabling them to privately or publicly interact with others online.
It includes social media, video sharing and instant messaging platforms, online forums, dating apps, commercial pornography websites, as well as online marketplaces, peer-to-peer services, consumer cloud storage sites and video games which allow online interaction. Search engines will also be subject to the new regulations.
Companies will have different responsibilities for different categories of content and activity, under an approach focused on the sites, apps and platforms where the risk of harm is greatest.
All companies will need to take appropriate steps to address illegal content and activity such as terrorism and child sexual abuse. They will also be required to assess the likelihood of children accessing their services and, if so, provide additional protections for them.
The government will make clear in the legislation the harmful content and activity that the regulations will cover and Ofcom will set out how companies can fulfil their duty of care in codes of practice.
A small group of companies with the largest online presences and high-risk features will be in Category 1, and will need to assess the risk of legal content or activity on their services with “a reasonably foreseeable risk of causing significant physical or psychological harm to adults”. They will then need to make clear what type of “legal but harmful” content is acceptable on their platforms in their terms and conditions and enforce this transparently and consistently.
All companies will need mechanisms so people can easily report harmful content or activity while also being able to appeal the takedown of content. Category 1 companies will be required to publish transparency reports about the steps they are taking to tackle online harms.
The legislation will include safeguards for freedom of expression and pluralism online - protecting people’s rights to participate in society and engage in robust debate.
Online journalism from news publishers’ websites will be exempt, as will reader comments on such sites. Specific measures will be included in the legislation to make sure journalistic content is still protected when it is reshared on social media platforms.
Do you have a story for us?
Let us know by emailing news@rochdaleonline.co.uk
All contact will be treated in confidence.
Most Viewed News Stories
- 1Detective from Rochdale convicted of sexually assaulting colleagues
- 2Changes to council services over Christmas and New Year
- 3Rail travel advice calendar released for week over Christmas
- 4Andy Burnham "did not ask" for powers to overturn council decisions
- 5Extra £1m to be spent fixing Rochdale town hall clock and steps
To contact the Rochdale Online news desk, email news@rochdaleonline.co.uk or visit our news submission page.
To get the latest news on your desktop or mobile, follow Rochdale Online on Twitter and Facebook.