ICO publishes Code of Practice to protect children’s privacy online

Date published: 23 January 2020


The Information Commissioner’s Office has this week published its final Age Appropriate Design Code – a set of 15 standards that online services should meet to protect children’s privacy.

The code sets out the standards expected of those responsible for designing, developing or providing online services like apps, connected toys, social media platforms, online games, educational websites and streaming services. It covers services likely to be accessed by children and which process their data.

The code will require digital services to automatically provide children with a built-in baseline of data protection whenever they download a new app, game or visit a website.

That means privacy settings should be set to high by default and nudge techniques should not be used to encourage children to weaken their settings. Location settings that allow the world to see where a child is, should also be switched off by default. Data collection and sharing should be minimised and profiling that can allow children to be served up targeted content should be switched off by default too.

Elizabeth Denham, Information Commissioner, said: “Personal data often drives the content that our children are exposed to – what they like, what they search for, when they log on and off and even how they are feeling.

“In an age when children learn how to use an iPad before they ride a bike, it is right that organisations designing and developing online services do so with the best interests of children in mind. Children’s privacy must not be traded in the chase for profit.”

The code says that the best interests of the child should be a primary consideration when designing and developing online services. And it gives practical guidance on data protection safeguards that ensure online services are appropriate for use by children.

Ms Denham said: “One in five internet users in the UK is a child, but they are using an internet that was not designed for them.

“There are laws to protect children in the real world – film ratings, car seats, age restrictions on drinking and smoking. We need our laws to protect children in the digital world too.

“In a generation from now, we will look back and find it astonishing that online services weren’t always designed with children in mind.”

The standards of the code are rooted in the General Data Protection Regulation (GDPR) and the code was introduced by the Data Protection Act 2018. The ICO submitted the code to the Secretary of State in November and it must complete a statutory process before it is laid in Parliament for approval. After that, organisations will have 12 months to update their practices before the code comes into full effect. The ICO expects this to be by autumn 2021.

This version of the code is the result of wide-ranging consultation and engagement.

The ICO received 450 responses to its initial consultation in April 2019 and followed up with dozens of meetings with individual organisations, trade bodies, industry and sector representatives, and campaigners.

As a result, and in addition to the code itself, the ICO is preparing a significant package of support for organisations.

The code is the first of its kind, but it reflects the global direction of travel with similar reform being considered in the USA, Europe and globally by the Organisation for Economic Co-operation and Development (OECD).

Ms Denham added: “The ICO’s Code of Practice is the first concrete step towards protecting children online. But it’s just part of the solution. We will continue to work with others here in the UK and around the world to ensure that our code complements other measures being developed to address online harms.”

Responding to the move, Andy Burrows, NSPCC head of child safety online policy, said: “This transformative code will force high-risk social networks to finally take online harm seriously and they will suffer tough consequences if they fail to do so.

“For the first time, tech firms will be legally required to assess their sites for sexual abuse risks, and can no longer serve up harmful self-harm and pro-suicide content.

“It is now key that these measures are enforced in a proportionate and targeted way.

“This code must go hand in hand with a legally enforceable duty of care and an independent regulator to make the UK a world leader in keeping children safe online.”

The Information Commissioner’s Office (ICO) is the UK’s independent regulator for data protection and information rights law, upholding information rights in the public interest, promoting openness by public bodies and data privacy for individuals.

The ICO has specific responsibilities set out in the Data Protection Act 2018 (DPA2018), the General Data Protection Regulation (GDPR), the Freedom of Information Act 2000 (FOIA), Environmental Information Regulations 2004 (EIR) and Privacy and Electronic Communications Regulations 2003 (PECR).

To report a concern to the ICO, visit:

The Age Appropriate Design Code can be viewed here:

Do you have a story for us?

Let us know by emailing news@rochdaleonline.co.uk
All contact will be treated in confidence.


To contact the Rochdale Online news desk, email news@rochdaleonline.co.uk or visit our news submission page.

To get the latest news on your desktop or mobile, follow Rochdale Online on Twitter and Facebook.


While you are here...

...we have a small favour to ask; would you support Rochdale Online and join other residents making a contribution, from just £3 per month?

Rochdale Online offers completely independent local journalism with free access. If you enjoy the independent news and other free services we offer (event listings and free community websites for example), please consider supporting us financially and help Rochdale Online to continue to provide local engaging content for years to come. Thank you.

Support Rochdale Online