UK

New ‘transformational’ code to protect children’s privacy online

Children playing an online game Image copyright Getty Images

Social media sites, online games and streaming services used by children will have to abide by a new privacy code set by the UK’s data watchdog.

Elizabeth Denham, the information commissioner, said future generations will be “astonished to think that we ever didn’t protect kids online”.

She said the new Age Appropriate Design Code will be “transformational”.

The father of Molly Russell, 14, who killed herself after viewing graphic content online, welcomed the standards.

The Information Commissioner’s Office – the UK’s data privacy regulator – published the new code of conduct on Wednesday, after a draft which was first revealed last April.

It hopes the changes will come into force by autumn 2021, once Parliament approves it, with large fines for breaches.

Image copyright Russell Family
Image caption The dad of Molly Russell, who took her own life aged 14, welcomed the code

The code includes a list of 15 standards that companies behind online services are expected to comply with to protect children’s privacy.

Examples of online services which are included are toys which are connected to the internet, apps, social media platforms, online games, educational websites and streaming service.

Firms who design, develop or run such products must provide a “baseline” of data protection for children, the code says.

The standards also include:

  • Location settings that would allow a child’s location to be shared should be switched off by default
  • Privacy settings to be set to high by default and nudge techniques to encourage children to weaken their settings should not be used

“I believe that it will be transformational,” Ms Denham told the Press Association.

“I think in a generation from now when my grandchildren have children they will be astonished to think that we ever didn’t protect kids online. I think it will be as ordinary as keeping children safe by putting on a seat belt.”

Image caption Children “are using an internet that was not designed for them,” says Ms Denham

Ms Denham said the move was widely supported by firms, although added that the gaming industry and some other tech companies expressed concern about their business model.

She added: “We have an existing law, GDPR, that requires special treatment of children and I think these 15 standards will bring about greater consistency and a base level of protection in the design and implementation of games and apps and websites and social media.”

The new standards follow concerns over young people suffering from grooming by predators, data misuse, problem gambling and access to damaging content which could affect their mental health.

Ian Russell believes his daughter Molly’s use of Instagram was a factor in her suicide aged 14 in 2017.

After she died, her family found graphic posts about suicide and self-harm on her account.

Media playback is unsupported on your device

Media captionMolly Russell’s father Ian travels to the United States and meets other parents bereaved by suicide

The response following her death led to Instagram pledging to remove images, drawings and even cartoons showing methods of self-harm or suicide.

Welcoming the code, Mr Russell said: “Although small steps have been taken by some social media platforms, there seems little significant investment and a lack of commitment to a meaningful change, both essential steps required to create a safer world wide web.

“The Age Appropriate Design Code demonstrates how the technology companies might have responded effectively and immediately.”

Andy Burrows, the NSPCC’s head of child safety online policy, said the code would force social networks to “finally take online harm seriously and they will suffer tough consequences if they fail to do so”.

He said: “For the first time, tech firms will be legally required to assess their sites for sexual abuse risks, and can no longer serve up harmful self-harm and pro-suicide content.

“It is now key that these measures are enforced in a proportionate and targeted way.”

Facebook said it welcomed “the considerations raised”, adding: “The safety of young people is central to our decision-making, and we’ve spent over a decade introducing new features and tools to help everyone have a positive and safe experience on our platforms, including recent updates such as increased Direct Message privacy settings on Instagram.

“We are actively working on developing more features in this space and are committed to working with governments and the tech industry on appropriate solutions around topics such as preventing underage use of our platforms.”

Source: bbc.co.uk

Tagged

Leave a Reply

Your email address will not be published. Required fields are marked *