Ofcom has publicly warned major tech firms to improve their “toxic algorithms” in order to comply with the UK’s online protection laws for children.

The UK communications regulator announced 40 practical steps that tech firms must take to keep children safer, including the introduction of robust age-checks to prevent children from seeing harmful content such as suicide, self-harm and pornography.

Tech firms have also been warned that harmful material must be filtered out or downranked in recommended content, which will likely require changes to algorithms, implementation of age verification, and improvements in content moderation.

Children’s Safety Codes of Practice

Ofcom in its draft Children’s Safety Codes of Practice sets out strict new duties on services that can be accessed by children, including popular social media sites and apps and search engines.

The onus is now on tech firms to first assess the risk their service poses to children and then implement safety measures to mitigate those risks.

Ofcom is seeking responses to its consultation, which must be submitted by 17 July, and the regulator expect to publish its final Children’s Safety Codes of Practice within a year. Tech firms will then have three months to conduct their children’s risk assessments.

Once approved by Parliament, the Codes will come into effect and Ofcom can begin enforcing the regime. Tech firms not comply are being warned of possible enforcement action, including sizeable fines.

It should be remembered that the Online Safety Act imposes strict new duties on services that can be accessed by children, including popular social media sites and apps and search engines.

Measures that tech firms must follow includes preventing children from encountering the most harmful content relating to suicide, self-harm, eating disorders, and pornography.

Tech firms must also minimise children’s exposure to other serious harms, including violent, hateful or abusive material, online bullying, and content promoting dangerous challenges.

Tech firm responsibility

“We want children to enjoy life online,” said Dame Melanie Dawes, Ofcom chief executive. “But for too long, their experiences have been blighted by seriously harmful content which they can’t avoid or control. Many parents share feelings of frustration and worry about how to keep their children safe. That must change.”

“In line with new online safety laws, our proposed Codes firmly place the responsibility for keeping children safer on tech firms,” said Dawes. “They will need to tame aggressive algorithms that push harmful content to children in their personalised feeds and introduce age-checks so children get an experience that’s right for their age.”

“When we passed the Online Safety Act last year we went further than almost any other country in our bid to make the UK the safest place to be a child online,” added Michelle Donelan, technology secretary . “That task is a complex journey but one we are committed to, and our groundbreaking laws will hold tech companies to account in a way they have never before experienced.”

Specific steps

Specifically, Ofcom’s Codes are calling for tech firms to undertake the following:

  1. Carry out robust age-checks to stop children accessing harmful content. This could mean in some cases preventing children from accessing the entire site or app. In others it might mean age-restricting parts of their site or app for adults-only access, or restricting children’s access to identified harmful content.
  2. Ensure that algorithms which recommend content do not operate in a way that harms children. Ofcom said that algorithms which provide personalised recommendations to users – are children’s main pathway to harm online. Left unchecked, they risk serving up large volumes of unsolicited, dangerous content to children in their personalised news feeds or ‘For You’ pages.
  3. Introduce better moderation of content harmful to children. Ofcom said that evidence shows that content harmful to children is available on many services at scale, which suggests that services’ current efforts to moderate harmful content are insufficient. Ofcom’s code states that all user-to-user services must have content moderation systems and processes that ensure swift action is taken against content harmful to children. Search engines are expected to take similar action.

Ofcom is calling its draft Children’s Safety Codes of Practice a “reset for children’s safety”, which it expects will make a significant difference to children’s online experiences.

Last week Ofcom announced it was opening an investigation into whether OnlyFans is doing enough to prevent children accessing pornography on its site.

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

UK’s CMA Readies Cloud Sector “Behavioural” Remedies – Report

Targetting AWS, Microsoft? British competition regulator soon to announce “behavioural” remedies for cloud sector

5 hours ago

Former Policy Boss At X Nick Pickles, Joins Sam Altman Venture

Move to Elon Musk rival. Former senior executive at X joins Sam Altman's venture formerly…

7 hours ago

Bitcoin Rises Above $96,000 Amid Trump Optimism

Bitcoin price rises towards $100,000, amid investor optimism of friendlier US regulatory landscape under Donald…

9 hours ago

FTX Co-Founder Gary Wang Spared Prison

Judge Kaplan praises former FTX CTO Gary Wang for his co-operation against Sam Bankman-Fried during…

10 hours ago