Ofcom has publicly warned major tech firms to improve their “toxic algorithms” in order to comply with the UK’s online protection laws for children.
The UK communications regulator announced 40 practical steps that tech firms must take to keep children safer, including the introduction of robust age-checks to prevent children from seeing harmful content such as suicide, self-harm and pornography.
Tech firms have also been warned that harmful material must be filtered out or downranked in recommended content, which will likely require changes to algorithms, implementation of age verification, and improvements in content moderation.
Ofcom in its draft Children’s Safety Codes of Practice sets out strict new duties on services that can be accessed by children, including popular social media sites and apps and search engines.
The onus is now on tech firms to first assess the risk their service poses to children and then implement safety measures to mitigate those risks.
Ofcom is seeking responses to its consultation, which must be submitted by 17 July, and the regulator expect to publish its final Children’s Safety Codes of Practice within a year. Tech firms will then have three months to conduct their children’s risk assessments.
Once approved by Parliament, the Codes will come into effect and Ofcom can begin enforcing the regime. Tech firms not comply are being warned of possible enforcement action, including sizeable fines.
It should be remembered that the Online Safety Act imposes strict new duties on services that can be accessed by children, including popular social media sites and apps and search engines.
Measures that tech firms must follow includes preventing children from encountering the most harmful content relating to suicide, self-harm, eating disorders, and pornography.
Tech firms must also minimise children’s exposure to other serious harms, including violent, hateful or abusive material, online bullying, and content promoting dangerous challenges.
“We want children to enjoy life online,” said Dame Melanie Dawes, Ofcom chief executive. “But for too long, their experiences have been blighted by seriously harmful content which they can’t avoid or control. Many parents share feelings of frustration and worry about how to keep their children safe. That must change.”
“In line with new online safety laws, our proposed Codes firmly place the responsibility for keeping children safer on tech firms,” said Dawes. “They will need to tame aggressive algorithms that push harmful content to children in their personalised feeds and introduce age-checks so children get an experience that’s right for their age.”
“When we passed the Online Safety Act last year we went further than almost any other country in our bid to make the UK the safest place to be a child online,” added Michelle Donelan, technology secretary . “That task is a complex journey but one we are committed to, and our groundbreaking laws will hold tech companies to account in a way they have never before experienced.”
Specifically, Ofcom’s Codes are calling for tech firms to undertake the following:
Ofcom is calling its draft Children’s Safety Codes of Practice a “reset for children’s safety”, which it expects will make a significant difference to children’s online experiences.
Last week Ofcom announced it was opening an investigation into whether OnlyFans is doing enough to prevent children accessing pornography on its site.
Targetting AWS, Microsoft? British competition regulator soon to announce “behavioural” remedies for cloud sector
Move to Elon Musk rival. Former senior executive at X joins Sam Altman's venture formerly…
Bitcoin price rises towards $100,000, amid investor optimism of friendlier US regulatory landscape under Donald…
Judge Kaplan praises former FTX CTO Gary Wang for his co-operation against Sam Bankman-Fried during…
Explore the future of work with the Silicon In Focus Podcast. Discover how AI is…
Executive hits out at the DoJ's “staggering proposal” to force Google to sell off its…