Ofcom Probes TikTok Over ‘Inaccurate’ Child Protection Data

UK media regulator Ofcom has said it is investigating ByteDance-owned TikTok for providing what it called inaccurate information about its parental controls.

Ofcom said TikTok may have misled it with regard to its parental controls, called Family Pairing, and had “reasonable grounds” for believing the social media platform may have breached its legal responsibilities.

The investigation looks into whether TikTok failed to comply with a legal information request, Ofcom said.

The information formed part of an Ofcom report published last Thursday into the methods used by video-sharing services to stop children viewing harmful content.

Image credit: Unsplash

Parental controls

The issue falls under the regulator’s remit after the Online Safety Act became law in October.

TikTok said inaccurate information was provided due to a technical glitch.

It said it informed Ofcom of the issue several weeks ago and that this triggered the current investigation, adding that it is working to rectify the issue and plans to supply accurate data as soon as possible.

Family Pairing allows parents to link their accounts to that of a child, giving them control over screen time, direct messages, content filtering and other parameters.

Child safety

TikTok said Ofcom’s probe related to information provided about the feature, introduced in April 2020, and not the feature itself.

Snap and Twitch were also asked for information about how they complied with legal requirements to protect children from seeing harmful material online.

The three services all require users to be 13 or over, but Ofcom research estimates that one-fifth of children aged 8 to 17 have accounts identifying them as over-18.

Unlike adult content site OnlyFans has multiple levels of age checks in place, including facial age estimation and ID checks, the three services rely on users to declare their own ages.

Age checks

TikTok said it uses additional technologies to spot underage users through keyword detection, while Twitch uses language analysis tools and Snap responds to user reports.

TikTok said more than 1 percent of its monthly active user base consisted of removed underage accounts in the 12 months up to March 2023, while Twitch said it had removed 0.03 percent of accounts and Snap up to 1,000 accounts for being underage during the same period.

Matthew Broersma

Matt Broersma is a long standing tech freelance, who has worked for Ziff-Davis, ZDnet and other leading publications

Recent Posts

UK’s CMA Readies Cloud Sector “Behavioural” Remedies – Report

Targetting AWS, Microsoft? British competition regulator soon to announce “behavioural” remedies for cloud sector

12 hours ago

Former Policy Boss At X Nick Pickles, Joins Sam Altman Venture

Move to Elon Musk rival. Former senior executive at X joins Sam Altman's venture formerly…

14 hours ago

Bitcoin Rises Above $96,000 Amid Trump Optimism

Bitcoin price rises towards $100,000, amid investor optimism of friendlier US regulatory landscape under Donald…

15 hours ago

FTX Co-Founder Gary Wang Spared Prison

Judge Kaplan praises former FTX CTO Gary Wang for his co-operation against Sam Bankman-Fried during…

16 hours ago