Ofcom Probes TikTok Over ‘Inaccurate’ Child Protection Data

UK media regulator Ofcom has said it is investigating ByteDance-owned TikTok for providing what it called inaccurate information about its parental controls.

Ofcom said TikTok may have misled it with regard to its parental controls, called Family Pairing, and had “reasonable grounds” for believing the social media platform may have breached its legal responsibilities.

The investigation looks into whether TikTok failed to comply with a legal information request, Ofcom said.

The information formed part of an Ofcom report published last Thursday into the methods used by video-sharing services to stop children viewing harmful content.

Image credit: Unsplash

Parental controls

The issue falls under the regulator’s remit after the Online Safety Act became law in October.

TikTok said inaccurate information was provided due to a technical glitch.

It said it informed Ofcom of the issue several weeks ago and that this triggered the current investigation, adding that it is working to rectify the issue and plans to supply accurate data as soon as possible.

Family Pairing allows parents to link their accounts to that of a child, giving them control over screen time, direct messages, content filtering and other parameters.

Child safety

TikTok said Ofcom’s probe related to information provided about the feature, introduced in April 2020, and not the feature itself.

Snap and Twitch were also asked for information about how they complied with legal requirements to protect children from seeing harmful material online.

The three services all require users to be 13 or over, but Ofcom research estimates that one-fifth of children aged 8 to 17 have accounts identifying them as over-18.

Unlike adult content site OnlyFans has multiple levels of age checks in place, including facial age estimation and ID checks, the three services rely on users to declare their own ages.

Age checks

TikTok said it uses additional technologies to spot underage users through keyword detection, while Twitch uses language analysis tools and Snap responds to user reports.

TikTok said more than 1 percent of its monthly active user base consisted of removed underage accounts in the 12 months up to March 2023, while Twitch said it had removed 0.03 percent of accounts and Snap up to 1,000 accounts for being underage during the same period.

Matthew Broersma

Matt Broersma is a long standing tech freelance, who has worked for Ziff-Davis, ZDnet and other leading publications

Recent Posts

Craig Wright Sentenced For Contempt Of Court

Suspended prison sentence for Craig Wright for “flagrant breach” of court order, after his false…

2 days ago

El Salvador To Sell Or Discontinue Bitcoin Wallet, After IMF Deal

Cash-strapped south American country agrees to sell or discontinue its national Bitcoin wallet after signing…

2 days ago

UK’s ICO Labels Google ‘Irresponsible’ For Tracking Change

Google's change will allow advertisers to track customers' digital “fingerprints”, but UK data protection watchdog…

2 days ago

EU Publishes iOS Interoperability Plans

European Commission publishes preliminary instructions to Apple on how to open up iOS to rivals,…

3 days ago

Momeni Convicted In Bob Lee Murder

San Francisco jury finds Nima Momeni guilty of second-degree murder of Cash App founder Bob…

3 days ago