Ofcom Probes TikTok Over ‘Inaccurate’ Child Protection Data

TikTok app displayed on a smartphone

Ofcom says TikTok may have provided inaccurate data about parental controls, as regulator begins enforcing child online safety rules

UK media regulator Ofcom has said it is investigating ByteDance-owned TikTok for providing what it called inaccurate information about its parental controls.

Ofcom said TikTok may have misled it with regard to its parental controls, called Family Pairing, and had “reasonable grounds” for believing the social media platform may have breached its legal responsibilities.

The investigation looks into whether TikTok failed to comply with a legal information request, Ofcom said.

The information formed part of an Ofcom report published last Thursday into the methods used by video-sharing services to stop children viewing harmful content.

tiktok
Image credit: Unsplash

Parental controls

The issue falls under the regulator’s remit after the Online Safety Act became law in October.

TikTok said inaccurate information was provided due to a technical glitch.

It said it informed Ofcom of the issue several weeks ago and that this triggered the current investigation, adding that it is working to rectify the issue and plans to supply accurate data as soon as possible.

Family Pairing allows parents to link their accounts to that of a child, giving them control over screen time, direct messages, content filtering and other parameters.

Child safety

TikTok said Ofcom’s probe related to information provided about the feature, introduced in April 2020, and not the feature itself.

Snap and Twitch were also asked for information about how they complied with legal requirements to protect children from seeing harmful material online.

The three services all require users to be 13 or over, but Ofcom research estimates that one-fifth of children aged 8 to 17 have accounts identifying them as over-18.

Unlike adult content site OnlyFans has multiple levels of age checks in place, including facial age estimation and ID checks, the three services rely on users to declare their own ages.

Age checks

TikTok said it uses additional technologies to spot underage users through keyword detection, while Twitch uses language analysis tools and Snap responds to user reports.

TikTok said more than 1 percent of its monthly active user base consisted of removed underage accounts in the 12 months up to March 2023, while Twitch said it had removed 0.03 percent of accounts and Snap up to 1,000 accounts for being underage during the same period.