Australian Senate Grills Lobbyist Over Social Media Failures

An advocate for social media platforms told an Australian Senate subcommittee that a law to ban users under 16 from social media platforms should be delayed until next year instead of rushed through Parliament this week.

Sunita Bose, managing director of Digital Industry Group, a lobbying firm that represents Facebook, Instagram, TikTok and X, answered questions before a Senate committee hearing into the pioneering legislation that was introduced to Parliament last week.

Bose said Parliament should wait until the completion of a government-commissioned evaluation of age assurance technologies was completed in June of next year, saying that otherwise the bill would be passed “without knowing how it will work”.

The proposed law, which is likely to be passed by Parliament with support of the major parties, would not take effect until a year after it becomes law, allowing platforms time to devise technical solutions for blocking under-age users while protecting users’ privacy.

Social media apps displayed on smartphone screen. Facebook, Twitter, X, Instagram, YouTube, Tumblr, Vine, WhatsApp
Image credit: Pexels

Fines

The legislation would impose fines of up to 50 million Australian dollars ($33m, £26m) on platforms for systemic failures to prevent children from holding accounts.

Communications Minister Michelle Rowland told Parliament earlier that social media in its current form “is not a safe product” for children and that the law would support parents wanting to keep their children away from it.

“Access to social media does not have to be the defining feature of growing up,” she said.

Some senators showed hostility to Bose’s assertions that the law could isolate children and compromise their safety by driving them from mainstream social media platforms to “darker, less safe online spaces”.

“That’s an outrageous statement. You’re trying to protect the big tech giants,” said senator Sarah Henderson.

Heated debate

In response to a question about why platforms are not using algorithms to stop sending harmful material that promotes suicide and eating disorders to young users, Bose said platforms already have algorithms filtering out nudity and that “continued investment” would see the algorithms doing a “better job” of addressing harmful content.

In response to a question by senator Dave Sharma, Bose said she did not know how much advertising revenue platforms made from Australian children and was not familiar with a Harvard study that found Facebook, Instagram, Snapchat, TikTok, YouTube and X made $11bn in advertising revenues from US users under 18 in 2022.

Matthew Broersma

Matt Broersma is a long standing tech freelance, who has worked for Ziff-Davis, ZDnet and other leading publications

Recent Posts

France Fines Apple Over Ad Tracking Feature

Apple fined 150m euros over App Tracking Transparency feature that it says abuses Apple's market…

12 hours ago

OpenAI To Release Open-Weight AI Model

OpenAI to release customisable open-weight model in coming months as it faces pressure from open-source…

13 hours ago

Samsung AI Fridge Creates Shopping Lists, Adjusts AC

Samsung's Bespoke AI-powered fridge monitors food to create shopping lists, displays TikTok videos, locates misplaced…

13 hours ago

Huawei Consumer Revenues Surge Amidst Smartphone Comeback

Huawei sees 38 percent jump in consumer revenues as its smartphone comeback continues to gather…

14 hours ago

China Approves First ‘Flying Car’ Licences

In world-first, China approves commercial flights for EHang autonomous passenger drone, paving way for imminent…

14 hours ago

Microsoft Shutters Shanghai Lab In Latest China Pullback

Microsoft closes down IoT and AI lab it operated in Shanghai tech district in latest…

15 hours ago