Government Considers Revisiting Online Safety Act

Ofcom has urged social media companies to take action against posts inciting violence, as the government said it may revisit the Online Safety Act, which is due to come into full force next year.

The media regulator urged platforms to address content that depicts “hatred and disorder” and promotes violence and misinformation.

Existing powers enable Ofcom to suspend or restrict video-sharing platforms such as YouTube or TikTok that fail to protect the public from “harmful material” and the regulator is expected to be given more powers across social media platforms in general under the Online Safety Act next year.

“There is no need to wait to make your sites and apps safer for users,” said Ofcom safety director Gill Whitehead.

Image credit: Pexels

Online misinformation

Policing minister Dame Diana Johnson said tech firms “have an obligation now” to “deal with” material that incites violence.

Speaking on the Today programme on BBC Radio Four, Johnson said the government was considering revisiting the upcoming legislation.

“The events of the last few days have meant that we need to look very carefully at what more we can do,” she said.

She said a possible plan to ban convicted rioters from football matches was “being looked at”.

The government said last week social media platforms “clearly need to do far more” after a list supposedly containing the names and addresses of immigration lawyers was spread online, apparently originating from Telegram.

Azzurra Moores of fact-checking organisation Full Fact said online misinformation was a “clear and present danger spilling across into unrest on UK streets in real-time” and urged Ofcom and the government to take “bolder, stronger action”.

‘Not fit for purpose’

London mayor Sadiq Khan said last week the Online Safety Act is “not fit for purpose” due to the way misinformation spreads rapidly on social media and urged ministers to act “very, very quickly” to review it.

Platforms X and Telegram have been notably slow to act on harmful material, with X owner Elon Musk repeatedly relaying posts containing misinformation to his 193 million followers on the service and criticising the government for cracking down on hate speech.

Telegram said its moderators were “actively monitoring the situation and are removing channels and posts containing calls to violence”, adding that “calls to violence” are forbidden in its terms of service.

Matthew Broersma

Matt Broersma is a long standing tech freelance, who has worked for Ziff-Davis, ZDnet and other leading publications

Recent Posts

UK’s CMA Readies Cloud Sector “Behavioural” Remedies – Report

Targetting AWS, Microsoft? British competition regulator soon to announce “behavioural” remedies for cloud sector

1 hour ago

Former Policy Boss At X Nick Pickles, Joins Sam Altman Venture

Move to Elon Musk rival. Former senior executive at X joins Sam Altman's venture formerly…

4 hours ago

Bitcoin Rises Above $96,000 Amid Trump Optimism

Bitcoin price rises towards $100,000, amid investor optimism of friendlier US regulatory landscape under Donald…

5 hours ago

FTX Co-Founder Gary Wang Spared Prison

Judge Kaplan praises former FTX CTO Gary Wang for his co-operation against Sam Bankman-Fried during…

6 hours ago