Government Considers Revisiting Online Safety Act
Policing minister Dame Diana Johnson says government considers revising Online Safety Act after posts fan anti-immigrant violence
Ofcom has urged social media companies to take action against posts inciting violence, as the government said it may revisit the Online Safety Act, which is due to come into full force next year.
The media regulator urged platforms to address content that depicts “hatred and disorder” and promotes violence and misinformation.
Existing powers enable Ofcom to suspend or restrict video-sharing platforms such as YouTube or TikTok that fail to protect the public from “harmful material” and the regulator is expected to be given more powers across social media platforms in general under the Online Safety Act next year.
“There is no need to wait to make your sites and apps safer for users,” said Ofcom safety director Gill Whitehead.
Online misinformation
Policing minister Dame Diana Johnson said tech firms “have an obligation now” to “deal with” material that incites violence.
Speaking on the Today programme on BBC Radio Four, Johnson said the government was considering revisiting the upcoming legislation.
“The events of the last few days have meant that we need to look very carefully at what more we can do,” she said.
She said a possible plan to ban convicted rioters from football matches was “being looked at”.
The government said last week social media platforms “clearly need to do far more” after a list supposedly containing the names and addresses of immigration lawyers was spread online, apparently originating from Telegram.
Azzurra Moores of fact-checking organisation Full Fact said online misinformation was a “clear and present danger spilling across into unrest on UK streets in real-time” and urged Ofcom and the government to take “bolder, stronger action”.
‘Not fit for purpose’
London mayor Sadiq Khan said last week the Online Safety Act is “not fit for purpose” due to the way misinformation spreads rapidly on social media and urged ministers to act “very, very quickly” to review it.
Platforms X and Telegram have been notably slow to act on harmful material, with X owner Elon Musk repeatedly relaying posts containing misinformation to his 193 million followers on the service and criticising the government for cracking down on hate speech.
Telegram said its moderators were “actively monitoring the situation and are removing channels and posts containing calls to violence”, adding that “calls to violence” are forbidden in its terms of service.