Ofcom, the UK regulator that will be in charge of enforcing the UK’s controversial Online Safety Act, has noted the link between social media posts and the riots in late July and early August in response to a mass stabbing in Southport that resulted in the murder of three girls.

In an open letter to the government, Ofcom chief executive Melanie Dawes noted the regulator had “spoke directly and in detail with many of the largest social media and messaging platforms from early August, to find out their response to the riots.”

It added that when its (Online Safety Act) powers come into force early next year “we will be able to consider whether to open an investigation into any future incidents of this kind, and take enforcement action if we have evidence that a breach of the rules may have occurred.”

The offices of Ofcom England

Posts contributed

Ofcom’s Dawes also wrote that “these events (the riots) have clearly highlighted questions tech firms will need to address as the duties come into force.”

“While some told us they took action to limit the spread of illegal content, we have seen evidence that it nonetheless proliferated, and appears to have contributed to the significant violent disorder which followed the tragic murders in Southport,” Dawes wrote.

“Of the numerous convictions which have followed, some have been found guilty of online posts threatening death or serious harm, stirring up racial hatred, or sending false information with intent to cause harm.”

It comes after Ofcom earlier this month had issued a stark warning to big tech platforms about their content moderation practices, when Melanie Dawes said that it will take “strong action” against tech companies that break new rules on content moderation.

But after the summer riots, Ofcom’s Dawes said that its assessment based on information provided to it by tech firms and other stakeholders was as follows:

Illegal content and disinformation spread widely and quickly online following the attack. Some of the posts had “malicious intent and seeking to influence public opinion and reaction” and “some platforms were used to spread hatred, provoke violence targeting racial and religious groups, and encourage others to attack and set fire to mosques and asylum accommodation.”

Ofcom’s Dawes noted that some accounts (including some with over 100,000 followers) falsely stated that the attacker was a Muslim asylum seeker and shared unverified claims about his political and ideological views. “Posts about the Southport incident and subsequent events from high-profile accounts reached millions of users, demonstrating the role that virality and algorithmic recommendations can play in driving divisive narratives in a crisis period.”

Ofcom’s Dawes also stated that “there was a clear connection between online activity and violent disorder seen on UK street.”

Ofcom’s Dawes also stated that “most online services took rapid action in response to the situation, but responses were uneven.”

Ofcom’s Dawes said she was “ confident that, had the draft Codes been in force at the time, they would have provided a firm basis for urgent engagement with services on the steps they were taking to protect UK users from harm.”

Elon Musk

It is not clear whether Ofcom’s Dawes was referring to Elon Musk when she stated that “posts about the Southport incident and subsequent events from high-profile accounts reached millions of users, demonstrating the role that virality and algorithmic recommendations can play in driving divisive narratives in a crisis period.”

Elon Musk had clashed directly with the British government over free speech vs social media posts that made false claims, which fueled riots and violence in the summer.

Musk had accused Prime Minister Sir Keir Starmer’s Labour government of censorship and recently falsely claimed it was “releasing convicted paedophiles in order to imprison people for social media posts” – a reference to a policy of releasing some offenders early to ease prison overcrowding.

Elon Musk’s X (as well as Telegram) had been blamed for enabling the spread of misinformation and being notably slow to act on harmful material.

Indeed, Musk repeatedly relayed posts containing misinformation to his 193 million followers on the service, and criticised the British government for cracking down on hate speech.

In August Ofcom had urged social media companies to take action against posts inciting violence, and the government warned it may revisit the Online Safety Act.

The media regulator previously urged platforms to address content that depicts “hatred and disorder” and promoted violence and misinformation.

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

Apple ‘Sharply Cuts’ Production For Vision Pro Headset – Report

Sales flop? Apple reportedly sharply scaled back production of its Vision Pro mixed-reality headset since…

43 mins ago

Hong Kong Bans WhatsApp, Google Drive On Government Devices

Hong Kong government moves against two prominent Western tech platforms and bans them on government…

2 hours ago

Nigerian Court Orders Release Of Binance Executive

After charges are dropped, Nigerian court orders the release of Binance executive Tigran Gambaryan who…

3 hours ago

ARM Cancels Qualcomm Chip Design Licence – Report

Potential huge market disruption after UK chip designer ARM reportedly cancels Qualcomm licence amid escalating…

5 hours ago

BT Switches On First Self-Powered Cell Tower

BT cell tower in Shropshire Hills has 100 percent of its power requirements delivered by…

9 hours ago

Elon Musk, Tesla Sued By Blade Runner 2049 Producers

Fallout from Elon Musk's uninspiring 'Cybercab' launch continues, after film producers sue for using certain…

22 hours ago