EU Sets Deadline For Meta, TikTok To Detail Efforts To Curb Misinformation
EU demands Meta, TikTok detail efforts to curb illegal content, disinformation, amid fears of Hamas broadcasting executions
The European Union has begun flexing its regulatory muscle gained after the Digital Services Act came into force in August.
The European Commission (the executive branch of the EU) announced that it has formally demanded Meta and TikTok detail their efforts to curb illegal content and disinformation during the Israel-Hamas war.
It comes after multiple social media platforms were warned in separate open letters last week to Elon Musk’s X (aka Twitter), Meta, TikTok, and YouTube, about the amount of disinformation about the conflict on their platforms.
European demand
The European Commission said it is “requesting Meta to provide more information on the measures it has taken to comply with obligations related to risk assessments and mitigation measures to protect the integrity of elections and following the terrorist attacks across Israel by Hamas, in particular with regard to the dissemination and amplification of illegal content and disinformation.”
And it has given Meta a deadline, saying it “must provide the requested information to the Commission by 25 October 2023.”
Meta also has until 8 November 2023 to response about on the protection of the integrity of elections.
“Based on the assessment of Meta’s replies, the Commission will assess next steps,” it said. “This could entail the formal opening of proceedings pursuant to Article 66 of the DSA.”
It has also sent the same request to Chinese-owned TikTok, to explain the measures it has taken to reduce the risk of spreading and amplifying terrorist and violent content, hate speech and disinformation.
Under the DSA, Meta and TikTok could potentially face a fine of 6 percent of annual revenues or a potential suspension of their services in the EU.
Execution broadcasts
The European Union’s DSA is being tested at the moment due to the sheer volume of misinformation and other fake content flooding social media platforms by the Israel-Hamas war.
Meta it seems has chosen a bad week to announce that it will soon launch broadcast channels for Facebook and Messenger.
Thierry Breton, the European commissioner responsible for the Digital Service Act (who last week was the public face of the EU warning big name social media firms about the level of misinformation about the Hamas terrorist attacks), has raised the prospect of these tech platforms could be used to live broadcast executions of Israeli citizens captured during the Hamas terrorist act on Israel.
“Events in the Middle East triggered by the Hamas terrorist attacks in Israel have raised the stakes even higher,” Breton said in a speech Wednesday. “The widespread dissemination of illegal content and disinformation linked to these events carries a clear risk of stigmatising certain communities and destabilising our democratic structures, not to mention exposing our children to violent content.”
“In our exchanges with the platforms, we have specifically asked them to prepare for the risk of live broadcasts of executions by Hamas – an imminent risk from which we must protect our citizens – and we are seeking assurances that the platforms are well prepared for such possibilities,” Breton warned.
We will not let terror and #disinformation divide us or undermine our democracy 🇪🇺
Read also : Senators Ask Biden To Extend TikTok Ban DeadlineMy intervention at the European Parliament Plenary on fighting disinformation and dissemination of illegal content in the context of the #DSA and in times of conflict ⤵️https://t.co/iBdSrvZTiS pic.twitter.com/Ddhgs4Nlzv
— Thierry Breton (@ThierryBreton) October 18, 2023
Special operations centre
Meta told Associated Press it has a “well-established process for identifying and mitigating risks during a crisis while also protecting expression.”
After Hamas militants attacked Israeli communities, “we quickly established a special operations center staffed with experts, including fluent Hebrew and Arabic speakers, to closely monitor and respond to this rapidly evolving situation,” the company reportedly said.
Meta said it has teams working around the clock to keep its platforms safe, take action on content that violates its policies or local law, and coordinate with third-party fact checkers in the region to limit the spread of misinformation.
TikTok didn’t respond to a request for comment.