Meta Platforms is facing a formal investigation by European authorities for possible violations of the Digital Services Act (DSA), linked to the protection of children.
The European Commission announced on Thursday that it has “opened formal proceedings to assess whether Meta, the provider of Facebook and Instagram, may have breached the Digital Services Act (DSA) in areas linked to the protection of minors.”
Meta is not alone. In February this year the European Commission had opened a formal probe into whether TikTok broke Digital Services Act rules related to the protection of minors.
Meanwhile the UK regulator had last week warned social media companies to tame their toxic algorithms” in order to comply with the UK’s online protection laws for children.
Meta has over the years faced many allegations on both sides of the Atlantic that its services and platforms are not good for children, considering the amount of time children spend online.
Last November Meta, speaking before American lawmakers, called for federal law for App Stores to approve app downloads for under-16s. It also said that parents should approve app downloads for their offspring.
The US Senate had begun to investigate Meta’s actions to protect children, after many US states in October 2023 sued the firm, alleging its Instagram and Facebook platforms are harming children’s mental health.
But now the European Commission had stated that it is concerned “that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioural addictions in children, as well as create so-called ‘rabbit-hole effects’. In addition, the Commission is also concerned about age-assurance and verification methods put in place by Meta.”
The opening of a formal investigation comes after a preliminary analysis of the risk assessment report sent by Meta in September 2023, coupled with Meta’s replies to the Commission’s formal requests for information, as well as the Commission’s own analysis.
“Today we are taking another step to ensure safety for young online users,” said Margrethe Vestager, Executive VP for a Europe Fit for the Digital Age. “With the Digital Services Act we established rules that can protect minors when they interact online.”
“We have concerns that Facebook and Instagram may stimulate behavioural addiction and that the methods of age verification that Meta has put in place on their services is not adequate and will now carry on an in-depth investigation,” said Vestager. “We want to protect young people’s mental and physical health.”
The investigation will address the following areas:
“We are not convinced that it has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram,” said Thierry Breton, Commissioner for Internal Market.
“We will now investigate in-depth the potential addictive and “rabbit hole” effects of the platforms, the effectiveness of their age verification tools, and the level of privacy afforded to minors in the functioning of recommender systems,” said Breton. “We are sparing no effort to protect our children.”
The Commission will now carry out an in-depth investigation as a matter of priority and will continue to gather evidence, for example by sending additional requests for information, conducting interviews or inspections.
In April 2023 19 so called ‘Gatekeeper’ firms (including Facebook and Instagram) had been designated as Very Large Online Platforms (VLOPs) under the EU’s Digital Services Act, as they both have more than 45 million monthly active users in the EU.
As VLOPs, four months from their designation, i.e. at the end of August 2023, Facebook and Instagram had to start complying with a series of obligations set out in the DSA.
Since 17 February, the Digital Services Act applies to all online intermediaries in the EU.
Meta has faced a number of whistleblowers who have publicly expressed concern at the impact of the platforms on children’s health.
Last November a former high-level Meta employee (Arturo Bejar, a former engineering director at Instagram) had testified before the US Senate about the harm Instagram can do to children, including his own daughter.
Bejar alleged that Meta’s leaders had ignored his concerns when he raised them internally.
In February the CEOs of Meta, TikTok, X and other platforms had appeared before the Senate Judiciary Committee to discuss online protections for children.
At the hearing, Mark Zuckerberg personally apologised to parents who held up pictures of their children who they said had been harmed by social media.
Bejar’s allegations mirror that of the famous whistleblower Frances Haugen, who in 2021 leaked internal documents to the US government about how company executives ignored warnings about the detrimental effects of social media use on teen girls.
She told the Senate in October 2021 that Facebook allegedly puts “astronomical profits before people”. She has also claimed that Facebook knows it is harming people, and she said the social networking weakens democracy.
At the time Mark Zuckerberg hit back at Haugen’s allegations, pointing out that many of Haugen’s allegations don’t make sense.
Suspended prison sentence for Craig Wright for “flagrant breach” of court order, after his false…
Cash-strapped south American country agrees to sell or discontinue its national Bitcoin wallet after signing…
Google's change will allow advertisers to track customers' digital “fingerprints”, but UK data protection watchdog…
Welcome to Silicon In Focus Podcast: Tech in 2025! Join Steven Webb, UK Chief Technology…
European Commission publishes preliminary instructions to Apple on how to open up iOS to rivals,…
San Francisco jury finds Nima Momeni guilty of second-degree murder of Cash App founder Bob…