Meta Platforms is facing a formal investigation by European authorities for possible violations of the Digital Services Act (DSA), linked to the protection of children.
The European Commission announced on Thursday that it has “opened formal proceedings to assess whether Meta, the provider of Facebook and Instagram, may have breached the Digital Services Act (DSA) in areas linked to the protection of minors.”
The US Senate had begun to investigate Meta’s actions to protect children, after many US states in October 2023 sued the firm, alleging its Instagram and Facebook platforms are harming children’s mental health.
But now the European Commission had stated that it is concerned “that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioural addictions in children, as well as create so-called ‘rabbit-hole effects’. In addition, the Commission is also concerned about age-assurance and verification methods put in place by Meta.”
The opening of a formal investigation comes after a preliminary analysis of the risk assessment report sent by Meta in September 2023, coupled with Meta’s replies to the Commission’s formal requests for information, as well as the Commission’s own analysis.
“Today we are taking another step to ensure safety for young online users,” said Margrethe Vestager, Executive VP for a Europe Fit for the Digital Age. “With the Digital Services Act we established rules that can protect minors when they interact online.”
Margrethe Vestager. European Commission
“We have concerns that Facebook and Instagram may stimulate behavioural addiction and that the methods of age verification that Meta has put in place on their services is not adequate and will now carry on an in-depth investigation,” said Vestager. “We want to protect young people’s mental and physical health.”
Investigation focus
The investigation will address the following areas:
Meta’s compliance with DSA obligations on assessment and mitigation of risks caused by the design of Facebook’s and Instagram’s online interfaces, which may exploit the weaknesses and inexperience of minors and cause addictive behaviour, and/or reinforce so-called ‘rabbit hole’ effect. Such an assessment is required to counter potential risks for the exercise of the fundamental right to the physical and mental well-being of children as well as to the respect of their rights.
Meta’s compliance with DSA requirements in relation to the mitigation measures to prevent access by minors to inappropriate content, notably age-verification tools used by Meta, which may not be reasonable, proportionate and effective.
Meta’s compliance with DSA obligations to put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors, particularly with regard to default privacy settings for minors as part of the design and functioning of their recommender systems.
“We are not convinced that it has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram,” said Thierry Breton, Commissioner for Internal Market.
Thierry Breton. Image credit: European Parliament
“We will now investigate in-depth the potential addictive and “rabbit hole” effects of the platforms, the effectiveness of their age verification tools, and the level of privacy afforded to minors in the functioning of recommender systems,” said Breton. “We are sparing no effort to protect our children.”
The Commission will now carry out an in-depth investigation as a matter of priority and will continue to gather evidence, for example by sending additional requests for information, conducting interviews or inspections.
As VLOPs, four months from their designation, i.e. at the end of August 2023, Facebook and Instagram had to start complying with a series of obligations set out in the DSA.
Since 17 February, the Digital Services Act applies to all online intermediaries in the EU.
Facebook whistleblowers
Meta has faced a number of whistleblowers who have publicly expressed concern at the impact of the platforms on children’s health.
Last November a former high-level Meta employee (Arturo Bejar, a former engineering director at Instagram) had testified before the US Senate about the harm Instagram can do to children, including his own daughter.
Bejar alleged that Meta’s leaders had ignored his concerns when he raised them internally.
In February the CEOs of Meta, TikTok, X and other platforms had appeared before the Senate Judiciary Committee to discuss online protections for children.
She told the Senate in October 2021 that Facebook allegedly puts “astronomical profits before people”. She has also claimed that Facebook knows it is harming people, and she said the social networking weakens democracy.
Get the latest IT news, analysis, features and interviews on key tech industry topics, direct to your inbox. You will also receive other useful resources such as jobs, whitepapers and downloads alongside our expert coverage. Sign up for free today, and keep informed on all aspects of the IT revolution.