Court Narrows Injunction On California Social Media Law

A judge's gavel on a computer keyboard. Law, justice, court, DOJ, trial.

US appeals court throws out much of injunction that blocks California law aimed at protecting children from harmful effects of social media

A US appeals court has thrown out much of an injunction that blocked California from bringing into force a law meant to protect children online, in a significant development for regulators’ efforts to rein in the effects of social media.

The Ninth US Circuit Court of Appeals said industry group NetChoice was likely to be successful in showing that the 2022 law, the Age-Appropriate Design Code Act, violated the First Amendment protection of free speech by compelling online businesses to assess and mitigate potential harms to children.

But the court vacated the broader injunction and remanded the case to the district court for further consideration, saying the lower court failed to asses whether other provisions of the law could remain valid.

The court found it was “too early to determine whether the unconstitutional provisions of the CAADCA were likely severable from its valid remainder”.

Social media apps displayed on smartphone screen. Facebook, Twitter, X, Instagram, YouTube, Tumblr, Vine, WhatsApp
Image credit: Pexels

‘Censorship’

The district court must now re-evaluate the law’s remaining provisions under the guidance of the Ninth Circuit opinion.

NetChoice, whose members include large tech companies such as Amazon, Google and Facebook parent Meta, called the decision a “victory for free expression, online security and Californian families”.

“The court recognized that California’s government cannot commandeer private businesses to censor lawful content online or to restrict access to it,” said Chris Marchese, director of the NetChoice Litigation Center, in a statement.

Proponents of the law also welcomed the decision.

“The Ninth Circuit made it clear that the district court did not properly analyse the safety-by-design and privacy-by-default protections at the heart of this bill, and the Court’s reasoning offers hope for policies that address harmful and addictive features such as dark patterns, infinite scrolls and streaks head-on,” said Meetali Jain, executive director of the Tech Justice Law Project.

Regulators worldwide are looking for ways to mitigate the effects of social media, particularly on young people, as well as their potential for harms such as spreading misinformation.

Social media controls

The EU’s new Digital Services Act (DSA) brings in stricter measures for the biggest online companies, while the UK’s Online Safety Act is expected to bring in similar measures when it comes fully into force next year.

This month TikTok agreed to “permanently withdraw” a rewards programme from the EU after the bloc raised concerns about the feature’s potentially “addictive effect” in the first resolution of a case under the DSA, which went into effect in February.

TikTok made the commitment without conceding it had broken the DSA, officials said.

Digital affairs commissioner Margrethe Vestager said the “legally binding” withdrawal “sends a clear message to the entire social media industry”.