(Bloomberg) -- New York City joined a growing chorus of states, cities and school districts suing social media companies over claims that their platforms are fueling a mental-health crisis among teens, alleging that Meta Platforms and others exploit children and adolescents.

The most populous US city filed a lawsuit Wednesday in California state court in Los Angeles against Meta, its Facebook and Instagram platforms; TikTok Inc. and its parent company, ByteDance Ltd.; Google LLC and its YouTube platform; and Snapchat owner Snap Inc.

Social media companies face mounting legal risks from claims that they use algorithms to get teenagers and adolescents addicted to their platforms. Meta was sued by the attorneys general of more than 30 states over similar claims in October. A month later, a judge in Oakland, California, ordered Meta, Google, TikTok and Snap to face hundreds of suits blaming them for hooking young people.

Hundreds of school districts also have sued to force the companies to change their behavior and pay the cost of addressing social media addiction. New York said it spends more than $100 million on mental health programs and services for youth every year. The city’s health commissioner last month called unchecked access to social media a “public health hazard.”

“Our city is built on innovation and technology, but many social media platforms end up endangering our children’s mental health, promoting addiction, and encouraging unsafe behavior,” New York Mayor Eric Adams said in a statement.

‘Maximizing Engagement’

Like many other complaints across the US, the New York City lawsuit alleges the companies borrowed from behavioral and neurobiological tactics used by the casino and tobacco industries to design features aimed at “maximizing youth engagement to drive advertising revenue,” and target children and adolescents who are “particularly vulnerable to the addictive effects of those features.”

José Castañeda, a Google spokesperson, disputed the city’s claims.

“Providing young people with a safer, healthier experience online has always been core to our work,” he said in an email. “In collaboration with youth, mental health and parenting experts, we’ve built services and policies to give young people age-appropriate experiences, and parents robust controls.”

Meta said it wants teens to have “safe, age-appropriate experiences online,” citing more than 30 tools and features to support them and parents.

“We’ve spent a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online,” a spokesperson said in a statement. 

A TikTok spokesperson said the company has “industry-leading safeguards to support teens’ well-being, including age-restricted features, parental controls, an automatic 60-minute time limit for users under 18, and more. We regularly partner with experts to understand emerging best practices, and will continue to work to keep our community safe by tackling industry-wide challenges.”

A spokesperson for Snap, in a statement, said the platform was “intentionally designed to be different from traditional social media, with a focus on helping Snapchatters communicate with their close friends. Snapchat opens directly to a camera – rather than a feed of content that encourages passive scrolling – and has no traditional public likes or comments. While we will always have more work to do, we feel good about the role Snapchat plays in helping close friends feel connected, happy and prepared as they face the many challenges of adolescence.”

The case is City of New York v Meta Platforms Inc., 24STCV03643, Superior Court of the State of California, County of Los Angeles.

--With assistance from Aisha Counts.

(Updates with comment from Google.)

©2024 Bloomberg L.P.