Meta lost its lawsuit against the state of New Mexico last week, marking the first time the company has been held accountable by the court system for endangering the safety of children. This was a landmark ruling in itself, but the next day, Meta lost another lawsuit when a Los Angeles jury found that Meta had intentionally designed its app to be addictive to children and teens, thereby endangering the mental health of the plaintiff, a 20-year-old boy known as KGM.
These precedents open the floodgates to a series of lawsuits over Meta’s intentional tracking of teenage users, despite knowing that its apps could have a negative psychological impact on teenagers. Thousands of cases like KGM’s are pending, and 40 state attorneys general have filed lawsuits similar to New Mexico’s against meth.
Social media platforms are legally protected and cannot be held responsible for what users post on them, but the content on these platforms was not the subject of this lawsuit. It was the design features themselves, such as endless scrolling and 24-hour notifications.
“They took the model that was used years ago against the tobacco industry and instead of focusing on things like content, they focused on these addictive features, how the platforms are designed, and design issues that are different from content that has First Amendment claims,” Alison Fitzpatrick, digital media attorney and partner at Davis+Gilbert, told TechCrunch. “At least in these two cases, it turned out to be a winning argument.”
After a six-week trial, a jury in the New Mexico case found Meta liable for violating the state’s unfair practices laws and ordered it to pay fines of up to $5,000 for each violation, for a total of $375 million. The Los Angeles lawsuit holds Meta 70% responsible and YouTube 30% responsible for plaintiff KGM’s suffering, and both companies face a combined fine of $6 million. (Snap and TikTok settled before trial.)
“It’s nothing to metas all over the world,” Fitzpatrick said. “But if you multiply that $6 million by all the lawsuits they’ve filed, that’s a huge number.”
“We respectfully disagree with these decisions and intend to appeal,” a Meta spokesperson told TechCrunch. “Attributing something as complex as teen mental health to a single cause risks leaving unresolved many of the broader issues facing today’s teens and overlooking the fact that many teens rely on digital communities to find connection and a sense of belonging.”
tech crunch event
San Francisco, California
|
October 13-15, 2026
During the course of the lawsuit, new internal Meta documents were uncovered that revealed a pattern of inaction regarding the known negative effects of Meta’s platform on minors, as well as intensive attempts to increase the amount of time teens spend on the app, both at school and through “Finstas” (“fake Instagram” accounts that teens create specifically to hide from parents and teachers).
One document shows a report containing the results of a 2019 study in which Meta conducted 24 one-on-one face-to-face interviews with people flagged as having problems using the product. This designation applies to an estimated 12.5% of users.
“The best external research shows that Facebook’s impact on people’s well-being is negative,” the report says.
The documents reference statements from Meta CEO Mark Zuckerberg and Instagram head Adam Mosseri about prioritizing engagement with teenagers. Zuckerberg even commented that for Facebook Live to be successful with teens, “we’re going to have to do a good job of not notifying parents and teachers.”
In other documents, Meta employees talk glibly about the company’s goal of increasing user retention among teens.
“I found that one thing I needed to optimize for was sneaking a look at my phone during chemistry :),” one employee wrote in an email to Meta CPO Chris Cox.
“Nobody wakes up in the morning and wants to maximize the number of times they open Instagram that day,” Meta VP of Product Max Eulenstein wrote in an internal email in January 2021. “But that’s exactly what our product team is trying to do.”
A Meta spokesperson told TechCrunch that while many of the newly released documents are nearly a decade old, the company is listening to parents, experts and law enforcement on how the platform can be improved.
“We are not targeting the amount of time teens spend today,” a spokesperson said, citing Instagram’s Teen Accounts, which will be introduced in 2024 and will have safety features built in for teen users, as an example. These protections include making your account private by default and allowing only people you follow to tag or mention your account in posts. Instagram also sends time limit reminders that remind teens to exit the app after 60 minutes. If you are under 16, this reminder can only be changed with parental permission.
None of this comes as a surprise to Kelly Stonelake, who worked at Meta from 2009 to 2024 as director of product marketing. (Stone Lake is currently suing Mehta for alleged gender-based discrimination and harassment.)
“The unsealed trove of evidence proves exactly what I experienced firsthand,” she told TechCrunch.
At Meta, Stonelake led the go-to-market strategy for the VR social app Horizon Worlds aimed at teenagers. She claims that she raised concerns about the lack of effective content management tools in the Metaverse, but that her objections were not taken seriously.
The U.S. government has taken a keen interest in the issue of children’s online safety, especially since Mehta’s whistleblower Frances Haugen leaked damning internal documents in 2021 showing that Mehta knew Instagram was harming teenage girls.
Congress has proposed a number of bills aimed at addressing children’s online safety, but some privacy activists say many of these efforts would do more to monitor adults and censor speech than to protect minors.
Fight for the Future director Evan Grier said in a statement: “There is no world in which passing censorship and ‘age verification’ laws disguised as child safety does not lead to massive online censorship of content and speech that President Trump doesn’t like.”
Stonelake previously lobbied on Capitol Hill for the Children’s Online Safety Act, the most vigorous of these legislative efforts, attracting support from companies like Microsoft, Snap, X, and Apple. However, she became critical of the bill as it evolved and changed.
“I urge everyone to vote ‘no’ on the current version,” he said, citing the bill’s preemption provisions that would override state regulation of tech companies. “The latest version includes language that would close the court’s doors on school districts, families, and the state, which is outrageous.”
This language could pre-empt the very lawsuit that New Mexico filed against meth, for example.
“What we need is a seat at the table with solutions rather than just what people are doing now, which is telling different stories on both sides of the aisle and infuriating and surprising them,” Stonelake said. “Actual solutions are complex and nuanced and require consideration of multiple priorities.”
Source link
