Prosecutors in the case, which centered on whether social media apps like Instagram are addictive and harmful, wanted to know why it took Meta so long to roll out basic safety tools, such as a nude filter for private messages sent to teens. In April 2024, Meta introduced the ability to automatically blur explicit images in Instagram DMs. The company reportedly recognized this as a problem nearly six years ago.
In a newly released deposition in a federal lawsuit, Instagram chief Adam Mosseri, asked about an August 2018 email chain with Meta vice president and chief information security officer Guy Rosen, said “terrible” things could happen through Instagram’s private messages (also known as DMs). Those horrifying things could include pictures of penises, the plaintiff’s attorney said, and Mosseri agreed.
But Mehta executives pushed back on questions that suggested that beyond removing CSAM (child sexual abuse material), the company should have notified parents that its messaging system was not being monitored.
“I think it’s clear that you can message problematic content on any messaging app, whether it’s Instagram or something else,” Mosseri said. He said the company is trying to balance people’s concerns for privacy with its own concerns for safety.
The testimony also revealed new statistics about harmful behavior on Instagram, revealing that 19.2% of survey respondents between the ages of 13 and 15 said they had seen nude or sexual images on Instagram that they did not want to see. Additionally, 8.4% of 13- to 15-year-olds said they had seen someone harm themselves or attempt to harm themselves on Instagram in the past seven days they used Instagram.
The nude filter is just one of several updates added to Instagram in recent years to protect teens, but prosecutors were more concerned with the delay than whether the app was safer for teens.
Mosseri was asked about other topics, including a 2017 email from a Facebook intern in which he said he wanted to find “addicted” Facebook users and see if there was a way to help them.
tech crunch event
boston, massachusetts
|
June 9, 2026
The 2018 email chain was supposed to serve as an example of Meta’s awareness of the risks to minors, but it took until 2024 for the company to release a product to address the issue of sexual images being sent to teens. This includes images sent by adults who may have engaged in grooming, the process of building a trusting relationship over time in order for an adult to manipulate or sexually exploit a minor.
Asked for comment, Meta spokeswoman Lisa Crenshaw pointed to other ways the company has worked over the years to keep teens safe. We’ve conducted rigorous research, and we’re using these insights to make meaningful changes, including introducing Teen Accounts with built-in protections and giving parents the tools to manage their teens’ experiences. We’re proud of the progress we’ve made.” We are always striving to do better,” she said.
The deposition provided by Mosseri comes in one of several ongoing lawsuits that hold big technology companies accountable for harming teenagers. In this particular lawsuit, filed in the U.S. District Court for the Northern District of California, plaintiffs argue that social media platforms are flawed because they are designed to maximize addictive screen time among teenagers. Defendants include Meta, Snap, TikTok, and YouTube (Google).
Similar lawsuits are ongoing in Los Angeles County Superior Court and in New Mexico.
Lawyers in each case hope to prove that Big Tech companies prioritized the need to increase user growth and engagement over the harm that could affect their youngest users.
The cases come at a time when a growing number of laws in several U.S. states and abroad are enacting restrictions on teenagers’ use of social media.
Updated with meta comments after publication.
Source link
