✨ Check out this insightful post from TechCrunch 📖
📂 **Category**: Social,Apps,Instagram,lawsuit,Meta,social media,social media lawsuit,Teens
💡 **What You’ll Learn**:
The plaintiffs in their lawsuit focused on whether or not social media apps, like Instagram, are addictive and harmful, and wanted to know why it took so long for Meta to roll out basic security tools, like a nudity filter for private messages sent to teens. In April 2024, Meta introduced a feature that would automatically blur out explicit photos in Instagram direct messages — something the company reportedly understood was a problem about six years ago.
In a newly unsealed filing in a federal lawsuit, Instagram chief Adam Mosseri was asked about an August 2018 email chain with Meta VP and chief information security officer Guy Rosen, in which he stated that “terrible” things could happen via Instagram’s private messages, also known as DMs. The plaintiff’s attorney said those egregious things could include pictures of penises, and Mosseri agreed.
However, the Meta exec pushed back on a line of questioning that suggested the company should have informed parents that its messaging system was not being monitored, beyond removing CSAM (child sexual abuse material).
“I think it’s pretty clear that you can send problematic content in any messaging app, whether it’s Instagram or whatever,” Mosseri said. He said the company tried to balance people’s privacy concerns with its own safety interests.
The testimony also revealed new statistics about harmful activity on Instagram, revealing that 19.2% of survey participants between the ages of 13 and 15 said they had seen nudity or sexual images on Instagram that they did not want to see. Additionally, 8.4% of 13-15 year olds said they had seen someone hurt themselves or threaten to do so on Instagram within the past seven days they used the app.
While the nudity filter is just one of many updates added to Instagram in recent years to protect teens, prosecutors were more concerned with its delay in acting, rather than whether the app is safer for teens now.
Mosseri was also questioned about other topics, such as an email from a Facebook intern in 2017, in which he said he wanted to find “addicted” Facebook users and see if there were ways to help them.
TechCrunch event
Boston, MA
|
June 9, 2026
The 2018 email chain was supposed to serve as an example that Meta was aware of the risks to minors, but it took until 2024 for the company to release a product that addresses the problem of sexual images sent to teens. This includes those photos sent by adults who may have engaged in grooming, a process in which an adult builds trust with a minor over time to manipulate or sexually exploit them.
When reached for comment, Meta spokesperson Lisa Crenshaw pointed to other ways the company has worked to keep teens safe over the years, noting, “For more than a decade, we’ve listened to parents, worked with experts and law enforcement, and conducted in-depth research to understand the issues that matter most. We use these insights to make meaningful changes — like offering teen accounts with built-in protections and providing parents with the tools to manage their teens’ experiences. We’re proud of the progress we’ve made. We’re always working to do better,” she said.
Mosseri’s deposition took place during one of what are now several lawsuits aimed at holding big tech companies liable for harming teens. This particular case, taking place in the U.S. District Court for the Northern District of California, involves plaintiffs who claim that social media platforms are flawed because they are designed to maximize screen time, which encourages addictive behavior in teens. The defendants include Meta, Snap, TikTok, and YouTube (Google).
Similar lawsuits are also underway in Los Angeles County Superior Court and in New Mexico.
Lawyers in these cases hope to prove that big tech companies have prioritized the need for user growth and increased engagement over potential harms affecting their younger users.
The timing of these trials comes amid a growing number of laws restricting teenagers’ use of social media, both in several US states and abroad.
Updated after posting with meta comment.
🔥 **What’s your take?**
Share your thoughts in the comments below!
#️⃣ **#Instagram #chief #pushed #long #delay #launching #teen #safety #features #nudity #filter #court #filing #revealed**
🕒 **Posted on**: 1772008542
🌟 **Want more?** Click here for more info! 🌟
