💥 Check out this trending post from TechCrunch 📖
📂 **Category**: Government & Policy,Social,Meta,YouTube
📌 **What You’ll Learn**:
Meta lost a lawsuit against the state of New Mexico last week, the first time the judicial system has held the company liable for endangering the safety of children. This was a landmark decision in itself – but the next day, Meta lost another case when a Los Angeles jury found that the company intentionally designed its apps to be addictive to children and teens, thus endangering the mental health of the plaintiff, a twenty-year-old known as KGM.
These precedents open the floodgates to a wave of lawsuits related to Meta’s intentional stalking of teenage users, despite knowing that its apps could have negative mental effects on teens. Thousands of cases like KGM’s remain pending, while 40 state attorneys general have filed lawsuits against Meta similar to the New Mexico case.
Although social media platforms are legally protected so that they cannot be held liable for what users post on their platforms, it was not the content on these platforms that was on trial this time. It was the same design features, like endless scrolling and around-the-clock notifications.
“They took the model that had been used against the tobacco industry for many years, and instead of focusing on things like content, they focused on these addictive features — how the platform was designed, and the issues with design, which is different from content, where you have this First Amendment argument,” Alison Fitzpatrick, a digital media attorney and partner at Davis+Gilbert, told TechCrunch. “It turned out to be, at least in these two cases, a winning argument.”
The jury in the New Mexico case, after a six-week trial, found Meta liable for violating the state’s unfair practices law, and ordered the company to pay the maximum $5,000 per violation, for a total fine of $375 million. The Los Angeles case, which found Meta 70% liable and YouTube 30% liable for plaintiff KGM’s suffering, would fine the two companies a combined $6 million. (Snap and TikTok settled the case before trial.)
“This is nothing for the meta in the world,” Fitzpatrick said. “But when you take the $6 million and multiply it by all the cases against them, it becomes a huge number.”
“We respectfully disagree with these rulings and will appeal them,” a Meta spokesperson told TechCrunch. “Reducing something as complex as teen mental health to a single cause risks leaving unaddressed many of the broader issues teens face today and ignores the fact that many teens rely on digital communities to connect and find belonging.”
TechCrunch event
San Francisco, California
|
October 13-15, 2026
Over the course of the litigation, new internal documents from Meta were uncovered, displaying a pattern of inaction regarding the known negative impact of its platforms on minors, as well as a focused attempt to boost the time teens spend on its apps, even during school or via “finstas,” which are “fake Instagram” accounts teens create specifically to hide from parents or teachers.
One document showed a report including the results of a 2019 study, in which Meta conducted 24 one-on-one interviews with people whose use of the product was flagged as problematic — a classification that applied to an estimated 12.5% of users.
“The best external research suggests that Facebook’s impact on people’s well-being is negative,” the report says.
Multiple documents referenced statements by Meta CEO Mark Zuckerberg and Instagram chief Adam Mosseri about prioritizing teens’ time sharing. Zuckerberg even commented that for Facebook Live to be successful with teens, he “thinks we’ll need to get pretty good at not notifying parents/teachers.”
In other documents, Meta employees spoke disparagingly about the company’s goals of increasing the retention rate of teenage users.
“We’ve learned that one of the things we need to improve on is peeking at your phone during chemistry :),” one employee wrote in an email to Meta CPO Chris Cox.
“No one wakes up thinking they want to maximize the number of times they open Instagram that day,” Meta VP of product Max Eulenstein wrote in an internal email in January 2021. “But that’s exactly what our product teams are trying to do.”
A Meta spokesperson told TechCrunch that many of the newly released documents date back nearly a decade, but the company is listening to parents, experts, and law enforcement about how to improve the platform.
“We’re not aiming for the time of teens today,” the spokesperson said, citing Instagram’s teen accounts, introduced in 2024, which offer built-in safety features for teen users. These protections, such as making virtual accounts private and allowing only people who follow them to tag them or mention them in posts. Instagram will also send time limit reminders telling teens to leave the app after 60 minutes, which can only be changed for those under 16 with parental permission.
For Kelly Stonelake, Meta’s director of product marketing who worked at the company from 2009 to 2024, these revelations come as no surprise. (Stonelake is currently suing Meta for gender-based discrimination and harassment.)
“The mountain of unsealed evidence really shows what I went through firsthand,” she told TechCrunch.
At Meta, Stonelake led “go-to-market” strategies for social VR app Horizon Worlds during its launch to teens. She claims to have raised concerns about the lack of effective content moderation tools in the Metaverse, but her objections were not taken seriously.
The US government has taken a keen interest in the issue of children’s online safety, especially after Meta whistleblower Frances Haugen leaked damning internal documents in 2021 that showed Meta knew Instagram was harming teenage girls.
While Congress has proposed several bills aimed at addressing children’s online safety, many of these efforts would do more to monitor adults and police speech than protect minors, some privacy activists say.
“There is no world where passing a censorship or ‘age verification’ law, under the guise of child safety, would not lead to widespread online censorship of content and speech that Trump doesn’t like,” Evan Greer, director of Fight for the Future, said in a statement.
Stonelake once lobbied on Capitol Hill for the Children’s Online Safety Act, which had the most momentum of any of these legislative efforts, garnering support from companies like Microsoft, Snap, X, and Apple. But as the bill developed and changed, she became critical of it.
“I urge a ‘no’ vote on the current version,” she said, citing preemption provisions in the bill, which would override state regulations on technology companies. “There is language in the latest version that would close the court’s doors to school districts, to bereaved families, to states — and that’s bizarre.”
This language could, for example, preempt the case New Mexico brought against Meta.
“We need people to come to the table with solutions, instead of what they’re doing now, which is just telling a different story to both sides of the aisle to anger them and dismay them,” Stonelake said. “The actual solution must be complex, nuanced, and take into account multiple priorities.”
⚡ **What’s your take?**
Share your thoughts in the comments below!
#️⃣ **#Meta #finally #held #accountable #victimizing #teens**
🕒 **Posted on**: 1774985797
🌟 **Want more?** Click here for more info! 🌟
