✨ Read this awesome post from WIRED 📖
📂 **Category**: Business,Business / Big Tech,Bellwether Friends
✅ **What You’ll Learn**:
Both Google and Meta deny the allegations in the complaint. “Providing a safer and healthier experience for young people has always been at the core of our work,” Jose Castañeda, a Google spokesperson, said in a statement. “Collaborating with youth, mental health and parenting experts, we have built services and policies to provide youth with age-appropriate experiences, and provide parents with strong controls.”
“For more than a decade, we have listened to parents, worked with experts and law enforcement, and conducted in-depth research to understand the issues that matter most,” Meta spokeswoman Stephanie Ottway said in a statement. “We’re using these insights to make meaningful changes — like offering teen accounts with built-in protections and providing parents with the tools to manage their teens’ experiences.”
Major case
KGM started watching YouTube at age 6, had an Instagram account when she was 11, got Snapchat at 13 and TikTok a year later — each app allegedly adding to a “spiral of anxiety and depression fueled by low self-esteem and body dysmorphia,” according to her attorney, Joseph VanZandt. She and her mother, Karen, filed a lawsuit against Meta, Google’s YouTube, Snap, and TikTok, claiming that features such as “autoplay” and “infinite scrolling” contributed to her social media addiction and that social media use contributed to her anxiety and depression, making her feel insecure about herself. (Snap and TikTok settled the case with KGM before trial. Terms were not disclosed.)
KGM’s mother testified last year that she did not realize the harm these platforms could do to her daughter, and that she would not have given her a phone if she had known about these harms previously. Bergman says KGM’s lawsuit was chosen as the lead case because it “represents many other young women who have suffered serious mental health harm, illness and emotional distress as a result of social media.”
“The goal of lawyers who bring these cases is not just to win and get compensation for their individual clients,” says Benjamin Zipursky, a law professor at Fordham University School of Law. “They are aiming for a series of wins in this sample of so-called groundbreaking trials. Then they will try to pressure the companies into a collective settlement in which they pay potentially billions of dollars and also agree to change their practices.”
The KGM case is the first of 22 groundbreaking trials to be held in Los Angeles Superior Court, although that number may change. A favorable outcome in favor of the plaintiff could give the roughly 1,600 remaining litigants significant leverage — and potentially force tech companies to adopt new safeguards. The trial also promises to raise broader awareness of social media business models and practices. “If the public has a very negative reaction to what is shown, or what the jury finds, that could impact legislation at the state or federal level,” Zipursky adds.
Bergman, who has spent 25 years representing asbestos victims, says this trial feels like a repeat of what happened in the past. “When Frances Haugen testified before Congress and revealed for the first time what social media companies knew their platforms were doing to target vulnerable young people, I realized this was asbestos again,” Bergman says.
Division lines
In seeking to draw parallels from product liability cases against Big Tobacco and the auto industry, the plaintiffs’ main argument is that the big tech companies designed their social media platforms in a negligent manner, meaning they did not take reasonable steps to avoid causing harm. “Specifically, plaintiffs argue that design features such as infinite scrolling and autoplay caused certain injuries to minors, including eating disorders, self-harm, and suicide,” says Mary Anne Franks, a law professor at George Washington University.
On the other hand, technology companies are more likely to focus on causation and defenses related to free speech. “Defendants will argue that it was third-party content that caused plaintiffs’ injury, not access to that content provided by the platforms,” Franks says. She says the companies are also likely to argue that “to the extent corporate decision-making about content moderation is involved, the decision-making process is protected by the First Amendment,” citing a 2024 U.S. Supreme Court ruling in Modi v. Netchoice.
💬 **What’s your take?**
Share your thoughts in the comments below!
#️⃣ **#Social #media #addiction #case #puts #big #tech #companies #trial**
🕒 **Posted on**: 1770626372
🌟 **Want more?** Click here for more info! 🌟
