Elon Musk’s company XAI faces lawsuit over child porn from minors Grok allegedly undressed

✨ Check out this awesome post from TechCrunch 📖

📂 **Category**: AI,CSAM,Grok,lawsuits,xAI

✅ **What You’ll Learn**:

Elon Musk’s xAI company should be held accountable for allowing its artificial intelligence models to produce sexually offensive images of identifiable minors, three anonymous plaintiffs argued in a lawsuit filed Monday in federal court in California.

The three plaintiffs want to file a class action lawsuit representing anyone who had real photos of themselves as minors turned into sexual content by Grok. They claim that xAI did not take basic precautions used by other frontier labs to prevent their image models from producing pornographic material depicting real people and minors.

The case was filed, Jane Doe 1, Jane Doe 2, a Minor, and Jane Doe 3, a Minor, against x.AI Corp. and x.AI LLC, in the U.S. District Court for the Northern District of California.

Other deep learning image generators use different techniques to prevent the creation of child pornography from regular images. The lawsuit alleges that these standards were not adopted by xAI.

It is worth noting that if a model allows the creation of nude or erotic content from real images, it is almost impossible to prevent it from producing sexual content featuring children. Musk’s public promotion of Grok’s ability to produce sexual images and photograph real people in skimpy outfits is evident in the suit.

The company did not respond to a request for comment from TechCrunch.

One of the plaintiffs, Jane Doe 1, had photos from her high school homecoming and Groke changed the yearbook to depict her nude. An anonymous tipster who contacted her on Instagram told her the photos were circulating online, and sent her a link to a Discord server showing sexual images of her and two other minors she knew from school.

TechCrunch event

San Francisco, California
|
October 13-15, 2026

The second plaintiff, Jane Doe 2, was informed by criminal investigators of altered sexual images of herself that were created by a third-party mobile application based on Grok mannequins. The third, Jane Doe 3, was also notified by criminal investigators who discovered an edited pornographic image of herself on the phone of an arrested person. Plaintiffs’ lawyers say that because third-party use still requires XAI’s code and servers, the company should be held liable.

The three plaintiffs, two of whom are still minors, say they are suffering severe distress because of the circulation of these images and what it could mean for their reputations and social lives. They are seeking civil penalties under a set of laws aimed at protecting exploited children and preventing corporate negligence.

🔥 **What’s your take?**
Share your thoughts in the comments below!

#️⃣ **#Elon #Musks #company #XAI #faces #lawsuit #child #porn #minors #Grok #allegedly #undressed**

🕒 **Posted on**: 1773715516

🌟 **Want more?** Click here for more info! 🌟

By

Leave a Reply

Your email address will not be published. Required fields are marked *