News Doe v XAI Grok CSAM Class Action 2026

From AI Law Wiki
Jump to navigation Jump to search

March 16, 2026 — Three minor plaintiffs filed a class action lawsuit against X.AI Corp. and X.AI LLC in the U.S. District Court for the Northern District of California (San Jose Division), alleging that xAI's Grok AI model generated child sexual abuse material (CSAM) deepfakes using real photographs of minors.[1][2]

Case Details

The case, Doe 1 v. X.AI Corp., Case No. 5:26-cv-02246, was filed on March 16, 2026.[2] The 44-page complaint alleges that Grok's "Spicy Mode" allowed users to generate photorealistic nude and sexually explicit images of real minors by uploading their photographs, and that xAI profited from this capability by restricting it to paid subscribers.[1][2]

Legal Claims

The complaint asserts claims under:

  • Masha's Law (18 U.S.C. § 2255) — civil remedy for child sexual exploitation victims[1]
  • Trafficking Victims Protection Act[1]
  • California state law — including claims for negligence and products liability[2]

The plaintiffs seek damages, punitive damages, and injunctive relief on behalf of a nationwide class of all U.S. persons whose real minor images were altered by Grok into sexualized images or videos.[1]

Background

The lawsuit follows a California Attorney General investigation launched on January 14, 2026, by AG Rob Bonta, probing xAI's role in producing nonconsensual deepfake intimate images, including of women, girls, and children, via Grok's "Spicy Mode."[3] The AG's investigation cited reports of over 20,000 generated sexualized images, including apparent child depictions, from late December 2025.[3]

Reports indicated that Grok generated approximately 3 million sexualized images and 23,000 apparent child depictions in late 2025–early 2026 before xAI restricted explicit content generation to paid "Spicy Mode" subscribers after the practice was publicly exposed.[1][4]

One perpetrator who used Grok to generate CSAM of real minors was reportedly arrested, and the CSAM was reported to the National Center for Missing & Exploited Children (NCMEC).[1]

Current Status

As of April 2026, the case remains at the initial pleading stage. The complaint (Docket No. 1) and a motion to relate the case to a prior matter (Docket No. 3) were filed on March 16, 2026.[2] No answer, motion to dismiss, or other responsive filing by xAI has been reported. A judge has not yet been assigned; plaintiffs filed a motion to relate the case to a prior matter for potential assignment to the same judge.[2]

Significance

This case represents one of the first class action lawsuits alleging that an AI model directly generated child sexual abuse material from real minors' images. It raises novel questions about AI company liability for outputs that violate federal child exploitation laws, the adequacy of safety guardrails in commercial AI products, and the scope of civil remedies available to victims of AI-generated CSAM under Masha's Law and related statutes.

Related Developments

References