Kistler v Eightfold AI Inc
Kistler v. Eightfold AI Inc. is a class action lawsuit filed on January 20, 2026, in the Superior Court of California, County of Contra Costa, challenging whether an AI-powered hiring platform's candidate scoring process violates consumer protection laws by functioning as an unregistered consumer reporting agency.[1]
Parties
The named plaintiffs are Erin Kistler and Sruti Bhaumik, both California residents with STEM backgrounds who applied for jobs through Eightfold's platform in 2025, were never interviewed, and never advanced beyond the automated screening stage.[2]
The defendant, Eightfold AI Inc., provides an AI-powered talent intelligence platform used by major employers including Microsoft, PayPal, Morgan Stanley, Starbucks, Chevron, and Bayer.[2]
Claims
The complaint alleges violations of three statutory frameworks:
1. Fair Credit Reporting Act (FCRA) — Eightfold allegedly violated 15 U.S.C. § 1681 et seq. by furnishing consumer reports for employment purposes without obtaining required certifications from employers, providing proper notification, disclosing the process to applicants, obtaining consent, or offering dispute procedures.[3]
2. California Investigative Consumer Reporting Agencies Act (ICRAA) — The company allegedly failed to satisfy consent and certification requirements, ensure reports were used only for permissible purposes, and provide notice and dispute procedures.[1]
3. California Consumer Protection Claims — Unfair and deceptive conduct claims.[1]
AI Data Practices
The lawsuit alleges Eightfold's AI platform:[4]
- Assembled detailed dossiers on applicants using data far beyond what they provided, including social media profiles (LinkedIn, GitHub, Stack Overflow), location data, internet and device tracking data, and online cookies
- Analyzed "more than 1.5 billion global data points" including profiles of over 1 billion workers
- Generated inferences about applicants' "preferences, characteristics, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes"
- Created "Match Scores" ranking candidates on a 0-5 scale based on predicted "likelihood of success" for the role
- Filtered out lower-ranked candidates before any human review of their applications
Legal Theory
The plaintiffs argue these practices constitute "consumer reports" under the FCRA definition—Eightfold contracts with employers for compensation, assembles candidate data from multiple sources, evaluates it using proprietary AI, and furnishes reports to employer-clients. Critically, the theory does not require proving the algorithm was biased; plaintiffs need only demonstrate that Eightfold compiled consumer reports without following mandatory procedures.[5]
Significance
The case is among the first to test whether AI-powered hiring tools constitute "consumer reporting agencies" under the FCRA, with significant implications for employers and HR technology providers. If the court sides with the plaintiffs, it could require all AI hiring platforms to comply with FCRA's consent, disclosure, and dispute resolution requirements.[1][2]
The complaint seeks national class-wide relief, statutory damages, and punitive damages.[4]
References
- ↑ 1.0 1.1 1.2 1.3 Fox Rothschild, "When AI Meets the FCRA: What the Eightfold Class Action Means for Employers and HR Technology Providers," April 2026
- ↑ 2.0 2.1 2.2 Jones Walker, "AI Hiring Under Fire: What the Eightfold Lawsuit Means for Every Employer Using AI," 2026
- ↑ Inside Tech Law, "Class Action Questions Whether Using AI to Score Job Applicants Violates the FCRA," March 2026
- ↑ 4.0 4.1 Fisher Phillips, "Job Applicants Sue AI Screening Company for FCRA Violations," 2026
- ↑ Ogletree, "Groundbreaking Lawsuit Tests Whether AI Hiring Tools Trigger FCRA Compliance," 2026