A group of job applicants has filed a lawsuit claiming that AI screening tools used in hiring should be subject to the same disclosure requirements as credit reporting agencies. Just as credit bureaus must give consumers the ability to dispute inaccurate information, the lawsuit states that job seekers should have the right to correct AI screening software.

The New York Times reports that a lawsuit filed in California takes aim at the growing use of AI in employment screening, arguing that AI hiring systems should be governed by the same consumer protection laws that regulate credit agencies. The case, filed in Contra Costa County Superior Court, represents what legal experts anticipate will be an increasing number of challenges to AI-driven hiring practices.

The lawsuit targets Eightfold AI, a Santa Clara-based company that provides screening technology to employers. According to the complaint, Eightfold has assembled a massive database containing information on more than one billion workers worldwide, including over one million job titles and one million skills compiled from sources like LinkedIn. The company’s software evaluates job candidates by comparing their qualifications against employer requirements and assigns them scores on a one-to-five scale.

Erin Kistler, one of the plaintiffs in the case, holds a computer science degree and has decades of experience in the technology sector. Despite her qualifications, she has experienced minimal success in her job search over the past year. According to her detailed records, only 0.3 percent of the thousands of applications she submitted resulted in follow-up contact or interviews. Multiple applications were processed through Eightfold’s system. Kistler expressed frustration with the lack of transparency, stating that she believes she deserves to know what information is being collected about her and shared with potential employers.

The legal argument centers on the Fair Credit Reporting Act, legislation Congress passed in 1970 as credit agencies began using computer databases to compile consumer information and generate numerical scores. The law requires reporting agencies to disclose information to consumers and provide mechanisms for disputing inaccuracies. While the act is commonly associated with credit reporting, its scope extends beyond financial services. The statute defines a consumer report broadly as any collection of information about personal characteristics used to determine eligibility for various purposes, specifically including employment.

Job seekers describe the AI screening process as an algorithmic gatekeeper that can prevent candidates from reaching human hiring managers while providing no feedback about scores or the methodology used to generate ratings. Without access to this information, candidates claim they have no ability to identify or correct potential errors in their evaluations.

This case represents one approach in what legal professionals expect will be numerous challenges to AI employment tools. Other lawsuits have focused on alleged violations of anti-discrimination laws. A notable 2023 case against Workday alleges that the company’s screening system illegally discriminates against older job seekers and individuals with disabilities.

Judge Rita F. Lin denied Workday’s motion to dismiss that case, finding that the plaintiffs presented plausible evidence suggesting the company’s algorithmic tools disproportionately reject applicants based on factors unrelated to qualifications. Her decision cited evidence including a rejection notice one applicant received at 1:50 a.m., less than an hour after submitting an application. In May, the judge granted preliminary approval for the case to proceed as a collective action potentially encompassing millions of rejected job applicants. Workday maintains that the allegations are false and states that its AI recruiting tools are not trained to identify or use protected characteristics such as race, age, or disability.

The regulatory landscape surrounding these issues has shifted under different administrations. In 2024, the Consumer Financial Protection Bureau issued guidance stating that dossiers and scores created for hiring purposes fall under Fair Credit Reporting Act jurisdiction and that vendors creating them legally qualify as consumer reporting agencies. Such guidance documents signal to companies how regulators intend to enforce their oversight authority.

Jenny Yang, a former Equal Employment Opportunity Commission chair appointed during the Obama administration and currently representing the plaintiffs, noted that the commission began examining algorithmic hiring systems over a decade ago. The commission recognized these systems were fundamentally transforming hiring processes, with applicants receiving rejections during overnight hours without explanation.

Read more at the New York Times here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.