Expanding Lawsuit Against Workday’s AI Job Screening
A federal judge has opened the door to more plaintiffs joining a lawsuit against Workday, a software company that provides AI-powered job screening tools. The lawsuit alleges that Workday’s technology systematically rejects job applications from candidates over 40 years old.
The case, initially filed in 2023 by a single plaintiff, claims that Workday’s AI application rejected over 100 of his job applications due to his age, race, and disabilities. Judge Rita Lin ruled that the case can now be transformed into collective action litigation, allowing other individuals with similar claims to join.
The plaintiffs argue that Workday’s algorithm immediately rejected their candidacies because of their age. They claim that the AI tool rejected hundreds of their applications without human review, often with emails stating that they didn’t meet job qualifications despite their previous experience.
Workday denies the allegations, stating that its AI technology evaluates applications based on client criteria and doesn’t discriminate against applicants. However, tech experts warn that AI apps can develop biases during operation, even if they’re not initially programmed with them.
The lawsuit highlights the potential dangers of using AI in hiring processes. As the American Civil Liberties Union warned, “AI tools are trained with a large amount of data and make predictions about future outcomes based on correlations and patterns in that data.” These tools can exacerbate existing discrimination in the workplace based on race, sex, disability, and other protected characteristics.
The outcome of this case will set a precedent for the legal risks companies face when using AI technology in their hiring processes. Business owners who have adopted AI for hiring and other personnel work should take note of the potential dangers and consider the implications for their own practices.