A Black man is suing Workday Inc., claiming its artificial intelligence disqualifies applicants who are Black, disabled, or over the age of 40 at a disproportionate rate.
The suit, which was filed in the US District Court for the Northern District of California, claims that as a result of the discriminatory screening process, Derek Mobley (40) has not been hired for any jobs he’s applied for.
Mobley allegedly applied to 80-100 positions since 2018, all of which use Workday as a screening tool. Mobley, who also has a bachelor’s degree in finance from Morehouse College and an associate’s degree in network systems administration from ITT Technical Institute, was denied employment every time he applied using Workday.
The lawsuit also says that Workday allows the pre-selection of applicants outside of protected categories. The tools allegedly rely on algorithms and inputs created by humans who often have conscious and unconscious motivations to discriminate.
According to Bloomberg Law, Mobley’s suit seeks to represent anyone in the protected classes who have been discriminated against using Workday’s AI screening process. From the suit, Mobley would like to reform “Workday’s screening products, policies, practices and procedures so that the Representative Plaintiff and the class members will be able to compete fairly in the future for jobs and enjoy terms and conditions of employment traditionally afforded similarly situated employees outside of the protected categories.”
But Workday has said that the lawsuit is without merit and the company is “committed to trustworthy AI.”
“We engage in a risk-based review process throughout our product lifecycle to help mitigate any unintended consequences, as well as extensive legal reviews to help ensure compliance with regulations,” said a spokesperson for the company.
The company also claimed that they act “responsibly and transparently in the design and delivery” of its AI solutions.
There is a growing concern about AI bias.
Due to the tech industry’s lack of representation, AI isn’t being taught how to understand marginalized groups. Because of this, bias is often baked into the outcomes the AI is asked to predict. According to the ACLU, data that is often discriminatory or unrepresentative of people of color, women, or other marginalized groups — and can rear its head throughout the AI’s design, development, implementation, and use.
Should you be experiencing a similar situation when applying for employment in Massachusetts, contact the Law Offices of Renee Lazar at 978-844-4095 to schedule a FREE confidential case evaluation.