AI hiring bias lawsuit heats up: Here are 5 ways you can use AI hiring tools, but avoid legal trouble
Employment trends expert discusses the importance of the Workday lawsuit and how it can impact companies who use AI hiring tools
— Current estimates show that around 50% of companies now use AI tools for recruitment and hiring.
However, the wild west of AI usage will soon come under legal wrangling. Earlier this month, an applicant in a groundbreaking AI bias lawsuit (Mobley v. Workday, Inc.) requested the court’s approval of a nationwide class of potential plaintiffs.
“Derek Mobley’s case against Workday was already setting legal precedent, as it was the first of its kind to bring charges of bias related to a company’s use of AI hiring tools,” says Rob Wilson, President of Employco USA, an employment solutions firm. “His legal team now believes they have enough plaintiffs to expand to their lawsuit nationwide which means more states could be impacted by what happens with the decision in this case.”
Wilson says that the EEOC is backing Mobley’s lawsuit.
“In a brief filed in the Mobley v. Workday class-action case, the EEOC argued that if the claims in the case are true, the plaintiff has valid reasons to hold the AI vendor accountable for any discrimination caused by its technology,” says Wilson.
In other words, Wilson says that the EEOC wants to hold companies legally responsible if their AI hiring software has any potential bias built in.
“It’s not enough to say ‘I didn’t realize our algorithm was biased.’ If your company’s AI tools are biased, even if you didn’t intend for that result, then you are as responsible as if you asked the questions yourself,” says Wilson.
It’s crucial to ensure that AI tools are used in a fair and non-discriminatory way to avoid legal risks.
Below Wilson offers the following tips for companies that want to use AI for hiring without facing legal trouble:
- Regularly Audit and Test for Bias: “Regularly review the algorithms to ensure they don’t inadvertently favor or discriminate against certain groups,” says Wilson. “Conduct fairness audits, run tests with diverse data sets, and adjust as needed to address any biased outcomes.
- Use Transparent and Explainable AI: “Choose AI tools that provide transparency into hiring decisions, and ensure that hiring managers can understand how the AI is scoring or ranking candidates.”
- Implement Human Oversight: “Always include human oversight in the hiring process. Even if AI tools are used for screening resumes, shortlisting candidates, or conducting initial interviews, ensure that final hiring decisions are made by humans. AI should assist, not replace, human judgment.”
- Maintain Diverse Training Data: “Use a wide variety of training data to help the AI understand and fairly evaluate candidates from different backgrounds. Make sure the training data includes a diverse range of demographic groups to prevent the AI from inheriting biases from historical data.”
Most importantly, says Wilson, you should consider allowing candidates to opt out of AI-based evaluations if they feel uncomfortable with the process.
“Some people may prefer a more traditional interview or assessment approach. Offering this option can help reduce the risk of claims that candidates were unfairly subjected to automated decisions,” says Wilson.
To interview Rob Wilson on this topic or any other employment topics, please contact me.
Media Contact:
Bridget Sharkey
Prime Media Management
773.459.6506
bsharkey@primemediamanagement.com