Employers, Be Wary of Built-in Bias from AI Vendors

PDF

Professionals

Practice Areas

Gregg L. Hill and Caleb J. Holloway
Robinson Bradshaw Publication
July 29, 2024

Employers beware: you could be liable for allowing third-party vendors to use applicant screening software if it results in unlawful discrimination, even when unintentional and without your awareness. Employers who turn to artificial intelligence programs to streamline hiring and recruitment processes must exercise caution to avoid potential liability.

Workday Class Action

In February 2023, Derek Mobley filed a class-action lawsuit against Workday Inc., asserting claims for race, age and disability discrimination in violation of Title VII of the Civil Rights Act of 1964 and other federal laws.[i] Workday is perhaps best known for its human resources software products and services for businesses. Among other things, Workday uses AI-powered hiring software to screen job applicants for employers. The complaint argues that Mobley — described in court documents as a Black man over 40 years old who has been diagnosed with anxiety and depression — was subjected to racism, ageism and ableism via more than 100 job applications he submitted to various employers using Workday’s hiring software. He claims that these applications included assessments and personality tests that likely revealed his protected characteristics and screened him out because of them. Mobley’s suit is the first proposed class action, potentially including hundreds of thousands of people, to challenge the use of AI employment screening software.[ii] It will set an important precedent on the legal implications of using AI to automate hiring and other employment functions.

EEOC Oversight

Beyond reputational damage from discriminatory practices, employers using AI screening technology could also face significant financial damage. This includes fines from regulatory bodies like the U.S. Equal Employment Opportunity Commission for violations of anti-discrimination laws. Employers may also incur increased insurance costs and operational costs associated with auditing, overhauling or replacing the discriminatory AI system. Lastly, employers could end up paying compensatory and punitive damages to affected applicants. In EEOC v. iTutorGroup, Inc., et al. (Civil Action No. 1:22-cv-02565), iTutorGroup's application software was programmed to automatically reject female applicants aged 55 or older and male applicants aged 60 or older, thereby rejecting more than 200 qualified applicants because of their age.[iii] On Aug. 9, 2023, the case settled after iTutorGroup agreed to pay $365,000 to be distributed as back pay and compensatory damages among the applicants who were allegedly unlawfully rejected.[iv]

The EEOC has warned employers that they can be held legally liable if they fail to prevent screening software from having a discriminatory impact. Under EEOC regulations, companies are responsible for their hiring decisions, including decisions based on the AI tools they use.[v] Even if a company does not intend to discriminate and does not know why an algorithm selected one candidate over another, it could still be liable for discriminatory decisions. To guide employers, the EEOC launched an initiative in 2021 to ensure AI-assisted software complies with EEO laws and released a related technical assistance document in May 2023.[vi] The EEOC has repeatedly warned employers that it is watching for AI bias in employment.[vii] But even with these warnings, recruiting officers are not always aware of how vendors use AI in their programs, and growing legal requirements mean that employers need to develop a deeper understanding of the technologies they use and know what questions to ask.

Steps for Employers to Protect Themselves

Oversight and review of these AI employment screening programs are crucial to prevent inadvertent discriminatory practices leading to potential liability from applicants who may claim that the program discriminates based on protected characteristics. This must include regularly assessing AI-driven hiring processes to identify and address potential biases.

Employers can adopt proactive measures to mitigate these risks, including at least the following:

In summary, while AI can enhance efficiency in candidate selection, it also carries risks of perpetuating biases and discrimination if not properly managed. These risks can expose employers to lawsuits under anti-discrimination laws if the AI tools (inadvertently or intentionally) screen out applicants based on protected characteristics such as race, gender, sex and disability. To mitigate these risks, employers should ensure that their AI systems are transparent, regularly audited for bias, and compliant with legal standards. Failure to do so could result in significant legal and financial consequences, undermining the benefits that AI technology promises to bring to the hiring process.

Robinson Bradshaw’s Employment & Labor Practice Group will closely monitor and report on the latest developments regarding the Workday lawsuit and government standards and regulations addressing AI tools, such as the screening program challenged by Mobley. For assistance in evaluating whether to use an AI screening program and counsel on best practices to avoid liability, please contact a member of our team.


[i] Mobley v. Workday Inc, U.S. District Court for the Northern District of California, No. 3:23-cv-00770.

[ii] Daniel Wiessner, Workday must face novel bias lawsuit over AI screening software Reuters (2024), https://www.reuters.com/legal/litigation/workday-must-face-novel-bias-lawsuit-over-ai-screening-software-2024-07-15/ (last visited Jul 24, 2024).

[iii] EEOC v. iTutorGroup, Inc., et al., Civil Action No. 1:22-cv-02565.

[iv] EEOC v. iTutorGroup, Inc., et al., Civil Action No. 1:22-cv-02565.

[v] Title VII of the Civil Rights Act of 1964, Title VII of the Civil Rights Act of 1964, the Age Discrimination in Employment Act and the Americans with Disabilities Act.

[vi] Title VII, 29 CFR Part 1607

[vii] Patrick Thibodeau, Workday’s AI lawsuit defense puts responsibility on users Tripp Scott Attorneys at Law (2024), https://www.trippscott.com/insights/workdays-ai-lawsuit-defense-puts-responsibility-on-users (last visited Jul 22, 2024).

Main Menu

Robinson, Bradshaw & Hinson, P.A. Cookie Preference Center

Your Privacy

When you visit our website, we use cookies on your browser to collect information. The information collected might relate to you, your preferences, or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. For more information about how we use Cookies, please see our Privacy Policy.

Strictly Necessary Cookies

Always Active

Necessary cookies enable core functionality such as security, network management, and accessibility. These cookies may only be disabled by changing your browser settings, but this may affect how the website functions.

Functional Cookies

Always Active

Some functions of the site require remembering user choices, for example your cookie preference, or keyword search highlighting. These do not store any personal information.

Form Submissions

Always Active

When submitting your data, for example on a contact form or event registration, a cookie might be used to monitor the state of your submission across pages.

Performance Cookies

Performance cookies help us improve our website by collecting and reporting information on its usage. We access and process information from these cookies at an aggregate level.

Powered by Firmseek