Organizations are responsible for ensuring their hiring and performance management tools do not discriminate against individuals with disabilities, federal civil rights officials say.
The growing use of artificial intelligence and other software tools for hiring, performance monitoring and pay determination in the workplace is compounding discrimination against people with disabilities, federal civil rights officials say.
Artificial intelligence can be deployed to target job ads to certain potential applicants, hold online job interviews, assess the skills of job applicants and even decide if an applicant meets job requirements. But the technology can discriminate against applicants and employees with disabilities.
On Thursday, the Equal Employment Opportunity Commission and the Department of Justice put employers on alert that they're responsible for not using AI tools in ways that discriminate and inform employees of their rights, agency officials told reporters.
"We are sounding an alarm regarding the dangers tied to blind reliance on AI and other technologies that we are seeing increasingly used by employers," said Kristen Clarke, assistant attorney general for civil rights at DOJ.
"No doubt the use of AI is compounding the longstanding discrimination that job seekers with disabilties face. This is the first step in what we hope will be robust, comprehensive action to eliminate the barriers that are locking too many qualified and eligible people with disabilities out of our job market," she continued.
A person with a stutter, for example, won't be fairly rated by a software that analyzes speech patterns to make assessments about problem solving or other characteristics, Charlotte Burrows, chair of the Equal Employment Opportunity Commission, told reporters.
The Justice Department enforces these laws for state and local governments – it's a "particular area of focus" and a "top priority," said Clarke – and the EEOC has jurisdiction over private employers and the federal government itself.
The EEOC guidance offers advice to employers on how to vet tools.
There are three principle ways that employers run afoul of civil rights laws: by not giving the accommodations needed to ensure that everyone is being fairly and accurate assessed by a tool or algorithm; by screening out people with disabilities in the hiring process, even if they can do the job; and by using these tools to ask medical questions or inquiries about someone's disability. When employers do want to use algorithmic decision-making tools, they need to make sure that these tools are only measuring for skills actually needed for the job, and doing this directly, as opposed to measuring characteristics that might be correlated with those skills.
Vendors that claim their tools are "bias free," for example, are often talking about discrimination based on race, gender, religion and national origin and not referring to disability.
"This is an area where I think particularly because it is developing so rapidly and it's so vitally important, where it's important that the federal government lead," Burrows said. "There's enormous potential to streamline things, to be helpful to everyone, but we cannot let these tools become a high-tech pathway to discimination."