(Bloomberg) -- The American Civil Liberties Union alleged in a complaint to regulators that a large consulting firm is selling AI-powered hiring tools that discriminate against job candidates on the basis of disability and race, despite marketing these services to businesses as “bias free.” 

Aon Consulting, Inc., a firm that works with Fortune 500 companies and sells a mix of applicant screening software, has made false or misleading claims that its tools are “fair,” free of bias and can “increase diversity,” the ACLU alleged in a complaint to the US Federal Trade Commission on Wednesday, a copy of which was reviewed by Bloomberg. 

In its complaint, the ACLU said Aon’s algorithmically driven personality test, ADEPT-15, relies on questions that adversely impact autistic and neurodivergent people, as well as people with mental health disabilities. Aon also offers an AI-infused video interviewing system and a gamified cognitive assessment service that are likely to discriminate based on race and disability, according to the complaint.

The ACLU is calling on the FTC to open an investigation into Aon’s practices, issue an injunction and provide other necessary relief to affected parties.“We are committed to building solutions that enable our clients to make inclusive hiring decisions,” a company spokesperson said in a statement to Bloomberg. “The design and implementation of our assessment solutions — which clients use in addition to other screenings and reviews — follow industry best practices as well as EEOC, legal and professional guidelines. We are monitoring new and proposed laws and regulatory guidance to ensure that our solutions remains in compliance.”The spokesperson also pushed back on the ACLU’s characterization of its personality test, saying Aon designed the system to avoid measuring clinical traits of autistic and neurodivergent individuals, consistent with the requirements of the Americans with Disability Act.

A growing body of research and reporting, including an analysis by Bloomberg News, has found that artificial intelligence tools are prone to bias and can have unintended and far-reaching consequences in HR and hiring. But the tech is being actively marketed in these sectors as a solution to streamline recruiting. In an executive order last year, President Biden highlighted the risk of AI worsening hiring discrimination. To date, however, there has been little federal action on this issue.

Aon isn’t the only company that provides AI-powered applicant screening software to firms. The industry for automated applicant tracking systems has been around for over a decade, and includes other players such as human resource consulting firms Towers Watson & Co. and Mercer Human Resource Consulting LLC. But the ACLU, by analyzing publicly available technical documentation about Aon’s systems and other materials, was able to zero in on what it said were bias problems embedded in Aon’s software.

The ACLU said the filing is part of its broader push to take on discrimination issues from AI hiring software and other automated technologies. “It’s a core priority,” said Olga Akselrod, a senior staff attorney in the racial justice program at the ACLU, in an interview. “The ACLU and other civil rights groups are carefully watching this space and are prepared to take action.”

A months-long investigation

The ACLU said that Aon first came to its attention when the group agreed to represent an autistic, biracial job seeker who encountered some of Aon’s assessment services during an application process. The ACLU began an investigation into the company’s practices around September.

In one example, the ACLU said it found part of Aon’s ADEPT-15 personality test asked candidates to choose between two statements about their personality they most agree with, on a spectrum. Some of the statements in the test have significant overlap with statements in screening tools commonly used by clinicians to aid in the identification of autistic traits and to support a diagnosis, the ACLU said. (Aon pushed back on this characterization.)

“You can’t discount the impact of being asked these questions that are so closely related to a disability you have and just feeling like the employer and the vendor are targeting you,” said Brian Dimmick, a senior staff attorney in the ACLU’s disability rights program. “That reinforces a lot of the discrimination that people with autism and other mental health disabilities already face every day.”

In another example, the ACLU found that documentation for gridChallenge, Aon’s gamified cognitive assessment tool, showed applicants who were Asian, Black, Hispanic or Latino, or of two or more ethnicities, all scored lower than White assessment-takers on average. The largest disparity in average scores was between White and Black applicants who took the assessment, the ACLU said.

“People with disabilities and people of color already experience significant barriers in the job market and in the workplace,” Maria Town, president and chief executive officer of the American Association of People with Disabilities, said in a statement. “Aon’s employment tools exacerbate this discrimination under the deceptive guise of reducing bias.”

Once the ACLU uncovered these findings, it filed charges with the Equal Employment Opportunity Commission, the federal agency enforcing laws that have made discrimination illegal in the workplace. At some point after this move, Aon made some of its technical documentation inaccessible to the public, Akselrod said.

“Screened out”

Among Aon’s global clients are Procter & Gamble, Deloitte and some parts of Amazon. The ACLU noted Aon’s marketing materials state that annually, the company administers “more than 30 million assessments in over 40 languages across 90 countries.” Though the consulting firm provides an array of human resource services to employers around the world, the ACLU said it was unclear which of Aon’s clients use the particular services that the organization identified in its investigation.

Companies that rely on Aon for hiring software not only risk missing out on highly qualified candidates, but also on people who could bring diversity and a special perspective to the workforce that would otherwise be absent, Dimmick said. Applicants, meanwhile, are missing out on vital opportunities — and may not even know it.“People are not getting jobs that they are qualified for, and in many cases aren’t even having their qualifications judged by humans — they’re just screened out by these tools,” Dimmick said. “They may never know why they were screened out.”

Larkin Taylor-Parker, the legal director of the Autistic Self Advocacy Network, said she has typically seen AI-powered assessment tools show up in entry-level roles.

“What this ends up doing is keeping members of our community from even getting a start in life, and from doing even the most menial kinds of jobs you can imagine,” she said. “That has a very real economic impact on disabled people in this country.”

©2024 Bloomberg L.P.