Justice Department warns misuse of algorithmic recruitment tools could violate accessibility laws

- Advertisement -


AI tools for the hiring process have become a hot category, but the Justice Department warns that careless use of these processes may violate US laws protecting equal access for people with disabilities. If your company uses algorithmic sorting, face tracking, or other high-tech methods to sort and evaluate applicants, you might want to take a closer look at what they’re doing.

- Advertisement -

The Departmental Equal Employment Opportunity Commission, which monitors and advises industry trends and activities related to similar issues. published guide about how a company can safely use algorithm-based tools without the risk of systematic exclusion of people with disabilities.

- Advertisement -

“New technologies should not become new ways of discrimination. If employers are aware of how artificial intelligence and other technologies can discriminate against people with disabilities, they can take steps to prevent this,” said EEOC Chair Charlotte A. Burroughs. Press release announcing leadership.

The general point of guidance is to give careful thought (and to get the opinion of affected groups) about whether these filters, tests, metrics, etc. measure qualities or quantities related to the performance of work. They offer several examples:

  • A visually impaired candidate must complete a test or task with a visual component in order to pass an interview, such as a game. If the work does not have a visual component, it unfairly excludes blind candidates.
  • The chatbot screener asks questions that were poorly worded or drafted, such as whether a person can stand for hours on end, and “no” answers disqualify the candidate. A person in a wheelchair could certainly do many jobs that some can stand by just sitting.
  • The AI ​​Resume Analysis Service is downgrading an app due to a gap in performance, but the gap could be due to a disability or a condition that shouldn’t be punished.
  • Automated voice screening requires applicants to answer questions or test problems out loud. Naturally, this does not include the deaf and hard of hearing, as well as people with speech impairments. If the work does not involve a lot of speech, this is inappropriate.
  • A facial recognition algorithm evaluates someone’s emotions during a video interview. But the person is neurodivergent or suffers from facial paralysis due to a stroke; their estimates will be outliers.
- Advertisement -

This does not mean that none of these tools or methods are wrong or fundamentally discriminatory, breaking the law. But companies that use them must be aware of their limitations and offer reasonable accommodations in case an algorithm, machine learning model, or some other automated process is not suitable for use with a given candidate.

Part of this is the availability of affordable alternatives, as well as the transparency of the recruitment process and announcing in advance which skills will be tested and how. People with disabilities are the best judges of what their needs are and what accommodations, if any, to request.

If a company does not or cannot provide reasonable accommodations for these processes – and yes, this includes processes created and managed by third parties – it may be held accountable or otherwise held liable for this failure.

As usual, the sooner this is taken into account, the better; If your company hasn’t consulted with an accessibility expert on issues such as recruitment, website and app access, and internal tools and policies, go ahead.

At the same time, you can read the full DOJ guide herewith a short version intended for workers who feel they may be discriminated against hereand for some reason a truncated version of the manual here.


Credit: techcrunch.com /

- Advertisement -

Stay on top - Get the daily news in your inbox

DMCA / Correction Notice

Recent Articles

Related Stories

Stay on top - Get the daily news in your inbox