Federal Reserves warn employers against discriminatory hiring practices

- Advertisement -

As companies increasingly to involve AI in hiring processes, lawyers, lawyers and researchers continue to sound the alarm. Algorithms have was found automatically assign job candidates different scores based on arbitrary criteria, e.g. wear glasses or a headscarf or have a bookshelf in the background. Hiring algorithms can penalize applicants for having black namementioning women’s collegeand even submitting your resume using certain file types. They can disadvantage people who stutter or have physical disabilities that limit their ability to interact with a keyboard.

- Advertisement -

All this was left without attention. But now the US Department of Justice and the Equal Employment Opportunity Commission have proposed management about what businesses and government agencies need to do to ensure their use of AI in hiring complies with the Americans with Disabilities Act.

- Advertisement -

“We cannot allow these tools to become a high-tech path to discrimination,” EEOC Chair Charlotte Burroughs said at a briefing with reporters on Thursday. The EEOC requires employers to disclose to job seekers not only when algorithmic tools are used to evaluate them, but also what traits the algorithms evaluate.

“Today we are sounding the alarm about the dangers associated with blind trust in AI and other technologies that we see increasingly used by employers,” Assistant Attorney General for Civil Rights Kristen Clark told reporters at the same press conference. “Today we are making it clear that we must do more to remove the barriers faced by people with disabilities, and there is no doubt that the use of AI exacerbates the long-standing discrimination faced by job seekers with disabilities.”

- Advertisement -

The Federal Trade Commission has given general guidance on how businesses can use algorithms. in 2020 and again in 2021and the White House agency is working on AI Bill of Rights, but this new guidance shows how the two agencies will respond to violations of federal civil rights law related to the use of algorithms. It also poses a real enforcement threat: the Department of Justice can bring lawsuits against businesses, and the EEOC receives complaints of discrimination from job seekers and employees, which can lead to fines or lawsuits.

According to the US Bureau of Labor Statistics, the number of unemployed people with disabilities is twice the national average. People with mental health-related disabilities also face high unemployment rates, and Burroughs says employers should take steps to test the software they use to ensure they don’t block people with disabilities from the job market.

A series of actions, approved by the EEOC and the Justice Department on Thursday, were previously proposed at the 2020 Center for Democracy and Technology. report about how hiring software can discriminate against people with disabilities. These include removing automated screening for people with disabilities and providing “reasonable accommodations” for people who may otherwise have difficulty using the software or hardware used in the recruitment process. The CDT report also calls for an audit of hiring algorithms before and after they go live – a move not included in the EEOC – and calls the bias against hiring people with disabilities online an “invisible injustice.”

As a teenager, Lydia XZ Brown thought that filling out personality tests when applying for a job seemed like a fun or strange game. They can’t prove it, but now they suspect they faced job discrimination at a mall near where they grew up in Massachusetts. A co-author of the CDT’s 2020 Employment Discrimination Report, Brown called Thursday’s leadership a big win after years of advocacy for people like them with disabilities.

“It was very exciting to see this and I hope it also leads to strong enforcement action,” they say, adding that they hope that future guidance will recognize the influential role that intersectionality plays into how people with disabilities from different classes, gender, or race may experience discrimination in different ways. Similar criticism has been made New York law Requiring tests to detect racial and gender bias in hiring algorithms.

According to Ben Winters of the Electronic Privacy Clearing House, the biggest benefit of these documents is that they tell companies that the Department of Justice and EOCC are paying attention to them and articulate the types of responsibilities companies have, including liability for discrimination caused by third-party software. the supplier.

“This alerts employers that agencies expect them to have higher standards for the providers they use,” says Winters.

The joint action by the Department of Justice and the EEOC is the first action by two agencies tasked with protecting the public and may signal a broader desire to prosecute cases of discrimination caused by automation. Despite recent signs of increased regulatory efforts, the US Congress failed to pass the law require testing or limiting the use of artificial intelligence systems that make important decisions about people’s lives in areas such as hiring, education, financial lending and healthcare.


Credit: www.wired.com /

- Advertisement -

Stay on top - Get the daily news in your inbox

DMCA / Correction Notice

Recent Articles

Related Stories

Stay on top - Get the daily news in your inbox