tech

AI is taking over job hiring, but can it be racist?

20 Comments
By Avi Asher-Schapiro

Since graduating from a U.S. university four years ago, Kevin Carballo has lost count of the number of times he has applied for a job only to receive a swift, automated rejection email - sometimes just hours after applying.

Like many job seekers around the world, Carballo's applications are increasingly being screened by algorithms built to automatically flag attractive applicants to hiring managers.

"There's no way to apply for a job these days without being analyzed by some sort of automated system," said Carballo, 27, who is Latino and the first member of his family to go to university.

"It feels like shooting in the dark while being blindfolded there's just no way for me to tell my full story when a machine is assessing me," Carballo, who hoped to get work experience at a law firm before applying to law school, told the Thomson Reuters Foundation by phone.

From Artificial Intelligence (AI) programs that assess an applicant's facial expressions during a video interview, to resume screening platforms predicting job performance, the AI recruitment industry is valued at more than $500 million.

"They are proliferating, they are fast, they are relatively cheap - they are everywhere," said Alex Enger, a fellow at the Brookings Institute, who studies AI in hiring.

"But at this point there's very little incentive to build these tools in a way that's not biased," he added, saying the cost and time involved in thoroughly testing a system for bias was likely to be prohibitive without regulations requiring it.

For Carballo, racial bias is his topmost concern.

"I worry these algorithms aren't designed by people like me, and they aren't designed to pick people like me," he said, adding that he has undergone a plethora of different AI assessments - from video analytics to custom logic games.

The risk of discrimination is also a central issue for lawmakers around the world as they weigh how to regulate the use of AI technology, particularly in the labor market.

While the EU is set to impose rules on the use of AI in hiring, U.S. lawmakers are considering federal laws to address algorithmic bias. Last year, legislators in New York City proposed a law specifically to regulate AI in hiring.

"We're approaching an inflection point," Enger said.

According to the most recent survey by human resource (HR) industry group Mercer, more than 55% of HR managers in the United States use predictive algorithms to help them make hiring choices.

AI is being introduced at every stage of the hiring pipeline, from the job adverts that potential applicants see to the analysis and assessment of their applications and resumes.

The COVID-19 pandemic has sped up the adoption of such tools. HireVue, a AI hiring firm that builds tools to analyze and score the answers job applicants give in video interviews, reported a 46% surge in usage this year compared to last.

The rise in AI could represent a real opportunity to root out prejudice in the hiring process, said Manish Raghavan, a computer scientist at Cornell University who studies bias in hiring algorithms.

"No one is going to tell you that traditional hiring was equitable," he said. "And with AI systems we can test them in ways we could never test or audit people's own biases."

Subjecting all candidates to the same interview, judged by the same algorithm, eliminates the subjectivity and bias of people in hiring, said Kevin Parker, chief executive of HireVue.

"We can measure how men and women score, and compare how people of color score against white candidates," he said. "We really try to fine-tune the algorithm to eliminate anything that can cause adverse impact, and come to very close parity."

But the problem, Raghavan said, is that when you build a machine learning algorithm, bias can creep into it in many ways that are difficult to detect.

Enger echoed that view.

"Natural language processing systems have been shown to associate white names as being more qualified. Resume screening systems have been shown to weed out all applicants who went to a women's college," he said.

"It's a minefield," he added.

For job seekers like Carballo - who belong to ethnic minorities and have disadvantaged backgrounds, automated tools can easily reinforce patterns of discrimination, Raghavan said.

In 2017, Amazon stopped using an AI resume screener after discovering it penalized resumes that included the word "women", automatically downgrading graduates of all-women's colleges.

Because applicants often have no way of understanding how they were scored, they are left wondering if bias crept in, Carballo said.

"I'm a first generation college student, I'm Latino, and I didn't go to a top university - and every time I get a rejection, I wonder if the system was designed to weed someone like me out."

AUDIT, REGULATE OR BAN?

Industry is eager to be perceived as fighting bias, Raghavan said, citing his own research showing that 12 of the 15 largest firms have announced some efforts to tackle discrimination.

But Enger said there was currently little incentive for companies to invest significant resources in detecting and rooting out bias, as regulators are not yet cracking down.

That could start to change, however, as policymakers begin to take a look at the industry.

Regulatory proposals being considered by the European parliament would designate AI used in hiring as "high-risk", meaning any companies selling such systems would have to be included in a public data-base.

It would also impose requirements on firms selling such tools in the EU, such as ensuring datasets are "relevant, representative, free of errors and complete", according to Daniel Leufer, an analyst at digital rights group Access Now.

Leufer said the draft regulations do not go far enough, calling for a blanket ban on certain AI tools in hiring, including any that use biometric information such as facial movements or voice tone.

"The length of my nose; how I speak, the way I move my mouth; we should not allow people to make inferences about someone's job performance from these kinds of inputs," he said.

In New York City, the city council is considering a law that would regulate the AI hiring industry, and compel companies to do their own audits for bias, but critics fear it will not be sufficient to rein in discrimination.

"One flawed algorithm can impact hundreds of millions of people," said Albert Fox Cahn, executive director of the Surveillance Technology Oversight Project (STOP), who wants a freeze on AI in hiring pending further bias investigations.

STOP and 11 other digital and civil rights groups sent a letter to New York City Council late last year, asking for stronger protections, including allowing applicants who were discriminated against to file lawsuits.

"We need to press pause until we are able to come up with effective regulatory structures to block AI bias and discrimination," Cahn said.

In April, after working a string of short-term temporary jobs over the past year, Carballo finally got a full-time job at a law firm. The hiring manager interviewed him without the use of an AI screener.

"I think that made a difference - I wasn't just a guy from a rough neighborhood, with a Spanish last name," he said. "I was able to make an impression."

© Thomson Reuters Foundation

©2021 GPlusMedia Inc.

20 Comments
Login to comment

It doesn't even work properly in the first place.

1 ( +3 / -2 )

No, but the programmers can be. The people paying them to actually create the algorithm.

2 ( +4 / -2 )

Hm, does it mean that AI proves that bias and discrimination is an optimal approach in people selection? ;)

2 ( +4 / -2 )

Its still has problems differentiating between buraku people and apes.

thats why is pull off line in the first place

-6 ( +1 / -7 )

This all gets tiring.

In Canada we have all these groups going on about discrimination, white privilege, etc... But one group is 90% silent and they as a group are 15% of the population and leaders of many of the major corporations, government, etc.. Asians instead of always blaming someone or something they get educated, push forward and now are a powerhouse to the point they are now a target of BLM and the like claiming the Asians and whites are racist.

-2 ( +3 / -5 )

You make significant points @Numan 9:03am:

- Q:“..taking over job hiring, but can A.I. be racist?”

- A: “No, but the programmers can be. The people paying them to actually create the algorithm.

Therefore, do you mean the answer is possibly: Yes, it can be programmed as such?

0 ( +2 / -2 )

You also had some interesting ideas last week @Antiquesavings 9:17am about A.I., allegedly, at use within this comments section for example. Can you elaborate sine it would be ‘on topic’ here, today?

-2 ( +1 / -3 )

Bias is likely. Good machine learning tries to mimic human responses, so it will mimic their bias too. That means you don't really want 'AI', you want 'robotic' - Programmed behaviours that actively erase human bias. But even if you intervene and directly program the tech, rather than 'teaching' it from real world examples, it won't be up to the job. Tech cannot contextualise the data it takes in, the way an experienced human can.

I would not want any business of mine to rely on this sort of tech. HR is important and I would want the people to be doing the hiring to be really good at it. New hires are your firm's future. You'd have to be insane to rely on some third party bunch of algorithms over a good interviewer that you trust. All these companies are doing is shifting responsibility for bad decisions on to technology.

3 ( +5 / -2 )

Why does everything have to be tied into "racism"???

-1 ( +3 / -4 )

I would not want any business of mine to rely on this sort of tech. HR is important and I would want the people to be doing the hiring

If you own a large corporations good luck with that, you will need an HR department equal to all others put together.

2 years ago I put an ad looking for an apprentice to learn and work as in a Japanese artisanal craft.

I put this ad on a small job site I reviewed over 5,000 applications from around the world Japan, the USA, Russia, China, Australia, Canada UK nearly every country in the EU and the list goes on.

I had no possibility of reading them all, that was for a training job that paid nearly nothing ( old style Japanese apprenticeship as I did living with us food lodging includes).

I dropped the whole idea in the end.

But if I got over 5,000 for that I cannot even begin to guess what some major corporation or a really good paying job must get in the amount of applications.

My wife's company put out the word (no online ad only headhunters) for an executive position for Japanese and got several thousand CVs sent to them.

The days of the local paper ad are long gone, any job now will end up on the web and generate thousands of not tens of thousands of applicants, some form of automated filtering is needed even if it just cuts out those without the needed university degree or language skills.

0 ( +3 / -3 )

Terminator Mk 1

-1 ( +0 / -1 )

Because, even the word: “Racism” can be used as ‘a tool’ to easily manipulate ‘weak-minded persons’…

- @Boku Dayo 9:49am: “Why does everything have to be tied into "racism"???” -

… and, to further ‘control the distribution of limited resources within a given population’.

Even before the advent A.I., royalty, empires, dictators, politicians, governments, religions, corporations, educational institutions, media, etc

have all employed “racism”, in some form, to work to their advantage and less, for the disadvantaged people they want to subjugate.

0 ( +2 / -2 )

Can you sue an algorithm?

-3 ( +1 / -4 )

Ultimately it does not matter how "colorblind" they make these algorithms, because the hiring decision still rests with real life people. And we all know there will always be personal biases in the hiring process whether it be discrimination based on rage, sex, age, or other factors that are technically protected under the law, but not very enforceable (and employers know it). The computer algorithm may pick the ideal candidate based on its programs, but the hiring manager can take one look at the person's appearance and not hire the person and use the safe "thank you for applying but we have decided to move on" excuse.

0 ( +1 / -1 )

HR AI selecting AI created resume for a job developing HR AI...

0 ( +1 / -1 )

"and I didn't go to a top university"

Well, there you go. It isn't your race or your first-generation college student status. Latino race is a plus now. First-generation college status isn't going to show up in the AI filters. But they certainly program the algorithms to take into account the college the person attended.

I read an article last year which indicated that hiring managers, when facing thousands of nearly identical resumes, will sort by colleges - assuming the student who managed to get into an elite school is more likely to have greater intelligence, drive and motivation, than students who didn't get into such schools. In fact, it is similar to Japan. Everybody wants the student who attended Tokyo, Kyoto, Keio or Waseda. They don't want the students who went to schools no one has ever heard of.

-1 ( +0 / -1 )

Baseball does it BIG TIME! In the old days a scout would go out an assess a athletes natural ability to see if he had the five tools needed to play professional baseball and it depended on where you would end up in the draft. Now days you have a scout he goes out and he looks at the athletes send in his report based on what he has seen as far as his natural ability to play the position height, weight and looking at the size of his parents. From there the data is given to the analyst. Based on how many hits this athlete had in high school or college etc.. the bean counting analyst with their computer stats come out with the projections of what this potential athlete will produce based on all the data they have and the position he is being drafted in. Not only do they do it for drafting they do it for all the players in the big leagues they know what pitch a batter can handle the best location to throw and also where he actually hits the ball to. Its crazy!!!

-1 ( +0 / -1 )

Are they racist? Or just logical?

-3 ( +0 / -3 )

HR AI selecting AI created resume for a job developing HR AI...

Today resumes are screened by automated devices that basically do a key word search. I knew some 15 years ago (when I last did a serious job search) to ensure I used as many of the words from the job description as I could in my resume. It wasn't so much a matter of your particular skills and experiences but making sure you couched these qualities of yours in the words the search engine would hit on. I'm sure in short order savvy job seekers will have figured out how to game AI to get one's resume in front of a human who can set up an interview.

0 ( +1 / -1 )

Enough.

0 ( +0 / -0 )

Login to leave a comment

Facebook users

Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts.

Facebook Connect

Login with your JapanToday account

User registration

Articles, Offers & Useful Resources

A mix of what's trending on our other sites