tech

Researchers say Amazon face-detection technology shows bias

10 Comments
By TALI ARBEL

The requested article has expired, and is no longer available. Any related articles, and user comments are shown below.

© Copyright 2019 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

©2024 GPlusMedia Inc.

10 Comments
Login to comment

That's not discrimination. That's an error, or malfunction.

6 ( +6 / -0 )

That's not discrimination. That's an error, or malfunction.

I came here to say the same thing. A computer program making a mistake is somehow racist... Can we for once not play that card?

4 ( +5 / -1 )

As a darker skinned person, I completely understand the concerns as we are often confused for others or incorrectly accused of being the culprits of crimes. However, the world today is very sensitive and it is becoming an issue. People are no longer allowed to say or do things without the constant fear of being called racist, bigot, sexist, homophobic or more; furthermore, these actions are being used as a tool to condemn people and restrict their rights. This is a serious problem.

4 ( +4 / -0 )

These are machine learning tools. When the only people these programs are learning from are white dudes, then this is what you get.

It's an excellent example of how the failure to diversify the workforce continues to perpetuate racial and gender biases that result in social inequalities. Even automation fails when the people doing the automation are implicitly biased in favor of themselves.

-6 ( +0 / -6 )

 When the only people these programs are learning from are white dudes, then this is what you get.

@CrazyJoe thinks Black people don't know how to program computers!!!!

5 ( +5 / -0 )

This isn't AI. At all. Isn't even close. Doesn't even vaguely resemble it. 

DNA is kind of accurate - but has an almost 100% conviction rate.

Now, "face recognition" is fancy but mostly wrong - and I'm sure it will be used to get a 100% conviction rate.

Police don't need to do investigations anymore because they've invented the modern-day versions of "if she drowns, she's not a witch".

-4 ( +0 / -4 )

Now, "face recognition" is fancy but mostly wrong - and I'm sure it will be used to get a 100% conviction rate.

Facial recognition is not used to secured convictions. You'll never get situations where a jury delegates the job of matching CCTV images to the suspect standing before them.

2 ( +2 / -0 )

These are machine learning tools. When the only people these programs are learning from are white dudes, then this is what you get.

It's crap like this that causes so many people to see the racist bogeyman around every corner. Yes, I'm sure the "white dudes" are all programming it to only see white people.

This isn't AI. At all. Isn't even close. Doesn't even vaguely resemble it.

Now, "face recognition" is fancy but mostly wrong

"Matt Wood, general manager of artificial intelligence with Amazon's cloud-computing unit, said the study uses a "facial analysis" and not "facial recognition" technology."

Did you even read the article?

0 ( +0 / -0 )

These are machine learning tools. When the only people these programs are learning from are white dudes, then this is what you get.

Please learn how the technology actually works before making ignorant claims that support your political agenda. Same goes for the article author.

Artificial intelligence can mimic the biases of their human creators as they make their way into everyday life.

Stop parroting that line (it was also used in the now-deleted "gender gap" article on here) because people who read it and are unfamiliar with the field just assume it's true and keep parroting it themselves (case in point; CrazyJoe, and Reckless in another thread). All you're doing is perpetuating ignorance, the exact opposite of what a journalist is supposed to do.

1 ( +1 / -0 )

That's not discrimination. That's an error, or malfunction.

If the software is working as it was designed to, I don't think we can call it a malfunction. If there is a pattern to the errors, I think we can say it is biased. If that bias works against minorities, I think we can say there is a danger of unfair discrimination.

The Rekognition software has been criticized a lot. The article below is one example.

https://www.theregister.co.uk/2018/07/26/amazon_face_recogition_sucks/

Interestingly, an update to the article says that Amazon now recommend setting a confidence level of 99% rather than the previously recommended 85%. I suspect that will just make any bias more difficult to detect rather than remove it.

0 ( +0 / -0 )

Login to leave a comment

Facebook users

Use your Facebook account to login or register with JapanToday. By doing so, you will also receive an email inviting you to receive our news alerts.

Facebook Connect

Login with your JapanToday account

User registration

Articles, Offers & Useful Resources

A mix of what's trending on our other sites