The requested article has expired, and is no longer available. Any related articles, and user comments are shown below.
© Copyright 2019 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Researchers say Amazon face-detection technology shows bias
By TALI ARBEL NEW YORK©2024 GPlusMedia Inc.
10 Comments
Login to comment
Bugle Boy of Company B
That's not discrimination. That's an error, or malfunction.
extanker
I came here to say the same thing. A computer program making a mistake is somehow racist... Can we for once not play that card?
JJ Jetplane
As a darker skinned person, I completely understand the concerns as we are often confused for others or incorrectly accused of being the culprits of crimes. However, the world today is very sensitive and it is becoming an issue. People are no longer allowed to say or do things without the constant fear of being called racist, bigot, sexist, homophobic or more; furthermore, these actions are being used as a tool to condemn people and restrict their rights. This is a serious problem.
CrazyJoe
These are machine learning tools. When the only people these programs are learning from are white dudes, then this is what you get.
It's an excellent example of how the failure to diversify the workforce continues to perpetuate racial and gender biases that result in social inequalities. Even automation fails when the people doing the automation are implicitly biased in favor of themselves.
madmanmunt
@CrazyJoe thinks Black people don't know how to program computers!!!!
CrazyJoe
This isn't AI. At all. Isn't even close. Doesn't even vaguely resemble it.
DNA is kind of accurate - but has an almost 100% conviction rate.
Now, "face recognition" is fancy but mostly wrong - and I'm sure it will be used to get a 100% conviction rate.
Police don't need to do investigations anymore because they've invented the modern-day versions of "if she drowns, she's not a witch".
Ah_so
Facial recognition is not used to secured convictions. You'll never get situations where a jury delegates the job of matching CCTV images to the suspect standing before them.
extanker
It's crap like this that causes so many people to see the racist bogeyman around every corner. Yes, I'm sure the "white dudes" are all programming it to only see white people.
"Matt Wood, general manager of artificial intelligence with Amazon's cloud-computing unit, said the study uses a "facial analysis" and not "facial recognition" technology."
Did you even read the article?
Otacon512
Please learn how the technology actually works before making ignorant claims that support your political agenda. Same goes for the article author.
Stop parroting that line (it was also used in the now-deleted "gender gap" article on here) because people who read it and are unfamiliar with the field just assume it's true and keep parroting it themselves (case in point; CrazyJoe, and Reckless in another thread). All you're doing is perpetuating ignorance, the exact opposite of what a journalist is supposed to do.
albaleo
If the software is working as it was designed to, I don't think we can call it a malfunction. If there is a pattern to the errors, I think we can say it is biased. If that bias works against minorities, I think we can say there is a danger of unfair discrimination.
The Rekognition software has been criticized a lot. The article below is one example.
https://www.theregister.co.uk/2018/07/26/amazon_face_recogition_sucks/
Interestingly, an update to the article says that Amazon now recommend setting a confidence level of 99% rather than the previously recommended 85%. I suspect that will just make any bias more difficult to detect rather than remove it.