UK -- la reconnaissance faciale est raciste
felix last edited by
Un article de la BBC sur la reconnaissance faciale au Royaume-Uni et le problème des faux-positifs s'agissant des personnes racisées.
Several UK police forces have been trialling controversial new facial recognition technology, including automated systems which attempt to identify the faces of people in real time as they pass a camera.
Documents from the police, Home Office and university researchers show that police are aware that ethnicity can have an impact on such systems, but have failed on several occasions to test this.
Des études similaires aux États-Unis :
The battle over the technology intensified last year after two researchers published a study showing bias in some of the most popular facial surveillance systems. Called Gender Shades, the study reported that systems from IBM and Microsoft were much better at identifying the gender of white men’s faces than they were at identifying the gender of darker-skinned or female faces.
Another study this year reported similar problems with Amazon’s technology, called Rekognition. Microsoft and IBM have since said they improved their systems, while Amazon has said it updated its system since the researchers tested it and had found no differences in accuracy.
Warning that African-Americans, women and others could easily be incorrectly identified as suspects and wrongly arrested, the American Civil Liberties Union and other nonprofit groups last year called on Amazon to stop selling its technology to law enforcement.
(source: New York Times)