A new report by the American Civil Liberties Union (ACLU) highlighted the weaknesses of surveillance technology, when it matched 28 politicians with the faces of convicts. At a time when facial recognition technology is becoming more commonplace, and being used by police departments and intelligence agencies around the world, it raises concerns about how this technology could lead to false matches, and implicate innocent people.
The ACLU tested Amazon's facial recognition tool Rekognition, whose customers include ARMED, a company that can offer real-time tracking of individuals in video streams, to recognise persons of interest; Butterfleye, a business and home security company; and the Washington County Sheriff Office.
It was tested against the faces of members of Congress in America, and it incorrectly matched 28 of them, identifying them as people arrested for crimes. The members were matched against a mugshot database, and 28 incorrect matches came back.
It also highlights another problem with AI-based systems as the training models have an impact on the results that these systems can return. In this case, the mistakes in Rekognitions results were disproportionately of people of colour; white faces were recognised with greater accuracy. This is something that must be kept in mind as technology companies service global clients around the world.
The ACLU noted the concern with this system, and wrote:
If law enforcement is using Amazon Rekognition, it's not hard to imagine a police officer getting a "match" indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins. Or an individual getting a knock on the door from law enforcement, and being questioned or having their home searched, based on a false identification.
An identification — whether accurate or not — could cost people their freedom or even their lives. People of color are already disproportionately harmed by police practices, and it's easy to see how Rekognition could exacerbate that. A recent incident in San Francisco provides a disturbing illustration of that risk. Police stopped a car, handcuffed an elderly Black woman and forced her to kneel at gunpoint — all because an automatic license plate reader improperly identified her car as a stolen vehicle.