Amazon’s facial recognition system, Rekognition, has received a new update and, along with it, improved abilities at detecting emotions. Previously, it was capable of detecting happiness, sadness, anger, surprise, disgust, calm and confusion. Now the update bonus arrives in the shape of fear.
In addition to fear, Rekognition also received an improvement as far as range estimation accuracy goes.
According to Amazon, Rekognition is capable of analyzing hundreds of people in just one photo thanks to its database, which includes tens of millions of faces. However, the system has had its share of controversies: back in July the Orlando Police Department stopped using it because, as the city’s Chief Administrative Office said in a memo: “the city was not able to dedicate the resources to the pilot to enable us to make any noticeable progress toward completing the needed configuration and testing.”
Back then, the American Civil Liberties Union (ACLU) congratulated Orlando on the decision for ““finally figuring out what we long warned – Amazon’s surveillance technology doesn’t work and is a threat to our privacy and civil liberties. This failed pilot program demonstrates precisely why surveillance decisions should be made by the public through their elected leaders, and not by corporations secretly lobbying police officials to deploy dangerous systems against the public.”
The ACLU is one of the most prominent critics of the technology, it has called it “primed for abuse in the hands of governments” and warned that it has the potential to become a huge threat not only to communities but also to people of color and immigrants.
ACLU even used the software to identify members of the Congress recently and got 26 of them identified as criminals. A good number of those who were misidentified were black, which did not favors to the racial bias commentary, as far as the technology goes.
Amazon argued that the results were not accurate, as ACLU only used the default 80% accuracy rate for the system, instead of the recommended 95% and that the results are supposed to narrow down the options, not provide a perfect result.