Civil rights organizations are forcing Amazon to stop selling its artificial intelligence (AI) software to government agencies for fear that this software will infringe on privacy rights. Rekognition, Amazon’s AI technology, is able to identify objects, people, scenes, and other details from video and images. Moreover, later versions of the software can identify up to 100 faces from a crowd.
The software has been praised by an Amazon executive for its usefulness when it comes to law enforcement. Authorities can track and recognize suspects or ‘persons of interest’ with Rekognition. This would work not only in pictures and videos, but also in real time.
This use case in particular is what caused uproar with civil rights groups. 41 altogether, including ACLU, Human Rights Watch, and Electronic Frontier Foundation, have co-signed a letter in which they stress how the government could abuse this software.
“This product poses a grave threat to communities, including people of color and immigrants, and to the trust and respect Amazon has worked to build. People should be free to walk down the street without being watched by the government. Facial recognition in American communities threatens this freedom.”
The ACLU has also obtained, through the Freedom of Information Act, emails exchanged between Amazon employees and local law enforcement in which the facial recognition software and its usage were discussed.
In those emails, the Washington County Sheriff’s Office in Oregon stated that they uploaded around 3,000,000 images to Rekognition, a vast majority of them provided by citizens or taken by security cameras. They stated that they were using the software to identify crime suspects, possible witnesses, accomplices, and persons of interest that they could not previously provide identification for.
Public Information Officer for the Washington County Sheriff’s Office, Jeff Talbot, said in an email to CNNMoney that the software had been in use for over a year.
“During that time, [we] have been transparent and forthcoming so the local public knows what it is, and equally importantly, what it is not. A facial recognition match alone, no matter the percentage match, does not establish probable cause to arrest a suspect. […] The Sheriff’s Office does not use the technology for mass and/or real time surveillance. In fact, state law and our policy prohibits it for such use. “
Amazon has stated that the software can be used for many other purposes and that Amazon requires customers to comply with the law and use Rekognition responsibly. In Orlando, where the technology is used to search for ‘persons of interest,’ the Orlando Police has said they are not using the software for investigations “at this time.”
With all this in mind, after ACLU’s concerns had been made public, the company has removed the mention of police body cameras from their website.