Amazon faces facial recognition rebellion
Shareholders of Amazon will demand, today, that the company stops selling its system to the police. It is simply too powerful and invasive, they say. Is facial recognition going too far?
The text from a friend was totally unexpected and shocking. Jillian York was on holiday in February when Adam told her that he had found photos of her in an American government database.
It was a database used to train facial recognition algorithms and the images went a long way back. The earliest were from 2008. Two had been copied from Google: private photos of her, as she says, goofing about with friends. Another bunch had been clipped from YouTube videos.
Her face was being used to build systems for the government to recognise illegal immigrants criminals and terrorists.
As automatic facial recognition (AFR) software gets more accurate and cheaper to use, it is being swiftly adopted by businesses, police forces and governments across western society. But, today, two events (one in Britain; the other in the US) are high-profile examples of the fightback.
This morning in Seattle, USA, Amazon is facing a concerted push by shareholders to stop selling its AFR technology to US police forces.
While in Cardiff, Wales, the first major legal challenge to police use of AFR has just begun.
At Amazon, angry shareholders are demanding an independent study into whether its system, called Rekognition, threatens civil rights. They want a vote, too, on stopping the business selling it to government agencies.
“It could enable massive surveillance, even if the technology was 100% accurate which, of course, it’s not. And we don’t want it used by law enforcement because of the impact that will have on society — it might limit people’s willingness to go in public spaces where they think they might be tracked,” said one Amazon shareholder.
In Cardiff, Ed Bridges is suing the police. His picture was taken without permission while he was Christmas shopping in 2017. He is being supported by the civil rights group Liberty which says AFR is like taking DNA or fingerprints without consent.
There is a 50-year history behind facial recognition, and a huge global industry already exists. First tested in 1964 by the CIA, by 2006 the technology was advanced enough for an event called the Face Recognition Grand Challenge at which algorithms first beat humans.
Today, China leads the world but AFR is also widespread in western liberal democracies. Facebook’s DeepFace system identifies faces in digital images. Apple introduced Face ID on the iPhone X. And the US Department of State operates one of the largest face recognition systems in the world, with a database of 117 million American adults.
The market for defensive technology is also growing. A Japanese company sells “privacy visors” that use infrared to confuse facial recognition software. A Chicago company sells “reflectacles” that make the user’s face a white blur to cameras.
Hide and seek
Supporters of AFR use the argument that we must move with the times. Look, they say, the technology works. It is an effective and cheap way of supporting police and government agencies under pressure because people don’t want higher taxes to pay for them. How else are we to keep our nations safe? Got a better idea?
This is the thin end of a very scary wedge, say opponents. Just imagine. There is already a model which has 81% accuracy at identifying gay men. What if that got into the wrong hands? And shops are already planning “dynamic pricing” in which price tags are replaced with digital screens. These prices can change based on the identity of the shopper. Facial recognition is simply biometric identification at a distance, with no need for consent. Anonymity is an important right and must be defended.
- Do you feel that your image is private property?
- Are you happy to swap some privacy to help the police fight crime?
- Write a short story set in 10 years’ time when every single move you make is captured on a camera somewhere. It doesn’t necessarily have to be a horror story!
- Hold a class debate. Divide into two teams. Each team appoints a speaker. Prepare the case for and against AFR. Let the speakers address the whole class. Then hold a vote.
Some People Say...
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.”Benjamin Franklin, American statesman
What do you think?
Q & A
- What do we know?
- Amazon’s Rekognition is an online tool that works with both video and still images, and allows users to match faces to pre-scanned subjects in a database containing up to 20 million people. It can detect “unsafe content”, such as whether there is nudity. It can suggest whether a subject is male or female. It can deduce a person’s mood, and it can spot text in images and transcribe it for analysis.
- What do we not know?
- If Amazon’s software is biased. A study published in January by researchers at Massachusetts Institute of Technology and the University of Toronto suggested Amazon’s algorithms suffered greater gender and racial bias than four competing products. Amazon denies this.
- The Amazon website says: “Rekognition makes it easy to add image and video analysis to your applications. You just provide an image or video to the Rekognition API, and the service can identify the objects, people, text, scenes, and activities, as well as detect any inappropriate content.”
- A charity that challenges injustice, defends freedom, and campaigns to make sure everyone in the UK is treated fairly.
- A molecule composed of two chains that coil around each other to form a double helix, carrying the genetic instructions used in the growth, development, functioning, and reproduction of all known organisms. Each person’s DNA is unique.
- The Central Intelligence Agency of the USA. Its website says it is the nation’s premier agency “providing global intelligence in an ever-changing political, social, economic, technological and military landscape”.