![](/wp-content/uploads/2024/01/012124-02-History-Technology.jpg)
This technology follows earlier biometric surveillance techniques, including fingerprints, passport photos and iris scans.
![](/wp-content/uploads/2024/01/SharronaPearl100.jpg)
By
Associate Professor of Bioethics and History
Introduction
American Amara Majeed was听听by the Sri Lankan police in 2019. Robert Williams was听听in Detroit and detained in jail for 18 hours for allegedly stealing watches in 2020. Randal Reid听听in 2022 for supposedly using stolen credit cards in a state he鈥檇 never even visited.
In all three cases, the authorities had the wrong people. In all three, it was face recognition technology that told them they were right. Law enforcement officers in many U.S. states are听听that they used face recognition technology to identify suspects.
Face recognition technology is the latest and most sophisticated version of听: using unique physical characteristics to identify individual people. It stands in a听听鈥 from the fingerprint to the passport photo to iris scans 鈥 designed to monitor people and determine who has the right to move freely within and across borders and boundaries.
In my book, 鈥,鈥 I explore how the story of face surveillance lies not just in the history of computing but in the history of medicine, of race, of psychology and neuroscience, and in the health humanities and politics.
Viewed as a part of the long history of people-tracking, face recognition techology鈥檚 incursions into privacy and limitations on free movement are carrying out exactly what biometric surveillance was always meant to do.
The system works by converting captured faces 鈥 either static from photographs or moving from video 鈥 into a series of unique data points, which it then compares against the data points drawn from images of faces already in the system. As face recognition technology improves in accuracy and speed, its effectiveness as a means of surveillance becomes ever more pronounced.
Accuracy Improves, but Biases Persist
![](/wp-content/uploads/2024/01/012124-03-History-Technology.jpg)
Surveillance is predicated on the idea that听听and their movements limited and controlled in a trade-off between privacy and security. The assumption that less privacy leads to more security is built in.
That may be the case for some, but not for the people disproportionately targeted by face recognition technology.听听to identify the people whom those in power wish to most closely track.
On a global scale,听听,听听and even听听, often with low-income and majority-Black populations.听听than others.
In addition, the cases of Amara Majeed, Robert Williams and Randal Reid听. As of 2019, face recognition technology听听at up to听, including, in 2018, a disproportionate number of the听听who were falsely matched with mug shots on file using Amazon鈥檚 Rekognition tool.
When the database against which captured images were compared had only a limited number of mostly white faces upon which to draw, face recognition technology would offer matches based on the closest alignment available, leading to a pattern of highly racialized 鈥 and racist 鈥 false positives.
With the expansion of images in the database and increased sophistication of the software,听听鈥 incorrect matches between specific individuals and images of wanted people on file 鈥 has听. Improvements in pixelation and mapping static images into moving ones, along with increased social media tagging and听听like those developed by Clearview AI, have helped decrease the error rates.
, however, remain deeply embedded into the systems and their purpose, explicitly or implicitly targeting already targeted communities. The technology is not neutral, nor is the surveillance it is used to carry out.
Latest Technique in a Long History
![](/wp-content/uploads/2024/01/012124-04-History-Technology.jpg)
Face recognition software is only the most recent manifestation of global systems of tracking and sorting. Precursors are rooted in the now-debunked belief that bodily features offer a unique index to character and identity. This pseudoscience was formalized in the late 18th century under the rubric of the听.
Early systemic applications included anthropometry (body measurement), fingerprinting and iris or retinal scans. They all offered unique identifiers. None of these could be done without the participation 鈥 willing or otherwise 鈥 of the person being tracked.
The framework of bodily identification was adopted in the 19th century for use in criminal justice detection, prosecution and record-keeping to allow governmental control of its populace. The intimate relationship between face recognition and border patrol was galvanized by the听听in some countries including Great Britain and the United States in 1914,听.
Face recognition technology provided a way to go stealth on human biometric surveillance. Much early research into face recognition software was听听for the purposes of border surveillance.
It tried to develop a standardized framework for face segmentation: mapping the distance between a person鈥檚 facial features, including eyes, nose, mouth and hairline. Inputting that data into computers let a user search stored photographs for a match. These early scans and maps were limited, and the attempts to match them were not successful.
![](/wp-content/uploads/2024/01/012124-05-History-Technology.jpg)
More recently, private companies have听, including face recognition, as part of a long practice of听.
Face recognition technology works not only to unlock your phone or help you board your plane more quickly, but also in promotional store kiosks and, essentially, in any photo taken and shared by anyone, with anyone, anywhere around the world. These photos are stored in a database, creating ever more comprehensive systems of surveillance and tracking.
And while that means that today it is unlikely that Amara Majeed, Robert Williams, Randal Reid and Black members of Congress would be ensnared by a false positive, face recognition technology has invaded everyone鈥檚 privacy. It 鈥 and the governmental and private systems that design, run, use and capitalize upon it 鈥 is watching, and paying particular attention to those whom society and its structural biases deem to be the greatest risk.
Originally by , 01.19.2024, under the terms of a license.