Introduction to Identification Technologies

Most modern smartphones, and even some desktop computers, include features that allow people to authenticate themselves using their face. For example, Apple’s iPhones include a Face ID system that uses facial recognition to identify the owner of the phone and unlock the device without requiring the owner to input a password. People welcome this application of facial recognition because it offers convenience. Furthermore, this application of facial recognition does not raise privacy concerns because the data is kept on the person’s personal device (typically in a heavily encrypted and secured format).

Other applications of facial recognition, however, have been controversial because of their implications for privacy. For example, Facebook uses facial recognition to identify images containing your face and suggest that you be tagged. Although these “tag suggestions” are only shown to your Facebook friends, we can imagine how individuals’ expectations of privacy might be violated if Facebook removed or accidentally disabled this restriction. Or consider the example of Clearview AI: this startup scrapes images of individuals from social media websites without their knowledge or consent, puts the images into a massive facial recognition database, and then sells access to the database to unknown third-parties. Taken to their extreme conclusion, powerful facial recognition systems paired with extensive data enable unprecedented social control, as the world is witnessing in the Xinjiang region of China.

Public-interest technologist Bruce Schneier observes that facial recognition is just one of many identification technologies that currently exist, or are in development. Additional identification technologies include voice recognition, gait recognition (the ability to recognize an individual based on how they walk), and biometric markers that can be read at a distance like fingerprints, iris scans, and even heartbeat patterns. Any one of these technologies, let alone several used in combination, could have profound implications for individual privacy if it was widely deployed, especially if the deploying party already had significant power.

Privacy and Obscurity

When discussing the concept of individual privacy, there is a tendency to define it in terms of private and public spaces. In private spaces, like within your home or office, you do not expect to be observed or recorded by third-parties. Indeed, if someone were to observe or record you within one of these private spaces they might be violating the law by, e.g., trespassing or wiretapping. In contrast, the law offers little-to-no privacy protection in public spaces like the sidewalk, a park, or a public school. While walking down the street someone may record what you are saying or take your photograph, incidentally or on purpose, and this is generally legal. More importantly, it is utterly banal – we understand that we are constantly being recorded by security cameras and in the background of other peoples’ selfies as we navigate public spaces.

Just because we are being observed and recorded in public spaces, however, does not automatically mean that there is no individual privacy in these contexts. In these examples there is nothing that reveals our identity to third-parties. For example, just because you happen to appear in the background of someones’ photo doesn’t automatically mean they know who you are; rather, you remain an anonymous bystander until the photographer undertakes affirmative effort to link your image to your identity.

Scholars Evan Selinger and Woodrow Hartzog refer to this property as “obscurity”:

“Obscurity is the idea that information is safe—at least to some degree—when it is hard to obtain or understand.”

Obscurity is what typically affords us privacy even when we are in nominally public spaces – the amount of effort it would take to track down a person based on the sound of their voice, an image of their face, the outline of a distinctive tattoo, etc. is so great that it isn’t worth it the vast majority of the time. Indeed, obscurity is exactly what makes investigative work challenging, e.g., when the police have to track down a suspect based on an image from a security camera.

Identification Technologies vs. Obscurity

The key problem created by identification technologies like facial recognition is that they erode obscurity. All of a sudden it becomes much easier, even trivial, to map from a recording of an individual to that individual’s identity. This erosion of obscurity fundamentally changes the norms and expectations around public spaces, shifting them away from a presumption of anonymity towards individuation.

The problems created by identification technologies are compounded when we consider how they shift power away from individuals towards powerful institutions. Although the underlying technology (e.g., cheap cameras, source code, database software) to build things like facial recognition systems are widely available, they are necessary but not sufficient to build functioning identification systems. The missing component is massive amounts of data: images, voice and video recordings, biometrics scans, etc. This data is time consuming to compile and costly to store. Further, sensitive data like biometric scans of individuals are typically only available to privileged organizations, like law enforcement, who can compel people to provide this data. In short: the institutions that are capable of building mass identification technologies are the same institutions that already hold significant power in society.