Now that we have introduced the concept of identification technologies and framed their potential to impact personal privacy, we can proceed to apply VSD to these technologies. As always, we start by thinking about the stakeholders who lives would be impacted by these technologies and about how their values are implicated.
VSD reminds us to always think about direct and indirect stakeholders impacted by a given technology. Identification technologies are a unique case, however, in that there are no indirect stakeholders: by design, the lens of identification technologies can be focused on anyone, which means everyone is a direct stakeholder. As such, we now discuss a potential set of direct stakeholders and their values. These groups are meant to be illustrative and non-exhaustive; we intentionally provide examples that are broad and very specific.
The technology owner. Some institution always owns and controls a given identification technology, and their values will help steer how the technology is developed and used in practice. Commercial entities like Facebook, Clearview AI, and Ever AI may primarily value the profit-making potential of the technology, although they may also pay lip service to other values like improving accessibility (e.g., through automated photo tagging) and public safety (e.g., by making the tool available to law enforcement).
The technology developers. The people who build a technology, in technical and non-technical roles, will have their own ideas about the values that should be promoted by the systems they build. With respect to identification technologies, the developers may have a genuine desire to improve convenience and accessibility for their users. Or they may feel strongly that helping law enforcement improve public safety justifies any downsides associated with identification technologies.
The public. Members of the public hold conflicting values with respect to identification technologies. One on hand, some people may feel that the erosion of obscurity is an unacceptable breach of privacy that will fundamentally chill personal autonomy, and invoke examples like George Orwell and China as examples of why these technologies should not be built or deployed. Others may favor the use of identification technologies for policing, i.e., to improve public safety, individual calmness, and collective accountability.
Law enforcement. Law enforcement agencies at the local and federal level are very interested in identification technologies. They have the potential to deter crime (i.e., because potential criminals know they will be recorded and caught) and solve a variety of crimes, ranging from identifying robbers in security camera footage to stopping smugglers at the border.
Members of historically oppressed groups. The focus and weight of law enforcement and government is not felt equally across all members of society – historically, members of minority (e.g., Blacks and LGBTQIA+ people) and low socio-economic status groups have borne the brunt of institutional power. Given this history, it is reasonable for members of these groups to assume that identification technologies will be used upon them with greater frequency, which crystallizes these groups’ concerns about the inherent biases in identification technologies. In addition to freedom from bias, members of these groups may also care deeply about the impact of identification technologies on privacy, autonomy, and social justice. For example, facial recognition could be used to identify people at a gay bar or gay pride event – even if those people have not come out and do not desire to be publicly outed.
Victims of sexual violence. Identification technologies enable surveillance, and surveillance is a key tool for abusers to control their victims. Consider how identification technologies could be leveraged by an abuser to stalk their victims, or re-identify a victim who has fled.
Sex workers. Various forms of sex work are perfectly legal in the US (e.g., acting in pornographic films), yet this occupation is precarious because of the stigma attached to it (among other reasons). Sex workers routinely try to protect themselves by adopting stage names, but the separation between their professional and personal identities may be untenable in the face of identification technologies.
Journalists and other “undercover” investigators. There are a variety of legitimate occupations that depend on deception about identity in order to function, e.g., investigative journalists and undercover police officers. The success and safety of these professionals may be jeopardized by widely available identification technologies.
People living under oppressive regimes. As the example of Uighurs living in the Xinjiang region of China vividly demonstrates, identification technologies can be immensely powerful tools of social control, with commensurate consequences on individual privacy, autonomy, and human rights.