VSD in Practice

Implementing VSD in practice starts from the moment of project conception. Before you even commit to an idea, the full range of implications stemming from that idea need to be investigated and critiqued. The following list proposes a model process for integrating VSD into the technological development lifecycle.

  1. Framing Technical Work
  • Clarify explicitly supported project values and designer stance
  • Situate the work within a social context
  1. Empirical Investigation
  • Identify key direct and indirect stakeholders
  • Elicit potential values from stakeholders
  • Systematically identify benefits and harms for stakeholders
  • Refine understanding of the social context
  1. Value Investigation
  • Develop working definitions of key values and identify potential value tensions
  • Analyze project from diverse perspectives
  • Define technical and technological success objectives
  • Identify potential ethical concerns
  • Map tensions to success objectives
  1. Technical Investigation
  • Identify choice points where the design team has the mandate, control, or power to intervene
  • Build technological and social interventions
  1. Monitor and Respond to Change Over Time

Visualized as a flowchart, this process might look like this:

graph LR F(Framing Technical Work) E(Empirical Investigation) V(Value Investigation) T(Technical Investigation) M(Adapt Over Time) F ==> E E ==> V V ==> T T ==> M V --> E T --> E M --> E

The thick arrows capture the linear path through the process. At a minimum, this is what it takes to implement VSD in practice. However, in reality new questions will arise during the value and technical investigations that may require additional empirical investigation to answer. You should not be afraid to loop back around, collect more data, revise your assumptions, and adapt your designs!

Framing Technical Work

There are some new steps and terms introduced in this process that are worth discussing. “Framing technical work” refers to the initial idea for your project. Right at the outset, it is useful to think about what the intended goals of the technology are and where you want it to fit into society. This will help guide the empirical investigation, by allowing it to focus on a reasonable set of hypothesized social contexts and stakeholders. Of course, these initial, hypothesized goals and contexts will need to be expanded and revised during the value investigation, as well as based on the knowledge gleaned from the empirical evaluation.

During the initial framing process, it is also useful to spend some time thinking about your own values, the values of your team (if you’re not working alone), and the values of your organization (e.g., your company). What ethical concerns, if any, does the team have about the project from the outset? Are there corporate values or objectives (e.g., profit maximization, user growth, etc.) that will need to be grappled with later on? Has your organization adopted or instituted value statements or principles that are relevant to the project?

Working Definitions of Values

By “working definitions” of values, we mean definitions that are closer to operational directives. Useful working definitions will be actionable during the technical investigation, i.e., they can guide the design of technical and social interventions.

For example, consider the value free expression. One way to operationalize this value is to define it as “no censorship”. This would imply that the design goal is to minimize content moderation at all costs. However, you could also operationalize free expression as “facilitating maximum communication”. This implies a design that strategically uses content moderation to eliminate violent speech (e.g., hate speech and bullying) that discourages other, productive speech.

Thoughtful and careful operationalization of values to working definitions is crucial to VSD and often requires diverse expertise. For example, translating justice-oriented concerns to a working definition is a difficult challenge that involves contextual understanding (what exactly are the justice-oriented concerns in the context at issue?), theoretical knowledge (what are reasonable conceptions of justice in the context?), and technical capabilities (how can the conceptions of justice be technically realized?). Therefore, addressing the problem can require domain-specific expertise, ethical expertise, and technical expertise.

Choice Points for Technical and Social Interventions

Recall that practitioners are not limited to just designing technology – they can also help to design the policies and processes that surround the technology. This gives designers the flexibility to design technical and social interventions to resolve apparent ethical dilemmas and balance value tensions. See the previous section for more information.

“Choice points” refer to the places in the design of technology where designers have the power to make changes to address ethical concerns or incorporate values. For example, suppose you decide to make a smartphone app for tracking fitness data. You can’t decide to support iOS but not Android, or vice versa – this simply isn’t feasible given current market conditions. But, you can decide what versions of iOS and Android to target, with the idea being that targeting older versions will make your app more accessible to people with older smartphones. Similarly, you can’t decide not to leverage GPS location data in your app – it simply isn’t useful without this data. But, you can decide where to store this data, for how long, and in what form. A privacy-conscious design might store the location data on the users’ smartphone in an aggregated form, rather than on a remote server in a highly precise form.

Technical vs. Technological Success Metrics

It is typical for technologists to focus on technical metrics of success. Common examples might be: test coverage; number of open issues in the bug tracker; benchmarks of speed or predictive accuracy; counts of app installations, user clicks, pages viewed, or interaction time; or number of users who make a purchase. This focus is understandable, and these metrics can be valuable because they are quantifiable.

However, VSD asks that designers expand their definition of success to include technological success metrics. A myopic focus on narrow metrics runs the risk of missing the forest for the trees. For example, just because users are clicking links and spending time on your website, doesn’t necessarily mean that these people are happy, or that your website is having a positive impact on their life. Maybe users are spending lots of time on the website because they are confused, lost, and frustrated. Or consider how low-quality “clickbait” articles online can be effective at capturing clicks and attention, yet provide little-to-no actual value for individuals.

Assessing technological success requires expanding the scope of what it means to succeed by looking at the impact of technology on individuals, society, and the environment. Example metrics might include: assessments of quality of life; measures of bias; reports of bullying or hate speech; or carbon footprint. Some of these qualities can be measured quantitatively, but gathering this data may necessitates engaging directly with users through surveys and interviews. In other cases, like assessing impact on society, qualitative evaluation may be more appropriate.

Responding to Change Over Time

Society is constantly changing. The implication is that your technology will also need to adapt over time, in response to both technological innovations and shifting social norms. Stakeholders’ values may shift or change in priority, which may have implications for your designs.

It is also critical to recognize that by releasing technology into society, you impact society. Your technology may create ripple effects that alter social norms, and thus necessitates reevaluating your own designs.

VSD calls for constant measurement and reevaluation to ensure that your technology continues to operate as intended and with positive impacts on individuals and society.

In the next section, we examine some common challenges that practitioners of VSD face, and discuss ways to address them.