Discussion:
That technology itself determines what is to be done by a process of extrapolation and that individuals are powerless to intervene in that determination is precisely the kind of self-fulfilling dream from which we must awaken
I don't say that systems such as I have mentioned [gigantic computer systems, computer networks, and speech recognition systems] are necessarily evil only that they may be and, what is most important, that their inevitability cannot be accepted by individuals claiming autonomy, freedom, and dignity. The individual computer scientist can and must decide. The determination of what the impact of computers on society is to be is, at least in part, in his hands
It is possible, given courage and insight, for man to deny technology the prerogative to formulate man's questions. It is possible to ask human questions and to find humane answers. (Joseph Weizenbaum, 1972, p. 614.)
Heeding to the call of computer scientists like Joseph Weizenbaum and cyberneticist Norbert Wiener before him, the emerging field of Value Sensitive Design seeks to design technology that accounts for human values in a principled and comprehensive manner throughout the design process (Friedman, 1997; Friedman and Kahn, 2003; Friedman, Kahn, and Borning, 2006). Value Sensitive Design is primarily concerned with values that center on human well-being, human dignity, justice, welfare, and human rights. This approach is principled in that it maintains that such values have moral standing independent of whether a particular person or group upholds such values (e.g., the belief in and practice of slavery by a certain group does not a priori mean that slavery is a morally acceptable practice). At the same time, Value Sensitive Design maintains that how such values play out in a particular culture at a particular point in time can vary, sometimes considerably.
Value Sensitive Design articulates an interactional position for how values become implicated in technological designs. An interactional position holds that while the features or properties that people design into technologies more readily support certain values and hinder others, the technology's actual use depends on the goals of the people interacting with it. A screwdriver, after all, is well-suited for turning screws, and yet amenable as a poker, pry bar, nail set, cutting device, and tool to dig up weeds. Moreover, through human interaction, technology itself changes over time. On occasion, such changes can mean the societal rejection of a technology, or that its acceptance is delayed. But more often it entails an iterative process whereby technologies are invented and then redesigned based on user interactions, which then are reintroduced to users, further interactions occur, and further redesigns implemented.
To date, Value Sensitive Design has been used in a wide range of research and design contexts including: an investigation of bias in computer systems (Friedman and Nissenbaum, in Friedman, 1997), universal access within a communications company (Thomas, in Friedman, 1997), Internet privacy (Ackerman and Cranor, 1999), informed consent for online interactions (Friedman, Howe, & Felten, 2002), ubiquitous sensing of the environment and individual rights (Abowd & Jacobs, 2001), computer simulation to support of democratization of the urban planning process (Borning, Friedman, Davis, & Lin, 2005), social and moral aspects of human-robotic interaction (Kahn, Freier, Friedman, Severson, and Feldman, 2004), privacy in public (Friedman, Kahn, Hagman, Severson, & Gill, 2006), value analyses in reflective design (Senger, Boehner, David, & Kaye, 2005), and the place of designer values in the design process (Flanagan, Howe, & Nissenbaum, 2005).
Methodologically, at the core of Value Sensitive Design lies an iterative process that integrates conceptual, empirical, and technical investigations. Conceptual investigations involve philosophically informed analyses of the central constructs and issues under investigation. Questions include: How are values supported or diminished by particular technological designs? Who is affected? How should we engage in trade-offs among competing values in the design, implementation, and use of information systems? Empirical investigations involve both social-scientific research on the understandings, contexts, and experiences of the people affected by the technological designs as well as the development of relevant laws, policies, and regulations. Technical investigations involve analyzing current technical mechanisms and designs to assess how well they support particular values, and, conversely, identifying values, and then identifying and/or developing technical mechanisms and designs that can support those values.
How then to practice Value Sensitive Design? Some suggestions follow (see also Friedman, Kahn, & Borning, 2006):
- Start With a Value, Technology, or Context of Use. Any of these three core aspects a value, technology, or context of use easily motivates Value Sensitive Design. Begin with the aspect that is most central to your work and interests.
- Identify Direct and Indirect Stakeholders. Systematically identify direct and indirect stakeholders. Direct stakeholders are those individuals who interact directly with the technology or with the technologys output; indirect stakeholders are those individuals who are also impacted by the system, though they never interact directly with it.
- Identify Harms and Benefits for Each Stakeholder Group. Systematically identify how each category of direct and indirect stakeholder would be positively or negatively affected by the technology under consideration.
- Map Harms and Benefits onto Corresponding Values. At times the mapping between harms and benefits and corresponding values will be one of identity; at other times the mapping will be multifaceted (that is, a single harm might implicate multiple values, such as both security and autonomy).
- Conduct a Conceptual Investigation of Key Values. Develop careful working definitions for each of the key values. Drawing on the philosophical literature can be helpful here.
- Identify Potential Value Conflicts. For the purposes of design, value conflicts should usually not be conceived of as either/or situations, but as constraints on the design space. Typical value conflicts include accountability vs. privacy, trust vs. security, environmental sustainability vs. economic development, privacy vs. security, and hierarchical control vs. democratization.
- Technical Investigation Heuristic Value Conflicts. Technical mechanisms will often adjudicate multiple if not conflicting values, often in the form of design trade-offs. It may be helpful to make explicit how a design trade-off maps onto a value conflict and differentially affects different groups of stakeholders.
- Technical Investigation Heuristic Unanticipated Consequences and Value Conflicts. In order to be positioned to respond agiley to unanticipated consequences and value conflicts, when possible, design flexibility into the underlying technical architecture so support post-deployment modifications.
Note: Much of the material in this pattern was adapted from Friedman and Kahn (2003) and Friedman, Kahn, and Borning (2006).