Pattern number within this pattern set: 
Douglas Schuler
Public Sphere Project

Because technology and technological systems can play out in so many ways, this is one of the longest problem statements in the pattern language. Technological systems are often portrayed as nearly miraculous solutions to problems both real and imagined. For this reason, people put faith in technology that is not always warranted. Moreover, the technologists peddling techno-utopian visions in which technology causes problems to vanish — essentially by magic — are not subjected to the same scrutiny that other societal prognosticators receive. For one reason, technologists by virtue of their special knowledge and unfathomable jargon are often intimidating to non-technologists. An unquestioning reliance on technology can result in a technocratic culture where people come to expect technological solutions. Technology puts major decisions in the hands of the technologists; degrades public discussion; diverts attention, discussion, and funds. Socio-technological systems generally have implicit trajectories. They're often implemented as "total programs" when almost by definition they are partial solutions that don't address or the social aspects either with analysis, co-design, education or funding. This is what is generally lacking when introducing computers into the classroom or in discussions about bringing inexpensive laptops to the children of Africa. The use of technology often introduces new problems including ones that humankind is not prepared for. (And then of course "technology will solve the new problems.") Introducing mandatory laptop computers in a middle school or high school, for example, soon leads to additional issues. Should students be able to use Instant Messenger during class? Download movies? Play fantasy baseball? As was pointed out in the play, Mitzi's Abortion (Hefron, 2005), based on a true story, technology can tell a pregnant woman that the baby she's carrying has no brain, but it can't provide any guidance on what she can do about the situation or how to negotiate with her insurance company to help her with financial burdens that may arise. Technology can be used for dumbing down (and having technology shouldn't be an excuse for ignorance — Why learn anything when I can simply find the knowledge on the Internet whenever I want to!). Moreover, it almost goes without saying that technology is the near perfect candidate for systems of exploitation, control, and surveillance. Machines will never be seized with doubts about ethics or morality as a human pressed into an inhuman situation may be. On the other hand, we must continually remind ourselves that technology breaks down. It is not perfect and never will be. The large number of failed tests of the Strategic Defense Initiatives, the mixed results of laptops in schools, the quiet withdrawal of facial recognition systems for "homeland security," and the potential for economic collapse due to unanticipated results of automated buying and selling all show the imperfection of our technological creations. The irony is, however, that technology might be most dangerous when it works correctly. For example, wouldn't a total failure of nuclear or biological weapons be preferable to "success?" Although Isaac Asimov presumed that humans would never allow robots to make life or death decisions or take the life of a human, the "launch on warning" computerized systems in the US (and presumably Russia) are virtually the same, minus the anthropomorphic features we've come to expect in our robots.


Virtually anybody who is alive today will be confronted with new technology that is likely to change the circumstances of their life.


The interesting and more useful use of the word criticism is as it is used in art or literary criticism,, namely the analysis, evaluation, interpretation and judging of something. Technology, or, rather, its practice including discourse, development, use, education, funding, regulation, and disposal, in addition to its physical embodiment, deserves this type of attention like all other aspects and creations of humankind.

Although technology, and ICT particularly, has a variety of attributes — or "affordances" — that will allow / encourage new capabilities (while discouraging others) and individual people obviously will play important roles in technology use (in the small?), the extreme "weight" of the social context will always exert considerable impact. The mistake that people most frequently make is forgetting the fundamental fact that technology in all of its guises is applied within specific social contexts. In other words, the phrase Guns don't kill people, people kill people could be more accurate if it read, Guns don't kill people, they just vastly improve the ease and efficiency of doing so. Without the "need" to shoot people only a fraction of the world's arsenal would exist. There is no such thing as "technology by itself" and, therefore, it makes no sense to view it in those terms.

The Strategic Defense Initiative, commonly called "Star Wars" illustrates many of the reasons why techno criticism is so necessary. Basically untestable, demonstrably unreliable, The SDI effort escalates militarism at the expense of non-military solutions while removing large sums of money from other more worthwhile enterprises. Additional militarization of space and the development of the next-generation of nuclear weapons also cry out for very deep technocriticism.

One of the most visible, current manifestations of techno-utopianism is that revolving around the prospects of a new "$100 Laptop" ostensibly for the children of Africa. Although many people believe that computers have intrinsic "subversive" nature that would empower people around the world it's not at all clear to me why African kids would be less attracted to Grand Theft Auto or other violent, time-squandering video games, then, say, their American counterparts if a brand-new laptop computer was suddenly in their possession.

A last example provides a glimpse of what can happen when fast computers and knowledge of human behavior are combined within specific systems of power. In a provocative article entitled, "AI Seduces Stanford Students" (200_) Kevin Poulsen describes a phenomenon called the "chameleon effect" in which "People are perceived as more honest and likeable if they subtly mimic the body language of the person they're speaking with." Now scientists at Stanford University's Virtual Human Interaction Lab have demonstrated that computers can exploit the same phenomenon, but with greater success and on a larger scale. Sixty-nine student volunteers interacted with a realistic human face, a computer generated "digital agent" that delivered a three-minute persuasive speech. Unbeknownst and undetected by seven out of eight students, the talking head was mimicking their every expression — eye movements, head tilts, etc.

The ominous result of this experiment was that the students reported that the echoing "agent" was "more friendly, interesting, honest, and persuasive" than the one that didn't blindly ape the facial movements of its mark. One doesn't need excess paranoia to imagine what lay in store for us when ubiquitous mass media systems, perhaps two-way, are joined with the system described above. Poulsen describes one way in which this could be accomplished.

"Bailenson [the Stanford researcher] says the research not only shows that computers can take advantage of our psychological quirks, but that they can do it more effectively than humans can because they can execute precise movements with scientifically optimized timing. The killer app is in virtual worlds, where each inhabitant can be presented with a different image, and the chameleon effect is no longer limited to one-on-one interaction. A single speaker — whether an AI or a human avatar — could mimic a thousand people at once, undetected, transforming a cheap salesman's trick into a tool of mass influence."

Ironically the people who are best equipped to apply this pattern are the people who know the most about how technology is designed, deployed, marketed, etc. Technophiles probably make the best Technocritics. This is an argument for technical education that is integrated with the humanities and the social sciences, marriage that many people in the non-technical disciplines might find as distasteful as those in the technical disciplines do. A society that was technologically literate would not "throw technology at problems" any more than they'd "throw money at problems." That, however, is not at all the same as saying that technology or money never can help solve problems —' as both resources when applied wisely can help immensely.

Organizations like Computer Professionals for Social Responsibility who have worked with issues like SDI and electronic voting and groups like the Union of Concerned Scientists and Electronic Privacy Information Center are working in this area. The Bulletin of Atomic Scientists provides thoughtful discussion on matters of weapons and national security. The world is ready for discussions in this area that aren't dominated by the media and the digerati. Many policy options come to mind, but in general, they should be based on informed public discourse. One intriguing example of this is the "co development laws" in Scandinavia in the 1980s in which new technology could not be introduced without the consent of the workers. Another ripe field is that of genetic engineering of seeds and other biological entities.

Luddism is not an answer to the question of "autonomous technology" anymore than it is to uncritically embrace it. The solution is not to totally eschew the use of technology in society. Technology is an integral part of the human condition. At the same time it is important for the reasons discussed above to acknowledge and to consider how technology is presented, designed, discussed, implemented and used, just as other activities, particularly ones with similar potential for large-scale disruption should be subjected to this scrutiny. Unfortunately there is a surprising number of people who interpret any of this discussion as being "anti-technology" (which is barely even thinkable). At any rate, this reaction has a chilling effect on the idea of actual conversations on the technology (as would befit a democratic society) and has the adverse effect of reinforcing the stereotype that technologists are binary thinkers who are simply not capable of more nuanced thought.

Langdon Winner, one of the intellectual founders of technocriticism (along with Lewis Mumford, Norbert Weiner, and, even, Dwight Eisenhower) made these statements in relation to the advent of ubiquitous digital computer networks:

As we ponder horizons of computing and society today, it seems likely that American society will reproduce some of the basic tendencies of modernism.
  • unequal power over key decisions about what is built and why;
  • concerted attempts to enframe and direct people's lives in both work and consumption;
  • the presentation of the future society as something nonnegotiable;
  • the stress on individual gratification rather than collective problems and responsibilities;
  • design strategies that conceal and obfuscate important realms of social complexity.

Although technological systems can be extremely powerful, they are subject to a number of limitations that must be understood and probed thoroughly if these systems are to be deployed effectively in society. A more visible, inclusive and engaging practice of Techno Criticism could go a long way towards educating society on the myriad implications — both creative and destructive — of technology in today's world.


Technology often alters power relations between people, generally amplifying the power for some and not for others. The development of new military technology through history dramatically illustrates this phenomenon. The distribution of computers in society is yet another example. Generally, rich people have them and poor people don't. If computers enable people to be more productive (as computer related companies assert) then economic benefits would obviously accrue to those that have them. People need to understand or at least anticipate to some degree not only the effects of specific technological artifacts (RFID in running shoes, for example) but the socio-technological systems that they support or destabilize.

Verbiage for pattern card: 

Unquestioning reliance on technology can create a culture where people expect technological solutions to all problems. This blind faith can help put decisions in the hands of the technologists, degrade public discussion, and divert attention and funds. It often alters power relations by amplifying the power for some. We need to understand and anticipate the effects of specific technological artifacts and the broader implications as well.

Pattern status: 
Information about introductory graphic: 
Baby Monkeys Playing, photograph by Richard Sclove