In the language of technology studies, technical codes may be conceived as the rule under which “black boxing” occurs. At the end of the development process of a technology, when it finally assumes its standard configuration, we know “what” it is; it acquires an essence.64 This essence is of course revisable but only with difficulty compared to the original very fluid situation of the first innovative attempts to make the device. The technical code prescribes some important aspects of the standard configuration, specifically, those which translate between social demands and technical requirements.
| Fig. 2 Schematic diagram showing relationship between technical elements, design space, and a concrete device or technology. In Critical Theory of Technology, a technical code (TC) is what enables the selection of a “best” design from a multitude of design possibilities. Exactly how this code is selected and applied is an empirical question, which will vary depending on the case being studied. The researcher’s task is to draw out the TC from a particular context through sociological analysis. |
We began this chapter by asking questions about the role of intentionality within the design process. Specifically, we have suggested that the path from designers’ intentions to the design of products is not a straightforward one. Though on the surface designers may seem like powerful actors, they are caught in the same web of constraints confronting other actors. Designers do not work in a vacuum. And all too often design demands, implicitly or explicitly, that new devices fit with established ways of being. In other words, designers must accommodate themselves to existing social worlds, which implies submitting to existing power relations and hierarchies. The stifling effect of such passive coercion is a significant obstacle to the realization of alternative designs.
We then outlined a critical theory of technology and explained how a greater focus on the historical and cultural conditions underlying the design process could help illuminate paths to different kinds of design. Technical elements, which in principle could be combined in any number of ways to form a device, are brought together under the constraints of a technical code to produce a concrete device that “fits” a specific social context. Moreover, designers are influenced by what has gone before: yesterday’s tools inform today’s designs, even when yesterday’s tools may have been less than optimal.65 This means that of the many technically feasible options available in the design space, only a small percentage are ever realized. We have argued that the process of resolving technically underdetermined choices should be the focal point of a philosophy of design. We have also argued that, rather than understanding this process solely in terms of the interests or strategies of specific actors (a la SCOT and ANT), we should look at the values and practices that are taken-for-granted in the broader culture.
If we understand technologies to be underdetermined, then the question facing society is not whether to accept or reject technology, but rather how alternative values can be brought into the design process so that the technical codes that determine design are humane and liberating rather than oppressive and controlling. An important first step in this process is to acknowledge that neither proximate designers nor the immediate design environment are decisive in determining the outcome of complex design processes. Instead, people’s taken-for-granted assumptions about the forms and meanings of specific technologies - what we have called here our technical heritage - are crucial. Critical theory of technology draws attention to these background assumptions and asks that the researcher take these seriously. Our hope is that by questioning technology vigorously we can help open a space for designing technology differently.
Abbate, J., 1999, Inventing the Internet, MIT Press, Cambridge, MA.
Bowker, G. C., and Star, S. L., 2000, Sorting Things Out: Classification and Its Consequences, MIT Press, Cambridge, MA.
Bucciarelli, L. L., 1994, Designing Engineers, MIT Press, Cambridge, MA.
Chandler, A. D., 1977, The Visible Hand: The Managerial Revolution in American Business, Belknap Press, Cambridge, MA.
David, P. A., 1985, Clio and the economics of QWERTY, Am. Econ. Rev. 72(2):332-337.
Downey, G., 1998, The Machine in Me: An Anthropologist Sits Among Computer Engineers, Routledge, NJ.
Edwards, P., 1996, The Closed World: Computers and the Politics of Discourse in Cold War America, MIT Press, Cambridge, MA.
Feenberg, A., 1999, Questioning Technology, Routledge, New York.
Feenberg, A., 2002, Transforming Technology: A Critical Theory Revisited, Oxford University Press, Oxford.
Kunda, G., 1993, Engineering Culture: Control and Commitment in a High Tech Culture, Temple University Press, Philadelphia.
Noble, D., 1977, America by Design: Science, Technology, and the Rise of Corporate Capitalism, Alfred A. Knopf, New York.
Norman, D., 1988, The Design of Everyday Things, Basic Books, New York.
Pinch, T., and Bijker, W. E., 1987, The social construction of facts and artifacts, in: The Social Construction of Technological Systems: New Directions in the Sociology and History of Technology, W. E. Bijker, T. P. Hughes, and T. Pinch, eds., MIT Press, Cambridge, MA.
Sclove, R., 1995, Democracy and Technology, Guilford Press, New York.
Winner, L., 1986, The Whale and the Reactor: A Search for Limits in an Age of High Technology, University of Chicago Press, Chicago.
Woodhouse, E., and Patton, J. W., 2004, Introduction: design by society: science and technology studies and the social shaping of design, Des. Issues 20(3):1-12.
Design Culture and Acceptable Risk
Kiyotaka Naoe
Abstract Technological design is usually considered as a process of stipulating target functions. Technological artifacts are, however, not determined entirely by the intent of the engineers who designed them: they unavoidably contain unpredictable and uncertain characters that transcend engineers’ intent, and they cannot be understood purely from a functionalist perspective. In aviation, for example, the smooth implementation of a flight is ensured by a system that includes pilots interacting with each other and with a suite of technological devices. Emphasizing the human aspect of technological designs, this article presents a theoretical framework that takes socio-cultural aspects of technology as the primary for a philosophical, ethical analysis. An analysis of the acceptability of risks shows that the reliability of a technology is determined by the reliability of the technological decisions, eventually the existence of a reliable technological culture. So the task of the ethics of risks is to provide ways to reform our technology culture.
64
Note that we do not mean “essence” in a Heideggerian sense, nor do we mean it in the ahistorical sense that essentialist philosophers of technology posit. The “essence” here is
65
See, for instance, David’s (1985) classic study on the QWERTY keyboard and how, despite being less than optimal in terms of layout and typing efficiency, it has remained the