Presently, the problem of how to deal with the risks posed by technology is growing in importance.
Engineering is often considered as a cultural activity, i.e., an activity that people undertake within a social context. Thus, the ethics of engineering and those concerning risks are to be found within this cultural process. However, risk is also considered as quantifiable and objective, particularly in scientific risk analysis. Moreover, since the situations with which risk analysis is concerned are complicated in nature and involve uncertainty to some extent, a complete optimization of technology cannot be expected and the rationality of risk analysis must correspond to “bounded rationality.” This might remind us of the well-known conflict between cultural relativism and naive positivism. However, in this chapter, I adopt a different path by avoiding referring to this conflict, i.e., avoiding referring to the under- or overestimation of risk analysis. Therefore, I focus on the problem of the acceptability of risks.
K. Naoe, Tohoku University
As an introduction to the following discussion, let us focus on the statement made by E. S. Ferguson. In “Engineering and the Mind’s Eye” (1992), while discussing computer-assisted design (CAD), he states that “numerical calculations always embody human judgment”:
The precise outcome of the [design] process cannot be deduced from its initial goal. [...] Computerized illusions of certainty do not reduce the quantity or the quality of human judgment required in successful design. To accomplish a design of any considerable complexity [.] requires a continuous stream of calculations, judgments, and compromises that should only be made by engineers experienced in the kind of system being designed. (Ferguson, 1992, 37)
Man tends to distinguish traditional techniques supported by human expertise and skills from modern technology supported by science. Such expertise and skills, which are usually not visually or verbally articulated, are replaced by or translated into scientific knowledge. However, in reality, they are not entirely removed from modern technology (hereafter, referred to as “technology” unless otherwise indicated). As in the case of CAD, they remain as constitutive elements, even though they are partly objectified and thoroughly modified in modern technological procedures. Ferguson calls this kind of knowledge the “mind’s eye” or “intuitive sense.” Initially, this “mind’s eye” seems to be purely personal in nature. However, when analyzed from a reflective viewpoint, one can identify some cultural “style” that is strongly connected to it; this is because a calculation or judgment is made on the basis of the accumulation of tacit information and tacit understanding. Therefore, it is possible to state that in technology, certain cultural elements are incorporated. If technology, which is considered to exist within a social and cultural context, is characterized as “technology in culture,” these cultural elements incorporated in technology can be characterized as “culture in technology.” We will also refer to these cultural aspects of technology as “technical culture” in a wide and narrow sense, respectively (this distinction will be indicated clearly only if it is necessary).
From this perspective, we can discuss the problem of acceptability of risks within a cultural context, without denying the need for scientific analysis. The following are some of the issues that need to be addressed: how a particular risk is recognized as risk; how some risks are considered to be acceptable in a society; in which cases do people regard such acceptance risks as reasonable; and so on. Studying the acceptability of risk from this perspective, I seek in this chapter to consider the problem of risk within the “ethos of technology” and consequently find answers to practical and ethical debates regarding technology. In this manner, the technical culture of a society, or of an organization, will be discussed critically, thereby paving the way for an inquiry about the public nature of technology.
In section 2, I will review the Challenger space shuttle accident in order to discuss the notion of acceptability more concretely and show that it is deeply rooted in technical culture (in the narrow sense). In sections 3 and 4, I generalize this notion to technology as a whole and indicate that the reliability of technology depends on that of technical culture. In section 5, I focus on technology in culture i.e., technical culture in the wide sense. Based on the examination of the Ford Pinto case, I create a discussion where the definition and reliability of design is not only concerned with engineers but also with society at large. Finally, in section 6,
I further explore the notion of public determination of technology. Highlighting the limitations of technological design and the engineer’s responsibility, I suggest a possibility of a narrative ethics that can be devoted to the improvement of design culture, or technical culture in general.
First, let us examine the case of the explosion of the space shuttle Challenger in 1986; this is an important case for textbooks on the ethics of technology. The Challenger exploded immediately after lifting off from the Kennedy Space Center, killing all the seven crew members aboard the shuttle. In the ensuing investigation, the O-rings that seal the joints in the shuttle’s solid rocket boosters were identified as the direct cause of the accident. Descriptions in textbooks identify two issues: 1) Roger Boisjoly, an engineer with Morton Thiokol, the engineering firm that was involved in the manufacturing of the boosters, had previously identified this problem and reported the risk to his supervisors; in fact, on the night prior to launch, he had suggested that the mission be delayed. 2) He was ultimately overruled by a management decision that was eventually responsible for the accident. In other words, the responsible behavior of Boisjoly, who doggedly continued to raise the problem, and the actions and attitudes of Morton Thiokol and the NASA management, who prioritized the schedule and proceeded with the launch though they were aware of the risk involved, can be depicted as the “professional ethics of engineers” versus the “logic of management.” The above analysis presents the ethical issues regarding the responsibility of experts, honest and unbiased inquiries, reliability, and the conflict between engineers and their organizations (e.g., Harris et al., 1995, 4 If.).
However, ethnographical research by the sociologist Diane Vaughan (1996), who carefully reviewed the extensive testimony of individuals involved in the accident, and the debates by Harry Collins and Trevor Pinch (1998) based on that research raised different issues.
To avoid any misunderstanding, it should be noted that Morton Thiokol and the NASA engineers were not unaware of the risk surrounding the joints. Rather, they were well aware of the problem and had dealt with it for a number of years. However, as Vaughan et al. pointed out, a) what they sought was not absolute certainty but an “acceptable” solution. That is, complete sealing requires unlimited time and expense, and even assuming that this is achieved, if its integration with the other parts is lacking, the stability and safety of the entire system would still not necessarily be ensured. In general, technology invariably involves some incompleteness as it depends on various factors and deviations arising in situations. However, determining which of these factors or deviations is definitive at that moment is only possible through a system of experience and knowledge. In the abovementioned case, the engineers of NASA and Morton Thiokol, who partly shared common views based on a common intellectual “horizon,” decided to “go ahead” with the launch because the effects of the O-ring damage were within workable limits owing to redundancy. In addition, b) by definition, conflicts between the technical opinions of engineers is normal, and generally, whichever of these conflicting views is considered valid from the perspective of this intellectual horizon is deemed the “winner.” Boisjoly and the others were unable to present persuasive data regarding the reduction in the elasticity of the O-rings at low temperatures; moreover, their data analysis was rife with inconsistencies. Thus, the engineers of Morton Thiokol and NASA concluded that the opinions of Boisjoly and the others were not supported by adequate data. In other words, their opinions lacked the validity required to reverse a decision under the conditions that a technological discussion at NASA must fulfill.