Based on the above facts, the descriptions provided in the textbooks are extremely simplified depictions, and it seems to be mere hindsight that judges the processes from the perspective of the result, i.e., the failure. First, the engineers of Morton Thiokol and NASA believed that, despite the uncertainties, the joint was an acceptable risk. Their managerial decision-making was rule-based, i.e., no rule was violated. The launch decision was, so to speak, the outcome of a strict technical discussion (see Vaughan, 1996, 336). Second, there were no absolute criteria regarding the validity of technical knowledge, i.e., the validity of technological knowledge is dependent on the situation. In other words, technological knowledge is situated in nature. Third, typically, though a “technical culture” that is shared by engineers determines the nature of the technical discussions regarding the validity of technical knowledge, irrespective of the existence of biases, this technical culture, or culture in technology, is often taken for granted. As a cognitive basal stratum, certain systems of experienced implicit (and explicit) knowledge are a part of this culture, and based on this technical culture, the engineers arrived at a consensus with regard to determining acceptability. After the path was adopted, Vaughan stated that “the launch decision resulted not from managerial wrongdoing, but from structural factors that impinged on the decision making, resulting in a tragic mistake” (Vaughan, 1996, 335). However, it is clear that these “structural factors” do not refer to the factors concerning the physical structure of the space shuttle; rather, they refer to the factors concerning NASA’s organizational culture. As can be observed from the above discussion, although the Challenger’s case initially appears to be a moral issue of engineers, at its core, it is an issue regarding the sanity of technical culture.66
Such a determination of the acceptability of risk on the basis of technical culture is typical to technology in general. In other words, it is neither specific to technology accompanied by enormous risk and uncertainty, similar to the case of the space shuttle Challenger, nor to the design process of technology. In fact, a culturally, or experientially, dependent nature is a fundamental characteristic of technical knowledge. Extremely similar situations are also observed with regard to more established technologies and in instances of management and operation of technical systems. In these cases, cultural determination does not involve technical discussions and calculations, but involves practical human-artifact relationships. Above all, embodied tacit knowledge plays an important role in these cases.
For example, with regard to the cockpit of an aircraft, large control devices as seen in the past are considered to be outdated. However, during take-off and landing and in emergency events, the existence of several people in the vicinity can be extremely significant in handling the situation and sharing the burden of making appropriate decisions. For instance, with regard to a large control device, the pilot’s action to lower the gear lever for the landing gear is subconsciously noticed by the copilot, who is informed by his counterpart that the pilot is controlling the aircraft. Such an “awareness of the situation” obviously serves to develop natural communication between the pilot and copilot. In this example, the mechanical control serves as the medium for a message; therefore, the synchrony of intersubjective communication and action through mechanical media, training, and teamwork permits the smooth operation of the overall system (Norman, 1993, 139 ff.).
This case reveals that the human aspect of a technological system, which is latent in usual situations, becomes evident in the case of emergency events. In current engineering practices, the involvement of humans in mechanical systems is generally believed to cause human error; therefore, it is preferred to maintain as little human involvement as possible. Conversely, humans are indispensable for rectifying problems and errors that occur constantly. Humans, in a sense, use artifacts and one another as extensions of their knowledge system, or rather their own body. In fact, one could suggest that a technological system is created through the interaction of humans and devices (cf. Hutchins, 1995; Norman, 1993). Thus, when increased workload or decline in proficiency negatively affects human reliability, automation through machinery does not increase the safety and reliability of a human-artifact system. Lisanne Bainbridge termed such situations as the “ironies of automation” (Bainbridge, 1987).
Humans design, produce, and manage complex systems. Thus, when a major accident occurs, the individuals who made the mistakes are often held responsible. The morals of engineers and an awareness of themselves as professionals is assumed to ensue, although these morals and the types of behavior that they comprise are the actions of human beings who are acting rationally in pursuit of optimality (cf. Renn et al., 2001). However, the problem now is that a vast majority of knowledge has become routine, and even if this knowledge was once accompanied by careful consideration, it is no longer perceived as such. Nonetheless, acts are committed in accordance with the knowledge “in hand” (Schutz, 1970); therefore, we are usually unable to identify “dis-situated” or disembodied subjects. Moreover, dealing with this knowledge is difficult; this is because if one does not adopt a retrospective viewpoint by asking the question “why,” it is not thematized in this manner (Schutz, 1970). Such knowledge allows the smooth and reliable operation of a system; however, it is also fraught with the possibility of a reduction in the reliability of the system with regard to certain aspects such as safety and product quality. The reliability of a system depends upon the reliability of the technical culture. In this context, James Reason noted the “latent conditions” in an organization that induce errors such as the unsuitableness of design, i.e., lacking consideration of human factors, and inadequate direction; accordingly, he proffered the concept of “organizational accidents” (Reason, 1997). Again, the issue here is regarding the improvement in culture and organization. Therefore, the nature of culture, i.e., embodied knowledge, and the nature of the corresponding designs, organizations, and systems, will be examined in the next section.
Let us again return to the example of the Challenger accident. With regard to the launch decision, Collins and Pinch merely observed the familiar scenario in which “one opinion won and another lost”; engineers “looked at all the evidence they could, used their best technical standards, and came up with a recommendation” (Collins and Pinch 1998, 55). However, the conclusion that everything that was possible was done cannot be arrived at based on the above description of the situation, i.e., winning or losing the debate. Such a discussion is merely a kind of afterthought and relativism. With regard to deciding what is right or wrong, they posit that the discussion must further delve into the situation. Vaughan, as cited previously, noted the “normalization of deviance” with regard to the structural factors that cause an accident. In the Challenger accident, no explicit infractions were necessarily committed. Rather, an activity that could be considered to be natural in an organization was responsible for the accident. In this case, since the criteria for the conditions that a discussion by the engineers must fulfill were rigidly applied, there is little scope for recognizing any such deviance; however, this encouraged a definitive situation. Therefore, we can proceed to a discussion on normativity in technical culture.
66
M. Davis, for example, insists on a “wrongdoing” (self-deception) in the attitude of R. Lund, Vice President of Engineering at Morton Thiokol. Lund had initially supported Boisjoly’s position; however, during the pre-launch caucus, he changed his mind following the advice of J. Mason, Senior Vice President at Morton Thiokol, “It’s time to take off your engineering hat and put on your management hat” (Davis, 1989). However, in her detailed analysis, by citing the evidences presented in the caucus by Thiokol Vice President J. Kilminster et al., Vaughan describes Mason’s decision as being typical of cases where engineering disagreements could not be resolved by data that drew everyone to a consensus. “Someone has to collect that information from both sides and made a judgment.” (Vaughan, 1996, 315 ff.). If this was the case, although by all considerations, Lund found himself in an extremely difficult position, one should consider his decision as an act of neglecting his loyalty toward engineering and replacing it with management logics. Based on this, it would be possible to argue that this is not an issue of personal morals but rather one of structure.