Выбрать главу
6 Unintended Results and Public Nature

As mentioned in the previous section, design can be considered as a process of stipulating target functions. Considering the facts that technological design embodies social needs and relationships and that it creates a new social order (see the examples given above),68 it would be possible to state that designing artifacts means simultaneously designing and defining the order of our world. In a sense, it is similar to a “legislative act” (Winner, 1986, 29). However, the power of this “legislation” is limited since one cannot presuppose the perfect predictability or analytical separability of means and ends. We must also note that the identification of objectives with “the intent of the designer” and of designing processes with the implementation of that design is problematic. As evident from the discussion above, this is because the dimension of what items will be established as objectives as well as what is emphasized in the process of design and what is viewed as secondary are dictated on the basis of culture, or routine knowledge that is often taken for granted. This is strongly associated with the assessment of the uncertainty and incompleteness of technology.

First, besides directly intended objectives, there could be latent secondary intentions that can cause unexpected results. For example, when a designer unintentionally designs an artifact that is primarily meant for non-handicapped people, it might be dangerous for the disabled and therefore result in them feeling discriminated against.

Second, the results of technology are not primary; instead, they accompany numerous effects and side effects. Technology exceeds the intent of the designer, resulting in unintended and unpredictable by-products. In the words of Tenner, technology “bites back” (1996). Results of technology cannot be controlled completely. In the context of risk analysis, with respect to the problem of side effects, a “risk trade-off’ is often insisted, i.e., comparing the possibility and weight of a target risk with those of a potential risk that will take its place and determining whether an action should be performed. However, the effects of technology that should be valued can only be determined within the cultural and social context.

Third, changes in the context incorporated in the design and the significance of that technology as a result of the transformations in lifestyle due to technology and other factors are also important. As Don Ihde states, all technologies are doubleedged because they have “ambiguous, multistable possibilities” (1999, 44) that exceed the intent of the designer. He terms this phenomenon “designer fallacy” that is modeled on the phenomenon of intentional fallacy in literature. Such instances result in changes in the assessment criteria with regard to risk and the features of technology.

Therefore, the question that arises is: Who should be responsible for this decision? Since no one can manage the technological uncertainties, the question of what overall benefits does a particular technology produce should not be assessed paternalistically and decided solely by engineers. Rather, this question should be determined in public by analyzing it from a larger number of perspectives without being limited to a narrow technical perspective. In this case, the engineers cannot possess all the rights and responsibilities, and the perspectives of non-engineers must be incorporated. This is the reason (Shrader-Frechette, 1994, 94) for advocating the principle of “giving priority to third-party or public responsibilities in situations of uncertainty.”

At the beginning of this chapter, I mentioned “culture in technology”; however, the existence of such a system of experiential knowledge implies that it will serve as a barrier that prevents the participation of people who do not share that system. Thus, it should be accepted that in our present society, experts have a monopoly on technological matters. There appears to be an asymmetrical relationship of dominance versus subordination between experts and laypersons. However, such a culture cannot be closed to both matters of fact and normative demands.

On the one hand, as claimed in risk theory, experts have noted the “risk-perception bias” of laypersons. In this case, experts often point to “literacy” in the sense of the capacity to understand science and technology. The thought is that acceptance without bias is only possible by redistributing knowledge, i.e., educating the public and enabling them to acquire the ability to understand modern science and technology “correctly”. On the other hand, if one disregards this barrier, participation in discussions will remain at the most a formality to obtain consent. As evident from this discussion, the situation is instead one of “cultural friction.” In other words, due to the differences between the systems of relevance of experts and non-experts, the matters that are considered problematic by non-experts are not viewed as problems by experts. Therefore, what is needed in the first place is “literacy” on the side of engineer’s: literacy in the sense of a competency in understanding and responding to the questions raised by laypersons. This could be termed as the engineer’s “responsiveness” to the public.

In order to further clarify this, I use the metaphor of a narrative or novel written by many authors, in this case, engineers, managers, laypersons, etc. In this sense, the current master narrative would be that of the engineers. What is required is a rewriting of the narrative of design through mutual recognition between experts and non-experts. This implies that both of them recognize each other in the dialogue as co-authors of the narrative, i.e., as agents with the rights and obligations to ask and answer (responsibility). Trust, identity (on both the sides), and solidarity are founded on the basis of such mutual recognition. Consequently, this shall act as a foundation for the improvement of technical culture in general, or what can be called design culture.

7 Conclusion

We can concretely elucidate “culture within technology” and discern technology as a social and cultural activity by focusing on “acceptability”. In general, the history of technology is not only a history of creations or choices but a history of the acceptances of the former and the oblivescence of the latter. Various decisions, interpretations, and valuations are embedded in the history of technology; they are sedimented and taken for granted. In a sense, technology is a narrative given by many people including laypersons. Thus, technological activities are conducted on this historical basis. For example, the reliability of a technology is determined by the reliability of the technological decisions and eventually the existence of a reliable technological culture. Therefore, particularly in organizations, this depends on the cultural and social relations; the same can be said about risk.

We shall undertake a detailed discussion on this issue in the future; however, with regard to the ethics of risks, we can state that the moral of the individual engineer and the moral rules of the engineering profession are not the only central, although not incidental, problems. When designing some artifacts, engineers expect numerous effects, side effects, and possible influences. In this context, in order to recognize engineers as qualified personnel, it is imperative that they are competent in appropriately understanding and responding to the questions of laypersons. Responsibility, in this sense, is the basis for ethics. Based on this approach, we can move beyond the dichotomy of scientifically quantified risk, the bias of nonexperts, and the cultural relativism of risks. Thus far, we have emphasized “culture in technology” and “technology in culture”; however, this does not imply that we should not continue to observe from a descriptive point of view. It is at every step. Design through mutual recognition between experts and non-experts engaged in dialogues is one such way. Technology and its risks are central to our discussion of human well-being.

вернуться

68

The problem of technical mediation demands a separate study and is beyond the scope of this chapter. For an example from classical literature, see E. Cassirer (1985). “Tool carries out the same function in the sphere of object that can be found in the sphere of logics: it is as it were ‘termimus medicus’ which is grasped in the objective conception (gegenstandliche Anschauung), not in mere thinking” (ibid., 61).