Independent of these different views the empiric observation remains that the practice of creating software rarely resembles the top-down engineering models like the lifecycle- or the waterfall-model where the process of going from neat requirements to a working program is thought of as an advancing in clear cut stages. The “real world” of software development is most often described as “messy, ad hoc, atheoretical” (Coyne, 1995), as consisting of “bricolage, heuristics, serendipity, and make-do” (Ciborra, 2004), or as the result of “methodological and theoretical anarchism” (Monarch et al., 1997). While this does not automatically make software production “art”, as Paul Graham (2003) suggests, we have to accept that the engineering ideal is just that: an ideal. Software production in practice commonly takes paths that go in different ways beyond engineering. Two important factors have to be taken into account: changing problems and increasing complexity.
First, the problems software is expected to be used to solve are becoming more “cultural” and less “technical.” If computers were still doing what they did in the 1960s, namely number crunching and data storage, there would probably be no discussion about software engineering or design. With computers now performing semantic and social functions this has changed. Methods like participatory design or end-user development are now used to try to integrate the fuzziness of specifications for software by integrating future users into the construction process.
Second, the complexity of software is increasing rapidly and this makes it always more difficult to plan a program in every detail before starting to write code. It is often impossible to foresee problems early on and plans and models have to be changed, tests have to be made, and specifications have to be modified during the construction process. Agile methods like extreme programming and rapid-prototyping strive to make complexity more manageable and transform the top-down waterfall into a long series of iterations.
The properties of software, the distribution of these properties into space by means of the Internet, and the changing technological landscape are slowly eroding the modern ideal of a neat separation between technology and culture, between detached rationality and human motivations. This argument is endorsed by a closer look at the diverse landscape of software production. As an example, we will briefly analyze the open source scene to show how a whole new array of actors, strategies, and practices can emerge in a situation where material cost is no longer a limiting factor.
On one level, the term “open source” refers to a certain way of handling and sharing computer software.78 It implies that programs are not just available in machine code, but also in source code, i.e., in text files written in a programming language accessible to human beings. To qualify as open source, it is essential that the public is allowed to modify and redistribute the product. On another level, the term refers to communities79 built around this notion of openness and sharing that is responsible for a considerable amount of today’s software production. There is now an open source equivalent for nearly every type of program
The open source scene is rather diverse, but it is possible to sketch a rough ideal type for how it functions. Most importantly, it is impossible to imagine open source without the existence of the Internet. Platforms like sourceforge.net, along with mailing lists and newsgroups, are the tools used to organize and coordinate a globally dispersed and mostly voluntary workforce. A project usually starts with an embryonic program written by an individual or a group which is released under an open source license, to people who are invited to participate in its development. If it can stimulate enough interest, a lively process is set in motion: following the “release early, release often” maxim, versions of the program are regularly published on the Web where anybody interested can add code, report bugs, and fix them. Which features and fixes are integrated is usually decided by a moderator (group or individual), supplemented by a community process very similar to scientific peer-review. The very linear structure of classic engineering is thus translated into a rapid succession of coding/building/debugging, where requirements specification, interface design, and user testing are carried out concurrently and subject to constant change. Collaboration is the main “tool” to tackle complexity. The Internet-based development platforms provide the infrastructure for a project’s representation, for communication between its participants and for the coordination of bug tracking and code maintenance; they are the media that render possible what could be called a “virtual factory” where a diverse and dispersed public channels its collective intelligence.
The open source scene also distinguishes itself from traditional engineering in social norms and general mindset. Mathematical rigor is valued less than an open and involved communication style. Similar to other (youth) subcultures, the demonstration of skill (and not diplomas) is the main source of symbolic capital. Inclusiveness, discussion, collaboration, and the open circulation of information is more important than the clear-cut attribution of tasks, positions, and responsibilities.
On an institutional level, the open source scene has become an important element in the socialization and education of programmers. The lively and helpful online communities allow one to get help and learn from individuals who have achieved status based on their contribution to the field. The accessible code landscape and participatory culture of the open source scene make for a powerful learning environment for individuals of all levels of skill. While engineering is traditionally connected to the somewhat authoritarian institutions of school and university, the open source community supplements these forms by offering a learning-by-doing environment based on playful imitation and autodidactic skill acquisition.
To show that open source products are an important part of the software landscape, we will briefly discuss three examples: the Linux operating system, the Apache Web server, and the Internet browser Firefox.
Linux started out in 1991 when a Finnish student, Linus Torvalds, wrote a very basic kernel program, the core of any operating system, as a hobby project and released it on the Web, inviting others to participate. Since then, Linux has developed into a modern, robust, and complete operating system and is now probably the only serious competitor for Microsoft Windows left. It is available for free and constantly maintained and extended by a community of thousands of programmers around the globe. Most Fortune 500 companies now use Linux, as do the metropolitan administrations of Vienna, Munich, and Paris. One reason for this success is cost, but other factors come into play, including reliability, platform independence, and the possibility to fix bugs directly without having to go through a vendor company.
The Apache project was initiated in 1995 and has since then steadily grown to become the dominant Web server application with a market share of over 52%.80 Open source and available for free, it is developed and maintained under the guidance of the Apache Software Foundation, a non-profit company that helps to organize the development process, assures legal support for the community, and protects the brand. Linux and Apache, coupled with the free database system
78
We are referring here to the open source definition given by the Open Source Initiative (http:// www.opensource.org/docs/definition.php).
79
The