acm-header
Sign In

Communications of the ACM

Historical reflections

Five Lessons from Really Good History


covers of four books

My last column (September 2012) explored the lessons to be found in "bad history" of the invention of email. One of the things this reminded me of is just how little many people whose work focuses on information technology know about its evolution over the past 50 years. In this column, I look at some of the very best historical writing about computing from the past few years. I highlight one big lesson from each of four books, giving four ways in which learning more about history can change the way you think about computing. A bonus lesson sums up what the books tell us about the field as a whole.

You do not just have to take my word on the "very best" part of the preceding description. The four books are the first winners of Computer History Museum Prize, given each year to the author of an outstanding book on the history of information technology. The prize was created in 2008 by SIGCIS, the organization for historians of computing, with money pledged by networking pioneer and inventor of packet switching Paul Baran. Like many other pioneers, Baran felt a keen interest in preserving and documenting the heritage of his field. He was a keen supporter of the Charles Babbage Institute, the leading academic and archival center for the history of computing, and a fellow and advisory board member of the Computer History Museum, in whose honor he suggested the name of the prize. When Baran died last year he left instructions for a gift to SIGCIS of $25,000 to endow the prize in perpetuity.

One of the rewarding aspects of working on the history of computing is that even cutting-edge research is potentially accessible to a broad audience. Historians make an effort to write clearly, at least compared to a typical technical paper in computer science, and we generally do our best to avoid technical jargon. I highly recommend you include one or more of these books on your reading list.

Back to Top

1. Making Stuff Creatively Made Silicon Valley Creative

In Making Silicon Valley: Innovation and the Growth of High Tech, 1930–1970 (MIT 2006, CHM Prize winner 2009) Christophe Lecuyer tackles one of the most familiar stories in the history of computing: the invention of the transistor at Bell Labs, through William Shockley's creation of a company in California to exploit his invention and the founding of Fairchild Semiconductor by refugees from his erratic management style to the founding of Intel by some of the same people. This is the creation myth for computing in Silicon Valley and has been told and retold by journalists and biographers over the years. It explains how, over the course of a single working life-time, transistors went from sizable handmade blobs sold at prices only the military could afford to microscopic metal smears so cheap that we package millions of them into singing greeting cards and other disposable fripparies.

Scholarly and journalistic histories generally rely on different kinds of evidence. Journalists tend to shun endnotes and conduct their research largely by interviewing people. Scholarly historians place great emphasis on finding original written documents from the time in question. If handed a new book close to his or her own research area a historian will often go first to the endnotes, thumbing the back pages to evaluate the range and appropriateness of archival sources used before looking at the main text.

Lecuyer's book is a great example of the depth of insight and detail this approach can provide. He broadens the story out to encompass less widely celebrated firms, such as Litton Industries, National Semiconductor, and Varian Associates, and pushes earlier in time to document the importance of radio component manufacturing to the Valley. The book is based on careful research in the preserved archival records of the people and companies concerned, rather than the recycled anecdotes often used by journalists. While giving due credit to the importance of military sponsorship and Stanford University he puts the development of a pool of skilled labor and amateur electronics enthusiasts at the heart of the Valley's success.

More than anything else, Lecuyer's careful accumulation of detail shows that the early success of Silicon Valley was based on innovation in manufacturing techniques and processes, so that the design and production of its products was closely coupled. This makes one wonder how well the physical and organizational separation of the two now practiced by Apple and other modern firms will sustain long-term innovation.

Back to Top

2. Computing Was Built at the Intersection of Many Other Fields

Perhaps the most important choice facing a historian is the question of what it is that the book they are writing is really about. History is a kind of storytelling, and stories have protagonists. These protagonists might be specific individuals, as in biography, but they might also be technologies, ideas, companies, occupations, groups of people, countries, or even the entire world. There is also the question of when to start the story and when to stop, as it is rarely possible to cover the entire lifespan of the protagonist.


Historians make an effort to write clearly, at least compared to a typical technical paper in computer science.


The topic of Atushi Akera's book Calculating A Natural World: Scientists, Engineers, and Computers During the Rise of U.S. Cold War Research (MIT 2007, CHM Prize winner 2010) is difficult to sum up in a single sentence, which is deliberate on his part. Like Lecuyer, Akera is bringing a new perspective to one of the best-known stories in the history of computing: the creation during the 1940s of the programmable electronic computer, initially as a scientific instrument, and its rapid spread into universities, companies, and government agencies over the subsequent 15 years. One long chapter is a biography of John Mauchly, a creator of ENIAC (remembered by historians as the first useful and flexibly programmable electronic computer). Other chapters explore topics as diverse as IBM's drive to sell its equipment to the new market of corporate computing centers, the role of the SHARE user group in creating programming as a new occupation, and the connection work on timesharing operating systems in university computer centers and the emergence of computer science as an academic field of study. These choices reflect in part the availability of archival source material, but Akera's shifts of focus and topic from individuals to institutions and technologies are also supported by his choice of the "ecology of knowledge" as an analytical framework. Put simply, this suggests that early computing developed as it did only because people with skills of many different kinds converged for reasons of their own around this new kind of technology. Because computer science, and computing more generally, emerged through the interactions of different kinds of experts and institutions we need to understand the entire intellectual "ecosystem" rather than fixating on any one part in isolation.

If that sounds daunting you might want to skip the introductory chapter. But the various stories told in the book are simply written and well researched, and the strength of Akera's holistic approach is made tangible through many unexpected insights. For example, we learn that Mauchly flitted from topic to topic in his early career, trying to turn his Ph.D. in molecular physics into a stable research career in the dreadful economic climate of the 1930s. He approached computing through statistics, meteorology, and tinkering with electronics. Akera shows how this complex background influenced his design for the ENIAC.

Back to Top

3. Scientists Know the World Through Computers

Like most other academics, when historians of computing get together we tend to bemoan the tendency of the world to completely ignore our ground-breaking work addressing vital issues. We then go back to our studies and spend years writing narrowly focused, painstakingly researched, books of intense interest to a few dozen of our colleagues. Of the thousand or so copies published by a major academic press a couple of hundred are given away as review or prize submission copies and the rest, once purchased, usually languish unread on the shelves of the ever-dwindling number of libraries that can still afford to err on the side of completeness. The problem is that writing a book the wider world might actually notice takes a lot of work. It is not easy to tackle a big topic, or to articulate the relevance of historical work to present-day debates without falling victim to what historians call "presentism" (misinterpreting historical events in the light of present-day knowledge or perspectives).

The third CHM Prize winner, A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming by Paul Edwards (MIT 2010, CHM Prize winner 2011) is an outstanding example of the potential for historians to contribute to broader public debates and give non-specialists insight into the work done by scientists and the process by which computer simulation has transformed scientific practice. Edwards tackles one of the most politically polarizing topics in U.S. science today: the connection of climate models to the real world. Without computers we could calculate average temperatures and plot trends, but only computer models can separate underlying climate trends from local or random fluctuations, project their future trend, or test explanations of the physical processes at work against the underlying data.

Our traditional idea of science, embraced by many scientists, is that scientists collect objective observations about the world and then formulate theories to explain them. Scholars in the field of science and technology studies, in which Edwards is trained, have instead stressed that nothing can be perceived except through one or another set of theories and assumptions. In climate science, as in much modern science, data points from the natural world become knowledge of a kind that can support or challenge a theory only after they are processed in computer models. These models are themselves based on theories. Thus, as Edwards succinctly puts in an introduction aimed at general readers, "without models there are no data." This is not to say that Edwards is content, as some earlier radical scholars in science and technology studies were, simply to establish that the scientific knowledge he is examining was "socially constructed" and exit in triumph. We have arrived at an odd moment where this strategy, once associated with the academic left, is now a mainstay of the political right.

Instead, Edwards dives into decades of history to show the slow process by which these models were developed and explore their relationship to technological change. The first computerized weather forecasts were made in 1950 using ENIAC, by members of a team sponsored by John von Neumann. However, even this was only made possible by human networks to record and consolidate weather observations. Since then ever more powerful network, sensor, and computer technologies have been used to construct a global "information infrastructure" to collect climate data and drive ever more complex models of weather forecasting and climate change. He focuses particularly on the work needed to integrate information from different sources and the "data friction" technology imposes on its flexible use.

Edwards believes the public should understand how the "sausage" of scientific knowledge is made, to better understand its strengths and weaknesses. The fact that scientific knowledge is created by social processes and with simulation techniques does not mean all ideas about climate change are equally valid or that scientific knowledge has no special reliability. His success in this mission was confirmed when The Economist named A Vast Machine as one of just six "Books of the Year" in science and technology for 2010. It was the only one about computing.

Back to Top

4. Computer Technologies Are Always Political

Eden Medina's book Cybernetic Revolutionaries: Technology and Politics in Allende's Chile (MIT 2011, CHM Prize winner 2012) takes a close look at the period from 1971 to 1973 when the British operations research specialist Stafford Beer was hired by Salvador Allende's short-lived democratic Marxist government in Chile to implement his newly developed vision of cybernetic control. The boldest version of this Cybersym project imagined traditional political control replaced entirely by a new system in which decisions were influenced to the greatest possible extent by the input of ordinary citizens, industrial production was organized with the help of constantly updated computer models of the entire national economy, and decisions were based on information rather than bureaucratic self-interest. Beer modeled his plans on an abstracted view of the human nervous system in accordance with the central idea of cybernetics, which was that artificial, natural, and living systems are all governed by conceptually equivalent processes of feedback and control. Cybernetics originated in the 1940s with close ties to early work on computers and the support of many of the U.S.'s brightest minds across a range of disciplines. By the 1970s it was still fairly prominent in popular culture but was already sliding into obscure eccentricity within science as researchers favored more focused disciplinary approaches such as artificial intelligence and cognitive science over the ostentatious universality of cybernetic theory.

Beer and his collaborators never came close to fulfilling their grand vision, though they did produce some economic models of little practical use and an "operations room" with an obvious debt to the bridge of the Starship Enterprise. As the revolution crumbled under a U.S. economic blockade and a series of strikes their most practical contribution to its defense was the national telex network, low tech even by the standards of the 1970s, which proved useful for centralized control of emergency responses. Beer himself was changed by his experiences in Chile, devoting himself to fixing the world rather than making money. The urbane lover of fine living gave up his Rolls Royce to spend much of his later career as a mystic, living simply in a primitive rural cottage.


These books demonstrate the rewards of tackling big topics of fundamental importance.


This would seem to offer rich materials for a farce, or perhaps a tragicomic opera like those featuring talk show host Jerry Springer and Canadian Prime Minister Brian Mulroney. To her credit, Medina avoids mockery while doing justice to the gripping weirdness of the story. She puts the Chilean experience center stage, examining tensions between Beer's vision and the agendas pursued by various hosts and collaborators. Media's heart is open to the hopes for a better world that drove Allende's revolution and the faith her characters put in Beer's approach, but is not shy about speaking up when she catches them exaggerating its actual accomplishments or making contradictory statements. One contribution of her work is to remind us that computer technology has been in use outside the U.S. and Western Europe for a long time, and that its history in the developing world may follow a quite different path.

To me, the most fundamental lesson is that all technology is political and most new approaches to computing are promoted with utopian fantasies that later come to seem embarrassing. We instantly recognize the political nature and unhinged ambition of the Cybersym project because they are alien to our own experience in wealthy countries during an economically liberal era. But, as I have discussed elsewhere, a similarly impractical vision of gigantic, real-time systems incorporating forecasting models was a mainstream part of corporate America in the 1960s.1 Even the science fiction control room idea was already established in the business press.2 Likewise, the banking industry first embraced the idea of the "cashless society" almost 50 years ago, but it still retains a futuristic allure. You may also remember all the predictions that the Internet would transform politics, revitalize democracy, and solve the problems of U.S. education. Snake oil, utopian dreams, and science fiction narratives have played a much more important role in the adoption of information technology than we would usually like to admit.

Back to Top

5. The History of Computing Is Maturing

This kind of prize plays an important role in the development of a field. By honoring excellence it helps to shape a canon of exemplary work and to build a consensus on topics and approaches of central importance. So what can we learn about the history of computing by looking at the books together?

One thing that jumps out is just how far the field has developed from its earliest days in the 1970s. The history of computing used to focus on the history of computers themselves. While many scholars continue to look closely at particular machines, such as ENIAC, there has been an unmistakable shift from hardware to applications and from narrow technical histories to broad portrayals of technologies in their social contexts. These books, in particular, demonstrate the rewards of tackling big topics of fundamental importance such as the rise of Silicon Valley or the rise of computer use within scientific practice.

There has also been a shift in the kinds of people telling the stories. Early activity on history of computing was driven by computer scientists and pioneers such as Herman Goldstine, Brian Randell, Bernie Galler, Donald Knuth, and Nick Metropolis. In contrast, all four prize-winning authors discussed in this column have Ph.D.'s in some variety of science and technology studies or history of technology. Two also hold degrees in computer science or electrical engineering. Three hold faculty positions—one in a department of science and technology studies and two within information schools. None are appointed primarily in history departments or history of science programs.

This hiring pattern reflects the openness of other disciplines to historical scholarship in computing, but also has a negative impact on the development of the field as most of the best scholars have limited opportunities to teach in their specialist areas or to train doctoral students in historical research. I recently learned that Medina's book is also the first history of computing book to win the annual Edelstein prize from the Society for the History of Technology, which is indisputably a good sign for the recognition of work on computing by other historical specialists. Hopefully this column and other historical commitments by the ACM and IEEE can maintain a similar connection between computer people and historians. I like to think that Paul Baran would have approved.

Back to Top

References

1. Haigh, T. Inventing information systems: The systems men and the computer, 1950–1968. Business History Review 75, 1 (2001), 15–61.

2. Widener, W.R. New management concepts: Working and profitable. Business Automation 15, 8 (1968), 28–34.

Back to Top

Author

Thomas Haigh ([email protected]) is an associate professor of information studies at the University of Wisconsin, Milwaukee, and chair of the SIGCIS group for historians of computing. A guide to other outstanding historical work is at http://www.sigcis.org/resources.


Copyright held by author.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2013 ACM, Inc.


 

No entries found