acm-header
Sign In

Communications of the ACM

Historical Reflections

Becoming Universal


computers in clouds, illustration

Credit: Getty Images

How to fit the history of computing into a book that can be picked up without needing a forklift truck? That was my challenge in writing A New History of Modern Computing5 (hereafter the "new history") with Paul Ceruzzi. My previous book, ENIAC in Action6 explored a single computer. Now we had to tell the story of billions of them, drawing on the work of an ever-expanding research community to help us find a story hiding among all the model numbers.

I should be clear up front that this is an academic history of computing. Trade books are the ones that get stocked in bookstores, reviewed in newspapers, and so on. Their editors will select and rewrite manuscripts with a mass audience in mind. Trade publishers appear to have decided, perhaps correctly, the only way to sell books on the history of computing is to stuff them with people and stories that readers already know about while nevertheless insisting they are tragically forgotten. Their books feature a lot of Charles Babbage, Alan Turing, and other "geniuses." They obsess over the question of the "first computer" and spend a lot of time in the 1940s laboriously weighing evidence for the primacy of one invention or another before awarding the crown. Once computers are invented their authors lose interest in them. Popular histories that make it out of the 1940s tend to repeat the focus on invention with later innovations—the first personal computers, the first Web browser, and so on. In recent years the more forward-looking authors, like Walter Isaacson whose book The Innovators now dominates the market, have taken pains to include a few women geniuses, like Ada Lovelace, alongside the men.7

In contrast, academic titles typically have print runs of a few hundred and consequently earn little to no money for anyone involved. Most are written by professors, whose salaries come from other sources. Academic books are peer reviewed, a process that focuses more on accuracy than on snappy prose. Most trade authors do little research in archives and thus tend to repeat and embellish the mistakes of their predecessors. In contrast, academic histories often emerge from doctoral dissertations involving years spend with dusty boxes full of old papers. Their authors look for unfamiliar topics but usually tell a narrow story in detail and focus on technologies, institutions, or cultures rather than heroic protagonists. Over the past decade dozens of exciting new academic histories of computing have been published. By drawing on them we could weave a big tapestry based on the research of an entire community.

uf1.jpg
Figure. Computer circuits were shrinking long before Moore's Law was defined. In 1962, four computing staff at the Ballistics Research Laboratory posed with one digit of storage from each of four computers installed there, beginning with ENIAC in 1947.

The two most important previous attempts along these lines were made about 25 years ago: Martin Campbell-Kelly and William Aspray's Computer: A History of the Information Machine1 appeared in 1996 and was followed two years later by Ceruzzi's book.3 Both books were updated, Ceruzzi's in 2003 with a bonus chapter and Campbell-Kelly and Aspray's most recently in 2014 with additional coauthors.2

Our new history began as a new edition of Ceruzzi's classic, but we soon agreed on a more fundamental reimagining. We drew up the outline (chapters, headings, and topics) we would use if writing a new and ambitious overview history of computing from scratch. Our goal was to connect the dots between the detailed studies to bind together the histories of computer users and applications with the invention and adoption of new kinds of hardware and software. Then we looked through Ceruzzi's existing book to see which topics were already covered and began to reassemble bits of legacy text into the new structure. A lot of our topics were not in the old book at all. Even when we could reuse legacy material we condensed and reshaped it. In his original book, Ceruzzi used a lovely quote from Mark Twain, "Very few things happen at the right time, and the rest do not happen at all. The conscientious historian will correct these defects."

This column describes the insights we used to structure the new history. Some are ideas widely shared in the history of computing community, others are quirkier.

Back to Top

You Have to Start It Somewhere

We start in 1946, when the front page of the New York Times featured a computer called ENIAC built at the University of Pennsylvania. That report spread both the idea of electronic computation and a visual impression of the computer as something festooned with lights, switches, and wires. ENIAC wasn't the first computer, but it embodied an unprecedented conjunction of electronic speed and programmability. The interactions of its designers with John von Neumann led directly to a conceptual design for a successor—EDVAC—that set out the key architectural features of most subsequent computers. ENIAC also precipitated the computer industry. Before ENIAC was even delivered to the U.S. Army's Ballistics Research Laboratory its main designers, John Mauchley and J. Presper Eckert, had set up a company to market what eventually became the Univac 1 computer.

Back to Top

The Computer Became Universal Gradually

Many histories of computing written by computer scientists or mathematicians have positioned actual computers as physical instantiations of the "universal machine" described by Alan Turing in the 1930s. To them almost any machine able to branch could, with sufficient time and storage, duplicate the work of any other computer. In that sense, even ENIAC was universal back in 1946. From the viewpoint of practice, however, things look very different. ENIAC had a writable memory of only 200 decimal digits, was challenging to reprogram, filled around 2,000 square feet, and cost (allowing for inflation) many millions of dollars. It could output results only by flashing lights and punching cards. It ran numerical computations and simulations, mostly for military and atomic projects. In any meaningful sense it was a highly specialized machine. Today, more than half of the people in the world own smartphones. They are universal in a practical sense because we use them for almost everything imaginable. Other computers, from supercomputers and cloud systems to embedded microcontrollers, collectively carry out a still broader range of tasks.


Looking at users has helped historians broaden the history of computing beyond the traditional focus on inventors.


Plenty of other technologies are just as widespread. Electric light, for example. Gears, levers, and wheels appear even more widely as components of machines of many kinds, though they themselves do only one job each. Neither is it unusual for technologies to become more general over time. The steam engine, for example, spent decades pumping water out of mines before being reconfigured to move trains and power factories. But the computer's journey from being a 1940s-era scientific instrument akin to a cyclotron to a spectacularly flexible general-purpose technology seems unique in the history of technology. That big transformation is the result of a series of smaller transformations.

Back to Top

Sustained Exponential Growth Does Not Happen by Itself

Writing the book drove home the impact of exponential growth in the affordability of computing power when sustained for decade after decade. To put an iPhone next to ENIAC is to feel as if, in the famous parable, someone really did figure out how to put the entire global harvest of rice onto a single square of a chessboard. This phenomenon is often associated with Moore's Law, though that misses the fact that the process was under way in the 1950s and 1960s before computers relied on integrated circuits. In 1962, the U.S. Army Ballistics Research Lab posed four of its computer staff to celebrate 15 years of miniaturization, as shown in the photograph on the first page of this column. Patsy Simmers, on the left in the photo, holds an ENIAC module that used 28 vacuum tubes to store a single decimal digit. The other three staff members hold modules with equivalent or greater memory storage on three subsequent BRL computers. Having lifted one of those ENIAC modules myself, I admire Simmers' smile and stance in the Army photo.

There was nothing natural about this progression. As several observers have noted, the seemingly scientific nature of Moore's Law hides the vast amounts of money, human labor, and incremental innovation that were needed to keep increasing transistor densities.10,11 Now that heat-dissipation issues, physical challenges, and the enormous cost of constructing next-generation chip manufacturing facilities have slowed that advance, at least temporarily, we can better appreciate how exceptional this era was. The rapid proliferation of ever more powerful computers was not just an inevitable playing out of the universality inherent in a Turing machine.

Back to Top

Computers Have Dissolved Many Other Technologies

Ceruzzi brought to our discussions the interesting metaphor of the computer as a "universal solvent." That invokes a mythical liquid able to dissolve any substance. A very partial list of the technologies dissolved by electronic computers would include adding machines, mechanical control systems, typewriters, telephones, televisions, music recording systems, paper files, amusement machines, film-based cameras, videotapes, slide projectors, paper maps, letters, and newspapers. All have been largely displaced, though some, like vinyl records and Polaroid cameras, have been revived by enthusiasts. Others, like televisions and video players, still exist as distinct classes of machine but their internal workings have been replaced with computer chips running specialized software.

This process depended on computers becoming cheaper, faster, and more power efficient but it also involved the creation of other technologies, including sensors and screens, and the development of new algorithms for things such as compressing audio and video information. Our new history pays attention not just to the boxes most people would recognize as "computers" but to the digital media devices and embedded systems that account for most of the computer systems in the world.

Ceruzzi's original book had also started in the 1940s, bucking the tradition of spending early chapters with Charles Babbage, with office machines, or with mechanical calculators. Providing that longer history has become increasingly difficult now they have supplanted so many other technologies. Why talk about the adding machines but not movie projectors; filing cabinets but not pinball machines; or astrolabes but not telephones? No single book has room to do justice to all those things.


The core technologies developed in one context often resurface in others.


These changes of scale, cost, form, and application make the computer particularly difficult to chronicle. The automobile, for example, was just as important to the 20th century as the computer is to the 21st century. It remains the most expensive consumer product and the one with the biggest influence over our daily lives and urban environments. Its history is rich and complex, entangled with everything from racial segregation to foreign policy. But over the century from 1920–2020, the typical car had a relatively stable physical form: a self-propelled box used to move between two and eight people over asphalt at a maximum speed that has approximately doubled, from 40mph to a (legally mandated) 70mph or 80mph. Cars are still built on assembly lines by huge companies. Ford, General Motors, and Chrysler were the "big three" U.S. automakers by volume in the 1920s and in 2021. Cars are still distributed by franchised dealers. A basic car still costs a skilled worker several months of pay. The story of computing offers no comparable continuities. Few, if any, other technologies have changed their scale, dominant applications, and users so often and so fundamentally.

Back to Top

Users and Applications Have Always Driven Computing

Looking at users has helped historians broaden the history of computing beyond the traditional focus on inventors. Sometimes the users are individuals, sometimes large organizations such as NASA or the IRS. When looking at the business use of computers we explored patterns of gendered labor around them, as new jobs were created from keypunch operator to systems analyst. We also explore the experience of people using home computers, not just the companies producing them.

Different users shaped different histories. The late Princeton historian Michael Mahoney criticized the assumption there was just one history of the computer, in which various earlier technologies converged in the 1940s to create the modern computer, after which the new technology started to transform the world. Instead, "the histories and continuing experience of the various communities show they wanted and expected different things from the computer. They encountered different problems and levels of difficulty in fitting their practice to it. As a result, they created different computers or (if we may make the singular plural) computings."9

Back to Top

Computers Are Always Becoming New Things …

Mahoney's insight is reflected in the structure of our book. Most of the chapters are entitled "The computer becomes [X]." In each we explore how the perceived needs of a particular community of users drove the development of new kinds of hardware and software. Rather than following in a strict time sequence these stories overlap. In the second, third, and fourth chapters, for example, the computer becomes a scientific supertool, a data processing device, and a real-time control system. Each of these runs from the early 1950s to the late 1970s. The second chapter, for example, is centered on the emergence of supercomputers, running from the IBM 701 and its successors to the Cray 1. All those machines were built around the needs of nuclear weapons labs and aerospace firms. They were hugely expensive and had small production runs.

The core technologies developed in one context often resurface in others. That gives the story coherence: our protagonist—"the computer"—is having a series of adventures and changing as it goes. We see it as a stack of hardware, architectural capabilities, software tools, algorithms, and the human skills needed to exploit them. Features invented for one purpose become part of the stack of technologies used as infrastructure for the creation of other kinds of system. For example, capabilities originally developed for exotic supercomputers or mainframes, such as instruction pipelining, virtual memory, and parallel processing eventually made their way into smartphones. Over time the stack grows taller.

Back to Top

… But They Stay All the Old Ones Too

I always knew the endpoint of our story would be a Tesla Model S, not an iPhone. As the catastrophic impact of the current chip shortage on the car market has shown, a typical middle-class family in the U.S. spends more on the computer power packed into its vehicles than on smartphones, laptops, or desktops. They just do not realize it, because their self-propelled computer cluster still looks from the outside like a traditional car.


Sometimes the history of computing and the history of the Silicon Valley seem in danger of blurring together in the popular imagination.


Examining the many dozens of computers packed into the Model S shows that computers still carry out all the applications we described throughout the book. This is an attempt to get away from the impression of a serial evolution in the form of computers: first mainframes, then minicomputers, then personal computers, and so on. The computers in the car's battery systems and air bags, for example, carry out real-time control functions similar to those used in the space rockets of the 1960s. As it rolls past an electronic toll, the transaction is processed against a credit card using batch systems that still run on mainframes. More visibly, the car's giant screen and Internet connections highlight Tesla's deliberate efforts to blur lines between cars and tablet computers.

Back to Top

Communications Were Central Long Before the Web

Earlier histories of computing tended to ignore computer communications for most of narrative, discovering them only with the arrival of the World Wide Web close to the end of the book.a In contrast, we weave the development of communications as a key part of the overall computing story including: the transmission of data for military and aerospace applications in the 1950s and 1960s; timesharing and its use, together with remote terminals, to make on-line interactive access to computers common by the end of the 1960s; how applications such as email, the Plato educational environment, videotext systems like Minitel, and the packet-switched ARPANET were constructed around timesharing systems; discussion of bulletin board systems; and the integration of Ethernet into the story of electronic office work. This broad focus on many types of communications and networks avoids the common tendency to focus exclusively on the Internet and its direct ancestors. All that is before we get to the last few chapters, which return communications at the forefront with explorations of digital audio and video, the Web, the cloud, and smartphones.

Back to Top

Video Games and Graphics Have Driven Computing in Profound Ways

In the present day the importance of video games is difficult to miss. Big games have budgets, sales, and cultural profiles comparable to the largest Hollywood blockbusters and consume far more of peoples' lives. Gamers are the only people still buying desktop computers, while their constant thirst for computer power has driven the development of graphics cards so powerful they are being snapped up for cryptocurrency mining and AI applications. Even smart-phones have been engineered to produce remarkably smooth and detailed real-time visualizations. Retrogaming has a mass-market phenomenon, evidenced by YouTube channels with millions of subscribers and the constant release of modern replicas of classic gaming platforms.

Little wonder the historical literature on video games has grown rapidly during the past decade. Yet earlier overview histories had had little time for arcade video games, games consoles, or game-oriented home computers such as the Sinclair Spectrum and Commodore 64. In our new history, we try to do justice to this aspect of computing, looking at key gaming platforms and their relationship to the emergence of particular styles of video game as well as the role of gamers and games in steering the evolution of personal computing technology.

Back to Top

The PC Really Matters but Its Story Has Never Been Properly Told

The development of the IBM PC into an evolving hardware standard underpinning almost all of the laptops, desktops, and servers of the past few decades has received remarkably little attention, not just in previous overview histories (which tend to take the arrival of the original IBM PC as an ending rather than a beginning) but in the historical literature as a whole. Many "platform studies" have been published of games consoles but nothing on the most important platform in personal computer history. For me, the important thing about the IBM PC was never the original model, or even IBM, but the gradual process by which the PC went from being a single proprietary computer model—different from dozens of others only because of its market success, to the template for an entire industry.

We follow the PC story into the 1990s, a decade that began with 386-based desktop computers running MS-DOS and finished with the robust and powerful Windows 2000 running on sleek Pentium III laptops. The speed of that development was quite remarkable and set a foundation for personal computing in the 20 years since. Intertwined with that are the stories of the RISC challenge to Intel and the evolution of the PC itself from compatibility with specific IBM models to a cluster of open standards maintained by industry groups.

Back to Top

Browsers and Smartphones Are the New Terminals

At a time when both "Internet History" and "Web History" often signify scholarly communities with little connection to the broader history of computing, our challenge was not just to cover the development of the Internet but to do so in a way that makes clear the benefits of talking a longer perspective. One of the benefits of starting in 1946 rather than 1991 is to uncover all the ways in which the Web rested on existing hardware and software infrastructures. These were shaped in turn by the academic context of the original ARPANET. In particular, the lack of integrated security and the lack of an inbuilt billing mechanism, both unnecessary in the original context, have had profound implications for things such as spam, online fraud, and the devastation of the traditional news industry (and liberal democracy) when online publishing settled on an advertising-supported model.

Although the first of our chapters on the Web focuses the development of online publishing, the second is centered on the cloud and shows the extent to which the original idea of publishing static hypertext was almost entirely replaced by new approaches in which Web browsers essentially replaced the terminals and terminal emulators of earlier generations as a way of rendering the dynamically generated output of software running in remote data centers. Today, smartphones rely on cloud computing to power the applications behind app-based services like Google Maps, Facebook, or Uber—all tie back to our extensive discussion of earlier timesharing systems. A reader, particularly one too young to remember the 1980s, is likely to consider the short era of free-standing personal computers, and even the era of clientserver systems, as a brief aberration. John McCarthy reportedly called it the "Xerox heresy" (after PARC, where client-server computing and graphical workstations were invented).b

Back to Top

Government Funding was Crucial to the Development of Computer Technology

We frequently found ourselves writing about highly influential government-run projects, from the Army's sponsorship of ENIAC through the central importance of the Cold War in supporting IBM's early mainframes and custom systems such as the SAGE air defense systems, through the role of the space and missile programs in kickstarting the market for integrated circuits. The development of the ARPANET and its successor, the Internet, was a government initiative. Other government contracts supported vital early research on things including timesharing and computer graphics. We also explore some developments outside the U.S., such as the French government's decision to replace telephone directories with Minitel terminals. All this can be difficult to square with rhetoric that future U.S. technological leadership will rest primarily on lowering taxes and getting the government out of the way of private development.

Back to Top

It Wasn't Just Silicon Valley …

Sometimes the history of computing and the history of the Valley seem in danger of blurring together in the popular imagination. Flipping through our book reminds me of how much innovation happened elsewhere. Within the U.S., we spend time with ENIAC and Univac in the Philadelphia area; MIT, Lotus, BBN, and DEC in Massachusetts; ERA, Control Data, and Cray in the Midwest; and IBM in (mostly) New York state. We also tried to internationalize the story, though it remains U.S.-dominated. Aside from the rise of Asian electronics manufacturing and the rapid global adoption of smartphones in the final chapters, which would be difficult to ignore, our attention goes overseas mostly when a distinctively different path from those in the U.S. is taken, as with the British home computers from Sinclair and Acorn or the French deployment of Minitel.

Back to Top

… but Silicon Valley Has Eaten the World

The ending of the book underscores the outsized importance of Silicon Valley (and the Seattle area) to our lives and to the global economy. Most of the companies we talked about in our final chapters were headquartered within a short drive from the original Tesla factory. In the book's epilogue we look briefly at these changes and at role of the COVID-19 pandemic in heightening our dependence on computer systems. As computers have expanded their roles so profoundly, they have come to matter in different ways. For most of the book, IBM was the only major global firm involved in the computer industry. The fate of firms such as Apple and Commodore in the 1980s mattered greatly to their fans, but less so to the world as a whole. In contrast, the last few years have seen a growing "techlash" against firms like Facebook and the enormous power their leaders exert over our lives. The first five companies to reach market valuations of $1 trillion were Apple, Microsoft, Amazon, Alphabet, and Tesla.

It is almost a twist ending. While critics were of course warning about the perils of computer technology all along, our narrative mostly followed the development of computer technology without editorializing. Was our protagonist secretly evil all along? Some will wish we had been focused on oppression and inequality all along, while others may wish we had stuck with bits and bytes to the very end. But the change of focus does highlight the sudden realization of many in the computing field that they are no longer plucky underdogs and that it is time to take some responsibility for the state of the world.

Back to Top

Computer Science ≠ Computing

Readers may recall Donald Knuth's complaint that the shift of "historians of computer science" away from technical material had reduced him to tears.8 My response then was that very little work on the history of computing actually fell into that category of "history of computer science," because neither the history of science community nor the computer science community had made investments to support that field.4 On the copy of the new history I sent to Knuth, I wrote "This is not the history of computer science, but I hope you like it anyway." What Ceruzzi and I tried to produce is a history of computer technology and practice engaged with how computer systems work, meaning we talk about computer scientists only when their work leads directly to changes in computing practice.

An actual history of computer science that took a comparably broad approach would look at the growth of the academic discipline, leading university programs, key areas of research, changes in career patterns and funding, relations between industry and academia, and above all at the rise and fall of areas including: artificial intelligence, computer graphics, computer architecture, and numerical analysis within the loose federation of research of research communities that calls itself "computer science." I hope to live long enough to see someone write that book, but we are not much closer to laying the groundwork for it with detailed studies than we were 20 years ago.

Back to Top

Conclusion

Ceruzzi's original book was the most widely cited scholarly history of computing. It attracted readers looking for an in-depth, technically sound history of computing that reached beyond the usual anecdotes. We hope that A New History of Modern Computing will fill a similar role for a new generation of readers, by serving as an approachable point of entry to the growing scholarly literature on the topic.

Getting too close to the present has its perils for any historian. As the new history progresses, our sources shift from painstaking researched academic histories to journalistic books and newspaper articles. It is the newest material that will surely go out of date fastest, as historians begin to dig in and stories start to come into focus. I am already taking notes for the next iteration.

Back to Top

References

1. Campbell-Kelly, M. and Aspray, W. Computer: A History of the Information Machine. Basic Books, New York, NY, 1996.

2. Campbell Kelly, M. et al. Computer: A History of the Information Machine. Third Edition. Westview Press, Boulder, CO, 2014.

3. Ceruzzi, P.E. A History of Modern Computing. MIT Press, Cambridge, MA, 1998.

4. Haigh, T. The tears of Donald Knuth. Commun. ACM 58, 1 (Jan. 2015), 40–44.

5. Haigh, T. and Ceruzzi, P.A. A New History of Modern Computing. MIT Press, Cambridge, MA, 2021.

6. Haigh, T., Priestley, M. and Rope, C. ENIAC In Action: Making and Remaking the Modern Computer. MIT Press, Cambridge, MA, 2016.

7. Isaacson, W. The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution. Simon and Schuster, New York, 2014.

8. Knuth, D.E. and Shustek, L. Let's not dumb down the history of computer science. Commun. ACM 64, 2 (Feb. 2021), 33–35; https://www.youtube.com/watch?v=gAXdDEQveKw.

9. Mahoney, M.S. The histories of computing(s). Interdisciplinary Science Review 30, 2 (2005), 119–135.

10. Mody, C.C.M. The Long Arm of Moore's Law (Cambridge, MA: MIT Press, 2017).

11. Mollick, E.R. Establishing Moore's Law. IEEE Annals of the History of Computing 28, 3 (July–Sept. 2006), 62–75.

Back to Top

Author

Thomas Haigh ([email protected]) is a professor of history at the University of Wisconsin—Milwaukee and a Comenius visiting professor at Siegen University.

Back to Top

Footnotes

a. In contrast to the general neglect of computer communications in overview histories, communication technologies such as telegraphs have often been treated as precursors of the computer itself as, for example, in the 1990s Smithsonian exhibit The Information Age.

b. https://amturing.acm.org/pdf/DiffieTuring-Transcript.pdf

This work was funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation)—Project-ID 262513311—SFB 1187 Media of Cooperation.


Copyright held by author.
Request permission to (re)publish from the owner/author

The Digital Library is published by the Association for Computing Machinery. Copyright © 2022 ACM, Inc.


 

No entries found