acm-header
Sign In

Communications of the ACM

Viewpoints

Computing's Paradigm


color Venn diagram

Cmputing rightfully comes up in many discussions of university organization and curricula, high school courses, job qualifications, research funding, innovation, public policy, and the future of education. In repeated attempts to characterize our field in these discussions, our leaders continue to encounter sometimes contentious debate over whether computing is a field of engineering or science. Because it leaves others with a sense that we lack a clear focus, that debate negatively influences policies involving computing.

There seems to be agreement that computing exemplifies engineering and science, and that neither engineering nor science characterizes computing. What then does characterize computing? In this column, we will discuss computing's unique paradigm and offer it as a way to leave the debilitating debate behind.

The word "paradigm" for our purposes means a belief system and its associated practices, defining how a field sees the world and approaches the solutions of problems. This is the sense that Thomas Kuhn used in his famous book, The Structure of Scientific Revolutions. Paradigms can contain sub-paradigms: thus, engineering divides into electrical, mechanical, chemical, civil; science divides into physical, life, and social sciences, which further divide into separate fields of science.

Back to Top

Roots of the Debate

Whether computing is engineering or science is a debate as old as the field itself. Some founders thought the new field a branch of science, others engineering. Because of the sheer challenge of building reliable computers, networks, and complex software, the engineering view dominated for four decades. In the mid-1980s, the science view began to assert itself again with the computational science movement, which claimed computation as a new sub-paradigm of science, and stimulated more experimental research in computing.

Along the way, there were three waves of attempts to provide a unified view. The first wave was by Alan Perlis,9 Allen Newell,8 and Herb Simon,11 who argued that computing was unique among all sciences and engineering in its study of information processes. Simon went so far as to call computing a science of the artificial.

The second wave started in the late 1960s. It focused on programming, seen as the art of designing information processes. Edsger Dijkstra and Donald Knuth took strong stands favoring programming as the unifying theme. In recent times, this view has foundered because the field has expanded and the public understanding of programmer has become so narrow (a coder).

The third wave was the NSF-sponsored Computer Science and Engineering Research Study (COSERS), led by Bruce Arden in the mid-1970s. It defined computing as automation of information processes in engineering, science, and business. It produced a wonderful report that explained many exotic aspects of computing to the layperson.1 However, it did not succeed in reconciling the engineering and science views of computing.

Back to Top

Peaceful Coexistence

In the mid-1980s, the ACM Education Board was concerned about the lack of a common definition of the field. The Board charged a task force to investigate; its response was a report Computing as a Discipline.4 The central argument of the report was that the computing field was a unique combination of the traditional paradigms of math, science, and engineering (see Table 1). Although all three had made substantial contributions to the field, no single one told the whole story. Programming—a practice that crossed all three paradigms—was essential but did not fully portray the depth and richness of the field.

The report in effect argued for the peaceful coexistence of the engineering, science, and math paradigms. It found a strong core of knowledge that supports all three paradigms. It called on everyone to accept the three and not try to make one of them more important than the others.

Around 1997, many of us began to think the popular label IT (information technology) would reconcile these three parts under a single umbrella unique to computing.3,7 Time has proved us wrong. IT now connotes technological infrastructure and its financial and commercial applications, but not the core technical aspects of computing.

Back to Top

A Computing Paradigm

There is something unsatisfying about thinking of computing as a "blend of three sub-paradigms." What new paradigm does the blend produce?

Recent thinking about this question has produced new insights that, taken together, reveal a computing paradigm. A hallmark of this thinking has been to shift attention from computing machines to information processes, including natural information processes such as DNA transcription.2,6 The great principles framework interprets computing through the seven dimensions of computation, communication, coordination, recollection, automation, evaluation, and design (see http://greatprinciples.org). The relationships framework interprets computing as a dynamic field of many "implementation" and "influencing" interactions.10 There is now a strong argument that computing is a fourth great domain of science alongside the physical, life, and social sciences.5

These newer frameworks all recognize that the computing field has expanded dramatically in the past decade. Computing is no longer just about algorithms, data structures, numerical methods, programming languages, operating systems, networks, databases, graphics, artificial intelligence, and software engineering, as it was prior to 1989. It now also includes exciting new subjects including Internet, Web science, mobile computing, cyberspace protection, user interface design, and information visualization. The resulting commercial applications have spawned new research challenges in social networking, endlessly evolving computation, music, video, digital photography, vision, massive multiplayer online games, user-generated content, and much more.


There is an interesting distinction between computational expressions and the normal language of engineering, science, and mathematics.


The newer frameworks also recognize the growing use of the scientific (experimental) method to understand computations. Heuristic algorithms, distributed data, fused data, digital forensics, distributed networks, social networks, and automated robotic systems, to name a few, are often too complex for mathematical analysis but yield to the scientific method. These scientific approaches reveal that discovery is as important as construction or design. Discovery and design are closely linked: the behavior of many large designed systems (such as the Web) is discovered by observation; we design simulations to imitate discovered information processes. Moreover, computing has developed search tools that are helping make scientific discoveries in many fields.

The newer frameworks also recognize natural information processes in many fields including sensing and cognition in living beings, thought processes, social interactions, economics, DNA transcription, immune systems, and quantum systems. Computing concepts enable new discoveries and understandings of these natural processes.

The central focus of the computing paradigm can be summarized as information processes—natural or constructed processes that transform information. They can be discrete or continuous.

Computing represents information processes as "expressions that do work." An expression is a description of the steps of a process in the form of an (often large) accumulation of instructions. Expressions can be artifacts, such as programs designed and created by people, or descriptions of natural occurrences, such as DNA and DNA transcription in biology. Expressions are not only representational, they are generative: they create actions when interpreted (executed) by appropriate machines.

Since expressions are not directly constrained by natural laws, we have evolved various methods that enable us to have confidence that the behaviors generated do useful work and do not create unwanted side effects. Some of these methods rely on formal mathematics to prove that the actions generated by an expression meet specifications. Many more rely on experiments to validate hypotheses about the behavior of actions and discover the limits of their reliable operation.

Table 2 summarizes the computing paradigm with this focus. While it contains echoes of engineering, science, and mathematics, it is distinctively different because of its central focus on information processes.5 It allows engineering and science to be present together without having to choose.

There is an interesting distinction between computational expressions and the normal language of engineering, science, and mathematics. Engineers, scientists, and mathematicians endeavor to position themselves as outside observers of the objects or systems they build or study. Outside observers are purely representational. Thus, traditional blueprints, scientific models, and mathematical models are not executable. (However, when combined with computational systems, they give automatic fabricators, simulators of models, and mathematical software libraries.) Computational expressions are not constrained to be outside the systems they represent. The possibility of self-reference makes for very powerful computational schemes based on recursive designs and executions, and also for very powerful limitations on computing, such as the noncomputability of halting problems. Self-reference is common in natural information processes; the cell, for example, contains its own blueprint.

The interpretation "computational thinking"12 embeds nicely into this paradigm. The paradigm describes not only a way of thinking, but a system of practice.

Back to Top

Conclusion

The distinctions discussed here offer a distinctive and coherent higher-level description of what we do, permitting us to better understand and improve our work and better interact with people in other fields. The engineering-science debates present a confusing picture that adversely affects policies on innovation, science, and technology, the flow of funds into various fields for education and research, the public perception of computing, and the choices young people make about careers.

We are well aware that the computing paradigm statement needs to be discussed widely. We offer this as an opening statement in a very important and much needed discussion.

Back to Top

References

1. Arden, B.W. What Can Be Automated: Computer Science and Engineering Research Study (COSERS). MIT Press, 1983.

2. Denning, P. Computing is a natural science. Commun. ACM 50, 7 (July 2007), 15–18.

3. Denning, P. Who are we? Commun. ACM 44, 2 (Feb. 2001), 15–19.

4. Denning, P. et al. Computing as a discipline. Commun. ACM 32, 1 (Jan. 1989), 9–23.

5. Denning, P. and P.S. Rosenbloom. Computing: The fourth great domain of science. Commun. ACM 52, 9 (Sept. 2009), 27–29.

6. Freeman, P. Public talk "IT Trends: Impact, Expansion, Opportunity," 4th frame; www.cc.gatech.edu/staff/f/freeman/Thessaloniki

7. Freeman, P. and Aspray, W. The Supply of Information Technology Workers in the United States. Computing Research Association, 1999.

8. Newell, A., Perlis, A.J., and Simon, H.A. Computer science, letter in Science 157, 3795 (Sept. 1967), 1373–1374.

9. Perlis, A.J. The computer in the university. In Computers and the World of the Future, M. Greenberger, Ed. MIT Press, 1962, 180–219.

10. Rosenbloom, P.S. A new framework for computer science and engineering. IEEE Computer (Nov. 2004), 31–36.

11. Simon, H. The Sciences of the Artificial. MIT Press (1st ed. 1969, 3rd ed. 1996).

12. Wing, J. Computational thinking. Commun. ACM 49, 3 (Mar. 2006), 33–35.

Back to Top

Authors

Peter J. Denning ([email protected]) is the director of the Cebrowski Institute for Information Innovation and Superiority at the Naval Postgraduate School in Monterey, CA, and is a past president of ACM.

Peter A. Freeman ([email protected]) is Emeritus Founding Dean and Professor at Georgia Tech and Former Assistant Director of NSF for CISE.

Back to Top

Footnotes

DOI: http://doi.acm.org/10.1145/1610252.1610265

Back to Top

Tables

T1Table 1. Sub-paradigms embedded in computing.

T2Table 2. The computing paradigm.

Back to top


Copyright held by author.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2009 ACM, Inc.


Comments


Juergen Pabel

Undoubtly, the core aspects are very technical, but computing is also very much about social and interactional aspects. Any IT project implies a great number of challenges and more often than not these challenges must be divided among project members. The way these tasks are defined, distributed, executed and reported (for further coordination of the outstanding project tasks) has a significant impact on the result. For example, consider a software project that is created according to the Secure Development Lifecycle; a similar project developed using an agile software methodology will inadvertedly result in a vastly different software product.

These non-technical aspects are very dominant in large IT projects. More often than not, these non-technical aspects have a greater impact on the success or failure of a given project than the concrete technical challenges. I would have loved to have seen these aspects addressed in this article.

Yours,
Juergen Pabel


CACM Administrator

The following letter was published in the Letters to the Editor in the April 2010 CACM (http://cacm.acm.org/magazines/2010/4/81506).
--CACM Administrator

Beginning with the headline, "Computing's Paradigm," The Profession of IT Viewpoint by Peter J. Denning and Peter A. Freeman (Dec. 2009) reflected some confusion with respect to Thomas Kuhn's notion of "paradigm" (a set of social and institutional norms that regulate "normal science" over a period of time). Paradigms, said Kuhn, are incommensurable but determined by the social discourse of the environment in which science develops.

The crux of the matter seems to be that computing can't be viewed as a branch of science since it doesn't deal with nature but with an artifact, namely the computer. For guidance, we reflect on at least one scientific antecedent thermodynamics, which originated from the need to understand the steam engine but is distinguished from steam engineering by its search for general principles, detached from a specific machine. The Carnot cycle and entropy theorem are scientific results, not feats of engineering.

The metatheoretical problem of computing seems mainly semiotic. Suppose, 200 years ago, somebody had created a discipline called, say, Thermozap, that included the study of the Carnot cycle and the building of new steam engines. Somebody might have come up with the insoluble problem of whether the new discipline was science or engineering. It was neither but rather a hodgepodge of things better left separated.

Computing is in a similar situation. There is an area (call it Knuth-Dijkstra computing) that studies scientific problems posed by the existence of computing devices. Thermodynamics was part of physics because steam engines use physical forces. Computing devices are formal machines, so Knuth-Dijkstra computing is a mathematical discipline. Then there is the computing discipline that builds systems (call it Denning-Freeman computing), which is definitely part of engineering. The error is in thinking they are the same. Both refer to the same device, generically called "computer," but is a misleading connection, since the two disciplines describe the computer in different ways a formal model of computation in Knuth-Dijkstra computing, an actual machine in Denning-Freeman computing.

Denning and Freeman proposed a "framework" that takes the side of engineering computing (why I call it Denning-Freeman computing), describing development of an engineering system and leaving no doubt as to the envisioned nature of the discipline. All the purportedly different fields they proposed from robotics to information processing in DNA are actually different applications of the same paradigm. To consider them different would be like saying quantum physics is different for nuclear plants and for semiconductors. The physics is the same; what changes is the engineering process of its application, as in computing.

The abstract problem of symbol manipulation is mathematical and the subject of computing science. The instantiation of the symbol-manipulation model in useful systems is a problem for the engineering of computing, a discipline that is theoretically, methodologically, and conceptually separated from the mathematical study of symbol manipulation.

Simone Santini
Madrid, Spain

----------------------------------------------

AUTHORS' RESPONSE

Our argument concerned computing's "belief system." Kuhn discussed belief systems in science. Whether or not we were true to Kuhn is irrelevant to our argument.

Santini says computing is about computers. We disagree. Computing is about information processes, and computers are machines that implement information processes. There are natural, as well as artificial, information processes. Computing is as much about computers as astronomy is about telescopes.

Computing does not separate neatly into math and engineering, as Santini claims. Computing increasingly employs experimental (scientific) methods to test hypotheses about complex information processes.

Santini's desire to parse computing into separate elements will fail, just as all such previous attempts have failed. Our collective concern with information processes keeps pulling all the elements together, no matter how hard we try to separate them.

Peter Denning
Monterey, CA
Peter Freeman
Atlanta, GA


CACM Administrator

The following letter was published in the Letters to the Editor in the April 2010 CACM (http://cacm.acm.org/magazines/2010/4/81506).
--CACM Administrator

I wish to suggest ways to improve Peter J. Denning's and Peter A. Freeman's proposed computing paradigm in their Viewpoint "Computing's Paradigm" (Dec. 2009). While I accept the tentative five phases-initiation, conceptualization, realization, evaluation, and action in the proposed paradigm, they are, in practice, incomplete.

While I agree with initiation (the existential argument followed by conceptualization) as the design argument, three additional phases are missing: The first is a phase 0 I call understanding (or problem understanding). Before one can pose the existential (Denning's and Freeman's initiation), a phase must address (problem) understanding, a key element in all complex computing domains. Moreover, understanding is associated with modeling, a key aspect of understanding. One cannot determine whether a system can be built or represented without the understanding needed to pose hypotheses, theses, or formal requirements. Understanding is often not addressed very well by beginning computing researchers and developers, especially as it pertains to information processes.

The second missing element of conceptualization is an explicit statement about bounded rationality, per Herbert Simon (http://en.wikipedia.org/wiki/Bounded_rationality), a concept based on the fact that the rationality of individuals is limited by the information they possess, the cognitive limitations of their minds, and the finite amount of time they have to make decisions. Bounded rationality addresses the tentative nature of design and discovery as an evolving set of decisions posed against multiple criteria derived from understanding and initiation. The results from conceptualization, or design, must always be understood as both tentative and knowledge-limited.

Finally, a phase missing from evaluation and action is "technology readiness" (http://en.wikipedia.org/wiki/Technology_readiness_level), especially in deploying real systems. A new technology, when first invented or conceptualized is not suitable for immediate application. It is instead usually subject to experimentation, refinement, and increasingly realistic contextual testing. When proven, it can be incorporated into a deployed system or subsystem. All information processes are realized and embedded within the context of existing deployed systems. Therefore, technology readiness of a posed information process must stand as a separate phase between evaluation and action.

1. Simon, H. Bounded Rationality and Organizational Learning. Organization Science 2, 1 (1991), 125134.
2. Simon, H. A mechanism for social selection and successful altruism. Science 250, 4988 (1990), 16651668.
3. Simon, H. A behavioral model of rational choice. In Models of Man, Social and Rational: Mathematical Essays on Rational Human Behavior in a Social Setting. John Wiley & Sons, Inc., New York, 1957.

David C. Rine
Fairfax, VA


CACM Administrator

The following letter was published in the Letters to the Editor of the October 2014 CACM (http://cacm.acm.org/magazines/2014/10/178772).
--CACM Administrator

Regarding Peter J. Denning's and Peter A. Freeman's Viewpoint "Computing's Paradigm" (Dec. 2009), Simone Santini's letter to the editor "Computing Paradigm Not a Branch of Science" (Apr. 2010) said computing can be categorized as both a branch of science and a branch of mathematics, claiming, "The abstract problem of symbol manipulation is mathematical..." and "The instantiation of the symbol-manipulation model in useful systems is a problem for the engineering of computing." In response, Denning and Freeman said, "Computing does not separate neatly into math and engineering, as Santini claims." But what is indeed wrong with Santini's distinction, which has been endorsed by many others over the years? Denning and Freeman even predicted, "Santini's desire to parse computing into separate elements will fail, just as all such previous attempts have failed."

Virtually all software development is done through trial and error, (unit) testing, and never-ending patching following delivery, with stupendous (productivity) costs; recall the classic Microsoft software alert: "You may have to reboot your system." There are good reasons for this practice. One is there are no tools (or supporting theory) for systematic top-down iterative formal development, from requirements to running system. Most software products do not need meaning-preserving transformations or formal verifications.

This state of the art does not mean we can dismiss a math approach to development, validation, and annotations of library elements for machine-assisted reuse. It is actually a failure of computer science, better called "informatics," to have not developed a math approach to the software development life cycle. Consider recent unwelcome consequences of the lack of formal verification techniques: the Heartbleed flaw in OpenSSL, Goto fail in Apple OS, and the CVE-2014-1776 patch for Internet Explorer.

Though sorting belongs to one of the oldest algorithms in computer science, the Cygwin library (part of a Unix-like command-line interface for Microsoft Windows) had (still has?) an "improved" defective version of qsort with guaranteed quadratic behavior on certain inputs. In any case, I have never encountered even an informal proof that the output of a sorting algorithm is a permutation of the input.

This is not to say I think computer science should be viewed as a branch of mathematics but rather as a way to urge more research in formal techniques, hopefully yielding tools for all phases of the development life cycle. Redeveloping the Linux operating system this way would be a genuine advance, making it possible to maintain it at a high level instead of exclusively tinkering with its code.

Denning's and Freeman's response should not have demeaned Santini's distinction, endorsing again and again the pathological optimism approach (such as Scrum and Agile) to software development. In the meantime, see my Technical Opinion "Software Engineering Considered Harmful" (Nov. 2002).

Dennis de Champeaux
San Jose, CA


Displaying all 4 comments