We are all members of the Association for Computing Machinery. Sounds sort of electromechanical doesn't it? Given today's computing technology, it is probably a good thing we are mostly known as ACM! There was a time when the physical artifactthe computerreally was the focus of attention. These behemoths occupied rooms full of equipment. Now, in fairness, if you have ever visited a cloud computing data center, the dominant impression is still a (vast) room full of machinery. But we carry huge quantities of computing power in our pockets and purses too. Computing is a remarkable artifact and its origins centered on the ability to make a piece of equipment calculate under programmable control. Alan Turing, whose 100th birthday we celebrated this year, drew dramatic attention to the artificiality of these systems with what we now call the Universal Turing Machine. This conceptual artifact emphasizes the artificial nature of computation.
In the physical world, science is largely about models, measurement, predictions, and validation. Our ability to predict likely outcomes based on models is fundamental to the most central notions of the scientific method. The term "computer science" raises expectations, at least to my mind, of an ability to define models and to make predictions about the behavior of computers and computing systems. I think we have a fairly good capability to measure and predict the physical performance of our computing devices. We can measure clock speeds, latencies, memory sizes, and computational capacity against standard computing tasks. In my view, however, we are much less able to make models and predictions about the behavior and performance of the artifact we label "software." An almost flippant analogy is the difference between measuring, modeling, and predicting neural brain functions and trying to do the same for "thought."
That software is an artifact seems obvious. Moreover, it is a strikingly complex artifact filled with layer upon layer of components that exhibit dependencies and complex and often unpredicted (not to say unpredictable) behaviors. Even though we design software systems and ought to have some clues about how these systems behave and perform, we generally do not have a reliable ability to anticipate the states these systems can get into, their vulnerabilities, their performance, and ability to adapt to changing conditions.
When we write a piece of software, do we have the ability to predict how many mistakes we have made (that is, bugs)? Do we know how long it will take to find and fix them? Do we know how many new bugs our fixes will create? Can we say anything concrete about vulnerability? What about the probability of exploitation? Murphy's Law suggests that if there is a bug that can be exploited for nefarious purposes, it will be. ACM Turing Award recipient Fred Brooks' wonderful book, The Mythical Man-Month1 captures some of the weakness of our understanding of the nature of software. A complementary look at this topic is found in ACM Turing recipient Herbert A. Simon's The Sciences of the Artificial.2 Chapter 8 deals with hierarchy and complexity, touching on the way in which we try to bound complexity through modular and hierarchical structures but are still challenged by the emergent behaviors masked, in some ways, by the beguiling apparent simplicity of the hierarchy.
The richness of our field has only grown in the 65 years of our existence as an organization. Computers, computing, software, and systems are seemingly omnipresent. We are growing increasingly dependent upon what must be billions of lines of code. Some unknown wag once quipped that the only reason all the computers in the world have not failed at once is that they are not yet all on the Internet. But that may be coming (not the collapse, I hope, but the interconnection of a vast number of programmable devices through the Internet or its successor(s)).
As a group of professionals devoted to the evolution, understanding, and application of software and hardware to the myriad problems, opportunities, and activities of modern society, we have a responsibility to pursue the science in computer science. We must develop better tools and much deeper understanding of the systems we invent and a far greater ability to make predictions about the behavior of these complex, connected, and interacting systems. I consider membership in the ACM a mark of recognition of that responsibility. I hope you share that view and will encourage others in our profession to join ACM in the quest for the science in our discipline.
1. Brooks, F.P. The Mythical Man-Month, Anniversary edition, 1995. Addison-Wesley, Reading, PA, ISBN 0-201-83595-9.
2. Simon, H.A. The Sciences of the Artificial, 3rd edition, 1996. MIT Press, Cambridge, MA, ISBN 0-262-19374-4.
©2012 ACM 0001-0782/12/10 $15.00
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee. Request permission to publish from [email protected] or fax (212) 869-0481.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2012 ACM, Inc.
The "science" of computer science is as elusive as always. The science aspect was well discussed in Minds & Machines some time ago, and the outcome was that science can be understood in so many ways that it's very hard to make any strong claims about computing being or not being a science.
I'm not sure there is really any question about the science of computer science amongst those who practice it. Let's take one particular section of the article that is supposed to convince us that computer science overlooks essential problems that are in dire need of science: "When we write a piece of software, do we have the ability to predict how many mistakes we have made (that is, bugs)? Do we know how long it will take to find and fix them? Do we know how many new bugs our fixes will create?" It honestly takes more effort to complain about how we know nothing about bugs than it does to search and quickly find approaches to answering those questions that are indistinguishable from any other exemplar of "science" that anyone can think up. See Adaptive Bug Prediction by Analyzing Project History, a nice piece of work by Sunghun Kim referencing other equally valid and strongly scientific approaches to answering precisely those questions (see the table on page 113).
I'm sure that I could provide equally "sciency" examples of approaches to answering any of the questions posed in the article (so sorry, but nothing new there), as could any other fellow, Google-savvy computer scientist (I'm thinking not many non-cs folk will even care to read the article above), but we won't, because we're busy doing science.
If math is a part or even the core of science in your mind, you can say "computing" science is a science. Computing science and those applications derived from it should be seemed as two interdependent parts, and the essence of computing make the former more "science". People are apt to say that lambda calculus, type-system analysis include category theorem and formal logic, algorithms and data structure are parts of math (and) science, but it's hard to think that the internet, softwares, hardware architectures and other things are "science".
"we generally do not have a reliable ability to anticipate the states these systems can get into, their vulnerabilities, their performance, and ability to adapt to changing conditions."
I once got into a discussion about it with a French PhD who had done his doctoral thesis on formal methods. He said academia has pretty good ideas about it but industry tends to ignore those.
It left me thinking if formal methods should be required knowledge for all software developers. Its hard training, but so is the training for all engineering professions.
I think a strict separation of computer science and software engineering must be made. Computer sciencei.e., theory of computation, databases & information theory, data structures, algorithms, artificial intelligence, etc. does maintain a reliable ability to anticipate states a system can get into (state machines), their performance (analysis of algorithms, Master Theorem), and ability to adapt to changing conditions (artificial intelligence, etc.). It is indeed the application of the science (i.e., engineering) where uncertainty opens, sloppiness occurs, and reliability is weakend. Just as designing a perfectly safe and sound bridge is impossible due to the pragmatic transcendance of theory to the physical realm, designing a perfectly reliable software system with even slight complexity is just as difficult.
"In the physical world, science is largely about models, measurement, predictions, and validation."
Not only that. In the physical world science is also about discoveries of entirely new things, unexpected - like electricity (Galvani), radioactivity (Becquerel) or at least previously unseen or unconfirmed - like electromagnetic waves (Hertz). In this sense, I think, the inventors of Simula, APL, LISP, relational databases - did real science (and deservedly received Turing Awards for their discoveries). In physical world science is also about collection and systematization of large amount of actual information - like Lamarck or Darwin in biology - and I think that The Art of Computer Programming by D.Knuth is somehow comparable with the former (as for the latter - well, it takes more time to reach this stage) - so this is also an example of science (and again, Turing Award was deservedly received for that).
Information - facts which can flow or persist
Knowledge - interpreted information
Phenomena - Observable information.
Architecture - Designed knowledge
Science - systematized knowledge
Engineering - transformation of science or architecture into the practical.
Computers - information engines
Computer Science or information engine science is a confusing almost nonsensical term, which should be avoided. Computers are devices, they are enablers, which transform the logical into the physical. They have an engineering relationship to science.
Information Science, like Natural Science is a scientific category, not a science in of itself. Natural Sciences abstract knowledge from preconstructed phenomena. Information sciences construct the phenomena, then abstract knowledge from it. Information Architecture (IA) is an architecture used to create a science. It is also a science used to create an architecture. Information sciences include: IA, Management Science, Mathematics and Philosophy.
Each of the information sciences is centered around a single concept.IA/ Information Design, Mathematics/measurement, Management//Process and Philosophy/origin
Because IA can design a body of knowledge only bounded by human imagination. Entire universes can be created. Even that is an understatement of what is possible.
I appreciate the thoughtful comments. I liked especially the observation that we should distinguish engineering from science. This does lead one to ask about how the scientific aspects (formal aspects?) of computing can inform the applied science and engineering activities that try to make practical the insights that science produces.
vint cerf
Information Architecture is transformational. It is a science which is one level of abstaction removed from other sciences. The first derivative of Information Architecture (the science) transforms the science into an architecture. Information Architecture (the architecture) can be used to create an information science. It is a system for creating a design. It then takes that design and creates a system. A system can also be created through many iterations of learning (knowledege advancement). The body of knowledge gradually matures until it becomes vast and complete enough to be considered a science. Information Architecture designs a science. Applying Information Archicture to a body of knowledge accelerates the learning process. Science matures at an accelerated pace.
The science of IA and the architecture of IA have a recursive engineering relationship. I have observed this engineering relationship is not intuitive. Those immersed in creating IA (the science) are removed from those creating IA the Architecture. The Architects are not learned enough in the science. The scientists are not well versed in the application of the science to an architecture. The science is applied to development of the practical (applications).
The architecture and the science mature more slowly as a result.
"The Sciences of the Artificial" [Simon] contains an interesting discussion and viewpoint on "science", which does not focus on models but on the absence of arbitrariness (i.e. within the applicability range of the scientific result). Newton's Laws are unavoidable (i.e. well below the speed of light and well above the microscopic). In contrast, a computer language (or a natural language) is one possibility out of zillions other designs. In Simon's viewpoint, the science in such a language design consists of the unavoidable aspects (only). Simon's own work starts from the bounds on human intelligence and communication abilities (which are unavoidable) and derives insights that have no arbitrariness short of the conditions under which they apply (typically a competitive and dynamic environment, implying pressures to adapt within finite time windows). A very old information technology also has this absence of arbitrariness : maps (not the choice of symbols but the representation of a corresponding reality). When two maps disagree, observation of the corresponding reality reveals which map is unavoidably correct, analogous to experiments proving Newton to be right and Aristotle to be wrong. The real world is full of sources of inevitability, which IT artefacts may model without introducing the arbitrariness that distinguishes science from design/engineering. Moreover, utilizing the capabilities to observe, sense, adapt and even actuate in IT systems, we can go far beyond ancient map technology while building software artefacts that share this absence of arbitrariness with Newton's Laws and Einstein's theories.
Displaying comments 1 - 10 of 16 in total