I hope readers will forgive me for plagiarizing Vannevar Bush's famous essay titlea that appeared 70 years ago in the pages of Atlantic Monthly. The title is so apt, however, that I dare to use it. Two items arrived in my inbox recently, one is the Winter 2015 edition of the Journal of the American Academy of Arts and Sciences, called Daedalus, and the other is the recent book by Peter J. Denning, a former president of ACM and editor of Communications, and Craig H. Martell titled Great Principles of Computingb and together, they provoked this essay.
The Daedalus issue is focused on neuroscience and spans topics from perception, the role of sleep, consciousness and much else in addition. The Great Principles book digs deeply into fundamental principles underlying what we call computer science. Not surprisingly, they deal with some overlapping notions. The one that caught my immediate attention in Daedalus is titled "Working Memory Capacity: Limits on the Bandwidth of Cognition." As I read this, I immediately thought of Denning's early work on the Working Set concept: programs had a natural span of memory requirement that had to be met in real memory to avoid thrashing in the implementation of virtual memory space. Apparently the human brain has a working set limitation. In vision, it appears to be roughly two objects in visual space per brain hemisphere. That is, four total, but limited to two supported by the visual cortex in each hemisphere. More than that, and our human ability to recall and process questions about what we have seen diminishes, rather like the thrashing that happens when we do not have sufficient real memory to support the working set needed in virtual space. Denning and Martell address this under their "Principle of Locality."
Despite the wonders of the human brain, which we are far from understanding, it does not appear to have a convenient way to grow processing capacity while we can achieve that objective with our artificial computers by adding memory or adding processors. This is not to say that adding more conventional computing devices necessarily produces an increase in effective computing, for particular computing tasks. One has only to remember Fred Brooks' Mythical Man Monthc to recall this also applies to programming and the rate at which "finished" code can be produced. Adding more programmers does not necessarily produce more or better code. It might even produce worse code for lack of coordination and Brooks actually draws attention to this phenomenon.
Still, there appears to be a growing sense that computing may in fact benefit from adopting unconventional computational paradigms. The so-called neural chips, such as IBM's TrueNorth,d are indicative of this interesting trend as is the rapidly evolving exploration of quantum computing. Adding more state by adding more neural nodes and interconnections seems to improve the scope and accuracy of pattern recognition for example. One then begins to wonder whether there might be utility in combining neural computing with conventional computing to achieve something that neither might be very good at alone. This reminds me of work done by my thesis advisor, Gerald Estrin, in the mid-1950s and early 1960s on what he called "Fixed plus Variable" computing.e In these designs, a general-purpose computer was combined with a variable structure computer that, like today's Field Programmable Gate Arrays (FPGAs), could be adapted to special purpose computations. As I understood Estrin's work, this idea also extended to combining analog and digital computation in which the former achieved approximate solutions that could be refined further by digital methods.
These ideas lead me to wonder whether, in the context of today's very flexible and scalable cloud computing environment, one might find ways to harness a variety of computing methods, including neural networks, conventional scalar and vector processing, graphical processors and perhaps even analog or quantum processing to solve various special sorts of problems. Assuming for a moment that any of this actually makes any sense, one is struck by the challenge of organizing the aggregate computation so that the results are reproducible, the appropriate intermediate results reach the right next computing step, and there is an ability to expand and contract the computing element requirements to match need might be preserved.
I hope readers who are far more experienced than I am in the design of complex computing systems may take time to voice their opinions about these ideas. In their book, Denning and Martell dive deeply into the importance of design in all aspects of computing. Without adherence to serious and deep design principles and attention to systems engineering, the usability and utility of computing systems of all kinds suffers. The Internet design adopted a variety of tactics including layering, information hiding and loose coupling, to achieve a scalable and evolvable system. There were only 400 computers on the Internet in 1983 and today there are billions of them. Design and systems engineering should have priority places in the curriculum of computer science. They are the beginning of everything and, in some sense, the end as well.
a. Vannevar Bush, "As We May Think;" http://theatln.tc/1ahQVW2
b. P.J. Denning and C.H. Martell. Great Principles of Computing. MIT Press, Cambridge, MA, USA, 2015, ISBN 978-0-262-52712-5
c. http://en.wikipedia.org/wiki/The_Mythical_Man-Month
d. http://www.research.ibm.com/articles/brainchip.shtml
e. G. Estrin and C.R. Viswanathan. Organization of a 'fixed-plus-variable' structure computer for computation of eigenvalues and eigenvectors of real symmetric matrices. JACM 9, 1 (Jan. 1962).
The Digital Library is published by the Association for Computing Machinery. Copyright © 2015 ACM, Inc.
No entries found