acm-header
Sign In

Communications of the ACM

Communications of the ACM

Embedding the Internet: Introduction


To most people, today's Internet is an astonishing technological marvel. But future networked computing systems will achieve a degree of sophistication and functionality that will make today's Internet appear primitive in comparison. While the Internet creates a new cyberspace separate from our physical world, technological advances will enable ubiquitous networked computing in our day-to-day lives. The power of this ubiquity will follow from the embedding of computation and communications in the physical world—that is, embedded devices with sensing and communication capabilities that enable distributed computation.

With this ability, we will begin to see the application of computing technologies in settings where they are unusual today: device and appliance networking in the home; faithful capture of scientific experiments in the laboratory; and automated full-time monitoring of patient health. We already see the elements of the technological advances necessary for such applications: faster, smaller, power-conserving processors; larger and cheaper computer memory; and early software tailored for such devices. Some of these technologies are, unbeknownst to us, already ubiquitous in our lives; an astounding 98% of all processors on the planet are not in traditional desktop computer systems but in household appliances, vehicles, and machines on factory floors. If we were to add two simple technologies—reliable wireless communication and sensing and actuation functions—very interesting application scenarios become possible.

Gaetano Borriello of the University of Washington and Roy Want of the Xerox Palo Alto Research Center point out these applications are closer to realization than even an educated observer might imagine. Industry consortia, such as Bluetooth and HomeRF, are already designing standards for wireless networks of embedded devices. Durable and accurate microelectromechanical systems (MEMS)-based sensors have been added to automobiles in recent years. Software vendors are advancing naming and directory service standards for networks of embedded computers. The Web already provides a standard interface that can be leveraged to integrate data harvested from these embedded systems.


When there are hundreds or thousands of computers per human, radically new challenges await us.


Such applications, writes David Tennenhouse, chief scientist of Intel Corp. and former director of the Defense Advanced Research Projects Agency's Information Technology Office, will get us to the human/machine/network "breakpoint" at which the number of computers approximately equals the number of human beings on the planet. Beyond this point, when, say, there are hundreds or thousands of computers per human, radically new challenges would await us. We would be able to "get physical" (accurately and closely instrument our physical environments), "get real" (do away with computing paradigms imposed by interactive computing), and "get out" (eliminate humans from low-level decision-making responsibilities imposed by today's interactive systems). Tennenhouse calls on the computer science community to focus some of its research and teaching efforts on the questions along the way toward physically embedding computing.

"Distributed microsensing" is one example of a situation well beyond the human/machine/network breakpoint. Future embedded devices are likely to have multimodal environmental sensing abilities, including acoustic and seismic sensing, even miniature cameras. Networking these sensors—empowering them to coordinate themselves on a larger sensing task—will revolutionize information gathering and processing in many of these situations. Large-scale, dynamically changing, robust sensor colonies would then be deployable in such inhospitable physical environments as remote geographic regions and toxic urban locations. They will also enable low-maintenance sensing in more benign, but less accessible, environments, including smart homes, large industrial plants, aircraft interiors, and more.

G.J. Pottie and W.J. Kaiser, experts in low-power wireless sensing and cofounders of Sensoria Corp., emphasize that an architecture in which the sensor network is also a distributed computer system makes more sense than today's centralized sensor systems from several perspectives, including signal processing, power efficiency, and theoretical communication limits. But designing distributed sensor networks is not without challenges. Starting from low-power circuit design to networking protocols allowing robust large-scale coordinated sensing, sensor networks open up a variety of fascinating research areas.

For example, what if these sensors were autonomously mobile? This additional degree of freedom would reveal very interesting possibilities. A swarm of robots could disperse rapidly to map a toxic spill or an urban disaster area. A human operator could interact with the robot ensemble to selectively gather detailed information, helping devise a recovery strategy. A robotics perspective involves several notable challenges: a robot has to be able to compute its own position; teams of robots have to be able to efficiently disperse and monitor the physical region under surveillance; and multiple robots might collectively recognize environmental features and produce maps to guide rescue teams.

The key challenge in designing such capabilities, write Gaurav Sukhatme and Maja Mataricacute_l.gif, roboticists in the University of Southern California's Computer Science Department, is dealing with uncertainty in sensor readings and unpredictability in the behavior of robots and other entities in the environment. If not accounted for properly, uncertainty could trigger, say, unwanted behaviors in the robot swarm, so all the robots find themselves clustered in one location. The approach Sukhatme and Mataricacute_l.gif espouse—decentralized coordination with local decision making to achieve the intended global goal—is a design theme central to several articles in this issue.

Most of these visions, which are are likely to be achieved in the next five to 10 years, are predicated on predictable advances in chip fabrication and radio and sensor design. A more radical presupposition is that of microfabrication techniques allowing simple computation, communication, and sensing. But what can we do with embedded devices the size of grains of sand? And what can we do with cellular engineering techniques allowing us to fabricate molecular inverters? If, for example, individual cells had simple computation and sensing abilities, would we be able to network them in interesting ways? Such advances could lead to computationally enhanced materials, such as smart paint, that report weather conditions and unusual building stress.

This is all fine, write Hal Abelson et al., most from the MIT Artificial Intelligence Laboratory, but how do we program smart paint? They argue that such "amorphous computing" forces us to fundamentally rethink our programming abstractions, as well as how we organize computing elements. In particular, they contend we should program each individual element to perform exceedingly simple tasks (such as maintaining and propagating information "markers") and exchange information only with neighboring elements. This simple computation and local communication can, if designed properly, lead to predictable ensemble behavior, even in the presence of failures of individual elements.

Regardless of which of these complementary visions of a future ubiquitous computing universe emerges first, and when they reveal themselves, we expect that computing, communications, and the world at large will be changed profoundly by the impending revolution in embedded Internet devices.

Back to Top

Authors

Deborah Estrin ([email protected]) is a professor of computer science at the University of Southern California in Los Angeles and a project leader in USC's Information Sciences Institute.

Ramesh Govindan ([email protected]) is a research assistant professor at the University of Southern California in Los Angeles and a project leader in USC's Information Sciences Institute.

John Heidemann ([email protected]) is a research assistant professor at the University of Southern California in Los Angeles and a project leader in USC's Information Sciences Institute.

Back to Top


©2000 ACM  0002-0782/00/0800  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2000 ACM, Inc.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
Article Contents: