acm-header
Sign In

Communications of the ACM

Communications of the ACM

Inside Risks: What to Know About


In this column, we assert that deeper knowledge of fundamental principles of computer technology and their implications will be increasingly essential in the future for a wide spectrum of individuals and groups, each with its own particular needs. Our lives are becoming ever more dependent on understanding computer-related systems and the risks involved. Although this may sound like a motherhood statement, wise implementation of motherhood is decidedly nontrivial—especially with regard to risks.

Computer scientists who are active in creating the groundwork for the future need to understand system issues in the large, including the practical limitations of theoretical approaches. System designers and developers—including those responsible for the human interfaces used in inherently riskful operational environments that must be trusted—need broader and deeper knowledge; interface design is often critical. Particularly in those systems that are not wisely conceived and implemented, operators and users of the resulting systems also need an understanding of certain fundamentals. Corporation executives need an understanding of various risks and countermeasures. In each case, our knowledge must increase dramatically over time, to reflect rapid evolution. Fortunately, the fundamentals do not change as quickly as the widget of the day, which suggests an important emphasis for education and ongoing training.

An alternative view suggests that many technologies can be largely hidden from view, and that people need not understand (or indeed, might prefer not to know) the inner workings. David Parnas's early papers on abstraction, encapsulation, and information hiding are important in this regard. Although masking complexity is certainly possible in theory, in practice we have seen too many occasions (for examples, see the RISKS archives) in which inadequate understanding of the exceptional cases resulted in disasters. The complexities arising in handling exceptions apply ubiquitously, to defense, medical systems, transportation systems, personal finance, security, to our dependence on critical infrastructures that can fail—and to anticipating the effects of such exceptions in advance.

The importance of understanding the idiosyncrasies of mechanisms and human interfaces, and indeed the entire process, is illustrated by the 2000 Presidential election—with respect to hanging chads, dimpled chads (due to stuffed chad slots), butterfly ballot layouts, inherent statistical realities, and the human procedures underlying voter registration and balloting. Clearly, the election process is problematic, including the technology and the surrounding administration that must be considered as part of the overall system. Looking into the future, a new educational problem will arise if preferential balloting becomes more widely adopted, whereby preferences for competing candidates are prioritized and the votes for the lowest-vote candidate are iteratively reallocated according to the specified priorities. This concept has many merits, although it would certainly further complicate ballot layouts!

Thus, computer-related education is vital for everyone. The meaning of the Latin word "educere" (to educate) is literally "to lead forth." However, in general, many people do not have an adequate perception of the risks and their potential implications. When, for example, the media tell us that air travel is safer than automobile travel (on a passenger-mile basis, perhaps), the comparison may be less important than the concept that both could be significantly improved. When we are told that electronic commerce is secure and reliable, we need to recognize the cases in which it isn't.

With considerable foresight and wisdom, Vint Cerf has repeatedly said that "The Internet is for everyone." The Internet can provide a fertile learning medium for anyone who wants to learn, but it also creates serious opportunities for the unchecked perpetuation of misinformation and counterproductive learning that should eventually be unlearned.

In general, we learn what is most valuable to us from personal experience, not by being force-fed lowest common denominator details. In that spirit, it is important that education, training, and practical experiences provide motivations for true learning. For technologists, education needs to have a pervasive systems orientation that encompasses concepts of software and system engineering, security, and reliability, as well stressing the importance of suitable human interfaces. For everyone else, there needs to be much better appreciation of the sociotechnical and economic implications—including the risks issues. Above all, a sense of vision of the bigger picture is what is most needed.

Back to Top

Footnotes

For previous columns relating to education, see February 1996 (research), August 1998 (computer science and software engineering), and October 1998 (risks of e-education), the first two are by Peter Denning, the first and third by PGN. PGN's open notes for a Fall 1999 University of Maryland course on survivable, secure, reliable systems and networks, and a supporting report can be found at www.CSL.sri.com/neumann.


©2000 ACM  0002-0782/01/0200  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2001 ACM, Inc.