acm-header
Sign In

Communications of the ACM

Inside risks

Trustworthy Systems Revisited


System trustworthiness is in essence a logical basis for confidence that a system will predictably satisfy its critical requirements, including information security, reliability, human safety, fault tolerance, and survivability in the face of wide ranges of adversities (such as malfunctions, deliberate attacks, and natural causes).

Our lives increasingly depend on critical national infrastructures that depend in varying degrees on the dependable behavior of computer-communication resources, including the Internet and many of its attached computer systems. Unless certain information system resources are trustworthy, our critical systems are at serious risk from failures and subversions. Unfortunately, for many of the key application domains, the existing information infrastructures are lacking in trustworthiness. For example, power grids, air-traffic control, high-integrity electronic voting systems, the emerging DoD Global Information Grid, the national infrastructures, and many collaborative and competitive Internet-based applications all need systems that are more trustworthy than we have today.

In this column, we have frequently considered risks associated with such systems and what is needed to make them more trustworthy. This month's column takes a higher-level and more intuitive view by considering analogies with our natural environment—expectations for which are rather similar to expectations for trustworthy information systems. For example, pure air and uncontaminated water are vital, as are the social systems that ensure them.

Although poorly chosen analogies can be misleading, the analogy with our natural environment seems quite apt. Each of the following bulleted items is applicable to both trustworthy information systems and natural environments.

  • Their critical importance is generally underappreciated until something goes fundamentally wrong—after which undoing the damage can be very difficult if not impossible.
  • Problems can result from natural circumstances, equipment failures, human errors, malicious activity, or a combination of these and other factors.
  • Dangerous contaminants may emerge and propagate, often unobserved. Some of these may remain undetected for relatively long periods of time, whereas others can have immediately obvious consequences.
  • Your well-being may be dramatically impeded, but there is not much you as an individual can do about aspects that are pervasive—perhaps international or even global in scope.
  • Detection, remediation, and prevention require cooperative social efforts, such as public health and sanitation efforts, as well as technological means.
  • Up-front preventive measures can result in significant savings and increases in human well-being, ameliorating major problems later on.
  • Once something has gone recognizably wrong, countermeasures are typically fruitless—too little, too late.
  • As we noted in the June 2004 column, long-term thinking is relatively rare. There is frequently little governmental or institutional emphasis on prevention of bad consequences.
  • Many of the arguments against far-sighted planning and proactive remediation are skewed, being based on faulty, narrowly scoped, or short-sighted reasoning.
  • Commercial considerations tend to trump human well-being, with business models sometimes considering protection of public welfare to be detrimental to corporate and enterprise bottom lines.

In some contexts, pure water is becoming more expensive than oil. Fresh air is already a crucial commodity, especially for people with severe breathing and health problems. Short- and long-term effects of inadequately trustworthy information systems can be similarly severe. Proactive measures are as urgently needed for system trustworthiness as they are for breathable air, clean water, and environmental protection generally. It is very difficult to remediate computer-based systems that were not designed and implemented with trustworthiness in mind. It is also very difficult to remediate serious environmental damage.

Anticipating and responding to compelling long-term needs does not require extraordinary foresight, whether for air, water, reversing global warming, or trustworthy systems upon which to build our infrastructures. Our long-term well-being—perhaps even our survival—depends on our willingness to consider the future and to take appropriate actions.

Back to Top

Author

Peter G. Neumann ([email protected]) moderates the ACM Risks Forum. This column was inspired by an article by Tim Batchelder, "An Anthropology of Air," Townsend Letter for Doctors and Patients (Nov. 2005): "Because [air] is negative space, it is difficult to see the value in preserving it."


©2006 ACM  0001-0782/06/0200  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2006 ACM, Inc.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
Article Contents: