Last month we discussed risks in trusting entities that might not actually be trustworthy. And yet, people use flawed systems that may cause more security and reliability problems than they solve. There are various reasons why untrustworthy mass-market software might be used so extensively, even if the source code is proprietary and the vendor can arbitrarily download questionable software changes without user intervention. Sometimes this is a path of least resistance, with few perceived alternatives. Or it has the appearance of saving money in the short term. In some cases it is mandated organizationallyostensibly to simplify procurement, administration, and maintenance, or because of a desire to remain within the monolithic mainstream. Often security, reliability, and the risks of networking are considered of lesser importance.
There is a misplaced trust that the free market will provide a cure. However, irrespective of any reasons why people might want to use flawed software, in certain cases it might be wiser not to use itespecially where the risks are considerable.
In my fourth testimony (August 2001) in five years for committees of the U.S. House of Representatives, I made the following statement: "Although there have been advances in the research community on information security, trustworthiness, and dependability, the overall situation in practice appears to continually be getting worse, relative to the increasing threats and risksfor a variety of reasons. The information infrastructure is still fundamentally riddled with security vulnerabilities, affecting end-user systems, routers, servers, and communications; new software is typically flawed, and many old flaws still persist; worse yet, patches for residual flaws often introduce new vulnerabilities. There is much greater dependence on the Internet, for governmental use as well as private and corporate use. Many more systems are being attached to the Internet all over the world, with ever increasing numbers of userssome of whom have decidedly ulterior motives. Because so many systems are so easily interconnectable, the opportunities for exploiting vulnerabilities and the ubiquity of the sources of threats are also increased. Furthermore, even supposedly standalone systems are often vulnerable. Consequently, the risks are increasing faster than the amelioration of those risks."
The situation seems still worse in 2003, especially in mass-market software. The continuing cascade of viruses, worms, and system crashes raises the level of inconvenience to users and institutions. The incessant flow of identified vulnerability reports and the further existence of flaws that are not publicly known suggest serious problems. The continual needs for installing thousands of patches in mass-market software (and the iterative problems they sometimes cause) suggest we are not converging. Putting the blame on inadequate system administration seems fatuous. The August 2003 exploitations of Microsoft problems (the Blaster worm and the SoBig virus) are examples of endemic problems in vulnerable systems that can be exploited. Unfortunately, too many people seem to be oblivious to the underlying security problems.
Suggestions that we need to raise the bar may be countered with the argument that past attacks have not really been serious, and we have never had any pervasive disasters of information system security, so why should we worry? However, it is precisely because past events have been relatively benign (compared with what they could done) that there should be greater concern. Furthermore, a general overemphasis on short-term costs allows long-term concerns to be ignored.
The Free Software/Open Source movements have been touted as possible alternatives to the inflexibilities of closed-source proprietary code. Indeed, GNU-Linux/BSD Unix variants are gaining considerable credibility, and are seemingly less susceptible to malware attacks. However, by itself, availability of source code is not a panacea, and sound software engineering is still essential. Even if an entire system has been subjected to extremely rigorous open evaluation and stringent operational controls, that may not be enough to ensure adequate behavior.
Many users have grown accustomed to flaky software, perhaps because they do not have to meet critical requirements (as in nuclear power control, power distribution, and flight and air-traffic control) and suffer no liability for disasters. Perhaps it is time to follow the adage of "Just Say No" to bad software that is seriously unsecurable, and to demand that software development be dramatically improved.
©2003 ACM 0002-0782/03/1000 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2003 ACM, Inc.
No entries found