acm-header
Sign In

Communications of the ACM

BLOG@CACM

The Virus Analogy and Validation


View as: Print Mobile App Share:
Robin K. Hill, University of Wyoming

The current pandemic highlights the virus analogy that gave rise to the use of the word "virus," from biology, to label a malicious program that attacks computer systems. The situation moves us to look into that, as another way to compare nature and artifact, and as an excuse to raise more abstract questions. We are moved also to stipulate that our mastery of both the biological and computational forms is shallow, and to invite other and better observations to follow. See Apvrille and Guillaume [Apvrille] for greater depth and intriguing crossover speculation, and Weis [Weis] for yet more intriguing comparison, and Wenliang Du's website for detailed virus examples [Du], which constitute dramatic reading for coders.

A virus is generally not regarded as a living orgamism, but sometimes described as (similar to) software. When the first self-replicating computer programs made the rounds, they were experiments or pranks [Wiki:Creeper]; for most, the point was solely reproduction. An early computer worm was beneficent, but escaped control [Chen]. We distinguish computer viruses from computer worms by the profligate scale of replication, viruses generating a broadcast of copies rather than a chain of copies. The obvious points of analogy across both types of virus include that viruses are tiny, invading a host much greater in size and complexity, without an overt signal; and that viruses disrupt some process in the host. Neither computer nor biological virus necessarily does damage. In biology, self-replication is an end, not a means, making the damage a side effect. In the modern computer virus, the end is likely to be the action of a payload of malicious code. Now the term "virus," in both environments, connotes an intrusive and damaging force carrying dangerous baggage.

To explore some points of analogy systematically, consider access: How is virus entry accomplished? Computer viruses look for an opening by probing known vulnerabilities. If one is found, malificent code is injected. This is quite like the organic version. Consider gain: What does the virus get out of this, and how? The virus gets more virus, and the means of reproduction is the same—self-replication. Note the correspondence to the Unix system fork() call, which spawns a new process by replicating the current process. The history tells us that this happened because it was easy: "...it seems reasonable to suppose that it exists in Unix mainly because of the ease with which fork could be implemented without changing much else" [Ritchie]. The heuristic, across both types: To start a new working structure, just copy the working structure on hand.

Consider pathology, the means of damage. A virus damages the host body by depleting cell resources, consumed by the virus; bursting the cell walls; or generating toxic byproducts. Do each of these have an analogy? Sure—Denial of Service; breaching buffer boundaries or reverse shell; interference with the operating system, degrading its protection of system resources such as CPU cycles, files, and ports [SciAm], [Du]. We could consider defense, the host's prevention or cure mechanism, that is, the action taken if the host somehow notices that something is wrong. That panoply of fascinating mechanisms is beyond our expertise, but it's clear that vaccination is one of them, leading to counter-measures such as mutation. Both organic and computer viruses can mutate quickly. But mutation in organics is a quirk, a random unguided alteration. Mutation in computer programs is human-directed. The brute force options for repair and defense are off limits to humans. We can't reboot to reset memory, let alone re-install a clean operating system.

We've talked of viruses as troops in a war game, initiating and reacting, as they take over cells for the purpose of replication. But wait—Is there a purpose? All we can say for sure is that viruses insert genetic material into cells, which cells then generate more viruses. Is there a struggle? Is control being deliberately wrested from the cell, or is there actually no agent involved that gives a hoot, no intention at all? The vocabulary of aggression in cell science (layperon's version) reflects our human phenomenology, projected onto what we see. It may be fair, or it may be distorted. It may be way off the mark; the cells might be "fulfilled"—an odd thought. But why is it less odd to say that the cells are "defeated"? Why use the language of attack, when the language of hospitality (or indifference) might model the process just as well (the language of indifference, even better)? Above, we said that in biology, damage is a "side effect," which assumes some kind of intention. We now question that assumption. Other natural forces bring about change; the wind threatens, intrudes, and damages. But to speak of its intention is only poetic.

In computing, similarly, a computer virus executes in order to create more copies of its code and then disseminate them... Does that statement of the analogy, through the phrase "in order to," leads us to the attribution of volition to the computer virus, inaccurately? We claim that it is misleading to speak as if the organic virus has volition. Imbued by the programmer, however, a computer virus exhibits hostility. But wait. That means that the computer virus is more like the organic virus than the organic virus itself!

Of course, the question of volition, seen here on a small scale, bears on larger questions in the philosophy of computing as well, those in artificial intelligence and cognitive science connected to intentionality and consciousness. That inquiry could be aided by a new locution for computer virus, which might even inform a new locution for organic virus. My earlier piece on the Articulation of Responsibility [Hill2018] called for such locutions. Programs do not make decisions. Because it looks like they do, we need a way to talk about what they actually do that is not misleading. Viruses do not "intend" in any meaningful way; they just behave as if they were intending. Or perhaps they don't even "behave" in any particular way, they just exhibit actions that intentional beings would exhibit if they had as a goal the end result reached by the virus. We are so dependent on the vocabulary of intention and volition that we have no other non-awkward options.

Analogies between natural and computation phenomena, tight or stretched, have formed the subjects of several pieces in this space [Hill2016, HIll2017a, Hill2017b]. In the case happening right before our very eyes, we see that the analogy between the biological virus and the computer virus exhibits strengths and weaknesses, and may offer further possibilities. Points of positive similarity may not be due to cause and effect, but rather to effects of some common cause, something like the general vulnerability of processes that use input and output. We might even propose that the proper analogue to the biological microbe is the programmer-code pair, a self-contained system that lies between the program and the programmer, enjoying some kind of collective semi-animate agency. We can turn to philosophy to ask—Do agents have to be individual and human? That's debatable [Schlosser] beyond the scope of this inquiry.

But wait. The really interesting question is what a strong successful analogy, matching computer viruses to organic viruses, would mean. Does it mean that some common notions—say, the general vulnerability of input (as mentioned above), or entry through a defined interface, or subversion of a external body's resources—are somehow universal? If so, have we gained anything beyond a pleasant self-validation? But wait! What does validation get us, anyway? Are computer scientists to congratulate ourselves when our artifacts look like nature? What's so great about that? Or is there something great about that? If so, what's not so great about artifice?

References

[Apvrille] Apvrille, A. and Lovet, G., 2012. An Attacker’s Day into Human Virology. Appendix comprises a table of vocabulary analogs.

[Chen] Chen, Thomas and Robert, Jean-Marc. 2004. The Evolution of Viruses and Worms. In Chen, W. (Ed.). (2004). Statistical Methods in Computer Security. Boca Raton: CRC Press.

[Du] Du, Wenliang. Undated. Computer & Internet Security: Videos, Slides, Problems and Labs. Website for the book Computer & Internet Security: A Hands-on Approach, Second Edition. 2019.

[Hill2016] Hill, Robin K. 2016. Fiction as Model Theory. Blog@CACM. December 30, 2016.

[Hill2017a] Hill, Robin K. 2017. Operating Systems as Possible Worlds. Blog@CACM. April 29, 2017.

[Hill2017b] Hill, Robin K. 2017. Human Acts and Computer Apps. Blog@CACM. November 28, 2017.

[Hill2018] Hill, Robin K. 2018. Articulation of Decision Responsibility. Blog@CACM. May 21, 2018.

[Ritchie] Ritchie, Dennis M. 1980. The evolution of the Unix time-sharing system. Proceedings of the the Symposium on Language Design and Programming Methodology. Springer.

[Schlosser] Schlosser, Markus. 2019. Agency. The Stanford Encyclopedia of Philosophy, Winter 2019 Edition. Edward N. Zalta, editor.

[SciAm] Various experts. 1997. When and how did the metaphor of the computer 'virus' arise? Scientific American, online. Article lists answers to the given question, usually identifying Fred Cohen, student of Adleman at University of Southern California. September 2, 1997.

[Weis] Weis, Or. 2020. What if it was a software bug/virus? Cyber vs. COVID-19: A thought experiment. Rookout.

[Wiki:Creeper] Wikipedia contributors. (2020, May 28). Creeper (program). In Wikipedia, The Free Encyclopedia. Retrieved 23:45, May 29, 2020.

 

Robin K. Hill is a lecturer in the Department of Computer Science and an affiliate of both the Department of Philosophy and Religious Studies and the Wyoming Institute for Humanities Research at the University of Wyoming. She has been a member of ACM since 1978.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account