acm-header
Sign In

Communications of the ACM

Inside Risks

Digital Evidence


Those of you concerned with privacy issues and identity theft will be familiar with the concept of dumpster diving. Trash often reveals the dealings of an individual or a corporation. The risks of revealing private information through the trash has led to a boom in the sale of paper shredders (and awareness of the risks of reassembling shredded documents). However, how many of us take the same diligent steps with our digital information?

The recovery of digital documents in the Enron case and the use of email in the Microsoft antitrust case have brought these concerns to the fore. For example, we are all more aware of the risks inherent in the efficient ("lazy") method of deleting files used by modern operating systems, where files are forgotten about rather than actually removed from the drive.

There will certainly be an increase in the sales of "wiper" software following this increased awareness, but that's not the end of the story. Overwriting data merely raises the bar on the sophistication required of the forensic examiner. To ensure reliable data storage, the tracks on hard-drive platters are wider than the influence of the heads, with a gap (albeit small) between tracks. Thus, even after wiper software has been applied, there may still be ghosts of the original data, just partially obscured.

So, what more can we do? Clearly we are in a tradeoff between the cost to the user and the cost to the investigator. To take the far extreme, we would take a hammer to the drive and melt down the resulting fragments, but this is not feasible without a large budget for disks.

One could booby-trap the computer, such that if a certain action isn't taken at boot time, the disk is harmed in some way. Forensics investigators are mindful of this, however, and take care to examine disks in a manner that does not tamper with the evidence. If we're open to custom drives, we could push the booby trap into the drive hardware, causing it to fail when hooked up to investigative hardware (or, more cunningly, produce a false image of a file system containing merely innocent data).

Another approach is to consider file recovery as a fait accompli and ensure the recovered data is not available as evidence. Encryption clearly has a role to play here. An encrypting file system built into your operating system can be helpful, but may provide only a false sense of security—unless you have adequate assurance of its cryptanalytic strength (which is likely to be weakened if there is common structure to your data) and the strength of the underlying operating system. Per-file encryption with a plurality of keys might help, but that begs the question of key management and key storage.

One could consider possible key escrow, backdoors, and poorly implemented cryptography software to be below your paranoia threshold. Another useful step can be secret sharing (see A. Shamir, "How to Share a Secret," Commun. ACM 22, 11, Nov. 1979, 612–613). Spread your data in fragments around the network such that k of the n fragments are required to be co-located to decipher the original file. In a carefully designed system, any k−1 fragments yield no useful insight into the contents of the file; k and n can be tuned according to the paranoia required, including the placement of no more than k−1 within the jurisdiction of the investigating agency.

Clearly, there are a number of steps we can take to push the evidence as far as possible beyond the reach of those who might use it to incriminate us. But one question not often raised in this topic is why should we bother? Given the lack of strong authentication in most computing systems, it's not beyond reasonable doubt the files in question are not even yours.

Furthermore, there are many risks of trusting recovered digital evidence, given the ease with which digital documents can be fraudulently created, modified, or accidentally altered, or their time stamps manipulated. Corroboration by independent sources of evidence is usually required to establish a case, even for non-digital evidence, although when all of these corroborating sources of evidence are digital, the risks remain. See, for example, discussion of the potential holes in evidence in the case of the Rome Labs Intrusion in 1994 (see P. Sommer, "Intrusion Detection Systems as Evidence," BCS Legal Affairs Committee, Mar. 2000; www.bcs.org.uk/lac/ids.htm).

So, things may not be what they seem. Supposedly deleted information may still be retrievable—from backup files, as residues on physical media, or from decrypted forms that over time become decryptable by brute force, as computing power increases. Supposedly reliable evidentiary information may have been forged, tampered, or bypassed altogether. Be careful what you believe.

Back to Top

Author

David Stringer-Calvert ([email protected]) is a senior project manager in the Computer Science Lab at SRI International.


©2002 ACM  0002-0782/04/0100  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2002 ACM, Inc.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
Article Contents: