#036 Today In Technology History by:Amy Elk

08/31/10 “Kinetoscopes and Super Guppy”TiTH.TechJives.net by:Amy Elk
Keywords: amy elk podcast tech jives techjives techjives.net chris pope today in technology history amyelk.com voice actress
Feedburner RSS feed:

Securely disposing data on hard drives and other storage media

Date: August 31st, 2010
Author: Chad Perrin

Debates sometimes arise, both within academic circles and outside of them, over the necessity of high-intensity secure deletion techniques. Find out the true state of affairs for secure data disposal.

The state of the art of secure data disposal is, like that in most technical spheres of knowledge, always subject to change as researchers do their work. One might imagine that this involves new techniques for more effective data recovery that employs magnetic force microscopes and similarly high-cost solutions, countered by new advice for how to defeat such efforts when disposing of hard drives and other storage media.

One example of an impressive data recovery effort is that of the remains of hard drives from the Columbia space shuttle disaster, which ultimately led to the recovery of experimental data. Six months after the shuttle came apart on atmospheric reentry, a damaged hard drive was found in a dry lakebed and delivered to data recovery specialists at Kroll Ontrack Inc. Some time in the next four years or so, 99% of the data stored on the drive was recovered. The drive was eight years old before the shuttle disaster; it was delivered to the people who recovered the data from it looking like a melted down piece of slag and then damaged further during the recovery process — but recovery was a success.

On the other hand, two other drives involved in the shuttle disaster were complete losses.

There is a persistent myth to the effect that to securely delete everything from a hard drive one must overwrite it thirty-five times with random data. This myth arises from a superficial read and misunderstanding of Peter Gutmann’s 1996 paper, Secure Deletion of Data from Magnetic and Solid-State Memory. The truth of the matter, as presented in his paper, is that 35 random overwrites serves only to apply the necessary means of securely deleting data for any of several different drive technologies. A specific data storage technology only requires some lesser technique applied to ensure secure deletion.

Perhaps more interesting is the fact that, for the most modern hard drive technologies, a single complete overwrite of a drive with zeros should be sufficient. Part of the reason for this is the fact that data density on a drive is much greater than it used to be. In layman’s terms, “the bits are smaller”, which means that when rewriting, there is less room for old data to be left behind in a recoverable manner. A fair amount of redundancy of stored data occurred on older, lower density drives because the reading and writing devices were not as precise, and small deviations would leave random small areas unaffected on a single overwrite.

In a recent epilogue to his paper, Gutmann quoted himself responding to a researcher who considered doing some data testing:

Any modern drive will most likely be a hopeless task, what with ultra-high densities and use of perpendicular recording I don’t see how MFM would even get a usable image, and then the use of EPRML will mean that even if you could magically transfer some sort of image into a file, the ability to decode that to recover the original data would be quite challenging. OTOH if you’re going to use the mid-90s technology that I talked about, low-density MFM or (1,7) RLL, you could do it with the right equipment, but why bother? Others have already done it, and even if you reproduced it, you’d just have done something with technology that hasn’t been used for ten years. This is why I’ve never updated my paper (I’ve had a number of requests), there doesn’t seem to be much more to be said about the topic.

Recent papers by other researchers may seem to contradict Gutmann’s results. He does address some of this in his epilogues. Judging by both his epilogues and an independent look at reporting on such papers, it seems that such papers are in some cases misguided, and in others not contradictory of Gutmann’s results so much as relating to a specific technology that falls within the range of Gutmann’s more general overview.

While no single storage technology requires Gutmann’s described technique for dealing with all technologies, few of us have the time or inclination to double-check the specific technologies and the approaches required for each of them before tackling the task of secure data disposal. If you want to run a secure data disposal service where you expect to need to deal with many, many different storage devices regularly, it pays to know the specific techniques for specific technologies, and to apply them, if only because the time and resource costs for secure deletion will add up quickly. If you are a more typical user who just needs to get rid of a hard drive every couple years or so, the time spent keeping track of drive technologies and data disposal techniques is probably worth more to you than the time it takes a computer to perform Gutmann’s thirty-five overwrite “scorched earth” technique.

For more, visit TechRepublic.com