Posts belonging to Category TJ News

Securely disposing data on hard drives and other storage media

Date: August 31st, 2010
Author: Chad Perrin

Debates sometimes arise, both within academic circles and outside of them, over the necessity of high-intensity secure deletion techniques. Find out the true state of affairs for secure data disposal.

The state of the art of secure data disposal is, like that in most technical spheres of knowledge, always subject to change as researchers do their work. One might imagine that this involves new techniques for more effective data recovery that employs magnetic force microscopes and similarly high-cost solutions, countered by new advice for how to defeat such efforts when disposing of hard drives and other storage media.

One example of an impressive data recovery effort is that of the remains of hard drives from the Columbia space shuttle disaster, which ultimately led to the recovery of experimental data. Six months after the shuttle came apart on atmospheric reentry, a damaged hard drive was found in a dry lakebed and delivered to data recovery specialists at Kroll Ontrack Inc. Some time in the next four years or so, 99% of the data stored on the drive was recovered. The drive was eight years old before the shuttle disaster; it was delivered to the people who recovered the data from it looking like a melted down piece of slag and then damaged further during the recovery process — but recovery was a success.

On the other hand, two other drives involved in the shuttle disaster were complete losses.

There is a persistent myth to the effect that to securely delete everything from a hard drive one must overwrite it thirty-five times with random data. This myth arises from a superficial read and misunderstanding of Peter Gutmann’s 1996 paper, Secure Deletion of Data from Magnetic and Solid-State Memory. The truth of the matter, as presented in his paper, is that 35 random overwrites serves only to apply the necessary means of securely deleting data for any of several different drive technologies. A specific data storage technology only requires some lesser technique applied to ensure secure deletion.

Perhaps more interesting is the fact that, for the most modern hard drive technologies, a single complete overwrite of a drive with zeros should be sufficient. Part of the reason for this is the fact that data density on a drive is much greater than it used to be. In layman’s terms, “the bits are smaller”, which means that when rewriting, there is less room for old data to be left behind in a recoverable manner. A fair amount of redundancy of stored data occurred on older, lower density drives because the reading and writing devices were not as precise, and small deviations would leave random small areas unaffected on a single overwrite.

In a recent epilogue to his paper, Gutmann quoted himself responding to a researcher who considered doing some data testing:

Any modern drive will most likely be a hopeless task, what with ultra-high densities and use of perpendicular recording I don’t see how MFM would even get a usable image, and then the use of EPRML will mean that even if you could magically transfer some sort of image into a file, the ability to decode that to recover the original data would be quite challenging. OTOH if you’re going to use the mid-90s technology that I talked about, low-density MFM or (1,7) RLL, you could do it with the right equipment, but why bother? Others have already done it, and even if you reproduced it, you’d just have done something with technology that hasn’t been used for ten years. This is why I’ve never updated my paper (I’ve had a number of requests), there doesn’t seem to be much more to be said about the topic.

Recent papers by other researchers may seem to contradict Gutmann’s results. He does address some of this in his epilogues. Judging by both his epilogues and an independent look at reporting on such papers, it seems that such papers are in some cases misguided, and in others not contradictory of Gutmann’s results so much as relating to a specific technology that falls within the range of Gutmann’s more general overview.

While no single storage technology requires Gutmann’s described technique for dealing with all technologies, few of us have the time or inclination to double-check the specific technologies and the approaches required for each of them before tackling the task of secure data disposal. If you want to run a secure data disposal service where you expect to need to deal with many, many different storage devices regularly, it pays to know the specific techniques for specific technologies, and to apply them, if only because the time and resource costs for secure deletion will add up quickly. If you are a more typical user who just needs to get rid of a hard drive every couple years or so, the time spent keeping track of drive technologies and data disposal techniques is probably worth more to you than the time it takes a computer to perform Gutmann’s thirty-five overwrite “scorched earth” technique.

For more, visit

Sticks and stones: Picking on users AND security pros

Nobody likes to get picked on. But is it sometimes necessary to snap people out of their apathetic approach to security?

By Bill Brenner, Senior Editor

August 25, 2010 — CSO

I took my share of name-calling as a kid. I did my share of name-calling, too. We’re taught that nothing good comes of such behavior. I’ve been thinking a lot about that since writing an article two weeks ago called “Security blunders ‘dumber than dog snot’” during the 2010 USENIX Security Symposium.

The story is based on a talk of the same title given by Roger G. Johnston, a member of the Vulnerability Assessment Team at Argonne National Laboratory. In the presentation, he gave examples of surprising (or not) examples of what he has seen as a vulnerability assessor: security devices, systems and programs with little or no security — or security thought — built in. There are the well-designed security products foolishly configured by those who buy them, thus causing more vulnerability than before the devices were installed.

Then there are the badly-thought-out security rules and security programs laden in security theater, lacking muscle and teeth. In fact, some policies only make some employees disgruntled because they are treated like fools. In turn, the company risks turning them into malicious insiders.

Also see “Ouch! Security pros’ worst mistakes

Johnston described three common problems: People forgetting to lock the door, people too stupid to be helped and — worst of all — intelligent people who don’t exploit their abilities for the betterment of security. Enter what he calls the dog snot model of security– where intelligence and common sense exist but are not used.

He came up with the term by watching his dogs, who often crash themselves against the picture window facing the yard when they want to go chase a squirrel. Hence, the windows are covered in dog snot. Executives and lower-level users are often like the dogs in that they bang their heads against the firewall (or their fingers against the keyboard) in an effort to get at a shiny object online. The security pros themselves can get caught up in this too, usually banging up against the glass by trying to prevent bad things from happening by repeating the same failed practices.

Moments after the story went live and appeared on Twitter, I got a message from Adam Shotack, co-author of “The New School of Information Security” and a security specialist at Microsoft.

“Is that attitude helpful? Does anyone respond better when you call them ‘dumber than dog snot?'” he asked.

For the rest, visit CSO Online.

Symantec: A mid-year status check on security predictions

Fast-flux botnets, social networking attacks, mobile malware and more — Symantec looks back at 2010 security predictions and how reality is matching up

By Vincent Weafer, Symantec – August 23, 2010


As predictive analytics emerge as a sought-after business tool, Symantec continues to gather data that it uses to both analyze and predict trends in Internet security. Just like predictive analytics provides valuable information allowing businesses to make smart decisions, Symantec’s predictions are based on analysis and give businesses and individuals important information on the changing threat landscape that helps them make smart decisions. In order to offer the best information possible, Symantec reevaluates its yearly predictions halfway through the year. Here’s a look at each prediction for 2010 and an evaluation of where it stands at the midyear mark.

What We Said: Antivirus Won’t Cut It
The multiplication of both malicious code and of polymorphic threats was so great in 2009 that the amount of malicious software actually surpassed the amount of good software. While users should still maintain antivirus protection, they are going to need something more to be secure. Other approaches, such as Reputation-Based Security, will emerge as key alternatives to the footrace of writing signature codes for malware.

Where it Stands
The increase of malicious code has not let up since making that prediction. While Symantec created 2,895,802 new malicious code signatures in 2009 (71 percent more than 2008), it has already created 1.8 million new malicious code signatures in the first half of 2010. It has also identified 124 million distinct new malicious programs.

The number of sources for new malicious code is huge and keeps growing. The security industry is simply not going to be able to keep up with the speedy spawning of malware. That doesn’t, however, mean cybercriminals have won. Reputation-Based security is catching on as a smart, innovative solution that promises security to those who are interested. Heuristic, behavioral and intrusion prevention technologies are also means of future protection as malware continues to spread.

What We Said: Rogue Security Software Vendors Step it Up
Sellers of rogue security software have not yet reached their peak. They will become more active and more innovative. They have already begun to sell rebranded copies of free third-party AV software and will likely begin to use tactics such as rendering computers useless and holding them for ransom until they are paid.

Where it Stands
While cases of vendors holding computers for ransom have not yet been observed, Symantec has certainly seen more activity and more innovation from rogue security software sellers. One example of this is the practice of cold calling where sellers insist a person’s computer is infected and offer “solutions” either by having them download something or by convincing the user to let them access the computer remotely. In such cases sellers may be from actual companies who make a business out of such scams, as was one company Symantec investigated called Online PC Doctors.

What We Said: Social Networking Third-Party Applications Will Be Fraud Targets
Social networking sites have been awakened (rudely, in some instances) to the reality that their popularity makes them a target for fraud and other cybercrimes. Symantec predicted that many of them would react well and continue to take steps to secure their sites. Sadly, cybercriminals are not so easily deterred. They will turn to vulnerabilities in third-party applications to weasel their way in and wreak havoc.

Where it Stands
This trend is still developing, but it is developing in the predicted direction. Fortunately, social networking sites have reacted well and decreased the amount of malware breaking through their sites. Unfortunately, malicious efforts are increasing in the world of third-party applications. One app, for example, turned out to be part of an IQ testing scam that covertly signed up users for premium mobile service that cost $10 per month.

Also see ‘The 7 deadly sins of social networking security’ on

Social networking sites may already have begun working against this trend. Facebook recently updated their application authorization system in an effort to reduce the number of scams and misleading applications making their way into the site. Users are now informed when an application seeks to access their information or post on their wall.

For more, visit CSO Online

Google makes Chrome devs dig into pockets

By Gregg Keizer at Computerworld – Fri Aug 20, 2010 6:09pm EDT

Computerworld – Google on Thursday announced that it would require new Chrome extension developers to pay a one-time $5 registration fee as a way to stymie malicious add-ons for its browser.

The company also launched a preview of its Chrome Web Store, giving developers a chance to experiment with the online mart before it goes public later this year. Developers can use the store to give away or sell theirbrowser extensions, themes and Web apps.

“[The signup fee] is intended to create better safeguards against fraudulent extensions in the gallery and limit the activity of malicious developer accounts,” said product manager Gregor Hochmuth in a blog post.

The payment must be made using Google Checkout, which links payments to a credit card, thereby creating a paper trail to the developer — or at least to the billing address and phone number recorded by the credit card company.

By charging the fee, “Google gets some more information about the human on the other end [of the developer account],” said Andrew Storms, director of security at nCircle Security. “It adds some legitimacy to the developer.”

A Chrome rival noted the paper trail aspect of the new registration fee, too. “Someone pointed out the $5 registration fee for Chrome Extension Gallery creates a paper trail, which is a good point,” said Mike Beltzner, Mozilla’s director of Firefox development, in a Twitter message on Thursday.

For more on this story, visit

This Day in Tech: Aug. 19, 1839: Photography Goes Open Source

1839: With a French pension in hand, Louis Daguerre reveals the secrets of making daguerreotypes to a waiting world. The pioneering photographic process is an instant hit.

Using chemical reactions to make images with light was not quite new. Doing it fast was. Inventor Joseph Nicéphore Niepce created a rough image using silver salts and a camera obscura, or “dark box,” in 1816. The image faded away quickly.

Another decade of work led to the first permanent photographic image, when Niepce fixed a shot of his courtyard onto a pewter plate. The exposure took eight hours in bright sunlight. Niepce continued researching in hopes of making the process faster and more practical.

Daguerre was a successful commercial artist hoping to increase the realism of his giant diorama paintings, some of them 70 feet long by 45 feet high. When using a camera obscura to sketch the outlines (or cartoons) for his paintings, he thought it would be better to create images directly with the camera. He began experimenting.

Daguerre’s optician told him about Niepce’s work. Daguerre and Niepce began a correspondence that turned into a partnership in 1829. Niepce died in 1833, and his son Isidore labored on. But it was Daguerre’s advances with silver-plated copper sheets, iodine and mercury that cut exposure time down to minutes and created positive rather than negative images.

Daguerre was unable to sell his process by subscription, but it caught the interest of François Arago, perpetual secretary of the French Academy of Sciences. It was under the auspices of the academy that Daguerre first displayed his daguerreotypes to the public on Jan. 9, 1839. They created a sensation.

For the rest of this story, and more This Day in Tech, visit

The 10 best IT certifications: 2010

  • Date: August 17th, 2010
  • Author: Erik Eckel

The certification landscape changes as rapidly as the technologies you support. Here’s an updated list of certs that currently offer the most value and validity for IT pros.

Just as with many popular arguments — Red Sox v. Yankees, Chelsea v. Manchester United, Ford v. Chevy — IT certifications are popular fodder for debate. Except that certifications, in an IT professional’s microcosm of a world, have a bigger impact on the future. Just which certifications hold the most value today? Here’s my list of the 10 accreditations with the greatest potential for technology support professionals, administrators, and managers seeking employment within consulting firms or small and midsize organizations.

Note: This article is also available as a PDF download.


This best certification list could be built using 10 Microsoft certifications, many of which would be MCITP accreditations. The world runs on Microsoft. Those professionals earning Microsoft Certified IT Professional (MCITP) certification give employers and clients confidence that they’ve developed the knowledge and skills necessary to plan, deploy, support, maintain, and optimize Windows technologies. Specifically, the Enterprise Desktop Administrator 7 and Server Administrator tracks hold great appeal, as will Enterprise Messaging Administrator 2010, as older Exchange servers are retired in favor of the newer platform.


With operating systems (Windows 2000, 2003, 2008, etc.) cycling through every several years, many IT professionals simply aren’t going to invest the effort to earn MCITP or MCSE accreditation on every version. That’s understandable. But mastering a single exam, especially when available examinations help IT pros demonstrate expertise with such popular platforms as Windows Server 2008, Windows 7, and Microsoft SQL Server 2008, is more than reasonable. That’s why the Microsoft Certified Technology Specialist (MCTS) accreditation earns a spot on the list; it provides the opportunity for IT pros to demonstrate expertise on a specific technology that an organization may require right here, right now.

3: Network+

There’s simply no denying that IT professionals must know and understand the network principles and concepts that power everything within an organization’s IT infrastructure, whether running Windows, Linux, Apple, or other technologies. Instead of dismissing CompTIA’s Network+ as a baseline accreditation, every IT professional should add it to their resume.

4: A+

Just as with CompTIA’s Network+ certification, the A+ accreditation is another cert that all IT professionals should have on their resume. Proving baseline knowledge and expertise with the hardware components that power today’s computers should be required of all technicians. I’m amazed at the number of smart, intelligent, and seasoned IT pros who aren’t sure how to crack the case of a Sony Vaio or diagnose failed capacitors with a simple glance. The more industry staff can learn about the fundamental hardware components, the better.


SonicWALLs power countless SMB VPNs. The company’s network devices also provide firewall and routing services, while extending gateway and perimeter security protections to organizations of all sizes. By gaining Certified SonicWALL Security Administrator (CSSA) certification, engineers can demonstrate their mastery of network security essentials, secure remote access, or secure wireless administration. There’s an immediate need for engineers with the knowledge and expertise required to configure and troubleshoot SonicWALL devices providing security services.

For the rest of the list, visit

Bulletin: Intel to buy McAfee for $7.68 billion

Chip maker says deal intended to beef up its mobile strategy

By Marc Ferranti – August 19, 2010 09:45 AM ET

IDG News Service – Intel said Thursday it plans to acquire security vendor McAfee in a cash deal valued at about $7.68 billion and aimed at enhancing the chip maker’s mobile strategy.

Both boards of directors have approved the deal, and McAfee is expected to become a subsidiary within Intel’s Software and Services Group.

“Hardware-enhanced security will lead to breakthroughs in effectively countering the increasingly sophisticated threats of today and tomorrow,” said Renée James, Intel senior vice president, and general manager of the group.

For more, visit

Critical Adobe Reader hole to be patched Thursday

Elinor Mills CNET News | August 19, 2010 4:52 AM PDT

Adobe will release a patch on Thursday for a critical hole in Reader that was disclosed at the Black Hat conference late last month, the company said on Wednesday.

Adobe had announced on August 5 that the emergency fix was coming this week, in advance of the next quarterly security release, scheduled for October 12.

The security update will resolve an undisclosed number of critical issues in Reader 9.3.3 for Windows, Mac, and Unix; Acrobat 9.3.3 for Windows and Mac; and Reader 8.2.3 and Acrobat 8.2.3 for Windows and Mac, according to Adobe’s advisory.

The flaw, which could be exploited to take control of a computer, is related to the way Adobe’s PDF (portable document format) reader software handles fonts, said Charlie Miller, principal analyst at Independent Security Evaluators who disclosed the hole at the security conference.

Visit for the rest of this story.

Deep theater defense

We all know perimeter firewalls are necessary but not sufficient. But what’s the right strategy for building additional layers of security? Greg Machler dives in.

By Greg Machler

August 17, 2010 — CSO

As an executive, do you ever get worried wondering if your corporate brand is properly protected from a lack of technological integrity? Corporations today have sensitive HR data, financial data, and often consumer data. If this data is compromised, often the outside world finds out about it, lawsuits are initiated and the corporate brand is tarnished. This could lead to consumers thinking twice about purchasing your products or services.

In the case of retail organizations, how does one effectively protect customer credit card data? Consider deploying an IT architecture that information security professionals call a deep-theater defense. Let’s investigate the design of this protective architecture:

First, put sensitive data in a second-tier of firewall segments behind the main corporate firewalls. This second-tier firewall and corresponding network shields sensitive applications and their data from being easily accessed if the Web-facing firewalls are breached.

For example, many national retailers sell groceries and have a pharmacy. It would be wise to deploy at least five firewall/network segments: one for HR data, one for financial data, one for credit card PCI (Payment Card Industry) data, one for pharmacy (HIPAA) data, and one for services that the other segments shared.

The segment containing services that are shared could contain common support services such as network and systems management, encryption and PKI functions, access control services, and security event management functions. Another architectural implementation that protects corporations from internal data theft is the creation of a tunneling access protocol. Often, critical systems are accessed by administrators and outside vendors.

It is important that all access to these applications be logged so that if an internal data breach occurs, the source can be discovered. It is important that the second-tier firewall close its administrative port access so that administration can only be initiated from the segment for common services. One wants to prevent access from administrative tools that exist in front of the second-tier firewalls.

Applications need to be ported behind the deep theater second-tier firewalls. Where does one start?

For the rest, visit CSO Online.

Hackers steal customer data by accessing supermarket database

Sunday 15th August, 03:50 AM JST

OSAKA – Hackers stole customer data from eight online supermarkets in Japan, including Uny Co. and Neo Beat Co, in July using a hacking technique called SQL injection to access their databases, sources familiar with the matter said Saturday.

A source close to Neo Beat, which also operates the websites of these online supermarkets, said it believes that the approximately 30,000 unauthorized accesses to its database server were likely ‘‘perpetrated by a group of professional hackers.’‘

The accesses, which were conducted from Japan and China on July 24-26, resulted in the theft of data on a total of 12,191 customers of the Osaka-based company as well as its seven business partners including supermarket chains Izumiya Co, Maruetsu Inc and Ryukyu Jusco Co.

Neo Beat has since filed a damage report with the Osaka prefectural police, and the companies have closed their online markets since late last month. Police investigators are now looking into the case and gathering relevant information.

Some major credit card companies have confirmed cases in which the credit card data stolen by the hackers in the July incident were used by third parties to buy goods.

An official at a credit card company said there have been more than 100 cases in which such parties either used or attempted to use the stolen card data.

Although credit card companies do not charge customers whose card data were illicitly used, some card companies have recommended that affected customers get new cards and invalidate their old ones.

For more, visit