Has the malware problem gotten out of control? An aggressive test of anti-virus products indicates that it has, at least by some measures. Its a tough call.
I worry less and less personally about malware, even though Im barraged by it day and night. Ive got a gateway security device that scans for it with a Kaspersky scanner and my mail server runs Sunbelt Softwares Ninja, which uses both Authentium and Bitdefender to scan everything coming through, *and* I have desktop anti-virus on almost all my systems. Ive got my belt and suspenders and my pants are nailed to my gut.
And its a good thing Ive got all this protection, because tests by independent test group AV-Test paint a dark picture of the detection capabilities of most products.
For many years there has been a standard of sorts for testing anti-virus products called the “WildList.” The problem with the WildList is that its relatively small and contains only certain types of malware, and everyone knows its contents. As Andreas Marx of AV-Test puts it, “the WildList is not reflecting todays threats, but more or less historical threats only (e.g. which malware was widespread two months ago?).” So its not surprising that (according to AV-Test) most scanners can detect 100 percent of it. What you should worry about is the huge number of other threats out there.
AV-Test ran a huge test of backdoors (59,053), bots (70,658) and trojan horses (159,971) for a total of 289,682 malware samples. They ran them through 33 products. We have separate numbers on the bots, backdoors and trojans, but check out the table, below, for the ranked results of overall detection percentage.
How do the numbers look? If you ask me, not good. Five vendors scored over 99 percent, which has to be considered excellent in so large a test sample. Another six scored over 95 percent. Another seven were over 90 percent, and this is approximately the median, which was 90.42 percent. Half the products did worse than this; 10 were under 75 percent and four were under 50 percent. Thats pretty bad.
Several of the best products, like my own mail security product, use multiple engines. The No. 1 product, WebWasher by Secure Computing, for example, detected 99.97 percent or all but 87 out of the 289,682 samples. It uses the AntiVir engine (the No. 2 product) in combination with an engine Secure Computing developed on its own. Not all products that use multiple engines score better as a result, as they may configure those engines less aggressively.
Analysis
Some of the best products surprised me. I didnt expect Symantec to do so well, but hats off to them. People like to complain about them, mostly for reasons unrelated to detection percentage, but this test does seem to show that they have taken detection of non-viral malware very seriously. The other high scorers, AntiVir in particular, dont get a lot of attention in the press, and perhaps we have not done them justice.
Then down in the Hall of Shame section we have many products that cant seem to keep up with the flood of malware. Ill start this list with Microsoft which, at 76.18 percent is seriously third-rate. I expected better from Sophos, although thats based on reputation, not personal experience. The eTrust engines and ClamAV are just where I expected them. Of course, even if they didnt dispute the tests in some way, authors of these products might claim that users are highly unlikely to encounter most of the threats in it.
What does this test show? Is it more important that its possible for products, especially if they use multiple engines, to detect a very high percentage of attacks? Or that the majority of products let through a very high number of attacks? I have to focus on the latter point. It makes me want to consider, once again, alternative approaches to malware.
Most, if not all of these products, detect many classes of malware generically by common characteristics. I asked Andreas Marx and he confirmed that they didnt break out for this test which detections were based on specific signatures and which were generic, which is a difficult distinction for an outsider to draw in any case.
But I have to think that the percentage of such detections is increasing over time, especially in products like Symantecs. They cant actually have anywhere near 290,000 signatures.
And then there are many companies trying to come at the problem from the other direction, whitelisting programs that the user should be allowed to run and disallowing everything else. This is an old idea and has proven difficult to manage in the past, and it misses malicious code run through most vulnerabilities like buffer overflows. I hear from just about all of these “alternative” approach vendors and I wonder if their time will ever come, but their mission is becoming more important.
Security Center Editor Larry Seltzer has worked in and written about the computer industry since 1983.
More from Larry Seltzer