As more malware is detected, anti-malware signature databases grow. The size of such databases and how up-to-date they are affect detection rates (and overall system performance). As the stress to meet performance demand increases, the need for anti-malware product evolution becomes more apparent.

To give you an idea of the sheer volume of malicious programs, Andreas Marx of AV-Test.org recently wrote that his organization received nearly 5.5 million unique malware samples in 2007 (at the URL below). That's quite a lot. Marx also wrote that his organization's in-house testing platform (which handles 45 different anti-malware solutions) downloaded a total of 148,869 unique updates in 2007 for a total of 1.6TB of data. I did some quick math and found that that's an average of 9 updates per day for each of the 45 products. http://www.av-test.org/down/papers/2008-02_vb_comment.pdf

So how does increased malware volume affect detection rates? AV-Test.org recently released a report that compares the performance of 30 anti-malware solutions in multiple tests. The tests included on-demand signature-based tests, adware and spyware detection, false-positive rates, retrospective and proactive detection, rootkit detection, new outbreak response times, and malware cleanup. Retrospective and proactive detection tests the capabilities of a tool that hasn't been updated for one week combined with any built-in heuristic and behavior-based defenses the tool might have.

The on-demand tests used 1.1 million Trojans, backdoors, bots, worms, and viruses collected in January and February. The adware and spyware tests used 80,000 samples that are no longer active. To determine false positive rates, 100,000 known clean files were run through each scanner. Thirty-five hundred samples were used for the retrospective tests, and 20 active samples were used to conduct proactive tests. In addition, the rootkit detection tests used 12 active rootkits, and cleanup was tested against 20 active malware samples. The cleanup test checked whether a solution could remove the malware and repair any damage, such as changes to the registry or modifications to the system's "hosts" file. To gauge response times to new outbreaks, the tests monitored update turnaround time for 55 outbreaks in 2007 and 3 outbreaks in 2008.

Avira, Sophos, and Trend Micro all ranked at the top overall, with each having a strong point as compared with the competition. For example, Trend Micro's rootkit detection is superior, Sophos's proactive detection is superior, and Avira has the best overall scan speed and response time for issuing updates after a new outbreak (clocked at less than 2 hours on average).

If you look at the results from a narrower perspective that takes into consideration only detection rates for malware along with those for adware/spyware, then Webwasher and G Data are the clear winners. The companies each achieved 99.9 percent detection rates in both categories. Right behind them was TrustPort, with a 99.6 percent detection rate for malware and 99.8 percent detection for adware/spyware, followed by Avira with 99.3 and 99.1 percent detection rates. You can view the full results at Virus Bulletin's site at the URL below: http://www.virusbtn.com/news/2008/03_13a.xml

It seems obvious that the evolution of anti-malware defense needs to move toward better behavior-based detection. Otherwise, we'll all wind up with gigantic signature databases, which of course would translate to performance problems in terms of raw system resource requirements as well as bandwidth use. Stronger gateway products could be another solution because they could offload a megaton of burden from desktops; however, those solutions don't address malware that doesn't transit through a gateway, as is the case with desktop-to-desktop or desktop-to-server transfers.

Marx outlined some of his ideas for better behavioral testing in a recent presentation given at the AVAR 2007 Conference in Seoul, South Korea. The presentation is detailed in a paper available (in PDF format) at the AV-Test.org site, at the URL below--if you're interested in how anti-malware technology might evolve. http://www.av-test.org/down/papers/2007-11_avar_2007_dynamic.zip

If you're curious about other anti-malware performance-related reports, check the data available from VirusTotal (at the first URL below), AV-comparatives.org (second URL below), and Okie Island Trading Company (third URL below). http://www.virustotal.com/estadisticas.html http://www.av-comparatives.org/ http://winnow.oitc.com/malewarestats.php