Let’s start with the good news: antivirus products all work. If what you care about is protecting your system from viruses and similar digital pathogens, just about every major vendor in the AV space does a respectable job. But don’t take my word for it. Check out AV-Comparatives, which currently evaluates 20 of today’s most recognized names in the antivirus world.
The above chart shows AV-C’s results for August 2011. Now compare against the same tests done in August 2010.
While it’s interesting that the three top performers on these charts are consistent, the point is that most players can change considerably from year to year, and even month to month. Does anyone really think that Microsoft went from 98% accuracy in 2010 to 92.5% and stayed there? As if the company suddenly forgot how to write virus definitions? No. Quite literally, sometimes AV companies have bad days. In the 2011 AV-C tests, Sophos and Webroot (which uses Sophos technology) were the only vendors to be dropped from the testing because the cloud-based portion of Sophos’ definition set was down.
As multiple vendors agreed in interviews with us, when it comes to detection and isolation of modern viruses, worms, bots, and so on, just about everyone does at least an adequate job.
“Most of us are close,” says Dodi Glenn, product manager at Vipre Antivirus. “The thing is, you could say that you detected some sample set, and your AV is better than mine. But if you fast forward, I could say that you missed X, Y, and Z. There’s a notion of when you gathered the data. And is it a zero-day? Has it never been seen before? Is it a proactive or reactive type of detection? It all depends on the sample set you’re using. In theory, your efficacy rate is going to change any time you update.”
Accuracy may no longer a primary criterion for product selection, although it should still be considered as a secondary item after pricing and full AV suite functionality. There is another side to consider with AV products, though, and long-time Tom's Hardware readers know it well. What impact is the software having on your system? Loads of features and stunning detection accuracy may be impressive, but if the background AV product is sucking minutes or hours of your productivity and performance away from foreground tasks, you have a problem. In general, for reasons we’ll discuss soon, we are less concerned about the impact of scheduled scans than we are about the low-level monitoring that today’s AV products perform constantly. Will they slow your gaming? Will they balloon your Web page load times?
We don’t need an exhaustive answer from examining two dozen names. We figured that half of a dozen would do for establishing whether or not AV products in general are dragging on your system and if there is a significant variance in this drag between products.
We felt it was important to represent a couple of the leading free AV products in our selections, and by our totally informal reckoning, AVG Free and avast! are two of the most popular. We certainly had several people in our last AV roundup request avast!. However, since that company is based in the Czech Republic and didn’t answer our press query, while AVG (located only three time zones away) bent over backward to help and participate, it made our choice simpler.
AVG Free (free.avg.com), as with most no-charge AV products, provides you with a solid antivirus core and little else in the way of free amenities. Most notably, the product lacks a firewall, so there’s no guarding inbound or outbound packet sniffing. Does it matter whether you detect a malicious before or after it reaches your system? Probably about the same as it matters if you extinguish a burning torch in your driveway instead of your hallway. You just don’t want something dangerous coming inside.
AVG Free still monitors your browsing and social networking. You’ll find a nifty tool called AVG Advice that proactively monitors Chrome, Firefox, and Internet Explorer, alerting you in the event of any “overuse of memory” by a browser. You’ll also be warned if AVG’s LinkScanner determines that a site is untrustworthy before you visit it. For $36, you can step into AVG Anti-Virus 2012 and thereby grab a firewall and priority tech support. For $49.49, AVG’s Internet Security 2012 will safeguard your wireless connections and email, as well as allegedly accelerate system start-up and rich content downloading.
GFI Vipre Antivirus (www.vipreantivirus.com/) comes to this story as our dark horse, a new contestant in our testing. Previously sold by Sunbelt before being acquired by GFI, the standard edition of Vipre covers antivirus, antispyware, email protection, and rootkit detection/removal. Vipre Premium adds in a firewall, ad blocking, Web site blocking, malicious script blocking, and more.
We can debate whether Vipre brings much that’s new to the AV space, but we definitely like having readily available phone tech support here in the U.S. Equally inviting are the product's flexible billing options. There are no auto-billing subscriptions—nice. Instead, you can select whether you want a license for one, two, or three or more systems, as well as a license duration of one, two, or three years or the PC’s lifetime. The Premium product spans from $30 for one year on one PC to $150 for lifetime support on three or more systems.
We aimed in the middle of Kaspersky’s AV stack and trialed the Internet Security 2012 version ($65; usa.kaspersky.com). For another $5, the Pure Total Security edition adds centralized security management for all household systems, parental controls management, a password manager, file wiping, and data backup.
Kaspersky recently overhauled its GUI, making it much simpler to work with. The company now hosts its reputation-based advice system in the cloud, which may add a little latency. But, on the other hand, it’s good to have a resource able to caution you about a given file’s trust level before you open it. The same cloud-based resources can also check application components, such as DLLs, and email for spam screening. Another clever Kaspersky addition here is malware rollback, which eliminates any system changes made by malware to the conditions of a previous session.
McAfee Internet Security ($50; home.mcafee.com/store), was recently acquired by Intel. Why? We asked, but reps only said that we’d find out very soon, wink-wink. The AV app has been around forever, of course, and is a perennial favorite for OEMs and service providers to offer as a customer incentive. In addition to antivirus, antispyware, and firewall protection, McAfee likes to talk up the product’s new anti-bot capabilities. Safeguards against malicious iFrames also bolster the app’s Web-based resume, as does “Deep Page Protection,” which alerts users before they step into a suspect Web site. We also like that McAfee automatically scans any added USB or other removable storage device.
Stepping up to McAfee Total Protection ($60) raises the package’s online storage from 1 GB to 2 GB, which is still lackluster in an age when 2 GB Dropbox and 50 GB ADrive accounts are free. The best draw in the higher-end product is anti-phishing protection. Otherwise, stick with the already well-endowed Internet Security 2012, which still provides functions like network monitoring (to show if you have rogue devices on your network) and Super Mode (a deeper inspection mode that kicks in when malware is detected).
As the closest thing to a free "de facto" AV product on the market, Microsoft’s Security Essentials is sort of a must-have in a round-up of this sort, if only to use as a possible baseline for judging against other products. As you can see in the earlier AV-C charts, Security Essentials isn’t known for being best-of-breed, but it also doesn’t have to be. As a fairly stripped down, antivirus-only tool, it just has to be good enough. Which it is. You don’t see forums filled with people lamenting how viruses killed their systems because Security Essentials is useless. You see lots of complaints from people who didn’t install any AV product.
Microsoft installs quickly and updates with no hassle. As usual, we accepted all of the program’s default settings, save for disabling the scanning schedule. Security Essentials also prompted us to enable Windows Firewall if we didn’t have any third-party apps serving in its place, so we did this, as well. You don’t get much ability to configure or customize here, but that shouldn’t bother most mainstream users.

Symantec’s Norton Internet Security ($70; http://us.norton.com/internet-security/) descends from what may be the oldest, most successful, and most criticized AV product in the market. Norton has always strived to be the most feature-rich AV title available, and there was a long stretch of time in which that also meant being the most demanding on system resources. In this sense, the cure was often worse than the disease, and a Norton AV scan could often bring a single-core system to its knees. Of course, this led the market to value features like off-hours scheduling, deferred scanning during non-idle times, and making low CPU impact a top priority. The company is so phobic about this now that the phrase “Stop online dangers without sacrificing performance” is its top marketing bullet.
As the next image shows, Symantec makes heavy use of reputation analysis (branded as Download Insight 2.0) in its AV. You also get identity protection, antispyware, antispam, firewall, and phishing protection, all wrapped up in a slick UI that puts a fairly simple front end on a ton of options settings. The $70 price is for up to three systems for one year. A two-year license runs $115, and three years notches to $165. If you don’t mind giving up parental controls, malicious Web site blocking, some identity protection features, and Symantec’s password wallet, you could slide back to Norton AntiVirus 2012 ($40 for one year/one PC, $70 for three years/three PCs).
We tested on an open bench system comprised of the following components:
- Intel DX50SO2 motherboard (BIOS version 0876)
- Intel Core i7-2600K processor (Hyper-Threading enabled, 3.4 GHz)
- 8 GB (2 x 4 GB) OCZ Gold Edition 1333 MT/s DDR3, 9-9-9-20
- Sparkle Calibre X560 DF
- Western Digital Scorpio Blue 750 GB (primary drive)
- Patriot Wildfire 240 GB (secondary drive for image cloning)
- PC Power & Cooling Turbo-Cool 860
On this hardware platform, we installed Windows 7 Professional 64-bit and applied all critical update patches effective October 2, 2011. This also included:
- Nvidia GeForce driver version 280.26
- Intel / Realtek ALC audio driver 6449
- Intel / Renesas USB 3.0 driver 2.1.25.0
- Intel Pro Network Connections LAN driver 16.5
- Intel / Marvell eSATA driver 1.2.0.7700
- Intel RST driver 10.6.0.1002
- Intel chipset device software 9.2.0.1030
Rather than run a totally clean configuration, we additionally installed the following applications to better mimic a real-world system:
- Google Chrome 14.0.835.163
- Firefox 7.0.1
- OpenOffice 3.3.0
- Google Picasa 3.8
- Apple iTunes 10.4.1
- Apple QuickTime 7.7
- Adobe Flash Player
- Adobe Reader X 10.1.1
- 7-Zip 9.20 for 64-bit Windows x64
- CPU-Z 1.58
- Microsoft .NET 4
- Microsoft SDK 7.1
- PCMark 7
- HTTPWatch
With all of this software installed, we then disabled UAC, Windows Update service, the screen saver, Windows system restore, all task scheduler events, Action Center notifications, automatic Windows Error Reporting, taskbar notifications, all hibernate/standby power settings, and Windows Media Player services. Firefox, Chrome, and IE were all configured to disable the initial wizard and crash prompting while their home page was set to about:blank.
Finally, we copied 10.0 GB of photos, video, and document files into a sample data folder. This comprised our “clean” image. The image used only a single account with no password to ensure automatic login after rebooting, and no unknown devices were present in Device Manager. All automatic updating tools were disabled, and we rebooted Windows before using Symantec Ghost 15.0 to back up the image to a secondary drive. Only after this image was created and finalized did we run our first tests and subsequently install our AV contenders. After testing was completed on one AV product, we would reformat the Scorpio drive, then copy the clean image back to the newly formatted drive for loading of the next AV product.
We used a 5400 RPM, 2.5” hard drive (rather than a more enthusiast-oriented) conventional SSD specifically to slow down our test times and magnify any differences that the AV products might be exerting on storage operations. In the same vein, this is also why needed a higher caliber of timing tools. A simple stopwatch is too imprecise for several of these tests. Instead, we turned to Microsoft’s Windows Performance Analysis toolkit. The need for this should be clear from the following Microsoft data (found in http://bit.ly/oOg71J):
System Configuration | Manual Testing Variance | Automated Testing Variance |
|---|---|---|
High-end desktop | 453 000 ms | 13 977 ms |
Mid-range laptop | 710 246 ms | 20 401 ms |
Low-end netbook | 415 250 ms | 242 303 ms |
We combined methodology suggestions from AVG, GFI, McAfee, and Symantec to arrive at our final test set as described below.
1. Install time. Using Windows PowerShell running with admin rights, we measured the installation time of LibreOffice 3.4.3 with this command:
$libreoffice_time=measure-command {start-process "msiexec.exe" -Wait -NoNewWindow -ArgumentList "/i .\libreoffice34.msi /qn"}
$libreoffice_time.TotalSeconds
2. Boot time. We used Windows Performance Analyzer’s xbootmgr and xperf tools to measure time elapsed across five looping boot cycles. Our score shows the mean time of the five cycles. Our command was:
xbootmgr –trace boot –prepSystem –numruns 5
3. Standby time. We used Windows Performance Analysis xbootmgr and xperf tools to measure time elapsed across five looping standby cycles. Our score shows the mean time of the five cycles. Our command was:
xbootmgr –trace standby –numruns 5
4. Synthetic performance. Our only conventional benchmark in this group, we used PCMark 7 to illustrate performance across a range of conventional computing tasks.
5. Page loads. We selected the following element-dense pages and used HTTPWatch to measure their load times in Internet Explorer 9.
6. Scan time. This time, a simple stopwatch would do, although most AV vendors display their scanning run times within the application. Given the time scale involved, we felt confident simply using these rougher tools. Because many vendors cache scanned files, we’ve broken out data for the first full scan and a mean value of three subsequent scans. The test system was rebooted between each scan.
On McAfee’s advice, we opted to test application installation times with the expanded .msi executable for LibreOffice 3.4.3 running under Windows PowerShell with Administrator rights. PowerShell’s UI looks much like a command prompt, and our command reports back with the exact time spent on file installation. Note that the /qn argument suppresses all of the usual installation pop-ups for which you inevitably click the Next button.

The standout surprise here is GFI Vipre Antivirus. We expected all application installs to be slower with AV running than on our clean configuration since AV products all run in the background, monitor file unpacking, and add resource overhead. Honestly, we’re stumped as to how GFI achieves this, but the company does stake much of its product marketing on speed. We suspect that subsequent image re-installations and test averaging might have yielded numbers closer to that of the clean image, and what we’re seeing might be a statistical outlier. Still, even at parity with our clean time, GFI blows away the rest of the field on app installation performance.
Perhaps not surprisingly, AVG and Microsoft deliver our next best times. Compared to Kaspersky, Microsoft, and Symantec, our two free AV products are considerably simpler and more pared down in their feature breadth. Perhaps an abundance of analysis is leading to installation paralysis? Certainly, installing an office suite to a 5400 RPM hard drive in under three minutes is nothing to sneeze at, but the other way to look at this is that Symantec added 60% to our application time. If you tend to do a lot of program loading, this could be a concern.
This was our first experience testing with Windows Performance Analysis tools. No doubt, this is old news to software developers, but for us, it was a great opportunity to dig into the individual software traces that happen during regular operations. For those who think that Microsoft gives short shrift to things like boot-up time, know that the company has devoted ample effort to making sure that OEMs have such tools and know-how to use them. A system builder has the power to see exactly how this or that bit of pre-installed code impacts other traces and where stalls are happening at the millisecond scale. Whether builders make effective use of these tools is a different story.

We ran this test three times, but there was no getting around the data before us: our clean config was the slowest for boot-up. Every AV product seemed to somehow accelerate the booting process. Perhaps this is somehow done with caching? We don’t know, but the average improvement over clean running was about 15 seconds.
GFI alone pulls ahead of the pack, shaving nearly 30 seconds off of our AV-free boot times. The other five contenders are statistically in a dead heat.
Note that we discovered an “aggressive boot time” setting in Symantec’s options, which is disabled by default. This default setting is what we reported above, but then we ran a second set of data with the option changed from “off” to “aggressive.” The resulting average time was 165.131 seconds—virtually no difference from the disabled score. If that’s the case, what is this feature doing, why is it disabled, and what does “aggressive” mean here?
The funny thing about standby time as a performance metric is that it’s commonly cited, yet infrequently relevant. Think about it. How often do you actually command your system to enter standby mode and then sit around waiting for it to happen? You don’t. Much more often, your system will slip into standby or hibernation as a result of power saving during prolonged idle time—when no one is present and waiting for the process to complete.
Still, standby is generally held up as another indicator of system impact, so we ran the tests.

Here we see a bit more differentiation than in our boot time results. Again, GFI turns in the best score of the group, still somehow beating our clean configuration. Now, though, McAfee and Microsoft also slip in ahead of our AV-free system. Symantec barely registers a slight impact, and Kaspersky and AVG come in close together at the back of the pack.
Despite there being a 28% variance from first to last place in this group, we’re not sure how much emphasis to place on this as a statement about overall performance. A positive or negative range of only nine seconds on a relatively slow system disk is fairly minor no matter how you look at it.
Here we come to the meat of our testing. When you have real applications running and you’re manipulating real media content, how much of a burden is your AV software? There are two ways to look at the answer, and we’ll show both. Here are the overall PCMark7 score results with the horizontal axis set to Auto:

This view magnifies differences in the results. The obvious conclusion is that Kaspersky seized the day and Microsoft...didn’t. Now let’s start the horizontal axis to 0 and see how things look:

It’s an even field, right? Suddenly, you no longer care about performance—they’re all practically the same—and you’re back to thinking about pricing and features. In fact, the real take-away here might be that there seems to be little to no impact from AV products on application performance. Kaspersky and McAfee even slip in just under our clean config score, again defying intuition and making the case for normal statistical variance in this test load.
So much for our lesson in statistics manipulation. You know that PCMark is comprised of several tests examining different usage scenarios, so let’s break this down.

A few AV products claim to improve rich media content performance, but if this test, which focuses on video playback frame rates, is any indication, such enhancements are totally absent from our test group. We found no statistically significant variance.

With video downscaling, we see a big of wiggle. In an odd twist, every competitor except Microsoft narrowly outperforms our clean configuration. AVG squeaks in with the win, but it’s practically a photo finish for all players. Again, we see little influence from antivirus background scanning in either direction.

We expected to see some of the most variance within the PCMark suite here, as gaming storage throughput might be the most sensitive to impacts from background file scanning. Not so. Only GFI and Microsoft show any real influence on scores, and even then the improvement is minimal. Once more, we see AV not hindering system performance.

Perhaps this is our most damning data point. Top to bottom, we see a roughly 12% impact against DirectX 9 performance across the board. No AV product does better than another at preventing this. It seems to be an inherent liability baked into the background scanning process.

Still image manipulation performance? Zero impact from AV presence. Nothing to see here...move along.

Importing photos is another place where we expected to separate the wheat from the chaff, as this would keep AV scanning constantly busy watching incoming files for malicious bad apples in our barrels of photo files. Yet again, there’s not much variability. GFI and Microsoft tie for delivering the best storage performance. And strangely, our clean configuration shows the worst performance of the bunch, lagging the leaders by about 10 percent. So AV is good for storage and bad for graphics? Weird.

Finally, some data that matches our presuppositions. The clean configuration shows the highest Web page browsing speed, which makes sense as the results are totally unscanned. GFI seems to spend the most cycles on page element analysis, bringing up the back of our fairly tight-knit pack. Microsoft comes the closest to native speed, although, given AV-C’s detection results, we have to wonder if this is because Security Essentials isn’t doing as thorough of a job.

The data decryption side of Web browsing tends to lean more heavily on CPU performance, so we would expect to see AV apps with higher CPU utilization resulting in lower scores here. It’s possible that such an effect is seen here, but with the groups laggard (Symantec) turning in a score only 3% below native speed, we’d call this a non-issue. The preliminary conclusion would be that AV is not affecting CPU performance.

Finally, we have Windows Defender performance. Defender guards against spyware. It’s a free download and a fairly necessary one if you’re only going to run a basic, free AV app. This time, Kaspersky falls behind while all other AV products show a fair performance gain over our clean baseline, led by AVG and GFI.
We have PCMark for our synthetic tests, so we wanted to use three real, commonly traveled, and highly stuffed Web pages for our assessment of AV impact on real-time browsing. As you might expect, pages can vary considerably in how they integrate third-party elements, such as hosted video and banner ads. We suggest considering all three result sets together.



We can only make a few general statements in looking over these charts. First, we see that the clean configuration is almost always the fastest. This is to be expected. Only once, on the Disney home page, does a single product slip in with a slightly faster score, and we attribute this to temporary Internet congestion slightly skewing our clean config numbers.
Of the three charts, we prefer the Wall Street Journal results as being the pattern that most fit our expectations. It may be that Microsoft puts the least effort into advance page scanning and so reports back the fastest results of the AV contestants. We can debate whether the other products are simply slower and more cumbersome or if they do a more thorough job of applying reputation analysis and other factors to provide advance warning to users if needed. Either way, we like seeing that the AV products are doing something.
Is a 10-second slow-down worth worrying about? Yes. Believe it or not, a 10-second delay experienced 100 times in the course of a day means more than 16 extra minutes of sitting around waiting for pages to load. That’s an entire break time broken up so minutely that you have no time to actually enjoy the break. On the other hand, consider the potential time lost and heartbreak gained by picking up malware from an unscreened site. We’ll take the 16-minute loss and hope AV vendors can improve their scanning algorithms in the future.
Antivirus scanning is an odd beast. Everybody wants to know the numbers, but the means with which the results are generated can be so various as to be meaningless. Moreover, we come back to real-world applicability. If you’re running a deep scan in the middle of the night, do you care if it takes 10 minutes or two hours? If, for some strange reason, you run a scan while working on other tasks, do you care so long as there’s no noticeable impact on system resources? Honestly, we don’t. But for those who like numbers as a way to grade product options, here goes.

The first full system scan will almost always be the slowest. This is because most AV products perform some caching. The rationale is that, provided no malware is detected in the system, why reinvent the wheel? Why burn a lot of time and cycles performing deep inspection on files that are already known to be safe? This is akin to doing a differential backup rather than a full system backup.
AVG and McAfee are the big surprises here. In fact, there almost seem to be two approaches to deep scanning. Is it a coincidence that AVG and Kaspersky are in the back half of AV-C’s missed samples results? Maybe, but so is Microsoft, which is our second-slowest scanner behind GFI.

Subsequent deep scans seem generally to accrue cached data until a steady performance level is reached. For example, our three subsequent scan times for Symantec were 29:50, 6:01, and 6:15. Microsoft progressed from 1790 seconds to 361 and 375, or just over six minutes. So, our mean times reported here are actually skewed to the high side in cases where caching progresses to an eventual baseline. The notable exception is GFI, which clearly does not cache and exhibited no drop-off from the first scan to subsequent scans.
We’ve hit the highlights in terms of time and system performance considerations here, although there are a few loose ends. For instance, AV product installation could be an issue. Most products install fairly quickly and smoothly, but some may be less intuitive than others in guiding you through an initial definitions update.
For instance, initial setup with GFI Vipre was slightly unusual. The program installed normally and requested a reboot, which we did. Upon resuming, Vipre did its usual thing and had us download a definition update, which took about three minutes or so and requested another reboot. However, upon our next return to the application, the main UI still showed that Active Protection was not enabled, even though it looked to be so in the options, and we appeared to have still not implemented a definition update. So we polled for new definitions again, and this time the download took over 20 minutes on our 15 Mb/s FiOS downlink. But once obtained, the new definitions installed quickly and resulted in both the Updates and Active Protection icons changing to green check marks. Good to go.
Another potential time sink comes after the scan when you assess results. Symantec excels at finding tracking cookies, regardless of whether you consider these important (the program’s options will let you filter out certain threat types, if you’re really annoyed). Our initial scan found 22 tracking cookies while, for example, McAfee found four. Do we care? Not really, and we’d rather not have to consider if the “threat” is worth worrying about. We use AV products in part to make such decisions for us.
As for impact on CPU utilization, all products seem to be fairly even, falling in the 4% to 11% range. We found that McAfee edged a bit higher, trending in the 10% to 15% range, which could explain some of the company’s lower scanning times, although we’re still baffled by that first deep scan score. One likely reason why AVG performs so quickly in scanning is that it will ratchet into high priority mode when your system goes idle. When you return, the app reverts to low priority mode.

In retrospect, we have to wonder if it was a mistake to run these tests on a modern quad-core processor (Core i7-2600K). In reality, the CPU is so fast that it’s hard to register a significant load from even the most demanding AV apps. In our opinion, that’s the big message of this article. We’ve seen some AV companies trumpet CPU utilization scores showing how much less impact their product has than others. Apparently, this is somewhat like saying you can boil water at 230 degrees Fahrenheit instead of 260 degrees. As long as the water is at 212 degrees or higher, no one really cares. What might have once been a serious concern in the single-core days, or even still on low-end, low-power processors, is no longer a worry on modern platforms. The load from any of these AV products is negligible with such horsepower under the hood.
On that basis, we conclude that, in most cases, performance may be a secondary concern when selecting an antivirus product, but it shouldn’t be one of your first criteria.












