Sign in with
Sign up | Sign in

Zotac GeForce GTX 480 AMP! Edition

Three Factory-Overclocked, High-End Graphics Cards
By

We've been impressed by Zotac's factory-overclocked AMP! edition cards in the past. This time around, the company is working with Zalman in order to provide a card with a more effective aftermarket cooler. Certainly, one of the main detractors from Nvidia's GeForce GTX 480 is how hot the card gets, and how noisy the stock cooler can be. So, if a new cooler can fix these issues, this product has the potential to be a particularly attractive option. At $510 on Newegg, this board is about $60 more than the lowest-priced reference GeForce GTX 480s.

The $60 price premium gets you a Zalman VF3000 graphics card cooler, which is a large and effective unit with dual-axial fans and five heat pipes designed to pull the high temperatures from the hot GF100 GPU. At the time of writing, the delta between this card and the reference pack was only $20. Since the Zalman VF3000F cooler will cost in the neighborhood of $50 when it's released (It hasn't made it to retail at time of writing), that seemed like a very reasonable deal. Now, it's a little less impressive in the face of cratering prices on the GTX 480s.

As with the rest of our factory-overclocked models, good case airflow is a must, as the hot air is not forced out of the back of the case. Instead, most of it will find its way back into the enclosure. Note that the Zotac GeForce GTX 480 AMP! edition card is the only model in our roundup that monopolizes three expansion card slots, due to the massive Zalman cooler.

The card's bundle includes some standard items, such as a driver CD, a manual, a DVI-to-VGA dongle, and a Molex-to-PCIe power adapter. But there are a couple of adapters that I'm not used to seeing: a mini HDMI-to-HDMI adapter and a dual-six-pin-to-eight-pin-PCIe power adapter. The card doesn't have any value-added software, but there are some 30-day trials of CUDA-accelerated software like the Badaboom video encoder. Zotac offers a five-year warranty with this AMP! edition card (and limited lifetime within the US), which is fantastic compared to the competition.

The PCB is 100% reference, complete with unused holes for the cross-flow fan with which the standard model comes equipped. Of course, this doesn't detract from the product. Just like the reference card, the GeForce GTX 480 comes with 1536 MB of GDDR5 memory. The outputs mirror the reference card, with two dual-link DVI options and a single mini-HDMI port. Because the GF100 includes two independent display pipelines, you can only use two of this card's three outputs at any given time.

This factory-overclocked AMP! edition card has a core speed of 756 MHz (56 MHz above reference), a shader speed of 1512 MHz (111 MHz over reference), and a memory speed of 950 MHz (26 MHz/104 MT/s effective over reference). As far as we know, the fastest factory-overclocked GeForce GTX 480 is the EVGA GeForce GTX 480 SuperClocked+ model, with a mere 4 MHz more on the core (760 MHz). Yet, the EVGA card has a 61 MHz lower shader speed, so we think it's reasonable to say that the Zotac AMP! card has the highest factory overclock you can get on a GeForce GTX 480. At idle, the card's clocks drop to a miserly 50.5 MHz core/101 MHz shader/67.5 MHz memory to keep things as efficient as possible.

Overclocking

As a testament to the effectiveness of the cooling system, we were able to overclock the Zotac card's core to 825 MHz, its shaders to 1650 MHz, and it memory to 1050 MHz. We achieved this with MSI's Afterburner overclocking utility that, fortunately, allowed us to adjust clock rates and voltages. We increased voltage from 1.05 V to 1.138 V, and we increased the fan speed to 100% to keep temperatures down.

Display all 55 comments.
This thread is closed for comments
Top Comments
  • 23 Hide
    ohim , August 12, 2010 9:35 AM
    Did your lights flickered when you powered up that GTX480 ? :) 
  • 20 Hide
    knutjb , August 12, 2010 6:20 AM
    Good to see sensible conclusions, bang for the buck.

    Amazing how well the ATI cards are doing given their time on the market.
  • 13 Hide
    coolvoodoo , August 12, 2010 2:17 PM
    I like the ATI vs Nvidia wars because that is what drives progress, but it seems that some people want any benchmarks where the Nvidia card is faster to be removed. What would that prove? It is funny that the commenters above seem to think that when ATI performs better, it is because of the card, and when Nvidia performs better it is because they "paid off Tom's". That just seems like fanboys spouting off. I admit that I still like to see how any graphics card performs in Crysis because:
    1. It still taxes even the mightiest of cards.
    2. Every comparison for the past few years has used it, thus giving me an idea how my current card performs against the new cards.
    3. Someday I want to own a graphics card that can beat it down.
    4. I actually still play Crysis (and Far Cry 2).
Other Comments
  • 20 Hide
    knutjb , August 12, 2010 6:20 AM
    Good to see sensible conclusions, bang for the buck.

    Amazing how well the ATI cards are doing given their time on the market.
  • 13 Hide
    Jax69 , August 12, 2010 7:39 AM
    i am amazed by ati cards after one year on the market is still strong as hell. very good amd
  • 13 Hide
    jonsy2k , August 12, 2010 7:48 AM
    I'm not liking the trend of these cards consuming more and more pci slots to be honest.
  • -5 Hide
    carlhenry , August 12, 2010 8:55 AM
    GTX 480 is looking very good and sexy
  • 23 Hide
    ohim , August 12, 2010 9:35 AM
    Did your lights flickered when you powered up that GTX480 ? :) 
  • -3 Hide
    Anonymous , August 12, 2010 11:00 AM
    ^^^^^
    hahahahahhaah.
    liked the flickered thing.
    LOL
  • 13 Hide
    h83 , August 12, 2010 11:12 AM

    So, the conclusion is that the only good point about those factory overclocked cards are their coolers...
  • -6 Hide
    Tamz_msc , August 12, 2010 11:58 AM
    Quote:
    Aliens vs. Predator favors the Radeons, just like Crysis favors the GeForce cards. However, the playing field remains very close

    The graphs tell otherwise.
  • -5 Hide
    The Lady Slayer , August 12, 2010 12:02 PM
    It's a shame the Big Green has paid off so many game developers that we'll never see a 'true' comparison between ATI & nVidia
  • -5 Hide
    LaloFG , August 12, 2010 12:06 PM
    In my previous card (HIS 4870 with a default zalman heatsink and an HIS fan) the fan failed, not so much a problem, I replaced with a better one, and that was better because while the heatsink is better (zalman one) the fan is crap (thin and very sensitive).

    The HIS card in the article, the heatsink it is not superior to the references one, and if the fans are the same type than before, well...

    I Like the gigabyte one, that heatpipes shines and offers superior cooling (ignore the inferior performance gain due to overclock).

    The GTX 480 ... well, I don´t like physX.
  • -3 Hide
    juliom , August 12, 2010 12:25 PM
    Far Cry 2 is so Nvidia biased, why the hell do you use it to compare the cards? If that test disappeared the huge advantage in the conclusion graphic would get much smaller.
  • -8 Hide
    tony singh , August 12, 2010 12:28 PM
    Why far cry 2 , crysis & dirt 2 chosen again & again knowing that nvidia performs better in these?? To show 480 much master than than 5870 ? Want an answer.
  • 0 Hide
    vaughn2k , August 12, 2010 12:43 PM
    ohimDid your lights flickered when you powered up that GTX480 ?



    Did not notice, because they were busy with the benchmark...

    ... but their neighbor does...
  • 3 Hide
    kikireeki , August 12, 2010 12:51 PM
    from the articleAt 100% fan speed, the card is noisier than a stock Radeon HD 5870. but it manages to keep the GPU cool, with temperatures under 70 degrees Fahrenheit at full load.



    wow This fan must be blowing liquid-nitrogen!
  • 1 Hide
    chunkymonster , August 12, 2010 1:08 PM
    Not as big of a gain with the factory oc'd 5870 cards over the 5870 reference design, certainly not enough to justify the additional cost, IMO.

    No surprise that the 480 performs better than the 5870 overall, this is something that ATI stated would happen when they announced they would use dual gpu cards to compete with the high end nVidia single gpu cards.

    Again with the Crysis and Far Cry benches...sheesh!
  • 0 Hide
    halls , August 12, 2010 1:20 PM
    Just a heads up: on the second page, you mention that the fan keeps the card at under 70 degrees Fahrenheit on full load. I don't believe you!
  • 2 Hide
    rrobstur , August 12, 2010 1:59 PM
    these cards are both bad-ass. you cant expect a year old card to compete with that gtx 480. thumbs ups nvidia but watch out ati next gen is inbound
  • 1 Hide
    ares1214 , August 12, 2010 2:17 PM
    Next gen ATI isnt changing much, so i doubt performance will be massively increased. Maybe 15-30%, but nothing like the 4xxx series to the 5xxx series. However, their arch. change comes with 7xxx, and my money is, by the time Nvidia finishes the 4xx series, 6xxx will be out. By the time Nvidia releases 5xx (depending on if they go fast or go good), 7xxx will only be around the bend. 7xxx however is scheduled for around Q3-Q4 2011, so they have some time.
  • 13 Hide
    coolvoodoo , August 12, 2010 2:17 PM
    I like the ATI vs Nvidia wars because that is what drives progress, but it seems that some people want any benchmarks where the Nvidia card is faster to be removed. What would that prove? It is funny that the commenters above seem to think that when ATI performs better, it is because of the card, and when Nvidia performs better it is because they "paid off Tom's". That just seems like fanboys spouting off. I admit that I still like to see how any graphics card performs in Crysis because:
    1. It still taxes even the mightiest of cards.
    2. Every comparison for the past few years has used it, thus giving me an idea how my current card performs against the new cards.
    3. Someday I want to own a graphics card that can beat it down.
    4. I actually still play Crysis (and Far Cry 2).
Display more comments