Nvidia GeForce GTX 1080 Graphics Card Roundup

Zotac GTX 1080 AMP! Extreme

Zotac's GeForce GTX 1080 AMP! Extreme version is one of the company's highest-end models based on GP104, though it's actually cheaper than a Founders Edition card. There are actually six different 1080 cards on Zotac's site, two of which are water-cooling-specific and two of which employ Nvidia's reference design. The remaining pair sport slightly different air coolers with their own clock rates. The simplicity of Zotac's line-up hopefully helps minimize the confusion that often plagues line-ups with too many different versions.

Its AMP! Extreme aims for the performance crown. And while that distinction isn't out of reach, the competition is pretty fierce. Zotac helps make its case with a bundled application it calls FireStorm, used for configuring clock rates, fans, and the LED lighting. Of course, we always test graphics cards as they arrive out of the box for fairness' sake. 

Technical Specifications


MORE: Best Graphics Cards


MORE: Desktop GPU Performance Hierarchy Table


MORE: All Graphics Content

Exterior & Interfaces

The cooler shroud is made of a light allow with carbon fiber-looking accents. The 47.5oz (1350g)-heavy card measures an impressive 13 inches (32.5cm) long, five inches (12.5cm) tall, and two inches (5.3cm) wide.

All of that extra length comes from the arrangement of three fans, each with a rotor blade diameter of 3⅓ inches (8.5cm), and the top cover. As you might imagine, it's not easy to get such a long card situated in shallow cases.

The back of the board is covered by a one-piece plate that doesn't help cooling, but also doesn't have a negative impact thanks to plenty of openings for ventilation.

If you're interested in a multi-GPU configuration, plan to accommodate an additional one-fifth of an inch (5mm) in depth beyond the plate. However, this card isn't ideal for SLI setups, where you need every little bit of space between boards. Pulling off the backplate hurts the 1080 AMP! Extreme's stability.

Up top, a centered Zotac logo is inset into an acrylic plate, and its color and lighting effects are controlled through software. Two eight-pin power connectors are rotated by 180° and positioned at the end of the card. The design actually feels rather slim, despite its extreme dimensions. Surely there are enthusiasts who'll appreciate this.

A closed-off end indicates that Zotac's cooling fins are oriented vertically, so all of this card's waste heat is going to get pushed out the top and bottom, rather than the front and back.

The rear bracket features five outputs, of which a maximum of four can be used simultaneously in a multi-monitor setup. In addition to one dual-link DVI-D connector (be aware that there is no analog signal), the bracket also exposes one HDMI 2.0b and three DisplayPort 1.4-ready outputs. The rest of the plate is mostly solid, with several openings cut into it that look like they're supposed to improve airflow, but don't actually do anything.

Board & Components

Zotac mounts the GP104 package in a modified frame with a raised edge. In cases like this, where heavy coolers put quite a bit of pressure on the processor, this comes as a welcome relief (even if it isn't needed; during normal operation the thermal solution's weight shouldn't be a problem).

The card uses GDDR5X memory modules from Micron, which are sold along with Nvidia's GPU to board partners. Eight memory chips (MT58K256M32JA-100) transferring at 10 MT/s are attached to a 256-bit interface, allowing for a theoretical bandwidth of 320 GB/s.

Zotac relies on a uPI Semiconductor µP9511P for PWM control of the GPU's power phases. It's actually a 6+2-phase model, so the trick is to manage GPU and memory power delivery independent of each other. In this way, Zotac can use all of the controller's outputs for GPU phases, each of which is equipped with two 100A Sinopower SM4502NHKPs on the low side and one 60A SM4502NHKP on the high side.

The memory is controlled by a smaller uP1666 with two separate phases, each equipped with the same three-part combination of N-channel MOSFETs as the GPU phases.

Unfortunately, Zotac's chokes are machine-soldered no-name clones of Foxconn's magic series. These AIO chokes seem to have been put in manually (in a rather messy way). They're not really quiet, but still better than the cheapest coils we've seen.

Zotac's load distribution is also noteworthy because only one of the two memory phases gets power from the motherboard slot. The other one is attached to the power connectors. The situation appears similar for the GPU phases, of which only one gets its power from the motherboard. If we sum up the power target and subtract the memory, the eight phases take in up to 270W, or 34W per phase.

In order to satisfy PCI-SIG compliance testing, the memory can get all of its power from the motherboard, or the load can be split between one memory and one GPU phase. Zotac clearly went with the latter option. This also explains why some manufacturers limit their power targets so strictly and can't (or don't want to) provide further BIOS updates.

Two capacitors are installed right below the GPU to absorb and equalize peaks in voltage. The large and (thanks to a recess in the backplate) highly visible "Power Boost" capacitor is probably more of a marketing gimmick, though Zotac claims it helps reduce ripple noise and minimizes power fluctuations, extending the card's life.

Power Results

Before we look at power consumption, we should talk about the correlation between GPU Boost frequency and core voltage, which are so similar that we decided to put their graphs one on top of the other.

Right out of the box the card already works at 2025 MHz during our gaming loop after warming up. This can be explained by a combination of the very high idle frequency of 319 MHz, Zotac's excellent thermal solution, and a high power target of up to 270W. Using Zotac's own software, we were able to get the card stable at over 2100 MHz (though the fans were extremely loud at that point).

After warm-up, GPU Boost drops to 2025 MHz during our gaming workload. The stress test nudges it down even more, and we see clock rates as low as 1936 MHz. hat means the voltages start at 1.05V and end in the 0.962V range.

Summing up measured voltages and currents, we arrive at a total consumption figure we can easily confirm with our test equipment by monitoring the card's power connectors.

As a result of Nvidia's restrictions, manufacturers sacrifice the lowest possible frequency bin in order to gain an extra GPU Boost step. So,Zotac's power consumption is disproportionately high as it idles at 319 MHz.

Swipe to scroll horizontally
Power Consumption
Idle15W
Idle Multi-Monitor16W
Blu-ray17W
Browser Games115-136W
Gaming (Metro Last Light at 4K)207W
Torture (FurMark)272W

Now let's take a more detailed look at power consumption when the card is idle, when it's gaming at 4K, and during our stress test. The graphs show the distribution of load between each voltage and supply rail, providing a bird's eye view of variations and peaks:

Temperature Results

Zotac uses a massive copper sink for cooling GP104. It transfers heat into a large aluminum base plate, which simultaneously cools the memory modules and their two power phases. An array of aluminum fins help dissipate thermal energy over lots of surface area with the help of four 8mm and two 6mm heat pipes made from a copper composite material.

The GPU power supply's eight phases are connected to 24 MOSFETs, which are covered by a nonsensical passive cooler that's supposed to receive some air flow from above. It turns out Zotac was overly optimistic about how this all works, especially when we consider the card's 270W default power target that you can push even higher through software.

The top-mounted backplate is attached with several screws and doesn't help cool the card in any way. Instead, its sole purpose is supporting the PCB's structural integrity.

Zotac's thermal solution and default fan curve facilitate a 70°C reading from the GPU while gaming. This gives us no real reason to worry. Even during our stress test, a maximum of 74°C isn't a problem (those numbers are 73°C and 78°C inside a case).

That's just the GPU, though...

The 69°C measured behind GP104's package is on par with what the processor's own diode reports. But the 89°C observed at the VRMs is more critical due to the spread of heat across the board. An 84°C reading just below the memory is barely within specification.

This gets more troubling at power levels in excess of 220W. Using the default 270W power target, we measured 107°C under the MOSFETs due to a lack of sufficient cooling (technically still acceptable) and 95°C right under the three memory modules closest to that hot-spot. The second reading exceeds the specification by a long shot, and is reason enough not to mess with a higher power target.

Sound Results

Looking at the chart below might suggest a sloppy implementation of hysteresis. The constant on/off/on/off during our gaming workload is extremely annoying. In less demanding titles at lower resolutions (like Fallout 4 at 2560x1440), the cycle gets stuck in an endless loop since the card never significantly exceeds 64 to 66°C. You might try fixing this with the card's FireStorm software. Unfortunately that doesn't work.

Zotac should seriously question the quality of its fans. If the rear-most fan (whose RPM values are transmitted to the controller) stops spinning just below 900 RPM, its hold value of approximately 700 RPM will never be reached because it is too low. In turn, the firmware constantly tries to restart all three fans at their full 1300 RPM, and then lowers the speed until the fan stops again.

While manually configuring a 900 RPM fan speed does somewhat ameliorate the situation, a proper silent mode followed by a moderately rising curve is still not possible. Within our sample, the minimum RPM of all fans varied between approximately 820 and 900. Anything lower and they would just stop.

When the card is idle, a semi-passive mode keeps the 1080 AMP! Extreme silent. We abstained from taking any measurements in that state.

After running at full load for a long time, the card is registers an impressive 34 dB(A) thanks to constant fan speeds of about 1200 RPM, though the bass-heavy bearing and motor sounds are clearly audible. These are transmitted as structure-borne noise to the card's housing, which may result in further resonance and vibration.

Here's the spectrum of our gaming workload, which reflects the starting/stopping fan behavior really well. The frequency changes, from start impulse (measurable up to ~4 KHz) and the subsequent RPM decay all the way to a standstill, are especially visible in the range between about 80 to 250 Hz. Thanks to peak values of up to 1300 RPM, the average noise level now also goes up to 35 dB(A) with measured peaks of almost 37 dB(A).

The one drop-out in the treble range, which shows up as a narrow, horizontal, rather blue stripe, is precisely the moment where one loop ends and the next begins.

As brutal as this cooler might look, and no matter how much potential it might have, its performance is just thwarted by the fans. Zotac needs to invest in solving this issue with double ball-bearing fans and a significantly lower start-up rotational speed.

Zotac GTX 1080 Amp! Extreme

Reasons to buy

+
Clock speeds
+
FireStorm tuning software
+
GPU Temperature
+
Spectra lighting

Reasons to avoid

-
Constantly changing fan speed
-
Price
-
Size
-
Voltage regulator temperature

Zotac GTX 1080 Amp! Extreme


MORE: Best Deals


MORE: Hot Bargains @PurchDeals

  • ledhead11
    Love the article!

    I'm really happy with my 2 xtreme's. Last month I cranked our A/C to 64f, closed all vents in the house except the one over my case and set the fans to 100%. I was able to game with the 2-2.1ghz speed all day at 4k. It was interesting to see the GPU usage drop a couple % while fps gained a few @ 4k and able to keep the temps below 60c.

    After it was all said and done though, the noise wasn't really worth it. Stock settings are just barely louder than my case fans and I only lose 1-3fps @ 4k over that experience. Temps almost never go above 60c in a room around 70-74f. My mobo has the 3 spacing setup which I believe gives the cards a little more breathing room.

    The zotac's were actually my first choice but gigabyte made it so easy on amazon and all the extra stuff was pretty cool.

    I ended up recycling one of the sli bridges for my old 970's since my board needed the longer one from nvida. All in all a great value in my opinion.

    One bad thing I forgot to mention and its in many customer reviews and videos and a fair amount of images-bent fins on a corner of the card. The foam packaging slightly bends one of the corners on the cards. You see it right when you open the box. Very easily fixed and happened on both of mine. To me, not a big deal, but again worth mentioning.
    Reply
  • redgarl
    The EVGA FTW is a piece of garbage! The video signal is dropping randomly and make my PC crash on Windows 10. Not only that, but my first card blow up after 40 days. I am on my second one and I am getting rid of it as soon as Vega is released. EVGA drop the ball hard time on this card. Their engineering design and quality assurance is as worst as Gigabyte. This card VRAM literally burn overtime. My only hope is waiting a year and RMA the damn thing so I can get another model. The only good thing is the customer support... they take care of you.
    Reply
  • Nuckles_56
    What I would have liked to have seen was a list of the maximum overclocks each card got for core and memory and the temperatures achieved by each cooler
    Reply
  • TheSilverSky
    No love for the Asus Turbo?
    Reply
  • Hupiscratch
    It would be good if they get rid of the DVI connector. It blocks a lot of airflow on a card that's already critical on cooling. Almost nobody that's buying this card will use the DVI anyway.
    Reply
  • Nuckles_56
    18984968 said:
    It would be good if they get rid of the DVI connector. It blocks a lot of airflow on a card that's already critical on cooling. Almost nobody that's buying this card will use the DVI anyway.

    Two things here, most of the cards don't vent air out through the rear bracket anyway due to the direction of the cooling fins on the cards. Plus, there are going to be plenty of people out there who bought the cheap Korean 1440p monitors which only have DVI inputs on them who'll be using these cards
    Reply
  • ern88
    I have the Gigabyte GTX 1080 G1 and I think it's a really good card. Can't go wrong with buying it.
    Reply
  • The best card out of box is eVGA FTW. I am running two of them in SLI under Windows 7, and they run freaking cool. No heat issue whatsoever.
    Reply
  • Mike_297
    I agree with 'THESILVERSKY'; Why no Asus cards? According to various reviews their Strixx line are some of the quietest cards going!
    Reply
  • trinori
    LOL you didnt include the ASUS STRIX OC ?!?
    well you just voided the legitimacy of your own comparison/breakdown post didnt you...
    "hey guys, here's a cool comparison of all the best 1080's by price and performance so that you can see which is the best card, except for some reason we didnt include arguably the best performing card available, have fun!"
    lol please..
    Reply