Skip to main content

EVGA GeForce GTX 1080 Ti FTW3 Gaming Review

Final Analysis

In order to do EVGA's GeForce GTX 1080 Ti FTW3 Gaming justice, we have to take a step back. After all, the company shows courage by not simply accepting that GeForce GTX 1080 Ti needs to monopolize three expansion slots. Thus, its offering joins the Founders Edition board as one of two respectable dual-slot cards out there.

That takes us to the physical limits of such a design, which can't be circumvented (not even by Nvidia). But despite the challenges of cooling such a powerful card quietly, for the most part, EVGA does this well. Delivering a higher performer able to operate at lower temperatures while generating less noise would have required more heft all around. And that's what it appears EVGA was trying to avoid. In a dual-slot form factor, then, it doesn't get much better.

As a trade-off, you don't get much overclocking headroom from a stock GeForce GTX 1080 Ti FTW3 Gaming. If EVGA's cooler was replaced by a real water block, it'd certainly be capable of much higher clock rates given a maximum power target of 350W. But the end result would still depend on winning the silicon lottery.

We very much appreciate the innovation that went into EVGA's sensor-controlled fans. The many sensors must be paradise for enthusiasts who thrive on collecting, processing, and dialing in their settings based on real data. This is one in a list of features, together with a unique (but still capable) dual-slot cooler that gives us cause to explicitly recommend the GeForce GTX 1080 Ti FTW3 Gaming.

So far, all of the 1080 Tis we've reviewed have their own special characteristics that differentiate them, all but guaranteeing excitement from their target audiences. This card is no exception. It may be a bit of a niche product, but it's convincing for those who can appreciate it.


MORE: Best Graphics Cards


MORE: Desktop GPU Performance Hierarchy Table


MORE: All Graphics Content

  • AgentLozen
    I'm glad that there's an option for an effective two-slot version of the 1080Ti on the market. I'm indifferent toward the design but I'm sure people who are looking for it will appreciate it just like the article says.
    Reply
  • gio2vanni86
    I have two of these, i'm still disappointed in the sli performance compared to my 980's. What i can do but complain. Nvidia needs to do a driver game overhaul these puppies should scream together. They do the opposite which makes me turn sli off and boom i get better performance from 1. Its pathetic. Nvidia should just kill Sli all together since they got rid of triple sli they mind as well get rid of sli as well.
    Reply
  • ahnilated
    I have one of these and the noise at full load on these is very annoying. I am going to install one of Arctic Cooling's heatsinks. I would think with a 3 fan setup this system would cool better and not have a noise issue like this. I was quite disappointed with the noise levels on this card.
    Reply
  • Jeff Fx
    19811038 said:
    I have two of these, i'm still disappointed in the sli performance compared to my 980's. What i can do but complain. Nvidia needs to do a driver game overhaul these puppies should scream together. They do the opposite which makes me turn sli off and boom i get better performance from 1. Its pathetic. Nvidia should just kill Sli all together since they got rid of triple sli they mind as well get rid of sli as well.

    SLI has always had issues. Fortunately, one of these cards will run games very well, even in VR, so there's no need for SLI.
    Reply
  • dstarr3
    19811038 said:
    I have two of these, i'm still disappointed in the sli performance compared to my 980's. What i can do but complain. Nvidia needs to do a driver game overhaul these puppies should scream together. They do the opposite which makes me turn sli off and boom i get better performance from 1. Its pathetic. Nvidia should just kill Sli all together since they got rid of triple sli they mind as well get rid of sli as well.

    It needs support from nVidia, but it also needs support from every developer making games. And unfortunately, the number of users sporting dual GPUs is a pretty tiny sliver of the total PC user base. So devs aren't too eager to pour that much support into it if it doesn't work out of the box.
    Reply
  • FormatC
    Dual-GPU is always a problem and not so easy to realize for programmers and driver developers (profiles). AFR ist totally limited and I hope that we will see in the future more Windows/DirectX-based solutions. If....
    Reply
  • Sam Hain
    For those praising the 2-slot design for it's "better-than" for SLI... True, it does make for a better fit, physically.

    However, SLI is and has been fading for both NV and DV's. Two, that heat-sig and fan profile requirements in a closed case for just one of these cards should be warning enough to veer away from running in a 2-way SLI using stock and sometimes 3rd party air cooling solutions.
    Reply
  • SBMfromLA
    I recall reading an article somewhere that said NVidia is trying to discourage SLi and purposely makes them underperform in SLi mode.
    Reply
  • Sam Hain
    19810871 said:
    Unlike Asus & Gigabyte, which slap 2.5-slot coolers on their GTX 1080 Tis, EVGA remains faithful to a smaller form factor with its GTX 1080 Ti FTW3 Gaming.

    EVGA GeForce GTX 1080 Ti FTW3 Gaming Review : Read more

    Great article!
    Reply
  • photonboy
    NVidia does not "purposely make them underperform in SLI mode". And to be clear, SLI has different versions. It's AFR that is disappearing. In the short term I wouldn't use multi-GPU at all. In the LONG term we'll be switching to Split Frame Rendering.
    http://hexus.net/tech/reviews/graphics/916-nvidias-sli-an-introduction/?page=2

    SFR really needs native support at the GAME ENGINE level to minimize the work required to support multi-GPU. That can and will happen, but I wouldn't expect to see it have much support for about TWO YEARS or more. Remember, games usually have 3+ years of building so anything complex needs to usually be part of the game engine when you START making the game.
    Reply