Sign in with
Sign up | Sign in

Performance

Gigabyte GeForce GTX Titan Black: Do-It-Yourself Cooler Swap
By

Performance Benchmarks

We're in the process of adding Gigabyte's GeForce GTX Titan Black GHz Edition to our 2014 VGA Charts, which will facilitate a complete performance break-down, but I still wanted to touch on a normalized comparison in this review, also.

Additionally, I thought it'd be good to explore whether or not 6 GB of graphics memory makes a big difference in the benchmarks. Originally, it was believed that Nvidia's partners would start selling GeForce GTX 780 and 780 Ti cards with 6 GB, rather than 3 GB. There's currently one 780 from EVGA with 6 GB. However, plans for the 780 Tis were scrapped when it became clear they'd cannibalize sales of Titan Black amongst affluent enthusiasts.

GPU Boost Frequency Under Load

We've gone into great depth about how Nvidia's GPU Boost technology works in theory and practice, so let's compare the company's reference GeForce GTX Titan, an overclocked Gigabyte GeForce GTX Titan Black, and the overclocked Gigabyte card with its modified cooler.

The overclocked Gigabyte board with Nvidia's reference cooler hangs out well under its 1100 MHz ceiling. In fact, the average frequency barely hovers around 1050 MHz. The modified version is quite a bit faster, averaging 1150 MHz. That's a gain of about 100 MHz, with lower temperatures and less noise to boot. Nvidia's reference GeForce GTX Titan really can’t compete. The overclocked Gigabyte model with its WindForce cooler lands just behind the Gigabyte GeForce GTX 780 Ti WindForce OC‘s Boost clock frequency of almost 1170 MHz.

We’re using the 1080p benchmarks of our VGA charts at higher (in fact, the highest possible) settings for our normalized performance benchmarks.

Comparison between 1080p and 2160p: Are 6 GB Worth It?

If you go by the performance benchmarks, 6 GB doesn't appear necessary at all. There's barely a difference between the two configurations. This is complicated by the fact that a single-GPU configuration still can't really handle 3840x2160. In order to enjoy the highest resolutions and most demanding detail settings, you want a couple of high-end cards in SLI or CrossFire. To keep the comparison fair, I overclocked the modified Gigabyte GTX Titan Black GHz Edition with the WindForce cooler to the same 1020 MHz base clock speed as the factory-overclocked Gigabyte GTX 780 Ti WindForce OC.

The head-to-head comparison paints a sobering picture. Apart from double-precision compute performance (for the handful of folks who actually need it), a similarly-clocked GeForce GTX 780 Ti is clearly the better choice, especially since it's a lot less expensive.

Now, here's the wild card: 6 GB is very likely more important to an SLI configuration. I don't have the cards or the FCAT suite here to properly test such a decadent setup. However, I also know from Chris Angelini that a couple of 3 GB cards rendering cooperatively start to demonstrate stuttering artifacts at 2160p, particularly as you crank up the detail settings. In those cases, we think you'll lament the fact that there aren't any GeForce GTX 780 Ti 6 GB boards available.

Display all 31 comments.
Top Comments
  • 10 Hide
    rohitbaran , June 9, 2014 9:23 AM
    That's some way to circumvent nVidia's rule. Nicely done Gigabyte!
Other Comments
  • 9 Hide
    ShadyHamster , June 9, 2014 2:10 AM
    Quote:
    So we have to change the cooler by our self? now thats wierd.


    If you bothered reading the first page you'd know why.
  • -7 Hide
    bloodgigas , June 9, 2014 2:25 AM
    Quote:
    Quote:
    So we have to change the cooler by our self? now thats wierd.


    If you bothered reading the first page you'd know why.


    "Nvidia doesn’t allow its partners to sell the GeForce GTX Titan Black with proprietary cooling. However, Gigabyte now offers a GHz Edition of the card that comes bundled with its WindForce solution, which you can install on the overclocked board yourself."

    This one right? whats the difference between you install it yourself and Gigabyte take the initiative pre-factory installed? or Warranty Void?
  • -4 Hide
    bloodgigas , June 9, 2014 2:46 AM
    Quote:
    Quote:
    So we have to change the cooler by our self? now thats wierd.


    If you bothered reading the first page you'd know why.


    "Nvidia doesn’t allow its partners to sell the GeForce GTX Titan Black with proprietary cooling. However, Gigabyte now offers a GHz Edition of the card that comes bundled with its WindForce solution, which you can install on the overclocked board yourself."

    This one right? whats the difference between you install it yourself and Gigabyte take the initiative pre-factory installed? or Warranty Void?
  • 7 Hide
    FormatC , June 9, 2014 3:21 AM
    It is one of Nvidias funny rules.

    Ok, for your better understanding:
    Nvidia doesn’t allow its partners to sell the GeForce GTX Titan Black with factory-installed proprietary cooling.
  • -1 Hide
    envy14tpe , June 9, 2014 5:11 AM
    If you are dropping the cash on a Titan I really wish people would go liquid cooling. Seems to be the best.
  • -3 Hide
    chaosmassive , June 9, 2014 9:17 AM
    Nvidia doesnt permit to install any 3rd party cooling on their chip, now thats weird !
  • 10 Hide
    rohitbaran , June 9, 2014 9:23 AM
    That's some way to circumvent nVidia's rule. Nicely done Gigabyte!
  • 0 Hide
    wolverine96 , June 9, 2014 10:27 AM
    Very nice, Gigabyte! I almost wish I had bought one. I have one of those "out of stock ASUS cards from Newegg". I am not disappointed, though. The card handles 84 degrees Celsius just fine!

    Igor Wallossek, I wonder if you could put up a graph for 3D rendering? If you use Blender's BMW scene by Mike Pan (a popular benchmark scene), make sure you properly set the tile size!
  • 2 Hide
    Damn_Rookie , June 9, 2014 10:29 AM
    Quote:
    It is one of Nvidias funny rules.

    Ok, for your better understanding:
    Nvidia doesn’t allow its partners to sell the GeForce GTX Titan Black with factory-installed proprietary cooling.

    Silly question probably, but why does nVidia allow only EVGA to break this rule, with their hydro copper signature edition you mentioned? Is it just because it's a water cooled model? Do you think nVidia specially signs off on the design?

    I'm genuinely curious.
  • 1 Hide
    Gunbuster , June 9, 2014 11:49 AM
    Do the individual OEM's even make the reference cards or does Nvidia just sell/ship them cards binned to their clock speed specification from one central ODM factory and the OEM put's it in their own box?
  • 0 Hide
    FormatC , June 9, 2014 12:12 PM
    All this cards came from Nvidia. No chance for an own PCB and cooler.
  • 2 Hide
    ingtar33 , June 9, 2014 4:34 PM
    wait wait wait... am i reading this wrong or is the titan black with the proprietary nvidia blower style cooler temp throttling?

    that's what it looks like to me.
  • 2 Hide
    FormatC , June 9, 2014 9:38 PM
    You are right, this is shown in graphics for the temperature and clock speed. EACH reference card from Nvidia is throttling under longer load, this is the disadvantage of the combination between temp target and a quieter cooler profile. Nothing new, because it is a "feature" since GTX 780. You have to run the original fan with fixed 65% rpm to prevent the card before thermal limitations. But this is really loud. :D 
  • 0 Hide
    SessouXFX , June 11, 2014 3:11 AM
    ...still like the EVGA option better.
  • 1 Hide
    mapesdhs , June 11, 2014 3:29 PM
    wolverine96 writes:
    > Igor Wallossek, I wonder if you could put up a graph for 3D rendering? If you use
    > Blender's BMW scene by Mike Pan (a popular benchmark scene), make sure you
    > properly set the tile size!

    Arion Bench 2.5.0 would be a better test, because it scales perfectly with
    multiple GPUs.

    Or the AE CUDA test my friend has created, but it's pretty intense, maybe
    takes too long with just one card (about 20 minutes with a single 780Ti).

    Ian.

  • 1 Hide
    wolverine96 , June 11, 2014 4:00 PM
    Quote:

    Arion Bench 2.5.0 would be a better test, because it scales perfectly with
    multiple GPUs.

    Or the AE CUDA test my friend has created, but it's pretty intense, maybe
    takes too long with just one card (about 20 minutes with a single 780Ti).

    Ian.



    I agree. The BMW scene is not the best CUDA benchmark. I just didn't want them to mess it up if they decided to use it. I heard some people complaining about this benchmark, although I don't know if they were right or wrong.

    My Titan Black renders the BMW in just over 24 seconds! :D  (Not including the post-process compositing, which uses the CPU. Tile size was set to 512x512.)
    For comparison, an Intel Core 2 Duo @ 2.33 GHz took 16 minutes!

    Have you run a Titan Black on that AE CUDA test? If so, I am curious to see the results!
  • 1 Hide
    mapesdhs , June 11, 2014 5:53 PM

    wolverine96 writes:
    > I agree. The BMW scene is not the best CUDA benchmark. ...

    I've tried it numerous times with various setups, it just seems to behave a
    bit weird IMO.


    > My Titan Black renders the BMW in just over 24 seconds! :D  ...

    Main problem I find is I can't work out how to make it use all available GPUs.
    Is that possible? One of my 580s does it in about 43s, but my system has 4 of
    them, so it's a bit moot really. Mind you, I'm using an older version of
    Blender (2.61), stuck with it to ensure consistent CPU-based testing.

    And as you say, it also involves some CPU stuff (scene setup doesn't use the
    GPU).


    > Have you run a Titan Black on that AE CUDA test? If so, I am curious to see
    > the results!

    Alas no, atm I don't have access to anything newer than some top-end 3GB GTX
    580s (MSI LX, 832MHz); my system has 4 of them. Final version of the test file
    takes 14m 48s to render in AE using 16bpc and 8 samples (ie. just average
    quality), so on a Titan Black I'm guessing it would take maybe 25 mins? Hard
    to say. Would certainly be interesting to find out. Note the 'max' quality
    setting would be 32bpc and 10 samples (likewise, for the full animation, avg
    quality is 1080p @ 25Hz, max quality is 50Hz).

    I'll sort out the test readme, download archive, web page, etc., next week,
    but need to talk to C.A. first about some things. Anyway, here's the rendered
    image in original Targa format (just one frame, the default test frame 96, the
    last frame in the main animation sequence):

    http://www.sgidepot.co.uk/misc/cuda.101_Frame96.tga

    Here's the file converted to BMP and SGI IRIS/RGB:

    http://www.sgidepot.co.uk/misc/cuda.101_Frame96.bmp
    http://www.sgidepot.co.uk/misc/cuda.101_Frame96.rgb

    and for those who don't mind losing a bit of quality, here's a 100% JPEG:

    http://www.sgidepot.co.uk/misc/cuda.101_Frame96.jpg


    The full 4 second animation takes hours to compute even at average quality and
    is thus intended more as a stress test for those interested in checking that
    their system can handle long renders or other GPU tasks without falling over
    (I've seen many people asking for a test like this on forums). I suspect at max
    quality the whole sequence would take about a week to crunch on my system. :D 
    Also interesting for exploring power consumption & energy cost issues for
    different GPU configs (load draw on my system during the render is around 920W).

    Ian.

  • 0 Hide
    wolverine96 , June 11, 2014 8:23 PM
    Did you say you are having trouble getting multiple GPU's to work? I only use one GPU, but here's a very informative link. More specifically, see this section.

    Your system with 4 GTX 580's is much faster than mine! (Two GTX 580's is about as fast as one GTX Titan Black.) I guess the only time mine would be faster is if the scene used more than 3GB of RAM. I actually was planning on getting 2 GTX 580's, but then I discovered the Titan Black.

    Is that Cycles in those images you posted?

    By the way, Blender 2.71 is coming out very soon. In the past 10 versions, there have been some major performance gains for Cycles. I think it's like 30-50% faster in some cases.
  • 1 Hide
    mapesdhs , June 12, 2014 3:49 AM
    wolverine96 writes:
    > Did you say you are having trouble getting multiple GPU's to work? I only use one
    > GPU, but here's a very informative link. More specifically, see this section.

    Thanks!! My goof, looks like V2.61 doesn't have the Compute Panel. Will try
    the newer 2.70a in a moment... (downloading now)


    > Your system with 4 GTX 580's is much faster than mine! ...

    Yup, though I suspect your power bill is less likely to make your eyeballs explode. :D 


    > ... I guess the only time mine would be faster is if the scene used more than
    > 3GB of RAM. ...

    I had been hoping we'd see 6GB 780Tis, but seems like that's been killed off. Shame.


    > I actually was planning on getting 2 GTX 580's, but then I discovered the Titan Black.

    The real advantage of multiple 580s is just low upfront cost. Standard 580s are pretty
    cheap (I have four 1.5GB 797MHz models which cost about 400 UKP total), if one's ok
    with the VRAM limit. 3GB 580s cost a bit more, but not much more (I've bought/sold
    nine Palit 3GB 580s in the past year). The MSI LXs though can be a tad pricey, depends
    on luck really I guess. I got mine (five total) for good prices though overall, and they do
    oc like crazy (1GHz+ is possible).


    > Is that Cycles in those images you posted?

    No, it's the RayTrace3D renderer within After Effects.


    > By the way, Blender 2.71 is coming out very soon. In the past 10 versions,
    > there have been some major performance gains for Cycles. I think it's like 30-50%
    > faster in some cases.

    Good that they keep boosting it, but a nightmare for benchmarking consistency. :D 


    Ok, download done, quick test...

    Cycles does the BMW in 11.56s (blimey!), tile size 240x135. Just curious btw,
    you mentioned using 512x512 tile size, but surely it'd be optimal to use an even
    divisor of the image dimensions in both X and Y? What do you get if you try
    a tile size of 240x135?

    Ian.

Display more comments
React To This Article