Sign in with
Sign up | Sign in

Upgrading The Gigabyte GeForce GTX Titan Black

Gigabyte GeForce GTX Titan Black: Do-It-Yourself Cooler Swap
By

A Step-by-Step Guide to Upgrading the GeForce GTX Titan Black

We’re ripping apart a thousand-dollar GeForce GTX Titan today. Obviously, the operation requires deliberate movements and a heaping dose of caution. Tearing off the reference cooler and bolting down Gigabyte's is not the way to approach this.

First, the reference cooler needs to be unscrewed. Gigabyte does include a universal tool for this, but using your own screwdriver is both faster and easier. Fortunately, the bundled manual is comprehensive enough to walk you through the process without any confusion that'd lead to mistakes.

If only because the original reference cooler should be saved, I collected all of its tiny screws into a small bowl. From there, it's easy to lift the heat sink up and off. Don't rush; you need to unplug a connector up top and another down below before the heat sink clears the card.

From there, Nvidia's cooler can be removed, greeting you with a bunch of gunk. The company doesn't shy away from blanketing its GK110 GPU with thick, heavy, and hard-to-remove thermal paste.

Gigabyte must anticipate this for each of its GeForce GTX Titan Blacks, so it provides a cloth for cleaning up the compound. That's a solid idea, though we'd also recommend using the right cleaning solutions.

As mentioned, the reference cooler (which is actually quite valuable) and screws are stowed away just in case warranty-related issues necessitate sending the card back. You're better off safe than sorry.

I chose to set aside the thick thermal paste Gigabyte bundles with its GeForce GTX Titan Black GHz Edition. There are higher-end solutions out there, after all. You want a non-electrically conductive compound that doesn’t need a lot of burn-in, is easy to apply and spread, and performs well. After quite a bit of comparison testing, I standardized on Gelid’s GC-Extreme for my graphics card reviews, so I lean on that compound to normalize one more variable as I compare Gigabyte's card to other modified graphics cards in the lab.

It’s time to install the new cooler. Remove the protective covers from the thermal pads, and pull off the big warning label. I took the extra step of cleaning the copper surface with Arctic’s purifier solution to make sure any remaining remnants of glue were gone.

First, the four washers are glued to their respective holes. Then, plug the fan connector into its socket on the bottom, and rest the cooler on the graphics card. The WindForce logo LED connector is attached next. Fasten the heat sink using spring-loaded screws tightened in a cross pattern.

Next, three small screws are secured into place around the voltage transformers. Finally, you'll find yourself with just one screw left. It requires a nut, which is also included. With it in place, the cooler installation is complete.

It’s a little disappointing that Gigabyte doesn't include a backplate. The cooler is fastened securely and solidly in seven places, and the card seems safe from flex, but additional cooling under the voltage transformers and rear-mounted memory modules would have been nice as well.

Display all 30 comments.
  • 9 Hide
    ShadyHamster , June 9, 2014 2:10 AM
    Quote:
    So we have to change the cooler by our self? now thats wierd.


    If you bothered reading the first page you'd know why.
  • -7 Hide
    bloodgigas , June 9, 2014 2:25 AM
    Quote:
    Quote:
    So we have to change the cooler by our self? now thats wierd.


    If you bothered reading the first page you'd know why.


    "Nvidia doesn’t allow its partners to sell the GeForce GTX Titan Black with proprietary cooling. However, Gigabyte now offers a GHz Edition of the card that comes bundled with its WindForce solution, which you can install on the overclocked board yourself."

    This one right? whats the difference between you install it yourself and Gigabyte take the initiative pre-factory installed? or Warranty Void?
  • -4 Hide
    bloodgigas , June 9, 2014 2:46 AM
    Quote:
    Quote:
    So we have to change the cooler by our self? now thats wierd.


    If you bothered reading the first page you'd know why.


    "Nvidia doesn’t allow its partners to sell the GeForce GTX Titan Black with proprietary cooling. However, Gigabyte now offers a GHz Edition of the card that comes bundled with its WindForce solution, which you can install on the overclocked board yourself."

    This one right? whats the difference between you install it yourself and Gigabyte take the initiative pre-factory installed? or Warranty Void?
  • 7 Hide
    FormatC , June 9, 2014 3:21 AM
    It is one of Nvidias funny rules.

    Ok, for your better understanding:
    Nvidia doesn’t allow its partners to sell the GeForce GTX Titan Black with factory-installed proprietary cooling.
  • -1 Hide
    envy14tpe , June 9, 2014 5:11 AM
    If you are dropping the cash on a Titan I really wish people would go liquid cooling. Seems to be the best.
  • -3 Hide
    chaosmassive , June 9, 2014 9:17 AM
    Nvidia doesnt permit to install any 3rd party cooling on their chip, now thats weird !
  • 10 Hide
    rohitbaran , June 9, 2014 9:23 AM
    That's some way to circumvent nVidia's rule. Nicely done Gigabyte!
  • 0 Hide
    wolverine96 , June 9, 2014 10:27 AM
    Very nice, Gigabyte! I almost wish I had bought one. I have one of those "out of stock ASUS cards from Newegg". I am not disappointed, though. The card handles 84 degrees Celsius just fine!

    Igor Wallossek, I wonder if you could put up a graph for 3D rendering? If you use Blender's BMW scene by Mike Pan (a popular benchmark scene), make sure you properly set the tile size!
  • 2 Hide
    Damn_Rookie , June 9, 2014 10:29 AM
    Quote:
    It is one of Nvidias funny rules.

    Ok, for your better understanding:
    Nvidia doesn’t allow its partners to sell the GeForce GTX Titan Black with factory-installed proprietary cooling.

    Silly question probably, but why does nVidia allow only EVGA to break this rule, with their hydro copper signature edition you mentioned? Is it just because it's a water cooled model? Do you think nVidia specially signs off on the design?

    I'm genuinely curious.
  • 1 Hide
    Gunbuster , June 9, 2014 11:49 AM
    Do the individual OEM's even make the reference cards or does Nvidia just sell/ship them cards binned to their clock speed specification from one central ODM factory and the OEM put's it in their own box?
  • 0 Hide
    FormatC , June 9, 2014 12:12 PM
    All this cards came from Nvidia. No chance for an own PCB and cooler.
  • 2 Hide
    ingtar33 , June 9, 2014 4:34 PM
    wait wait wait... am i reading this wrong or is the titan black with the proprietary nvidia blower style cooler temp throttling?

    that's what it looks like to me.
  • 2 Hide
    FormatC , June 9, 2014 9:38 PM
    You are right, this is shown in graphics for the temperature and clock speed. EACH reference card from Nvidia is throttling under longer load, this is the disadvantage of the combination between temp target and a quieter cooler profile. Nothing new, because it is a "feature" since GTX 780. You have to run the original fan with fixed 65% rpm to prevent the card before thermal limitations. But this is really loud. :D 
  • 0 Hide
    SessouXFX , June 11, 2014 3:11 AM
    ...still like the EVGA option better.
  • 1 Hide
    mapesdhs , June 11, 2014 3:29 PM
    wolverine96 writes:
    > Igor Wallossek, I wonder if you could put up a graph for 3D rendering? If you use
    > Blender's BMW scene by Mike Pan (a popular benchmark scene), make sure you
    > properly set the tile size!

    Arion Bench 2.5.0 would be a better test, because it scales perfectly with
    multiple GPUs.

    Or the AE CUDA test my friend has created, but it's pretty intense, maybe
    takes too long with just one card (about 20 minutes with a single 780Ti).

    Ian.

  • 1 Hide
    wolverine96 , June 11, 2014 4:00 PM
    Quote:

    Arion Bench 2.5.0 would be a better test, because it scales perfectly with
    multiple GPUs.

    Or the AE CUDA test my friend has created, but it's pretty intense, maybe
    takes too long with just one card (about 20 minutes with a single 780Ti).

    Ian.



    I agree. The BMW scene is not the best CUDA benchmark. I just didn't want them to mess it up if they decided to use it. I heard some people complaining about this benchmark, although I don't know if they were right or wrong.

    My Titan Black renders the BMW in just over 24 seconds! :D  (Not including the post-process compositing, which uses the CPU. Tile size was set to 512x512.)
    For comparison, an Intel Core 2 Duo @ 2.33 GHz took 16 minutes!

    Have you run a Titan Black on that AE CUDA test? If so, I am curious to see the results!
  • 1 Hide
    mapesdhs , June 11, 2014 5:53 PM

    wolverine96 writes:
    > I agree. The BMW scene is not the best CUDA benchmark. ...

    I've tried it numerous times with various setups, it just seems to behave a
    bit weird IMO.


    > My Titan Black renders the BMW in just over 24 seconds! :D  ...

    Main problem I find is I can't work out how to make it use all available GPUs.
    Is that possible? One of my 580s does it in about 43s, but my system has 4 of
    them, so it's a bit moot really. Mind you, I'm using an older version of
    Blender (2.61), stuck with it to ensure consistent CPU-based testing.

    And as you say, it also involves some CPU stuff (scene setup doesn't use the
    GPU).


    > Have you run a Titan Black on that AE CUDA test? If so, I am curious to see
    > the results!

    Alas no, atm I don't have access to anything newer than some top-end 3GB GTX
    580s (MSI LX, 832MHz); my system has 4 of them. Final version of the test file
    takes 14m 48s to render in AE using 16bpc and 8 samples (ie. just average
    quality), so on a Titan Black I'm guessing it would take maybe 25 mins? Hard
    to say. Would certainly be interesting to find out. Note the 'max' quality
    setting would be 32bpc and 10 samples (likewise, for the full animation, avg
    quality is 1080p @ 25Hz, max quality is 50Hz).

    I'll sort out the test readme, download archive, web page, etc., next week,
    but need to talk to C.A. first about some things. Anyway, here's the rendered
    image in original Targa format (just one frame, the default test frame 96, the
    last frame in the main animation sequence):

    http://www.sgidepot.co.uk/misc/cuda.101_Frame96.tga

    Here's the file converted to BMP and SGI IRIS/RGB:

    http://www.sgidepot.co.uk/misc/cuda.101_Frame96.bmp
    http://www.sgidepot.co.uk/misc/cuda.101_Frame96.rgb

    and for those who don't mind losing a bit of quality, here's a 100% JPEG:

    http://www.sgidepot.co.uk/misc/cuda.101_Frame96.jpg


    The full 4 second animation takes hours to compute even at average quality and
    is thus intended more as a stress test for those interested in checking that
    their system can handle long renders or other GPU tasks without falling over
    (I've seen many people asking for a test like this on forums). I suspect at max
    quality the whole sequence would take about a week to crunch on my system. :D 
    Also interesting for exploring power consumption & energy cost issues for
    different GPU configs (load draw on my system during the render is around 920W).

    Ian.

  • 0 Hide
    wolverine96 , June 11, 2014 8:23 PM
    Did you say you are having trouble getting multiple GPU's to work? I only use one GPU, but here's a very informative link. More specifically, see this section.

    Your system with 4 GTX 580's is much faster than mine! (Two GTX 580's is about as fast as one GTX Titan Black.) I guess the only time mine would be faster is if the scene used more than 3GB of RAM. I actually was planning on getting 2 GTX 580's, but then I discovered the Titan Black.

    Is that Cycles in those images you posted?

    By the way, Blender 2.71 is coming out very soon. In the past 10 versions, there have been some major performance gains for Cycles. I think it's like 30-50% faster in some cases.
  • 1 Hide
    mapesdhs , June 12, 2014 3:49 AM
    wolverine96 writes:
    > Did you say you are having trouble getting multiple GPU's to work? I only use one
    > GPU, but here's a very informative link. More specifically, see this section.

    Thanks!! My goof, looks like V2.61 doesn't have the Compute Panel. Will try
    the newer 2.70a in a moment... (downloading now)


    > Your system with 4 GTX 580's is much faster than mine! ...

    Yup, though I suspect your power bill is less likely to make your eyeballs explode. :D 


    > ... I guess the only time mine would be faster is if the scene used more than
    > 3GB of RAM. ...

    I had been hoping we'd see 6GB 780Tis, but seems like that's been killed off. Shame.


    > I actually was planning on getting 2 GTX 580's, but then I discovered the Titan Black.

    The real advantage of multiple 580s is just low upfront cost. Standard 580s are pretty
    cheap (I have four 1.5GB 797MHz models which cost about 400 UKP total), if one's ok
    with the VRAM limit. 3GB 580s cost a bit more, but not much more (I've bought/sold
    nine Palit 3GB 580s in the past year). The MSI LXs though can be a tad pricey, depends
    on luck really I guess. I got mine (five total) for good prices though overall, and they do
    oc like crazy (1GHz+ is possible).


    > Is that Cycles in those images you posted?

    No, it's the RayTrace3D renderer within After Effects.


    > By the way, Blender 2.71 is coming out very soon. In the past 10 versions,
    > there have been some major performance gains for Cycles. I think it's like 30-50%
    > faster in some cases.

    Good that they keep boosting it, but a nightmare for benchmarking consistency. :D 


    Ok, download done, quick test...

    Cycles does the BMW in 11.56s (blimey!), tile size 240x135. Just curious btw,
    you mentioned using 512x512 tile size, but surely it'd be optimal to use an even
    divisor of the image dimensions in both X and Y? What do you get if you try
    a tile size of 240x135?

    Ian.

Display more comments
React To This Article