Gigabyte GeForce GTX Titan Black: Do-It-Yourself Cooler Swap
Tags:
-
Graphics Cards
-
Gigabyte
- Nvidia
Last response: in Reviews comments
FormatC
June 8, 2014 11:00:04 PM
Nvidia doesn’t allow its partners to sell the GeForce GTX Titan Black with proprietary cooling. However, Gigabyte now offers a GHz Edition of the card that comes bundled with its WindForce solution, which you can install on the overclocked board yourself.
Gigabyte GeForce GTX Titan Black: Do-It-Yourself Cooler Swap : Read more
Gigabyte GeForce GTX Titan Black: Do-It-Yourself Cooler Swap : Read more
More about : gigabyte geforce gtx titan black cooler swap
bloodgigas
June 9, 2014 1:47:51 AM
Related resources
- Radeon R9 290X vs GeForce GTX Titan Black, which is BEST? - Forum
- GeForce® GTX™ TITAN Z vs GEFORCE GTX TITAN BLACK - Forum
- GEFORCE GTX 780 TI vs GEFORCE GTX TITAN BLACK - Forum
- GeForce GTX 780 Ti vs Titan Black vs amd Radeon R9 290X - Forum
- EVGA GeForce GTX 780 Ti 3GB Video Card vs GeForce GTX Titan Black - Forum
bloodgigas
June 9, 2014 2:25:56 AM
Quote:
bloodgigas said:
So we have to change the cooler by our self? now thats wierd.If you bothered reading the first page you'd know why.
"Nvidia doesn’t allow its partners to sell the GeForce GTX Titan Black with proprietary cooling. However, Gigabyte now offers a GHz Edition of the card that comes bundled with its WindForce solution, which you can install on the overclocked board yourself."
This one right? whats the difference between you install it yourself and Gigabyte take the initiative pre-factory installed? or Warranty Void?
Score
-7
bloodgigas
June 9, 2014 2:46:36 AM
Quote:
bloodgigas said:
So we have to change the cooler by our self? now thats wierd.If you bothered reading the first page you'd know why.
"Nvidia doesn’t allow its partners to sell the GeForce GTX Titan Black with proprietary cooling. However, Gigabyte now offers a GHz Edition of the card that comes bundled with its WindForce solution, which you can install on the overclocked board yourself."
This one right? whats the difference between you install it yourself and Gigabyte take the initiative pre-factory installed? or Warranty Void?
Score
-4
FormatC
June 9, 2014 3:21:30 AM
chaosmassive
June 9, 2014 9:17:42 AM
rohitbaran
June 9, 2014 9:23:18 AM
Very nice, Gigabyte! I almost wish I had bought one. I have one of those "out of stock ASUS cards from Newegg". I am not disappointed, though. The card handles 84 degrees Celsius just fine!
Igor Wallossek, I wonder if you could put up a graph for 3D rendering? If you use Blender's BMW scene by Mike Pan (a popular benchmark scene), make sure you properly set the tile size!
Igor Wallossek, I wonder if you could put up a graph for 3D rendering? If you use Blender's BMW scene by Mike Pan (a popular benchmark scene), make sure you properly set the tile size!
Score
0
FormatC said:
It is one of Nvidias funny rules.Ok, for your better understanding:
Nvidia doesn’t allow its partners to sell the GeForce GTX Titan Black with factory-installed proprietary cooling.
Silly question probably, but why does nVidia allow only EVGA to break this rule, with their hydro copper signature edition you mentioned? Is it just because it's a water cooled model? Do you think nVidia specially signs off on the design?
I'm genuinely curious.
Score
2
Gunbuster
June 9, 2014 11:49:53 AM
FormatC
June 9, 2014 12:12:15 PM
FormatC
June 9, 2014 9:38:18 PM
You are right, this is shown in graphics for the temperature and clock speed. EACH reference card from Nvidia is throttling under longer load, this is the disadvantage of the combination between temp target and a quieter cooler profile. Nothing new, because it is a "feature" since GTX 780. You have to run the original fan with fixed 65% rpm to prevent the card before thermal limitations. But this is really loud.
Score
2
SessouXFX
June 11, 2014 3:11:04 AM
mapesdhs
June 11, 2014 3:29:47 PM
wolverine96 writes:
> Igor Wallossek, I wonder if you could put up a graph for 3D rendering? If you use
> Blender's BMW scene by Mike Pan (a popular benchmark scene), make sure you
> properly set the tile size!
Arion Bench 2.5.0 would be a better test, because it scales perfectly with
multiple GPUs.
Or the AE CUDA test my friend has created, but it's pretty intense, maybe
takes too long with just one card (about 20 minutes with a single 780Ti).
Ian.
> Igor Wallossek, I wonder if you could put up a graph for 3D rendering? If you use
> Blender's BMW scene by Mike Pan (a popular benchmark scene), make sure you
> properly set the tile size!
Arion Bench 2.5.0 would be a better test, because it scales perfectly with
multiple GPUs.
Or the AE CUDA test my friend has created, but it's pretty intense, maybe
takes too long with just one card (about 20 minutes with a single 780Ti).
Ian.
Score
1
mapesdhs said:
Arion Bench 2.5.0 would be a better test, because it scales perfectly with
multiple GPUs.
Or the AE CUDA test my friend has created, but it's pretty intense, maybe
takes too long with just one card (about 20 minutes with a single 780Ti).
Ian.
I agree. The BMW scene is not the best CUDA benchmark. I just didn't want them to mess it up if they decided to use it. I heard some people complaining about this benchmark, although I don't know if they were right or wrong.
My Titan Black renders the BMW in just over 24 seconds!
(Not including the post-process compositing, which uses the CPU. Tile size was set to 512x512.)For comparison, an Intel Core 2 Duo @ 2.33 GHz took 16 minutes!
Have you run a Titan Black on that AE CUDA test? If so, I am curious to see the results!
Score
1
mapesdhs
June 11, 2014 5:53:53 PM
wolverine96 writes:
> I agree. The BMW scene is not the best CUDA benchmark. ...
I've tried it numerous times with various setups, it just seems to behave a
bit weird IMO.
> My Titan Black renders the BMW in just over 24 seconds!
...Main problem I find is I can't work out how to make it use all available GPUs.
Is that possible? One of my 580s does it in about 43s, but my system has 4 of
them, so it's a bit moot really. Mind you, I'm using an older version of
Blender (2.61), stuck with it to ensure consistent CPU-based testing.
And as you say, it also involves some CPU stuff (scene setup doesn't use the
GPU).
> Have you run a Titan Black on that AE CUDA test? If so, I am curious to see
> the results!
Alas no, atm I don't have access to anything newer than some top-end 3GB GTX
580s (MSI LX, 832MHz); my system has 4 of them. Final version of the test file
takes 14m 48s to render in AE using 16bpc and 8 samples (ie. just average
quality), so on a Titan Black I'm guessing it would take maybe 25 mins? Hard
to say. Would certainly be interesting to find out. Note the 'max' quality
setting would be 32bpc and 10 samples (likewise, for the full animation, avg
quality is 1080p @ 25Hz, max quality is 50Hz).
I'll sort out the test readme, download archive, web page, etc., next week,
but need to talk to C.A. first about some things. Anyway, here's the rendered
image in original Targa format (just one frame, the default test frame 96, the
last frame in the main animation sequence):
http://www.sgidepot.co.uk/misc/cuda.101_Frame96.tga
Here's the file converted to BMP and SGI IRIS/RGB:
http://www.sgidepot.co.uk/misc/cuda.101_Frame96.bmp
http://www.sgidepot.co.uk/misc/cuda.101_Frame96.rgb
and for those who don't mind losing a bit of quality, here's a 100% JPEG:
http://www.sgidepot.co.uk/misc/cuda.101_Frame96.jpg
The full 4 second animation takes hours to compute even at average quality and
is thus intended more as a stress test for those interested in checking that
their system can handle long renders or other GPU tasks without falling over
(I've seen many people asking for a test like this on forums). I suspect at max
quality the whole sequence would take about a week to crunch on my system.
Also interesting for exploring power consumption & energy cost issues for
different GPU configs (load draw on my system during the render is around 920W).
Ian.
Score
1
Did you say you are having trouble getting multiple GPU's to work? I only use one GPU, but here's a very informative link. More specifically, see this section.
Your system with 4 GTX 580's is much faster than mine! (Two GTX 580's is about as fast as one GTX Titan Black.) I guess the only time mine would be faster is if the scene used more than 3GB of RAM. I actually was planning on getting 2 GTX 580's, but then I discovered the Titan Black.
Is that Cycles in those images you posted?
By the way, Blender 2.71 is coming out very soon. In the past 10 versions, there have been some major performance gains for Cycles. I think it's like 30-50% faster in some cases.
Your system with 4 GTX 580's is much faster than mine! (Two GTX 580's is about as fast as one GTX Titan Black.) I guess the only time mine would be faster is if the scene used more than 3GB of RAM. I actually was planning on getting 2 GTX 580's, but then I discovered the Titan Black.
Is that Cycles in those images you posted?
By the way, Blender 2.71 is coming out very soon. In the past 10 versions, there have been some major performance gains for Cycles. I think it's like 30-50% faster in some cases.
Score
0
mapesdhs
June 12, 2014 3:49:31 AM
wolverine96 writes:
> Did you say you are having trouble getting multiple GPU's to work? I only use one
> GPU, but here's a very informative link. More specifically, see this section.
Thanks!! My goof, looks like V2.61 doesn't have the Compute Panel. Will try
the newer 2.70a in a moment... (downloading now)
> Your system with 4 GTX 580's is much faster than mine! ...
Yup, though I suspect your power bill is less likely to make your eyeballs explode.
> ... I guess the only time mine would be faster is if the scene used more than
> 3GB of RAM. ...
I had been hoping we'd see 6GB 780Tis, but seems like that's been killed off. Shame.
> I actually was planning on getting 2 GTX 580's, but then I discovered the Titan Black.
The real advantage of multiple 580s is just low upfront cost. Standard 580s are pretty
cheap (I have four 1.5GB 797MHz models which cost about 400 UKP total), if one's ok
with the VRAM limit. 3GB 580s cost a bit more, but not much more (I've bought/sold
nine Palit 3GB 580s in the past year). The MSI LXs though can be a tad pricey, depends
on luck really I guess. I got mine (five total) for good prices though overall, and they do
oc like crazy (1GHz+ is possible).
> Is that Cycles in those images you posted?
No, it's the RayTrace3D renderer within After Effects.
> By the way, Blender 2.71 is coming out very soon. In the past 10 versions,
> there have been some major performance gains for Cycles. I think it's like 30-50%
> faster in some cases.
Good that they keep boosting it, but a nightmare for benchmarking consistency.
Ok, download done, quick test...
Cycles does the BMW in 11.56s (blimey!), tile size 240x135. Just curious btw,
you mentioned using 512x512 tile size, but surely it'd be optimal to use an even
divisor of the image dimensions in both X and Y? What do you get if you try
a tile size of 240x135?
Ian.
> Did you say you are having trouble getting multiple GPU's to work? I only use one
> GPU, but here's a very informative link. More specifically, see this section.
Thanks!! My goof, looks like V2.61 doesn't have the Compute Panel. Will try
the newer 2.70a in a moment... (downloading now)
> Your system with 4 GTX 580's is much faster than mine! ...
Yup, though I suspect your power bill is less likely to make your eyeballs explode.
> ... I guess the only time mine would be faster is if the scene used more than
> 3GB of RAM. ...
I had been hoping we'd see 6GB 780Tis, but seems like that's been killed off. Shame.
> I actually was planning on getting 2 GTX 580's, but then I discovered the Titan Black.
The real advantage of multiple 580s is just low upfront cost. Standard 580s are pretty
cheap (I have four 1.5GB 797MHz models which cost about 400 UKP total), if one's ok
with the VRAM limit. 3GB 580s cost a bit more, but not much more (I've bought/sold
nine Palit 3GB 580s in the past year). The MSI LXs though can be a tad pricey, depends
on luck really I guess. I got mine (five total) for good prices though overall, and they do
oc like crazy (1GHz+ is possible).
> Is that Cycles in those images you posted?
No, it's the RayTrace3D renderer within After Effects.
> By the way, Blender 2.71 is coming out very soon. In the past 10 versions,
> there have been some major performance gains for Cycles. I think it's like 30-50%
> faster in some cases.
Good that they keep boosting it, but a nightmare for benchmarking consistency.
Ok, download done, quick test...
Cycles does the BMW in 11.56s (blimey!), tile size 240x135. Just curious btw,
you mentioned using 512x512 tile size, but surely it'd be optimal to use an even
divisor of the image dimensions in both X and Y? What do you get if you try
a tile size of 240x135?
Ian.
Score
1
Thanks! I tried 240x135, but that took 31 seconds. I doubled it to 480x270, and it rendered in just under 25 seconds with compositing turned on. So it's about a second quicker (4%).
The reason I used 512x512 is because it fits nicely into the graphics card. Graphics cards handle images best at resolutions with dimensions that are powers of two (128x64, 1024x1024, 16x16, etc.)
If I had multiple GPU's, I would see a greater gain in performance by switching to 480x270 (with one GPU, I'm only rendering one tile at a time anyway). I have learned this while rendering on my 8-core CPU. It is an FX-8350, and it renders it in 1 minute and 53 seconds, only 4.5 times slower than one Titan Black!
The GTX 580's I wanted were 3GB, I think. They were refurbished for $450 each. I got my brand-spanking-new Titan Black for $1000, so that was $100 well spent!
Do you have a 780 ti? I'm just wondering how it compares to the Titan Black. Or maybe I should just ask on BlenderArtists.org...
The reason I used 512x512 is because it fits nicely into the graphics card. Graphics cards handle images best at resolutions with dimensions that are powers of two (128x64, 1024x1024, 16x16, etc.)
If I had multiple GPU's, I would see a greater gain in performance by switching to 480x270 (with one GPU, I'm only rendering one tile at a time anyway). I have learned this while rendering on my 8-core CPU. It is an FX-8350, and it renders it in 1 minute and 53 seconds, only 4.5 times slower than one Titan Black!
The GTX 580's I wanted were 3GB, I think. They were refurbished for $450 each. I got my brand-spanking-new Titan Black for $1000, so that was $100 well spent!
Do you have a 780 ti? I'm just wondering how it compares to the Titan Black. Or maybe I should just ask on BlenderArtists.org...
Score
0
mapesdhs
June 14, 2014 3:09:48 AM
wolverine896 writes:
> The reason I used 512x512 is because it fits nicely into the graphics card. Graphics cards handle
> images best at resolutions with dimensions that are powers of two (128x64, 1024x1024, 16x16, etc.)
Interesting, I found it was fastest when using a tile size that was an even divisor of the image size.
Otherwise it ends up having to render splinter pieces towards the end.
> The GTX 580's I wanted were 3GB, I think. They were refurbished for $450 each. ...
I bought about nine 3GB 580s in the last year, typically for around $220 each, mostly from eBay.
Sold four of them for AE machine builds.
> Do you have a 780 ti? ...
Not yet. Can't justify the cost atm.
> ... I'm just wondering how it compares to the Titan Black. ...
It'll be identical for anything where the RAM limit is not an issue or 64bit fp doesn't matter.
Ian.
> The reason I used 512x512 is because it fits nicely into the graphics card. Graphics cards handle
> images best at resolutions with dimensions that are powers of two (128x64, 1024x1024, 16x16, etc.)
Interesting, I found it was fastest when using a tile size that was an even divisor of the image size.
Otherwise it ends up having to render splinter pieces towards the end.
> The GTX 580's I wanted were 3GB, I think. They were refurbished for $450 each. ...
I bought about nine 3GB 580s in the last year, typically for around $220 each, mostly from eBay.
Sold four of them for AE machine builds.
> Do you have a 780 ti? ...
Not yet. Can't justify the cost atm.
> ... I'm just wondering how it compares to the Titan Black. ...
It'll be identical for anything where the RAM limit is not an issue or 64bit fp doesn't matter.
Ian.
Score
1
mapesdhs
June 18, 2014 8:40:12 AM
wolverine96, a small followup: I tried the BMW with a more unusual setup just for a laugh:
P55 config with an i7 875K at 3.2GHz default (mbd is an ASUS P7P55 WS Supercomputer,
16GB RAM at only 1333 CL9), using four EVGA 1.5GB 580s at 797MHz (somewhat slower
than my MSIs). It completed the BMW test in 12.51s.
Proves one does not need a
modern chipset or crazy CPU to have good CUDA performance, though of course in reality
the 16GB max RAM could be limiting for some tasks. Total cost of these four 580s was a
little over 400 UKP, all bought about a year ago.
Ian.
P55 config with an i7 875K at 3.2GHz default (mbd is an ASUS P7P55 WS Supercomputer,
16GB RAM at only 1333 CL9), using four EVGA 1.5GB 580s at 797MHz (somewhat slower
than my MSIs). It completed the BMW test in 12.51s.
Proves one does not need amodern chipset or crazy CPU to have good CUDA performance, though of course in reality
the 16GB max RAM could be limiting for some tasks. Total cost of these four 580s was a
little over 400 UKP, all bought about a year ago.
Ian.
Score
0
mapesdhs said:
wolverine96, a small followup: I tried the BMW with a more unusual setup just for a laugh:P55 config with an i7 875K at 3.2GHz default (mbd is an ASUS P7P55 WS Supercomputer,
16GB RAM at only 1333 CL9), using four EVGA 1.5GB 580s at 797MHz (somewhat slower
than my MSIs). It completed the BMW test in 12.51s.
Proves one does not need amodern chipset or crazy CPU to have good CUDA performance, though of course in reality
the 16GB max RAM could be limiting for some tasks. Total cost of these four 580s was a
little over 400 UKP, all bought about a year ago.
Ian.
Interesting.
I guess It's okay that I didn't go for an Intel CPU on my $3000 PC, LOL! And yes, I have heard that RAM can affect render times, even while rendering on the GPU.
Score
0
mapesdhs
June 18, 2014 1:20:40 PM
Certainly true that for a small pure GPU test, the CPU and RAM often don't matter. A more real-world
task can be very different though. That AE test I mentioned earlier takes 10% longer to compute on
my friend's 3930K/4.7 system if the RAM is reduced from 2133 to 1866 (his setup has a Quadro 4000
and three 3GB 580s, though only the 580s form the CUDA pool).
Ian.
PS. The P55 config above gives 4912 for Arionbench 2.5.0.
Score
0
nascarf1
June 25, 2014 12:52:19 AM
mapesdhs
June 25, 2014 4:00:05 PM
I read recently that the possibility of 6GB 780 Tis has been ditched, because NVIDIA was
afraid of hurting Titan Black/Z sales. Anyone know if this is definitely confirmed? I think
NVIDIA are completely wrong if so, but c'est la vie.
Ian.
PS. Hey wolverine96, I sold a couple of reference GTX 580 1.5GB cards to a friend this
week which he's put into a Q9550 system for use with Blender rendering. He knows
someone with an i7 system who told him their Blender scene took 17 seconds to render
in sw on their i7; my friend's dual-580 did it in 1.5 seconds.
There was much boastingI gather.
He also plans on using it SLI for Elite Dangerous, so a win/win all-round. 8)PPS. I hereby declare that anyone who has the spare time to properly play Elite Dangerous
when it's fully released is officially annoying.
I remember talking to friends in the mid-80sabout exactly what they've created; alas these days I don't have the time for such in-depth
gaming. Rats... lottery winners suck. :}
Score
0
Rehuel Galzote
June 26, 2014 5:24:46 PM
mapesdhs
June 26, 2014 6:15:01 PM
For what task? It all depends on the nature of the task.
290X is pretty good for gaming
(I forget offhand, but I'm sure it's quicker than a Titan for some games), but that's just one
very narrow field in the world of computing as a whole. No GPU is the best for everything.
Even for gaming, some titles favour AMD, others NVIDIA, and then there's the way driver
changes can mess up what may have been better performance or stability in the past. Plus
there's the way some games scale better with CF instead of SLI, others vice versa, so one
AMD card might be faster than one NVIDIA card, but doubling them ends up with the SLI
pair being faster/smoother than the AMD pair (or vice versa). It's complicated, you have to
make each decision based on the task you're interested in, not hunt for a general average
answer, because the latter can be misleading and isn't that useful anyway. Also remember
the issue of absolute performance: imagine a situation where, for 4K gaming, a 290X gives
29fps vs. 24fps for a Titan Black; the AMD is quicker, but neither is remotely good enough
for acceptable play. Add a 2nd card? Sure, but that's when the CF/SLI scaling issue matters,
and how it varies depending on the game. This is a made-up example btw, could easily be
the other way round, etc.
It's enough to induce a headache. :}
Ian.
290X is pretty good for gaming(I forget offhand, but I'm sure it's quicker than a Titan for some games), but that's just one
very narrow field in the world of computing as a whole. No GPU is the best for everything.
Even for gaming, some titles favour AMD, others NVIDIA, and then there's the way driver
changes can mess up what may have been better performance or stability in the past. Plus
there's the way some games scale better with CF instead of SLI, others vice versa, so one
AMD card might be faster than one NVIDIA card, but doubling them ends up with the SLI
pair being faster/smoother than the AMD pair (or vice versa). It's complicated, you have to
make each decision based on the task you're interested in, not hunt for a general average
answer, because the latter can be misleading and isn't that useful anyway. Also remember
the issue of absolute performance: imagine a situation where, for 4K gaming, a 290X gives
29fps vs. 24fps for a Titan Black; the AMD is quicker, but neither is remotely good enough
for acceptable play. Add a 2nd card? Sure, but that's when the CF/SLI scaling issue matters,
and how it varies depending on the game. This is a made-up example btw, could easily be
the other way round, etc.
It's enough to induce a headache. :}
Ian.
Score
0
BustedGPU
August 27, 2014 5:26:10 PM
Related resources
- gigabyte gtx titan black vs asus gtx titan black Forum
- Should I get the EVGA GeForce GTX Titan Black, or the Sapphire Radeon R9 295x2? Forum
- EVGA GeForce GTX Titan Black or 2 GeForce 980’s in SLI? Forum
- SolvedGeForce GTX TITAN Blacks in 4-Way SLI vs GeForce GTX TITAN Zs in Quad SLI? Forum
- nVidia GeForce GTX TITAN SuperClock, Black, or Z? Forum
- SolvedGIGABYTE GV-N680OC-2GD GeForce GTX 680 2GB vs. EVGA GeForce GTX780 SuperClocked w/EVGA ACX Cooler 3GB Forum
- GeForce GTX Titan Black Will Be Released on 02/18. Will You Buy It? Forum
- SolvedDo gigabyte geforce gtx 770 WINDFORCE 4gb fits in cooler master haf 922 mid tower case? Forum
- SolvedEVGA Geforce GTX 750 TI FTW w/ ACX Cooler 2 gb vs. Gigabyte Geforce GTX 760 Windforce Forum
- SolvedEVGA ACX Cooler GTX 780 3GB Video Card vs Gigabyte GeForce GTX 770 4GB Video Card Forum
- SolvedStock Geforce Gtx 780 or Asus/Gigabyte Gtx 780 coolers? Forum
- More resources
!