Sign in with
Sign up | Sign in
Your question

4x9800GX2: will it overheat?

Last response: in Components
Share
March 20, 2008 5:17:15 PM

I work as a Computer Scientist at an academic image processing department. In my research, I regularly work with nvidia Cuda (http://www.nvidia.com/object/cuda_home.html) for high performance computing on the graphics card (GPGPU). These are general computations that have nothing to do with rendering 3D scenes. For GPGPU, SLI is not required. Therefore, it seems possible to me to put four 9800GX2 cards on a single motherboard, for example the MSI K9A2 Platinum V2 (http://global.msi.com.tw/index.php?func=proddesc&prod_no=1395&maincat_no=1) that has four (physical) PCI-Express X16 slots with double spacing between the slots.

So far, I have tried to figure out if there will be any obvious problems with driver support. Since SLI will not be used, it seems that it can be done (see this thread http://www.tomshardware.co.uk/forum/249450-15-four-9800gx2-cards-work and this one http://forums.nvidia.com/index.php?showtopic=62618 for more info.

Anyway, let's assume for now that it will work on the software side. So far, I am thinking of the following case and PSU:

case: Lian-Li Armorsuit PC-P80 (http://lian-li.com/v2/en/product/product06.php?pr_index=131&cl_index=1&sc_index=25&ss_index=61).
This is the only big tower case I could find that has more than 7 expansion slots. Using four 9800GX2 cards means that at least 8 slots are required. Any suggestions for alternatives?

PSU: Thermaltake Toughpower 1500W (http://www.thermaltake.com/product/Power/ToughPower/w0171/w0171.asp).
This is the only PSU I could find that has four PCI-Express x8 connectors and four PCI-Express x6 connectors, which should be enough to power the four cards.

I have no intention of overclocking (it seems like enough heat already) :) 
I am trying to get a feeling of whether or not it will be possible to cool this case sufficiently. Any thoughts on whether or not air cooling would be sufficient?

More about : 4x9800gx2 overheat

a b ) Power supply
March 20, 2008 6:09:14 PM

Just a guess but I'd think your going to need water cooling for the thing not to melt.

You might start thinking about getting a dual power supply case, I've heard there is a cooler master stacker that has two psu mounting points. Also the Antec p190 comes with two power supplies.

Air cooling wise you might be able to leave the case open and stick a full size fan blowing across everything.
March 20, 2008 6:14:17 PM

Thanks! Unfortunately, none of the dual PSU cases I've seen (include the Stacker 832 and the Antec P190) have the required 8 expansion slots...
Related resources
March 20, 2008 6:16:01 PM

Hi, I was just wondering, you must already have some sort of computer that you use for day to day work? Because have you looked at the Tesla products from Nvidia?! This allows what you what to do, but some products are self contained in a separate case, thus eliminating the need for an uber MB and also it would eliminate the need for Uber cooling.

See this:http://www.nvidia.com/object/tesla_computing_solutions.html

The Tesla S870 GPU Computing System comes with 4 GPUs and 6GB of Ram.

just my 2 cents.
March 20, 2008 6:58:37 PM

Unfortunately, the Tesla S870 costs $14000, and then you still have "only" 4 GPUs (instead of eight, with a <$5000) single PC. Part of the reason for doing it is also the "fun factor" actually :)  If you could cram 4 9800GX2 cards in a PC, the computation speed for suitable tasks would be +3TFlops, easily more than a 512-node supercomputer....
March 20, 2008 7:05:21 PM

The ThermalTake Armor+ also has 10 expansion slots ( http://www.newegg.com/Product/Product.aspx?Item=N82E168... ). As far as PSU goes, I'm not sure another one is made with 8 PCIe connectors. I heard something about a 2000W PSU at CeBit this year, but I don't remember any specifics.

If you scaled back to 3 9800GX2s you'd have a significantly lower cost, and less headache about actually powering and containing it.

I'd suggest filling your case with as many of these as will fit:

Scythe SY1225SL12SH 120mm "Slipstream" Case Fan 1900RPM
http://www.newegg.com/Product/Product.aspx?Item=N82E168...
or
Scythe DFS123812-2000 120mm Case Fan
http://www.newegg.com/Product/Product.aspx?Item=N82E168...
March 20, 2008 7:51:33 PM

Go for it man, 4 9800gx2 would be pretty cool to see, if u build this u gotta post pics so we can see, then maybe a little explanation of what you do so that the rest of us understand why SLI was not important for gpgpu work.

Have as many large fans as u can, so that u have plenty of airflow, the more the better, as heat will be a huge problem, and watercooling might be too complex for all 8 gpu's...
March 20, 2008 8:22:16 PM

KyleSTL said:
If you scaled back to 3 9800GX2s you'd have a significantly lower cost, and less headache about actually powering and containing it.


I think I will eventually settle for three, but maybe after trying four for a short while :)  We can use the cards in other PC's around the workplace too, so that should be no problem.

Thank you all for your comments so far!
March 20, 2008 9:06:00 PM

Screw the case! Just get some plywood and a drill :-D
March 20, 2008 9:52:38 PM

zyberwoof said:
Screw the case! Just get some plywood and a drill :-D

plywood might be the duct tape computer building...
March 26, 2008 5:55:52 PM

w/e u do post pictures when u have completed it with the 4 9800gx2's!!!!
March 28, 2008 8:31:22 AM

I will certainly post pictures as soon as it's up and running, hopefully within a few weeks :-)
March 29, 2008 2:39:13 AM

I am concerned that you will be able to find a power supply to support 4 of those. I dont know if they require any less power not in SLI mode but you can go here and see what is actually certified for your psu needs:

http://www.slizone.com/object/slizone_build_psu.html

This is specs for two of these cards, not four. You may have to get out the plywood and mount two of these units to get the job done! (from a certified standpoint).

Good luck!
March 29, 2008 3:17:25 AM

You will need 2x decent power supplies in a case that supports that, from what I see atleast.
March 29, 2008 4:19:58 AM

More importantly...
Will it blend?

If u plan on putting these cards in other pc's after awhile, its kind of a waste having a mobo, psu, case etc that wont be needed.

Cooling shouldnt be a big issue TBO. Considering u have 4x dual card, single slot gfx cards, which are 2 cards in one yes? You then have 8 gpu's to cool. Doing this with water will be stupid really.
8gpu blocks a mile of hose, 2x pumps, 1-2 360cm rads... ur looking at ~1,000USD to do it properly.

Look at the design of the cards carefully, along with your case. You will need to make 4 seperate ducts, each with their own fresh air intake ducts. These can be made out of cardboard etc. Add a side fan plus one on top and bottom just to get the still air moving around a bit and you should be able to run the cards 24/7.
March 29, 2008 5:00:00 AM

Ok several questions about this thread need answering, so I will start by asking the questions and making a couple of suggestions as well..

#1: What software on the market (or not) will actually utilize four graphics cards designed to be linked via sli( but wont be), and what will these graphics cards (which by the way are not designed for the task you are assigning them) be used for if not 3d acceleration or rendering?

#2: Being a computer scientist Im sure you have done your research and realize by now that (4) 9800 GX2 graphics cards (giving you in your situation) a proposed 8 GPU's with which to do computations is a complete and utter waste of time because the software that runs these cards is not designed to have all four cards working as a team nor would four cards in the four PCI Express slots you are mentioning give you balanced throughput or data processing capability as the bandwidth provided by each slot will be different as 2 of those slots will be converted into 8X bandwidth instead 16x as the other two will be. Also the software you are using must be capable of using each card independently of the others as the four PCI Express slots will not report to any software that I know of as a team or grouping of (8) GPU's atleast not on a standard motherboard like the one you are talking about, without being linked in some fashion and again you have already stated that they will not be linked.

#3: For you to run (4) of these in tandem without failure for any length of time over say an hour or 2 would require a PSU far above the normal available wattage from your average reseller. You would need to frankenstein this build and have all non-PRIORITY devices: Examples include CD Drives and fans, on a seperate PSU. And have the primary system components on another PSU:EX = the MOBO and VIDEO CARDS. Also hooking (4) of these up at all to have them all on a single PSU would require a PSU with a whole hell of alot PCI express Plugs and adpaters.. Atleast 8 PCI Express connectors will be required by the time it would be done. However with (2) PSU's and the correct timing when you flipped the switches on ..this wouldnt be a problem. IF you dont know how to do this research how to put (2) PSU's together in one computer, it is simply a matter of shorting certain pins on the motherboard connector cable, and flipping the switch.

#4: Also for the sake of mentioning it, you are forgetting that the amount of ram that the motherboard you are using is capable of supporting much lower than say your "super computer" or Tesla Products as was stated earlier. This will mean that your actual data throughput and processing power of the (8) GPU's would be severely hindered as a result.

#5: As Far as cooling goes this will not pose a serious threat to the operation of such a computer. The easiest and most cost effective solution would be to leave the case open on both sides ( kind of like a stripped down server rack
) and have a standard box or floor fan positioned blowing fresh air directly across the cards at all times. This will allow fresh air to shoot in one side of the case and force hot air out the other side. Trust me this works amazingly well, as I have tested it several times during 12-16 hour gaming sessions with SLI'd graphics cards in the past.

#6: There are cases which will support your requirements for available slots , I would suggest looking at server class cases for this build which are typically designed for the high cooling capability and also expandability. If that fails you could always do a hack job to the back of a case using a Dremel tool.

Well thats my 2 cents.. I do not think this will work very well in execution, as the actual functionality of 4 of these together has not been tested if at all, and they were not designed for these sorts of operations. Also the power issue will be an "ISSUE". The Cooling can be solved with some effort on your part to withstand noise levels, but all in all theres a reason why the Tesla system costs 14,000. I think you will find out why.
March 29, 2008 5:19:19 AM

@ thread title = YES
March 29, 2008 10:12:03 AM

Is there a proper (nvidia)driver to support 8xGPU !?
March 29, 2008 10:21:03 AM

What exactly are you planning to do with 8 GPUs if you aren't doing 3d acceleration or rendering? I don't work for NASA but I seriously doubt if any application in the world would require such a high end GPU setting.....
March 29, 2008 11:11:28 AM

Why not setup a small cluster?

@the people asking why he wants them: DID YOU NOT READ THE ORIGINAL POST? HES WORKING WITH CUDA.
March 29, 2008 11:30:43 AM

I guess this wont be coolable in a case. IMHO you should go for an open rig with external fans (the regular ones that are used to cool rooms) and watercooling on everything.
OR you could dip the whole thing into a pool of liquid nitrogen, but... yea... ya know...
March 29, 2008 12:13:17 PM

Get any case which has the required 8 slots on the backside for the graphics cards...

Buy 4 of the 5.25" thermaltake GPU power supplies (250w each, one for each card) and 4 6pin to 8pin PCI-E adaptors...

Buy a dremel and 2-4 120mm fans...

Cut mounting spaces in the side panel for them over the graphics cards...

Mount the fans and connect them to the regular power supply which is fuelling motherboard etc...

Voila ... you have 8 GPUs and cooling and power sufficient to keep them cool and working.
a b ) Power supply
March 29, 2008 1:24:10 PM

skittle said:
Why not setup a small cluster?

@the people asking why he wants them: DID YOU NOT READ THE ORIGINAL POST? HES WORKING WITH CUDA.

That might not be a bad idea. But, it depends on how much power the OP needs. Most of the time GPUs can do floating point calculations faster than a CPU.
March 29, 2008 2:02:30 PM

I wouldn't get that card. You said you don't need SLI, but 9800GX2 is made up of 2 cards running in SLI together. The 9800GTX comes out at the end of this month, and should be available soon after. I think it would better suit your needs since it is one more powerful card instead of two less powerful ones working in SLI.

Tomshardware just built a high end system, and it go to hot using 3 8800 Ultra cards, and suggested water cooling for anyone doing this. You would probably need to water cool your cards as well to keep the system cool enough. Its hard to get enough airflow to the graphics cards, so air might not be very good.
March 29, 2008 2:16:23 PM

coret said:
Get any case which has the required 8 slots on the backside for the graphics cards...

Buy 4 of the 5.25" thermaltake GPU power supplies (250w each, one for each card) and 4 6pin to 8pin PCI-E adaptors...

Buy a dremel and 2-4 120mm fans...

Cut mounting spaces in the side panel for them over the graphics cards...

Mount the fans and connect them to the regular power supply which is fuelling motherboard etc...

Voila ... you have 8 GPUs and cooling and power sufficient to keep them cool and working.


These were my exact same thoughts. The case has 3 120mm fans on the front and 1 at the back I believe, just equip the sidewindow/side panel with 3-4 120mm fans and you should be fine, but remember to use High CFM fans to move as much air as possible.

Also my second thought was the mineral oil tank case mod here:

http://www.leetupload.com/tutorials/1337_fleet/

Not telling you this is what you should do, just giving you ideas to cool down the PC (You would have to nix the Lian-Li case though and go for some kind of tank)
March 29, 2008 7:25:01 PM

dude he doesn't need water cooling, for the last freakin time...if u know how to properly use good fans with a sh1tload of cfm, it will work fine...

and to kjoost, can u explain to these nublet's what is the program cuda, and why it is that you do not need SLI....
a c 78 ) Power supply
March 29, 2008 8:17:01 PM

coret said:
Get any case which has the required 8 slots on the backside for the graphics cards...

Buy 4 of the 5.25" thermaltake GPU power supplies (250w each, one for each card) and 4 6pin to 8pin PCI-E adaptors...

Buy a dremel and 2-4 120mm fans...

Cut mounting spaces in the side panel for them over the graphics cards...

Mount the fans and connect them to the regular power supply which is fuelling motherboard etc...

Voila ... you have 8 GPUs and cooling and power sufficient to keep them cool and working.


Perfect.. just cut some vents in the case since it will be positive pressure. look at the tagan case it does that too since there is sooo much income on the side...
a b ) Power supply
March 30, 2008 4:14:42 PM

^ or build a custom 4' by 4' case and fill it with some Fluorinert( if you can buy that that is, since you are a Computer Scientist, you should be able to get hold of some). Just a thought on extreme cooling ;) .

PS: They apparently used Florinert to cool the Cray-2 supercomputer.
http://en.wikipedia.org/wiki/Fluorinert
March 31, 2008 9:37:03 PM

Hi guys, sorry I have been away for a while and thanks for all your comments.

As I stated underlined in my original post: SLI is not required for CUDA. This is the only reason why it makes perfect sense to use more than 4 GPUs on a single motherboard. So far, the only obstacles seem to be possible BIOS issues and possible NVIDIA (non-SLI) driver issues, but both MSI and NVIDIA don't see a direct reason why it will not work.
BTW: CUDA does not even SUPPORT SLI, and still NVIDIA seems to think it makes sense, right? :) 

So, what is this "CUDA" thing about? It is about doing general computations (sometimes not even related to graphics) on your graphics card. In my case, I write programs for medical CT-scanners that can quickly reconstruct a three-dimensional volume from X-ray data. There will be no monitor attached to the graphics cards. The data is uploaded from general RAM to the card memory, computations are performed, and then the data is written back to general RAM. My computations can be split easily into different parts that do not require communication between them, so they can be executed on different GPUs (as many as you have available). I work on "advanced" iterative reconstruction algorithms that easily take weeks to run on a quad-core PC for a complete 3D volume. Throughput of the PCI-Express bus is not the bottleneck for me, as that bus is only used to upload the start dataset of the computation and read out the results at the end.

To summarize:
- I am CERTAIN that this thing is useful. We are already using it to speed up our computations by a factor of 50 on a single 8800GTX card.
- It scales well for my application. More GPUs directly means more speed, almost linearly, as PCI-Express bandwidth is NOT the bottleneck
- For MY application (and many others, take a look at gpgpu.org), a single system has the potential to replace a supercomputer consisting of 256+ nodes.

About the power supply: can anyone give me a reason why the Thermaltake Toughpower 1500W will not do the job? It has all the required connectors. Won't 1500W be enough?
March 31, 2008 9:46:34 PM

I'm sure a 1500w power supply would be adequate ... but personally, for the situation you're looking at, I'd be more inclined to use the multiple 5.25" bay power supplies designed to power just graphics cards.

That way, if one PSU dies, you could switch the card powered by it over to the regular internal PSU until a replacement arrives ... basically less potential downtime.
April 1, 2008 3:09:00 AM

Nah I think he'll be good with a 1500W unit, what's the possible downtime if it's REALLY necessary to get back in action? 1 day if you use overnight shipping, truly not an issue in my eyes. I believe there are a couple of possible cooling alternatives here, what you must do now is find out which is more viable to you and do it. And for the love of all that is sacred, PLEASE POST a DETAILED review and PICS when you're done.
April 1, 2008 6:11:07 AM

Look simply the 1500 watt Thermaltake will be fine its a very good quality psu and was built for purposes like this. But to keep it cool with the armorsuit try cutting holes in the side panel and installing fans. Remember at least Scythe high range or Delta. If you can do that just pick up the directory (we have yellow pages in AUS what does USA have) and find a pc mod shop and get them to do it. With your budget it couldn't much of a difference.
April 1, 2008 1:36:22 PM

hi. just to comment on your psu needs. i dont think 1500watts will give you enough punch. i would suggest shelling out another 50-60bucks and get a secondary psu thats good enough to power one of your 4 beast. http://www.tigerdirect.com/applications/SearchTools/ite...

i just picked one up and its amazing. my current set up below actually utilizes 400-500watts on full power during gaming which caused my 550VX to spool up and getting quite noisy. So the reason why I bought this secondary psu was to keep the noise down and it works like a charm.

Im sure you can even run two of those 325w beast.
April 1, 2008 2:03:05 PM

quanger said:
hi. just to comment on your psu needs. i dont think 1500watts will give you enough punch.

WHAT?!?!? :kaola: 
1500W will be more than enough! A GX2 is rated at what, pretty much the same as an 8800 Ultra/GTX. Nvidia recommended min. 1000W (which is overkill remember!) for tri-sli ultras, so 1500W is going to be easily more than enough! Much more than 1500W and you'd need the computer on it's own circuit otherwise it'd be tripping fuses all the time. Some people really do go overboard on recommending PSUs...
I reckon for a quad-core, 2 DVD drives, 4 HD's, 4 x 2Gb sticks of RAM and 4 GX2's 1.2kW would be ample, but a bit of headroom with such a system is never going to be a bad thing!
I would imagine my PSU (6x 12V rails and 850W continuous load with 1kW peak) would be more than adequate for tri-sli, but Nvidia have to cover their asses the moment they certify a product for use!
Anyhow, it'll be interesting to see the beast in action :) 
April 1, 2008 2:08:35 PM

CPU - 125w
4 x GPU - 600w (GX2 is a 150w card)
mobo, hdd's etc - <100w

825w total.

Max efficiency is at about half load, so 825w x 2 = 1650w ... 1500watts is close enough
April 1, 2008 2:28:14 PM

I have the Lian Li P80 Its a great case with excellent airflow. I used the windowed side panel from a Lian Li G70 and also the vidio cooling fan from the G70. The 4 / 140mm fans work great,and the fan controler is a plus.Its has the 10 expansion slots you need so your all set. If i were you i would also buy the new Freezone Elite to cool your processor.
April 1, 2008 2:32:26 PM

One more thing,your going to need at the very least,a 1500w power supply. They suggest at least a 1200w for two GX2s which is what i have installed in my P80 along with a Silverstone 1200w. My 2nd rig is the photo under stan116 thats a Lian Li G70. If you would like to see photos of my P80 setup let me know.
April 1, 2008 2:44:35 PM

i honestly dont think 1500watt psu will give you too much headroom. u have to consider the fact that you'll be running a crap load of 120mm fans to cool the beasts down.

efficiency rating is not 85% throughout the powerband so do expect it to drop to 80%+-. The other thing is psu life does deteriorate so having the extra power prevents anything from popping\catching fire later on. Besides, I cant afford to have 4 9800gx2 go up in smokes. 1500watt psu cost a lot less than a 9800gx2 so why skimp out?
April 1, 2008 3:24:00 PM

LukeBird said:
WHAT?!?!? :kaola: 
1500W will be more than enough! A GX2 is rated at what, pretty much the same as an 8800 Ultra/GTX. Nvidia recommended min. 1000W (which is overkill remember!) for tri-sli ultras, so 1500W is going to be easily more than enough! Much more than 1500W and you'd need the computer on it's own circuit otherwise it'd be tripping fuses all the time. Some people really do go overboard on recommending PSUs...
I reckon for a quad-core, 2 DVD drives, 4 HD's, 4 x 2Gb sticks of RAM and 4 GX2's 1.2kW would be ample, but a bit of headroom with such a system is never going to be a bad thing!
I would imagine my PSU (6x 12V rails and 850W continuous load with 1kW peak) would be more than adequate for tri-sli, but Nvidia have to cover their asses the moment they certify a product for use!
Anyhow, it'll be interesting to see the beast in action :) 


AGREE!!! AGREE!!! AGREE!!!

1500w is EASILY ENOUGH Power
April 1, 2008 8:48:39 PM

I work in Belgium, where direct availability of components is sometimes a bit of a problem...

Anyway, the case (PC-P80) and PSU (Thermaltake 1500W) should arrive by the end of the week. I will be travelling a lot within the next two weeks, but I expect the system to be up and running (if all goes as planned) in about three weeks from now. I will certainly post benchmarks and pictures when everything is finished :) 
April 1, 2008 9:20:01 PM

For argument's sake the Thermaltake is not 80Plus, but something like the PCP&C TurboCool 1200W is (as well as being a 'Tier 1' not 'Tier 2' - like the TT, and manufactured by Seasonic rather than Channel Well). With the TC 1200's 3 x 6-pin and 3 x 8-pin connectors you could power three cards and add on an FSP BoosterX to power the fourth card. You may be able to knock out a few watts by going that route with a more efficient PSU. Remember 2% @ 1500W = 30W of heat, and I think you need all the efficiency you can get, along with the high-cfm Delta fan that's included with all the Turbo-Cool PSUs.

Downside TT 1500W = 120A 12V; PCP&C 1200W = 90A
I'm also not sure how they compare cost-wise since I can find the TT on the popular sites, and the PCP&C is a lofty $550 @ newegg. It may be a moot point since it seems like budget is not a big factor.

Edit: Nevermind, looks like parts are on their way and we should see some awesome pictures in a week or so.
April 2, 2008 7:49:27 PM

mhhmmmmmmmmmmmmmmmmm

Lead-laden fries, yummy!!!

:D 
a c 78 ) Power supply
April 2, 2008 8:10:34 PM

at least with RHOS there is "less" to worry about :p 

Not like even that system would make enough heat to cook fies(since i doubt that was releasing enough heat to boil the oil)....
April 2, 2008 11:05:58 PM

I think it was an April Fools' joke.
April 3, 2008 12:25:30 PM

TiSP FTW :bounce: 
April 8, 2008 3:23:33 AM

why so many pranks by google???
!