Sign in with
Sign up | Sign in
Your question

Nvidia's Fermi Cards Said to Run Very Hot?

Last response: in Graphics & Displays
Share
January 7, 2010 11:54:00 PM

http://www.tomshardware.com/news/nvidia-fermi-gpu-graph...

Quote:
Graphics enthusiasts eagerly await the arrival of Nvidia's Fermi GPU-based cards. At this point, Nvidia is trailing behind ATI and its 5000-series cards, but expectations are high for Fermi.

Expected to turn things around for Nvidia in a big way, Fermi is supposed to vastly superior to the company's current line of 200-series cards.

Speaking to several case vendors at CES, we were told that while running one Fermi card alone or two single-GPU cards is fine, going any higher may introduce thermal issues. Though no firm temperatures were revealed, manufacturers said that users need to be extra careful about how they setup the innards of their gaming chassis.

A rep from one manufacturer said that Fermi-based cards will run hotter than the hottest ATI Radeon HD 5000 series.


Do you guys agree we should hear names from Huan on this one? A vendor is not a manufacturer, so I'm not sure which he means since he says both as if they are the same. Also, there is no reason either would have any sort of inside information on an Nvidia project. A video card just goes into a PCIe slot, so there's no reason anyone at ThermalTake to Lian Li would know anything more than what has been officially posted on the web.

Do you guys believe this story has any reality to it? No, I am not concerned with your personal opinions about Nvidia, just about quality reporting-- because to me this article appears to be nothing but troll juice to satisfy the immaturity that plauges the comment sections under front-page articles these days.

Personally, I remember a Tom's that didn't spread FUD, but I think those days are over. After reading such things, I begin to question the accuracy and honesty behind other articles, such as hardware reviews.
January 8, 2010 12:34:19 AM

We can only wait and see.
Related resources
a b U Graphics card
January 8, 2010 12:42:19 AM

37% faster could mean any number of things from performance figures to clock speeds (quite likely given the heat issues).
a c 217 U Graphics card
a c 81 Î Nvidia
January 8, 2010 12:43:38 AM

Everything ever written about Fermi would suggest it would run hot. That's no suprise. The 37% faster might not even refer to FPS, but processing speed for their GPGPU stuff. Who knows, we just have to wait.
January 8, 2010 1:24:51 AM

Sweet i needed a space heater. Heat isn't something i care about i'm not going to try to further oc the card it's probably a turn off to the more hardcore air cooled oc crowd, but i don't see it as too much of an issue.
a b U Graphics card
January 8, 2010 1:26:12 AM

In March, hopefully, the battle will be official. I can't wait for the price drops due to competition.
January 8, 2010 1:41:28 AM

Can't be hotter than my old 8800gt with the refrence cooler.
a b U Graphics card
a b Î Nvidia
January 8, 2010 1:57:24 AM

spoofedpacket said:
No, I am not concerned with your personal opinions about Nvidia,..


I'm not concerned with your personal opinion about the article either.

If you have something more to refute it than your personal opinion I'd be happy to see it, otherwise it sound like you're simply pissed that it's not positive rumours he overheard and wrote about.

I'm not surprised it's hot, the GTX280 generated more heat than the HD4870 which was 'hotter' because the cooling was less efficient, however, there's only so much you can do to improve the HSF assembly, and unlike the previous rumours to the contrary, if this is the final model of the HSF it doesn't look too different from the previous model, and might not be sufficient to make a 'cool' card.
However, people won't care if it performs well enough for the price. The only way it will be an issue is if the performance isn't enough for people to forgive things like power and temperature.

Kinda reminds me of the FX5800 again, talk of hot before delivering the space heater too.
a b U Graphics card
January 8, 2010 5:56:33 AM

Just like when the 4870X2 came out...it ran hot as a brick pizza oven, but it still raped every game out there, which in turn made me care little about the heat it poured out of my case.
January 8, 2010 12:05:44 PM

In theory, The GF100 can't be much hotter than a 5970. Even though the 5970 supposedly has very good heat dissipation. They both use close to 300 watts and the die size for the GF100 is much larger, so the GF100 should have a lower heat flux.
January 8, 2010 12:13:55 PM

KidHorn said:
In theory, The GF100 can't be much hotter than a 5970. Even though the 5970 supposedly has very good heat dissipation. They both use close to 300 watts and the die size for the GF100 is much larger, so the GF100 should have a lower heat flux.


This.

Although a dual-GPU version of the GF100's.... if they even make one, would be a monster with heat production lol. I could probably cook breakfast on it and keep mortar warm for brick laying. Although if one GF100 is almost as powerful as a 5970, you don't really need a dual-GPU solution I guess.

That being said, if you can fit two 5970's in a case without heat being an issue, three GF100's doesn't seem like it would be that much more of a problem.

If a GF100 produces lets say even 80% of the heat a 5970 does, that would mean three GF100's in a case would be equivilant to 2.4 5970s in terms of heat, which really isn't that much more, and 80% of a 5970 in power disippation seems unrealistic. I'm guessing 70% or less which would be roughly equivilant to two 5970's in a case.

If the Fermi even marginally delivers what they promise it will, the heat issue isn't an issue :p 
January 8, 2010 1:34:31 PM

TheGreatGrapeApe said:
I'm not concerned with your personal opinion about the article either.

If you have something more to refute it than your personal opinion I'd be happy to see it, otherwise it sound like you're simply pissed that it's not positive rumours he overheard and wrote about.


No, I could care less if real news is about whether or not they run hot. I'm far from what you would call a fanboy, I always just buy whatever suits my needs which is determined 100% by which chip maker has the better product when I actually need to upgrade. I do often try to hold off until one of them releases some blockbuster advance in hopes of pricing evening out and go with the 2nd or 3rd best performer for the price point.

What I do care about is source-less derogatory front page articles. If we scale it down a bit, and say you own a small storage company with a new product roll-out coming up in a few months. Then, you hop onto Tom's hardware and see a bunch of fabricated, source-less stories about your product with, not predictions, but claims that your hardware performs badly by case vendors with no names listed next to it. Wouldn't you be upset when you look down the list of comments and see half of them taking this as hard news?

Sure, it is fine to guess it'll run hot. It is fairly likely if history holds true. But to report it like it is a fact with such nonsense sourcing, then yeah, there's a problem.


a b U Graphics card
January 8, 2010 1:48:27 PM

Why wouldn't case manufacturers know more than is posted on the web? It seems very likely that those that want to put the SLI sticker on the cases would be informed well in advance of what kind of thermal loads they will have to deal with. There are lots of cases that cost a fortune because they are painted green (or red) with brand stickers all over them, these companies are probably jsut as important to Nvidia as any other part manufacturer, and probably get the same NDA insider info.

like this:
http://www.tomshardware.com/news/thermaltake-nvidia-cha...

As for not having specific names.. well that is not uncommon. You are not required to like every article a site writes.
January 8, 2010 5:04:26 PM

spoofedpacket said:
No, I could care less if real news is about whether or not they run hot. I'm far from what you would call a fanboy, I always just buy whatever suits my needs which is determined 100% by which chip maker has the better product when I actually need to upgrade. I do often try to hold off until one of them releases some blockbuster advance in hopes of pricing evening out and go with the 2nd or 3rd best performer for the price point.

What I do care about is source-less derogatory front page articles. If we scale it down a bit, and say you own a small storage company with a new product roll-out coming up in a few months. Then, you hop onto Tom's hardware and see a bunch of fabricated, source-less stories about your product with, not predictions, but claims that your hardware performs badly by case vendors with no names listed next to it. Wouldn't you be upset when you look down the list of comments and see half of them taking this as hard news?

Sure, it is fine to guess it'll run hot. It is fairly likely if history holds true. But to report it like it is a fact with such nonsense sourcing, then yeah, there's a problem.


You claim not to be a fan boy, but damn did you jump on that article super fast, also I like how you rant about 'sourceless bashing' when the guy above me with probably less then 5 minutes of research propped up some evidence by the THG team with a vendor name to boot. Do your homework before you make a rant thread.
January 8, 2010 10:36:34 PM

azgard said:
You claim not to be a fan boy, but damn did you jump on that article super fast, also I like how you rant about 'sourceless bashing' when the guy above me with probably less then 5 minutes of research propped up some evidence by the THG team with a vendor name to boot. Do your homework before you make a rant thread.


Amazing how everyone has to have a turn at shitting on this thread.

Anyway, there is no source in regards to the person making statements about the thermal characteristics of the cards. There is just a ThermalTake case with an Nvidia logo. Great, this is the stuff I was looking for and appreciate his link. But, evidence is one thing, clear and concise reporting is another thing. If not FUD, people shouldn't be making statements to the "press" if they are under a NDA, and if not, shouldn't be anonymous. It is simple as that and goes back to my "put yourself in the shoes of a hardware designer" remarks where the media finds it popular to pick at your product without knowing any details.

As for being a fanboy, I have more ATI cards and AMD processors installed in boxes in my home than Intel and Nvidia, dating back to my first AMD which was a 486 DX40 which was a lovely system for it's time. I guess it doesn't matter. Front page or the forums, there seem to be people who have nothing else to cling to than trolling posts and shitting on threads. Congrats, You have won the Internets?!
a b U Graphics card
a b Î Nvidia
January 8, 2010 10:38:46 PM

Also if Fuad @ Fudzilla is hearing/saying it's gonna be hot in his article, with his pro-Fermi style sofar, then it's far from sensational for THG to report what seems more likely that they did overhear at CES.

What happens in Vegas doesn't always stay in Vegas. ;) 
a b U Graphics card
January 8, 2010 11:50:17 PM

http://translate.google.be/translate?hl=fr&sl=fr&tl=en&...
So heres another, next someone will say its an entire conspiracy? On the entire internet? Theres reasons why we dont have fermi sitting on shelves right now, this could very well be the biggest, or part of it.
To accuse Toms, or FUaD or PCWorld or whoevers next, is reaching at this point.
January 8, 2010 11:52:07 PM

Considering when 1 news outlet on the internet links something, everyone else just copies them, it could be false info, yes.

Than again could be true, no one knows yet :p 
January 9, 2010 12:02:44 AM

So when are these cards due out? March? Anyone guessing what the prices might be? With ATI's cards having gone up in price over the last few months, I'd expect the competitive situation to reverse that. It should be interesting to see how the battle plays out.
a b U Graphics card
January 9, 2010 12:11:25 AM

Since they partners have coming products to support this, and words all over CES about it, from those same partners, Id guess youd have to include them into this conspiracy also. At least trhey have motive.
a b U Graphics card
January 9, 2010 12:16:11 AM

I dont see it as ATIs resposibility to lower prices, I see it as nVidia needs to come in with a price competitive part, and then see if ATI wants to lower prices, unless nVidia comes in low price/perf, which would be different
January 9, 2010 12:34:02 AM

ATI doesn't have to lower prices they want to make money too lol, is AMD still in debt eh it's been so long.

It's nvidia's job to compete with ATI(because they released later) which in turns gives us the consumers better prices for performance cuz they are competing for us to buy their products.
a c 173 U Graphics card
a b Î Nvidia
January 9, 2010 12:57:15 AM

If the price is right really right I just might have the money for one but will wait for some reviews first plus I want to see the gpu side of the pcp and the cooler to guess the build quality.
January 10, 2010 12:17:29 AM

IzzyCraft said:
ATI doesn't have to lower prices they want to make money too lol, is AMD still in debt eh it's been so long.

It's nvidia's job to compete with ATI(because they released later) which in turns gives us the consumers better prices for performance cuz they are competing for us to buy their products.


Yeah. I agree. It just seems to be the natural flow of things. The channel distributors were actually pricing the ATI cards below retail, but then after companies like Newegg jacked them up, the channel prices quickly rose to get a bit more of the margin (they pay attention to the end sales price and will often adjust if you are making mad margin). Sadly, I don't think the inflated prices get more money back to ATI, but at least they fly off the shelves like hotcakes.

The same with Intel's Gen2 SSDs when they first went into the channels. The channel price for QTY 1 was right at $450, which is likely about $10 more than Newegg was able to source them, but that didn't stop them from marking it up $200. When posters on Tom's started bashing Intel for their pricing, it made me want to gouge my eyes out since Intel was selling at the same price, it was just the distributors and online retailers that were cashing in on supply and demand while they could.

Back to the journalism and FUD stuff, I think Tom's could make a lot more positive impact on behalf of gamers if they were to fabricate some articles based on how much demand there is for more DX11 titles, playing up the importance of it and PC gaming in general. I'm quite tired of TF2 and COD:WaW even though they were really nice games, just damn, how long can this go on? Not being into dirt track racing (Some ground breaking DX11 Grand Turisimo knock-off for the PC would be a whole different story: I'd pay $500 for video and $200 on a good wheel for that mythical game), I can't quite use Dirt2 as a reason to upgrade anything right now.

a b U Graphics card
January 10, 2010 12:48:45 AM

Here may be a clue as to Fermis heat

"Element V Nvidia Edition chassis also incorporated graphic card “air duct” system engineered by Thermaltake and Nvidia to provide added cooling for high-performance 3-way SLI or Quad SLI setup based on Nvidia’s next generation of enthusiast graphic card. The proprietary “air duct” system brings cool and fresh air directly from the outside of the chassis and accelerates it to graphic card’s intake to increase heat displacement and achieve optimal cooling efficiency. Without Nvidia SLI certified chassis, system powered by the next generation of high-performance graphic cards may not be able to operate at their highest setting due to inadequate cooling."
http://www.tweak.dk/nyheder2.php?id=22680
January 11, 2010 1:08:18 PM

The proprietary “air duct” system brings cool and fresh air directly from the outside of the chassis and accelerates it to graphic card’s intake to increase heat displacement and achieve optimal cooling efficiency.

Sounds like Nvidia is becomming Apple computer.
January 11, 2010 2:20:25 PM

If I'm reading that correctly, which I may not be, people are going to need a specific case to properly cool these cards. Is that wrong? If it has it's own "air duct" that takes in outside air, then it will have to be vented through the cases/chassis. If that's the case (pun intended), then that blows (more pun intended). Of course, i could be reading that wrong.
a b U Graphics card
January 11, 2010 2:22:19 PM

JAYDEEJOHN said:
Here may be a clue as to Fermis heat

"Element V Nvidia Edition chassis also incorporated graphic card “air duct” system engineered by Thermaltake and Nvidia to provide added cooling for high-performance 3-way SLI or Quad SLI setup based on Nvidia’s next generation of enthusiast graphic card. The proprietary “air duct” system brings cool and fresh air directly from the outside of the chassis and accelerates it to graphic card’s intake to increase heat displacement and achieve optimal cooling efficiency. Without Nvidia SLI certified chassis, system powered by the next generation of high-performance graphic cards may not be able to operate at their highest setting due to inadequate cooling."
http://www.tweak.dk/nyheder2.php?id=22680


Epic fail.
a c 130 U Graphics card
a b Î Nvidia
January 11, 2010 2:37:25 PM

I really really don't understand why people keep going on about the heat and power.
IF Fermi comes out and performs at around the 5850 level and is as hot and power hungry as it seems sure from what we know it will be. THEN and only then is at the epic failure of a power hungry over hot piece of crap most everybody wants to label it even before they know the exact form the card will take, sure we have a good idea but its a very roughly idea.
But.
IF Fermi comes out and a single chip card performs close to or shock horror beats a 5970 THEN it will be an astounding piece of hardware that can assumabley only get better with refreshes etc, and Nvidia did it again.

I very strongly agree with what TGGA first posted,
"However, people won't care if it performs well enough for the price. The only way it will be an issue is if the performance isn't enough for people to forgive things like power and temperature."

Thats the bottom line people.

Mactronix
a b U Graphics card
January 11, 2010 2:38:36 PM

real world said:
If I'm reading that correctly, which I may not be, people are going to need a specific case to properly cool these cards. Is that wrong? If it has it's own "air duct" that takes in outside air, then it will have to be vented through the cases/chassis. If that's the case (pun intended), then that blows (more pun intended). Of course, i could be reading that wrong.


They have had SLI certified cases for years. IT is just marketing mumbo jumbo. Sure the cards will run hot, but anyone could have told you 5 years ago that triple SLI needs a very good case with nice complete air flow (or water).

Any case worth its salt will still work fine for fermi multi GPU set ups, they just want you to buy the special cases with the SLI sticker on them. Mind you, marketing this in this way seems strange.. I kind of get the feeling they do think they are very hot.. which may be fine, but why basically say that your product is damn hot, so you have to spend more money to run it properly? Shrug..

Remember that when someone says "may not be able to run" in PR term they often mean that unless you use an autoclave as a case you will be fine. The "air duct" BS is just a sheet of plastic that does ass all more than a side fan.
January 11, 2010 2:40:50 PM

JAYDEEJOHN said:
Here may be a clue as to Fermis heat

"Element V Nvidia Edition chassis also incorporated graphic card “air duct” system engineered by Thermaltake and Nvidia to provide added cooling for high-performance 3-way SLI or Quad SLI setup based on Nvidia’s next generation of enthusiast graphic card. The proprietary “air duct” system brings cool and fresh air directly from the outside of the chassis and accelerates it to graphic card’s intake to increase heat displacement and achieve optimal cooling efficiency. Without Nvidia SLI certified chassis, system powered by the next generation of high-performance graphic cards may not be able to operate at their highest setting due to inadequate cooling."
http://www.tweak.dk/nyheder2.php?id=22680


Not Surprising. I would imagine 3 or 4 of these would produce a lot of heat. Maybe on the order of a kilowatt. The equivalent of cooling ten 100 watt bulbs.

I would expect 3 or 4 5890's series cards to require special cooling also.

I can't imagine how 4 of these would fit on a normal ATI MB.
a b U Graphics card
January 11, 2010 2:47:41 PM

mactronix said:
I really really don't understand why people keep going on about the heat and power.
IF Fermi comes out and performs at around the 5850 level and is as hot and power hungry as it seems sure from what we know it will be. THEN and only then is at the epic failure of a power hungry over hot piece of crap most everybody wants to label it even before they know the exact form the card will take, sure we have a good idea but its a very roughly idea.
But.
IF Fermi comes out and a single chip card performs close to or shock horror beats a 5970 THEN it will be an astounding piece of hardware that can assumabley only get better with refreshes etc, and Nvidia did it again.

I very strongly agree with what TGGA first posted,
"However, people won't care if it performs well enough for the price. The only way it will be an issue is if the performance isn't enough for people to forgive things like power and temperature."

Thats the bottom line people.

Mactronix


That is very true at the high end, but temperature is a big consideration for low end, and OEM sales. All you need are a couple of morons putting a gtx380 in a 10 year old case causing the thing to Overheat to cause a really large wave of "this card is too hot, don't buy it" PR.

For enthusiasts like most of us temperature only matters if the card is not designed to run much hotter (no OC, flaky stability) but I'd have to assume the fermi is probably supposed to be hot, or if it produces a lot of extra heat that must be dealt with before it hurts other components OC. It is all well and good my GPU is supposed to run at 120C, but that wouldn't help me OC my CPU that peaks at 70C. But if the performance is there, we will deal with it. This is not to say I would not like a card with lower temps/

Temperature/performance is always going to be a consideration, as provided this ration isn't absolutely ridiculously out of whack there won't be an issue.
a b U Graphics card
January 11, 2010 3:20:38 PM

mactronix said:
I really really don't understand why people keep going on about the heat and power.
IF Fermi comes out and performs at around the 5850 level and is as hot and power hungry as it seems sure from what we know it will be. THEN and only then is at the epic failure of a power hungry over hot piece of crap most everybody wants to label it even before they know the exact form the card will take, sure we have a good idea but its a very roughly idea.
But.
IF Fermi comes out and a single chip card performs close to or shock horror beats a 5970 THEN it will be an astounding piece of hardware that can assumabley only get better with refreshes etc, and Nvidia did it again.

I very strongly agree with what TGGA first posted,
"However, people won't care if it performs well enough for the price. The only way it will be an issue is if the performance isn't enough for people to forgive things like power and temperature."

Thats the bottom line people.

Mactronix

I agree, but Im in the in between camp, where we are likely to see it less than you proposed for top perf, and more than minimum perf.
Either way, having a multi card setup is going to get negative response if it needs these types of cases, etc.
Even if the perf is that great, it wont matter then , tho thats for multi card only, single would still hold promise, and thats where were at now, and that lessons the scenario in the negative category, but still appears to be high power/hot card.
If it doesnt beat the 5970, heres my take. Since the 5970 is maxxed out, just under the PCI spec, and it loses alot due to its CF setup, a single core with the same power usage, should beat it.
Will this be a deciding factor? Possibly. But more, it speaks to each companies approach, where itll really hit home later. If Fermi isnt easily scalable, and carries these traits downthe line, itll be quite apparent that nVidia will have to adopt ATIs approach of smaller then x2 for halo approach, even tho its sacrificing perf in the process, and a future need to rid ourselves of AFR

PS If this translates to a simple SLI setup, then this is where a problem will occur regarding sales. I understand tri or quad setups needing something more, but a simple SLI setup shouldnt require a special case, a better one yes, but a special one?
Anonymous
a b U Graphics card
January 11, 2010 3:33:53 PM

If the PCI spec would be higher, ATI could have made 1Ghz 5970. Strange thing is that the card is more stable at 1Ghz with extra voltage than default 735Mhz and 1.04v.The 5970 cores are undervolted and working at lower clocks than reference. Logically and from specs, Fermi will be between 5970 and 5870, a bit faster than 5870 but not 5970 unless they can get magically higher Mhz. Cooling should not be an issue , manufacturers like Asus , XFX will get the job done. Hope its not as hot as the GTX280 was. A good PSU will go very well past the PCI Spec. Fermi will beat 5970 in non multi gpu games though.
January 11, 2010 4:07:05 PM

As was already said, anything labeled "SLI-ready" is just a PR gimmick.

Any decent airflow case will do just fine even if these are hot cards. The air channel in those special Element cases is nothing but a flimsy piece of plastic that wont beat side fans at cooling anyways.

I realize this has already been said, but I figure this statement needs its own post to be fully recognized.
a b U Graphics card
January 11, 2010 4:18:11 PM

This could be, and it may be just speculative anticipation for fermi, but the hints have been coming on the heat issue, and the perf issue as well.
What were hearing now may be fixed later, or, if nVidia decides to crank it up to give it even more perf for final clocks etc, the temps could be even slightly higher as well.
While I agree this is all speculation, and wont likely be a deal breaker for single cards, this may play a part in a dual SLI on up setup, and thats where it could be a problem. Previous SLI cert or not, this ones a lil hotter
a c 173 U Graphics card
a b Î Nvidia
January 11, 2010 4:34:33 PM

Hmmmm fried eggs and tomatoes any one? Just go water or redneck if any of you guys buy this card. :sleep: 
a b U Graphics card
January 11, 2010 4:41:05 PM

JAYDEEJOHN said:
This could be, and it may be just speculative anticipation for fermi, but the hints have been coming on the heat issue, and the perf issue as well.
What were hearing now may be fixed later, or, if nVidia decides to crank it up to give it even more perf for final clocks etc, the temps could be even slightly higher as well.
While I agree this is all speculation, and wont likely be a deal breaker for single cards, this may play a part in a dual SLI on up setup, and thats where it could be a problem. Previous SLI cert or not, this ones a lil hotter


Oh don't get me wrong.. I'm sure this indicates the thing is going to be piping hot.. Though I don't think that matters to most.

I do think they are using this as a chance to make some cash by 'forcing' users to buy into PR nonsense and get a special case to go with the GPU though. If heat was not an issue I think they would have worded it a lot more carefully.. or perhaps it is as simple as someone makign a mistake. The last thing you want to do while trying to sell nvidia certified cases is give off the impression your product is flawed in some way (heat, size, whatever). SLI/crossfire/etc. cases make money because either people think they look cool, or because ignorant people think they are required. No company could have such disdain for their consumers as to think they are all the latter by designing something too hot for the cases everyone already owns..
a b U Graphics card
January 11, 2010 4:52:29 PM

Well, looking at the 5970, and its special design, its thermal cooling setup, being over done etc, Fermi comes in a smaller package/case, its heat distribution is placed on a smaller overall piece of silicon etc.
This tells me a few things. If its close to 5970 in power usage, it has alot less space on chip, and in the card itself for heat dissapation.
Looking at the 280 vs the 285, we saw this again, as the die shrink really allowed for alot concerning the 285, not just clocks. The 280 was the highest in returns/RMA for that gen, for both sides, and thats where I think Fermi will also end up.
If you get a good one, youll have a great card, but I expect a few RMAs, especially early on, and no shrink this time around, itll have to rely only upon process maturity, meaning lower yields etc, higher costs or lower profits.
But, like I said, if you get a good one, youll be happy
a c 217 U Graphics card
a c 81 Î Nvidia
January 11, 2010 4:58:47 PM

daedalus685 said:
No company could have such disdain for their consumers as to think they are all the latter by designing something too hot for the cases everyone already owns..


This has already happened in the past. Back in the day, cases had 0 ventilation or heat sinks. As hardware has improved and heating has increased we gained heat sinks then later cases added ventilation. As heaitng issues increased more, we added fans on the heat sinks and we added fans to our cases.

There are 3 things that can be done in order to keep advancing:
1) increase cooling methods to account for hotter chips due to adding more transistors (fermi has increased transister count by a lot).
2) shrink the chips
3) improve performance with the same transistor count.

All three are always happening. It makes perfect sense that at some point, a heat duct will become the normal design, and even prebuilt computers will have them some day, or some other new cooling method.
a b U Graphics card
January 11, 2010 6:05:59 PM

bystander said:
This has already happened in the past. Back in the day, cases had 0 ventilation or heat sinks. As hardware has improved and heating has increased we gained heat sinks then later cases added ventilation. As heaitng issues increased more, we added fans on the heat sinks and we added fans to our cases.

There are 3 things that can be done in order to keep advancing:
1) increase cooling methods to account for hotter chips due to adding more transistors (fermi has increased transister count by a lot).
2) shrink the chips
3) improve performance with the same transistor count.

All three are always happening. It makes perfect sense that at some point, a heat duct will become the normal design, and even prebuilt computers will have them some day, or some other new cooling method.


I understand that, but a "duct" doesn't do anything unless you move away from atx designed cases. Obviosuly things have gotten hotter, though to state that cases used to have zero ventilation is a bit of a stretch, a lot less certainly, but I can't think of a time when a high end computer, server, etc. did not have at least one fan (even if that fan was part of the building air conditioning).

Cases obviously have improved as parts need better cooling but that was not my point. You took what you quoted out of context.

The point is that any high end case has enough cooling, there is no such thing (until the MB layout changes) as a case that will be able to have a simple duct on it to improve GPU cooling in any meaningful way. If you have a full tower case with good air flow that is good air flow for any GPU that can function without water cooling. The colour on the sticker is irrelevant.

The part of my post you quoted was referring to how no company would blindly assume all of their customers are blind idiots by stating something as asinine as "it is hotter so you had better buy one of our certified cases or it won't work." A high end case is a high end case. Nvidia knows they must design within certain open standards, and they do. All this smells to me is their cards running hotter than the old ones did, thus heating up the case a lot unless it is particularly large, and Nvidia (as any company would) trying to make the best out of a bad thing.

To add, you left out a lot of ways to keep advancing list :D . What about improving materials to reduce resistance? Moving away from the transistor entirely? No longer using electrons as our communication medium? Lots can be done, more power does not have to mean more heat (the work to actually compute things is very low, we just need lots of power to overcome the shortcomings in our materials resistance) , just so happens it generally does inside the same paradigm but that is not a constant.
a c 217 U Graphics card
a c 81 Î Nvidia
January 11, 2010 6:27:19 PM

Most your methods of advancing are part of those 3 or a combonation of those 3. Cooling methods does include reducing resistance to produce less heat.

Anyways, my point is simple, we are going to continue to come up with new and improved methods of cooling. Using diamond wafers might be the next cooling method, as it doesn't heat up nearly as much as silicon, but it's still a new way to reduce heat, but probably a long ways off before it's ready.

Is the heat duct really that effective? I don't know, but if it is, it just might become the next design feature for the future. It has to start somewhere.

My first computer was an 8086, it had a box with nearly 0 ventilation, but you are probably right, it might have had a couple holes out the back. I can't recall.

This new duct is only being asked for in SLI solutions. If it catches on, prooves to be useful, and is a cost effective design, I'd bet you it will become a standard as a result of Nvidia asking for it to be used for SLI fermi solutions. You better believe that means ATI will take advantage of it too.
a b U Graphics card
January 11, 2010 6:47:06 PM

bystander said:
Most your methods of advancing are part of those 3 or a combonation of those 3. Cooling methods does include reducing resistance to produce less heat.

Anyways, my point is simple, we are going to continue to come up with new and improved methods of cooling. Using diamond wafers might be the next cooling method, as it doesn't heat up nearly as much as silicon, but it's still a new way to reduce heat, but probably a long ways off before it's ready.

Is the heat duct really that effective? I don't know, but if it is, it just might become the next design feature for the future. It has to start somewhere.

My first computer was an 8086, it had a box with nearly 0 ventilation, but you are probably right, it might have had a couple holes out the back. I can't recall.

This new duct is only being asked for in SLI solutions. If it catches on, prooves to be useful, and is a cost effective design, I'd bet you it will become a standard as a result of Nvidia asking for it to be used for SLI fermi solutions. You better believe that means ATI will take advantage of it too.


Cooling ducts have been used for more than a decade on GPU's and CPU's. They were all the rage to put a tube over the CPU fan 5 years ago. This is nothing new, it is just a gimmick that doesn't function as well as something like a side fan, or rearranging the motherboard.

Not everything can be broken down into general rules.. Though I suppose if I am as general as possible I can avoid having to face every being incomplete.. after all, the first and only rule of computer progress is this:

Stuff will get better as time goes on.

But that being said, "cooling" means to cool. Which means that something has to have been hot. Changing the chip to prevent this heat from ever being produced is not a cooling method, it is a production improvement, perhaps rule 2 should be "production improvements" and not just shrinking. But this is not the place for lame semantics discussion.. there is no such thing as "the three rules of improving commuters." Take my comment for what it is, a joke.. As no logical individual would limit their creativity to any finite set of rules.

At any rate, the subtle comments in my prose seem to be lost on the majority of people so let me rephrase it:

The duct will do nothing, cooling is adequate for what we have. Until atx standards change and more space is left between a graphics card and the card above it in crossfire/sli (or the orientation changes) mass cooling will not improve using conventional means. Besides that, conventional means are plenty to cool a card that meets the 300W limit.

Perhaps Fermi runs hotter than the previous generation, but unless it breaks spec it will be cooled fine. The only reason Nvidia is commenting on it at all is because they want to sell more partner's cases. Which I was contending is a silly thing to do as it gives intelligent users a bad taste in their mouth.. thus they either do have heat issues for one reason or another, or think everyone is some sort of bard yard animal and will buy a new case because they vaguely say it is required. A regular case is already not sufficient for triple sli... Perhaps Nvidia is in damage control for what might be an unusually poor performance/temperature part.. but one cannot read too much into that from a case. It is a massive chip, it will be warm, it is far more likely that being a huge chip nvidia is trying to prepare us for what might be the hottest thing we have seen, but no worse performance/temperature than anything else.. just more area to heat up.
a c 217 U Graphics card
a c 81 Î Nvidia
January 11, 2010 7:10:31 PM

I'm having a heard time following you.

First you tell us that the duct doesn't do anything, then you go on to tell us that maybe the card is running super hot, so they are trying to do damage control by adding the duct. Which would suggest you believe it does help.
a b U Graphics card
January 11, 2010 8:27:26 PM

bystander said:
I'm having a heard time following you.

First you tell us that the duct doesn't do anything, then you go on to tell us that maybe the card is running super hot, so they are trying to do damage control by adding the duct. Which would suggest you believe it does help.


What a waste of time....

Fine,

Let me try this one more time.

The duct does not do anything, some cases have used them over the last decade, with no real results.. But it is a distinguishing feature in some cases.

The won't even add a duct in all SLI certified cases, they will in some. Nvidia (and everyone else) wants you to buy more than you need. If they can get you to do this by saying the cards are hot and you need a duct then they will.

I said damage control because I cannot think of a rational reason nvidia would want to say that one needs better cooling unless they are simply getting our feet wet with this because they do run a bit hot. That or, as I pointed out, they believe everyone who buys their products is a moron who will buy an extra case because they are told to, without realizing.. "hey, why am I doing this.. why is it so hot anyway?"

This does not imply that I think the duct does anything.. It surely does not as all it does is inhibit air flow to the rest of the system, you want more fans to push more air, not more restrictive air movement. Surely a duct can help in some aspects.. but that has nothing to do with Nvidia, and is not new. It means that I believe that Nvidia are not so stupid to go out implying they have hot cards for the purpose of selling more cases given the damage that will do to their reputation unless they really are hot cards (and really... duh? what do people expect from top of the line). Because they are hot cards does not mean the duct works.. it means that they actually do think people need nice cases, and in order for people to spend all that money they don't have, instead of saying "buy a nice case" they stick to recommending to buy an SLI certified case as they have done in the past several years. If this means that to get an SLI certificate the cases have to jump through hoops and add a duct then so be it, but the duct has nothing to do with this at all.. it is jsut a feature they can add to differentiate the old case you already own to the one they want you to buy.

I'm not sure how I got dragged into a talk about ducts.. sure they can help at times (I contend that until they change the layout of the MB it won't help much, and may hurt other parts of the case at the same time).. but a good case is a good case, with or without a duct. That is my point.. It is about why nvida is marketing the way they do..
a c 173 U Graphics card
a b Î Nvidia
January 11, 2010 8:44:52 PM

A well fitting water block and any pump & rad that can be bought up that wont corrode then problem solved.
a b U Graphics card
January 11, 2010 9:29:11 PM

@ daedalus Somehow, I liked my answer better heheh
" I understand tri or quad setups needing something more, but a simple SLI setup shouldnt require a special case, a better one yes, but a special one? "
a b U Graphics card
January 11, 2010 9:36:20 PM

JAYDEEJOHN said:
@ daedalus Somehow, I liked my answer better heheh
" I understand tri or quad setups needing something more, but a simple SLI setup shouldnt require a special case, a better one yes, but a special one? "


But where is the flare, JDJ! ;)  Why say in a few words what you can creatively say in many ;) .. or something...

But no, it shouldn't and I am assuming it doesn't, require anything extra.. but why would that stop them from trying to get people to think it does $$$!

Of course, as stated, I think either nvidia PR went AFK for a while when the Case vendors talked about how hot the fermi's were, nvidia is staffed by people with nothing but contempt for society, or the cards are actually a tad hot and they figure, what the hell, maybe we can sell more *** this way.

!