Nvidia's Fermi Cards Said to Run Very Hot
Hot graphics with hot temperatures.
Graphics enthusiasts eagerly await the arrival of Nvidia's Fermi GPU-based cards. At this point, Nvidia is trailing behind ATI and its 5000-series cards, but expectations are high for Fermi.
Expected to turn things around for Nvidia in a big way, Fermi is supposed to vastly superior to the company's current line of 200-series cards.
Speaking to several case vendors at CES, we were told that while running one Fermi card alone or two single-GPU cards is fine, going any higher may introduce thermal issues. Though no firm temperatures were revealed, manufacturers said that users need to be extra careful about how they setup the innards of their gaming chassis.
A rep from one manufacturer said that Fermi-based cards will run hotter than the hottest ATI Radeon HD 5000 series.
Im no big fan of Nvidia, but saying a GPU will suck before we get some benchmarks is absurd. Wait untill fermi is out, then you can critisize the final product as much as you want.
This way you can have all the hot java you want when you want.
I want fermi to come out already so that we will have some real numbers to end this debate of whether its going to rock or suck.
Im no big fan of Nvidia, but saying a GPU will suck before we get some benchmarks is absurd. Wait untill fermi is out, then you can critisize the final product as much as you want.
So was the HD 2900 XT....
I want fermi to come out already so that we will have some real numbers to end this debate of whether its going to rock or suck.
I had a pair of 8800GTX's and they ran extremely hot in my small unventilated mid-tower case until I upgraded to a bigger chassis. That didn't make them "fail" cards by any stretch of the imagination!
Yeah but they work fine when air cooled by a decent case, they don't need water cooling at all.
This way you can have all the hot java you want when you want.
ahah agreed.
although im like 90% sure that nvidia said multiple can cause heating problems because even one gpu would run like 75C.
they said 2 is fine, this is probably based on the fact that they prob use liquid helium to cool their shit. i dont think it can anywehrre remotely cool as my 5850 which is at 29C.
one gpu will be more than what most cases can take. if you have something from 4 years ago, you are SCREWED.
Dude I totally agree with you - but what get's me fired is how this site, as of late, has been sinking in terms of credibility and professionalism - they even manage to use 2nd and 3rd hand info and twist it into some trash-talk. I visit other sites and the classics still manage to provide me with interesting news, whereas TH -which used to be my top pick- seems to go for the cheap shot of writing a catchy title to get me to read the story, only find out that they're just twisting things around. I give this site my time, the least they can give me is accurate news not some sensationalized BS! Grrr!
50% higher transistor count. Wider 384-bit memory interface. Huge L2 Cache.
I think fermi's going to be slow. /sarcasm.
-On the article's subject:
The 5970 runs hot, and pushes the power limit of the PCIe specification. So will Fermi, but like the the 5970's situation, a dual slot air cooler will push most of that hot air out of your case anyway. These heat and power consumption concerns are pretty much... STUPID.
Haha i'm guessing all this means is that there wont be a x2 2 gpu on one card coming any time soon, which to me doesn't sound like an issue if the fermi runs faster then the 5870 enough and falls between the 5970
Fermi needs to beat the 5970 while also costing less. Don't hold your breath.
The smart money is on Nvidia retreating from the gaming arena completely. Fermi will be a decent performing gaming gpu, pretty good in terms of compuational stuff but there is no way Nvidia will continue to lose money and resources trying to beat ATI.
between this, the delay, and the reported yeild issues, its really not looking good. lets just hope they don't panic like everyone else has in this situation and realease a dodgy slow/broken product.
even if it is awesome, enough is enough, I hardly see giving us twice the power for twice the power usage and heat output is progress. I don't want a noisey computer with a MASSIVE case just to accomidate the graphics card. up till the directX 10 cards, you could get a really good card that was still small and manageable while being awesome, the last one being the 8800GT, now they all have to be massive, I don't consider this progress...
They are not, I pretty much remember cooling problems when using 8800ultras causing some components to overheat especially on nvidia boards.