Sign in with
Sign up | Sign in
Your question

Upcoming 260/280GTX heat and power vs 9800GTX

Last response: in Graphics & Displays
Share
June 6, 2008 5:47:35 PM

I know they are still a couple weeks away, and evntually there will be some comparisons done of course, but does anyone have a link or info on:

1) How does the heat generated by either the 260GTX or 280GTX compare to say the 9800GTX?

2) How does the power consumption generated by either the 260GTX or 280GTX compare to say the 9800GTX?

I did notice on one of the leaked out pictures/video of the 260 (said it was 280 on this site but then people said it was the 260) that it only has 2 6 pin connectors, not a 6 and an 8 pin so power could be less?

If less power, I'd assume less heat?

Of course I'm assuming all this out my arse as I have no other metrics...
June 6, 2008 5:49:56 PM

the GTsuX gonna eat lots of power according to rumors, heat, maybe, quite the same, im not sure, everything here's gonna be assumptions until the real thing comes out =D
Related resources
June 6, 2008 6:08:25 PM

260+W is too much wouldn't you guys say?
June 6, 2008 6:25:46 PM

dirtmountain said:
http://www.atomicmpc.com.au/forums.asp?s=2&c=7&t=9354
for power usage for the 3


Thanks for the link, looking it over.

And yes, I realize this is all mental masturbation and we should just wait 2 weeks and we'll have our answer, but isn't that what enthusiasts do?

(not the masturbation part, I mean...)
June 6, 2008 6:34:46 PM

Ive heard the tdp for the 280 is 234 watts. Ive also heard that the 280 is or runs as hot as the 2x9800GTX
a b Î Nvidia
June 6, 2008 6:39:26 PM

dirtmountain said:
http://www.atomicmpc.com.au/forums.asp?s=2&c=7&t=9354
for power usage for the 3


Terrible resource based on BS compilation most of which are based on system power draw.

The only true tests are ones done by places like Xbit, which test power draw at the connectors, not of the system.

And no one knows he power draw until it ships.

nV is posting BS numbers showing sub 150W numbers, yet in the same slide they're saying the GF9800GTX draws 80W max under load, while most other sites show draw at the pins is over 100W.

Which to me means that the GTX280 will draw as much or more than an HD2900XTX, I would suspect 160-200W would be more realistic.
And that crap stie says the HD2900XTX draws 240W which is BS, and the OEM 270W, yet it only has spec support for 250W on all connectors (75+75+100), while Xbit et all found it was closer to 175W.
June 6, 2008 6:40:23 PM

well, nVidia wants to burn our CPUs.
June 6, 2008 6:45:31 PM

Well, I think (at least the EVGA version) the 9800 GTX seems efficient at both temperature and the way it relieves itself of the heat.

I know this is not a scientific test but of all the cards I've had in the last 4-6 years, the 9800 GTX EVGA seems to have the best performance vs. heat generated IN the system.

Again, extremely ghetto method but just running my hands around every crevice of the card while running benchmakrs shows very little heat dissipating into my box compared to other higher-end cards I've had.

Sure, my internal cooling (even with both side panels off) is different from box to box but that is why I say this is the laymens test yet I'm very happy with the performance of it so I'm wanting to upgrade, but not take a step back in terms of heat and power.

I do not want the pressure-cooker heat of the 9800GX2 for example.

Of course I say that now but if the 280 for example performs 2 to 3 tmies as fast as a 9800 GTX, I'll tear my pants by reaching so fast into my pocket to grab my credit card to order...

...so maybe I'm just more bark than bite.

*waits in line like a patient, obedient, NVidia fanboi*
June 6, 2008 6:49:52 PM

True Ape. My numbers are from theoretical FULL usage, but thatd never really happen. ive seen numbers at 150 too, but I discount them. I wonder if some of these power numbers dont get fudged in a real simple way. They dont include the mobos 75 watt, when they want to show the "low" numbers, just the connector draw?
June 6, 2008 7:39:34 PM

arrpeegeer said:
Of course I say that now but if the 280 for example performs 2 to 3 tmies as fast as a 9800 GTX, I'll tear my pants by reaching so fast into my pocket to grab my credit card to order...

...so maybe I'm just more bark than bite.

*waits in line like a patient, obedient, NVidia fanboi*


LOL. Dude, problem is, if this beast performs 2 to 3 times better than a 9800GTX...holy snap, that'd be sexy...then you'd be hitting areas where games can't even utilize that much juice yet!

But dude, I say that, but I'd have some torn pants too hehehe...but for me it's a little more legit...I'm coming from a 6X00 Ultra I think...it's so old I forget WTF it is, LOL.
a b Î Nvidia
June 6, 2008 7:49:17 PM

Here's nV's official slide;



So no matter how you slice it, essentially 70W or 80+% increase in power consumption compared to the GTX, even using nV's generous numbers.

And JDJ, the under load numbers for Xbit are running 3DMark SM3.0/HDR 16x12 with AA in a loop which means it's what actually happens, not as an average, but as peak of course;
http://www.xbitlabs.com/articles/video/display/geforce9...

So using those numbers instead of the generous ones, that would equate to about 200W (198.63W) for the GTX280 in 3Dmark.

And considering alot of people buy these for 1920x1200+ with AA that test may be even less than max power draw. Your numbers of course are TDP so the termal MAX they need to be designed for with the idea that overtime that heat builds up, etc, so they need to be able to handle more than what would be the actual heat generated just by the chip at that time.
June 6, 2008 7:58:50 PM

Thanks for clarifying that. And of course splainin it proper like. heheh
June 6, 2008 9:53:49 PM

Nice post TGGA. So I guess if roughly max power is ~150W for 3d gaming/load, then I should be ok.

Based on the following components, for those more in tune with exact power consumption, should I have any issues throwing in a 280GTX?

ASUS RAMPAGE FORMULA
Intel E8500 + Noctua NH-U12P cooler
G.SKILL 4GB (2x2GB) DDR2-1000
EVGA GeForce 9800 GTX [replacing eventually with a 260/280]
X-Fi XtremeGamer
Velociraptor 300GB / WD6400AAKS 640GB / Pioneer DVR-215DBK
PCP&C QUAD Silencer 750W PSU
Lian Li PC-A70 + 6 Scyth S-Flex SFF21F

Thanks
a b Î Nvidia
June 6, 2008 11:43:49 PM

With PC Power & Cooling you'll be morethan fine, that thing is like most people's 1KW PSU.

But be advised (for your electric bill maybe :lol:  ) I still think it'll be on the northside of that nVidia figure by a few dozen watts, but even my high end guesstimation of 200W would be well handled by your PSU, the only concern might be SLi, but for a single one if you're not fine then it's gona be tough all over for people with Crap PSUs.
June 9, 2008 5:14:48 PM

Yeah I was thinking the same thing - but then again, how many people with 'crap PSUs' will be spending $450 or $650 on a card? :) 

And I agree, since it's a single rail 750, I don't think I'm drawing more than 400 watts right now in gaming with that build, but I'm not a great calculator :/ 
June 9, 2008 5:29:26 PM

BTW, found some great info on the Anandtech forums. I love both this site and theirs and since I'm not sure if I can cross post, here is the info.

If I can crosspost, I'll use the link to give them credit.

GT200
-Size: 576mm^2
-Manufacturing architecture: 65nm
-2nd gen unified shaders
-1.1-1.4B transistors
-Costs $110-$120 to make
-DX10.0 & SM4.0 capable

----------
Cards

Geforce GTX 260
Core clocks: 575MHz
Shader clocs: 1250MHz
Memory clocks: 2000MHz
Memory interface: 448-bit
Memory bandwidht: 112GB/s
Frame buffer: 896MB GDDR3
Power consumption: 182W TDP
Lenght: 10.5"
Power connectors: 2x6pin
Price: $449

Geforce GTX 280
Core clocks: 600MHzMHz
Shader clocs: 1300MHzMHz
Memory clocks: 2200MHz
Memory interface: 512-bit
Memory bandwidht: 140,8GB/s
Frame buffer: 1024MB GDDR3
Power consumption: 236W TDP
Lenght: 10.5"
Power connectors: 1x6pin + 1x8pin
Price: $650

----------
Release date: 17.06.2008
----------

Links

GTX 260 info + pic:
http://plaza.fi/s/f/editor/images/0604gtx260.jpg

GT200 tech demo [with GTX 260]:
http://www.youtube.com/watch?v=K9gwJwCNvT8

GTX 280 pictures:
http://www.vr-zone.com/articles/Detailed_Geforce_GTX_28...

GTX 280 powerconsumption figures:
http://www.dvhardware.net/news/nvidia_geforce_gtx_280_p...

GTX 260 and GTX 280 prices:
http://resources.vr-zone.com//newspics/May08/30/gtx200-...

GTX 260 and GTX 280 performance (according to Nvidia):
http://resources.vr-zone.com//newspics/May08/30/gtx200-...

55nm GT200 on the works:
http://www.nordichardware.com/news,7815.html

GT200 has 2nd gen unified shaders:
http://www.dvhardware.net/article27294.html

GT200 launch date:
http://www.nordichardware.com/news,7803.html
June 9, 2008 5:32:58 PM

..... And still cant do DirectX 10.1.......
June 9, 2008 5:39:54 PM

Kaldor said:
..... And still cant do DirectX 10.1.......


Even though the top 9 of 10 game companies (read an article in I think it was PC Gamer) said they have ___--->>> zero <<<---___ plans to support 10.1 as they are barely interested in supporting 10.0...

...so despite MS/Vista pushing 10.1 out, no one cares.

Literally, no one.

So as a consumer, if leaving 10.1 out makes it cheaper...

/fixed
June 9, 2008 6:28:12 PM

Ever wonder why companies dont utilize Direct X 10.1? And have you ever noticed how many games have that lame "the way its meant to be played" BS? Take that into account before you open your mouth. Nvidia PAYS companies big bucks to NOT use 10.1. I think Assasins Creed showed that 10.1 worked alot better with ATI hardware than Nvidia hardware, then the game was back pedaled to 10.

There are some very compelling reasons for programmers to use 10.1 from a performance standpoint. Im not going to link a bunch of crap into here. But by all means, go search around on the net. Wikipedia actually shows some good info on it.

Leaving 10.1 out doesn't make the software any cheaper......

And PC Gamer, what an awesome source of info. Id be willing to bet that Nvidia has quite a bit at stake with those 9/10 companies...

Nvidia is actually doing customers a disservice by not pushing the envelope and embracing new technology. ATI has embraced the new tech, why cant Nvidia?

Im no fanboi for either side. I own a EVGA 8800 GTX. I buy whatever is fastest at the time when I buy my hardware, and in January of 2007, that card was king. I hope to god that ATIs 4870X2 crushes Nvidias new offerings at $200 cheaper a card. Nvidia needs to get off their ass and actually innovate for a change. Hopefully they get smoked, and then they will actually have to go make a good product, and be competitive, which is only good for the customer.
a b Î Nvidia
June 9, 2008 6:50:46 PM

arrpeegeer said:
Yeah I was thinking the same thing - but then again, how many people with 'crap PSUs' will be spending $450 or $650 on a card? :) 


More than you'd think unfortunately.

It's like the people who put $4,000 rims and tires on a broke down car with worn out suspension.
Only people who know better think that a quality PSU makes a difference versus just some big numbers. nVidia says 550W, hey then this 600W one that shipped with my case should be fine... [:thegreatgrapeape:5]
a b Î Nvidia
June 9, 2008 6:57:02 PM

Kaldor said:

Nvidia is actually doing customers a disservice by not pushing the envelope and embracing new technology. ATI has embraced the new tech, why cant Nvidia?


And also S3 has a DX10.1 part as well, so it's obviously not that difficult to do.
Personally I would've been fine with there being slow support for DX10.1, but having it come out and then lame excuses used for its withdrawl is just sketchy.
Overall I think it's bad for the customer, especially since it's a global M$/DX standard and not some niche made up one like Truform or an FX renderpath.

I would love to see someone compare Assasin's Creed on the New cards with and without the patch and then see what happens, I have a feeling it would have a greater impact than it did on the HD3K's viability.
June 11, 2008 5:47:11 PM

Kaldor said:
Ever wonder why companies dont utilize Direct X 10.1? And have you ever noticed how many games have that lame "the way its meant to be played" BS? Take that into account before you open your mouth. Nvidia PAYS companies big bucks to NOT use 10.1.


My mouth is opening again. And it is warning you to duck or those black helicopters might hit you in the head ;) 

(I think I'm too late)
June 11, 2008 5:49:41 PM

TheGreatGrapeApe said:
More than you'd think unfortunately.

It's like the people who put $4,000 rims and tires on a broke down car with worn out suspension.
Only people who know better think that a quality PSU makes a difference versus just some big numbers. nVidia says 550W, hey then this 600W one that shipped with my case should be fine... [:thegreatgrapeape:5]



Very true I guess. Not until I recently spent weeks building my system (and it's the best, fastest, most solid machine I've ever owned thanks to Toms Hardware forums and Newegg reviews helping out) did I realize how much a PSU has made a differencet in the last 3-5 years.

Also, I never even understood the advantages (hell, I didn't even know how it worked) of single-rail vs. multiple rail, single-wire design vs. modular (some would debate this), etc.

But yes, I went with the PPC 750 silencer and it's a beast for 750.
!