Sign in with
Sign up | Sign in
Your question

Some Real results of GTX280 testing.

Last response: in Graphics & Displays
Share
June 18, 2008 12:40:31 AM

So.......it takes $1400 worth of video cards to beat/match $1000 worth in performance. Gotcha.
June 18, 2008 12:41:43 AM

i dont know were u get 1400$ from. 2 280's will cost 1200-1300 :) 
Related resources
June 18, 2008 12:43:39 AM

gadgetnerd said:
i dont know were u get 1400$ from. 2 280's will cost 1200-1300 :) 


Yea, im waaaay off :heink: 
June 18, 2008 12:44:12 AM

Why not just buy 3x 9800GTX's for the price of 1 gtx280?
June 18, 2008 12:49:22 AM

i was thinking about it but 1. GTX have heat issuie. also i belive 1 gtx280 will beat them? if im not mistaken.
June 18, 2008 12:56:46 AM

um no, 2x 9800GTX beat a single 9800gx2 which beats a gtx280, and reviews have shown that3x 9800gtx beat or are on par with 2x9800gx2's... I've been shopping around and managed to snag me a 9800gtx for $200. As for the heat issue they run cooler then 8800GTX's which i previously had so i see no problem.
June 18, 2008 1:04:28 AM

hmm i heard they had heat issuies. i shoulda went with Tri sli 9800 GTX's i ordered a GTX280 lol
June 18, 2008 1:15:58 AM

Toms review also showed that those GTX280's produce an unbearable amount of noise on load ~66.5 dba almost 10 more dba then a 9800 (13 at idle). At load both the 9800 and the 280 run at the same temperature. Though when idle the 280 runs much cooler due to the ridiculously louder fan and advanced power management.
June 18, 2008 1:19:16 AM

fans will make noise... lol games over power noise. or headset :D 
June 18, 2008 1:20:20 AM

lx_flier said:
Toms review also showed that those GTX280's produce an unbearable amount of noise on load ~66.5 dba almost 10 more dba then a 9800 (13 at idle). At load both the 9800 and the 280 run at the same temperature. Though when idle the 280 runs much cooler due to the ridiculously louder fan and advanced power management.


Yea I remember when the 9800GTX came out, the thing was beastly loud! Now the GTX280 drowns the 9800GTX noise out, LOL!
June 18, 2008 1:23:36 AM

Not at idle they don't, lol add another $100 cooling solution to the $650 card. At least you'll save some money on the power bill :-)
a b U Graphics card
June 18, 2008 1:25:03 AM

dude thats one fine system u got there.
June 18, 2008 1:40:11 AM

I ordered a GTX 280 yesterday, is it really THAT loud?! My case has a fan fight where the graphics card goes so I might be able to turn the fan down on the GTX if my case gets good enough airflow, thats not a stupid idea is it?
June 18, 2008 1:44:50 AM

na im sure if u got good air flow u wont need to turn it up like me. i have the side pannel 4 fans. so yea. im sure ill hit 30% idle and about 50% full load.
June 18, 2008 2:11:02 AM

gadgetnerd said:
http://www.evga.com/forums/tm.asp?m=407279

and yes 2 SLI 280's will beat the quad 2900's gx2's


This is what I call the "Nvidiot" effect. It's marketing the high end that most people can't afford to sell the low end that doesn't perform better than the competition and lacks full support for Direct X.

Well, marketing works. Intel relied upon marketing during the Netburst days (but AMD's marketing can't get the Phenom out of it's rut). Nvidia has always relied upon marketing plus strong arm tactics like the Assassin's Creed patch ("The Way It's Meant to Be Payed"), and dodgy drivers like the Crysis demo water or the blurriness of the 7000 series.

Is the GTX280 in quad better than 2 3870x2's in CrossfireX? Probably (don't know what you mean by quad 2900 gx2's). Is it better than the 9800gx2 in SLI? Maybe. Overall, the results show that the GTX280 is a bad deal. It's not much better than Nvidia's recent last generation and it's horribly overpriced.

If you're in the category of someone who can afford 4 GTX 280's then go for it, but if you're not in that price range of lucky gamers, don't decide on ditching an 8800gts 512, a pair of 8800gt in SLI, a 9800gx2 or even a 3870x2 just for one or two of these. Just not worth it.

The GTX280 is as much of a refresh as the 4xxx series, leaving out Folding @ Home stuff. Used to hear Nvidia fans say that AVIVO or image quality didn't matter all that much, it's about game and Nvidia delivers the framerates. Now, we see an overpriced high end Nvidia card that relies upon something other than gaming to sell, because it loses to the last gen Nvidia dual GPU card and the last gen ATI dual GPU card is often in third place right behind the costly GTX280.

Get the GTX260 instead, if you like Nvidia, or the 4870 if you don't. We'll see if the later versions of the GTX280 bring yields that make a difference after a die shrink. I'll be interested in seeing how the 4870x2 performs, as that's in my price range later in the year.

If I hear that Nvidia tells people that drivers will improve GTX280 performance, or that the later die shrink of the GTX280 will change things, then I'll really LOL. It will be so much like what we've heard from ATI regarding improved Radeon performance or AMD regarding Phenom.

June 18, 2008 2:22:48 AM

so why not just get 2 8800gts g92's and save the money... a 8800gts g92 is a 9800gtx... oh wait... a tad bit less, sorry
a b U Graphics card
June 18, 2008 2:36:23 AM

yipsl said:
The GTX280 is as much of a refresh as the 4xxx series, leaving out Folding @ Home stuff.

G92, G80 and dirivatives back to the 8400 can fold too, so that is another refreshed feature :lol: 
June 18, 2008 2:36:57 AM

I put a 9800GTX in my wife's system I built from the ground up last week. I set the fanspeed to 75% with riva and its not really that loud, I was expecting louder. It also runs cooler idle than my 8800GT with a DuoOrb on it, her 9800GTX idles at around 38-41c, but hits 60c in AoC while my 8800GT never tops 52c. But then again the 9800 is stronger than the GT, barely.
June 18, 2008 2:55:07 AM

randomizer said:
G92, G80 and dirivatives back to the 8400 can fold too, so that is another refreshed feature :lol: 


Okay, I thought that the CPU did Folding @ Home on earlier generations. My mistake. I do think that the GTX260 is the better choice in the new Nvidia generation, and that anyone with a 9800gx2 or 3870x2 doesn't need to upgrade unless their favored game doesn't support SLI or Crossfire.

Overall, the GTX280 is looking to be as disappointing as the 2900XT. People should wait for the die shrink, otherwise they're buying into the marketing.
a b U Graphics card
June 18, 2008 3:33:39 AM

Right now you need to edit an INF file to get the CUDA drivers to work with pre-GTX2x0 cards, but they do work. Once Nvidia releases a driver that natively installs for all cards things will be alot smoother running. The same thing happened with the 9x00 cards when they were released so give it a few weeks and universal drivers should start showing up.
June 18, 2008 5:17:29 AM

yes drivers is the only reason mostly why the card is testing like this.
a b U Graphics card
June 18, 2008 5:20:34 AM

I doubt drivers will help it much. Drivers can make a decent performance difference if they really, really suck to start with, but they can't make up for the price difference.
June 18, 2008 5:26:02 AM

I was expecting better numbers from the GTX 280, its not new architecture. Just more shaders and ROPs with more transistors, I am waiting for the card that at the very least doubles the 8800 GTX 768MB card.
a b U Graphics card
June 18, 2008 5:37:11 AM

Keep on waiting. the 8800GTX is still king of the bang-for-buck hill of you bought it in 06 or early 07.
a b U Graphics card
June 18, 2008 6:21:06 AM

Maybe ATI has some magic 2900 drivers left around they havnt used...just kidding. The refresh will be the gamers card, this card has all the extras in it so nVidia can start with that direction they need to go in. When the apps show up that use these so called wasted trannys heheh then itll start showing its worth. If anyones ever tried encoding a blueray or any vid, this is going to help immensely, plus the gpgpu. A product like this is set to last, and I believe it will, tho the refresh is going to be better
June 18, 2008 6:58:30 AM

I just don't get some of you guys. I think people are spoiled from the 8 series to be honest, and now if nvidia can't do an 8 series leap every year people complain. Another thing that drives me nuts (because I hear it constantly) is when people say...."well why not just get 2 of these in SLI since its cheaper and will beat X or Y's single card solution".
Enough of that! Not everybody wants SLI. There are more than a few things about SLI that I personally don't like. The very least of which being that you need to buy a more expensive mobo (need more case space), buy better or larger cooling/case/fans, eat vid ram, need a bigger & better psu, deal with dual furnaces, deal with twice the noise from fans, and then of course other things about SLI itself that I just don't like. And on top of all that I'm not a fan of all that extra power consumption for various reasons.
Point is, some people don't want SLI or don't have the rig for it, even if you can get more performance by running 2 of these than 1 of those. I would rather have the latest and greatest single card solution and be very happy with it, rather than deal with SLI which often times you are running older technology unless you are running the 200 line in SLI.

Then there are these complaints about how the GX2 beats the 280 in some benchmarks. However, actually look at those benchmarks and you'll see nothing surprising. First off, the GX2 is a hellish beast of a card (a single card SLI), and is also an expensive card in its own right (one of the better GX2's goes for $570, just $80 less than the 280). It is also the latest and greatest card on the market, at least until today. So why is anyone surprised or upset that the 280 doesn't somehow blow it out of the water in all circumstances? The 280 is no doubt still the better card. More importantly, those GX2's, whether you have them in SLI or not, will only net you 512mb of vid ram.

And we all are seeing how ram requirements are growing fast on the latest games, and with the hardware people have 512mb just won't cut it for long, especially for those of us who love AA & AF loaded up on games.
And the benchmarks showed that. Sure, if you turn of AA & AF, the GX2 is going to give the 280 a run for its money in some cases. But who the hell is going to buy a 280 and not crank up some AA & AF? Or, who would buy a GX2 and not want to do that. If you don't care about AA & AF, then you can get a cheaper card than the 280 or GX2 and still be in great shape for any game unless you run at mega high resolutions.

So, needless to say, once I saw the benchmarks with AA & AF cranked up, the 280 and its better memory specs (among other things) really start to shine through and leave everything else in the dust. Maybe it isn't another 8 series, but seriously we can't be expecting that kind of leap all the time. Come back to reality. The truth is that the 280 comes stocked with some great specs that should keep owners happy for quite some time. As for the price, I think $650 is fine, especially when you consider what the going price have been for the GX2's and GTX/Ultra's of the world, especially considering how long the later have been out on the market (year and a half). Spend a $100 or $200 more and you get the latest card with notably better performance. Whats wrong with that?

If you can't afford the 280, the 260 is a superb 2nd best card and much more affordable and you will still kill games. Or try the new ATI card. Better yet, grab a GTX/Ultra as they will be even more affordable, and you'll find tons of them used now with the 280/260 release.
Point is, people just shouldn't be complaining about the state of graphics cards, the prices, or anything. We got it good as far as gpu's and all the great choices. Not like you need a 280/260 to play ANY game out there, including Crysis. I've been playing Crysis 1680x1050 on High settings and 4xAA on both my 8800GTS and GTX!

As for me, I preorderd the eVGA GTX 280 SSC. What a beast! And since I'm a fan of cranking up AA & AF, along with maxing out settings, I'll be plenty happy with this single card solution and I consider it money well spent. I know the price will drop soon enough, but I've never bought a flagship card (or could afford one) on launch day before. Just being able to do that is a treat for me.
And also lets not forget another thing this card has over all others, which is the PhysX.
So in the very near future we will be seeing games that showcase the PhysX and the 200 series in general, some games will be built with this series card in mind as we have seen done in the past to some extent with other series of cards. Definitely expect some optimizations in upcoming games that really accomodate this gpu. Oh, and let's not forget that this card is being released today, with baby infant drivers. We all know these drivers will mature and help performance to some extent. So if any card, despite TODAY's benchmarks, has the most potential, the most room to grow, it is by far and away the 280/260. As time goes on I think we'll see these cards pull away even more in benchmarks, if not in synthetic ones then at least in real life gaming.
a b U Graphics card
June 18, 2008 7:21:15 AM

Pretty much sums it up. The pricing will come down, especially if the 4870 does well.
June 18, 2008 8:50:10 AM

I think that the gtx280 is a victim of its own hype....people expected that it would beat the 9800gx2, and rightly so: it has almost twice the shaders, twice the memory, twice the bandwidth etc....add to that the general inefficiencies of SLI and i expected that it would beat it, not by a huge margin, but still beat it.

The performance alone wouldn't be an issue if it was reasonably priced and had decent power req/fan noise etc. Another thing that erks me is the lack of progress/innovation with this card. Nvidia have taken the same tech (65 nm/ddr3) and just crammed in more of everything. The result is that it consumes a lotta power, needs noisy cooling and still cant achieve the same clock speeds as the previous generation; clock speeds which if they coulda reached would of made the 280 a real winner (image those 240 shaders running at 1625mhz or higher)

When it has fallen in price a little and moved to a 55nm process, nvidia should be able to reach higher clocks and this card will be a real winner....That is so long as the 4870x2 doesn't eclipse it.
June 18, 2008 9:55:28 AM

Robx24, we can see that you're a fan of nVidia, having just bought a GTX280 and I envy you! If you have that much money to spend on a card then it is a sound choice as the fact is it is the fastest card on the market today. However, the 9800GX2 is much better bang for your buck. I don't know what you're talking about $570, I've found one for $425. Now that is over $200 cheaper than the GTX280 and you think that the, on average, 6% gain from a GTX280 over a GX2 is worth that $200. No-one in their right mind would say that.
June 18, 2008 11:56:26 AM

I think we can all suspect who Robx24 may be...
June 18, 2008 12:53:15 PM

Screen size determines weather you will be using SLI, if you want all the bells & whistles at 1920x1200 you gotta go SLI (excluding the 9800GX2). A single card solution fits the bill if you plan or have a 19-22 inch monitor. When I finely do go 1920x1200 I will have to consider SLI at that point as an option if I want all the bells & whistles at 1920x1200.
June 18, 2008 2:07:00 PM

gadgetnerd said:
yes drivers is the only reason mostly why the card is testing like this.


Do you really believe that? I'll admit that both ATI and Nvidia cards improve somewhat from better driver revisions, but when ATI fans say that the wrong or immature drivers used by benchmarking sites are the reason their cards don't do as well; then Nvidia fans mock that argument big time.

Drivers only improve things a bit, the underlying architecture and design choices affect things more. ATI goes for image quality over sheer frame rates whereas Nvidia has fudged image quality in favor of sheer framerates. While that makes a difference in a few games like Crysis (i.e. 36 fps over 26 fps is important), it's not as important when the difference is 50 fps vs. 40, or 60 fps vs. 70. Yet, people prefer monster overheating GPU's like Nvidia's or inelegant, poorly designed dual cards like the 9800gx2.

That's why drivers fudging anything but sheer frame rates are important upon release for Nvidia cards. The high end wins (even if just by a nose) influences the buying decisions of gamers down the line from ultra high end to mainstream. It's good marketing, just like the semi-ethical TWIMTBP program (the Assassin's Creed fiasco brought Nvidia into really shady territory more so than the Crysis demo water incident).

Lest I sound like sour grapes, I'll simply say that almost all is fair in business competition. If company A can convince people to buy their products even when they aren't as good as the competition, then that's okay by me. I want ATI to survive not to lower the price of Nvidia cards, I tried Nvidia chipsets and cards in the 405 and 7xxx generation and found them wanting, but I want ATI to survive (even if AMD doesn't) because I prefer their approach to both games and video.

I just think that anyone wanting a new generation Nvidia card should either wait for the die shrink to see if the GTX280 improves, or should get a GTX260; but that even the 8800gt in SLI is a better option than Huang's latest $650 monster.

robx46 said:

Enough of that! Not everybody wants SLI. There are more than a few things about SLI that I personally don't like. The very least of which being that you need to buy a more expensive mobo (need more case space), buy better or larger cooling/case/fans, eat vid ram, need a bigger & better psu, deal with dual furnaces, deal with twice the noise from fans, and then of course other things about SLI itself that I just don't like. And on top of all that I'm not a fan of all that extra power consumption for various reasons.
Point is, some people don't want SLI or don't have the rig for it, even if you can get more performance by running 2 of these than 1 of those. I would rather have the latest and greatest single card solution and be very happy with it, rather than deal with SLI which often times you are running older technology unless you are running the 200 line in SLI.

Then there are these complaints about how the GX2 beats the 280 in some benchmarks. However, actually look at those benchmarks and you'll see nothing surprising. First off, the GX2 is a hellish beast of a card (a single card SLI), and is also an expensive card in its own right (one of the better GX2's goes for $570, just $80 less than the 280). It is also the latest and greatest card on the market, at least until today. So why is anyone surprised or upset that the 280 doesn't somehow blow it out of the water in all circumstances? The 280 is no doubt still the better card. More importantly, those GX2's, whether you have them in SLI or not, will only net you 512mb of vid ram.



I don't expect Nvidia or ATI to double performance every year. That's ridiculous, but I expect marketing to be realistic and not shady. Nvidia promised as much from the GTX280 as ATI did with the 2900XT. We learned that the 2900XT was a competitor to the 8800gts 320 and not the gts 640 or gtx Ultra. I doubt the GTX280 will be a competitor to the 4870x2 or 9800gx2, for that matter.

SLI and Crossfire setups are a costly affair, which is why dual GPU cards (should all be on one PCB, why did Nvidia fudge that?) are the future. In fact, multi core GPU's are the further future. Huang needs to recognize this. IMHO, that man needs more criticism from stockholders and customers than Ruiz on his worse day. Nvidia's on top financially but his rants vs. Intel and Nvidia's forcing Ubisoft to drop DX10.1 from Assassin's Creed via a patch shows that they have leadership issues.

Almost all is fair, but not everything is fair; else Intel would not be embroiled in investigations on several continents. Right now, Nvidia's not only picking a needless fight with Intel that it might lose by 2010 but it's doing what in the graphics industry is almost like an OEM rebate program -- if TWIMTBP marketing saves cash for the developers and they are "encouraged" to not support (or worse remove actual support) for features found only on ATI cards, then that's worthy of an investigation by regulators.
June 18, 2008 2:45:49 PM

yipsl said:
Do you really believe that? I'll admit that both ATI and Nvidia cards improve somewhat from better driver revisions, but when ATI fans say that the wrong or immature drivers used by benchmarking sites are the reason their cards don't do as well; then Nvidia fans mock that argument big time.

Drivers only improve things a bit, the underlying architecture and design choices affect things more. ATI goes for image quality over sheer frame rates whereas Nvidia has fudged image quality in favor of sheer framerates. While that makes a difference in a few games like Crysis (i.e. 36 fps over 26 fps is important), it's not as important when the difference is 50 fps vs. 40, or 60 fps vs. 70. Yet, people prefer monster overheating GPU's like Nvidia's or inelegant, poorly designed dual cards like the 9800gx2.

That's why drivers fudging anything but sheer frame rates are important upon release for Nvidia cards. The high end wins (even if just by a nose) influences the buying decisions of gamers down the line from ultra high end to mainstream. It's good marketing, just like the semi-ethical TWIMTBP program (the Assassin's Creed fiasco brought Nvidia into really shady territory more so than the Crysis demo water incident).

Lest I sound like sour grapes, I'll simply say that almost all is fair in business competition. If company A can convince people to buy their products even when they aren't as good as the competition, then that's okay by me. I want ATI to survive not to lower the price of Nvidia cards, I tried Nvidia chipsets and cards in the 405 and 7xxx generation and found them wanting, but I want ATI to survive (even if AMD doesn't) because I prefer their approach to both games and video.

I just think that anyone wanting a new generation Nvidia card should either wait for the die shrink to see if the GTX280 improves, or should get a GTX260; but that even the 8800gt in SLI is a better option than Huang's latest $650 monster.



I don't expect Nvidia or ATI to double performance every year. That's ridiculous, but I expect marketing to be realistic and not shady. Nvidia promised as much from the GTX280 as ATI did with the 2900XT. We learned that the 2900XT was a competitor to the 8800gts 320 and not the gts 640 or gtx Ultra. I doubt the GTX280 will be a competitor to the 4870x2 or 9800gx2, for that matter.

SLI and Crossfire setups are a costly affair, which is why dual GPU cards (should all be on one PCB, why did Nvidia fudge that?) are the future. In fact, multi core GPU's are the further future. Huang needs to recognize this. IMHO, that man needs more criticism from stockholders and customers than Ruiz on his worse day. Nvidia's on top financially but his rants vs. Intel and Nvidia's forcing Ubisoft to drop DX10.1 from Assassin's Creed via a patch shows that they have leadership issues.

Almost all is fair, but not everything is fair; else Intel would not be embroiled in investigations on several continents. Right now, Nvidia's not only picking a needless fight with Intel that it might lose by 2010 but it's doing what in the graphics industry is almost like an OEM rebate program -- if TWIMTBP marketing saves cash for the developers and they are "encouraged" to not support (or worse remove actual support) for features found only on ATI cards, then that's worthy of an investigation by regulators.


this guy knows what he's talking about
words of wisdom
June 18, 2008 8:28:45 PM

It still amazes me that people on this forum will boast and praise a product as though it were their own, and nothing could be wrong with it.


@ Gadgetnerd: Deal with it already, the GTX 280 is a huge let down. It makes no sense to buy it when there are so many cheaper solutions that perform the same. I am getting tired of scrolling through the graphics cards section of the forums and seeing you started five new threads having to do with how great the GTX 280 supposedly is. God dam, stop fixating!

a b U Graphics card
June 18, 2008 8:36:41 PM

The_Abyss said:
I think we can all suspect who Robx24 may be...

Youre not thinking THAT r u? :o  :o  :o  A mans innocent til proven guilty
a b U Graphics card
June 19, 2008 1:22:43 AM

I see no point in arguing about launch prices when most of you won't be buying the cards for at least a month or two anyway (except you rob :D ). I don't intend to buy another card over my 9600GT until I'm going to see a significant benefit. While I can still play at 1680x1050 with or without AA I see no benefit in shelling out money. Being able to play Crysis at 80FPS is not something that motivates me to waste money because the game isn't worth spending more money on unless other games will benefit significantly.
June 19, 2008 1:00:02 PM

Basicly the GTX280 is what the 7900GTX was to the 7800GTX, a little bump in performance nothing more. Nvidia seems to have a track record of every four years for a new architecture, look at the 7800GTX when it came out. Two years later the 7900GTX was released, then two more years and the legendary 8800GTX popped its popular head out. Since the original G80 (8800GTX) and the G92 (improved G80 = G92) are pretty much the same architecture, the next greatest card from Nvidia will be a completely new architecture in two more years.

Nvidia needs to make money (profit) off the G80 product before releasing a new breed, only then will they give the go ahead to release something completely new. I don't expect the next best card from Nvidia till another two years, if they follow their trend.
!