Sign-in / Sign-up
Your question

ATI or NVIDIA?

Tags:
  • Graphics Cards
  • Chipsets
  • Nvidia
  • Graphics
  • Product
Last response: in Graphics Cards
August 31, 2008 3:41:55 AM

God I hate competition, why can't some all powerful company simply make everything and tell us exactly what to buy. Just kidding. It's nice to see some competition between the card manufacturers again but this leaves a consumer with a tricky decision. Now I know that the main stream choice is to take the NVIDIA route and get a 260 or 280. This brings up the more important choice of motherboard chipset. While NVIDIA and their SLI setup seems to be more main stream from my understanding their motherboard chipset runs hot and isn't as good as the one offered by Intel. ATI on the other hand and the 4870 offer even better performance and utilize the Intel chipset.

From my understand NVIDIA is going to stop manufacturing it's chipsets in the near future making it more likely that any future graphic solutions will not be SLI or at least will most likely work on the Intel chipsets. with this in mind should I got for a P45 motherboard and some brand of ATI card for my new system? This is the way I am leaning for instance to a system with a Radeon 4870 or something of the like since it seems to be the better long term system. So whats everyone's input, whats the long term better solution?

More about : ati nvidia

August 31, 2008 3:50:14 AM

Nvidia finally buckling under the amassing presure from AMD/ATI and Intel has forced them to give SLi to Intels X58 chipset (with strings attached to the MF's), so that is a strong, if early, clue that they are getting out of the Chipset business, and personaly I think they need to, the 600i and 700i series are quite bad overall, questionable overclocking with really high drop/droop, and hot NB's. Many people have had no problems, and me personally ive worked with a 750i for my wife's PC and it took about 4 days to get it working correctly, but its been solid since.

Anyways back on topic. You should probably take a look at the P45 or the X38/X48 boards. DFI has a very good X48 board for $220, and a very good X38 for $180 http://www.newegg.com/Product/Product.aspx?Item=N82E168... . Ive pondered both of these boards myself. If you want Crossfire at full x16/x16 then go for these, if not then look at the P45's.

As far as the video card goes, its no secret the ATI 4800 cards have been in the spotlight for quite awhile now. But honestly the best buy on the market right now is probably the 9800GX2, it hovers around $270 right now, roughly $100 more than the 4870 but it handles it easily, although runs hot and consumes far more power.
August 31, 2008 3:56:39 AM

There is a large chance of Nvidia leaving the chipset market simply because with intel's new chipset the X58 it already has sli functions built in, Rendering Nvidia's chipsets useless (750i, 780i, 790i) so at he moment if you are getting a GPU I would make it Nvidia so that next year with nehalem you can buy a new mobo and CPU and SLI the GPU
Related resources
Can't find your answer ? Ask !
August 31, 2008 3:59:13 AM

The most cost effective option is to get a p43/45 (starting at as low as $75), with a 9800gx2.

P43/45 cost less than 750i/780i, while performing far better. Cost half the price of x38/48, while performing on par. The only disadvantage is lack of cf/sli support.
http://www.newegg.com/Product/ProductList.aspx?Submit=E...

9800gx2, at current price of $275 (or $244 after mir), cost far less than gtx280 ($420), while outperforming it, cost less than 4870 ($280) while far outperforming it.
http://www.newegg.com/Product/Product.aspx?Item=N82E168...

There was the misconception floating around that 9800gx2 perform the same as gtx280. That's not true. The performance gap between them is significant.
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=13
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=14
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=15
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=16
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=17
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=18
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=19
August 31, 2008 5:02:25 AM

Hrmm thats an interesting idea, get a nice NVIDIA card now then upgrade my system in 6 months or so when the new chipsets and processors come out. However that raises a question, would the current generation of SLI cards be supported on this new Intel chipset?

Either going with ATI or NVIDIA it seems the next generation of chipset will support both. So in the short term what do you all think is better going with a ATI 4870 or a NVIDIA 280?
August 31, 2008 5:10:29 AM

Chipset does not matter when it comes to single card solutions, and there is no new interface with X58 so...yea it will work. Also be aware that the X58 will not support SLi on every board, its up to the board manufacturers (ASUS, MSI, Gigabyte.......) to do this, by PAYING Nvidia on a "commission" based system it seems, so expect SLi to only be on the higher end boards.
August 31, 2008 2:48:48 PM

spathotan said:
Chipset does not matter when it comes to single card solutions, and there is no new interface with X58 so...yea it will work. Also be aware that the X58 will not support SLi on every board, its up to the board manufacturers (ASUS, MSI, Gigabyte.......) to do this, by PAYING Nvidia on a "commission" based system it seems, so expect SLi to only be on the higher end boards.


SLI will not be supported natively, but rather using a bridge chip. It's not so much a commission system. Technically, it's not necessary, since the motherboard chipset can support it just fine alongside cf, but Nvidia requires the purchase of their severely overpriced bridge chip instead, as part of a deal with Intel. Many people think Intel should have pushed harder. The same bridge chip is already being used in the ultra high end dual cpu Intel Skulltrial motherboard, which support both cf and sli. Basically, expect sli supporting x58 boards to cost significantly more. Just go with ATI if you want dual card in x58.

As for "gtx280 or 4870," forget both unless you're shooting for ultra high end graphics. 9800gx2 blows them both away for less. It's the second fastest single card at this point. The 4870x2 is faster, in first place, but cost over twice as much. Unless you go for 4870x2, get 9800gx2, forget anything in between that cost more but performs less.
August 31, 2008 3:03:44 PM

heavymetalsmith said:
Hrmm thats an interesting idea, get a nice NVIDIA card now then upgrade my system in 6 months or so when the new chipsets and processors come out. However that raises a question, would the current generation of SLI cards be supported on this new Intel chipset?

Either going with ATI or NVIDIA it seems the next generation of chipset will support both. So in the short term what do you all think is better going with a ATI 4870 or a NVIDIA 280?


From what I have read the new chips and chipsets should be released in the next couple months but it will only be the Extreme CPU's. The mainstream Nehalem chips won't be available until late spring - early summer next year so you're looking at 8-10 months before you will be able to get them.

August 31, 2008 3:42:28 PM

dagger said:
SLI will not be supported natively, but rather using a bridge chip. It's not so much a commission system. Technically, it's not necessary, since the motherboard chipset can support it just fine alongside cf, but Nvidia requires the purchase of their severely overpriced bridge chip instead, as part of a deal with Intel. Many people think Intel should have pushed harder. The same bridge chip is already being used in the ultra high end dual cpu Intel Skulltrial motherboard, which support both cf and sli. Basically, expect sli supporting x58 boards to cost significantly more. Just go with ATI if you want dual card in x58.

As for "gtx280 or 4870," forget both unless you're shooting for ultra high end graphics. 9800gx2 blows them both away for less. It's the second fastest single card at this point. The 4870x2 is faster, in first place, but cost over twice as much. Unless you go for 4870x2, get 9800gx2, forget anything in between that cost more but performs less.



keep up with the news the intel 58 chipset is getting sli without the bridge chip through software, and that makes me think we will get it on p45 to the 48 chipset with a wee bit of modding

and dont go for a nvidia card they dont do DX10.1 and ati cards are going to shine when the DX10.1 games come out, very shortly
August 31, 2008 4:28:25 PM

rangers said:

and dont go for a nvidia card they dont do DX10.1 and ati cards are going to shine when the DX10.1 games come out, very shortly


Don't count on it. They've been saying that for ages, haven't happened yet, aside from Assassin's Creed, in the form of a patch. Speaking of which, games haven't even truly taken advantage of dx10 yet. Game developers don't just jump on the bandwagon, they have programing complexity and cost considerations. It'll take a while. :p 
August 31, 2008 4:40:48 PM

dagger said:
Speaking of which, games haven't even truly taken advantage of dx10 yet. :p 



DX10.1 makes games go faster so to say they haven't truly taken advantage of DX10 yet is just not thought out
August 31, 2008 4:53:53 PM

If developers make their games faster for half their customers, the game might gain that reputation and the other half may be turned of from buying it. Besides, it cost extra money and manpower to develop. It's just not in their interest.
August 31, 2008 9:34:30 PM

Okay well I made a decision, the card I will go with is a ATI Radeon 4870 just one for now and about 6 months from now or so upgrade to two. The question I have is can anyone recommend a good system to go along with this card? I am definitely looking to go Crossfire in a few months with a twin 4870 setup. The big question is can anyone recommend a processor/mobo/memory combination that won't bottleneck this future Crossfire. Also is it better to get one of these dual gpu single cards or two get two single gpu cards?
August 31, 2008 10:20:16 PM

Now this is just bothersome. Okay back to the drawing board, I'm going to keep looking between the GTX 280 or 4870. I know it will be one or the other it's just hard to draw a conclusion at the moment they both seem to have their benefits and drawbacks.
August 31, 2008 10:27:28 PM

heavymetalsmith said:
Now this is just bothersome. Okay back to the drawing board, I'm going to keep looking between the GTX 280 or 4870. I know it will be one or the other it's just hard to draw a conclusion at the moment they both seem to have their benefits and drawbacks.


Between gtx280 and 4870? Isn't 4870 and gtx260 more fitting since they cost the same? gtx280 cost $120 more, for most people's that's an important factor. :p 

Well, I guess you're different, otherwise you'd have picked 9800gx2, as it cost the least while performing the best among the 3.
a b U Graphics card
August 31, 2008 10:42:21 PM

dagger said:
Between gtx280 and 4870? Isn't 4870 and gtx260 more fitting since they cost the same? gtx280 cost $120 more, for most people's that's an important factor. :p 

Well, I guess you're different, otherwise you'd have picked 9800gx2, as it cost the least while performing the best among the 3.


So, basically the 9800GX2 is kinda the BEST thing u can get ($/Perf wise) now, right?

I'm a little confused about the defective chips around nVidia's cards and you seem to know quite a bit. Do they affect the 9800GT's (GTX, GTX+ and GX2) and if they do, how they do affect them?

Anyhow, i'd say the DX10.1 from the Ati cards is not a minor issue, like you said on other topic. Developers are actually 'watching' DX10.1 and might develop some titles for it before DX11. So Assasin's Creed step forward might be an example for that.

I'd say sooner than later, we'll see some DX10.1 titles showing off the performance increase from DX10. Or at least, for the sake of the HD2xxx's, HD38xx's and HD48xx's sake, they better do, lol.

This is some sort of bet i'd might add, but an interesting one. I'd bet on DX10.1 -> DX11 rather than DX10 now and jump to DX11 HW.

Also, there's the release of the +'s from nVidia in the near future (Q1-2009?).

Argh, so many choices! >_<

Esop!
August 31, 2008 10:43:11 PM

dagger said:
Between gtx280 and 4870? Isn't 4870 and gtx260 more fitting since they cost the same? gtx280 cost $120 more, for most people's that's an important factor. :p 

Well, I guess you're different, otherwise you'd have picked 9800gx2, as it cost the least while performing the best among the 3.



buying the 9800gx2 is like buying a time bomb
August 31, 2008 10:46:33 PM

Okay this is what I am very confused at the gx2 is a dual gpu single card right? And while it out performs a single 4870 or 280 it doesn't outperform two of them. So in essence getting one 4870 or 280 then getting a second a half year from now makes more sense then getting a second gx2 as running in quad gpu mode is pointless from what I have seen. So yes price/performance a 9800gx2 make more sense but not over a two year + upgrade model. So really it comes down to choosing a ATI 4870 or a NVIDIA 280. It's a hard choice to make they both have their + and -.
August 31, 2008 10:48:32 PM

And yes it's more like between the 260 and the 4870 because of the price comparison but for me the extra cost is not as much of an issue as the future performance.
August 31, 2008 11:21:50 PM

heavymetalsmith said:
Okay this is what I am very confused at the gx2 is a dual gpu single card right? And while it out performs a single 4870 or 280 it doesn't outperform two of them. So in essence getting one 4870 or 280 then getting a second a half year from now makes more sense then getting a second gx2 as running in quad gpu mode is pointless from what I have seen. So yes price/performance a 9800gx2 make more sense but not over a two year + upgrade model. So really it comes down to choosing a ATI 4870 or a NVIDIA 280. It's a hard choice to make they both have their + and -.


As for 4870x2 or 4870 cf outperforming 9800gx2 quad sli, that's not actually true. See benchmarks:
http://www.hothardware.com/Articles/ATI-Radeon-HD-4870-...
http://www.hothardware.com/Articles/ATI-Radeon-HD-4870-...
http://www.hothardware.com/Articles/ATI-Radeon-HD-4870-...
http://www.hothardware.com/Articles/ATI-Radeon-HD-4870-...
http://www.hothardware.com/Articles/ATI-Radeon-HD-4870-...

Performance were quite close, but the only case where 4870x2/4870 cf outperformed 9800gx2 quad sli is in Half Life 2. 9800gx2 is faster in all the remaining tests. The reason for this is simple: while quad sli scale horribly, the gap between single 4870 and 9800gx2 is so big to begin with, that it can't be overshot just because of quad sli/cf's bad scaling. Of course, 4870 still closed the gap A LOT.

In this case, the quad sli/cf handicap just isn't enough to counter the raw power. 2 4870s or 1 4870x2 still cost more and perform less than 2 cheaper 9800gx2.
a b U Graphics card
a b Î Nvidia
September 1, 2008 8:27:33 AM

dagger said:

Performance were quite close, but the only case where 4870x2/4870 cf outperformed 9800gx2 quad sli is in Half Life 2. 9800gx2 is faster in all the remaining tests.


No it's not!
Once again you don't bother to actually look at the benchmarks in depth.
QuadSLi GX2 is faster at the lower resolution, at the higher resolution the X2 in CF beats it. so the GX2 gets you 1 or two frames in the easier mode, but then move to higher resolution when these aren't CPU bound, and the X2 CF is ahead.
And once again you pull out DX9 Crysis results. I don't know why you or reviewers even bother with those. It's like running games in 16bit mode to show higher fps, pointless with these setups that can handle full DX10 mode, or well the GTX and HD4K can handle full DX10 Very High, the GX2, not so much.

And many of these posts you're putting up counter you previous GX2 vs GTX280 argument, where HL2, UT3, and Quake wars all have the GTX well out in front and in the DX9 Crysis benchmarks both the GTX280SLi and even GTX260SLi well outperform the GX2 Quad SLi.

Next time actually look at what you're posting.

Then go beyond the 1920x1200 range and the GX2 falls of the end of the earth running out of VRAM, making quad SLi very limited for it.

Seriously you try to pimp the hell out of the GX2 like you have stock in them. :heink: 

September 1, 2008 4:21:46 PM

TheGreatGrapeApe said:
No it's not!
Once again you don't bother to actually look at the benchmarks in depth.
QuadSLi GX2 is faster at the lower resolution, at the higher resolution the X2 in CF beats it. so the GX2 gets you 1 or two frames in the easier mode, but then move to higher resolution when these aren't CPU bound, and the X2 CF is ahead.
And once again you pull out DX9 Crysis results. I don't know why you or reviewers even bother with those. It's like running games in 16bit mode to show higher fps, pointless with these setups that can handle full DX10 mode, or well the GTX and HD4K can handle full DX10 Very High, the GX2, not so much.

And many of these posts you're putting up counter you previous GX2 vs GTX280 argument, where HL2, UT3, and Quake wars all have the GTX well out in front and in the DX9 Crysis benchmarks both the GTX280SLi and even GTX260SLi well outperform the GX2 Quad SLi.

Next time actually look at what you're posting.

Then go beyond the 1920x1200 range and the GX2 falls of the end of the earth running out of VRAM, making quad SLi very limited for it.

Seriously you try to pimp the hell out of the GX2 like you have stock in them. :heink: 


No offense, but you're the one that haven't looked at the benchmarks, not me.

Look closer, the benchmarks include 2 bars for each game. The higher one is for 1920x1200, the lower one for 2560x1600. In each and every case other than Half Life 2, 9800gx2 quad perform higher than 4870x2/4870 cf on both resolutions respectively. I'm not sure where "Then go beyond the 1920x1200 range and the GX2 falls of the end of the earth running out of VRAM, making quad SLi very limited for it" comes from. While you're certainly right that the lower vram will tank performance for 9800gx2 at some point, 2560x1600 just isn't quite there yet, not to mention the smaller 1920x1200. While there are resolutions larger than 2560x1600, it's safe to say most monitors can't handle those. 2560x1600 resolution is hardly "cpu bound," especially since their test rig use an "extreme" quad cpu. And those games do not run in 16bit mode.

As for dx9 vs dx10, honestly, I haven't noticed that the benchmarks are dx9 and not dx10. They didn't say on the test pages. Side to side comparisons of 4870x2 vs 9800gx2 quad sli is extremely rare. They're hard enough to find to begin with, so I was happy to be able to come across one at all, dx9 or dx10. As for your claim that 4870x2 will outperform 9800gx2 quad sli, I believe you. You're a mod and been here for a long time, your word is good enough for me.

But to say that dx9 benchmarks are somehow invalid is just an elitist viewpoint. Most benchmarkers still use either dx9 or a mixture of dx9 and dx10 instead of dx10 only. Most games still use either dx9 or both dx9 and dx10 dual mode, no game in existence so far use dx10 only. Majority of gamers also still run on dx9. To say that the only side to side benchmark that include this rare setup I could find is invalid because it happen to use dx9 is like saying unless you drive a Mercedes, you're not a driver. It's unfair. Nothing wrong with the still mainstream dx9.

Just as a side note, where in the benchmark did they actually say the tests are done in dx9 and not dx10? I reread it and weren't able to catch that part. They did say they use Vista, and 3dmark Vantage test is certainly done in dx10, since it doesn't run in dx9. They just didn't say if the games are done in dx10 Vista or if they forced dx9 mode.

As for this:
"And many of these posts you're putting up counter you previous GX2 vs GTX280 argument, where HL2, UT3, and Quake wars all have the GTX well out in front and in the DX9 Crysis benchmarks both the GTX280SLi and even GTX260SLi well outperform the GX2 Quad SLi."

I never posted that 9800gx2 quad sli outperforms gtx280 sli, in this thread or anywhere else. I know you're working and you're busy, but please try to keep track of things and don't say things that doesn't actually exist. All I said in other threads is that a single 9800gx2 outperform a single gtx280 or single 4870 (when 9800gx2 wasn't affected by the horrible quad scaling), which is true, using those benchmarks:
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=13
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=14
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=15
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=16
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=17
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=18
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=19
This particular benchmark does use dx10, including Very High shader setting for Crysis. I try to use dx10 benchmarks whenever possible. It's the new thing, after all. Benchmarks comparing 4870x2 and 9800gx2 quad sli side to side are just far too rare, and the one is all I could find, so there's no choice there.

For HL2 and UT3, the earlier benchmarks didn't include them on its lineup. Many benchmarkers didn't include them either, probably because they are slightly older and so well optimized that you max out with hundreds of fps. But you're right, those are popular games, and should be included in benchmarks. As for Quake Wars, which was included in both benchmarks, the quad benchmark use OpenGL instead of dx, which might have contributed to the difference in performance. Although I guess it's really better and more logical that way since it runs on OpenGL natively and dx is only a matter of sound and compatibility.

I knew from the beginning that going against conventional wisdom is going to result in taking some heat, even if backed up by benchmarks. I just didn't expect so much of it from a level headed and well respected poster like you. :( 

Since you're from Canada, in a time zone close to my own, and your post is timestamped 4:27 am, I'll just assume you're fatigued. :p 

I still respect you very much. But please try to look at things more carefully and check for possible errors before hitting the submit button.
September 1, 2008 4:36:08 PM

I wouldnt go for the gx2 etheir its like buying a dead horse! That card's life is basically over with drivers that are 100% mature and not getting better. On top of that the gx2 consumes a ton of power and emits alot of heat! Nvidia never even wanted to make the card and they probably will stop making them very soon to make way for newer cards.

Yes it performs pretty good at lower resolution's but if you go up in res with AA on the other cards outperform it.
a b U Graphics card
a b Î Nvidia
September 2, 2008 5:08:26 PM

dagger said:
No offense, but you're the one that haven't looked at the benchmarks, not me.


No I read the benchies, what I missed was that you were comparing Quad to a single X2 (because of how the CF was written), which makes little sense for someone was talking about SLIing 2 GTX280, then you couch the comparison by omitting the proper comparison. Compared to the GTX280 in SLi and the proper X2 in CF the GX2 Quad is dead in the water.

Quote:
I'm not sure where "Then go beyond the 1920x1200 range and the GX2 falls of the end of the earth running out of VRAM, making quad SLi very limited for it" comes from. While you're certainly right that the lower vram will tank performance for 9800gx2 at some point, 2560x1600 just isn't quite there yet, not to mention the smaller 1920x1200. While there are resolutions larger than 2560x1600, it's safe to say most monitors can't handle those. 2560x1600 resolution is hardly "cpu bound," especially since their test rig use an "extreme" quad cpu.


You need to look at the reviews that include that resolution, even the GF9800GTX, HD4850, HD4870 and HD4870 in CF die off at that resolution with settings on high.

Quote:
But to say that dx9 benchmarks are somehow invalid is just an elitist viewpoint. Most benchmarkers still use either dx9 or a mixture of dx9 and dx10 instead of dx10 only. Most games still use either dx9 or both dx9 and dx10 dual mode, no game in existence so far use dx10 only. Majority of gamers also still run on dx9.


It has nothing to do with being an elitist viewpoint, it's a viewpoint that the reason to test with Crysis is to test with the toughest game, to cripple that test makes no sense, and as little sense as turning down the shader quality in HL2 or ETQW because 'the majority of gamers' game at less settings than these cards can handle, it makes no sense. And you say most benchmarks use DX9 or a micture, I agree with the mixture, especially when benchmarking the 2nd & 3rd generation DX10 cards, this review takes the one DX10 benchmark they have (Vantage doesn't count for anything but like tests, the way all Bungholimarks only matter for that architecture) an turn it into a DX9 benchmark which makes no sense. BTW, I find it hillarious you're talking about the 'majority of gamers' and calling my position elitist when talking in a thread about SLing GTX 280, GX2s and X2s. C'mon, really.

Quote:
To say that the only side to side benchmark that include this rare setup I could find is invalid because it happen to use dx9 is like saying unless you drive a Mercedes, you're not a driver. It's unfair. Nothing wrong with the still mainstream dx9.


Sure its wrong and sure it's pointless (not invalid, there's no error fro what I can see), you're buying DX10 cards, with a future in mind, why would you be testing based on other people's pasts? Especially when cards like the GTX280 and HD4870 perform better than the older designs. It's like testing the GF7800 with SM3.0 and FP16 settings to expose that the GF6800 was weak at that task and that their new design made using HDR in games like Fart Cry usable. This is the point of testing the cards in a stressful fashion that matches their intended future and a large reason they were made.

Quote:
Just as a side note, where in the benchmark did they actually say the tests are done in dx9 and not dx10?


Pretty simple way to determine, shaders High = DX9, shaders Very High = DX10.

Quote:
I never posted that 9800gx2 quad sli outperforms gtx280 sli, in this thread or anywhere else. I know you're working and you're busy, but please try to keep track of things and don't say things that doesn't actually exist. All I said in other threads is that a single 9800gx2 outperform a single gtx280


However a single GX2 doesn't outperform a single GTX280 in those hothardware benchies you test, in both resolution in both UT3 and ETQW he single GT280 beats the single GX2, and that was my point, you keep saying it as if the GX2 always beats a single GTX280, but in all those game benchies that's not the case.

Quote:
I knew from the beginning that going against conventional wisdom is going to result in taking some heat, even if backed up by benchmarks. I just didn't expect so much of it from a level headed and well respected poster like you. :( 


The problem is I see you picking/chosing too much at this or that benchmark and not overall, and you're flooding threads with that, with the same benchies, it's overboard dude. I admit I reacted more to what looked like you comparing quad to X2CF, but really the second half showing the GTX280 beating the GX2 in almost all those benchmarks in Hothardware's tests should show you that it's not as simple as you've been flooding the threads with.

Quote:
Since you're from Canada, in a time zone close to my own, and your post is timestamped 4:27 am, I'll just assume you're fatigued.


You wanna blame anything for my dissposition blame the sun-burn from my drive in the Mustang with the top down coming back from Spokane and Boise and posting in the US. :sol:  Who's ever tired on a long weekend? It's afterwards returning to work that would be the issue, and then again I have the rest of the week off too.

Posting a suggestion about the GX2 is all well and good, however you're making blanket assertions even your own benchmarks don't support, and you also aren't giving these cards anywhere near the time to mature that the GX2 was given, where you compare tests that arrived at or before launch when the drivers were far from optimized. This is the GX2's Swan Song, where it's on it's way out, but it's optimization is such that it outperforms the newer cards that have only recently arrived.

I agree that the GX2 is a good value, but you're use of these benchmarks selectively without taking into account what they're saying overall is a problem when advising people what to buy.
a b U Graphics card
September 2, 2008 5:24:28 PM

xx12amanxx said:
I wouldnt go for the gx2 etheir its like buying a dead horse! That card's life is basically over with drivers that are 100% mature and not getting better. On top of that the gx2 consumes a ton of power and emits alot of heat! Nvidia never even wanted to make the card and they probably will stop making them very soon to make way for newer cards.

Yes it performs pretty good at lower resolution's but if you go up in res with AA on the other cards outperform it.


Wrong, NVIDIA has stated that the next driver release (Big Bang II) will improve performance on the 9xxx series cards, so the drivers are not mature yet.

Secondly, despite what people say, the GX2 still outpaces most everything out there by itself. While its true you don't get a large of a gain at higher res and higher AA, the GX2 still often comes out on top.

Seriously dude, get your facts stright.
September 2, 2008 5:29:15 PM

xx12amanxx said:
I wouldnt go for the gx2 etheir its like buying a dead horse! That card's life is basically over with drivers that are 100% mature and not getting better. On top of that the gx2 consumes a ton of power and emits alot of heat! Nvidia never even wanted to make the card and they probably will stop making them very soon to make way for newer cards.

Yes it performs pretty good at lower resolution's but if you go up in res with AA on the other cards outperform it.

Now dont go that far.....

The GX2 uses a lot less power than the GTX 280, 4870x2/4870 CF.....dont say it uses a lot when there are others that use more
And same for heat....sure its hot but so is the 4850 and 4870 but theyre still good cards....

As for me I think im going GX2 best bang for my buck as far as i can read...I mean im not using crazy high resolution 30" monitors (wish i was) but im not....atm i have a simple 19" that might become a 22" or 24" (so i can use 1080p for my PS3) so at those resolutions the GX2 wins in most tests (and the games i play which are Crysis a little but mainly CoD4)
a b U Graphics card
a b Î Nvidia
September 2, 2008 5:51:10 PM

Silverion77 said:

The GX2 uses a lot less power than the GTX 280, 4870x2/4870 CF.....dont say it uses a lot when there are others that use more
And same for heat....sure its hot but so is the 4850 and 4870 but theyre still good cards....


Well the same goes for you, don't say it uses alot less than the GTX280, when you actually test at the card itself (not CPU etc) then it show the GX2 uses more than the GTX280, not much, but definitely not 'a lot less';

http://www.xbitlabs.com/articles/video/display/radeon-h...

The X2, is in another league, but I think Xbit's numbers are off because it's not power by Watts, it's power by the souls of raging grannies. [:zorg:2]
September 2, 2008 5:58:32 PM

Using the numerous power calculators...(xtreme outervision lite is my favorite) the load calculated through them was less for the GX2 than the 280 and x2. Now i understand that they are all basically the same....about 700 watt to be clear and higher if u have a lot of fans, overclocking the cpu....but ppl shouldnt go out of the way to say the GX2 is a "power hungry space heater" when other cards are just the same
and sorry for the "a lot less" i just glanced at the numbers and saw that the numbers for the GX2 came up lower....didnt actually compare side by side. It looks big when its like 570 to 620 when the hundreds change

Personally a space heater would be nice. Its cold in my room when the heat isnt on....(dad doesnt believe in heat till like freezing sub 0 temps come)

Edit: But in terms of a 4870 at $250 versus a GX2 for $270, is it better to go with the GX2?? I mean i probably will never CF or SLI. (believe in the whole new better cards instead of getting another old one)
a b U Graphics card
September 2, 2008 7:22:25 PM

The GX2 does get a little warm (mine idles at 72), thats due to a low fan setting by default. Use rivatuner to turn the fan up, and temps dropped to the mid 50s idle for me.

Considering that the GX2 (~$250), the X2 (~$350) and 280 (~$450) often swap which is best depending on the game, the GX2 wins the 'Budget Card' award.

...Now lets hope that Big Bang II makes SLI decent for these babys...
September 2, 2008 8:31:11 PM

gamerk316 said:
Considering that the GX2 (~$250), the X2 (~$350) and 280 (~$450) often swap which is best depending on the game, the GX2 wins the 'Budget Card' award.

If there was a x2 for $350 id buy it in 2 seconds :lol:  :lol: 
a b U Graphics card
September 2, 2008 8:59:59 PM

In all honesty, I guessed at the price. I have no intention to buy one, because my GX2 wrecks everything I throw at it :D 
September 2, 2008 9:25:16 PM

how is it??

Im considering one cause of the huge price drops that is basically a 4870

Some complain of the microstuttering....u get any bad?
a b U Graphics card
September 2, 2008 9:54:42 PM

Heres a problem that nVidia better clear up. Just like they ignored ATI, and concentrated on their gpgpu solutions, and kept an eye on Intel, ATI, well you know the reast. Its been almost a year since everyones been saying DX10.1 doesnt matter. The longer this farce goes on, the sooner it becomes a lie. Thats right, swallow the nVidia line "DX10.1 doesnt matter" If people are still saying this 6 months from now, and so is nVidia, therell be another surprise coming
September 2, 2008 11:10:48 PM

JAYDEEJOHN said:
Heres a problem that nVidia better clear up. Just like they ignored ATI, and concentrated on their gpgpu solutions, and kept an eye on Intel, ATI, well you know the reast. Its been almost a year since everyones been saying DX10.1 doesnt matter. The longer this farce goes on, the sooner it becomes a lie. Thats right, swallow the nVidia line "DX10.1 doesnt matter" If people are still saying this 6 months from now, and so is nVidia, therell be another surprise coming


That's just the problem. We've been hearing about the promised "surprise" for a long time now, but it's nowhere to be seen. When dx10 and dx10.1 first came out and people said dx10.1 will never be widely adopted, I was very skeptical. But with each month passing and no dx10.1, it's getting harder and harder to dismiss them. When the Assassin's Creed dx10.1 patch came out, I thought "this is the beginning." But that turned out to be a dead end. Even more depressing is few people with Assassin's Creed even bothered to patch their installation.

All of the upcoming blockbusters like Starcraft 2 and Crysis Warhead have been announced to specifically not support dx10.1. There are two upcoming games: Stormrise and Cloud 9, that are announced to support dx10.1. But no one's even heard of them, and they're just console ports.

Basically, if dx10.1 haven't been accepted into the mainstream at this point, it's not looking good. There's the looming possibility that it'll get skipped over the the next full version dx, because it can be.
a b U Graphics card
September 3, 2008 12:23:01 AM

And it can also be added in as well. The tip of the iceberg. Im not saying every game will be released DX10.1 from now onwards, but were getting close. As soon as those "no name" games come out, its going to make a difference. Most people dont like to actually play Crysis as much as making it a comparative across setups. Im thinking the same will happen here, and it could just as easily bite nVidia in the arse . Adding to this, its easy to dismiss 1 game, especially one which went out of its way to accomodate nVidia with its "patch". Whats funny is, the "patch" didnt seem to work on the nVidia cards, only the cards with DX10.1. That being said, when theres 3 or even 4 games out, no matter how popular they are, that show the differences, it wont go ignored. So therefore, beware nVidia is quite appropo