heavymetalsmith

Distinguished
Apr 10, 2008
45
0
18,530
God I hate competition, why can't some all powerful company simply make everything and tell us exactly what to buy. Just kidding. It's nice to see some competition between the card manufacturers again but this leaves a consumer with a tricky decision. Now I know that the main stream choice is to take the NVIDIA route and get a 260 or 280. This brings up the more important choice of motherboard chipset. While NVIDIA and their SLI setup seems to be more main stream from my understanding their motherboard chipset runs hot and isn't as good as the one offered by Intel. ATI on the other hand and the 4870 offer even better performance and utilize the Intel chipset.

From my understand NVIDIA is going to stop manufacturing it's chipsets in the near future making it more likely that any future graphic solutions will not be SLI or at least will most likely work on the Intel chipsets. with this in mind should I got for a P45 motherboard and some brand of ATI card for my new system? This is the way I am leaning for instance to a system with a Radeon 4870 or something of the like since it seems to be the better long term system. So whats everyone's input, whats the long term better solution?
 

spathotan

Distinguished
Nov 16, 2007
2,390
0
19,780
Nvidia finally buckling under the amassing presure from AMD/ATI and Intel has forced them to give SLi to Intels X58 chipset (with strings attached to the MF's), so that is a strong, if early, clue that they are getting out of the Chipset business, and personaly I think they need to, the 600i and 700i series are quite bad overall, questionable overclocking with really high drop/droop, and hot NB's. Many people have had no problems, and me personally ive worked with a 750i for my wife's PC and it took about 4 days to get it working correctly, but its been solid since.

Anyways back on topic. You should probably take a look at the P45 or the X38/X48 boards. DFI has a very good X48 board for $220, and a very good X38 for $180 http://www.newegg.com/Product/Product.aspx?Item=N82E16813136051 . Ive pondered both of these boards myself. If you want Crossfire at full x16/x16 then go for these, if not then look at the P45's.

As far as the video card goes, its no secret the ATI 4800 cards have been in the spotlight for quite awhile now. But honestly the best buy on the market right now is probably the 9800GX2, it hovers around $270 right now, roughly $100 more than the 4870 but it handles it easily, although runs hot and consumes far more power.
 

random1283

Distinguished
Oct 26, 2007
222
0
18,680
There is a large chance of Nvidia leaving the chipset market simply because with intel's new chipset the X58 it already has sli functions built in, Rendering Nvidia's chipsets useless (750i, 780i, 790i) so at he moment if you are getting a GPU I would make it Nvidia so that next year with nehalem you can buy a new mobo and CPU and SLI the GPU
 

dagger

Splendid
Mar 23, 2008
5,624
0
25,780
The most cost effective option is to get a p43/45 (starting at as low as $75), with a 9800gx2.

P43/45 cost less than 750i/780i, while performing far better. Cost half the price of x38/48, while performing on par. The only disadvantage is lack of cf/sli support.
http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=2010200280%20107172615&bop=And&Order=PRICE

9800gx2, at current price of $275 (or $244 after mir), cost far less than gtx280 ($420), while outperforming it, cost less than 4870 ($280) while far outperforming it.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814133217

There was the misconception floating around that 9800gx2 perform the same as gtx280. That's not true. The performance gap between them is significant.
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=13
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=14
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=15
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=16
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=17
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=18
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=19
 

heavymetalsmith

Distinguished
Apr 10, 2008
45
0
18,530
Hrmm thats an interesting idea, get a nice NVIDIA card now then upgrade my system in 6 months or so when the new chipsets and processors come out. However that raises a question, would the current generation of SLI cards be supported on this new Intel chipset?

Either going with ATI or NVIDIA it seems the next generation of chipset will support both. So in the short term what do you all think is better going with a ATI 4870 or a NVIDIA 280?
 

spathotan

Distinguished
Nov 16, 2007
2,390
0
19,780
Chipset does not matter when it comes to single card solutions, and there is no new interface with X58 so...yea it will work. Also be aware that the X58 will not support SLi on every board, its up to the board manufacturers (ASUS, MSI, Gigabyte.......) to do this, by PAYING Nvidia on a "commission" based system it seems, so expect SLi to only be on the higher end boards.
 

dagger

Splendid
Mar 23, 2008
5,624
0
25,780


SLI will not be supported natively, but rather using a bridge chip. It's not so much a commission system. Technically, it's not necessary, since the motherboard chipset can support it just fine alongside cf, but Nvidia requires the purchase of their severely overpriced bridge chip instead, as part of a deal with Intel. Many people think Intel should have pushed harder. The same bridge chip is already being used in the ultra high end dual cpu Intel Skulltrial motherboard, which support both cf and sli. Basically, expect sli supporting x58 boards to cost significantly more. Just go with ATI if you want dual card in x58.

As for "gtx280 or 4870," forget both unless you're shooting for ultra high end graphics. 9800gx2 blows them both away for less. It's the second fastest single card at this point. The 4870x2 is faster, in first place, but cost over twice as much. Unless you go for 4870x2, get 9800gx2, forget anything in between that cost more but performs less.
 

ausch30

Distinguished
Feb 9, 2007
2,210
0
19,790


From what I have read the new chips and chipsets should be released in the next couple months but it will only be the Extreme CPU's. The mainstream Nehalem chips won't be available until late spring - early summer next year so you're looking at 8-10 months before you will be able to get them.

 

rangers

Distinguished
Nov 19, 2007
1,563
0
19,790



keep up with the news the intel 58 chipset is getting sli without the bridge chip through software, and that makes me think we will get it on p45 to the 48 chipset with a wee bit of modding

and dont go for a nvidia card they dont do DX10.1 and ati cards are going to shine when the DX10.1 games come out, very shortly
 

dagger

Splendid
Mar 23, 2008
5,624
0
25,780


Don't count on it. They've been saying that for ages, haven't happened yet, aside from Assassin's Creed, in the form of a patch. Speaking of which, games haven't even truly taken advantage of dx10 yet. Game developers don't just jump on the bandwagon, they have programing complexity and cost considerations. It'll take a while. :p
 

rangers

Distinguished
Nov 19, 2007
1,563
0
19,790



DX10.1 makes games go faster so to say they haven't truly taken advantage of DX10 yet is just not thought out
 

dagger

Splendid
Mar 23, 2008
5,624
0
25,780
If developers make their games faster for half their customers, the game might gain that reputation and the other half may be turned of from buying it. Besides, it cost extra money and manpower to develop. It's just not in their interest.
 

heavymetalsmith

Distinguished
Apr 10, 2008
45
0
18,530
Okay well I made a decision, the card I will go with is a ATI Radeon 4870 just one for now and about 6 months from now or so upgrade to two. The question I have is can anyone recommend a good system to go along with this card? I am definitely looking to go Crossfire in a few months with a twin 4870 setup. The big question is can anyone recommend a processor/mobo/memory combination that won't bottleneck this future Crossfire. Also is it better to get one of these dual gpu single cards or two get two single gpu cards?
 

heavymetalsmith

Distinguished
Apr 10, 2008
45
0
18,530
Now this is just bothersome. Okay back to the drawing board, I'm going to keep looking between the GTX 280 or 4870. I know it will be one or the other it's just hard to draw a conclusion at the moment they both seem to have their benefits and drawbacks.
 

dagger

Splendid
Mar 23, 2008
5,624
0
25,780


Between gtx280 and 4870? Isn't 4870 and gtx260 more fitting since they cost the same? gtx280 cost $120 more, for most people's that's an important factor. :p

Well, I guess you're different, otherwise you'd have picked 9800gx2, as it cost the least while performing the best among the 3.
 


So, basically the 9800GX2 is kinda the BEST thing u can get ($/Perf wise) now, right?

I'm a little confused about the defective chips around nVidia's cards and you seem to know quite a bit. Do they affect the 9800GT's (GTX, GTX+ and GX2) and if they do, how they do affect them?

Anyhow, i'd say the DX10.1 from the Ati cards is not a minor issue, like you said on other topic. Developers are actually 'watching' DX10.1 and might develop some titles for it before DX11. So Assasin's Creed step forward might be an example for that.

I'd say sooner than later, we'll see some DX10.1 titles showing off the performance increase from DX10. Or at least, for the sake of the HD2xxx's, HD38xx's and HD48xx's sake, they better do, lol.

This is some sort of bet i'd might add, but an interesting one. I'd bet on DX10.1 -> DX11 rather than DX10 now and jump to DX11 HW.

Also, there's the release of the +'s from nVidia in the near future (Q1-2009?).

Argh, so many choices! >_<

Esop!
 

rangers

Distinguished
Nov 19, 2007
1,563
0
19,790



buying the 9800gx2 is like buying a time bomb
 

heavymetalsmith

Distinguished
Apr 10, 2008
45
0
18,530
Okay this is what I am very confused at the gx2 is a dual gpu single card right? And while it out performs a single 4870 or 280 it doesn't outperform two of them. So in essence getting one 4870 or 280 then getting a second a half year from now makes more sense then getting a second gx2 as running in quad gpu mode is pointless from what I have seen. So yes price/performance a 9800gx2 make more sense but not over a two year + upgrade model. So really it comes down to choosing a ATI 4870 or a NVIDIA 280. It's a hard choice to make they both have their + and -.
 

heavymetalsmith

Distinguished
Apr 10, 2008
45
0
18,530
And yes it's more like between the 260 and the 4870 because of the price comparison but for me the extra cost is not as much of an issue as the future performance.
 

dagger

Splendid
Mar 23, 2008
5,624
0
25,780


As for 4870x2 or 4870 cf outperforming 9800gx2 quad sli, that's not actually true. See benchmarks:
http://www.hothardware.com/Articles/ATI-Radeon-HD-4870-X2--AMD-Back-On-Top/?page=5
http://www.hothardware.com/Articles/ATI-Radeon-HD-4870-X2--AMD-Back-On-Top/?page=6
http://www.hothardware.com/Articles/ATI-Radeon-HD-4870-X2--AMD-Back-On-Top/?page=7
http://www.hothardware.com/Articles/ATI-Radeon-HD-4870-X2--AMD-Back-On-Top/?page=8
http://www.hothardware.com/Articles/ATI-Radeon-HD-4870-X2--AMD-Back-On-Top/?page=9

Performance were quite close, but the only case where 4870x2/4870 cf outperformed 9800gx2 quad sli is in Half Life 2. 9800gx2 is faster in all the remaining tests. The reason for this is simple: while quad sli scale horribly, the gap between single 4870 and 9800gx2 is so big to begin with, that it can't be overshot just because of quad sli/cf's bad scaling. Of course, 4870 still closed the gap A LOT.

In this case, the quad sli/cf handicap just isn't enough to counter the raw power. 2 4870s or 1 4870x2 still cost more and perform less than 2 cheaper 9800gx2.
 


No it's not!
Once again you don't bother to actually look at the benchmarks in depth.
QuadSLi GX2 is faster at the lower resolution, at the higher resolution the X2 in CF beats it. so the GX2 gets you 1 or two frames in the easier mode, but then move to higher resolution when these aren't CPU bound, and the X2 CF is ahead.
And once again you pull out DX9 Crysis results. I don't know why you or reviewers even bother with those. It's like running games in 16bit mode to show higher fps, pointless with these setups that can handle full DX10 mode, or well the GTX and HD4K can handle full DX10 Very High, the GX2, not so much.

And many of these posts you're putting up counter you previous GX2 vs GTX280 argument, where HL2, UT3, and Quake wars all have the GTX well out in front and in the DX9 Crysis benchmarks both the GTX280SLi and even GTX260SLi well outperform the GX2 Quad SLi.

Next time actually look at what you're posting.

Then go beyond the 1920x1200 range and the GX2 falls of the end of the earth running out of VRAM, making quad SLi very limited for it.

Seriously you try to pimp the hell out of the GX2 like you have stock in them. :heink:

 

dagger

Splendid
Mar 23, 2008
5,624
0
25,780


No offense, but you're the one that haven't looked at the benchmarks, not me.

Look closer, the benchmarks include 2 bars for each game. The higher one is for 1920x1200, the lower one for 2560x1600. In each and every case other than Half Life 2, 9800gx2 quad perform higher than 4870x2/4870 cf on both resolutions respectively. I'm not sure where "Then go beyond the 1920x1200 range and the GX2 falls of the end of the earth running out of VRAM, making quad SLi very limited for it" comes from. While you're certainly right that the lower vram will tank performance for 9800gx2 at some point, 2560x1600 just isn't quite there yet, not to mention the smaller 1920x1200. While there are resolutions larger than 2560x1600, it's safe to say most monitors can't handle those. 2560x1600 resolution is hardly "cpu bound," especially since their test rig use an "extreme" quad cpu. And those games do not run in 16bit mode.

As for dx9 vs dx10, honestly, I haven't noticed that the benchmarks are dx9 and not dx10. They didn't say on the test pages. Side to side comparisons of 4870x2 vs 9800gx2 quad sli is extremely rare. They're hard enough to find to begin with, so I was happy to be able to come across one at all, dx9 or dx10. As for your claim that 4870x2 will outperform 9800gx2 quad sli, I believe you. You're a mod and been here for a long time, your word is good enough for me.

But to say that dx9 benchmarks are somehow invalid is just an elitist viewpoint. Most benchmarkers still use either dx9 or a mixture of dx9 and dx10 instead of dx10 only. Most games still use either dx9 or both dx9 and dx10 dual mode, no game in existence so far use dx10 only. Majority of gamers also still run on dx9. To say that the only side to side benchmark that include this rare setup I could find is invalid because it happen to use dx9 is like saying unless you drive a Mercedes, you're not a driver. It's unfair. Nothing wrong with the still mainstream dx9.

Just as a side note, where in the benchmark did they actually say the tests are done in dx9 and not dx10? I reread it and weren't able to catch that part. They did say they use Vista, and 3dmark Vantage test is certainly done in dx10, since it doesn't run in dx9. They just didn't say if the games are done in dx10 Vista or if they forced dx9 mode.

As for this:
"And many of these posts you're putting up counter you previous GX2 vs GTX280 argument, where HL2, UT3, and Quake wars all have the GTX well out in front and in the DX9 Crysis benchmarks both the GTX280SLi and even GTX260SLi well outperform the GX2 Quad SLi."

I never posted that 9800gx2 quad sli outperforms gtx280 sli, in this thread or anywhere else. I know you're working and you're busy, but please try to keep track of things and don't say things that doesn't actually exist. All I said in other threads is that a single 9800gx2 outperform a single gtx280 or single 4870 (when 9800gx2 wasn't affected by the horrible quad scaling), which is true, using those benchmarks:
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=13
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=14
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=15
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=16
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=17
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=18
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=19
This particular benchmark does use dx10, including Very High shader setting for Crysis. I try to use dx10 benchmarks whenever possible. It's the new thing, after all. Benchmarks comparing 4870x2 and 9800gx2 quad sli side to side are just far too rare, and the one is all I could find, so there's no choice there.

For HL2 and UT3, the earlier benchmarks didn't include them on its lineup. Many benchmarkers didn't include them either, probably because they are slightly older and so well optimized that you max out with hundreds of fps. But you're right, those are popular games, and should be included in benchmarks. As for Quake Wars, which was included in both benchmarks, the quad benchmark use OpenGL instead of dx, which might have contributed to the difference in performance. Although I guess it's really better and more logical that way since it runs on OpenGL natively and dx is only a matter of sound and compatibility.

I knew from the beginning that going against conventional wisdom is going to result in taking some heat, even if backed up by benchmarks. I just didn't expect so much of it from a level headed and well respected poster like you. :(

Since you're from Canada, in a time zone close to my own, and your post is timestamped 4:27 am, I'll just assume you're fatigued. :p

I still respect you very much. But please try to look at things more carefully and check for possible errors before hitting the submit button.
 

xx12amanxx

Distinguished
Oct 27, 2007
584
16
18,995
I wouldnt go for the gx2 etheir its like buying a dead horse! That card's life is basically over with drivers that are 100% mature and not getting better. On top of that the gx2 consumes a ton of power and emits alot of heat! Nvidia never even wanted to make the card and they probably will stop making them very soon to make way for newer cards.

Yes it performs pretty good at lower resolution's but if you go up in res with AA on the other cards outperform it.
 


No I read the benchies, what I missed was that you were comparing Quad to a single X2 (because of how the CF was written), which makes little sense for someone was talking about SLIing 2 GTX280, then you couch the comparison by omitting the proper comparison. Compared to the GTX280 in SLi and the proper X2 in CF the GX2 Quad is dead in the water.

I'm not sure where "Then go beyond the 1920x1200 range and the GX2 falls of the end of the earth running out of VRAM, making quad SLi very limited for it" comes from. While you're certainly right that the lower vram will tank performance for 9800gx2 at some point, 2560x1600 just isn't quite there yet, not to mention the smaller 1920x1200. While there are resolutions larger than 2560x1600, it's safe to say most monitors can't handle those. 2560x1600 resolution is hardly "cpu bound," especially since their test rig use an "extreme" quad cpu.

You need to look at the reviews that include that resolution, even the GF9800GTX, HD4850, HD4870 and HD4870 in CF die off at that resolution with settings on high.

But to say that dx9 benchmarks are somehow invalid is just an elitist viewpoint. Most benchmarkers still use either dx9 or a mixture of dx9 and dx10 instead of dx10 only. Most games still use either dx9 or both dx9 and dx10 dual mode, no game in existence so far use dx10 only. Majority of gamers also still run on dx9.

It has nothing to do with being an elitist viewpoint, it's a viewpoint that the reason to test with Crysis is to test with the toughest game, to cripple that test makes no sense, and as little sense as turning down the shader quality in HL2 or ETQW because 'the majority of gamers' game at less settings than these cards can handle, it makes no sense. And you say most benchmarks use DX9 or a micture, I agree with the mixture, especially when benchmarking the 2nd & 3rd generation DX10 cards, this review takes the one DX10 benchmark they have (Vantage doesn't count for anything but like tests, the way all Bungholimarks only matter for that architecture) an turn it into a DX9 benchmark which makes no sense. BTW, I find it hillarious you're talking about the 'majority of gamers' and calling my position elitist when talking in a thread about SLing GTX 280, GX2s and X2s. C'mon, really.

To say that the only side to side benchmark that include this rare setup I could find is invalid because it happen to use dx9 is like saying unless you drive a Mercedes, you're not a driver. It's unfair. Nothing wrong with the still mainstream dx9.

Sure its wrong and sure it's pointless (not invalid, there's no error fro what I can see), you're buying DX10 cards, with a future in mind, why would you be testing based on other people's pasts? Especially when cards like the GTX280 and HD4870 perform better than the older designs. It's like testing the GF7800 with SM3.0 and FP16 settings to expose that the GF6800 was weak at that task and that their new design made using HDR in games like Fart Cry usable. This is the point of testing the cards in a stressful fashion that matches their intended future and a large reason they were made.

Just as a side note, where in the benchmark did they actually say the tests are done in dx9 and not dx10?

Pretty simple way to determine, shaders High = DX9, shaders Very High = DX10.

I never posted that 9800gx2 quad sli outperforms gtx280 sli, in this thread or anywhere else. I know you're working and you're busy, but please try to keep track of things and don't say things that doesn't actually exist. All I said in other threads is that a single 9800gx2 outperform a single gtx280

However a single GX2 doesn't outperform a single GTX280 in those hothardware benchies you test, in both resolution in both UT3 and ETQW he single GT280 beats the single GX2, and that was my point, you keep saying it as if the GX2 always beats a single GTX280, but in all those game benchies that's not the case.

I knew from the beginning that going against conventional wisdom is going to result in taking some heat, even if backed up by benchmarks. I just didn't expect so much of it from a level headed and well respected poster like you. :(

The problem is I see you picking/chosing too much at this or that benchmark and not overall, and you're flooding threads with that, with the same benchies, it's overboard dude. I admit I reacted more to what looked like you comparing quad to X2CF, but really the second half showing the GTX280 beating the GX2 in almost all those benchmarks in Hothardware's tests should show you that it's not as simple as you've been flooding the threads with.

Since you're from Canada, in a time zone close to my own, and your post is timestamped 4:27 am, I'll just assume you're fatigued.

You wanna blame anything for my dissposition blame the sun-burn from my drive in the Mustang with the top down coming back from Spokane and Boise and posting in the US. :sol: Who's ever tired on a long weekend? It's afterwards returning to work that would be the issue, and then again I have the rest of the week off too.

Posting a suggestion about the GX2 is all well and good, however you're making blanket assertions even your own benchmarks don't support, and you also aren't giving these cards anywhere near the time to mature that the GX2 was given, where you compare tests that arrived at or before launch when the drivers were far from optimized. This is the GX2's Swan Song, where it's on it's way out, but it's optimization is such that it outperforms the newer cards that have only recently arrived.

I agree that the GX2 is a good value, but you're use of these benchmarks selectively without taking into account what they're saying overall is a problem when advising people what to buy.
 


Wrong, NVIDIA has stated that the next driver release (Big Bang II) will improve performance on the 9xxx series cards, so the drivers are not mature yet.

Secondly, despite what people say, the GX2 still outpaces most everything out there by itself. While its true you don't get a large of a gain at higher res and higher AA, the GX2 still often comes out on top.

Seriously dude, get your facts stright.