Sign in with
Sign up | Sign in
Your question

3870X2 Vs 9800 GX2

Last response: in Graphics & Displays
Share
March 18, 2008 2:48:28 PM

Ok well many reviews have been posted about these 2 cards and i find it funny when the varibles change from every game to favor a card. For example we all know ati has issues with AA for a reasson i dont know why. (if anyone knows please post) also many reviews change AA settings of and on to favor a card. for instance Half life 2 the 3870X2 beats the GX2 with aa off then drops with aa turned on. then the review decided to turn aa on to show how powerful the gx2 compared to the 3870X2... if there is a site out there that does apples to apples please tell me where. Because all i c in nvida fan sites kissing nvidias ass..

please do not post about o he is a ati fan boy as i said before i own ati and nvidia cards and im just tired of reviews fing the people over.

More about : 3870x2 9800 gx2

March 18, 2008 3:56:41 PM

As far as I know the 3870X2 and the 9800GX2 perform very similarly. It is my understanding that nvidia has better driver support, which explains why the anti aliasing feature is much better supported in nvidia cards.

I think I would like to clarify some things about anti aliasing though. Most people I think have no clue what anti aliasing (AA) actually is, but for some reason like toss the term around when discussing benchmarks.

Aliasing is a phenomenon that causes different continuous signals to become indistinguishable. For example, if you have extremely poor vision, then everything 20 ft in front of you may look exactly the same, like one big blur. This means you have very low visual acuity, and are unable to discern objects from what they really are. You would have aliased vision.

The anti aliasing feature applies an algorithm to correct objects that may look indiscernable, in effect it is applying a higher level of detail. I supposed you could think of glasses as a type of anti aliasing.

In the gaming world, anti aliasing helps us see a much higher level of detail. 90% of the time, I will not notice this detail. I usually leave anti aliasing around 2x (Currently I am using a Sapphire 3870).

Benchmarking programs such as 3dmark06 have their purpose, but mean very little. A certain number of marks does not really correlate meaningfully into how much we enjoy the graphics of a game. So where does this leave us with 3870X2 vs 9800X2?

Assuming we are using a resolution around 1200 x 1000, whatever it is, or so (like the vast majority of us):

Well if you take anti aliasing out of the picture (because it really does not make a difference during actual game play), then you will see that the 3870X2 may perform slightly better, although mostly the two cards mostly perform the same. The real topic to discuss is, how good are my graphics for a certain price? Well it looks like the 3870X2 is going to be somewhere between $150 to $200 cheaper. Performance wise they are the same. Go with the 3870X2.
Related resources
a c 147 U Graphics card
March 18, 2008 3:58:42 PM

ghost_uwi said:
http://www.tomshardware.com/2008/03/18/nvidia_geforce_9...

tom's reviews are pretty good i find

I think toms reviews are more intuitive, but sometimes lack variety. The anandtech review had more to choose from, they had a 9600gt sli setup that showed it runs with the 9800gx2 in alot of areas. I think you can't base anything off just one review. You have to read a few and build your own conclusion.

My conclusion is an SLI setup is cheaper and would out perform the GX2 and its a waist unless you are stuck with a non sli board.
March 18, 2008 4:01:47 PM

agreed was just showing him a descent review
March 18, 2008 4:46:02 PM

After reading some of the anandtech review I can say one thing, I don't like their test set-up. Using fraps with 20 second demoes. I just don't think 20 seconds is enough to get an accurate feel for how well a game will play, especially for the 3870x2 which can get some low intial FPS when first loading scenes into memory. All the reviews probably use extremely short demoes though, even though no one plays a game for twenty seconds.
March 18, 2008 5:01:29 PM

honor said:
Ok well many reviews have been posted about these 2 cards and i find it funny when the varibles change from every game to favor a card. For example we all know ati has issues with AA for a reasson i dont know why. (if anyone knows please post) also many reviews change AA settings of and on to favor a card. for instance Half life 2 the 3870X2 beats the GX2 with aa off then drops with aa turned on. then the review decided to turn aa on to show how powerful the gx2 compared to the 3870X2... if there is a site out there that does apples to apples please tell me where. Because all i c in nvida fan sites kissing nvidias ass..

please do not post about o he is a ati fan boy as i said before i own ati and nvidia cards and im just tired of reviews fing the people over.


Toms does a nice apples to apples comparison. They test 8 games at 4 different settings and the last page of the review shows the average frames at all 4 settings for each card, theres no "varibles change from ever game to favor a card". You can see that the average frame rates for all 8 games, the GX2 beat it by 32% with no AA and 57% with AA at the lower res. The GX2 beats it by 37% with AA off and 40% with AA at the highest res. Reviewers test the games with AA turned on, not to show AMD in a bad light, but because no one buys a $600 card to pay at 640x480 with no AA or AF. Just because AMD cant handle the AA doesnt mean that reviewers are shilling for Nvidia they turn AA on.


http://images.tomshardware.com/2008/03/18/nvidia_9800_g...
http://images.tomshardware.com/2008/03/18/nvidia_9800_g...

EDIT: I also want to note that I dont really like the GX2. Its about on par with the performence I expected but hearing all the ATI fan boys talk about how the X2 would still beat up on it was driving me crazy. I like the fact that it does beat up on the X2 and takes over the "fastest single card on the planet" title, but its not worth the money.
March 18, 2008 5:46:37 PM

Once both cards are ironed out they should be pretty close. I dont doubt Nvidia will pass the ATI in performance slightly. Its expected. The only real issues here to think about is the cost of the card and compatability. I still lean to ATI for dx10.1 support and better over all xfire support and MB performance on xfire systems. If you change your gpu every year it realy doesnt matter. By the this time next year you will be using the latest hardware. For guys like me that like to keep my hardware for 2 years +++ and do the minimal amount of upgrading the ATI is the clear pick.
a c 271 U Graphics card
March 18, 2008 6:19:29 PM

jerseygamer said:
Once both cards are ironed out they should be pretty close. I dont doubt Nvidia will pass the ATI in performance slightly. Its expected. The only real issues here to think about is the cost of the card and compatability. I still lean to ATI for dx10.1 support and better over all xfire support and MB performance on xfire systems. If you change your gpu every year it realy doesnt matter. By the this time next year you will be using the latest hardware. For guys like me that like to keep my hardware for 2 years +++ and do the minimal amount of upgrading the ATI is the clear pick.

I changed my cards after two years, 7900GT's for 8800GT's nVidia being the clear pick :kaola:  . So what's all this about DX10.1?, what games will utilize it?, when will it be released to the masses? and don't say 'soon' because that's getting old now.
March 18, 2008 6:30:31 PM

I think this DX10.1 will not be an issue untill alot of cards can use it.
No game will require it unless most of the sold cards can use it.
March 18, 2008 6:59:39 PM

Vista won't even have DX10.1 until SP1.
March 18, 2008 7:07:56 PM

njalterio said:
As far as I know the 3870X2 and the 9800GX2 perform very similarly. It is my understanding that nvidia has better driver support, which explains why the anti aliasing feature is much better supported in nvidia cards.


Not really. The 9800GX2 is a notably stronger card, but the 3870 X2 is cheaper...

Nvidia and Ati both have great driver support, Ati's slow AA in this generation is due to a hardware limitation.
March 18, 2008 7:33:25 PM

bfellow said:
Vista won't even have DX10.1 until SP1.


So, what you are saying is that Vista supports DX10.1? In case you didn't know, SP1 released today.
March 18, 2008 7:36:09 PM

I can sell a Voodoo3 3000 perfectly working and with all accessories if someone is interested.
March 18, 2008 7:44:00 PM

Re:AA, SP1-DX10.1
Thanks Cleve, I have seen that on several sites, and that is supposed to be one of the big things with the 4800? series. The RV770. That it is supposed to ?double? the ?ROP?'s and that is where ATI is falling behind in the AA, the fact that it's "back end" is so insignificant to the processing power the rest of the chip has. As far as that goes, I would say the 3870 is a vastly superior card in terms of processing power, it's just that it was not made terribly well to do graphics the way they are now. It was more designed as a GPGPU and it's gamin performance shows it.

As far as DX 10.1 and SP1 goes, just because DX 10.1 is available, doesn't mean that any of the games are going to support it. It just means that the interface and command sets are available. ATI needs to stop advertising DX 10.1 because most game developers have downplayed it's usefulness. Makes me wonder if there are actually going to be any DX 10.1 games ever. However, if there are, there are going to be quite a few happy ATI card owners.

T
March 18, 2008 7:54:19 PM

Can't wait for the 4870x2. Then Nvidia will jump that, but I trust AMD more because they don't fudge benchmarks nowadays, like Nvidia with the Crysis demo water; or the Linkboost to the 9600gt on Nvidia boards. Huang loves marketing, when his company doesn't need to do FUD with benchmarks. Nvidia's cards are good enough.

I just don't think the 9800gx2 is worth it for less than a 30" monitor.
March 18, 2008 8:33:06 PM

teldar said:
Re:AA, SP1-DX10.1
Thanks Cleve, I have seen that on several sites, and that is supposed to be one of the big things with the 4800? series. The RV770. That it is supposed to ?double? the ?ROP?'s and that is where ATI is falling behind in the AA, the fact that it's "back end" is so insignificant to the processing power the rest of the chip has. As far as that goes, I would say the 3870 is a vastly superior card in terms of processing power, it's just that it was not made terribly well to do graphics the way they are now. It was more designed as a GPGPU and it's gamin performance shows it.


no, it is not that it was made poorly... just that both companies take risks in design. From early on chip devs (including Matrox, 3dFx and others in this) each take a look at where they think the market and software devs are going. Sometimes they have "inside info" on what a game is going to do or what m$ is creating in dx10, other times not. Going back a few years, the rad9700pro took a radical approach to the architecture against a dominant gf4 and won the gamble... even dominated Nv's response in the FX. So Nv took a different approach as the gf5 did not pan out. the 6 series came back strong and ati hit supply/manufacture issues with the x800. It was on par with te gf6 but too late. Similar things happened in the gf7 and initial x1800 gen, but you saw the x1800 diverging quite drastically from the texture-heavy days of old. w/ the 1900 it was so shader-heavy that it obviously showed their leaning towards the soon to be standard "stream procs". The 1900 was actually a great architecture that showed in most cases dominance over the gf7.

It was at this point though that you saw games themselves running better on ati over Nv design and vice-versa. Oblivion and Unreal engine being 2 perfect examples.

Enter today's "generation" (arguably a name-mess with gf8 & 9 and radeon2x00 and 3x00 being essentially being evolutionary and not totally "new") and you see that ati went further with its design plan and missed. Perhaps not as much as the gfFX, maybe so... but they seem to have missed here to this point. No worries, it happens. Nv hit the ball out of the park so-to-speak with the gf8. Ati's design is not "bad" either though, which is why I still think the gfFx is the bigger miss. the current gen radeon has alot going for it, not the least of which being better dual/tri/quad card support than current sli. Better htpc functionality. Better hdmi support (sound). Overall, if you do anything else along w/ games then the radeon is stronger.

It may not be as efficient on some (most?) games of this gen, but it could prove more competitive if software design changes... who knows? Maybe their magic 8-ball was wrong... ;) 

teldar said:
As far as DX 10.1 and SP1 goes, just because DX 10.1 is available, doesn't mean that any of the games are going to support it. It just means that the interface and command sets are available. ATI needs to stop advertising DX 10.1 because most game developers have downplayed it's usefulness. Makes me wonder if there are actually going to be any DX 10.1 games ever. However, if there are, there are going to be quite a few happy ATI card owners.

T


lol, never underestimate the power and money gained of marketing a "useless" feature to the uneducated masses.

n00b buyer: "which card is better for my budget p.o.s. pc I just bought a week ago to play games?"

best buy leech-clerk: "well, this one has 10.1 and the other one is only 10"

n00b buyer: "wow, so that extra .1 will get me lots of useless marketing stickers and a more colorful box?"

best buy leech: "...and cost you more money!"

n00b buyer: "well then, give me that and I also want the overpriced warranty you were trying to shackle me with earlier. 10.1 must have more to break."


Edit: changed 9800pro to 9700pro... my bad ;) 
a b U Graphics card
March 18, 2008 9:38:29 PM

sojrner said:
lol, never underestimate the power and money gained of marketing a "useless" feature to the uneducated masses.


Yes and the biggest marketing tool to use on those uneducated tool-focused masses will likely be the supposedly DX10.1 equipped 3Dmark Vantage product.

Likely we'see something similar to the 3Dmk05 situation with SM3.0 where the GF8800Ultra gets X in Vantage and the HD3600 gettin X+n Bungholiomarks, thus the HD3600 must be better than the GF8800U of course. [:mousemonkey:1]
March 18, 2008 10:05:57 PM

lol, ya....

funny how much better marketing can be if you have a product that can back your claims... otherwise you need something to change the consumer's "vantage"-point... eh? eh? rofl :lol: 

[:mousemonkey] ...ahh, I kill me sometimes... [:mousemonkey:5]

seriously though, this kind of competition is good. Even as bad as the FX-bomb was it was good for Nv cause it made them do better while ati hedged on the whole R300 series. Even the R420 (x800) was only evolutionary from that. Now, just like the current gf8/9, that was not "bad" on their part as the chip was a great performer. Nv is sandbagging still (IMO) cause they really have the better design with current games. (putting aside the strong points on the radeon I mentioned earlier) I just expect that ati will come out swinging like Nv did after the FX.

and to all: I meant 9700pro earlier, not 9800.
March 18, 2008 10:25:46 PM

Well recently I have purchased 2 3870X2 and I am just aggravated that sites put bs benchmarks that’s all... thx for the info on the aa evryone.... Not 2 piss anyone off but some of the benches tom has placed are far lower then what I am getting.... I am gaming on a 27 in monitor 1920 rez., DDR3 1800 MHz dominator ram. maximus extreme mobo and a Q6600 at 3.5 GHz so something to me does not look right with these benchmarks.... Maybe i have a Flex Capacitor in my computer lmfao but idk i love this site and for the most part i agree.
a b U Graphics card
March 18, 2008 10:39:38 PM

sojrner said:
seriously though, this kind of competition is good. Even as bad as the FX-bomb was it was good for Nv cause it made them do better while ati hedged on the whole R300 series. Even the R420 (x800) was only evolutionary from that. Now, just like the current gf8/9, that was not "bad" on their part as the chip was a great performer. Nv is sandbagging still (IMO) cause they really have the better design with current games. (putting aside the strong points on the radeon I mentioned earlier) I just expect that ati will come out swinging like Nv did after the FX.


The only problem IMO is that ATi (and many others) feel that they have the technical advantage, just not the speed advantage, and they may get trapped into tinking eventually programmers and consumers will come around to their way of thinking and then just making it a little bigger will help. I'm not sure that that strategy will work for performance at the high end. It's a solid thought for keeping the feature list similar to what it is, but I sure do hope they properly tweak the core ratios to ensure performance, because technical superiority is just like checkboxes on a list, not worth much if you're still rendering similar looking frames at 75% the speed of the competition.

Hopefully both nV and AMD can take this as an evolutionary step which will benefit everyone. I don't expect nV to jump ahead again like they did with the GF6800 or GF8800 to take the technical lead, but we'll see. I suspect this will be like the GF7800 refresh for nV and the ATi refresh will be like you say more like the X800 one.
March 19, 2008 4:24:36 AM

I agree ape, Nv is already taking those evolutionary steps...

...what I was implying about ati having the tech but not the raw performance was to say that it was not as bad a bomb as the gfFX. But fear not, I have not drank their kool-aid, it is still a far cry from the performance (in games) of the 8800.

htpc usage notwithstanding, the current ati cards are dissapointing IMO. I really hope their next one (rad4000 or whatever) is NOT just evolutionary like Nv is w/ the gf9 but rather revolutionary like the R300 or the gf6/8... but I doubt it. ;) 

rock on man.
a b U Graphics card
March 19, 2008 3:48:44 PM

Yep agree completely.

I think the unfortunate thing is nV is going to stick to base DX10 for the G100/T200 and this doesn't push ATi to consider adding more features because of the lack of their adoption. CubeMaps are nice and Tesselation is handy, but if people don't adopt it and people like Carmack pan it as an 'artificial' method, it may never see wide application, despite it's obvious short term advantage until the VPU power matches the requirements to do this 'non-artificially'.

Unfortunately when companies don't get rewarded from pushing the yardsticks ahead they tend to go from leaders to followers. Which is just as bad as being complacent about leading. I think nV's experience with the GF6800 with the lack of speedy SM3.0 adoption or even utility kept them from pushing hard the next refresh.

Technically the R600 may be more advanced but AMD didn't need the technological boost above the X1900, however they were rewarded with the Oblivion performance and features advantage over the GF7 series, so that was a positive boost. I think if AMD had the option to increase units or increase features I think we both feel they would increase units. Also nV's in the position where the lack of features doesn't hurt them, and if anything bringing a DX10.1 part might hurt their previous 'now obsolete' parts, and it's in their interest to resist a feature boost to give developers even more reason to ignore DX10.1 versus just coding for DX10.0

Hopefully this perception/guess is wrong, butthat's the feeling I get right now.
March 19, 2008 4:09:50 PM

yup, and in addition to keeping their dx10.0 parts selling they nullify any perceived advantage that any dx10.1 parts have over theirs... thus hurting ati even more.

very marketing-centric but wise nonetheless on Nv's part for their bottom line.

It really is a bummer that some of these things are being "forced" on devs (just my perception) like the "proper" way to implement AA and the like. Wasn't there an early brouhaha over ati doing it "correct" for dx10 and Nv not but devs went w/ Nv's method? In addition they allowed something like "custom" AA methods for devs?

Ya, it looks like just another feature that is unused on their part but on initial gloss-over it almost looks like Nv went behind the scenes to make sure their "improper" implementation was used by devs where ati just assumed that the "proper" way would be implemented also and they would get the advantage... much like the FP and color depth issues in dx9 between the rad9 and gf5 only Nv learned from that and made sure they stayed on top...

...pure speculation sure, but whaddaya think?
March 19, 2008 4:36:54 PM

For me (I use 2 monitors and span my desktop) the ability to do this with CrossfireX got my vote, I got the 3870x2 and am really enjoying the performance.

Both are monster cards, but from what I have read the 3870x2 is slightly ahead on a Cost vs. Performance issue (not outright performance).
March 19, 2008 5:27:51 PM

As per the new price of the the radeon 3870 ($160), you can buy THREE 3870s for the price of a 3870X2, ...... or THREE 3870s AND a mobo for less than a GX2 .... im guessing three 3870s beat the potty out of a GX2
a b U Graphics card
March 19, 2008 8:13:37 PM

sojrner said:

It really is a bummer that some of these things are being "forced" on devs (just my perception) like the "proper" way to implement AA and the like. Wasn't there an early brouhaha over ati doing it "correct" for dx10 and Nv not but devs went w/ Nv's method? In addition they allowed something like "custom" AA methods for devs?


Well it is actually the way to do HDR correctly with AA, and even soft shadows (remember our FEAR issue with soft shadows and AA) it doesn't work with standard hardware resolve it need to be done in shaders, however it is somewhat optional, and depends on if people really care if it's Pixel correct or looks OK. Like most people were fine with the X1900's HDR+AA so making the case that it's a very important issue just won't stick IMO, but it will be where AA is headed for all DX10 titles though eventually, and it's not like the GF8's can't do it, they just lose their benefit of the dedicated AA hardware resolve. To me that difference is like the diff between GF8 AF and HD AF, it's there but so small as to not really matter.

And it is something where ATi basically crippled their DX10 card's DX9 AA performance because they did expect this to be a requirement and the way things would go (and it would still be fast enough in software mode to play DX9 titles), the problem is they were wrong about the speed of DX10 adoption. First Vista was delayed, then the DX10 titles, so by the time the R600 arrived, they weren't as deep into DX10 development and adoption as they tought, and with the partial adoption and the removal of some requirements they were left in a realm where leaving out hardware resolve was now a major issue.
March 19, 2008 9:10:06 PM

Well i wonder how Quad fire Vs quad sli will span out.....
2 me crossfire scaled better we will c
April 28, 2008 9:20:12 PM

Actually, Assassins Creed supports DX10.1, yielding in a 20% performance boost. The graphics are still the same though as DX10, just faster, and im glad im using Radeon 3870 and vista:) 

source:
http://www.rage3d.com/articles/assassinscreed/

April 29, 2008 5:37:36 AM

honor said:
http://www.bit-tech.net/hardware/2008/03/18/xfx_geforce...

if u looke at this review u will c what im talking about with aa... i have crysis cod4 and with aa turned off (i dont even c a diffeence with it on) im getting crazy fps with quad fire)


Just like you said and also someone earlier on, saying you cant notice any difference with aa on is hard to believe in my opinion. I can easily see the difference with it on and it looks Much better. So if the GX2 runs better with aa then it is a faster card. If your spending 400+ on a card it should be able to handle aa. AA smooths at the jagged lines giving it a much more smooth crisp picture. Go and stand right in front of a big tv and then you will see jagged edges to the extreme lol. AA smooths that out the more you put it on. I usually go for 4x cause 8x is excessive.
!