3870X2 Vs 9800 GX2

honor

Distinguished
Jun 22, 2005
175
0
18,680
Ok well many reviews have been posted about these 2 cards and i find it funny when the varibles change from every game to favor a card. For example we all know ati has issues with AA for a reasson i dont know why. (if anyone knows please post) also many reviews change AA settings of and on to favor a card. for instance Half life 2 the 3870X2 beats the GX2 with aa off then drops with aa turned on. then the review decided to turn aa on to show how powerful the gx2 compared to the 3870X2... if there is a site out there that does apples to apples please tell me where. Because all i c in nvida fan sites kissing nvidias ass..

please do not post about o he is a ati fan boy as i said before i own ati and nvidia cards and im just tired of reviews fing the people over.
 

njalterio

Distinguished
Jan 14, 2008
780
0
18,990
As far as I know the 3870X2 and the 9800GX2 perform very similarly. It is my understanding that nvidia has better driver support, which explains why the anti aliasing feature is much better supported in nvidia cards.

I think I would like to clarify some things about anti aliasing though. Most people I think have no clue what anti aliasing (AA) actually is, but for some reason like toss the term around when discussing benchmarks.

Aliasing is a phenomenon that causes different continuous signals to become indistinguishable. For example, if you have extremely poor vision, then everything 20 ft in front of you may look exactly the same, like one big blur. This means you have very low visual acuity, and are unable to discern objects from what they really are. You would have aliased vision.

The anti aliasing feature applies an algorithm to correct objects that may look indiscernable, in effect it is applying a higher level of detail. I supposed you could think of glasses as a type of anti aliasing.

In the gaming world, anti aliasing helps us see a much higher level of detail. 90% of the time, I will not notice this detail. I usually leave anti aliasing around 2x (Currently I am using a Sapphire 3870).

Benchmarking programs such as 3dmark06 have their purpose, but mean very little. A certain number of marks does not really correlate meaningfully into how much we enjoy the graphics of a game. So where does this leave us with 3870X2 vs 9800X2?

Assuming we are using a resolution around 1200 x 1000, whatever it is, or so (like the vast majority of us):

Well if you take anti aliasing out of the picture (because it really does not make a difference during actual game play), then you will see that the 3870X2 may perform slightly better, although mostly the two cards mostly perform the same. The real topic to discuss is, how good are my graphics for a certain price? Well it looks like the 3870X2 is going to be somewhere between $150 to $200 cheaper. Performance wise they are the same. Go with the 3870X2.
 

I think toms reviews are more intuitive, but sometimes lack variety. The anandtech review had more to choose from, they had a 9600gt sli setup that showed it runs with the 9800gx2 in alot of areas. I think you can't base anything off just one review. You have to read a few and build your own conclusion.

My conclusion is an SLI setup is cheaper and would out perform the GX2 and its a waist unless you are stuck with a non sli board.
 

San Pedro

Distinguished
Jul 16, 2007
1,286
12
19,295
After reading some of the anandtech review I can say one thing, I don't like their test set-up. Using fraps with 20 second demoes. I just don't think 20 seconds is enough to get an accurate feel for how well a game will play, especially for the 3870x2 which can get some low intial FPS when first loading scenes into memory. All the reviews probably use extremely short demoes though, even though no one plays a game for twenty seconds.
 

blackened144

Distinguished
Aug 17, 2006
1,051
0
19,280


Toms does a nice apples to apples comparison. They test 8 games at 4 different settings and the last page of the review shows the average frames at all 4 settings for each card, theres no "varibles change from ever game to favor a card". You can see that the average frame rates for all 8 games, the GX2 beat it by 32% with no AA and 57% with AA at the lower res. The GX2 beats it by 37% with AA off and 40% with AA at the highest res. Reviewers test the games with AA turned on, not to show AMD in a bad light, but because no one buys a $600 card to pay at 640x480 with no AA or AF. Just because AMD cant handle the AA doesnt mean that reviewers are shilling for Nvidia they turn AA on.


http://images.tomshardware.com/2008/03/18/nvidia_9800_gx2/avg1.png
http://images.tomshardware.com/2008/03/18/nvidia_9800_gx2/avg2.png

EDIT: I also want to note that I dont really like the GX2. Its about on par with the performence I expected but hearing all the ATI fan boys talk about how the X2 would still beat up on it was driving me crazy. I like the fact that it does beat up on the X2 and takes over the "fastest single card on the planet" title, but its not worth the money.
 

jerseygamer

Distinguished
Nov 9, 2007
334
0
18,780
Once both cards are ironed out they should be pretty close. I dont doubt Nvidia will pass the ATI in performance slightly. Its expected. The only real issues here to think about is the cost of the card and compatability. I still lean to ATI for dx10.1 support and better over all xfire support and MB performance on xfire systems. If you change your gpu every year it realy doesnt matter. By the this time next year you will be using the latest hardware. For guys like me that like to keep my hardware for 2 years +++ and do the minimal amount of upgrading the ATI is the clear pick.
 

I changed my cards after two years, 7900GT's for 8800GT's nVidia being the clear pick :kaola: . So what's all this about DX10.1?, what games will utilize it?, when will it be released to the masses? and don't say 'soon' because that's getting old now.
 

tjoepie

Distinguished
Jan 9, 2008
206
0
18,680
I think this DX10.1 will not be an issue untill alot of cards can use it.
No game will require it unless most of the sold cards can use it.
 

cleeve

Illustrious


Not really. The 9800GX2 is a notably stronger card, but the 3870 X2 is cheaper...

Nvidia and Ati both have great driver support, Ati's slow AA in this generation is due to a hardware limitation.
 

weskurtz81

Distinguished
Apr 13, 2006
1,557
0
19,780


So, what you are saying is that Vista supports DX10.1? In case you didn't know, SP1 released today.
 

teldar

Distinguished
May 8, 2006
58
0
18,630
Re:AA, SP1-DX10.1
Thanks Cleve, I have seen that on several sites, and that is supposed to be one of the big things with the 4800? series. The RV770. That it is supposed to ?double? the ?ROP?'s and that is where ATI is falling behind in the AA, the fact that it's "back end" is so insignificant to the processing power the rest of the chip has. As far as that goes, I would say the 3870 is a vastly superior card in terms of processing power, it's just that it was not made terribly well to do graphics the way they are now. It was more designed as a GPGPU and it's gamin performance shows it.

As far as DX 10.1 and SP1 goes, just because DX 10.1 is available, doesn't mean that any of the games are going to support it. It just means that the interface and command sets are available. ATI needs to stop advertising DX 10.1 because most game developers have downplayed it's usefulness. Makes me wonder if there are actually going to be any DX 10.1 games ever. However, if there are, there are going to be quite a few happy ATI card owners.

T
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780
Can't wait for the 4870x2. Then Nvidia will jump that, but I trust AMD more because they don't fudge benchmarks nowadays, like Nvidia with the Crysis demo water; or the Linkboost to the 9600gt on Nvidia boards. Huang loves marketing, when his company doesn't need to do FUD with benchmarks. Nvidia's cards are good enough.

I just don't think the 9800gx2 is worth it for less than a 30" monitor.
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790


no, it is not that it was made poorly... just that both companies take risks in design. From early on chip devs (including Matrox, 3dFx and others in this) each take a look at where they think the market and software devs are going. Sometimes they have "inside info" on what a game is going to do or what m$ is creating in dx10, other times not. Going back a few years, the rad9700pro took a radical approach to the architecture against a dominant gf4 and won the gamble... even dominated Nv's response in the FX. So Nv took a different approach as the gf5 did not pan out. the 6 series came back strong and ati hit supply/manufacture issues with the x800. It was on par with te gf6 but too late. Similar things happened in the gf7 and initial x1800 gen, but you saw the x1800 diverging quite drastically from the texture-heavy days of old. w/ the 1900 it was so shader-heavy that it obviously showed their leaning towards the soon to be standard "stream procs". The 1900 was actually a great architecture that showed in most cases dominance over the gf7.

It was at this point though that you saw games themselves running better on ati over Nv design and vice-versa. Oblivion and Unreal engine being 2 perfect examples.

Enter today's "generation" (arguably a name-mess with gf8 & 9 and radeon2x00 and 3x00 being essentially being evolutionary and not totally "new") and you see that ati went further with its design plan and missed. Perhaps not as much as the gfFX, maybe so... but they seem to have missed here to this point. No worries, it happens. Nv hit the ball out of the park so-to-speak with the gf8. Ati's design is not "bad" either though, which is why I still think the gfFx is the bigger miss. the current gen radeon has alot going for it, not the least of which being better dual/tri/quad card support than current sli. Better htpc functionality. Better hdmi support (sound). Overall, if you do anything else along w/ games then the radeon is stronger.

It may not be as efficient on some (most?) games of this gen, but it could prove more competitive if software design changes... who knows? Maybe their magic 8-ball was wrong... ;)



lol, never underestimate the power and money gained of marketing a "useless" feature to the uneducated masses.

n00b buyer: "which card is better for my budget p.o.s. pc I just bought a week ago to play games?"

best buy leech-clerk: "well, this one has 10.1 and the other one is only 10"

n00b buyer: "wow, so that extra .1 will get me lots of useless marketing stickers and a more colorful box?"

best buy leech: "...and cost you more money!"

n00b buyer: "well then, give me that and I also want the overpriced warranty you were trying to shackle me with earlier. 10.1 must have more to break."


Edit: changed 9800pro to 9700pro... my bad ;)
 


Yes and the biggest marketing tool to use on those uneducated tool-focused masses will likely be the supposedly DX10.1 equipped 3Dmark Vantage product.

Likely we'see something similar to the 3Dmk05 situation with SM3.0 where the GF8800Ultra gets X in Vantage and the HD3600 gettin X+n Bungholiomarks, thus the HD3600 must be better than the GF8800U of course. [:mousemonkey:1]
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
lol, ya....

funny how much better marketing can be if you have a product that can back your claims... otherwise you need something to change the consumer's "vantage"-point... eh? eh? rofl :lol:

[:mousemonkey] ...ahh, I kill me sometimes... [:mousemonkey:5]

seriously though, this kind of competition is good. Even as bad as the FX-bomb was it was good for Nv cause it made them do better while ati hedged on the whole R300 series. Even the R420 (x800) was only evolutionary from that. Now, just like the current gf8/9, that was not "bad" on their part as the chip was a great performer. Nv is sandbagging still (IMO) cause they really have the better design with current games. (putting aside the strong points on the radeon I mentioned earlier) I just expect that ati will come out swinging like Nv did after the FX.

and to all: I meant 9700pro earlier, not 9800.
 

honor

Distinguished
Jun 22, 2005
175
0
18,680
Well recently I have purchased 2 3870X2 and I am just aggravated that sites put bs benchmarks that’s all... thx for the info on the aa evryone.... Not 2 piss anyone off but some of the benches tom has placed are far lower then what I am getting.... I am gaming on a 27 in monitor 1920 rez., DDR3 1800 MHz dominator ram. maximus extreme mobo and a Q6600 at 3.5 GHz so something to me does not look right with these benchmarks.... Maybe i have a Flex Capacitor in my computer lmfao but idk i love this site and for the most part i agree.
 


The only problem IMO is that ATi (and many others) feel that they have the technical advantage, just not the speed advantage, and they may get trapped into tinking eventually programmers and consumers will come around to their way of thinking and then just making it a little bigger will help. I'm not sure that that strategy will work for performance at the high end. It's a solid thought for keeping the feature list similar to what it is, but I sure do hope they properly tweak the core ratios to ensure performance, because technical superiority is just like checkboxes on a list, not worth much if you're still rendering similar looking frames at 75% the speed of the competition.

Hopefully both nV and AMD can take this as an evolutionary step which will benefit everyone. I don't expect nV to jump ahead again like they did with the GF6800 or GF8800 to take the technical lead, but we'll see. I suspect this will be like the GF7800 refresh for nV and the ATi refresh will be like you say more like the X800 one.
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
I agree ape, Nv is already taking those evolutionary steps...

...what I was implying about ati having the tech but not the raw performance was to say that it was not as bad a bomb as the gfFX. But fear not, I have not drank their kool-aid, it is still a far cry from the performance (in games) of the 8800.

htpc usage notwithstanding, the current ati cards are dissapointing IMO. I really hope their next one (rad4000 or whatever) is NOT just evolutionary like Nv is w/ the gf9 but rather revolutionary like the R300 or the gf6/8... but I doubt it. ;)

rock on man.
 
Yep agree completely.

I think the unfortunate thing is nV is going to stick to base DX10 for the G100/T200 and this doesn't push ATi to consider adding more features because of the lack of their adoption. CubeMaps are nice and Tesselation is handy, but if people don't adopt it and people like Carmack pan it as an 'artificial' method, it may never see wide application, despite it's obvious short term advantage until the VPU power matches the requirements to do this 'non-artificially'.

Unfortunately when companies don't get rewarded from pushing the yardsticks ahead they tend to go from leaders to followers. Which is just as bad as being complacent about leading. I think nV's experience with the GF6800 with the lack of speedy SM3.0 adoption or even utility kept them from pushing hard the next refresh.

Technically the R600 may be more advanced but AMD didn't need the technological boost above the X1900, however they were rewarded with the Oblivion performance and features advantage over the GF7 series, so that was a positive boost. I think if AMD had the option to increase units or increase features I think we both feel they would increase units. Also nV's in the position where the lack of features doesn't hurt them, and if anything bringing a DX10.1 part might hurt their previous 'now obsolete' parts, and it's in their interest to resist a feature boost to give developers even more reason to ignore DX10.1 versus just coding for DX10.0

Hopefully this perception/guess is wrong, butthat's the feeling I get right now.