Sign in with
Sign up | Sign in
Your question

Geforce 9800gx2 review

Last response: in Graphics & Displays
Share
March 13, 2008 1:28:44 PM

I do expect the 9800gx2 will "pwn" (amusing phrase, does it mean anything?) the 3870x2. Nvidia's known for dodgy drivers and they might boost the PCIe x16 bus behind the scenes to get a few extra fps, they did it with their 9600gt in SLI. If I were buying Nvidia now, I'd go 9800gtx over 9800gx2.

At any rate, I can't connect to that second rate site, so I'll wait for Tom's or Anandtech to actually have a review. Then, we'll have complete benchmarks. Xbit Labs is another site I'll trust with reviews. H is one that I'll ignore because, while they do "real world game tests" they cherry pick drivers for their favored cards, while testing on too few games.

Yes, the 9800gx2 will be a bit faster than the 3870x2, but at what cost? Is it $599 like the Inquirer said? If what I read recently is true, then Nvidia will come out with the G100/200/Whatever once AMD comes out with R770. So, anyone who buys a 9800gx2 might be in for a shock, unless it works in Triple SLI the way a 3870x2 will work in CrossfireX.

I expect both Nvidia and ATI to have their 3 or more GPU drivers worked out before too long and to support more games later this year. Whether Huang likes it or not, multi-GPU cards and multi-card setups will "pwn" massive single GPU solutions.

March 13, 2008 1:40:00 PM

the link to the sites on nordic hardware
Related resources
March 13, 2008 1:41:44 PM

darthvaderkenneth said:
the link to the sites on nordic hardware


Why didn't you say so the first time? Why type in a link to a site all in Chinese? I'll check it out there. I trust Nordic Hardware too.

Here's their news report, but it's still based on testing at a site we know nothing about. Not the drivers used, but it is amusing that you need to enable SLI, whereas it's not needed for the 3870x2.

Here's the article:

http://www.nordichardware.com/index.php?news=1&action=m...

Quote:

Even though it's still a week or so to go before GeForce 9800GX2 will be officially unveiled, Chinese website PCOnline has already posted the first review. Do keep in mind that the drivers are premature and buggy, for example, the card refuses to work in 3D on X38, even though it is suppose to work. However, if you want to run Quad-SLI you will need an NVIDIA chipset. The overall performance is, as expected, just breathtaking at high resolutions. The Radeon HD 3870X2 gets whipped in every benchmark, brutally with AA/AF activated.

Amusingly, you need to activate SLI in the control panel for the card to work properly, whether this is also a prematurity or something people will have to get used to, remains to be seen.

Alas, water cooling is the most extreme cooling you will ever see with these cards due to the design, but that will probably still be enough to reclaim the 3DMark 03 crown. The other benchmarks will probably need two or three 9800GTX under some LN².

Stories are also going around that Quad-SLI will not be available when the GeForce 9800GX2 hits the market on March 18. The reason is suppose to be the drivers and poor scaling of more than two GPUs.


Nvidia fans, get 9800gtx's and stay away from this card. Even when Nvidia gets their Triple SLI (Quad SLI? That's a new one) drivers ready, I doubt that the 9800gx2 will bring anything that G100 plus a 9800gtx won't. The 9800gx2 is a badly designed two PCB monster that will roar for a few scenes before dying in the crossfire of a 4870x2 vs. a G100 (or 200 or whatever Huang wants to call it).

I want a 4870x2 already!
March 13, 2008 2:01:29 PM

yipsl said:

Nvidia fans, get 9800gtx's and stay away from this card. Even when Nvidia gets their Triple SLI (Quad SLI? That's a new one) drivers ready, I doubt that the 9800gx2 will bring anything that G100 plus a 9800gtx won't. The 9800gx2 is a badly designed two PCB monster that will roar for a few scenes before dying in the crossfire of a 4870x2 vs. a G100 (or 200 or whatever Huang wants to call it).

I want a 4870x2 already!



I have a 8800GTS 512 and im thinking of taking advantage of the step-up program so that I can get this card.
I found your statement curious, and Id like to know how you arrive at this conclusion considering how little is really known about the card.
March 13, 2008 2:24:01 PM

Quite a lot is known about the card. Not it's in game performance with mature drivers. I admit it will most likely beat a 3870x2 in most, if not all, games (especially considering Nvidia's The Way It's Meant to be Played program). I simply think that Nvidia cheats too much with drivers and image quality to get frame rates up in benchmarks. Then there's that issue of a 9600gt's PCIe x16 bus being boosted in the background on Nvidia boards where even Nvidia reps aren't aware of the tweak.

The 9800gx2 is two PCB's. It doesn't look like it will cool very well. Nvidia could have done much better with this card. I really think a single 9800gtx is a better choice for a brand new purchase. If you want to step up then that's more understandable than buying it outright.
March 13, 2008 2:28:32 PM

yipsl said:
Quite a lot is known about the card. Not it's in game performance with mature drivers. I admit it will beat a 3870x2 in most, if not all, games (especially considering Nvidia's The Way It's Meant to be Played program).

It's two PCB's. It doesn't look like it will cool very well. Nvidia could have done much better with this card. I really think a single 9800gtx is a better choice. If you want to step up then that's more understandable than buying it outright.



I appreciate the response. Thanks. Im really torn about this though. My Step-up expires at the end of this month, so I will probably not be able to step up to the 9800GTX, which certainly would be ideal.

My hope is in line with your skepticism, that driver tweaks could make this a more viable card. Perhaps they could have done better, sure, but compartively, it seems that it will be a very capable card that I should not have to replace for quite some time. And in April, I will be upgrading to a new mobo CPU and DDR2. All that together should give me a potent machine.

Such a money sucking hobby!
Doesnt help that my other hobbies are dirt bikes and my WRX. :pt1cable: 
March 13, 2008 2:33:42 PM

yipsl said:
Then there's that issue of a 9600gt's PCIe x16 bus being boosted in the background on Nvidia boards where even Nvidia reps aren't aware of the tweak.



This, I had not heard about. More detail pls! :) 
March 13, 2008 2:34:20 PM

rallyimprezive said:
I appreciate the response. Thanks. Im really torn about this though. My Step-up expires at the end of this month, so I will probably not be able to step up to the 9800GTX, which certainly would be ideal.

My hope is in line with your skepticism, that driver tweaks could make this a more viable card. Perhaps they could have done better, sure, but compartively, it seems that it will be a very capable card that I should not have to replace for quite some time. And in April, I will be upgrading to a new mobo CPU and DDR2. All that together should give me a potent machine.

Such a money sucking hobby!
Doesnt help that my other hobbies are dirt bikes and my WRX. :pt1cable: 


In your case, I'd get it. You have nothing to lose but the difference in cost between what you paid and what you're willing to pay for the extra performance. I don't like being mistrustful of Nvidia, but I am nowadays. They fixed the blurring on the 7xxx series with the 8xxx series, but fudged Crysis demo drivers and did a behind the scenes boost on the 9600gt, that all seems to be aimed at benchmarks to show Nvidia cards beat ATI cards.

Here's a post at Nordic Hardware's forums with links to the two articles on the 9600gt boost:

http://www.nordichardware.com/forum/viewtopic.php?topic...

Yes, I'll admit to an ATI preference, but I didn't have too many issues when I had a 7600gs with an Nvidia 405 chipset board. My modder wife felt it was blurrier, but I'm not the artist in the family and I didn't see it as much. I almost stayed with Nvidia once the 8800gt came out but I decided I wanted to go back to a total ATI chipset and GPU platform.

Okay, here are the original links, just in case that thread disappears:

http://www.techpowerup.com/reviews/NVIDIA/Shady_9600_GT...

Quote:

The automatic increase of 25 MHz on the PCI-Express bus frequency yields an increase of 25% or 162.5 MHz over the stock clock (assuming a 650 MHz clock board design). With a final clock of 812.5 MHz you can bet this card will perform much better, when used by an unsuspecting user, on an NVIDIA chipset motherboard with LinkBoost.

Also it will have an extra performance advantage when reviewers compare the non-overclocked GeForce 9600 GT against any other card which is commonly done in most reviews. Unfortunately such a massive overclock can often cause instability of the graphics card, maybe so much that the system won't POST at all.

The idea of implementing a mechanism that directly increases the GPU frequency (and performance) based on the PCI-Express base frequency is a great novelty. It has the potential to offer hassle-free performance improvements to a large number of less experienced users. Being able to adjust this frequency in most modern BIOSes is a big plus because it will be applied without any software installation requirement in Windows (or any other operating system - there is your Linux overclocking).

The execution of this from NVIDIA's side is less than poor in my opinion. They did not communicate this new feature to reviewers at all, nor invented a marketing name for it and branded it as a feature that their competitors do not have.
Even when asked directly we got a bogus reply: "the crystal frequency is...". No, there is no 25 MHz crystal and its frequency is not fixed either. I'm not accusing the sender of the E-Mail of course, I just believe he didn't know, maybe this fact wasn't communicated to the marketing team at all. However, if you would get such an inquiry wouldn't you look into this further if it was your job to properly promote a product?

More room for speculation can be found in the driver. Why does it always return the "normal" frequency and not the real one? Maybe the driver developers didn't know about this either, who knows. I find it hard to believe that the internal communication lacks that much in a company which constantly delivers excellent, high-performing products.

It is certainly nice for NVIDIA to see their GeForce 9600 GT reviewed on NVIDIA chipsets with LinkBoost enabled where their card leaves the competition behind in the dust (even more). Also it could send a message to customers that the card performs considerably better when used on an NVIDIA chipset? Actually this is not the case, the PCI-Express frequency can be adjusted on most motherboards, you will see these gains independent of Intel/AMD CPU architecture or Intel/NVIDIA/AMD/VIA chipset.

While nothing is wrong with having a better product (and I do believe this is a feature that makes the product better), being transparent about such changes should be the proper way to do it in this industry.


http://www.nordichardware.com/index.php?news=1&action=m...

The following site says that Nvidia boards no longer have Linkboost, but also says that the issue will be encountered with more 9 series cards, and that only Nvidia can answer what's going on. Don't hold your breath on that. Benchmarking tweaks have been done by both ATI and Nvidia, but more by Nvidia in the past few years. The Crysis demo drivers were notorious for not displaying the water the way it was meant to be seen (and many gamers would accept that for better framerates, but it needs to be transparent so benchmarkers know what's going on and can compensate for any tweaks when comparing apples to apples.

Quote:

When we increased the PCIe clock, MT score did not increase. But when we increase GPU clock the raising graphic is the same with 9600GT. The 8800GS and 8400GS are the same with 8800GTS, when we increase their PCIe clock, MT score did not increase.

So, other cards’ performance did not affect by PCIe. For now only 9600GT have this issue. But will it be the only one? no. We ran a test with a 9500GT ES, the issue is the same, the card gain performance when we increase the PCIe clock.Maybe it is a new feature NVIDIA just did not want us know about it.


http://en.expreview.com/2008/03/11/follow-up-to-nvidias...
March 13, 2008 2:42:54 PM

rallyimprezive said:
I appreciate the response. Thanks. Im really torn about this though. My Step-up expires at the end of this month, so I will probably not be able to step up to the 9800GTX, which certainly would be ideal.

My hope is in line with your skepticism, that driver tweaks could make this a more viable card. Perhaps they could have done better, sure, but compartively, it seems that it will be a very capable card that I should not have to replace for quite some time. And in April, I will be upgrading to a new mobo CPU and DDR2. All that together should give me a potent machine.

Such a money sucking hobby!
Doesnt help that my other hobbies are dirt bikes and my WRX. :pt1cable: 


I didnt read what you had to say untill i saw the last part.... I have a WRX myself. 02 wagon to be exact.

Are you on NASIOC?
March 13, 2008 3:26:56 PM

spaztic7 said:
I didnt read what you had to say untill i saw the last part.... I have a WRX myself. 02 wagon to be exact.

Are you on NASIOC?


Yep! been on NABISCO under the same SN since April of 2001. :) 

I dont go there to often anymore though. The value of information has decreased, and the amount of trolls, flamers, and flat out idiots has gone up a lot.

ive got an 05 WRX. :) 

Love em!
a b Î Nvidia
March 13, 2008 3:49:42 PM

I'm really beginning to question whether or not NordicHardware even reads other reviews or not or just looks at 3Dmark scores and that's it. How do they come to the following statement;

yipsl said:

Here's the article:

http://www.nordichardware.com/index.php?news=1&action=m...

Quote:

...The overall performance is, as expected, just breathtaking at high resolutions. The Radeon HD 3870X2 gets whipped in every benchmark, brutally with AA/AF activated.


When 4 games the GX2 get 0fps with AA enabled and in Crysis it's slower than the X2?
http://www.pconline.com.cn/diy/graphics/reviews/0803/12...


IMO the only brutal whipping it the infinity% increase in performance of any number above 0, and then being beaten in the major game that matters for these uber rigs. Personally I don't care about AA in 3Dmark, it doesn't make it a more compeling benchmark. [:thegreatgrapeape:5]

Seriously Nordic needs to really look harder, this isn't the first time they've come out with these oversimplistic and overall wrong synopsis(s) .

We were discussing the AA and compatability issue in this other thread which has a few of the graphcs and a link to an english versionn of the review, see if you come to the same conclusion as Nordic did :heink:  ;
http://www.tomshardware.com/forum/249328-33-full-review...
March 13, 2008 4:17:12 PM

I agree TGGA, I was also confused when I reand "brutually with AA/AF activated".
March 13, 2008 11:41:43 PM

TheGreatGrapeApe said:
I'm really beginning to question whether or not NordicHardware even reads other reviews or not or just looks at 3Dmark scores and that's it. How do they come to the following statement;



When 4 games the GX2 get 0fps with AA enabled and in Crysis it's slower than the X2?
http://www.pconline.com.cn/diy/graphics/reviews/0803/12...


IMO the only brutal whipping it the infinity% increase in performance of any number above 0, and then being beaten in the major game that matters for these uber rigs. Personally I don't care about AA in 3Dmark, it doesn't make it a more compeling benchmark. [:thegreatgrapeape:5]

Seriously Nordic needs to really look harder, this isn't the first time they've come out with these oversimplistic and overall wrong synopsis(s) .

We were discussing the AA and compatability issue in this other thread which has a few of the graphcs and a link to an english versionn of the review, see if you come to the same conclusion as Nordic did :heink:  ;
http://www.tomshardware.com/forum/249328-33-full-review...


I don't know why I couldn't get the images to load earlier this morning when I first posted. All I got was the Chinese text and that's no help! The 9800gx2 does beat the 3870x2 in Crysis here;



But I'm not sure that it really matters all that much. I'll check out the English language version of the review for more details. As far as Nordic's language. It's the way it's meant to be spun :lol:  It happens more when Nvidia "pwns" ATI than when ATI beats Nvidia, so I see it as linked to whatever gene makes people think that Nvidia's faster overall and at every price point.

Sort of like the thread where a guy wanted to replace his 690G onboard 4 pixel pipeline X1250 (essentially an X700 SE) with a PCI FX 5200 just to get the memory back, when he could have bought more RAM or spent $50 on a 3450 PCIe x16 card. Even as far as enthusiast rigs go, Nvidia doesn't win all the time because they have genuine competition in ATI, plus I do think the Crysis water issue, plus the recent 9600gt PCIe x16 linkboost issues show they fudge to get better performance, without transparency so reviewers can be objective.

Let's hope, for Nvidia fan's sake, that the AA issue is driver related. Then we can hear them say "it will improve with drivers!":



Anyways, thanks for the link to the thread and article in English. That clears things up quite a bit. Since reviews of the 3870x2 said it really performed at 1920 x 1200 and above, I'm sure Nvidia engineered this card just to beat the 3870x2 at those resolutions; since Huang doesn't like multi-GPU cards. I still expect to see the 9800gtx do better in SLI than a single 9800gx2.
March 14, 2008 1:27:02 AM

rallyimprezive said:
Yep! been on NABISCO under the same SN since April of 2001. :) 

I dont go there to often anymore though. The value of information has decreased, and the amount of trolls, flamers, and flat out idiots has gone up a lot.

ive got an 05 WRX. :) 

Love em!



NICE CAR!

You should try to make it to the next 48 hrs. http://48hrs.info

I dont go there much either, i get my info from my sister and husband. We hang out with the weazel from time to time. I don't know if you know of him.


Anywho, take care and see you around.
March 18, 2008 8:00:55 PM

I would buy two 8800GT's and SLI them any day over buying this GX2. $400 vs. $600? If the GT even came CLOSE to getting higher scores in these benchmarks vs. the GX2 it would be a waste of cash imo (which it does a-plenty).

http://www.overclockersclub.com/reviews/xfx_9800gx2/
!