The 9800gx2 pwned the hd3870x2 :http://www.pconline.com.cn/diy/graphics/reviews/0803/1241871_13.html
I'm not sure the performance is worth the price though.
I'm not sure the performance is worth the price though.
Even though it's still a week or so to go before GeForce 9800GX2 will be officially unveiled, Chinese website PCOnline has already posted the first review. Do keep in mind that the drivers are premature and buggy, for example, the card refuses to work in 3D on X38, even though it is suppose to work. However, if you want to run Quad-SLI you will need an NVIDIA chipset. The overall performance is, as expected, just breathtaking at high resolutions. The Radeon HD 3870X2 gets whipped in every benchmark, brutally with AA/AF activated.
Amusingly, you need to activate SLI in the control panel for the card to work properly, whether this is also a prematurity or something people will have to get used to, remains to be seen.
Alas, water cooling is the most extreme cooling you will ever see with these cards due to the design, but that will probably still be enough to reclaim the 3DMark 03 crown. The other benchmarks will probably need two or three 9800GTX under some LN².
Stories are also going around that Quad-SLI will not be available when the GeForce 9800GX2 hits the market on March 18. The reason is suppose to be the drivers and poor scaling of more than two GPUs.
The automatic increase of 25 MHz on the PCI-Express bus frequency yields an increase of 25% or 162.5 MHz over the stock clock (assuming a 650 MHz clock board design). With a final clock of 812.5 MHz you can bet this card will perform much better, when used by an unsuspecting user, on an NVIDIA chipset motherboard with LinkBoost.
Also it will have an extra performance advantage when reviewers compare the non-overclocked GeForce 9600 GT against any other card which is commonly done in most reviews. Unfortunately such a massive overclock can often cause instability of the graphics card, maybe so much that the system won't POST at all.
The idea of implementing a mechanism that directly increases the GPU frequency (and performance) based on the PCI-Express base frequency is a great novelty. It has the potential to offer hassle-free performance improvements to a large number of less experienced users. Being able to adjust this frequency in most modern BIOSes is a big plus because it will be applied without any software installation requirement in Windows (or any other operating system - there is your Linux overclocking).
The execution of this from NVIDIA's side is less than poor in my opinion. They did not communicate this new feature to reviewers at all, nor invented a marketing name for it and branded it as a feature that their competitors do not have.
Even when asked directly we got a bogus reply: "the crystal frequency is...". No, there is no 25 MHz crystal and its frequency is not fixed either. I'm not accusing the sender of the E-Mail of course, I just believe he didn't know, maybe this fact wasn't communicated to the marketing team at all. However, if you would get such an inquiry wouldn't you look into this further if it was your job to properly promote a product?
More room for speculation can be found in the driver. Why does it always return the "normal" frequency and not the real one? Maybe the driver developers didn't know about this either, who knows. I find it hard to believe that the internal communication lacks that much in a company which constantly delivers excellent, high-performing products.
It is certainly nice for NVIDIA to see their GeForce 9600 GT reviewed on NVIDIA chipsets with LinkBoost enabled where their card leaves the competition behind in the dust (even more). Also it could send a message to customers that the card performs considerably better when used on an NVIDIA chipset? Actually this is not the case, the PCI-Express frequency can be adjusted on most motherboards, you will see these gains independent of Intel/AMD CPU architecture or Intel/NVIDIA/AMD/VIA chipset.
While nothing is wrong with having a better product (and I do believe this is a feature that makes the product better), being transparent about such changes should be the proper way to do it in this industry.
When we increased the PCIe clock, MT score did not increase. But when we increase GPU clock the raising graphic is the same with 9600GT. The 8800GS and 8400GS are the same with 8800GTS, when we increase their PCIe clock, MT score did not increase.
So, other cards’ performance did not affect by PCIe. For now only 9600GT have this issue. But will it be the only one? no. We ran a test with a 9500GT ES, the issue is the same, the card gain performance when we increase the PCIe clock.Maybe it is a new feature NVIDIA just did not want us know about it.
...The overall performance is, as expected, just breathtaking at high resolutions. The Radeon HD 3870X2 gets whipped in every benchmark, brutally with AA/AF activated.