'NVIDIA's shady trick to boost the GeForce 9600GT'

Interesting read.

I'm not so concerned about how it was done, but the reporting of the 'normal clocks' instead of actual clocks is a little questionable, and their lack of replies to the W1zz is a concern, especially after ther terse initial brush off.
 

badgtx1969

Distinguished
Jul 11, 2007
263
0
18,780
Go one more page for the meat of the story....
scaling.gif




Driver Freq. Clock reported Fillrate % Change vs.100 MHz
GeForce 9600 GT
100 MHz 725 MHz 725 MHz 17368 0%
105 MHz 725 MHz 761 MHz 18266 5.2%
110 MHz 725 MHz 794 MHz 19074 9.8%
115 MHz 725 MHz 834 MHz 19873 14.4%
GeForce 8800 GT
100 MHz 660 MHz 660 MHz 28916 0.0%
105 MHz 660 MHz 660 MHz 28912 0.0%
110 MHz 660 MHz 660 MHz 28908 0.0%
115 MHz 660 MHz 660 MHz 28913 0.0%
 

SpinachEater

Distinguished
Oct 10, 2007
1,769
0
19,810
Agreed. There shouldn't be a discrepancy between what a rep says and what actually is. Perhaps they didn't know about it but that is no excuse.

I am a little concerned about NV finding ways to put out a new generation of products that don't change too much from the last. I am curious if this was just an innocent performance booster or a cover up to make the card better than what it is. The hush hush nature of it sort of makes you wonder.
 
Yeah I saw that, but go one more page for the dark meat of the Link Boost tech, which gives you an idea how a card may perform differently from MoBo to MoBo and test/test & review/review;

linkboost.jpg


That to me says, while the PR didn't know, it's obvious the engineers should be aware of this kinda thing.

Now you'd need to kinda discount any nVidia mobo tested until you found out if Link Boost was active (even if unbenownst to the reviewer at the time, not 'enabled').
So it could still report 650 mhz and really be at 800+ for all we know.

It's more annoying than truly nefarious or anything, I hate having to look at the AA used because of the discrepencies, this is just another thing to take into consideration to think of when gathering info.
 

SpinachEater

Distinguished
Oct 10, 2007
1,769
0
19,810
That boost is a little misleading if it is being reviewd as a 650MHz and being compared to something else that isn't overclocked. So is this the source of the SLI scaling boost?
 

KyleSTL

Distinguished
Aug 17, 2007
1,678
0
19,790
nVidia's on the juice, send them to Capital Hill with Clemens and Bonds. I like better performance as much as the next guy, but this is a bit low for business practices to not inform the consumer of the reason for the disproportionately high performance for its SP/TMU/ROP count and clock.
 

marvelous211

Distinguished
Aug 15, 2006
1,153
0
19,280
I wouldn't call it shady trick. Why would Nvidia do this? What if someone decided to raise the PCI-E clocks and found out their card was unstable. People would be returning their cards which wouldn't be broken in the first place.

What if the card isn't stable @ 800mhz and you set your PCI-E to 115mhz?

Does this mean the card will be stable with PCI-E@ 115 making the core clock to 834mhz?
 


Yeah I don't know, could be especially since it needs an nV board to work to begin with.

The place of concern would be people seeing a 'default' clock of 650 (but on a 115+ PCIe bus) , then comparing it to something like a GF8800GT at stock speed, and then thinking you have overclocking room ontop of that performance based on people's OC's on 100mhz, where in reality it's getting close max OC already.

Anywhoo, it's an interesting technology, although kinda like Overdrive for dummies (what we don't tell them they won't worry about), but for most overclockers they'd want more knowledge and control, since they may want to OC indpendantly and tweak the voltages while they're at it.

One thing that didn't get answered yet is what does it OC, just the core minus the shaders or the core and the shaders together in tandem ratios?
 

badgtx1969

Distinguished
Jul 11, 2007
263
0
18,780
Yeah I hoped to see if the shader clock would increase as well. Maybe part 2?

I wonder about LinkBoost increasing PCIe frequencies above 110 to OC the GPU. That could cause problems with other hardware and overall system stability.

 

SpinachEater

Distinguished
Oct 10, 2007
1,769
0
19,810


Dude, I know!!! I was just thinking that it is like the video card version of AOL. I wonder if it is going to be specific to the 9600 series since they figure it is a low end card that isn't of interest to the OC crowd or if we will see this even in the 9800 series.
 
SE - Yeah I don't know if it's limited to the G9600. It's a nice optional feature for quick and easy overclocking, but I'd prefer the option of increasing the bus throughput to a card like the GX2 without the worry of increasng the core freq and generating more heat, etc.

gtx, Link Boost can obviouslydo it above 110mhz since the illustrated option is 125mhz, but whether or not it would work on the GF9600 is another story. If you think about the scaling, 125mhz bus would equate to just over a 905mhz core. Without serious cooling I'd say that'd fry it for sure.
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780


No, I would not call many things that Nvidia does "love". It's more like the old fashioned "breach of promise". From blurry 7xxx series image quality, to fudged drivers for Crysis that give more fps but do not show the game the way the developers meant, to this little bit of chicanery.

Nvidia, the way the gaming customer is meant to be fooled.



If it makes the 9600gt out to be a better card on an Nvidia board, then they can use the marketing to contrast Nvidia "performance" vs. ATI's alleged second place status at that price point. The problem is that Nvidia's not being transparent about their behind the scenes overclocking of the PCIe bus, so people buying the cards for non-Nvidia boards might think their card performs better than ATI under stock conditions.

I find it interesting that the marketing and driver department are in the dark. While it could be marketed as a feature that recommends Nvidia boards, it comes across as just another behind the scenes Nvidia exploit that they don't want to be made public by their own team.

Intel should have bought Nvidia instead of developing their own GPU's, because the companies both operate with a questionable business ethic, even in generations where they have very competitive products.
 

marvelous211

Distinguished
Aug 15, 2006
1,153
0
19,280


IF you read the whole article Nvidia stopped OC the PCI-E bus.
 

wingless

Distinguished
Oct 23, 2006
156
0
18,680
I like Nvidia but this is all very sheisty. I'm an overclocker and I don't want my clock rates or voltages changing in my system without me knowing about it. If it makes my system unstable I'll be pissed.
 

homerdog

Distinguished
Apr 16, 2007
1,700
0
19,780

That's what I thought. I don't understand why Nvidia would do this if they weren't trying to "cheat" though...
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780


It's a fudge. Not as noticeable as the blurry 7xxx generation graphics, but the hardware equivalent to the Crysis Demo drivers not displaying water correctly.

Let's face it. What Nvidia did would be cool for Nvidia board and 9xxx series owners if it was out front and open. It would be a great feature to sell more expensive high end SLI boards to enthusiasts who don't really want to overclock themselves. It's like me getting a factory overclocked 3870x2 instead of doing it myself.

However, I wonder if it affected that benchmark of two FPS that showed the 9600gt in SLI beating a 3870x2? If the 3870x2 remained at stock PCIe but the 9600gt's got a boost, then that skews the already limited benchmarks.

IMHO, the card that will be benchmarked against the 3870x2 is the 9800gx2. Then, come summer, the 4870x2 will be benchmarked against the 9800gx2. ATI and Nvidia leapfrog, but Nvidia has a bad reputation for fudging demo drivers and now hardware tweaks behind the scenes just to win a few benchies. That's just unethical and not something that Nvidia needs to do with a good product.