Sign in with
Sign up | Sign in
Your question

'NVIDIA's shady trick to boost the GeForce 9600GT'

Last response: in Graphics & Displays
Share
February 29, 2008 3:04:37 PM

Interesting read from techPowerUp!:
http://www.techpowerup.com/reviews/NVIDIA/Shady_9600_GT...

nVidia using the PCie clock to boost performance? The only shady part about it is not letting anyone know.
February 29, 2008 3:31:20 PM

My 8800GTX always reports an incorrect clockspeed in Rivatuner's hardware monitor, even when I manually set the core, shader, and memory speed.
a b U Graphics card
a b Î Nvidia
February 29, 2008 3:57:15 PM

Interesting read.

I'm not so concerned about how it was done, but the reporting of the 'normal clocks' instead of actual clocks is a little questionable, and their lack of replies to the W1zz is a concern, especially after ther terse initial brush off.
Related resources
February 29, 2008 4:01:59 PM

Go one more page for the meat of the story....




Driver Freq. Clock reported Fillrate % Change vs.100 MHz
GeForce 9600 GT
100 MHz 725 MHz 725 MHz 17368 0%
105 MHz 725 MHz 761 MHz 18266 5.2%
110 MHz 725 MHz 794 MHz 19074 9.8%
115 MHz 725 MHz 834 MHz 19873 14.4%
GeForce 8800 GT
100 MHz 660 MHz 660 MHz 28916 0.0%
105 MHz 660 MHz 660 MHz 28912 0.0%
110 MHz 660 MHz 660 MHz 28908 0.0%
115 MHz 660 MHz 660 MHz 28913 0.0%
February 29, 2008 4:14:27 PM

Interesting indeed. Possibly a sneaky move on Nvidia's part, but as they say, all's fair in love and graphics processing units :sol: 
February 29, 2008 4:14:40 PM

Agreed. There shouldn't be a discrepancy between what a rep says and what actually is. Perhaps they didn't know about it but that is no excuse.

I am a little concerned about NV finding ways to put out a new generation of products that don't change too much from the last. I am curious if this was just an innocent performance booster or a cover up to make the card better than what it is. The hush hush nature of it sort of makes you wonder.
a b U Graphics card
a b Î Nvidia
February 29, 2008 4:22:22 PM

Yeah I saw that, but go one more page for the dark meat of the Link Boost tech, which gives you an idea how a card may perform differently from MoBo to MoBo and test/test & review/review;



That to me says, while the PR didn't know, it's obvious the engineers should be aware of this kinda thing.

Now you'd need to kinda discount any nVidia mobo tested until you found out if Link Boost was active (even if unbenownst to the reviewer at the time, not 'enabled').
So it could still report 650 mhz and really be at 800+ for all we know.

It's more annoying than truly nefarious or anything, I hate having to look at the AA used because of the discrepencies, this is just another thing to take into consideration to think of when gathering info.
February 29, 2008 4:30:21 PM

That boost is a little misleading if it is being reviewd as a 650MHz and being compared to something else that isn't overclocked. So is this the source of the SLI scaling boost?
February 29, 2008 4:51:08 PM

nVidia's on the juice, send them to Capital Hill with Clemens and Bonds. I like better performance as much as the next guy, but this is a bit low for business practices to not inform the consumer of the reason for the disproportionately high performance for its SP/TMU/ROP count and clock.
February 29, 2008 4:55:37 PM

I wouldn't call it shady trick. Why would Nvidia do this? What if someone decided to raise the PCI-E clocks and found out their card was unstable. People would be returning their cards which wouldn't be broken in the first place.

What if the card isn't stable @ 800mhz and you set your PCI-E to 115mhz?

Does this mean the card will be stable with PCI-E@ 115 making the core clock to 834mhz?
a b U Graphics card
a b Î Nvidia
February 29, 2008 5:10:46 PM

SpinachEater said:
That boost is a little misleading if it is being reviewd as a 650MHz and being compared to something else that isn't overclocked. So is this the source of the SLI scaling boost?


Yeah I don't know, could be especially since it needs an nV board to work to begin with.

The place of concern would be people seeing a 'default' clock of 650 (but on a 115+ PCIe bus) , then comparing it to something like a GF8800GT at stock speed, and then thinking you have overclocking room ontop of that performance based on people's OC's on 100mhz, where in reality it's getting close max OC already.

Anywhoo, it's an interesting technology, although kinda like Overdrive for dummies (what we don't tell them they won't worry about), but for most overclockers they'd want more knowledge and control, since they may want to OC indpendantly and tweak the voltages while they're at it.

One thing that didn't get answered yet is what does it OC, just the core minus the shaders or the core and the shaders together in tandem ratios?
February 29, 2008 5:19:27 PM

Yeah I hoped to see if the shader clock would increase as well. Maybe part 2?

I wonder about LinkBoost increasing PCIe frequencies above 110 to OC the GPU. That could cause problems with other hardware and overall system stability.

February 29, 2008 5:51:57 PM

TheGreatGrapeApe said:

Anywhoo, it's an interesting technology, although kinda like Overdrive for dummies (what we don't tell them they won't worry about), but for most overclockers they'd want more knowledge and control, since they may want to OC indpendantly and tweak the voltages while they're at it.



Dude, I know!!! I was just thinking that it is like the video card version of AOL. I wonder if it is going to be specific to the 9600 series since they figure it is a low end card that isn't of interest to the OC crowd or if we will see this even in the 9800 series.
a b U Graphics card
a b Î Nvidia
February 29, 2008 8:53:34 PM

SE - Yeah I don't know if it's limited to the G9600. It's a nice optional feature for quick and easy overclocking, but I'd prefer the option of increasing the bus throughput to a card like the GX2 without the worry of increasng the core freq and generating more heat, etc.

gtx, Link Boost can obviouslydo it above 110mhz since the illustrated option is 125mhz, but whether or not it would work on the GF9600 is another story. If you think about the scaling, 125mhz bus would equate to just over a 905mhz core. Without serious cooling I'd say that'd fry it for sure.
February 29, 2008 9:28:47 PM

homerdog said:
Interesting indeed. Possibly a sneaky move on Nvidia's part, but as they say, all's fair in love and graphics processing units :sol: 


No, I would not call many things that Nvidia does "love". It's more like the old fashioned "breach of promise". From blurry 7xxx series image quality, to fudged drivers for Crysis that give more fps but do not show the game the way the developers meant, to this little bit of chicanery.

Nvidia, the way the gaming customer is meant to be fooled.

marvelous211 said:
I wouldn't call it shady trick. Why would Nvidia do this?


If it makes the 9600gt out to be a better card on an Nvidia board, then they can use the marketing to contrast Nvidia "performance" vs. ATI's alleged second place status at that price point. The problem is that Nvidia's not being transparent about their behind the scenes overclocking of the PCIe bus, so people buying the cards for non-Nvidia boards might think their card performs better than ATI under stock conditions.

I find it interesting that the marketing and driver department are in the dark. While it could be marketed as a feature that recommends Nvidia boards, it comes across as just another behind the scenes Nvidia exploit that they don't want to be made public by their own team.

Intel should have bought Nvidia instead of developing their own GPU's, because the companies both operate with a questionable business ethic, even in generations where they have very competitive products.
February 29, 2008 11:01:16 PM

yipsl said:

If it makes the 9600gt out to be a better card on an Nvidia board, then they can use the marketing to contrast Nvidia "performance" vs. ATI's alleged second place status at that price point. The problem is that Nvidia's not being transparent about their behind the scenes overclocking of the PCIe bus, so people buying the cards for non-Nvidia boards might think their card performs better than ATI under stock conditions.

I find it interesting that the marketing and driver department are in the dark. While it could be marketed as a feature that recommends Nvidia boards, it comes across as just another behind the scenes Nvidia exploit that they don't want to be made public by their own team.

Intel should have bought Nvidia instead of developing their own GPU's, because the companies both operate with a questionable business ethic, even in generations where they have very competitive products.


IF you read the whole article Nvidia stopped OC the PCI-E bus.
February 29, 2008 11:45:08 PM

I like Nvidia but this is all very sheisty. I'm an overclocker and I don't want my clock rates or voltages changing in my system without me knowing about it. If it makes my system unstable I'll be pissed.
March 1, 2008 12:49:34 AM

Eh.... When millions of $$$ are at play I would never put it past a company to lie like this.
March 1, 2008 4:11:56 AM

BTW, thanks for posting that badgtx. I would have missed that article this week.
March 1, 2008 2:29:43 PM

marvelous211 said:
IF you read the whole article Nvidia stopped OC the PCI-E bus.

That's what I thought. I don't understand why Nvidia would do this if they weren't trying to "cheat" though...
March 1, 2008 2:42:40 PM

They didn't lie, just didn't tell the complete truth :) 
March 1, 2008 10:11:11 PM

Hatman said:
They didn't lie, just didn't tell the complete truth :) 


It's a fudge. Not as noticeable as the blurry 7xxx generation graphics, but the hardware equivalent to the Crysis Demo drivers not displaying water correctly.

Let's face it. What Nvidia did would be cool for Nvidia board and 9xxx series owners if it was out front and open. It would be a great feature to sell more expensive high end SLI boards to enthusiasts who don't really want to overclock themselves. It's like me getting a factory overclocked 3870x2 instead of doing it myself.

However, I wonder if it affected that benchmark of two FPS that showed the 9600gt in SLI beating a 3870x2? If the 3870x2 remained at stock PCIe but the 9600gt's got a boost, then that skews the already limited benchmarks.

IMHO, the card that will be benchmarked against the 3870x2 is the 9800gx2. Then, come summer, the 4870x2 will be benchmarked against the 9800gx2. ATI and Nvidia leapfrog, but Nvidia has a bad reputation for fudging demo drivers and now hardware tweaks behind the scenes just to win a few benchies. That's just unethical and not something that Nvidia needs to do with a good product.


!