Sign in with
Sign up | Sign in
Your question

Quick question about videocard past...

Last response: in Graphics & Displays
Share
August 12, 2008 11:11:55 PM

Can someone help me with which video card company. ATI or Nvidia, who was supposidly 'cheating' or optimizing their drivers to give them better benchmark results? This couldn't have been more than a couple/few years ago... and probably both companies... ;)  Please and thanks
a b U Graphics card
a b Î Nvidia
August 13, 2008 2:23:20 AM

Both companies, but the major stink that went to the 'cheating' level was nVidia.

ATi's transgression, re-ordering the way shaders were handled by the driver, that while the output was the same it was a violation of Futuremark's rules. Funny thing is this is similar to what nV did for PhysX drivers in Vantage to get the GPUs to do the work.

nV's transgression in 3Dmark involved, partial precision and defined render paths (meaning you rendered a narrower FOV and if the camera turned outside the 'rails' it would show blank space). Caused a huge furor;
http://www.futuremark.com/companyinfo/pressroom/company...

There were also issues with Aquamark and a few other benchies.

The most recent issue (other than the PhysX controversy) was the Crysis floptimization.

There isn't really a clean company out there, all of them have had floptimizations, but I doubt anyone would disagree that nV had the worst controversy over it with 3Dmark in the FX series, the closest thing ATi had was their Quack issue in the R8500 generation.
August 13, 2008 2:25:41 AM

Best answer, EVER!! :)  very nice man
Related resources
August 13, 2008 2:32:24 AM

Man the FX series was so terrible. Not only did they do "floptimizations" as you put it, but the cards in a lot of cases performed worse than the previous generation. Not to mention the insane heat the 5900Ultra produced. I remember how bad my 5900Ultra struggled when BF2 came out, then my friend gave me his 9800XT because he upgraded to a 6800 Ultra, and the difference was night and day.

That was seriously the worst generation of graphics cards I've ever seen. That's right around the time that the P4s and Althlon XPs were dueling it out, and the same type of thing was going on with Intel.

It's pretty amazing how the turned the tables, but it looks like ATI has once again turned the tables on Nvidia, maybe AMD might do the same again.
a b U Graphics card
a b Î Nvidia
August 13, 2008 2:43:05 AM

doormatderek said:
Best answer, EVER!! :)  very nice man


Thanks; be sure to follow the futuremark pdf's links, the extremetech one illustrates some of the errors with pics.
August 13, 2008 3:02:49 AM

IndigoMoss said:
Man the FX series was so terrible. Not only did they do "floptimizations" as you put it, but the cards in a lot of cases performed worse than the previous generation. Not to mention the insane heat the 5900Ultra produced. I remember how bad my 5900Ultra struggled when BF2 came out, then my friend gave me his 9800XT because he upgraded to a 6800 Ultra, and the difference was night and day.

That was seriously the worst generation of graphics cards I've ever seen. That's right around the time that the P4s and Althlon XPs were dueling it out, and the same type of thing was going on with Intel.

It's pretty amazing how the turned the tables, but it looks like ATI has once again turned the tables on Nvidia, maybe AMD might do the same again.


I've been through that too... I had a 5200, and later the 5700. This last one was ok and a much improvement over the 5200, but I eventually got an ATI 9800SE that I could convert to non-SE 9800 and it blasted all away. Also in those times I remember that I jumped from an slotA - Athlon 750 (@950Mhz with the microresistance-multiplier mod, the chip inside was actually a 950Mhz) to an Athlon XP 1700, and later a barton 2500.
August 13, 2008 3:18:10 AM

I could care less if they cheat or call it optimization...
Also my vote for the most complete answer to a question in the history of the internet.... The GreatGrapeApe
August 13, 2008 6:47:37 PM

As TGGA has said, both companies have been guilty of such things in the past at more than one time. Some of these "floptimizations" are minor, some are genuinely reasonable trade-offs, while others are outright underhanded tricks.

In the case of selectively writing your drivers to ignore and cull in a benchmark to artificially inflate your results, that's inexcusable, since the results will then have no bearing to ANYTHING the actual user will be able to get outside of that benchmark.

There are, of course, lesser crimes, some which some people may be able to justify, such as the "brilinear" filtering ordeal, which both nVidia and ATi have done at various points, applying full trilinear filtering only near mip-mapping steps rather than for whole surfaces; in both companies' cases, it tended to improve performance over full trilinear, and did provide most to all of the graphical quality improvement that was expected out of trilinear, so it could make sense, especially since it applied to everything the card did.
!