Sign in with
Sign up | Sign in
Your question

AMD Cheating to get better Benchmarks than Nvidia?

Last response: in Graphics & Displays
Share
November 24, 2010 3:41:49 AM

http://blogs.nvidia.com/ntersect/2010/11/testing-nvidia...

Quote:
Getting directly to the point, major German Tech Websites ComputerBase and PC Games Hardware (PCGH) both report that they must use the “High” Catalyst AI texture filtering setting for AMD 6000 series GPUs instead of the default “Quality” setting in order to provide image quality that comes close to NVIDIA’s default texture filtering setting. 3DCenter.org has a similar story, as does TweakPC. The behavior was verified in many game scenarios. AMD obtains up to a 10% performance advantage by lowering their default texture filtering quality according to ComputerBase.

AMD’s optimizations weren’t limited to the Radeon 6800 series. According to the review sites, AMD also lowered the default AF quality of the HD 5800 series when using the Catalyst 10.10 drivers, such that users must disable Catalyst AI altogether to get default image quality closer to NVIDIA’s “default” driver settings.

Going forward, ComputerBase and PCGH both said they would test AMD 6800 series boards with Cat AI set to ”High”, not the default “Quality” mode, and they would disable Cat AI entirely for 5800 series boards (based on their findings, other 5000 series boards do not appear to be affected by the driver change).


I just read that and I know its a bit biased being from Nvidia's blog but the sources it references are unbiased. I'm wondering what you guys think of this.
November 24, 2010 4:02:07 AM

Just another shot in the forever-enduring AMD/nVidia war. I know that nVidia is guilty of doing "optimizations" in the past.

Also, on the blog, I love it when they say that "This is not NVIDIA-generated data, though of course we verified their findings too."
November 24, 2010 4:07:14 AM

Quote:
I will only believe it when other websites also start publishing it.


It was published on other sites. Nvidia just quoted those sites.
Related resources
a b U Graphics card
November 24, 2010 4:26:18 AM

The fact that nVidia is quoting this on their blog, gets me suspicious. We will see where the truth lies soon enough.
a c 169 U Graphics card
a b À AMD
November 24, 2010 8:29:56 AM

nvidia sounds like a lil whining kid pointing and saying "oooo, your a cheater your a cheater!" This is fine since its only nvidia fanboys who will read their blog. Im not against nvidia or anything, hell the past 2 cards ive owned are nvidia, but bitching and gossipping like a litle girl i dont like. If AMD wants that to be their default setting, then they are allowed to do that. just like the default settings in nvidia drivers change from time to time.
a b U Graphics card
November 24, 2010 8:36:28 AM

This is just nVidia getting their excuses in before Antilles is out. Not that it'll matter because Antilles is going to crush the 580 out of sight at any settings.
November 24, 2010 11:55:59 AM

iam2thecrowe said:
nvidia sounds like a lil whining kid pointing and saying "oooo, your a cheater your a cheater!" This is fine since its only nvidia fanboys who will read their blog. Im not against nvidia or anything, hell the past 2 cards ive owned are nvidia, but bitching and gossipping like a litle girl i dont like. If AMD wants that to be their default setting, then they are allowed to do that. just like the default settings in nvidia drivers change from time to time.



I don't consider myself a fanboy by any means (own a 6850) but I think they should both be on even grounds though when benchmarking. That seems like the only fair way to do it.
a b U Graphics card
November 24, 2010 12:23:44 PM

Some replys have had me in stiches.. You can quickly come to the conculsion of AMD fanboys..

But, in my view, do i believe it? Not really.. Nvidia and AMD are at a profit war and will do anything to slander the others nature to promote their own products to boost sales. Its as simple as that.

My Personal Opinion Of Course :) 
a b U Graphics card
November 24, 2010 12:51:22 PM

it appears to be a legitimate claim btu to be honest i have to look really really hard to even see a difference. if i had to give up that minute amoutn of detail for a good fps boost i'd do it. nvidia isn't guiltless though they subsidize and work with developers to get physics and thier card to work better in thier "the way its meant to be played" program ... really its the pot calling the kettle black as always look at benchmarks for what you buy and this is just another thing to weigh in on before you make your decition.. i still think the 5770 is best low-mid end, 460/465 if you can get it on sale mid, 6850/6870 mid-high and 470high and 5970 for top tier performance what matters is performance not brand
a c 235 U Graphics card
November 24, 2010 2:06:11 PM


Nvidia or ATI calling out the other company as cheaters? how is this news? :lol: 

a c 172 U Graphics card
November 24, 2010 2:53:31 PM

Interesting that a thread here on THG was posted about this a few months ago. Amazing how big the memory hole on the net has become.
a c 271 U Graphics card
a b À AMD
November 24, 2010 3:15:55 PM

nforce4max said:
Interesting that a thread here on THG was posted about this a few months ago. Amazing how big the memory hole on the net has become.

Something about Metro 2033 IIRC.
a b U Graphics card
November 24, 2010 5:45:58 PM

Quote:
More on the News:
Nvidia claimed GPU based physics rendering is better on Geforce Graphics than an Intel Core i7-980x based on the company's PhysX software


That would be because PhysX is horribly coded for work on a CPU.
a c 169 U Graphics card
a b À AMD
November 24, 2010 7:59:26 PM

I had a realization after thinking about this a bit more. It is not really AMD's fault for using default settings that may reduce image quality to boost speed, it is the reviewers fault for not testing everything with fair settings. You would think once you have reviewed 1000's of cards you would pick up on these things. Also, can someone provide me a link to some screenshots of both AMD 6xxx and Nvidia 4xx series using all default settings? (from a reputable source) I really want to see for myself. And i dont just want to see particular zoomed in textures that may be affected if this is not noticeable in game. I want to see a whole screen shot.
a b U Graphics card
November 24, 2010 11:33:32 PM

It was already posted by Kari above.

It's just more garbage lies from nVidia. They've been talking about this for months now, and they must actually believe that if they keep talking about it, it'll stick. The evidence won't go away though.

http://www.guru3d.com/article/radeon-hd-6850-6870-revie...

They reckoned that the AMD card was slightly better in one of them, the other 2 were a wash. Nobody would be able to tell any difference while gaming, only extremely anal people looking at stills might find something.
a c 169 U Graphics card
a b À AMD
November 24, 2010 11:49:37 PM

thanks kari & eyefinity, it turns out this is a crock of BS after all.
a b U Graphics card
November 25, 2010 12:00:28 AM

http://www.anandtech.com/show/3987/amds-radeon-6870-685...

You can see on Anandtechs review that the 6800's are as good as, if not better than the gtx 480 (mouse over each cards name at the bottom of the graphic to see). It's kinda personal, some people might see banding while others see the 6800 as being much smoother. The 480's filtering is also much less rounded (oval shaped), meaning it's easier to filter as well. If anything is cheating, it's theirs but I guess AMD has more important things to care about.
a c 172 U Graphics card
November 25, 2010 2:44:32 AM

This really isn't news as they have been at this for the past decade and have cheated numerous times but what I hate is how the both of them got together and made a deal to keep prices high regardless of market share to keep prices inflated. All one has to do is a few google searches and you will find news articles about how much they were fined ect. Worse than drivers when it is money out of your pocket.
a b U Graphics card
November 25, 2010 8:27:03 AM

They weren't fined anything and both were found not guilty.

I'll quote a bit of it

Quote:
In December 2006, antitrust regulators began to investigate ATI and Nvidia, the two largest add-in graphics technology players, for possible antitrust violations within the graphics processing unit and cards industry.

At the time, one industry analyst noted that because ATI and Nvidia were the two main players in the graphics chip market, the pricing was often similar for products of theirs offering comparable performance.


Quote:
Analysts were puzzled about the nature of the investigation. Unlike the memory industry--the subject of recent DOJ investigations into anticompetitive practices--there are only two main players in the market for add-in graphics technology. Nvidia and ATI, now part of AMD, have always been thought of as fierce competitors not inclined to work together outside of industry standards associations


That seems like a pretty weak case, and it was. Most nVidia and intel fans like to use this excuse as "proof" that AMD is just as corrupt as they are. AMD didn't even own ATI when this alleged price fixing was supposed to have happened.
!