Sign in with
Sign up | Sign in
Your question
Closed

More nVidia bias?

Last response: in Graphics & Displays
Share
a b U Graphics card
December 8, 2010 11:42:06 PM

Quote:
Moving forward, I'm inclined to use AMD's High Quality Catalyst AI setting for graphics card reviews, if only to get the best possible filtering quality.


Tomshardware is the first high volume site to cave-in to nVidia's lies over IQ.

Note - not a single shred of evidence has been shown of this "inferior IQ" in any review on THG so far, yet the reviewer has taken it upon himself to decide that AMD cards should be unfairly tested at the highest settings, while nVidia cards are tested at medium settings.

FYI, other major sites have tested it as well, and found AMD's solution to be just as good. THG is the only major english-speaking website that disagrees so far.

More about : nvidia bias

a b U Graphics card
December 9, 2010 12:22:22 AM

I forgot to add, AMD will be releasing their two fastest cards next week - the 6950 and 6970. It's quite unfortunate timing that toms would decide to change their reviewing methods now. What a sad coincidence.

Quote:
I didn’t have time to dig too far into Nvidia’s recent accusation that AMD is tinkering with texture filtering quality in its newest drivers, mostly impacting the Radeon HD 6800-series cards.


Quote:
I went through seven different modern DirectX 11 titles looking for problematic areas that’d make for easy demonstrations, and, despite knowing about the issues and squinting at a 30” screen, came away with very little conclusive in the real world


Quote:
Moving forward, I'm inclined to use AMD's High Quality Catalyst AI setting for graphics card reviews, if only to get the best possible filtering quality.


Something isn't quite...right here.
a b U Graphics card
a b Î Nvidia
December 9, 2010 12:26:55 AM

I disagree with you , about there being no shred of evidence.
Nvidia sites many in their blog.

AMD filtering took shortcuts equivalent to putting NV at the performance level, also VS their own quality standard in the 10.9 driver.
Its not visible in every game the same way.

NV control panel default setting is 'quality'
There is only one higher setting High Quality
and 2 lower
Performance
High performance

This post breaks it down very well. Xbit labs tested the 68XX in High Quality setting, Hardware Canucks, went back and tested.
http://forums.anandtech.com/showpost.php?p=30709253&pos...




http://www.microsofttranslator.com/BV.aspx?ref=IE8Activ...
Related resources
December 9, 2010 12:27:04 AM

Do you own stock in AMD or something? Or do you work for AMD?
a b U Graphics card
December 9, 2010 12:28:02 AM

Metroidman said:
Do you own stock in AMD or something? Or do you work for AMD?


No and no.
December 9, 2010 12:30:11 AM

eyefinity said:
No and no.


You seem to fight for them like you do. Why does it even matter? There's bias in everything. Get used to it.
a b U Graphics card
December 9, 2010 12:30:14 AM

@ Notty - where is the evidence? Look at what I posted from Chris - does that add up to evidence of inferior IQ? He's just admitted he can't see any difference, yet is going to increase the IQ to MAX on AMD cards anyway.

If it's all about the max IQ, surely he has to do the same with nVidia cards?
a b U Graphics card
December 9, 2010 12:30:43 AM

Metroidman said:
You seem to fight for them like you do. Why does it even matter? There's bias in everything. Get used to it.


Why do you care what I "fight" for? Get used to it.
December 9, 2010 12:31:34 AM

eyefinity said:
Why do you care what I "fight" for? Get used to it.



No you just seem like those kids who fight on which system is better between PS3 or 360. I mean I can understand if you have no other hobbies.
a b U Graphics card
December 9, 2010 12:33:33 AM

Metroidman said:
No you just seem like those kids who fight on which system is better between PS3 or 360. I mean I can understand if you have no other hobbies.


What does that make you? The boring kid that wishes he was able to form an opinion of his own but can't?
December 9, 2010 12:35:30 AM

eyefinity said:
What does that make you? The boring kid that wishes he was able to form an opinion of his own but can't?


Na I'm above petty bickering between companies I have no stakes in. You go on right ahead though
a b U Graphics card
December 9, 2010 12:35:37 AM

AMD is releasing their two fastest cards NEXT WEEK.

Toms decides it's time to max out the settings.

You figure it out. They couldn't wait ONE WEEK? Did you forget already that Cleeve did a sli vs crossfire article ONE WEEK before the 6800's were released?

At what point does it finally click?
a b U Graphics card
December 9, 2010 12:41:18 AM

Heres my opinion. Its too early to definitively say wether it should be on or off. And therefore either both settings should be present, or leaving it at default like everyone else. Its very likely they should be turned up, i just think Tom's jumped the gun.
a c 130 U Graphics card
a b Î Nvidia
December 9, 2010 12:43:36 AM

So far I've seen two scenarios where ATI's change affects anything.

A: Find a 'sweet spot' in a modern game (I believe Mass Effect 2 was used). Screenshot it on an ATI card, screenshot it on an nVidia card. Compare the screenshots by blowing them up 100x in photoshop.

B: Specifically design a texture/pattern that exploits the change.

From those two it is my understanding that in order to observe any change the user must deliberately search for it.
a b U Graphics card
December 9, 2010 12:47:18 AM

My opinion is that you don't "jump the gun" just as a major release is about to take place. Can you imagine if this happened with the initial Fermi release?

This is way off, way off. I'm sorry but individuals simply cannot be trusted to make these billion dollar decisions and THG needs to look at it, fast.
a b U Graphics card
December 9, 2010 6:43:33 AM

@ Eyefinity
Remember what I said. Back yourself up with facts. And let's keep this civil.
a b U Graphics card
December 9, 2010 6:48:29 AM

eyefinity said:
They couldn't wait ONE WEEK?

Do you honestly think that Chris wrote that article yesterday?
a c 130 U Graphics card
a b Î Nvidia
December 9, 2010 7:20:40 AM

Timing can make fools of all of us, so can editorial staff.

As end users we all want Apples to Apples reviews, thats never going to be possible so we need to get as near as possible.
Allowing the AMD cards to run at default settings in drivers wont do that as some titles will get FPS gains thanks to the demotion issue.
For benchmarking IQ really isnt the issue its about having the cards working at as similar a set of settings as possible so that one side isnt doing less or more so the results can be as even as possible.

Mactronix :) 
a c 130 U Graphics card
a b Î Nvidia
December 9, 2010 8:52:33 AM

Who cares what the card is doing so long as you see the same result? The ONLY reason ANYONE decided to investigate is because nVidia spent who knows how many hours taking screenshots and blowing them up 100x. Otherwise this discussion wouldn't exist.
a b U Graphics card
December 9, 2010 9:02:36 AM

randomizer said:
Do you honestly think that Chris wrote that article yesterday?


No I think it was completed not long before the 7th, and he started it the week before. That's how it works. Your reviewer knows that Cayman is coming next week and he also has the card in his hands, right now.
a b U Graphics card
December 9, 2010 9:04:05 AM

mactronix said:
Timing can make fools of all of us, so can editorial staff.

As end users we all want Apples to Apples reviews, thats never going to be possible so we need to get as near as possible.
Allowing the AMD cards to run at default settings in drivers wont do that as some titles will get FPS gains thanks to the demotion issue.
For benchmarking IQ really isnt the issue its about having the cards working at as similar a set of settings as possible so that one side isnt doing less or more so the results can be as even as possible.

Mactronix :) 


AMD's top setting is higher quality than nVidia's default setting. If apples to apples is required, all cards need to use the top settings.
a b U Graphics card
December 9, 2010 9:49:19 AM

Sigh.....moving along!
a b U Graphics card
December 9, 2010 9:56:21 AM

Don't start. Don't even think about starting. Don't even start to think about starting.
a b U Graphics card
December 9, 2010 10:02:03 AM

Sigh, all the players are here for another massive flame thread. BTW Eyefinity, what do you think of the AMD bias and only putting AMD dual card solutions? Hm? :sarcastic: 
a b U Graphics card
December 9, 2010 10:04:59 AM

If there is another flame war this will be the last thread that it will happen in with this group. There are a couple of people in this thread who will not return from another flame war under their current usernames before the turn of the next century.
a b U Graphics card
December 9, 2010 10:06:06 AM

Look again at Notty22's screenshot.

"Selecting High quality will turn off all the texture filtering optimizations in order to provide the highest visual quality"

nVidia's "High Quality" setting is the exact same as ATI's "High Quality" setting. If you want the application exactly as it was created, with no performance optimizations, all cards need to be set to their maximums AF settings.

I really cannot see how anyone can disagree with this. It's there right in front of you.
a b U Graphics card
December 9, 2010 10:08:37 AM

randomizer said:
If there is another flame war this will be the last thread that it will happen in with this group. There are a couple of people in this thread who will not return from another flame war under their current usernames before the turn of the next century.


I can think of a few :sarcastic:  In any event, eyefinity, stop making conspiracy threads, and stop biting the mouth that feeds you. The benchmark could be viewed as equally biased both ways. If you dont like the reviews, dont read them, if you dont like the forum, dont come back.
a b U Graphics card
December 9, 2010 10:10:13 AM

ares1214 said:
I can think of a few :sarcastic:  In any event, eyefinity, stop making conspiracy threads, and stop biting the mouth that feeds you. The benchmark could be viewed as equally biased both ways. If you dont like the reviews, dont read them, if you dont like the forum, dont come back.


Biting the mouth...oh....there's a mental picture that's not going away any time soon! :lol: 
a b U Graphics card
December 9, 2010 10:10:16 AM

I've never actually seen any difference in quality or performance between NVIDIA's quality levels even after staring at screenshots. But Perhaps with GF100 there is a noticeable difference.
a b U Graphics card
December 9, 2010 10:12:32 AM

ares1214 said:
I can think of a few :sarcastic:  In any event, eyefinity, stop making conspiracy threads, and stop biting the mouth that feeds you. The benchmark could be viewed as equally biased both ways. If you dont like the reviews, dont read them, if you dont like the forum, dont come back.


How about, if you don't like my threads, don't read them? That seems like the best idea. Now you know what to do, I don't expect you'll be posting any more in this thread, right?
a c 130 U Graphics card
a b Î Nvidia
December 9, 2010 10:13:21 AM

eyefinity said:
AMD's top setting is higher quality than nVidia's default setting. If apples to apples is required, all cards need to use the top settings.



I disagree, for as near to Apples to Apples as possible all cards need to be set at settings which have them functioning as close to each other as possible.
If you just set every thing to its highest you could end up with one card having an unfair advantage due to it doing less complex calculations say like demotion of texture rendering :kaola: 
At least we have had this up front so if like yourself we dont like it then we can go use another review site. Dont remember the AMD release about the Demotion though.

I know you dont need me to put thie eyefinity but im going to anyway.

Look im not a Moderator guys but i feel that eyefinity is fully within rights to post about what ever takes the fancy or which ever issue there is feeling about.
aford10 posted a little reminder to eyefinity about civility lets all take notice, yes?

Mactronix :hello: 
a b U Graphics card
December 9, 2010 10:14:39 AM

randomizer said:
I've never actually seen any difference in quality or performance between NVIDIA's quality levels even after staring at screenshots. But Perhaps with GF100 there is a noticeable difference.


I've never seen any difference with AMD's either.

The point is, nVidia is optimizing at their default setting. The only time optimizations aren't used is at the highest setting, therefore if a true apples to apples comparison is needed, both cards must be set to the maximum setting.

Benchmarking the AMD cards at highest quality, and nVidia cards with optimizations is just plain biased and unfair.
a b U Graphics card
December 9, 2010 10:16:49 AM

mactronix said:
I disagree, for as near to Apples to Apples as possible all cards need to be set at settings which have them functioning as close to each other as possible.
If you just set every thing to its highest you could end up with one card having an unfair advantage due to it doing less complex calculations say like demotion of texture rendering :kaola: 

Mactronix :hello: 


See my last post.

There is nothing closer to the same quality that turning off all optimizations. That can only be got with Highest Quality settings on both drivers.
a c 130 U Graphics card
a b Î Nvidia
December 9, 2010 10:20:21 AM

eyefinity said:
See my last post.

There is nothing closer to the same quality that turning off all optimizations. That can only be got with Highest Quality settings on both drivers.


Tell you what can you link me to an article showing how Nvidia and AMD settings compare as im going on the basis of a review that said the Demotion was on by default and that turning the settings up turns it back on agian.
Thats what im guessing is going on at Toms they dont want teh Demotion on so they turn the IQ up.

Mactronix :) 
a b U Graphics card
December 9, 2010 10:39:29 AM

mactronix said:
Tell you what can you link me to an article showing how Nvidia and AMD settings compare as im going on the basis of a review that said the Demotion was on by default and that turning the settings up turns it back on agian.
Thats what im guessing is going on at Toms they dont want teh Demotion on so they turn the IQ up.

Mactronix :) 


It's all optimizations below "High Quality" mactronix.

Both AMD and nVidia optimize the filtering in some ways below "High Quality". AMD has two settings, Quality (default) and Performance (lowest) while nVidia has Quality (default), Performance and High Performance (lowest quality).

nVidia claims that AMD's default setting is a cheat, and they use a really old game called Trackmania to prove it.

Check this out.



You can see that the banding effect is pretty bad on the AMD card on the right. This is the worst example of the "cheat" as nVidia calls it, and it's only really noticable in this game from 2003.

Now check the same screenshots again.



Notice how nVidias image quality is worse in some places too? That's with both settings at Quality.


Sorry about the large screenshots, is there any way to reduce them in size in the forum?
a c 130 U Graphics card
a b Î Nvidia
December 9, 2010 11:12:02 AM

Ok so i see what you are saying and have re read some articles but i still maintain that its not about IQ settings or what settings you have on its about making both sets of cards have as similar workload as possible.

Having said that Toms do seem to have this back to front though and it does seem there is a total of Zero benchmarking benefit to be gained from enabling the High Quality Catalyst AI setting.
They should be turning catalyst AI off to void any Demotion issues according to what i have just been reading.
I believe eyefinity is correct in as much that doing so can do nothing but slow down ATI hardware.
Nvidia themselves request that Catalyst AI be disabled for testing purposes.

Unless there has been a change to how this all works in recent driver updates that someone wants to shed some light on that is.

Mactronix :) 
a b U Graphics card
December 9, 2010 11:14:21 AM

People, please, stop it.

First and formost, each company is reponsable for its own, unique implementation of DirectX/OpenGL, there are IQ differences as a result, even if they are incredibly minor. Likewise, any DirectX/OpenGL effect will be computed differently, so any true Apples to Apples comparision is technically impossible.

Secondly, is this REALLY necessary? I mean, I was expecting this thread to be about devs using PhysX, or more games developed with aid from NVIDIA's devs, but we're back to the old IQ argument again? Seriously?

I mean, we're down to arguing which default settings offers the closest IQ between each brand, right? If thats the case, I now argue AA is invalid, as each unique AA mode offered offers singificantly varying levels of IQ/Performance.

Please people, stop it.
December 9, 2010 11:29:17 AM

Oh look, another flame war propogated by fanboys... ::rolls eyes::
a c 130 U Graphics card
a b Î Nvidia
December 9, 2010 11:33:16 AM

Psychoteddy said:
Oh look, another flame war propogated by fanboys... ::rolls eyes::


I have yet to see any flame or war just people like you spouting rubbish :pfff: 
Its as if some people want an eruption, Its like a bunch of kids standing around chanting Fight Fight :pfff: 
People have been posting stupid little digs like that ever since this thread opened just grow up people.

Mactronix :) 
a b U Graphics card
December 9, 2010 11:37:48 AM

gamerk316 said:
People, please, stop it.

First and formost, each company is reponsable for its own, unique implementation of DirectX/OpenGL, there are IQ differences as a result, even if they are incredibly minor. Likewise, any DirectX/OpenGL effect will be computed differently, so any true Apples to Apples comparision is technically impossible.

Secondly, is this REALLY necessary? I mean, I was expecting this thread to be about devs using PhysX, or more games developed with aid from NVIDIA's devs, but we're back to the old IQ argument again? Seriously?

I mean, we're down to arguing which default settings offers the closest IQ between each brand, right? If thats the case, I now argue AA is invalid, as each unique AA mode offered offers singificantly varying levels of IQ/Performance.

Please people, stop it.


You do realise that if AMD's optimizations are removed, and nVidia's remain, that would cost AMD up to 10% fps loss in just about every game?

Yes, I think this is really necessary gamerk.
a b U Graphics card
December 9, 2010 11:38:28 AM

mactronix said:
I have yet to see any flame or war just people like you spouting rubbish :pfff: 
Its as if some people want an eruption, Its like a bunch of kids standing around chanting Fight Fight :pfff: 
People have been posting stupid little digs like that ever since this thread opened just grow up people.

Mactronix :) 


I'm used to it, the same people just cannot help themselves. I'm just ignoring them as best as I can from now on.
a b U Graphics card
December 9, 2010 11:47:45 AM

eyefinity said:
You do realise that if AMD's optimizations are removed, and nVidia's remain, that would cost AMD up to 10% fps loss in just about every game?

Yes, I think this is really necessary gamerk.


That really doesn't fair IMO, if people come to sites like this to get a fair comparison between cards to help decide how they are going to spend their hard-earned cash. If they're not getting a fair comparison, they're being done a bit of a disservice.

Having said that, to the mainstream user, will any of this actually be applicable?
December 9, 2010 12:09:37 PM

It seems the problem eyefinity has is that THG is going to use highest quality which will bring down performance while not using the highest quality settings on the Nvidia drivers thus giving a performance edge to Nvidia.

The problem with whining about this, is that if this is not fair, THG (and anyone else) can test however they want. If this creates an unfair advantage for one over the other, it will become obvious while comparing benchmarks from THG and other reviewers. ( we all look at 6 or 15 different reviews on launch day, don't we)?

If there is an anomoly, they will be outed, and not trusted, VERY quickly.
a c 172 U Graphics card
a b Î Nvidia
December 9, 2010 12:46:08 PM

a b U Graphics card
December 9, 2010 1:58:10 PM

I think i agree with tom's insomuch as they shoudl be using highest quality settings; i thoguht they were doign the same with Nvidia. if that is not the case i woudl want Nvidia cards set to highest quality possible as well to even the playing fields.

yes both use thier own drivers but when seeing benchmarks I know I would like to see what the GAME developer indended to be shown and exactly the way they it was intended it to be shown .. not what part amd or nvidia want me to see

Now if you ask me if i'd choose a filter that would allow me to use the same card and get performance gains for an almost imprecievable amount of image degridation then yes i would use this setting, however i would not want these settings active durring benchmarks on principle! only way i'd want them on is if we got a comparison with filters on both side AND with all filters off to see performance as the game dev's intended for a more apple cider to heated apple juice comparisons (i agree its never apples to apples but it is damn close)
a b U Graphics card
December 9, 2010 3:01:59 PM

Thing is g00fy, wasn't the point of driver improvements and optimizations just for this reason?

That trackmania game that has got nVidia making cheat accusations shows that there are differences between both cards. What makes the ATI optimization a "cheat" while theirs is just optimizing?
a b U Graphics card
December 9, 2010 3:28:59 PM

agreed that drivers and optimisation can improve gains in performance.

however i also don't want either company using these tweeks as they are perceptible (though admitadly barely perceptable)

give my the game how the dev's intended without altering for performance in benchmarks and then let me decide to turn the optimisations on ...

i'm all for adding an astrick for optimised vs non optimised but then make that clear you are nto showign the game the way it was intended in some way

*add* and no i'm not aiming that soley on amd here nvidia does it to but doesn't make either side right for it
a c 595 U Graphics card
a c 153 À AMD
a c 388 Î Nvidia
December 9, 2010 4:16:46 PM

http://www.guru3d.com/article/exploring-ati-image-quali...
"If a graphics card manufacturer chooses to forfeit on image quality then that is that company's sole decision and they will have to put their product into retail with the knowledge that the end-users KNOW they forfeit on image quality. In the end that's going to haunt, taunt and blow up in ATI's or NVIDIA's face without doubt."

"Forfeiting on image quality will cost the manufacturer business as end-users want the best product. Especially in the high-end performance graphics area people really care about image quality."

"We urge and recommend AMD/ATI to disable the optimization at default in future driver releases and deal with the performance loss, as in the end everything is about objectivity and when you loose consumer trust, which (as little as it is) has been endangered, that in the end is going to do more harm then good. The drop in 3-4 FPS on average is much more acceptable then getting a reputation of being a company that compromises on image quality. And sure it raises other questions, does ATI compromise on other things as well ? See, the cost already outweigh the benefits."

"So the moral right thing to do for AMD/ATI is to make the High Quality setting the standard default."
a c 595 U Graphics card
a c 153 À AMD
a c 388 Î Nvidia
December 9, 2010 4:24:07 PM

http://benchmarkreviews.com/index.php?option=com_conten...

1120-Core "Fixed" Radeon HD 6850 Video Card Review Samples Shipped to Media

Websites publish Radeon HD 6850 reviews with inflated results as a result of hidden 6870 GPU.
Shortly after AMD launched their new Radeon HD 6000 series, which featured their 960-core AMD Radeon HD 6850 and 1120-core AMD Radeon HD 6870 video cards, some reports surfaced that some retail manufacturers had shipped 6850 test samples with the same 1120 shader core Barts GPU that comprises the more powerful Radeon HD 6870 video card.

This creates a major problem for review websites, because many of them had unknowingly published their Radeon HD 6850 video card reviews to the public. Fortunately for AMD, although quite unfortunate for NVIDIA, many curious readers have now incorrectly perceived the Radeon HD 6850 to be capable of performance levels not possible from its correct GPU - even with maximum overclocking. While this incident excludes all unbranded AMD reference samples, partners such as Sapphire, HIS, PowerColor, and XFX appear to have sent "overpowered" samples to reviewers.
a b U Graphics card
December 9, 2010 4:28:52 PM

Yet before that he said, and I quote...

Quote:
We have a hard time spotting differences as much as you do, and while making the screenshots we increased gamma settings to 50% and applied a resolution of 2560x1600 to try it look more visible.

Do you spot the difference ? Probably not, that is the rule we life by here at Guru3D, if you can not see it without blowing up the image or altering gamma settings and what not, it's not a cheat.


Seems like a very mixed message doesn't it?

In the forum thread for the article, he claims that he is concerned that it will start an image quality contest (which is what it already is), and he also has not remarked on the fact that the nVidia cards look worse in some aspects of the trackmania title.
!