Sign in with
Sign up | Sign in
Your question

Discuss Lars column

Last response: in Graphics & Displays
Share
June 24, 2003 8:04:40 PM

Well for those that have read it. What are your opinions on it.

The first page was good. Gave some brief information into what is to be discussed later on. The picture with the text also foretells some of the mis-represantation that occurs in the industry.
The second issue mentioned i found it partially good. Cause it didn't tell some of the things one could associate with 3dmark. 3dmark score is not so good if you got nothing to compare too. And how about trying to see it this way. If you got a good score on game test 4 which was DX8 let's say 60fps then you can quite surely assume that you can play most games that are DX8 at quite ok performance. Not to say it's 60fps, it could be more since games often has some slight optimizations in general and doesn't run neutral code as 3dmark01.
But it's not all bad. It describes how 3dmark went from tiny to huge. Something not all might be aware off. And some more background information why Nv left the futuremark beta program and some of the statements quoted partially as well. Good info on how the beta program works to a certain degree.
I agree with some of the parts mentioned in the "nvidia under fire section". I liked it when he mentioned the part that the cards support different lvl of precisions in the pixel shaders. Brings up some memories of how fanboy used to argue one card superior over the other vice versa. And it also gives some hint to the mudslinging between the two major competitors in the graphics sector regarding which is best. At the last part of the second page of the same section it mentions one of the things that is still here and that fools some. Hype. the 32-bit entry war. One graphics card maker was able to handle 32-bit colors in games and the other card maker only 16-bit (22-bit internally) Even if at 32-bit it went at a snailpace it was hyped like crazy and managed to get in some people's mind that this card is the best it can handle 32-bit color output.

When we get to the 3dmark03 issue we get to the part that might have been the thing that led to this column. Further information regarding 3dmark03 and that particular graphics card maker can be found mostly everywhere so i wont bring anything up about it. Search, read, search more, read more, evaluate.
Regarding the thing that there is no neutral code. Wasn't that why standards like DX created(yes there is ARB also but we mostly talk about DX anyways so :D )
And again good of you Lars to adress how easy marketing fools alot of people. By mentioning examples like the agp4x vs agp8x talk.

I will summarize my opinion regarding the last3 sections.
I think Lars should take a neutral stance on any issue that occurs between the two major competitors and any future issues and provide unbiased info to the crowd here at THG. Not purely quoting any card makers statement but critisize and evaluate and give suggestions to why it doesn't seem right or any other things. Maybe it's only me but i want to learn and not get PR statements when one comes here.
Lars mentioned in the column that there was both pro's and con's regarding the TWIMTBP program. But does the pro's even out the con's? Since the developer would optimize mostly for that certain card makers hardware and not for the other. Wouldn't that make benchmarking even harder? That would invalidate many games as benchmarks right from the start wouldn't it?(yeah i know UT2003 is one of the games on that list. And it was right wasn't it. The developers themselves said they were surprised that UT2003 worked so good on the cards they hadn't even aimed at developing for)
In these times when reviewing video cards get worse, harder and more time consuming. How much should the reviewers have to go through cause the card makers can't play it clean?
Of course most developers will probably develop their game for as many platforms as possible to get as much potential sales as possible. If the TWIMTBP program does go well and the public slowly moves towards that card makers direction it will inevitably try to push more and more of the market in it's direction even if it means rushing over alot of things in the way (for example invalidating different kinds of opinion. Invalidating benchmarking software). It's business and you want more and more and you should try your best to get more.

Pretty ok column overall. I hope Lars goes the neutral way and provides the THG with information and not PR statements in the future.

Note: I was bored. Had to write something :) 

Wooba Wooba

More about : discuss lars column

June 24, 2003 8:21:52 PM

Just disappointed that they didn't give ati's cheating as much space as they should have, both companies need to have their little cheats pasted all over the place so they get plenty of publicity about it and lots of hate mail... only way they're going to learn.

Shadus
June 24, 2003 8:35:17 PM

Yes it does seem good. But what would you do if only one of the two parts listen and corrects themselves. While the other keeps doing the bad things (cheating etc etc)?
Hate-mail isn't enough
The community has to take a stance through the reviewers and the forum participants that are consumers (fanboys excluded).

Wooba Wooba
June 24, 2003 8:45:10 PM

I'm going to have to disagree with you here, Shadus. I think it was represented fairly. Ati's transgressions were mentioned, even the old quake scandal. And floating point precision was addressed fully, IMHO.

You can call me a fanboy if you like, but I don't think I'm being unreasonable when I say that there is a big difference between a 23% and a 2% increase in 3dMarks. And an even bigger difference in the policy.

Call them "Optimizations" or "Cheats"... I don't care what you call them. But only one company admitted and retracted the practice, while the other simply ignore criticism from the community of their customers.

That is a hell of a differentiator, as far as I'm concerned.

If thinking that makes me a fanboy, so be it.
But my twisted fanboy mind is inclined to think that it's more fanboyish to ignore such a huge difference in corporate attitude and actions.
June 24, 2003 10:33:27 PM

Post deleted by Thomas

Sorry guys, but I've started to check out this forum on a regular basis now (well, was about time, wasn't it?). From now on, I'd appreciate if we could stick to some reasonable level of courtesy.

Can we do that?

Best wishes,

Tom<P ID="edit"><FONT SIZE=-1><EM>Edited by Thomas on 06/26/03 07:16 PM.</EM></FONT></P>
June 24, 2003 10:34:10 PM

Bangarang, Cleeve.

You're absolutely right, dude. The opts they used were legitimate, as they didn't change IQ or deceive others to gain an edge on the competition. ATi never tried to hide their optimizations. Hell, nVidia is still riding out the "driver bug" story! What CRAP! ATi said, "If you don't like it because it wasn't part of the original benchmark, cool, we'll remove it" (and they did). They lost 1-2% performance, and the IQ never changed. nVidia went and blamed it on everyone else, and when the problem was finally corrected (by FutureMark, in the 3dMark03 scandal), they lost 25% and their IQ went crappy to normal (albeit still inferior to ATi!!!)

Tit for tat, butter for fat, ATi's dog kicks nVidia's cat

(Maximum PC)
June 24, 2003 10:37:42 PM

YIKES, GW... woke me up... that's okay, you're absolutely right.

Tit for tat, butter for fat, ATi's dog kicks nVidia's cat

(Maximum PC)
June 25, 2003 4:20:16 AM

It also goes to show you how much a company's services make up different mentalities.
It is clear the ATi side that responded so humbly and with honor to this issue, is not the same one as the side that created that PR PDF aimed at bashing away nVidia's FX line.

Same thing as Intel's excellent production service compared to their marketting dept. which influences a lot their production (hurrying the Wilamette out).

--
If I could see the Matrix, I'd tell you I am only seeing 0s inside your head! :tongue:
June 25, 2003 4:25:04 AM

Hmm I didn't see his post as being flame bait here.

--
If I could see the Matrix, I'd tell you I am only seeing 0s inside your head! :tongue:
June 25, 2003 4:34:30 AM

I dont care...I dont like him.
Do not make my bunghole angry.
*edit*
A couple of days ago this POS posted in a thread, insinuating that we were all stupid.
I wont put up with that here...


"We need fangirls" - dhlucke<P ID="edit"><FONT SIZE=-1><EM>Edited by geneticweapon on 06/25/03 00:47 AM.</EM></FONT></P>
June 25, 2003 5:53:47 AM

HAHA you tell that fargot Gen!

The funny thing is, my 3dmark 2003 score at stock 9800 speeds went UP 100 points with cat 3.5s, so although they removed those optimizations they must have increased performance in other ways and areas.

"Every Day is the Right Day." -Pink Floyd
June 25, 2003 6:32:06 AM

Quote:
although they removed those optimizations they must have increased performance in other ways and areas.

Yea, instead of shader re-ordering, they've switched to shader replacement....just to compete :cool:

"We need fangirls" - dhlucke
June 25, 2003 6:39:40 AM

You don't really believe that do ya Gen?
You're just an nv boi now cause u got that ti4200!
I don't think ATI would be stupid enough to do that, there would be a 100% chance of them getting caught!
Why shouldnt new drivers unlock performance?
As long as its legit.
I totally expect more performance from futuer driver updates. Esp. in like uh teh doom 3 and teh hl2 etc.

"Every Day is the Right Day." -Pink Floyd
June 25, 2003 6:45:24 AM

Quote:
It is clear the ATi side that responded so humbly and with honor to this issue, is not the same one as the side that created that PR PDF aimed at bashing away nVidia's FX line.


It's just their marketing strategy... never benefit the end users at all...
a b U Graphics card
June 25, 2003 9:38:35 AM

Quote:
the 32-bit entry war. One graphics card maker was able to handle 32-bit colors in games and the other card maker only 16-bit (22-bit internally) Even if at 32-bit it went at a snailpace it was hyped like crazy and managed to get in some people's mind that this card is the best it can handle 32-bit color output.


Actually it's 24 bit precision not 22. And if you've actually seen any of the captures, the difference between 24 upgraded to 32 and static 16 IS major, but the difference between 24 rounded up to 32 and true 32 is imperceptible, Anandtech had a larg article on it in the ERA of the 5800v9700 debate.
The difference is that Nvidia DID downgrade their options, while ATI ran at max, which most reviewers have viewed as being equal. The main thing is the resultant IQ. Pseudo 32 and actual 32 pretty damn close. true 16 vs Pseudo 32, BIG difference.
The main thing to me is what is needed to provide a good picture, if you can give me the same perception of quality through a simpler (perhaps lesser) technique do I care? If the picture sufferes I will definitely care, speciall if they run the 16 bit path DESPITE my settings in the control panel, simply because they want to wi benchmarks, that's trading your consumers off for benchmrk score.
Considering the poor settings on most people's monitors I doubt that they would notice the difference between pseudo32 and true 32 even in a blown up comparison. However most of even the crappiest monitors can show the diff. between 16 and pseudo 32(24), although not everyone will notice since their eyes may not be that fine tuned, or perhaps their frag-straffing the hell out of their game and everything is happenig so fast they can't really see quality anymore it's just a radar screen. There! Kill! There! Argghh NOoo! :smile:

Anywhoo, I think it's a small issue and it HAS been covered by the community already and isn't cheating, just different architecture, which brought better precision earlier. And yes the CineFX 128bit is better than the R300+'s 96bit, but how much better? Perceptible?


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
June 25, 2003 10:12:35 AM

Quote:
Actually it's 24 bit precision not 22

actually he's talking about voodoo 22bit color precicion against tnt 16it / 32bit precision..

but yes, you're right..

some calc:

the 32bit values have 1bit sign (+-), 23 bit mantissa and 8 bit exponent..
the ati 24bit values have 1bit sign, 23-8 = 15 bit mantissa and 8 bit exponent.. (i think..). so, value-range is the same, but precicion gets less..

the 23bit mantissa can store 8388608 different values, the 15bit mantissa can store 32768 different values. so at 1/32768 changes of the mantissa, they behave the same.. thats 0.000030517578125.. 0.030517578125 promille that is..

sure, relative to the size of the float (namely the exponent..), but 0.000030517578125-steps are enough precious for me.. at least for the 64 arithmetic instructions you _can_ have in a pixelshader only anyways.. it possibly cummulates to some.. percent difference at MAX per pixel.

but in the overall picture, its a difference of some milli-percent.. at MAX..

"take a look around" - limp bizkit

www.google.com
a b U Graphics card
June 25, 2003 10:58:54 AM

Ooops missed that.
And I know that if I mention the techinicals I can rely on you to come an explain things (as long as you're watching) to the point where I can say Uh-huh, and nod my head like a bobble-head doll. I get it while at the same time not getting it (some things just fly overhead while I watch in facination), it's a very strange feeling. :eek: 


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
June 25, 2003 1:42:19 PM

Yes one company cheated and said they removed them... then got caught with more cheats later on. ATi *just* got caught again playing with the pixel shaders by the same guy (not futuremark) that found the slew of cheats nvidia was doing with pixelshaders. Nature demo in 3dmark was cheated by both nvidia and ati and just about an equal ammount too. Both teams are cheating, severely. I just hate to see only one getting beaten on when both deserve.

Shadus
June 25, 2003 3:24:53 PM

I wonder where Nvidia got the Idea to support different floating point modes, that was different from the competition and also had tiny differences between higher quality. That seems to be the only noticeable thing they took when they acquired 3dfx
June 25, 2003 3:33:34 PM

Interesting. I'd like to see all the cheats that have been found.

What's the link, Shadus?

------------------
Radeon 9500 w/256 bit memory bus @ 367/310
AMD AthlonXP 2000+
3dMark03: 3439
June 25, 2003 4:38:03 PM

They have covered ATi's mess.
Now why is nVidia being beaten down?
Excuse me, but they HAVE been accused 3 times, THEY HAVE been the center of attention with their accusations and convictions. THEY are the ones who we found to have suspected driver float precision downgrading, THEY are the ones who cheated in 3dMark03 BADLY, THEY are the ones who were recently found to have messed with the Anisotropy settings with application detection for 3dMark03.
What, do you want me to write you a book on ATi's 1.9% difference cheat and give nVidia a page of tabloid?

--
If I could see the Matrix, I'd tell you I am only seeing 0s inside your head! :tongue:
a b U Graphics card
June 25, 2003 5:14:23 PM

Yeah Shadus, I agree with Cleeve, provide a link to back up your bluster. It doesn't need to be clickable I can do that for you, just provide any proof of these 'NEW' cheats of which you speak, that are equal in scope and number to those of Nvidia.
And while you're at it how about providing those MATROX cheats you were talking about before. You're so full of BS counter-accusations you can't support it's ridiculous.
Show me EITHER of these two references and maybe I'd believe ONE more thing you say isn't NV PR, but until then you're full of BS!

And in case you forgot, <A HREF="http://forumz.tomshardware.com/hardware/modules.php?nam..." target="_new">THIS</A> is the thread with your BS MAtrox accusations which you backed down to "Oh I forget the exact card anymore.." and 'If you care so much YOU look on Google' instead of backing up your BS. I refer you to my last post of that thread, and it's opening statement; <A HREF="http://forumz.tomshardware.com/hardware/modules.php?nam..." target="_new">"Well then no one can trust ANYTHING you say."</A>

Now's the time to put up or STFU! Post a link (and not to a friend's site, to a reputable site) backing up your statement.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
June 25, 2003 5:46:06 PM

err, YEA!!

"We need fangirls" - dhlucke
June 25, 2003 6:32:41 PM

I think they said something about it in a press release, although I can't find it. Something about reshuffling the code to increase precision...the code itself has not changed just the order at which it is processed. So more efficient I guess...and I would consider that an optimization not a cheat. Of course, my opinion and a subway token will get you on the subway.

Insert_ending_here,
TKS
June 25, 2003 7:04:53 PM

the reasons are ... "simple"..

ati worked out together with microsoft the dx9 minimum requirement.. means their card _IS_ the minimum. not more, not less. they told microsoft they use 24bit, so microsoft had to say 24bit is minimum to not steal away the dx9 from ati..

nvidia on their side thought hey cool, floats on a PC (cpu that is) are 32bit. so they thought (and they are indeed marketing vise right), they could claim to be the only real 32bit float gpu. means the same as a cpu. this is actually wrong, the size is the same, the math is not the same (anyone remembers cpu's calculate internally with 80bit floats?!??! ahh? ahh? :D )..

at the same point nvidia knew this would be [-peep-] slow.. and in a lot of situations a 16bit float type would be enough.. wich _is_ true. problem is, its not enough according to the dx9 specs => the whole 16bit can get dropped to be dx9 compliant => their fast modes will be unused while the only mode will be the slow mode..

first of all: the IDEA of nvidia is NOT BAD! i like the thought. but there are two major issues with the multitype system:
it is NOT what the dx9 spec asks for. so WHY DOING IT? there is no NEED!! gamers want SPEED, not more features that no dx9 app can EVER EVER USE!!
second, it is complicated. for developers. they always have to think hm.. is here 16bit enough.. it would be faster then.. or is even the 12bit fixed point enough? .. or do i need 32bit at this point?.. this needs tons of different shaders to test to find out wich one performs best with the best image result.. a lot of work you DONT need on ati cards.

secondly, it was clear from the beginning (i knew it from the first time i've seen nvidia will do multiple sized types at the same time.. and thats now about a year ago!!), it will lead to a major issue: CHEATING. i mean.. nvidia KNEW their hw will suck in dx9 compliant mode and rock if they go lower precicion. it was CLEAR that they had to play around replacing dx9 shaders by mixed shaders wich perform 16bit where it is about not visible (but it is measurable:D :D :D ) and 32bit where its needed. that way they can gain a lot of performance. problem: THEY FALL BEHIND DX9 COMPATIBILITY.

said different: the gfFX is a great card, but it has only a slow dx9 emulation. the fast mode you see currently is not dx, not gl, it is nvidia-made and is NOT dx9 compliant.. you can call it extended dx8 if you want..


don't let us start about the clipping planes. they simply, plain simply, and just simply SUCK!

well..

if you go back and read up my posts, you'll find an important note by me, months ago: "the gfFX is not following the dx9 specs.. it does eighter more, slower, or less, faster, but not exactly what dx9 asks for.. that WILL LEAD TO TROUBLES" and guess what? IT DID/DOES.. more than i excpted..

i knew nvidia bleeds.. :D 

"take a look around" - limp bizkit

www.google.com
June 25, 2003 7:24:07 PM

If it bleeds, we can kill it.

------------------
Radeon 9500 w/256 bit memory bus @ 367/310
AMD AthlonXP 2000+
3dMark03: 3439
June 25, 2003 8:21:54 PM

Wouldn't it have been simpler for Nvidia just to have made the FX a dx8 card then? How much could that have hurt them? Obviously they haven't gotten to the point where they can do dx9 better than the competition so why not just make a dx8 card that kicks major tail?

<A HREF="http://forums.btvillarin.com/index.php?act=ST&f=41&t=38..." target="_new"><font color=red>dhlucke's system</font color=red></A>

<font color=blue>GOD</font color=blue> <font color=red>BLESS</font color=red> <font color=blue>AMERICA</font color=blue>
June 25, 2003 9:12:04 PM

it is a dx9 card. just a [-peep-] slow one. there is a fast dx9 path in the hw as well, but it has just too low precision, while the slow one has too high precicion..

would you call such a card dx8 card if your competition has dx9 cards? no..

"take a look around" - limp bizkit

www.google.com
June 25, 2003 10:18:28 PM

I'm not sure why everyone argues about which company "cheated" less or more. That in itself is irrelevant. The bottom line is they both are "cheating"--whether you cheat by 20% or 5%, they are essentially the same result: by cheating you are augmenting the true performance of your product to appear faster than it really is. Don't get me wrong, I am not saying that there isn't an degree of cheating, I am saying that once you are cheating, all credibility is lost.

To illustrate the point, people claim that ATI in withdrawing a SINGLE instance of cheating is far better than Nvidia cheating by a larger margin and continuing to do so. But I don't see either company doing the RIGHT thing. My question is, had the "cheat" by ATI not been discovered, would ATI continue to exploit the "cheat" with unaware consumers? I believe, under the current industry, the answer to that question would be "yes" for both companies. In order for both companies to be credible, both must not "cheat" to begin with. That, however, it being too hopeful. I believe something more feasable is for both companies to be required to release detailed documentation on their products and driver software, but that alone does not solve the "cheating" problem.

Furthermore, I think the article went quite a ways to describe the current graphics chip mud-slinging war; this isn't about WHO is cheating more, its about a competition that is lacking standards. The article also tries to say something about the current "standards" and how the standards are biased.

More specifically, the lastest version of 3dmark (which to some degree is the current "standard") is greatly influenced by graphics chip manufacturers rather than software developers. In general for PC software, most games are NOT built first and do NOT have specialized hardware for the game. Games and software are built FOR the current hardware. To some extent, the development of 3dmark was backwards; by allowing hardware vendors to participate in the development, they could have an advantage by building their hardware for the software which is absurd. Very few games are built such that hardware is specifically built for the game. What is the small game software company going to do? Have big-name Nvidia or ATI make a graphics card specifically for their game and have no idea how well their game will sell? I think not. This is not the strict rule for everything; there are plenty of cases where hardware is built after software, but most software companies don't know how big their sales are going to be and therefore develop for the hardware.

This article is about how much credibility there is in the graphics chip competition, game benchmarking standards, the DIFFICULTY in benchmarking, and how mud-slinging and credibility of both companies continues to get worse. In my opinion, it will continue like this until a (neutral) system or rule is put in place to stop the current "cheating" practices.
June 26, 2003 12:44:47 AM

I agree that graphics chip manufacturers should have to publish their specifications. I also find it interesting that ATi went out of their way to tell us that they removed the optimizations that were <i><b>found</b></i>. Hmmmmm. What about the rest of them?

<font color=blue>Build a foolproof system and they'll build a better fool.</font color=blue><P ID="edit"><FONT SIZE=-1><EM>Edited by Confoundicator on 06/25/03 05:49 PM.</EM></FONT></P>
a b U Graphics card
June 26, 2003 8:27:10 AM

The question becomes 'What is a cheat', Quack and Futuremurk are cheats pure and simple. It's a question of whether it obvious that the intent was to increase scores at he COST of Image Quality or if it's just a quicker way of achieving the same results. One is a cheat and one is an optimization IMO. Now BOTH cheated, but by blurring the issue you make it sound as if it the end of the story. If people change their ways they should be given credit. An if it's minor transgretions versus larger ones, then it's more substantial (where there are degrees of things, like GRAND theft and PETTY theft). Making A=B despite the glaring differences in their methods/cheats just witewashes the issue and doesn't really make a good start at solving th issue, which is kinda complicated.

Personally from wha's be seen so far I'd say one company has engaged in Petty cheating and the other in GRAND cheating. However yes BOTH are cheats and BOTH should stop, although it appears only one has felt it's an issue worth stoping (or at least stoping what's been discovered).

As for the methods of Futuremark, I don't see how game developers would be different. They are influenced by NV and ATI too. I also doubt that small game developers could send the resources to help futuremark (who are really software developers themselves) come up with a reasonable test (and it's not just ATI and NV its ALOT of companies, Including the Software big boys). While everyone's involvement is key, it is obvious who benifits most, and like most other free-markt-systems. A much as we may have issues with it, Futuremark built they poduct and reputation it wasn't handed to them by someone else (like other governing bodies). I'm all for any other unbiased/objective replacement for 3Dmark, I just haven't seen any good suggestions so far. Maybe Aquamark ill b a good replacement, but I am CERTAIN that will depend on the legal teeth they can put behind he use of their product.
Remember these are just tools and it dpends on how we use them, unfortunately we've found that one of our tools is broken, the problem is there is no replacement at this time, so we're screwed.



- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
June 26, 2003 11:17:46 AM

Quote:
I agree that graphics chip manufacturers should have to publish their specifications. I also find it interesting that ATi went out of their way to tell us that they removed the optimizations that were found. Hmmmmm. What about the rest of them?

Exactly, what about all the other ATI optimizations like the ones found out 2-3 weeks earlier in 3dmark2001SE?

The problem nowadays with drivers is that it's impossible to write a driver which will work out of the box with all the different programs/games/benchies. It's mandatory that drivers today are coded in a way to recognize a special program and then apply the needed optimizations.

Both nvidia and ATI have been "cheating" longer than anyone of us here are aware of and it's only a matter of time until the next big cheating scandal (either of ATI or nvidia) will be revealed. If anyone really believe that ATI is as harmless as they are currently saying, trust me, you have been fooled in this case.

The period of raw performance has ended with the release of 3dmark03 and the resulting "cheating/optimization" findings. Currently the graphiccard sector is changing and hopefully all those driver optimizations which have been done until now in secret will finally be commonly accepted and maybe even publically discussed. This evolution is inevitable in any way. Of course, all these optimizations should NOT change îmage quality, they should only make the rendering process for a specific hardware faster.

I beg everyone to seriously reconsider their opinion over those driver optimization and to accept them as legitimate because it will be for either nvidia and ATI fans be much easier to accept the truth when in a few weeks the next optimizations will be be discovered... because they do exist for every company including Matrox and SiS.

And please be rational, don't come now with "ATI only optimized their driver for 3dmark03" because due the complexity of writing a driver nowadays for all the different programs and games, it's impossible to write one which works with everything without any tweaking.

In fact, the performance increases found in the latest Catalyst 3.5 driver only seem to be prove that. What 3dmark is for DirectX, GLExcess is the same for OpenGL and what a weird coincidence, GLExcess scores went up by 12-14% with the 3.5 driver. Strange thing when considering that the best performing cards for this benchmark have ever been nvidia cards. Specific GLExcess optimization or general OpenGL optimization? Same thing for the Viewperf benches; it's always suspect when only the most important games/benches gain performance increase with newer drivers.

Of course, I didn't say that nvidia never practised similiar methods, it's just that it should be clear by now that everyone does optimizations, everyone cheats so to say and that is about time to accept those optimizations in order for them to become more transparent for the enduser.
June 26, 2003 12:46:37 PM

Lemme look around it was either hardocp, the register, or shacknews... those are the only three sites I read regularly (besides here of course.) I'll dig around soon as I get some freetime here at work and post it.

I still stand by what I say about the matrox card also, I never backed down in the least, I just don't care enough to bother going to look it up when it happened THAT long ago. If you care that much go look it up, it's still out there, nothing disappears from the web entirely... if it bothers you so much and matters so much to you do the obvious, go look, I didn't end up buying the card so I really don't give a flying sh!t.

Shadus
June 26, 2003 1:48:28 PM

> everyone cheats so to say and that is about
> time to accept those optimizations in order
> for them to become more transparent for the
> enduser.

He who cheats best wins! That is what is going to happen in the long run regardless of what we want most likely. ATi cheated with quake, they cheated 3dmark, nvidia has cheated before and now caught with their hand stuck in the cookie jar again, they're both going to cheat in the future when they feel they 'need' to, and bad publicity is better than NO publicity.


Shadus
June 26, 2003 3:15:36 PM

Naw, man, I think they've learned their lesson. Especially nVidia. They took the whole damn cookie jar, and were resoundly slapped for doing so. I hope they've learned their lesson. As for ATi, I still stand by my opinion that they never did anything unforgiveable. Everything they did was in the spirit of optimizing, not cheating, and they took it away for clarification, solving the problem. nVidia is STILL lying. That's what ticks me off.

Good publicity is better than all of that.

Tit for tat, butter for fat, ATi's dog kicks nVidia's cat

(Maximum PC)
June 26, 2003 3:18:15 PM

I am never going to accept a reality where GFX card companies will make a hard working graphics programmer's graphics lower than they were intended so THEY can enjoy selling cards to the uninformed.

--
If I could see the Matrix, I'd tell you I am only seeing 0s inside your head! :tongue:
June 26, 2003 3:32:30 PM

Yeah, that's why I have no intentions of buying the nVidia 5900 Ultra when I get a new computer. I'm getting a 9800 Pro 128mb, which I feel is the best deal on highend cards anyway.

Tit for tat, butter for fat, ATi's dog kicks nVidia's cat

(Maximum PC)
June 26, 2003 3:34:39 PM

nvidia cheats the programmers as it tries to make them nvidia-dependend, both companies cheat programers and gamers with imagequality-issues (nvidia much much more currently) and espencially nvidia cheats gamers a lot in terms of performance.

to the question above, what is a cheat:
everything that differs if you change the file-name or any other thing in an executable, while you DON'T TOUCH THE GRAPHICS PART.

the same graphics-calls, done in the same order, have to result in the exactly same pictures, in exactly the same quality and with exactly the same performance. if it doesn't, its a cheat.

and there is possibly a way out of it. recording. every programs calls to the outer side (drivers and such) are recordable, there is an opensource wrapper for opengl, wich is just there to log opengl calls. helps to determine bottlenecks for the programer, or helps to write an openglwrapper for dx-only systems (xbox for example). helps to make dawn run on the radeon, etc.
wich such a wrapper you could record ALL opengl calls and make replayable testscenes, where you could everytime change the whole executable (and its name), and all. so the driver could still optimize, but only if its an optimisation, and not a cheat.

optimisation == driver does a certain task bether
cheat == driver detects a special situation and doesn't behave as it normally would in this situation.

dunno if that would help.. for the first part, yes.. but how long till there would be cheats around that?

and no, there will not change anything. i've seen "bowling in columbine" today and learned a lot.. forget believing, dudes. plain fact is nvidia doesn't suffer from this situation and we can't change that. so it will continue.

"take a look around" - limp bizkit

www.google.com
June 26, 2003 3:43:09 PM

Um you just contradicted yourself!
Quote:
everything that differs if you change the file-name or any other thing in an executable, while you DON'T TOUCH THE GRAPHICS PART.

I don't remember a cheat being something that alters performance without touching graphics :wink: .

Quote:
exactly the same quality and with exactly the same performance.


Then why bother optimizing if only the code inside is better but no performance increase or notceable difference happens to us gamers?!

--
If I could see the Matrix, I'd tell you I am only seeing 0s inside your head! :tongue:
a b U Graphics card
June 26, 2003 5:32:05 PM

The only reason I can think of where an optimization may give exact same quality and exact same performance, might be a stability/bug fix. Which in Application A makes it run glitch free, let's say reworking the drivers to run shader/code/etc as 1324 instead of 1234, however when run on Application B (a benchmark lets say) then it actually does affect performance because that reworking changes something to the way the application runs which is outside the norm. You still end up with the same quality but also increased performance, but only in app. B. Now this would be an example of a rework which for either product could end up with unintended effects and maybe the goal was same perf. same quality, but it worked out same quality improved perf. elsewhere. The opposite could be true to where quality suffers but performance stays the same (or increases).
However since these drivers detected the benchmarks and then set their paths/etc. this is not the case.
But it could happen that way.
Just and idea.

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
June 26, 2003 10:06:30 PM

you got it wrong:

if you don't change the graphics part of an application, means it renders the SAME SCENE EXACTLY WITH THE SAME CALLS then it should result in the exact same performance (on same hw, same drivers, same configuration, etc). if it does NOT, its a cheat. because it performs different depending on graphics-unrelated facts, like "quake3.exe", or "this is loading screen of 3dmark03", or what ever.

it is ALWAYS a cheat if it performs something BECAUSE of something different, DEPENDING on something different.

bugfixes in drivers that work for only a special app (say a bug fix for good old halflife) is a cheat, too. but its understandable, because sierra possibly don't want to fix their own bugs years after production.. still its a cheat, and means vendors have to port that fix over generations and generations of hw, or you can in some years not play halflife "correcly" anymore.

if there is a bug, fix it. but fix it where it happens to be a bug. application specific bugfixes are not reuseable. they should get fixed with an application patch.

a gpu driver should behave as a mathematical function: same input, same output. ALWAYS. of course, without bugs, and with best performance. but the most important is ALWAYS THE SAME RESULTS WHEN DOING THE SAME.
and THAT is what is not guaranteed with cheats.

for optimisations as for bugfixes (as for every problem in life:D ): always find the real source of the problem, and solve that. you can solve much more problems depending on that source most the time as well. and you're sure the bug does not reappear again.

"take a look around" - limp bizkit

www.google.com
June 26, 2003 11:57:46 PM

But I don't understand, by your statement, an optimization is a cheat because you did not alter the graphics and increased performance.

Was I right when I said you contradicted yourself, when saying it's a cheat if the image quality is unaffected?

--
If I could see the Matrix, I'd tell you I am only seeing 0s inside your head! :tongue:
June 27, 2003 12:26:25 AM

personal attacks. Let's stick to the discussion of facts, please.

Best,

Tom
June 27, 2003 12:37:11 AM

We've got two topics at hand that should be considered wisely:

1) What can be considered a cheat? As long as an application runs faster and looks the same, you might be able to call it an optimization. It's getting kinda tough when you look at NVIDIA's clipping planes issue in 3DMark03. After all, it's not a game but a benchmark, so the 'camera' ain't supposed to move any other way.

2) Can we really classify cheating? Does it make a difference if the gain's 20% or 2%? Isn't cheating just cheating? What if your buddy or spouse cheated on you? Would you make a difference between "a little cheating" and "a lot of cheating"? Where's the limitation?

I haven't even made up my own mind just yet. Lars and I discussed this issue at length. I am personally disgusted by the fact that now we suddenly learn that we can't trust anyone, as the cheating has become something rather fashionable.

What are your thoughts on that?

Cheers,

Tom
June 27, 2003 12:56:15 AM

I tend to see it the same way. People who cheat don't deserve to be trusted, regardless how much they gained. Right now, all the dirt that has been piling up in the drivers for years is coming right out. While NVIDIA has now been in the line of fire and in permament defense position for months, ATi has been firing away without being really clean and fair itself.

Best,

Thomas
June 27, 2003 12:57:37 AM

Yes! We finally have a moderator!

Quote:


I haven't even made up my own mind just yet. Lars and I discussed this issue at length. I am personally disgusted by the fact that now we suddenly learn that we can't trust anyone, as the cheating has become something rather fashionable.

Mr. Thomas (or is it just Thomas?), many other review websites have been considering using custom timedemos for future reviews. I was wondering what Tom's Hardware's stance on custom timedemos in reviews is.

Quote:

2) Can we really classify cheating? Does it make a difference if the gain's 20% or 2%? Isn't cheating just cheating? What if your buddy or spouse cheated on you? Would you make a difference between "a little cheating" and "a lot of cheating"? Where's the limitation?

The issue is that ATI's optimizations can be applied to any game. However, nVidia's performance enhancement technique only applies to timedemos and synthetic benchmarks where the camera views are known in advance. I, and several people on both Tom's Hardware Community and other hardware communities agree that what nVidia did should be classified as a cheat while what ATI did should be classified as an optimizations. I think most people agree there. However, some argue that a synthetic benchmark such as 3DMark2K3 is supposed to create a leveled playing field. They insist that graphics card manufacturers shouldn't optimize for 3DMark2K3, period.

Intelligence is not merely the wealth of knowledge but the sum of perception, wisdom, and knowledge.
June 27, 2003 1:06:58 AM

The GRX chip makers (and their almighty driver developing teams) like to claim they are free from any evil, while preferably pointing at others -- until the very day when they've been caught cheating.

Let me ask you a question:

We have strong reasons to believe that all of NVIDIA's cheats were actually found by ATi and then slipped to certain publications who sold these findings as their own. Not my place to judge over this.

However, how much work do you think ATi invests in their own "optimizations" when they already use so much man power to find cheats in NVIDIA's drivers? More? Less?

I wish I had nothing better to do than looking for the failures of other media and to play it to the right sources that use this information to their own as well as of course my benefit.

Regards,

Thomas
June 27, 2003 1:27:00 AM

Yes, after years of wild wild west, here's the pseudo-almighty, but unfortunately not at all omniscent moderator! As it was since the inception of Tom's Hardware in 1996, just call me "Tom" or "Thomas", whichever you fancy.

I personally think that customized benchmarks that are unavailable to the GRX-chip makers are the ONLY way to ensure unbiased testing in the future. Of course we should also hear Lars' opinion on this.

As I already said, I don't feel well getting caught up in the discussion "did NVIDIA cheat or did ATi cheat?" I know one thing for sure though - NEITHER OF THE TWO HAS BEEN TOTALLY HONEST with us, as we learn these days in rather heavy doses.

I have my own problems with the actual legitimacy of 3DMark03. Games are mostly optimized for ALL GRX-CHIPS with a reasonable user base. 3DMark03 was not. I think that Futuremark's sweet idea of a "level playing field" is seriously flawed. Today, in the era of programmable GPUs, we have the same situation as with other applications. Some run faster on AMD CPUs, some perform better on P4s. Why are we using e.g. BAPCo and still a huge number of other benchmarks? Because their is no CPU benchmark that gives you a reliable picture. How can 3DMark03 say it's a fair benchmark? Would you trust Futuremark if they presented you with the ultimately fair and unbiased CPU/memory/etc. benchmark?

Then I just don't know what I should think about Futuremarks business model. They receive hundreds of thousands of $$$ from those companies that they are meant to keep honest. They don't disclose the amount of money they receive from ATi, Intel, fromerly NVIDIA, etc. What would you think if you heard that one company pays Futuremark more than the other? Would that make you feel good?

There's a good reason why synthetic benchmarks have close to no relevance in the CPU field. I think we are facing the same in the 3D area. It's a natural development.

Regards,

Thomas
June 27, 2003 1:30:51 AM

Well said!
!