Sign in with
Sign up | Sign in
Your question

Biased 3DMark03 Article (was NVidia is spoonfeed~

Last response: in Graphics & Displays
Share
February 14, 2003 2:51:58 AM

Pff... I've been coming to Tom's Hardware for at least three years now, but if Tom allows such grotesquely biased articles as <A HREF="http://www4.tomshardware.com/column/20030213/index.html" target="_new">"3D Mark 2003 - The Gamers' Benchmark (?)"</A> to be published, I, like more and more of my peers, will dismiss the value of this website. You know, I don't know that 'Tom' himself is so biased persay... he is just pretty aloof and doesn't seem to closely monitor his page. He just lets little peons with their own agenda run their own little segment of the site.... and kicks them off if they go too far... Remember Van Smith?

So, I'm hoping he will redeem himself and give Lars a kick out the door! Look at this article from last year:

<A HREF="http://www4.tomshardware.com/blurb/20020825/index.html" target="_new">Tom's Blurb: Battle of Hypocrites</A>

In which he shares his disdain for Van Smith and Kyle Bennett's regurgitating the stories which had been spoon fed by companies that something to gain by the airing of the laundry. This line in particular ironic:

Quote:
I thought "Aha! ATi is cheating. Let's have a look!" I checked it out and indeed, there was something cheesy going on. However, I did not consider the issue serious enough. In fact, I was a bit irritated by the fact that the information about it had come from NVIDIA. If I picked up on this issue, I had to tell the world that it came from NVIDIA and I would do nothing but NVIDIA's marketing job.

Today Tom's site has done just that, NVIDIA's marketing job for them. Tom should be thoroughly disgusted with this. I do not believe one line of the article above was researched by Lars, but rather just regurgitated from NVIDIA's spoon fed PR-spindoctors.

Lars' bias is obvious, one needn't look only to his articles, the fact that he runs <A HREF="http://www.rivastation.com/go.htm?http://www.rivastatio..." target="_new">Riva Station</A> an NVIDIA fansite, is proof of this. That said, a fan of one product has the ability to put aside his biases when it comes down to the wire, unfortunately Lars has not done that consistantly.

In <A HREF="http://www4.tomshardware.com/graphic/20030127/geforce_f..." target="_new">NVIDIA GeForceFX: Brute Force Attack Against the King</A> Lars was the only tier-1 (if you like to consider yourself a tier-1 site), reviewer that awarded NVIDIA the crown.

It might be premature to eject Lars from Tom's Hardware, but I hope he is thoroughly dressed down for his latest, horriblely biased article!

I am tired and going to bed now, but should you require a item, by item scrutiny of Lars' regurgitated NVIDIA PR, I'll do so tomorrw. In the mean time, goodnight. :lol:  If anyone is interested, a good measure of my faith in Tom's Hardware would be restored if I came back tomorrow to find Lars' most recent article retracted.

<i>Edited title, I would have done it earlier, but was unaware I could do so. My apologies for the derogatory comment. It was written in frustration/anger</i>
February 14, 2003 3:11:49 AM

I was just going to post on this, but you did a much better job. Ive been coming to this site for a number of years myself aswell and the quality of the reviews are just not what they used to be.
I completly agree that Lars Weinand has been doing some pretty bad reviews lately, the last few seemed more then a little biased. The 3dmark review basically said what nvidia wanted(made?) someone(Lars?) to say, im not sure whats going on here but after crowning the now cancled FX you do have to wonder. Actually all the reviews have taken a dump in some way or another. Take a look at the recent Barton review, its full of various inaccurate information.

Sex is like a card game, if you dont have a good partner you better have a good hand.
February 14, 2003 3:35:43 AM

has ATi put out a comment on the program? i don't know, but if they haven't then the only quotes he could get WOULD be from nVidia
as to whether he's biased or not, i really don't know, could be, i thought the review was informative anyway, a decent analysis of what nVidia had to say and why

--------------

<A HREF="http://forumz.tomshardware.com/modules.php?name=Forums&..." target="_new">mubla otohp eht ni ecaf ruoy teg</A>
Related resources
February 14, 2003 3:59:49 AM

Waht a fine load of bollocks that article was.

This to me sounds like nVidia in damage control, trying to discredit a benchmark that doesn't do their cards any justice. They pull out of the Futuremark program a couple of months before the release of the benchmark, after participating for 16 months... something smells bad.

And the last paragraph... that game developers are expected to use seperate code-paths for different cards? What a load of baloney! The whole point of Direct-X and so-forth was the unify the coding base, so that only one code was needed, so that all hardware would be compatible. Look what happened to Glide, and how many GeForce only extensions are used in games? S.T.A.L.K.E.R. developers are going to be in for a rude shock when only 100,000 FX cards are being made...

I feel nVidia is pushing their luck here... if they keep this up, they are going to end up alienating themselves from the gamers, the developers, and the graphics market.

-

I plugged my ram into my motherboard, but unplugged it when I smelled cooked mutton.
February 14, 2003 4:48:58 AM

That was the most myopic piece of writing i've ever seen. This review would be unacceptable at high school level. Lars only used ONE source for his primary source material! Basic english courses teach a writer to use at LEAST 3 sources for a semi-objective review of writen material not ONE!

It's simply worthless as a piece of journalism.

rogo
a b U Graphics card
February 14, 2003 5:26:45 AM

Maybe ATI doesn't have a REASON to comment as the benchmarks seem fine to them? Perhaps these features that nVidia wines about are supported by ATI cards?

<font color=blue>There are no stupid questions, only stupid people doling out faulty information based upon rumors, myths, and poor logic!</font color=blue>
February 14, 2003 11:19:36 AM

Well said. What a load of crap! Did he even read the white paper from futuremark explaining how and what the benchmarks test??


The only reason nvidia is complaining is because their entire line of cards is reduced to crap in this benchmark.

This doesn´t correlate to reality? Not now it doesn´t, but wait until games with dx9 features come out and see how a DX8/7 card plays it..

and what is the thing about nvidia not participating with futuremark to create the benchmark?? they left the beta program 2 months before the launch!!

and PS1.4?! it is a big evolution from PS1.1 and 1.3, but just because their cards don't support it it shouldn´t be used? What about all those 8500/9000/9100 users ( not to mention that newer DX9 cards also suport PS1.4)? it isn´t a brand specific "thing", it's DX8.1!


they should concentrate on making better hardware and let it speak for it self!



and for the rep of this site ( which i've been reading for years now )... get on track again, because it's going down by every review you make.
February 14, 2003 11:56:26 AM

nvidia said this because their old hw sucks because it doesn't have ps1.4. doom3, while not dx, _WILL_ use ps1.4 features on cards it has, and it _WILL_ run much faster, and more precious and sweet thanks to ps1.4 features on those cards.

and yes, the bench doesnt have to do that much with games, there i agree. its rather bruteforce used here. but the poor performance is not a fault of 3dmark, it was planned. its a fault of a stupid gpu.

this test shows one thing: how far behind nvidia really was all the time. they successfully hided over the last year that they did not evolved away from the gf3, that their cards don't have any future proof features, simply nothing. ps1.4 is the first pixelshader capable to be called "pixel shader", or "programable". ps1.1 - ps1.3 (btw, nearly the same.. nvidias claim there they should used ps1.3 is stupid, there would not be much gain from ps1.3 features anyways..), ps1.1 - ps1.3 is the same as the multitexturing unit of dx7 cards for most cases. it is more powerful, but not much more. ps1.4 is the first pixelshader wich allows you to calculate per pixel what position on a texture you want where (and this to up to 6 textures..). thats called then dependent texture read, and is essential to provide advanced functions in fast, quick'n'dirty pixelshaders. nvidia can't do that on gf3/gf4. they never could.

3dmark 03 is not really a gaming benchmark. its a hardware benchmark. and it shows wich hw fits well into the standard features wich get used much in future games. nvidias card drifted off from this for quite some while, i always told them. no body cared. this is the first time we can officially see it. no wonder nvidia cries..

and tom, i'm dissapointed in supporting nvidias cry..

"take a look around" - limp bizkit

www.google.com
February 14, 2003 2:57:13 PM

ok i'm convinced, except for one thing
is the redundant code (redrawing a given object 36 times instead of 1 or whatever the issue was) truly a shortcoming of 3DMark03, or just a shortcoming of nVidia hardware?

--------------

<A HREF="http://forumz.tomshardware.com/modules.php?name=Forums&..." target="_new">mubla otohp eht ni ecaf ruoy teg</A>
February 14, 2003 3:37:17 PM

They didnt get rid of the article, but it looks like it was swept under the rug. Cant be found on the front page anymore but its still hidden under graphics cards which got put wayyy down to the bottom of the page for some mysterious reason? =)

Sex is like a card game, if you dont have a good partner you better have a good hand.
February 14, 2003 3:51:06 PM

Well since they "crowned" the new FX to the king of graphics card i've been very reluctant to come here. To me it seemed so odd that they named FX the nr1. That was the clock that rang biased for me
February 14, 2003 3:56:41 PM

The FX sucks, and I just read this.

Quote:
yeah, GeForce FX cards will indeed ship from the 20th, but there's trouble at the fab mill, Jim.

According to her, there's an allocation of 2,000 cards for the US, 1400 for the European Union and a measly 750 for the whole of Asia.

And Terratec, Creative, PNY, BFG, Gainward, Leadtek, Asus, and MSI will all have to share this generous amount between themselves.

These cards will be 128MB cards, apparently, and you can kiss the 256MB cards farewell until the NV35 arrives.




<A HREF="http://www.theinquirer.net/?article=7796" target="_new">Take it for what it's worth</A>

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">If you were to have sex with your clone would that be considered incest or masturbation?</A></font color=red>
February 14, 2003 3:58:34 PM

Just when there couldn't be any more bias on this site, along comes a seemingly harmless Valentines article.

One that proudly procclaims NVIDIA is selling more than ATI so there is no reason to fear. Ummm... they chose to go back as far as July, just back far enough to make the total figures tilt in NVIDIA's favour. The R9700/9500/9000 obviously can't instantly turn around ATI's fortunes and give them better sales off the bat, especially with NVIDIA's solid distribution and OEM channels...

However they clearly state NVIDIA is selling more than ATI, look at the December sales...

NVIDIA 31,168
ATI 36,807

Hmm seems to me that despite their disadvatages ATI horsewhipped NVIDIA in sales in December, isn't this more refelctive of the current trends in the industry then numbers from July? Yet the way the peice reads it's as if everything is A-OK in NVIDIA land... umm this time last year if you were to read that ATI had better September, October and December sales than NVIDIA you'd be springing a leak in your pants about now.

But I guess getting your @ss handed to you in the most important month for sales by a company that never even came close the past few years is not a sign of NVIDIA's star falling... ok Tom ... ok... everything is just fine and dandy in NVIDIA land uh huh... the GeForceFX is king, 3DMark 2003 has no merit and despite lower sales numbers than ATI, NVIDIA still somehow sells more, I mean they can always dig up numbers from 2002 once ATI's aggregate sales total is greater thus giving NVIDIA more sales... brilliant guys.

<P ID="edit"><FONT SIZE=-1><EM>Edited by wonderboy905 on 02/14/03 01:02 PM.</EM></FONT></P>
February 14, 2003 6:47:31 PM

Well I don't know, but I personally found the article to simply explain some things which do make sense, if you remove all the marketting feel.
I think we need a game programmer like davepermen to tell us whether nVidia's explanations of some tests such as Battle of Proxycon are indeed being very unoptimized.

And why would you suspect so much bias, if many other sources like HardOCP are also posting 3dMark03 editorials?
Come on people, at least look deeper before blaming already, geez.

EDIT: Ok, I replied to this thread while half of it now wasn't posted on this page, so I didn't know dave posted and many others. While it does seem like nVidia's cry for marketting here, I am still reluctant on judging the article as only biased and no info. Dave, tell us, are some engines used, really unoptimized?
I do agree with Lars though, and I don't see why nVidia is wrong here, despite also throwing marketting intentions in the mix: 3dMark HAS become controversial, it HAS to go fetch real world engines and stress them, not its own. Where is the lie people?

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile: <P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 02/14/03 03:58 PM.</EM></FONT></P>
February 14, 2003 6:57:57 PM

Not to be to critical but I don't care who sells more of what. I use both for different applications and as long as they keep forcing each other to push down prices I come out the winner. The numbers he stated can go to hell though. It would have been way more effective to use dollars instead of product sold.

As far as the V-DAY rant: I am tired of people blaming M$ for everything. It's just gotten old. Please get a new villain and let M$ go on hiatus for year or two. I pretty much agreed with everything else.

HULK SMASH!!!
February 14, 2003 7:12:31 PM

Oddly enough....

I was waiting for the release of the FX so that it would push down prices of the Radeons. Instead, since the release I've seen prices rise. Nvidia's failed card has surely given ATI a lot of confidence in their product and it's costing consumers.

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">If you were to have sex with your clone would that be considered incest or masturbation?</A></font color=red>
February 14, 2003 7:54:06 PM

I think it is more so the delay and rumours of limited quantity of the card coupled with the FXs inability to thorughly pound the 9700 into the ground. The 2 cards are equal in performance and thus Nvidia gets crucified. Why should ATI lower their prices when they have comparable product that isn't loud as hell. I would pick on them and call them stupid if they did.

By the way, the reason ATI hasn't really tooted their own horn is because they know the price of the GeF 4 Ti series is about to come down making it all that more attractive and I believe they are readying the 9900 for combat with the FX so they can stay on top. ATI beat Nvidia to the punch and NV knows this so they are probably working extra hard on the NV 31/34/35 and hoping to ride out the storm by pushing a lower priced Ti lineup probably all the way to the end of this year.

HULK SMASH!!!
February 14, 2003 8:03:58 PM

well there we go.. first: i downloaded doom3leak again, and it ran very well on my ati now. it didn't when i played it the first time. well "ran well".. never forget i am on 256mb ram and the game consumes 1gig of ram in this leak:D 

the 3dmark "doom3" demo ran very well as well..

the trick is simple: games will be optimized, at least, most games (most will just buy doom3 engine for their games, or unreal engine, or something like that anyways). so they will run faster than 3dmark or the leaked doom3. BUT anyways, the 3dmark demos do a simple thing: bruteforce. this is not unoptimized, its simply benchmark style. when ever you bench something, you do it too much. 36 passes at 5 lights is really much. then again, doom3 will do.. 7 passes for a full light on a gf3/gf4, 3 on the radeon8500, one on the 9500+.. if i remember right.. possibly not. but anyways.

what i want to say is yes, the 36 passes are stupid, but no, they are not useless.
first: its only on nvidia hw 36 passes.
second: nvidia used 5 lights to get to 36 passes. in most games, for some single polygon, there will not be 5 lights illuminating it (see culling.. one of the most important parts of a todays engine..), so 5 lights is too much most the time anyways. 20 passes for a normal pixel of a normal scene is more likely.
third: think about it again. it shows wich cards can process 36times the full objects, every time animating them, and the card can still draw tons of pixels (stencil shadows are hell for fillrate.. _REALLY_HELL_, a simple tiny triangle can fill a whole screen 2 times for dropping a simple, triangular, some pixels big shadow on the other side of the screen..), and all that. such a card is _FAST_.

now, gamedev will not animate a character 36times, will they? possibly..
if they don't, they can do 36times as much as seen in the demo. and that sounds like some impressive games, not?
if they do, they actually follow what nvidia told them to do: use vertexshaders.
why that and whats the fuzz now?
well. you can't store the result of a vertexshader, it gets sent to the rastericer to draw triangles directly. now nvidia, as well as all others, motivate vendors to use vertexshaders to do skinning (animating of a boned mesh). and nvidia, as well as all others, motivate vendors to use pixelshading, to do fancy stuff. problem now simply is (and 3dmark profs that very well), nvidia pre gfFX pixelshaders are crap. and that results in those tons of passes, for simplest perpixellighting. and it shows as well how much more advanced the radeon cards ever where (8500+ i mean) in pixelshading.
so nvidia wants the coder to do pixelshading? so they need multipass (several times throwing geometry to the full t&l unit, the rastericer, and the pixelshaders) to do anything good looking. thats what 3dmark did. 3 passes for a light, and 2 for the shadow. (or so.. i don't remember numbers well:D )
then nvidia motivates them to do skinning in the vertexshader. that means for each pass, 5 per light according to my count, they have to do the whole transformation again, the whole skinning.
now they want to bench stencilshadows, as they get quite important. problem: you need the final mesh form (yet animated), to create the shadow. that means, shadow creation has to follow the skinning. that means, it has to be done in a vertexshader as well. and that means the often mentoyed 6x as much geometry per mesh (why that is another topic, and why it is not needed most the time, and i don't believe 3dmark does it all the time, yet another topic..:D ). that means, you have to transform 6 times as much data, for 2 shadow passes, results in .. 12times as much data processed.

3dmark actually followed the suggestions of nvidia (currently they claim they never said it. but guess from wich devpage i've learned it all?:D ), and those suggestions created that engine. and the suggestion of nvidia simply shows their pixelshader is crap. the rest of the card isn't. it would perform well. but it has to draw 3 times as much. no chance for the card.
now nvidia started to cry. and thats understandable. 3 times as much, you can't compare then performance.
then i can simply reply: sorry dudes, but your card simply can't do tasks the easy way it should be possible. its your fault you can't do it in one pass. not ours. and doom3 WILL be optimized for every card, it WILL use radeon features where there is speed gain, and it WILL use nvidia features where there is speed gain. thats what 3dmark does as well.

and again: 3dmark does not need to be optimized, its a benchmark..

if you want to test how well a game really performs, you test it on low computers, on bad hw, and check if you can still play it. then the game has prooven to be a good programmed game.
if you want to test how well a gpu, a cpu, a pc or what ever is, you send in low optimized programs, and check if they can still run it. then they've prooven to be a good gpu/processor/pc.
and thats what 3dmark does. it does use the features available today and tomorrow (okay, dx9 is not much in.. i could see much more of this stuff:D ), and it shows up how well a card can handle it.

nvidia simply loosed.. and they are no good loosers.


after a full year (okay, more actually) following nvidias hyping and crying and how good they are everywhere, i am actually very happy to see 3dmark kicking their asses. finally something that shows up how old their cards are. they are fast, but they are very old. not useable for future gaming.
end of story:D 

"take a look around" - limp bizkit

www.google.com
February 14, 2003 10:32:28 PM

the only thing i do when i read the articles is read the specs, and compare the benchmark charts. thats all. the articles themselves have never really stood out as interesting...maybe the topic, but not the actual writing.

im not a big writing critic, but i also found that the article was kida biased. i dont have anything else to contribute to this thread, other than to help validate it.
February 15, 2003 5:31:28 PM

FutureMark has released a comprehensive and well written response to NVidia's allegations. Which can be found here:

http://www.futuremark.com/companyinfo/Response_to_3DMar...

If Tom's Hardware wishes to retain any credibility it will either create a new article based on FutureMark's rebuttal, or thoroughly update Lars' original post with this updated information... <b>AND</b> make sure to draw attention to the fact that the article has been updated.
February 15, 2003 11:14:58 PM

After reading the article last night, I went to bed feeling insulted. I dont know. It wasnt something I wanted to read on the front page of Tom's, I dont think it belonged there.
Saving up to buy Crashman a Shuttle Mini PC....
February 16, 2003 6:11:23 AM

<b>Occam's razor: When multiple explanations are available the simple one usually is true.</b>

- Futuremark released 3DMark03.
- NVidia panics and tries to discredit the benchmark
- Lars concludes "Is the main reason for NVIDIA's criticism at 3D Mark 2003 due to Futuremark's refusal to use optimized code?"

This has nothing to do with the fact whether nvidia likes synthetic benchmarks or not. This has everything to do with the fact that nvidia doesn't support all DX8 features and their DX9 will only run as fast as ATI if software is optimized for it. According to Lars Nvidia has started developer campaigns where they explain that software _has_ to be optimized for GFFX in order for it to run properly. According to plan file from John Carmac GFFX is about half the speed of R300 when run in an unoptimized environment! That's a drastic win for ATI. No wonder Nvidia's only change is to a) get GFFX optimized games out there and fast b) claim that those GFFX optimized games are the only real source of performance information.

<b>Conclusion: A clean and objective benchmark based on DirectX will hurt Nvidia in both high-end _and_ mainstream. Nvidia is doing everything it can to hide this.</b>

(While we all have a lot of doubts... :)  let's hope that Lars is man enough to do the math and do something about this! While biased, his article was actually very enlightening. He's the only one who has openly said that this is about Futuremark refusing to do Nvidia optimizations.

Lars: If you're reading this you can also solve this the Sherlock Holmes way: Figure out what the motive for each company may be. Once you have the motive you'll have the answer.<P ID="edit"><FONT SIZE=-1><EM>Edited by martin_ on 02/16/03 03:25 AM.</EM></FONT></P>
February 16, 2003 7:19:17 AM

It isn't about the content of the article. EVERY website has a certain connection with one or more of the market leaders. This is the rule. Otherwise they couldn't learn anything before all of us, the simple users - consumers. It is about the way you use the information coming from a company with great interests. It is unacceptable to argument a piece of software with incoming argumentation. You test it on your own and arrive at certain conclusions. Except the author nVidia will pay this false marketing move. nVidia has simply divided the gamers in two partitions. The ones who support 3d mark03 or have a good faith with what Futuremark is saying, and the others who are with nVidia. But I see from the messages that only a few support nVidia without any hessitation. nVidia has a loud, hot chipset, which bearly performs better than Radeon 9700 Pro, on the run. That's the truth. nVidia has to face this and don't play marketing tricks without watching far in the future. Do they want for a chipset to damage their icon and prestige? If they want it, they can do it very easily behaving like crying babies. The serious problem for nVidia isn't the inefficiency of 3d mark03 but the inefficiency of a 500MHz core compared to a 325MHz core. This is the real problem. They thought that 1GHz memory can sell as a marketing trick. They are wrong. These cards like GeforceFX and Radeon 9700 are addressed to gamers who know some of the underlying technology and can differentiate. It's not the OEM market! If they want to burn themselves they can continue to behave like blinds in a market which they continued to build and develop after 3dfx. It's their choice.
February 16, 2003 9:49:39 AM

What's the probability that Lars will even bother doing some research into this. He runs a nvidia biased site himself. Either tom should lecture him or just kick him off.
February 16, 2003 11:19:45 AM

"I've been coming to Tom's Hardware for at least three years now"

And your posting status is Stranger. I guess you have been reading Tom's articles but have never bothered to post on this board in the past 3 years.

You haven't even reached 25 posts. In fact the date you registered was 2/13/03 :-P Just seems like a flamer looking for attention to me.

BTW I only read this guy's first post, none of the replies. I guess this thread is a bit worthless to me other than this statement I am making :-)

-----

Benchmarks don't lie :-)
February 16, 2003 4:11:28 PM

There are many people who mainly read at different sites and rarely post or never post. Some sites i only read some i also post at.
February 16, 2003 4:44:42 PM

As Unit01 mentioned (thanks, I take it you are an Eva fan), many people, like myself, visit many websites, but only post in the forums of a few (Rage3d, NVNews, Beyond3D). I have visited Tom's for years, browsed the forums somewhat, but didn't feel the desire to post here, until that article disgusted me to the point it did. if you go to SharkyExtreme's forums... you'll find I posted thousands of comments there YEARS ago, only to abandon my support of the site after Sharky left. I'm afraid I might have to abandon Tom's as well, but where then will I turn? The number of high quality computer hardware sites is dwindling, so I'd rather see Tom's reformed than go looking for another 'hardware home'.

You say I just seem like a flamer looking for attention, yet where as I wrote out a rational and well defined comment tht is critical of a comment made here, you just slap me with such a label without even bothering to read my thread. If you did, you'd find that many credible members of Tom's community have supported my opinion. I suggest you educate yourself before jumping to such conclusions in the future.

You can start by reading <A HREF="http://www.futuremark.com/companyinfo/Response_to_3DMar..." target="_new">FutureMark's response to NVidia's criticism</A> along with their <A HREF="http://www.futuremark.com/companyinfo/3dmark03_whitepap..." target="_new">original whitepaper</A> and <A HREF="http://www.hardocp.com/article.html?art=NDMx" target="_new"> ATI's stance on 3DMark03</A> Oh and you might take the time to read the entire thread.

If you have specific complaints with my argument address them. Don't make blanket conclusions based on insufficient information.
February 16, 2003 6:13:47 PM

Wow, I just read the ATi comment, this indeed shows how 2 against 1 show that nVidia was only crying. I am more convinced now than before, and considering FutureMark's reply with arguments to counter nVidia, and ATi's stance on it, I see why nVidia shouted.

We'll have to see ATi's more detailed view on this though.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
February 16, 2003 7:25:28 PM

First of all some direct comments... which I said I'd get around to in my initial post:

<b>Page 1:</b>

Quote:
These tests use ps1.4 for all the pixel shaders in the scenes. Fallback versions of the pixel shaders are provided in ps1.1 for hardware that doesn't support ps1.4. Conspicuously absent from these scenes, however, is any ps1.3 pixel shaders. Current DirectX 8.0 (DX8) games, such as Tiger Woods and Unreal Tournament 2003, all use ps1.1 and ps1.3 pixel shaders. Few, if any, are using ps1.4.

- NVidia's comment in the Tom's Hardware Editorial

Interestingly enough, both games NVidia mentioned above support PS1.4. Furthermore many of the most well respected posters at Beyond3d have stated that all DX9 games will fall back to PS1.4 in the absence of PS2.0. I'll let futuremark speak for themselves on the absence of PS1.3 support.

Quote:
According to the DirectX 8 specification, there are 4 different pixel shader models. In order to do a fair
benchmark, you want any hardware to do the minimum number of shader passes necessary to render the desired scene. We analyzed all 4 shader models and found that for our tests Pixel Shader 1.2 and Pixel Shader 1.3 did not provide any additional capabilities or performance over Pixel Shader 1.1. Therefore we provided two code paths in order to allow for the broadest compatibility.
A good 3D benchmark must display the exact same output on each piece of hardware with the most efficient methods supported. If a given hardware supports pixel shader 1.4, like all DirectX 9 level hardware does, then that hardware will perform better in these tests, since it needs less rendering passes. Additionally, 1.4 shaders allow each texture to be read twice (total 4 texture lookups in 1.1, but 12 (=6*2) in 1.4 shaders). This is why, not only Futuremark, but also game developers can only implement
single pass per light rendering using a 1.4 pixel shader, and not using a 1.3 or lower pixel shader. A 2.0 pixel shader would not have brought any advantages to these tests either. Note that the DirectX design requires that each new shader model is a superset of the prior shader models. Therefore all DirectX 9 hardware not only supports pixel shader 2.0, but also Pixel Shader 1.4, 1.3, 1.2, and 1.1.

-FutureMark's response to NVidia's allegations

Next:
Quote:
These shaders don't only ensure bad results of PS 1.1 cards compared to those that support PS 1.4 (they need more passes for the effect) they are also hardly used in actual 3D games. <b>Xbox can't run PS 1.4 code as well.</b> Even more serious is that 3DMark03 test runs use different shader codes for different cards. This makes comparisons between different 3D-chips close to impossible. In 3D Mark 2001 SE, this was only the case with a special PS1.4 test. Now, however, all tests, including GT4 (Mother Nature), are actually no longer comparable to one another.

- Lars Weinand

First off, I wanted to vomit at the inclusion of a reference to the X-Box. What, pray-tell does that have to do with 3DMark at all? Last I checked the 3DMark tests were meant to compare graphics cards, not consoles.

Now taking that comment in its entirety I have to ask first why should Futuremark limit itself to DirectX 8.0 when the newer standard 8.1 is available? The fact that NVidia does not choose to support that standard is not Futuremark's fault. It isn't a card specific test, it is a Microsoft standard! Furthermore, why would test four not be comparable amongst different cards? The fact that it integrates DirectX 8.1 shaders is irrelevent since only DirectX 9 Cards can run the test and DirectX 8.1 is a subset of DirectX 9.

<b>Page 2:</b>

Quote:
The portion of this algorithm labeled "Skin Object in Vertex Shader" is doing the exact same skinning calculation over and over for each object. In a scene with five lights, for example, each object gets re-skinned 11 times.

- NVidia's comment in the Tom's Hardware Editorial

Quote:
Since each light is performance-wise expensive, game developers have level designs optimized so that as few lights as possible are used concurrently on one character. Following this practice, 3DMark03 sometimes uses as many as two lights that reach a character concurrently, not five as mentioned in some instances.

-FutureMark's response

This proves that:

a)NVidia was lying or at least being very deceptive in their wording (They never actually claim that there are 5 light situations in FutureMark's test) in their comment...

and more importantly in the context of this thread...

b)Lars Weinand blindly regurgitated what NVidia told him without any independant testing.

<b>Page 3:</b>

Quote:
This year's 3DMark has a new nature scene. It is intended to represent the new DirectX 9.0 (DX9) applications targeted for release this year. The key issue with this game scene is that it is barely DX9. Seven of the nine pixel shaders in this test are still ps1.4 from DX8. The same issues about ps1.4 shaders described above apply here. Only two of the pixel shaders are the new ps2.0. Consumers believing that this test gives a strong look at the future of games will find it merely provides a brief glimpse, if at all.

-NVidia's comment in the Tom's Hardware Editorial

Quote:
The argument here is that game test 4 is not “DirectX 9 enough”. Once again, a good application should draw a scene as efficiently as possible. In the case of game test 4 this means that some objects use Pixel Shaders 2.0, and some use 1.4 or 1.1 if a more complex shader is not required. Because each shader model is a superset of the prior shader models, this will be very efficient on all DirectX 9 hardware. In addition, the entire benchmark has been developed to be a full DirectX 9 benchmark:

-FutureMark's response

This is such an obvious statement that Weinand failed to address that it throws any credibility into question. Why use very taxing shaders when less complex can quite effectively render the same effect. No game developer would go with 2.0 shaders when 1.1 or 1.3 are adequate.

Quote:
It surprises to remember that while the argument of being non-realistic was also valid for earlier versions of 3D Mark, although not to this extent, NVIDIA's marketing machine used to promote this benchmark quite strongly. There's the nagging suspicion that NVIDIA was afraid that GeForce FX wouldn't be able to stand up against the Radeon 9700 PRO in 3DMark03.


Indeed they did promote the benchmark strongly, in fact you may remember their performance analyzer that told you which NVidia card to upgrade too based on 3DMark scores. You also may recall that when 3DMark2K1 was released only the GF3 supported the Nature scene. So when 3DMark placed NVidia in a favorable light they were more than willing to support it, but once that has changed they do everything in their power to discredit the benchmark.

Quote:
An initial test with the GeForce FX "Launch Driver" (v42.63) confirms this suspicion immediately. Yesterday, however, NVIDIA made its new driver (v42.68) available, and here, the results look completely different:



I'll quote FirebirdTN from the Rage3D forums (with his permission) who noticed something suspicious about this, rather than commenting on it myself.

Quote:
Okay, now for the part of Toms that hasn't been mentioned yet that REALLY burned me...

"...it's no problem for a graphics driver to recognize a special and well-known shader code, and then to replace it with a special, optimized code while the game or benchmark is executed."

Hmmm.....so he claims its easy to for hardware manufacturers to "cheat" with driver optimizations? Is that what that means?

How about this:

"There's the nagging suspicion that NVIDIA was afraid that GeForce FX wouldn't be able to stand up against the Radeon 9700 PRO in 3DMark03. An initial test with the GeForce FX "Launch Driver" (v42.63) confirms this suspicion immediately. Yesterday, however, NVIDIA made its new driver (v42.68) available, and here, the results look completely different: ..."

"...With the new driver the shader performance has nearly doubled in some instances! ..."

Now, have I got this totally wrong, or was ATI accused of inflating QIII's performance with a set of drivers, what about 6 months ago?

So, its okay for the manufactuer to "optimize" a driver set for a common benchmark, but when a competitor does it for a REAL WORLD game as they say, then its cheating???

Beyond what FirebirdTN noticed, there have been many reports circulating the internet that these new drivers with their exceptional speed boost, not only sacrifice image quality, but in fact fail to render explosions, smoke, and machine gun fire in test 1. I am not sure if there have been such graphics anomylies in the other tests.

Quote:
The Image Quality tests are only restricted to screen shots, instead of showing special scenes that could demonstrate advantages and disadvantages of a card's specific FSAA implementation.

-Lars Weinard

Okay, explain to me how the ability to do both image and filtering quality tests on ANY frame rendered in ANY test is inferior to having some contrived test designed just for AA and AF quality? That is rediculous. While the 3DMark03 test IS synthetic... the amazingly refined image comparison tests are REAL WORLD. The way AA and AF look in 3DMark03 is the way they will look in real games. Furthermore the fact that you can take EXACT frames in each test in this, newest version of 3DMark, means an exact frame to frame comparison can be made between video cards, rather than the old method in which testers just did their best to get as close as possible.

Quote:
At the NVIDIA Developer Forum, it became clear the Pixel Shader (2.0) code doesn't run equally well on all hardware. Ultimately, it all boils down to the fact that developers need to use special code for different chips in order to ensure good performance on all possible platforms.
This explains the attention that NVIDIA devoted to developers. Is the main reason for NVIDIA's criticism at 3D Mark 2003 due to Futuremark's refusal to use optimized code? In this case, NVIDIA is right in saying that 3D Mark is unrealistic - for one thing, because it was developed without NVIDIA's input, and for another thing, because it's difficult to imagine that actual "neutral" code would be useful for performance evaluation.

So in otherwords, you are claiming that since an objective benchmark is objective and doesn't optimise for specific hardware it is worthless? A benchmark, in my mind is supposed to gauge the efficiency of hardware, NOT the developers attempts to make it run as well as possible on all hardware. Also, it is disgusting that Wienard states that NVidia WAS NOT involved in the development of 3DMark03 when in fact they were part of the development team for 16 of its 18 month development cycle, but dropped out in December when they realized their cards weren't going to be shown in a favorable light.



More information on the subject can be found here:
<A HREF="http://hwextreme.com/reviews/misc/3dresponse/" target="_new">http://hwextreme.com/reviews/misc/3dresponse/&lt;/A>
<A HREF="http://www.rage3d.com/board/showthread.php?s=26cf8abbab..." target="_new">http://www.rage3d.com/board/showthread.php?s=26cf8abbab...;/A>
and
<A HREF="http://www.beyond3d.com/forum/viewtopic.php?t=4302&post..." target="_new">http://www.beyond3d.com/forum/viewtopic.php?t=4302&post...;/A>

*Deep Breath* Okay I spent over two hours compiling all that. I'd wager, that is about one hour and forty-five minutes longer than Mr. Weinard spent on his initial article. I hope this will silence all of my critics.
February 16, 2003 7:45:22 PM

How can you be more convinced of NVidia's innocence after reading FutureMark's response? Unlike NVidia, Futuremark backed up their rebuttals solidly. I was particularly upset that NVidia implied that 5 light sources were used when only 2 were in engine. ATI didn't really say much so I don't have any comment on their support of FutureMark. Oh, btw, Dell has also endorsed 3DMark03.
February 16, 2003 10:47:07 PM

Gosh, this should be fun.

While I marvel at the vitriole directed against Lars, here's my take on it.

It is a column that reflects a fact of life in the graphics business:

-drivers win benchmarks, and drivers impact real world performance.
-because, applications, in this instance games, are apt to target specific hardware, particularly if they know the installed base is sufficient to justify the work they put into optimization, or the optimization is a minor act of coding.

What Lars points out isn't that astonishing, but not in any way pro-Nvidia. It did include his feedback derived from an Nvidia game developer event, sure, but Lars is not Nvidia biased. He is about the most stand-up guy you are going to meet so, knock it off with that stupid, childish name calling which doesn't really do much for your arguments.

It was a column that pointed out an issue that is going to get more cloudy as programmable graphics becomes more acceptable to the game developer community.

Remember, programmable graphics processing units are a very new thing, in game graphic terms, and are not fully used by developers. Nvidia has added more support for coding than ATI, and that's a fact so, why shouldn't they be concerned that the work they have done to get developers to support their features will not show up in Futuremark.

All Lars is saying is, game developers are going to program their software differently for Nvidia boards than ATI, and the new Futuremark benchmark may not take that into consideration so, is the new benchmark an objective assessment of real world gaming performance.

The real world situation is, game developers are going to optimize code for Nvidia's GPUs, as well as ATIs. In Nvidia's case, this may not be reflected in Futuremark, and yes, probably because, Nvidia has done things in hardware that are not exactly "standard", but the issue is real game performance.

Frankly, you will find benchmarks coming under extreme pressue right now because, vendors have so little differentiation.

Yes, it doesn't make a huge amount of difference whether you have an Intel, AMD, Nvidia, or ATI product at the high-end of their respective categories. You are in the same ballpark, everything else being equal.

Lars article is not that a big a bombshell, if you understand how optimizations for graphics card are going to work in actual games, and it is not Nvidia biased. It asks the question, does Futuremark give a fair assessment of graphics performance when one of the two big graphics chip vendors sees such discrepencies.

We had a similar situation with AMD and Bapco, at one point.

As for, Futuremark's replies, I know that Lars, Tom, and the Futuremark guys are in contact (I got copied on the emails).

Now, take a chill pill, or whatever your sedative of choice is, and lay off Lars. The guy is one of the best reviewers I have seen in the last 10 years. You don't have to agree with him, but he is neither biased nor does he have a hidden agenda.

If it was anyone else, I'd let them answer you on their own, but not Lars.

Finally, on the subject of Van Smith, I will say one thing because, I am sick and tired of having people write about something that they know nothing about:

-Van Smith went out of his way to personally attack Tom with lies, fabrications and distortions of the truth.
-The proof is that I stuck our lawyers on him and he caved knowing he didn't have a leg to stand on.
-He is not someone that I know personally, nor do I care to know. What I do know is that if he ever refers to this site in the way that he did, and if he ever writes crap about THG and his time here, and the people here, I will go after him with all the legal firepower I can muster. While I don't expect everyone to love Tom, I do know that he is not in any shape or form a racist, and Smith took his cheap shots, and wrote his crap.

Omid Rahmat
Tom's Guides Publishing
February 17, 2003 12:51:24 AM

i can't believe i just went from really convinced of one thing to really convinced of another in like...15 minutes
it shows how little i know about the industry =/
well written, omid
i think one standard benchmark is no longer possible for the gaming industry, but instead in-game benches are needed
fortunately, of course, they exist
i don't plan to use 3DMark03 as a standard because of all of this "optimization" talk
i just want to see what's gonna play my games well and buy it

--------------

<A HREF="http://forumz.tomshardware.com/modules.php?name=Forums&..." target="_new">mubla otohp eht ni ecaf ruoy teg</A>
February 17, 2003 12:58:26 AM

People, it is the 300th Episode of The Simpsons. There are more important things to think about than the performance of an ATI/Nvidia chip in Aquanox or Serious Sam 2 on this momentous Sunday night.
February 17, 2003 2:18:20 AM

No, I said I am convinced about nVidia's cry for attention just because their card did not run well, that's the complete opposite of what you thought I said!

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
February 17, 2003 2:28:28 AM

Now I've always enjoyed the graphics guides Lars has wrote, mind you, so I am not against him on this one.
What I do am, is nVidia's reply, which seemingly does not correlate with ATi's enthusiasm about 3d Mark 03. nVidia's FX, as Davepermen, local game graphics programmer, has stated, is simply not THAT advanced in the DirectX 9 technologies IT ADDS, nor is CineFX any worth any soon.
Point being that optimizing for the FX is simply to let its brute force flex itself, while ruining other cards' "efficient" way of performing. Just as optimizing only for the Pentium 4's deep slim pipeline, and letting go of the Athlon. Both should be equally fed, not one. nVidia's card is overly inefficient at 500MHZ, and no, we should not waste our time optimizing for it. The graphics card industry is not the same as the CPU, nor SHOULD it be. Cards are still at 500MHZ for the GPU, and are very slowly evolving compared to CPUs. It is not time yet to go for insane clock speeds, it is time to continue creating IPC.

What can you comment on the driver cheating nVidia did with the 42.68? Are they enough to warrant the sudden jump in 3d Mark 03 score seen on THG's article? I know smoke, explosion, fire effects mostly take the frame rate.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
February 17, 2003 12:50:46 PM

Omid,

I believe Lars was premature with his article at best and does look like he lacked feedback from FutureMark and maybe even ATI and others prior to publishing. Fact is 3dMark03 uses DX9 programming, fact is Nvidia cards available now on the shelf lack DX9 capability except for the mysterious GF FX which Nvidia promissed in August of last year. He's maybe a standup guy but standup guys do make mistakes and admit to them when they do. The article is full of statements which I believe stem from faulty assumptions or premisses, it should be redone or updated at least. FutureMark new 3dMark has always been forward looking just as the previous benchmarks have been in the past, no difference here. They have always been based on the newest Directx capability just like the previous. It does not make the previous 3dMark obsolete, 3dMark2001 should be used if you want to test DX8 hardware, plain and simple. The image quality comparison ability of 3dMark03 is second to none. I find the suite of test to be inciteful and well thought out which I believe stemmed from alot of involvement from a number of beta testers including Nvidia. The article should be pulled as far as I am concerned because it is not up to par with Tomshardware standards.

I too find myself coming here less and less. Reason being is Thomas Pabst rarely is seen here or heard from. He is the one or should I say his character is what made this site, his absense or lack of appearance is starting to make this site like all the rest on the Internet, dime a dozen.

Now when are you going to write another review???? <b>Oh I did like your Valentine's Day Special: Love Hurts</b> :smile:
Quote:
<i>Valentine's Day. When couples try and pretend that an arbitrary point in time will somehow hold emotional significance, and tide over the cracks in failing relationships. The time of year when baubles, trinkets, and cellulite-friendly sugar products turn men into knights on white steeds.</i>

I've been married for 20 years now and I was wondering why a box of chocolates was sitting at my desk on valintine's day :lol:  . In short I didn't even know it was valentine's Day, my wife laughed when I told her I had nothing for her except myself. She laughed and accepted that as we strive to reach another 20 years. Now that is true candy for you. Gesh that sounded terrible :redface: , I should be banned for saying such things here.

<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 02/17/03 10:30 AM.</EM></FONT></P>
February 17, 2003 1:43:44 PM

Well, thanks for coming to hang out with us Omid, but still, Lars has to go.....

Saving up to buy Crashman a Shuttle Mini PC....
February 17, 2003 1:51:59 PM

Actually, Tom edited Lars' article himself, and Tom does write for the site. I was surprised that people failed to notice that he did two excellent, and first out the gate, articles on Centrino these past few weeks.

I still don't believe that Lars' article says anything out of the ordinary. It merely asks whether 3DMark03 is as relevant to real world gaming as real world games.

Frankly, I doubt that we will not use it, as we do a number of other tests. We always try and get as many tests as possible because, no one benchmark is definitive.

As for THG being a dime a dozen: well, I'll have to disagree with you there :) 
February 17, 2003 1:56:02 PM

Now that's just plain mean spirited. Let's all have a group hug.
February 17, 2003 2:23:10 PM

Omid, I wrote my initial comment while angered at the content of the article in question, If I was able to (maybe I am and just don't know how), I'd change the title, I regret having written it, but the fact of the matter is that the article in question is biased, even if Lars isn't. Ask any English comp 101 instructor whether any article written from only one source, without substantially questioning that source is valid, and they will say no. At the very least, Weinard could have looked to FutureMark's whitepapers before writing that article. Furthermore, he could have contact FutureMark for a response before flying off the handle and posting NVidia's criticism unopposed. It is just a truly horrible article, no matter who wrote or edited it.
February 17, 2003 2:57:19 PM

Yea, the title is a little bit blunt and I do think you can change it if you edit your first post. I would and I would apologizes if it was mine, just put in correction in first post in italics or something. Still a very good discussion.
February 17, 2003 3:25:46 PM

Quote:
<i>Actually, Tom edited Lars' article himself, and Tom does write for the site. I was surprised that people failed to notice that he did two excellent, and first out the gate, articles on Centrino these past few weeks.

</i>

Oops, I guess Tom is spurring up some commotion which I always found to be good in the past. I guess I've been away too long and missed those. I will read them because I always have enjoyed his articles, I havn't always agreed with him but always respected his viewpoints and method or reasoning behind them.

I hope Tomshardware continues to use a broad approach in testing hardware which is something I always like. In addition, the how to make it faster, better and cooler is usually my favorite recipes on the net that you guys provided for some fine cooking.

Still 3dMark03 does make something useful for the reviewer (not the dime a dozen type reviews) and that is with the image quality tests. The average reviewers (cough cough) usually lack comparison shots and explanations (my opinion) of what those performance numbers are pushing out at you. You can show apples to apples comparisons for all the different filtering schemes available with each card with some of the most advance DX coding ever done :smile: . I hope that aspect is done in the future using this new tool. Still 3dMark03 is useful and needed but using a balance approach method including games would be the best way as far as I am concern.

Thanks saintbabs for joining us here, it is very well received, have a good day.
February 17, 2003 4:11:42 PM

No problem. You should see the hate mail we get during the course of the day.

I get the "sand [-peep-]" ones, and Lars and Tom get the "nazi" ones.

So, you have to remember that when people start to call us names, for whatever reason, it is not really in our best interest to get involved or reply or care.

Anyhow, I am sure that Lars will be on doing more on the subject, and he has very good relationships with all the parties involved, including Futuremark, ATI, and Nvidia.

Okay, I have to let go of this thread, and go do some work.

Thanks, guys.
Anonymous
a b U Graphics card
February 17, 2003 4:49:58 PM

Quote:
Actually, Tom edited Lars' article himself...

Uh, well that makes the thing much worse - I hoped Tom would stand up here.

The most disturbing thing to me is that Lars/Tom actually wrote:
Quote:
it's difficult to imagine that actual "neutral" code would be useful for performance evaluation

That plainly says that Tom's Hardware isn't doing and/or going to do any fair tests. How can there be fair tests if the code is optimized for one company? Sure that particular game will run faster on that companys cards, but what about the future games that might not have those optimizations? OR future cards from other companies?

Yes, Tom and Lars seem to be saying that most games are optimized for NVidia (and some even for ATI) so the cards should be tested with code that is optimized for them. Just consider companies like Matrox? How many games are optimized for Parhelia? Probably none. So testing their cards with code that is optimized for NVidia - Matrox will alway lose even with an equal card. It's just too bad that Tom's Hardware openly admits that they are not going to give any new manufacturer any chance of getting out a good card as their tests will always be optimized for the old king. Fairness - that's what I thought Tom was about, but sadly doesn't seem that way anymore.

Besides, Futuremark has had their well written response out now for some time. Tom's Hardware hasn't even mentioned it. I believe that it, together with what was written here about Tom editing that story, just proves that the story wasn't written in a hurry - it was deliberately written as such. That's just how Tom's hardware is nowadays. It used to be so much more reliable... Used to be my favorite site. I will still monitor this site for about a week to see how they write about Futuremarks reply.
February 17, 2003 6:42:57 PM

The article makes a point I find very valid: Why do we support benchmarking tools that wield so much influence that engineers spend the bulk of their time optimizing drivers to extract the best possible score from those benchmarks, rather than working on drivers for games? nVidia released new optimized drivers for 3dMark03, and suddenly the FX kicks butt in the benchmark, yet, by their own admission, this does not provide us a fair indication of how well actual games will perform with a specific graphics card. ATI is guilty of the same thing. Is that really the best way for graphics engineers to spend their time?

It really seems like some people are looking at this as an nVidia vs. ATI argument and it really shouldn't be. You should be interested in making sure you are getting good information from any benchmark that you will use to make your next purchasing decision.

Seriously, what do I care how well the 9700 Pro or GeForce FX will do in Direct X 9.0 applications right now? We won't see too many Direct X 9.0 applications for quite a while, and by that time, nVidia will probably have the FX heating problem fixed and be running on a 256-bit memory bus (read: FX-Ultra,) while ATI will have gone to .13 process and BOTH of those cards will destroy what we have today. If you want a card that kicks butt in Direct X 9.x, you might as well buy neither of the two cards and just wait, because we're barely seeing games for DX 8.1.

Sorry, I just find it ridiculous to see all the carping over two graphics cards that are essentially equal with a variance in performance of 3-5% in most benchmarks. Too many fanboys. I'm supposed to love nVidia because they have been on top for a long time? I'm supposed to love ATI because they are the recent champions and that makes them somehow more noble?

Today, I love the Radeon. Six months from now, I may be in love with the newest GeForce. Who knows, maybe SiS will shock us all and release a darkhorse card that destroys all! (Okay, that's not gonna happen.) I just don't see why I'm supposed to love or hate any company that really only wants my money.
February 17, 2003 8:05:08 PM

I still do not like 3dmark article.

Sex is like a card game, if you dont have a good partner you better have a good hand.
February 17, 2003 8:27:03 PM

The thing is that the article/column itself is almost identical to the crap nvidia threw out.

3dmark03 puts the FX in a bad light.
Instead of trying to write better real drivers. They do several things. 1. They say that 3dmark03 is not a real benchmark. It should be optimized to run perfectly. It's a benchmark that's supposed to last 1 years minimum and at least to DX10. And they expect it to already run optimal. It's a benchmark not a game. It's supposed to create bottlenecks to stress current cards and be able to stress future cards in the same way. 3dmark03 is also able to stress a specific part of the graphics card with the variety of tests. And also the new screenshot future enables frame to frame comparison of identical frames.

2.Nvidia said that it's easy to replace a certain code that when detected with a different one. They not only mentioned this to show that it's easy to surpass 3dmark03 way of running things. But they wanna stomp futuremark to dust and 3dmark03 claiming that it is inreliable. While others look at the way things can be done in 3dmark03. Some forget that nvidia put out a "new driver release" to the reviews just so they could surpass the radeon9700pro in the final score. If you look through some other sites that has detailed the scores you will notice that the only improvement is in the tests that add to the final score(not sure which one). And that the other results remain the same. Which clearly shows now that nvidia cheated. But many forget this. they merely look at ooh nvidia got 300 more 3d03marks. This is one of the things lars doesn't mention either. Nvidia's cheat though is surpassed by many also due to cause look at futuremark rather than nvidia

3.This is just to silly but i'll mention it anyways. Nvidia says 3dmark03 is a bad benchmark cause nvidia didn't have any input in it. Well is it futuremark's problem/fault that nvidia dropped out 2 months before the final release? They were in the beta testing from beginning but they dropped 2 months before the final release. So they think 3dmark03 would've been a "real" benchmark if it would've been optimized for the GF FX. Is it me or is that just plain silly? Just cause it does run neutral code(well it wont after the "new and improved nvidia drivers") it's not a good benchmark.

4.Nvidia says that they have to waste valuable time and effort to optimize their drivers for the benchmark. WRONG. Noone forces them to do it. I'm not very knowledgable into this but i'll say what i can. 3dmark03 runs most of the things like many future games will run minus the optimization. This not only gives an indication of how not perfect the design of the FX is. But also that it needs specific optimisations to either beat or be on par with the radeon9700pro. Whereas the radeon9700pro runs exactly the same way it will in games that will execute the same type of commands in the future as it does now. Performance will be different of course. But it will work the same way.

Off Topic: I find it quite hard to believe Lars is non-biased considering 2 things. 1: He crowned the FX the new king. When it wasn't really all that great. I wont repeat the reasons cause it's been mentioned so many times. The review didn't have screenshots of the quality tests he claimed he did.
2:Take a look at this site
http://www.rivastation.com/about.htm
If he really is non-biased it's something that the future will tell. Cause at the moment he's just a nvidia-bribed reviewer from my point of view.

I agree with some that the quality of this site has degraded slowly but surely for a long time. I hope tom will give some more input to it than he does now.

Unit01
February 17, 2003 11:09:15 PM

Look OICA, you started this thread expressing the way you feel, and it touched home with alot of people here, thats why it got so many responces, these threads are here for exactly those purposes. Just because Omid showed up(which I appreciate) and expressed his oppinion, doesnt mean that you should back down from yours, and have such a sudden change of heart....as for the dribble that noko was just giving you about you needing to apologiez to THG, I cant believe that there is such a wuss in our midsts, you need not apologiez for anything, if you didnt start this thread, I was going to, and I deffinately think that you did a better job of showing a little rationalism than I ever would have.

Saving up to buy Crashman a Shuttle Mini PC....
!