Comment on HL2 performance from Tom's

kmolazz

Distinguished
Feb 14, 2003
9
0
18,510
"Valve did not allow us to also run test with a new Beta version of the upcoming NVIDIA Detonator 50 driver. Valve explained that they only want to see scores of already released drivers. We're still waiting for a more detailed explanation from Valve what exactly they don't like on that Beta driver"


You're still waiting? You didn't ask? Well, other sites have answers:

http://www.techreport.com/etc/2003q3/valve/index.x?pg=1
http://www.extremetech.com/article2/0,3973,1265063,00.asp


Aparently they thought the "optimizations" had gone a little to far. And you must of skipped the one slide from the presentation here they talk about the issues regarding someone's drivers in half-life2...

This is a little quote from the TechReport's article:
"As you can tell from looking at the list in the slide above, Newell was concerned particularly with some of the techniques NVIDIA has used in recent driver releases, although he didn't exempt other graphics hardware makers from his complaints. He said they had seen cases where fog was completely removed from a level in one of Valve's games, by the graphics driver software, in order to improve performance. I asked him to clarify which game, and he said it was Half-Life 2. Apparently, this activity has gone on while the game is still in development. He also mentioned that he's seen drivers detect screen capture attempts and output higher quality data than what's actually shown in-game."




My other question is in regard to the Anti-Aliasing test in the article. Did you use AA forced from the control panel? I don't recall seing AA tests form other sites. Because, from what i've read, in the final game AA will be controled by the engine, providing more performance.

This is a quote from Extremetech article:
"Improved full scene anti-aliasing will deliver more image quality bang for the processing buck, by sometimes selectively applying AA, and the Source engine will control AA from within the engine, rather than rely on driver control panel settings."




And this:
"But it's also up to Valve now to offer a solution and a more in-depth explanation to the million owners of NVIDIA cards as to why their cards perform so badly with Halflife 2 at the moment."

I think they gave a damn fine slide presentation explaining it.... Perhaps the makers of the last Tomb Raider should explain why FX performance is also bad in that game??
 

jmecor

Distinguished
Jul 7, 2003
2,332
0
19,780
RoFlmAo...


<font color=red>If your nose <b>RUNS</b>, and feet <b>SMELL</b>.
Then you must be born <b>UPSIDE DOWN</b>.</font color=red>
 

Cassius105

Distinguished
Sep 6, 2003
25
0
18,530
im not sure why toms did the AA/AF tests

Valve told the other hardware reviewers that the AA/AF implimentation in HL2 was not finished and was a bit dodgy so would be different to the performance in the final product

this is probably why the FX5900 score didnt move with AF on
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
He also mentioned that he's seen drivers detect screen capture attempts and output higher quality data than what's actually shown in-game."
Dang, talk about deception. Why does nVidia intently continue to persist in deceiving their own customers. These actions are not forgivable unless nVidia pulls a 180 in the future. Why can't nvidia just come out and admit that they have a faulty product? I'd have much more respect for them if they did this instead of all these little coverups.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

TKS

Distinguished
Mar 4, 2003
747
0
18,980
THIS POST SHOULD BE KEPT ALIVE...especially with all the other crap people are posting right now. People making excuses for Nvidia saying they only can utilize 4X1 architecture with DX9 instead of their standard 4X2 and that drivers will update this...Bullcrap. I'm bumping this post as often as it takes so people will SNAP THE FRICK OUT OF IT.

BUMP!!!

<font color=blue>other people's lives
seem more interesting
cuz they aint mine
</font color=blue>
<font color=green>Modest Mouse</font color=green>

TKS
 
What's this 'Bump' you speak of?

:wink:

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

rain_king_uk

Distinguished
Nov 5, 2002
229
0
18,680
Yeah I read that article, not only did it seem the benchmarks were performed by a bunch of headless chickens, but the article seemed to be penned by a 12 year old who didn't pay any attention to the explanations that were given.

It did strike me that either an incompetent wrote it or someone who is biased. Take your pick.
 

Borsti

Distinguished
Apr 19, 2002
49
0
18,530
Hi kmolazz,

sure I saw that slide as well. But it was a general slide and not related to the Det50 in special. I asked what invalid stuff they found in the Det50 and HL2 and Valve did not want to give me an answer. The only thing they explained is the fog. NV told me that it´s a bug and it only happens on a 5600 because the driver they used was only tested on 5900 that time. That info might be correct or incorrect. I had no chance to find out myself. Valve did also not say what Det50 it was. They told me they never saw the Beta NV gave to the press and that the Det50 they have is an older one.

Regarding the FSAA test. The Aniso settings were set up by ATI´s Raja Koduri who is pretty well aware of what the HL2 demo can do and what not. He was also the one telling us (other sited tested at the same time as I did as I wrote in the article) that setting FSAA in the control panel is fine.

Because we only had limited time we were not able to confirm the settings. I said that in the beginning of the article.

Regarding your last comment, there´s one thing to consider. Valve wants to sell games and this includes that they want to sell it to people with NV cards as well. Valve says: ATI rocks, NV sucks. Fine. I highly respect their descission to tell that to anybody without the typical marketing blah!!! They also prove their point with the HL2 benchmarks. But before I hook up to their conclusion I want to make sure that this really is the case or if there´s a (valid!) way to get better performance out of NV cards in DX9 games or HL2 in special. If this means time (=money) then NV has to pay for it. If it is because of the design of their shadercode then they have to tell it. Maybe be they have the fear that game studios won´t licence the source engine? I don´t know! But it´s big business and business most times is dirty. I have huge respect for the Valve people but they are not gods! They chose a way to store textures that results in trouble with FSAA with actual cards. It´s no bug of the cards or something else. It´s because of the way store textures in the game.

To bring it to a point. I won´t tell people owning NV cards that there won´t be chance for better DX9 performance until I´m 100% sure. The only thing I say right now is: ATI is a safe harbour and it looks like NV cards have a problem. So if you want to buy a new card and want to be sure it´s good in DX9 then go with ATI. But this won´t help people who already own a FX card. And one thing everybody has to consider as well: DX9 titles are still very rare or in beta and NV promises better performance with Det50. And I want to find out if that means hope. Goating won´t help them. They want to see solutions. And if Valve says: sorry there´s no solution then I would welcome a bit more detailed explanations and not only a few powerpoint slides.

If you look at it from a fanboy´s point of view, then the situation is pretty clear. ATI fans say: we rule, NV fans cry. But it´s not my job to supply the one or the other side with new ammo. It´s the time to talk about solutions and if there are none then we have to talk about the reasons why they are not possible. And that´s the state we´re currently in.

Shaders are a very complex story and there are ways to hit NV´s or ATI´s hardware in the way you code shaders. ATI´s architecture is less complicated in this point but it does not mean that you can´t hurt its architecture: (http://mirror.ati.com/developer/dx9/ATI-DX9_Optimization.pdf)

best regards

Lars
 
Hey Borsti/Lars how's it hanging? Guess you've been a little busy huh? :cool:

Are you guys going to get a chance with the DET 50s & HL2 in the near future? Or can you not talk about it? :wink: Is there an Aquamark review in the future? I know you gave us that GREAT video of your TV apperance. Thanks again for that.

Do you think it might be possible to add a Parhelia to some of your reviews just for reference sake. I think it would be of interest to many (especially with some of their DX9 features). Just a thought, and I''d love to see how they do vs the current nV setup. You did have some benchies for the Parhelia in other segments before.


Thanks as always! And I'll resurect an old 'sig' from the old 'cheats' days just in honour of one of your statements in your reply.

ATI SUX, nVidia BLOWS, and Matrox gives good Multi-head! :cool:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

Flinx

Distinguished
Jun 8, 2001
1,910
0
19,780
<A HREF="http://www.techreport.com/etc/2003q3/valve/index.x?pg=1
" target="_new">http://www.techreport.com/etc/2003q3/valve/index.x?pg=1
</A>
<A HREF="http://www.extremetech.com/article2/0,3973,1265063,00.asp" target="_new">http://www.extremetech.com/article2/0,3973,1265063,00.asp</A>

The loving are the daring!
 

TKS

Distinguished
Mar 4, 2003
747
0
18,980
Very good points. I would agree on 90% of them. But we are forgetting one thing. For this I will remember the almight English Lit class I took back in college and invoke (drum roll please) a comparison :wink: . Here we go: Everyone in major league baseball uses wooden bats. There's no getting around it...there is a weight...a height...dimensions that are set for each bat to follow. Tangibles that set a <i>standard</i> *nudge nudge, wink wink*. Mighty Casey comes along and begins to belt the balls right over deep left center with every swing. Mighty Casey becomes a legendary hitter that performs *wink wink, nudge nudge* better than all of the other big leaguers. Then one day, along come the umpires *wink wink, nudge nudge* and they decide they need all of the bats in the league to inspect/benchmark because there is rumor of cheating/optimizations. Casey is caught with a cork in the bat. This, my friends, is where Might Casey struck out. Especially when Mighty Casey admits no wrong doing.

Does it take away from the game? Damn straight it does. In fact, the games that are being developed will take longer and longer to develop the more Nvidia/ATI/Matrox/Trident/whoever decides to make things complicated by introducing vendor specified optimizations. What Nvidia doesn't understand right now is that even if they pissed Valve off, as long as they stuck to the standards and use a regular bat (DX9 standards) like everyone else...they'll perform just fine. But they keep coming to the plate with a corked bat and then start saying "what? What corked bat? Forgetaboudit...you dint see no freakin cork"

I feel quite a bit like I did after the Baseball strike in the 90's....I feel cheated...like I don't matter and the game didn't matter...But, I'm older and wiser now and I realize that I do matter. But the giants of the game won't recognize this until someone stands up to them and doesn't budge one bit. I applaud Valve for doing what they have done and not budging. Kudos for having some cajones. I have a sneaking suspicion that they've set the precedent on this one. Look for more sproutings of nutsaq's all across the game developer board.

<font color=blue>other people's lives
seem more interesting
cuz they aint mine
</font color=blue>
<font color=green>Modest Mouse</font color=green>

TKS
 

spud

Distinguished
Feb 17, 2001
3,406
0
20,780
Not a good comparison, DX stuff is laid out but doesnt need to be followed perfectly. If so then everyone would have the exact same performance in DX stuff. Its all about the engineering take that away then its like watching a game of baseball *zZzZzZzZz*.

You also seem to be missing the point that a lot of FX users are wanting to be addressed. Is there a way or possiblity that our DX9.0 peformance is somewhat hindered by PS 2.0 code, drivers, hardware or current compilers? Since this is somewhat not nVidia's style of business. But you could be right it could be nVidia's engineers had poor foresight when designing the core. But either or I want to know what the deal is and not get [-peep-] mocked because I can throw cash at anything I want but in the current case threw it at the wrong thing.

-Jeremy

:evil: <A HREF="http://service.futuremark.com/compare?2k1=6940439" target="_new">Busting Sh@t Up!!!</A> :evil:
:evil: <A HREF="http://service.futuremark.com/compare?2k3=1228088" target="_new">Busting More Sh@t Up!!!</A> :evil:
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
Lars continued... <A HREF="http://www.beyond3d.com/forum/viewtopic.php?t=7884&postdays=0&postorder=asc&start=0" target="_new">http://www.beyond3d.com/forum/viewtopic.php?t=7884&postdays=0&postorder=asc&start=0</A>
Scroll down.

I help because you suck.
 

TKS

Distinguished
Mar 4, 2003
747
0
18,980
That's not true spud, you have to have standards. That's why there is the NIS. The national institute of standards. In electronics...especially communication, you have to have standards...IEEE 488, 1553, 1394, 802.11b-g...these things are set up so that when vendors come in to support it, they follow things to a T and the performance of them are based strictly on hardware configuration and what OS you run. Software performance for these mentioned standards are insignificant.

This is what DX9 should be as well. Will this limit things? Most likely it will. I think it should be limited though so we do not get falsified information and optimizing code that lets it <i>seem</i> as though things are going better for your card. I mean, put it this way....say you have a USB 2.0 modem...the software mod you installed tells you that you are connected at 10Mb and the performance tests report that you are connected close to 10Mb....but in actuality you are connected at .1Mb...wouldn't that piss you off? Especially if you payed an arseload of money for the modem/optimization? It would me....that's why I think standards are a good thing. While they hold things back a bit...in the long run they keep companies truthful and in check.

-------------------------------------
Nvidia's marketing tip: <font color=blue>The victor will never be asked if he told the truth. </font color=blue>
<font color=red>Adolf Hitler</font color=red>
 

phial

Splendid
Oct 29, 2002
6,757
0
25,780
That's not true spud, you have to have standards.

exactly, otherwise you have company A sponsering a game developer, and causeing the game to not work company B's video cards



like the new Tiger Woods game.. where the water shaders only work on cards with an Nvidia device ID even tho ATI/others can run it just as well

-------

<A HREF="http://www.teenirc.net/chat/tomshardware.html" target="_new"> come to THGC chat~ NOW. dont make me get up ffs @#@$</A>