NVidia's response to HL2...

Ganache

Distinguished
Dec 31, 2007
225
0
18,680
NVIDIA's Response:

Over the last 24 hours, there has been quite a bit of controversy over comments made by Gabe Newell of Valve at ATIs Shader Day.

During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.

We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.

Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.

Pending detailed information from Valve, we are only aware one bug with Rel. 50 and the version of Half Life 2 that we currently have - this is the fog issue that Gabe refered to in his presentation. It is not a cheat or an over optimization. Our current drop of Half Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.

We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.

Derek Perez
Director of Public Relations
NVIDIA Corp.

<A HREF="http://gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchmarks/003.htm" target="_new">http://gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchmarks/003.htm</A>

draw your own conclusions.
 

sargeduck

Distinguished
Aug 27, 2002
407
0
18,780
yeah, I already posted this a little while ago, called, "Nvidia speaks!"



As each day goes by, I hug my 9600Pro just a little tighter.
 

eden

Champion
Give your recent track record, you probably played into it like a mouse on cheese on a trap eh?

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
 

eden

Champion
That's odd, in Ganache's quoted text, Derek Perez is the writer, but on Tech Report's news, it says this:
Over the last 24 hours, there has been quite a bit of controversy over comments made by Gabe Newell of Valve at ATIs Shader Day.
During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.

We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.

Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.

Pending detailed information from Valve, we are only aware one bug with Rel. 50 and the version of Half Life 2 that we currently have - this is the fog issue that Gabe referred to in his presentation. It is not a cheat or an over optimization. Our current drop of Half Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2.

We are committed to working with Gabe to fully understand his concerns and with Valve to ensure that 100+ million NVIDIA consumers get the best possible experience with Half Life 2 on NVIDIA hardware.

<b>Brian Burke NVIDIA Corp.</b>

Oh my, now they are stealing their own writers' intellectual property?!

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
 

eden

Champion
Chief Eden, P.I.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
 

lhgpoobaa

Illustrious
Dec 31, 2007
14,462
1
40,780
hehehe whats the bet their "optimised" 50 drivers are optimized to buggery and beyond?

<b>I am not a AMD fanboy.
I am not a Via fanboy.
I am not a ATI fanboy.
I AM a performance fanboy.
And a low price fanboy. :smile:
Regards,
Mr no integrity coward.</b>
 

jmecor

Distinguished
Jul 7, 2003
2,332
0
19,780
i only heard this today.
Nvidia is trying to prove that their cards can play hl2 well better than radeon.

<font color=green>If your nose <b>RUNS</b>, and feet <b>SMELLS</b>.
Then you must be born <b>UP-SIDE DOWN</b>.</font color=green>
 

lhgpoobaa

Illustrious
Dec 31, 2007
14,462
1
40,780
Of course they had to make some excuse.
They couldn't just say "Yep, our product sucks."

The 50 detonators must be fantastically better though to get them out of this rut.

<b>I am not a AMD fanboy.
I am not a Via fanboy.
I am not a ATI fanboy.
I AM a performance fanboy.
And a low price fanboy. :smile:
Regards,
Mr no integrity coward.</b>
 

rain_king_uk

Distinguished
Nov 5, 2002
229
0
18,680
If the 50 drivers really are good enough to improve the GFFX shader performance by 100% you have to wonder how the product has been out for months with totally crappy drivers.

Stranger things have happened, but I smell a few cheats and/or game/map-specific "optimizations" on their way.
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
hehehe whats the bet their "optimised" 50 drivers are optimized to buggery and beyond?
I don't even like to call it "optimization" any more. I prefer to call it "cheaptimization" a 3-way pun for "optimization", "cheat", & "cheap".

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
I really hope the GF4 Ti series gets like a 5 or 10% bump. That would be really nice, but probably just wishful thinking on my part.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

eden

Champion
You being an HL maniac, I must wonder if your Quadro isn't gonna cut it enough (if HL2 is Direct3D and not OpenGL), and you want max quality, what would you do if you had to switch or upgrade?

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A><P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 09/11/03 11:23 PM.</EM></FONT></P>
 

spud

Distinguished
Feb 17, 2001
3,406
0
20,780
It is nice to see their PS 2.0 compiler being integrated in the drivers this time. There should be some performance increases for sure. Keeps looking up for me.

-Jeremy

:evil: <A HREF="http://service.futuremark.com/compare?2k1=6940439" target="_new">Busting Sh@t Up!!!</A> :evil:
:evil: <A HREF="http://service.futuremark.com/compare?2k3=1228088" target="_new">Busting More Sh@t Up!!!</A> :evil:
 
Keeps looking up for me.
Well I guess you have to keep looking up if you're always getting knocked down! :lol:



- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

lhgpoobaa

Illustrious
Dec 31, 2007
14,462
1
40,780
hahaha good one!
You better copyright that one quick!

I get 3dcockstroke, you get cheaptimization(s)

<b>I am not a AMD fanboy.
I am not a Via fanboy.
I am not a ATI fanboy.
I AM a performance fanboy.
And a low price fanboy. :smile:
Regards,
Mr no integrity coward.</b>
 

spud

Distinguished
Feb 17, 2001
3,406
0
20,780
Why I still have a 9700 on RMA replacement from ATI either or I am in good water.

-Jeremy

:evil: <A HREF="http://service.futuremark.com/compare?2k1=6940439" target="_new">Busting Sh@t Up!!!</A> :evil:
:evil: <A HREF="http://service.futuremark.com/compare?2k3=1228088" target="_new">Busting More Sh@t Up!!!</A> :evil:
 
Right Spud, whatever you say. Whatever it was that you said.
Need SAP, subtitles or something. Maybe I can get that women to 'SIGN' in a little window on my desktop.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

spud

Distinguished
Feb 17, 2001
3,406
0
20,780
What do you mean I initially bought a 9700 when I saw the shite performance of the 5800. Got the card and went about my business then the card and the mobo shite out on me. RMA on the board went alright the video card im still waiting on... Not buying online again thats for damned sure (bought from cendirect.com).

But either or Im not so screwed, I can recover from this I can sell the 5900 for like 200-250 bucks get 1/3 back on it then try and trade off the 9700 when I get it back maybe 200-250 as well then go buy a brand spanking new 9800 since they dont suck ass. Still gonna wait till the Det 50's are out to be sure.

-Jeremy

:evil: <A HREF="http://service.futuremark.com/compare?2k1=6940439" target="_new">Busting Sh@t Up!!!</A> :evil:
:evil: <A HREF="http://service.futuremark.com/compare?2k3=1228088" target="_new">Busting More Sh@t Up!!!</A> :evil:
 

spud

Distinguished
Feb 17, 2001
3,406
0
20,780
Where I say that??? Another thread???

-Jeremy

:evil: <A HREF="http://service.futuremark.com/compare?2k1=6940439" target="_new">Busting Sh@t Up!!!</A> :evil:
:evil: <A HREF="http://service.futuremark.com/compare?2k3=1228088" target="_new">Busting More Sh@t Up!!!</A> :evil:
 
look up, you say the R9800 doesn't suck ass, but you returned the R9700, hence the conclusion one would draw is that the R9700 sucks ass in your op-onion.

BTW, the Det 50s are starting to make appearances, check my thread about AmdMb's review.

Personally I'd still say to everyone, WAIT! Regardless of DET50s and preview benchmarks. There will be alot of 'new' architecture out soon. Perhaps the R9800XT will be more to everyone's liking, and perhaps the R9600XT/PLUS/MEGA/PMSNBC/whatever will perform fine, and perhaps the FX5700 will have some fixes (it seemed to me it was closer to release than the FX5950).
If people are building for the day HL2 comes out then I think there is really only 2 choices for the most part, unless they are contemplating DX8 products. But even then wait till HL2 hits the shelves so we know the final version benchmarks and people's inital reactions. But if you wanna play first sec. it's released then you guess-timate what you need.

Anywhoo we shall see.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
Which 2 choices? I can think of several ATi cards that I imagine will run HL2 fine in DX9: R9500, R9500 Pro, R9600, R9600 Pro, R9700, R9700 Pro, R9800, R9800 Pro, and some of the untested new ones should work too. That's 8 cards there.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!