Sign in with
Sign up | Sign in
Your question

J. Carmack's answer about GeForce FX

Last response: in Graphics & Displays
Share
September 18, 2003 3:12:07 AM

I found this in AnandTech.com, thought you folks may like to see it.

"Thanks to GTaudiophile for this tid bit from John Carmack.

Hi John,

No doubt you heard about GeForce FX fiasco in Half-Life 2. In your opinion, are these results representative for future DX9 games (including Doom III) or is it just a special case of HL2 code preferring ATI features, as NVIDIA suggests?

Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn't really matter to Doom, but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec.

John Carmack"
This should prove Valve's point. Or maybe some don't trust Carmack and thinks this is a part of the conspiracy against Nvidia.

<font color=red>listen to me or wait for the next patch!</font color=red>

More about : carmack answer geforce

September 18, 2003 3:19:27 AM

IMO
once games start using 32bit precision, that's when nVidia will shine~~~~~~ of course i'm wrong though :D 

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
September 18, 2003 3:29:13 AM

If Gabe sounded not too credible to some, they better not expect Carmack to be that.

Carmack is the single most respected and most knowledgeable guy in the graphics industry, it's a given. He drives the industry really. If he says such, then it's almost definitive.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
Related resources
September 18, 2003 4:03:18 AM

32bit precision isn't going to happen until DX10 at the earliest, by then to take advantage of other DX10 features you're going to need a new card anyway.
a b U Graphics card
September 18, 2003 4:47:11 AM

And that card is <A HREF="http://www.xbitlabs.com/articles/editorial/display/idf-..." target="_new">THIS CARD</A>

Sorry I just had to. As GW says, it gives me a <A HREF="http://flakmag.com/misc/chubby.html" target="_new">Chubby</A> (or well it will get one from the fridge for me [PCI-EX does that right, isn't that one of it's amazing features, something about fetch or pre-fetch?]) I'd link to their site but FOX bought it. I tried to go there and got FOX.

KIDSCLUB.COM take you to FOX BOSTON!

Anywhoo, yeah it doesn't look good for nV.

The NV40 is a LONG way away.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
September 18, 2003 6:48:33 AM

This weakens Nvidia's credibility about addressing the problem by a driver update.

It proves that the problem is in hardware level and det.50 can improve performance in DX9.0 mostly by further sacrificing IQ.

FX’s failure in DX 9.0 will have a bad effect on the quality of the games to be released since Developers will be more conservative about using DX 9.0 capabilities of graphic cards.

Hope Nvidia wakes up before it’s too late.

<font color=red>listen to me or wait for the next patch!</font color=red>
September 18, 2003 7:09:16 AM

Quote:
once games start using 32bit precision, that's when nVidia will shine

What the [-peep-] are you talking about?
Games <b>do</b> use 32bit precision right now but the Nvidia card puke whenever it's enabled.
When Gabe said that they had to make a special path for the FX, it was a combination fp32/fp16 path(mostly fp16).
Valve tried to code pure DX9 for the game(which is at least fp24) but NV cards cant run fp32 with any playable speed.
Now read your post again and you'll understand why I found it silly squirtle:) 

I help because you suck.
September 18, 2003 7:25:54 AM

Quote:
32bit precision isn't going to happen until DX10

Lol...you guys are killing me.
In order for a piece of hardware to meet DX9 spec, it has to at least support fp24 render targets.
Nvidia anounced that their cards would support fp16 <b>and</b> fp32, knowing damn well that fp32(in the case of desktop graphics) would be useless. It's a marketing gimmick.
Developers have tried to squeeze the best precision they can out of Nvidia's cards, but have found fp32 not doable in most situations.
As far as DX10 goes, I dont think you really understand that no one knows what DX10 spec will call for, and we arent going to see it for along time(which makes me happy)
I heard from a friend at B3D that fp32 was instituted by Nvidia mostly for their professional(Quadro) graphics cards, and play a key role in making Quadro the best solution(though I have no idea really) for proffesionals because of it's high precision.
I must be tired because I just forgot what I was talking about.
Thats all.

I help because you suck.
September 18, 2003 7:31:09 AM

Clearly I meant 32bit precision isn't going to be required as standard in games until DX10 "AT THE EARLIEST" (the part of my quote you missed), therefore supporting 32bit precision now is useless.
September 18, 2003 7:46:29 AM

Right on:) 
Mike C from NV News posted a few shots that were taken in all three different modes that we're discussing. fp16, fp24, and fp32. I honestly couldnt tell the difference between fp24 and fp32. I dont think that anything over fp24 is needed. Some argued that fp16 looked great, and I half heartedly agree. When Nvidia introduced the FX 5800, they claimed superiority because their hardware supported 32bit rendertargets. I actually fell for it, alot of people did. Funny how things turn out.
Maybe Nvidia's next architecture will bring some speed to 32bit rendering.

I help because you suck.
September 18, 2003 8:28:33 PM

Hey Spud, you're a decent guy. Any chance of you returning your GeForce FX card?

_________________________________________
<A HREF="http://skulls.sytes.net/tom/" target="_new">12 bit... The way games are meant to be played!</A>
September 19, 2003 3:24:43 AM

At this point its a tough one I can sell it easy buddy will take it off my hands for 350 clean. But I broke my bank account with my new P4C800-E Deluxe and 1gig of 3200. So ill have to wait till next pay day to get a 9800. Unless the 53.45's clean things up for me.

-Jeremy

:evil:  <A HREF="http://service.futuremark.com/compare?2k1=6940439" target="_new">Busting Sh@t Up!!!</A> :evil: 
:evil:  <A HREF="http://service.futuremark.com/compare?2k3=1228088" target="_new">Busting More Sh@t Up!!!</A> :evil: 
!