HL2 = Microsoft's official DX9 Testing Platform

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
HL2 will be Microsoft's official platform for testing DX9


Click here to read more and found out:

<A HREF="http://www.megagames.com/news/html/pc/h-l2newbenchmarkingstandard.shtml" target="_new">http://www.megagames.com/news/html/pc/h-l2newbenchmarkingstandard.shtml</A>


My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
not necessarily, if NV38 or NV40 pwn ATi at HL 2 then it would be bad for ATi

Noone knows for sure

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
 
if NV38 or NV40 pwn ATi at HL 2 then it would be bad for ATi
Yeah but remember that those cards don't come out until long after the release of HL2. So it will be too late by then by most accounts an since the NV38 is pretty much the same basic design as the NV35, I don't see it improving that much when compared to the R360.
Yes we shall see, but don't get your hopes up. nV main concern now must be DOOM ]|[ as they are already gone on HL2 (using reduced IQ to make it even playable). I don't see nV releasing another major product between now and the time that HL2 has received it's first mark down (just in time for the start of the holiday shopping season).

Be Loyal, but also beware. I would REALLY LOVE to see the performance of he FXs against the Parhellia NOW. I'm serious, we know they are p'zone'd by the ATIs but do they even impress when put up against the Parhellia.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

eden

Champion
I am wondering if John Carmack didn't decide to give up vendor-specific codepaths after Doom III just because of nVidia's hectic design?

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
 

davepermen

Distinguished
Sep 7, 2002
386
0
18,780
nope, he hasn't gave up yet. would be nvidias death if he would. so they for sure work hard (shift much money towards id:D) to get him doing an nv path

btw.. the biggest advantage of the gfFX was the higher precicion. to get comparable speed on gfFX than on radeon, this higher precicion gets completely disabled by nvidia in drivers.. and like that its lower than atis precicion, and that by quite some amount.. in percent, that is.

oh well.. they shoot themselfes continuously

"take a look around" - limp bizkit

www.google.com
 

phial

Splendid
Oct 29, 2002
6,757
0
25,780
yep


and also , ATIs 24bit precision is only partial 24bit, and in the end theres little difference between it and 32bit

-------

<A HREF="http://www.teenirc.net/chat/tomshardware.html" target="_new"> come to THGC chat~ NOW. dont make me get up ffs @#@$</A>
 

davepermen

Distinguished
Sep 7, 2002
386
0
18,780
partial 24bit? what you mean by that? the shaders are full 24bit all the time..

nvidias det50 shaders you can call now "full 16bit all the time, possibly more", and makes it essencially a 16bit precise hw. possibly they even drop to 12bit fixedpoint, wich makes it.. uhm.. 3bits bether than gf4:D

if you mean that the ati hw has parts in 32bits, yeah, but only storage, not calculation.

scientifics love the gfFX for beeing full 32bit precious. now they gonna hate that drivers choose themselfes if they need that precicion.. wich makes them useless for scientific calculations.

anyways. gfFX is dead. finally. now we only have to get rid of all the sold dead units.. :D

"take a look around" - limp bizkit

www.google.com