Doom III ,how fast will your card be?

cleeve

Illustrious
Not to worry!

Just like radeons mopped the floor with the GeforceFX in the HalfLife 2 previews, by the time both games go retail they should both offer acceptable performance with cards from both manufacturers.

Lots of driver and coding work done since then.

Heck, my 9500 PRO runs the Doom3 alpha GREAT with new Catalyst drivers. Very sweet, especially since it's an alpha release.
The final version should run smooth as silk on Radeons 9500 and higher.

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 322/322)</i>
<b>AthlonXP <font color=red>2600+</b></font color=red> <i>(o/c 2400+ w/143Mhz fsb)</i>
<b>3dMark03: <font color=red>4,055</b></font color=red>
 

petrolhead2003

Distinguished
Dec 29, 2003
99
0
18,630
Yeah i agree but the Nvidia cards have been slaughtered because of their HL2 performance.

Just needed to make SOME people realise that Nvidia FX cards are not shite running the latest "must have"games.
 

cleeve

Illustrious
We'll see.

regardless, the framerate differences in Doom3 were minimal between vendors, while the differences in Half Life 2 were horrible.

Ati's shaders are superior, there's no doubt about it. That doesn't make Nvidia's cards bad, it just means that they are disadvantaged in true DirectX 9 titles because Radeon's have such a technology lead.

In Doom3 (an OpenGL title), Nvidia does have a small lead though. But looking at the industry I think DirectX is probably where the bulk of game development work is happening.

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 322/322)</i>
<b>AthlonXP <font color=red>2600+</b></font color=red> <i>(o/c 2400+ w/143Mhz fsb)</i>
<b>3dMark03: <font color=red>4,055</b></font color=red>
 

addiarmadar

Distinguished
May 26, 2003
2,558
0
20,780
Doom 3 will look a hell of a lot better on the radeons than the FX cards. FX chips would probably get the FPS by a small amount but it will not be enough to notice a difference.

Barton 2500+ @ 2200mhz (10x220 vcore @ 1.8)
Asus A7N8X Dlx 440 FSB
1gb Geil GD pc3500 Dual Channel (2-3-3-6)
Segata 80gb SATA 8.5ms seek
ATI Radeon 9800 Pro(420/720)
 

petrolhead2003

Distinguished
Dec 29, 2003
99
0
18,630
"Ati's shaders are superior, there's no doubt about it. That doesn't make Nvidia's cards bad, it just means that they are disadvantaged in true DirectX 9 titles because Radeon's have such a technology lead".

This is not true.

DX9 specifies 24 bit shader routines and that is what ATI uses.
Nvidia are ahead of the field because they have 16 and 32 bit shaders.
This means that in a direct comparison ati cards are faster but the FX cards have more advanced technology.

The FX cards go beyond the specifications of dx9 and will HOPEFULLY come good in the near future.
 

petrolhead2003

Distinguished
Dec 29, 2003
99
0
18,630
"Doom 3 will look a hell of a lot better on the radeons than the FX cards. FX chips would probably get the FPS by a small amount but it will not be enough to notice a difference".

My question is:-

Why will a 24 bit precision image look better than a 32bit precision image?



<P ID="edit"><FONT SIZE=-1><EM>Edited by petrolhead2003 on 01/12/04 04:46 PM.</EM></FONT></P>
 

cleeve

Illustrious
Both images are in 32 bit color. You are grossly mistaken/misinformed.

The precision argument is not for available color as every videocard since the original Geforce has had 32 bit color available.

The Radeon has 24 bit shader precision. The GeforceFX has 16/32 bit shader precision.

In fact, the fragment programs run on the GeforceFX will default to 16 bit in many places because the FX hardware is not as powerful with fragment programs as the Radeon 3xx architecture.

Carmack had to write a specific codepath to get more performance out of the FX cards because using the default ARB codepath, their performance wasn't nearly as good as the Radeons'. One of the tradeoffs was lowering precision to 16 bit in many cases.

If you do not understand the arguments, please do not share your misinformation on a messageboard.

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 322/322)</i>
<b>AthlonXP <font color=red>2600+</b></font color=red> <i>(o/c 2400+ w/143Mhz fsb)</i>
<b>3dMark03: <font color=red>4,055</b></font color=red>
 
G

Guest

Guest
Ya I disagree with you. DIII is one of those The way its meant to be played. So it will have optimized nvidia code path(ATI also but i think with DIII GFX are the first concern as opposed to HL2 who should be somewhat more optimized for ATIs).
Nvidia architechture isnt bad at all if the game has been programmed/optimized for it.
As other mentionned both cards should work fine but I wouldnt be suprise to see Ati having the upper hand in HL2 and nvidia in D3...
 
Seriously you need to do ALOT more research before you post this old news. It's ok to be that ignorant at the time that these nV-centric benchies came out but by now you should know better (like the rest of us).

Running the default ARB2 path the FXs will actually run SLOWER than their Radeon counterparts, the FX require their own specially coded path in order to get a SMALL gain over the ATI cards.

Carmack even commented on it at the time of the HL2 benchies;

<font color=blue>"Unfortunately, it will probably be representative of most DX9 games. Doom has a custom backend that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn't really matter to Doom, but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec"</font color=blue>

So I guess for the lates must have games there are some issues.

Thank god nV didn't focus on the crappy Benchies above and instead got their run-time compiler to work better, otherwise that'd be the only 'DX9 game' you'd see with any positive results for the FX line (save the FX5700U who's 3 vertex engines help alot).


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
So it will have optimized nvidia code path(ATI also but i think with DIII GFX are the first concern as opposed to HL2 who should be somewhat more optimized for ATIs).
Actually they aren't 'optimized for ATIs', the ARB2 path in D]|[ is for all HLSL DX cards to use, same with the basic HL2, only nV gets all the optimizations in both games, to make them competitive it seems. Originally Carmack said that the Parhelia would use the SAME ARB2 path as the R300 series, that may have changed but I doubt it.
The only card 'optimized' for in either HL2 or D]|[ is the FX series, and even then it doesn't seem to help since it requires lower precision. The only advnatage that the FX cards HAD was the Ulltra-Shadow technology, and ATI addressed that with the slight hardware tweaks in the R350/RV350 and latter editions to allow for longer shader instructions.

All in all it's highly likely that unless there are effects only present in the FX path, that the ATI will actually LOOK better, but te FX will be faster with lower precision.

Too bad that Carmack specifically dropped support for geometric amplification mehods like TruForm due to the stencil shadow volumes. With the power of the new cards that would likely have offered GREAT image enhancement at little performance cost, especially due to the fact that... D]|[ has been CAPPED at 60FPS, so it might not make much of a difference depending on whether or not these nV-centric paths do anything to improve the low end min. fps if it's an issue.

I'm certain by the time that either game ships the current cards will have no trouble running either game. I'm sure the NV40 will handle HL2 better than the current line (it almost has to).


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

Vapor

Distinguished
Jun 11, 2001
2,206
0
19,780
Dude, that article is 8 months old!! Who knows how the real version will play?? First of all, drivers for both companies have been changed, pretty significantly since that time, too. Second of all, the game will be different, more optimized to play on current and future systems.

The only that holds any validity is Carmack's favoritism of Intel and nVidia. I don't even trust the HL2 benchmarks yet, you want to know why? It's not the game that's going to be released in _ months!

Maxtor disgraces the six letters that make Matrox.
 
G

Guest

Guest
Yes that make sense thanks for clarifying. I knew that ATI stuck to DX9 APi and that nVidia tryed to change the DX9 code to suits their engine better and that it didnt quite work out. But I tought that ATI had more than their named tagged on the box and that some tweaks may exist to make it perform better...
And I second you on the NV40 issue.
 
Eaxctly. CG's adoption didn't really happen so no benifit in the architecture. The only person who might have favoured CG favours OpenGL even more. Ohh the Irony! :wink:

Personally I'm going to pick whichever PCI-EX card works best for my needs (regardless of maker, although a new MATROX card would be nice :cool: ), if D]|[ or HL2 only 95% as well on one of those cards as another, who cares? It should be more than good enough for something that is supposed to play on everything above a GF3/R8500 (GF4MX not included).

Now the real question, who will feel more compelled to buy a NEW card the GF3 user with DX8 or the R8500 user with DX8.1 at his disposal? If anything I think the optimizations will impact most there (BTW that would be different optimizations from those for the FX [there's like 4 different code paths IIRC]).
I wish I still had my older cards just to see what one would see on them. I would like to see if the TNT or RageFuryPro would even load the game.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

Vapor

Distinguished
Jun 11, 2001
2,206
0
19,780
I wish I still had my older cards just to see what one would see on them.
I know I'm going to try it on my GeForce2 GTS! Too bad I don't have anything older. I'll be shocked if it runs better than a powerpoint presentation.

Maxtor disgraces the six letters that make Matrox.
 

Vapor

Distinguished
Jun 11, 2001
2,206
0
19,780
Anyway, my Ti4400 should take task pretty well--if I still have it by then. Wanna put it in the stables.

Maxtor disgraces the six letters that make Matrox.
 

pauldh

Illustrious
I still have a Rage Fury Pro in the box and will give it a try. Actually I have an All-In Wonder 128 Pro on this computer (my main home non gaming machine), so that will be even easier to try.

ABIT IS7, P4 2.6C, 512MB Corsair TwinX PC3200LL, Radeon 9500 Pro, Santa Cruz, Antec 1000AMG, TruePower 430watt
 
Funny those are two of the cards I had.

There's even a pic. of the AIW in the THGC Album.

Funny how others follow similar upgrade patterns. I move to an AIW8500DV and a R9000 and now an R9600Pro. Althugh I went from the Rage to the R9600P, the others are my editing rig.
I see you appear to have gone from the Rage to the R9500Pro.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

pauldh

Illustrious
Yup very similar. I went from AIW Pro 8MB and Xpert 98@play, to a Rage Fury Pro and AIW 128 Pro, to a Radeon 8500, and Finally a Radeon 9500 Pro. I was tempted to do a AIW 8500DV too, but never upgraded the AIW128 pro because I stopped playing all games on my main home computer. So I could still do basic video editing, watch TV, and keep the system from getting cluttered. I later bought TV Wonder Pro card and also Leadtek Windfast 2000 Deluxe(?) for video editing in another system. I'm a PC Build addict. Luckily I don't smoke or drink, so I only have one habbit that wastes my money. Well 2, Home theater also. I would love an AIW9600Pro though for that system. But this AIW128Pro will be used for as long as it works. I get 1300 3dmarks in 2001se, but hey that's plenty fast for my wife to play Solitaire or Tetris, which are about the only games on this thing.

ABIT IS7, P4 2.6C, 512MB Corsair TwinX PC3200LL, Radeon 9500 Pro, Santa Cruz, Antec 1000AMG, TruePower 430watt