The FX's poor perf. again in DX9.0 @ Xbit

Xbit Labs does another <A HREF="http://www.xbitlabs.com/articles/editorial/display/hl2.html" target="_new">benchmark of the FX5600U/5900U and R9800P/R9600P</A> with their HL2 based benchmark.
Not much too surprising, just further 'evidence', and likely they are correct about the potential future and release date. The mention of the R8500 and UT2K3 was interesting, and likely nV WILL come out with drivers that don't significantly reduce IQ, but I wouldn't want to lose the effects even if the IQ is similar (not including the eeffects), and considering they will have alot of time to refine those drivers, I'm not sure if it will be big news by then.

<A HREF="http://www.xbitlabs.com/articles/editorial/display/hl2.html" target="_new">http://www.xbitlabs.com/articles/editorial/display/hl2.html</A>

Also you guys might want to check out the 'extreme' overclcocking done by xBit a few days ago (forgot to post from work), on both the R9800Pro and FX5900Ultra. Pretty good article too.
<A HREF="http://www.xbitlabs.com/articles/video/display/extreme.html" target="_new">http://www.xbitlabs.com/articles/video/display/extreme.html</A>


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

Spitfire_x86

Splendid
Jun 26, 2002
7,248
0
25,780
Poor DX9 performance of FX cards is an established fact. It's not interesting anymore. It would be interesting to see DX10 performance of the FX cards...

----------------
<b><A HREF="http://geocities.com/spitfire_x86" target="_new">My Website</A></b>

<b><A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig & 3DMark score</A></b>
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
Yeah, sorry Ape. We need some new news in the Graphics Community.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

Spitfire_x86

Splendid
Jun 26, 2002
7,248
0
25,780
There's a real interesting thread in the CPU forum at this moment, <b>"Tejas postponed to 2005, prescott >120W"</b>

BTW, this is not an useless popey thread

----------------
<b><A HREF="http://geocities.com/spitfire_x86" target="_new">My Website</A></b>

<b><A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig & 3DMark score</A></b>
 
Seriously if you guys have more news then please feel free to provide it, which I don't see you doing.
However I think it's a nice addition to the nails in the current FX line's coffin.

Anywhoo, if you're worried about the CPUs, that's fine, but considering no one has or is planning on getting a Z51 'til prices drop dramatically, I don't see what the 1-2 years in the future is going to do in that respect. Likely the R500 will be out before the Tejas. And the Prescott consuming alot of power is not surprising if you read the other article I posted a link to from xbit, that intel expects ALOT of power consumption later, including the massive power consumption of the CPUs and the 75W+ power consumption of PCI-EX graphics cards. But since you already read/knew that, then the power requirements of the Prescott shouldn't come as a surprise. Right? :tongue:

Jeez, maybe I shoulda stuck with bumping and sexual organ jokes.

Seriously, you guys are both wet blankets. Watch out or you'll both get moldy.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

One

Distinguished
Oct 13, 2003
36
0
18,530
You don't really believe that do you, I mean come on the 9600 pro is not going to beat the 5900. Of course Valve will make the pre-release benchmarks freindly toward ATI, but for the 5600 beating the 5900 that is ludicrous. I am shocked you put any faith into this benchmark.
 

speeduk

Distinguished
Feb 20, 2003
1,476
0
19,280
I mean come on the 9600 pro is not going to beat the 5900.

Its not just in this benchmark that the 9600pro beats the 5900u. A few DX9 games out at the moment show the same thing.....

<A HREF="http://service.futuremark.com/compare?2k1=7000747" target="_new"> 3D-2001 </A>
<A HREF="http://service.futuremark.com/compare?2k3=1284380" target="_new"> 3D-03 </A>
<font color=red> 120% overclocker </font color=red> (cheapskate)
 

Ion

Distinguished
Feb 18, 2003
379
0
18,780
If 5600 beats 5900 thats because 5600 is running in DX8.1 mode while 5900 running in DX9 mode.


PS. improve your knowledge before you make posts like this, it will only give you a bad name. :wink:
 

nickd

Distinguished
Oct 9, 2003
32
0
18,530
Surely from your perspective it would be worse if he 'improved his knowledge' as you put it and *then* made posts like that.
 

The_MaguS

Distinguished
Mar 25, 2002
269
0
18,780
Obviously, if a specially developed code-path for NVIDIA GeForce FX-series as well as seriously optimised drivers are used, the latest products from the legendary NVIDIA Corporation will also perform fast in the next generation DirectX 9.0 game

Just thought I would point that out.


<font color=blue> There's no such thing as hell, but you can make it if you try.</font color=blue>
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
Ho hum, more nVidia bashing.

"The FX's poor perf. again in DX9.0"

I thought you were going to claim something about AMD64 FX, not Geforce FX.

<b>56K, slow and steady does not win the race on internet!</b>
 

One

Distinguished
Oct 13, 2003
36
0
18,530
"Unfortunately, we did not succeed in using a special rendering mode for NVIDIA GeForce FX graphics cards, but I am sure we will publish those results in future when we finally learn more about that preset."

"The present benchmarking experience was done by our Alexei Stepin using full-precision Pixel Shaders 2.0 in DirectX 9.0 mode. Later he will try to use lower-quality modes as well."
"Furthermore, you see that the GeForce FX 5600 Ultra performs even better than the more powerful GeForce FX 5900. This is result of a strange behaviour of the game that needs additional development."

That is quoted from the article that you clearly did not read. Before you say I am unknowledgable maybe you should consider reading the article first.

P.S. Thanks for making me look good :)

<P ID="edit"><FONT SIZE=-1><EM>Edited by one on 10/13/03 05:39 PM.</EM></FONT></P>
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
You guy's who are pissing on this thread are [-peep-] lame. I guess talking about a graphics card review isnt as exciting as a thread celebrating a member getting a new title.

<b>I help because you suck</b>
 

dhlucke

Polypheme
I read that. Just to confirm though: Is the anandtech review of the new detonator drivers the ONLY positive review of the FX cards to date?

I am led to believe that the FX series is improving, but only if the games are optimized specifically for the Nvidia card. This pretty much means that HL2 and Doom3 will be "playable", but that everything else is going to blow.

Am I right? If so I think we should continue to recommend ATI cards over FX cards until Nvidia comes out with a new line.

_________________________________________
<font color=red>12 bit... The way games are meant to be played!</font color=red>
 

Ion

Distinguished
Feb 18, 2003
379
0
18,780
Fx5200/5600 doesnt have a special code path in HL2, get that into your head first. Both cards are set to run in DX8.1 mode by default.
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
HL2 sucks, and Gabe Newell is a gay mercenary for hire. It's damn funny that their game got ripped off after they teamed up with ATi to scar Nvidia the way they did.
ATi paid Valve a ton of cash to be chosen as 'the company' to run HL2. Shader Day wasnt at all a demonstration to show off or discuss the game, it was an evily orchestrated PR move to kick Nvidia in the nuts publicly, under the guise of being a sneek peak of an upcoming title.
Did Nvidia deserve it?...lol, hell yea, but ATi is showing their true colors through all of this, and I see it.....I'm watching those bastards....those sonowabitches.

<b>I help because you suck</b>
 

TKS

Distinguished
Mar 4, 2003
747
0
18,980
The main thing you forget is that this article, and any article benchmarking ANY FX series card in DX9 is bullsh!t. The reason being twofold. First and foremost, it is a DX9 <b>STANDARD</b> that vid cards must render scenes in 24bit precision. Nvidia dips into 12 and 16 and up to 32 with their 'mixed mode' rendering. In otherwords, they take the DX9 code and change it so that they can run their cards at a loss of quality for an increase of speed.

Secondly, on their website, Nvidia claims that their cards are PS 2.0+ compatible. They're not. They cannot do PS 2.0. Once again, the DX9 standard states that PS 2.0 will be on all vid cards...and once again Nvidia ignored it to sacrifice quality for speed.

Bottom line: You cannot compare the FX with ATI's DX9 cards because it <b>IS NOT DX9 COMPATIBLE</b>. If Nvidia can sprout a sack and step up to the playing field then the benchmarks will be valid. Until then, any DX9 benchmark should hold <b>NO</b> weight because Nvidia is not DX9 compatible. I am sick and tired of explaining this over and over. Please, Please, Please, tell all of your friends this.

----------
<b>I'm not normally a religious man, but if you're up there, save me, Superman! </b> <i>Homer Simpson</i>
 

eden

Champion
You didn't by any chance drink like 20 beers before you wrote that?

OR

Was it a good sarcastic moment brought to us by GW?

(2 Choices really, it's all logical :tongue: )

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>This just in, over 56 no-lifers have their pics up on THGC's Photo Album! </b></font color=blue></A> :lol:
 

eden

Champion
Am I the only one who found that the places where these cards started shining, were also the games frequently benchmarked?

SimCity still runs bad on the FXs. So as in Tron.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>This just in, over 56 no-lifers have their pics up on THGC's Photo Album! </b></font color=blue></A> :lol:
 

Ion

Distinguished
Feb 18, 2003
379
0
18,780
Tron 2 runs bad on FX? I thought the programmers make sure Nvidia cards will run good on it because they even put the effort to make a special glow effect for Nvidia's card. :frown:
 

eden

Champion
It's kinda funny, since the game is almost entirely made out of glow effects. :lol:

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>This just in, over 56 no-lifers have their pics up on THGC's Photo Album! </b></font color=blue></A> :lol: