Official FX 5900 Discussion Thread

flamethrower205

Illustrious
Jun 26, 2001
13,105
0
40,780
Ok, too many posts for people to read thru: let's keep all the battles, flame wars, comments, and ideas in here. Me sar like the FX5900, and yes, my prophecy came true, 5800 does well too w/ newer drivers.

Hilbert space is a big place.
 

CasualCat2001

Distinguished
Apr 16, 2003
175
0
18,680
looks impressive...too bad I can't spend that kind of money

My wife would probably leave me if I spent that much on a video card.

Another thing I noticed from the benchmarks is that I will definitely have to overclock my 9600pro now for Doom III seeing the framerates they got. That is of course if my card is ever released from customs :mad: ...

Overall though, good job nvidia, glad to see they didn't repeat their mistakes.

On a side note, I wonder what THG does with the various video cards after they are done benchmarking them? I wish they had a contest/drawing to give them away to their readers...
 

Sarke

Distinguished
May 4, 2003
95
0
18,630
I'm sure they keep the cards for themselves (bastards, wish I was a reviewer... I'd be getting paid to play Doom ]|[ on a nv35 right now). And they'd have to keep them cuz of new driver releases and new benchmarks...

and OC'ing. Btw, I know it's a little early, but how do you guys think the nv35 would OC as it (supposedly) runs at around 50-60°C under heavy stress, and it is tested for 100°C use? Think it could come back up to the 500/1000 of the nv30?
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
Your not going to have to overclock your 9600 to play Doom3. The game will detect your card, like any other game, and set the details accordingly...it will be up to you which details you'll be able to turn up or down. Everyone will have to tweak the game to their liking.

Pictures of my PC & Me!<A HREF="http://www.lochel.com/THGC/html/Genetic_Weapon.html" target="_new">http://www.lochel.com/THGC/html/Genetic_Weapon.html</A>
<font color=red>Melb_Angel</font color=red>=<font color=blue>The other white meat!</font color=blue>
 

Sarke

Distinguished
May 4, 2003
95
0
18,630
Here are a few of the articles on the new 5900u:

<A HREF="http://www.3dgpu.com/reviews/5900u.php" target="_new">3DGPU</A> (5 pages)
<A HREF="http://www.avault.com/hardware/getreview.asp?review=geforcefx5900ul" target="_new">Adrenaline Vault</A> (15 pages)
<A HREF="http://www.anandtech.com/video/showdoc.html?i=1821" target="_new">AnandTech</A> (30 pages, a bunch on IQ and drivers, and benchies of course)
<A HREF="http://www.bjorn3d.com/_preview.php?articleID=273" target="_new">Björn3d</A> (one looong page)
<A HREF="http://www.extremetech.com/article2/0,3973,1074270,00.asp" target="_new">ExtremeTech</A> (13 pages)
<A HREF="http://www.hardocp.com/article.html?art=NDcy" target="_new">HardOCP</A> (14 pages, also on the r9800pro 256mb)
<A HREF="http://www.hexus.net/review.php?review=554" target="_new">Hexus</A> (22 pages, half are benchmarks and some good pix too)
<A HREF="http://www.hothardware.com/hh_files/S&V/gffx5900u.shtml" target="_new">Hot Hardware</A> (8 pages, pix)
<A HREF="http://www.neoseeker.com/Articles/Hardware/Reviews/nvidiafx5900ultra/" target="_new">Neoseeker</A> (10 pages)
<A HREF="http://www.sudhian.com/showdocs.cfm?aid=378" target="_new">Sudhian Media</A> (10 pages)
<A HREF="http://www.tomshardware.com/graphic/20030512/index.html" target="_new">THG</A> (30 pages, but 20 are benchmarks)
<A HREF="http://www.xbitlabs.com/articles/video/display/geforcefx-5900ultra.html" target="_new">X-Bit Labs</A> (27 pages, about 15 are benchmarks and a few good ones on IQ)

Love it, but I'm not gonna buy it. I love it cuz it's gonna drop the price of what I want to buy!<P ID="edit"><FONT SIZE=-1><EM>Edited by Sarke on 05/12/03 04:32 PM.</EM></FONT></P>
 
Casual man, I feel your pain as one great American once said (John Holmes wasn't it?) anywhoo, yeah I mentioned it in my D3 post. I just hope drivers specific to the 9600P help in D3.

As for Thom's cards, it wouldn't surprise me if they found their way into a rig there, especially for future tests. Many reviewers/sites/etc have to return the cards after the official bench, release, article, etc. I doubt that THG has to do that but then again I might be wrong. They may also prefer to do that if they get them on 'loan', and then they go out and buy a random 'retail' board for their long term tests. To reflect what we mortals 'should' expect from a randomly chosen or sampled boards(s)
Since there are supposed to be THG employees in our midst maybe they can shed some light on what happens to them.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <font color=green>RED</font color=green> <font color=red>GREEN</font color=red> :tongue: GA to SK
 
So you mean I shouldn't have sold the RageFury? I coulda played D]|[ on it? Dang, and I wanted to use the money for a Parakeet and some magic beans! :lol:

I can see it now D3 running at 320x240x16bit with nothing 'on', 7 fps. Hey let's go 256 Colour! Cool 9 FPS! :eek:

Well, let's see if Ati can help their cards overcome "the way it's meant to be played". I sure hope so cause I'd like to see some things turned on.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <font color=green>RED</font color=green> <font color=red>GREEN</font color=red> :tongue: GA to SK
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
This whole shindig might have something to do with Cat.3.3 not being released.

Pictures of my PC & Me!<A HREF="http://www.lochel.com/THGC/html/Genetic_Weapon.html" target="_new">http://www.lochel.com/THGC/html/Genetic_Weapon.html</A>
<font color=red>Melb_Angel</font color=red>=<font color=blue>The other white meat!</font color=blue>
 

eden

Champion
Flamey my boy, I never doubted you and nVidia when it came to drivers heheh, but I always had doubts of the FSAA improvement, as it seemed it was inherent to the architecture.

All I can say though, is...amazing.
No one has yet to comment in this thread, in detail, but I will.

First of all, the cooler is somewhat different from the pics other sites had in preview, not sure if I am wrong on that. Second, I wish THG would release GPU temps and power consumption charts. The card is likely to still be overly hot and consuming.
Glad to know it isn't a dustbuster anymore, though the fan itself looks awefully familiar to the FX Flow. At least in the end, nVidia had the guts to admit it wasn't succesful, the NV30. Though not that much props, since you can easily interpret that since they will release a product that will shadow the FX5800 and even replace its market availability, then they can make fun with no problems.

We've yet to hear of the FP performance which was supposedly doubled, but I am excited.
The Intellisample advancements show once more how nVidia can truly improve. I thought the NV3x serie in itself will have utterly horrible FSAA, but the drivers have done their job. If you ask me though, there is no architectural enhancement in this domain though, the entire extra performance comes from the bandwidth which I am willing to bet, is not even entirely used at all (consider having 10GB more, and you only achieve a moderate 30% maximum over your brother).

Detonator-XP: Now this is what power is all about. There is no denying it, nVidia has probably rehired the real programmers, and yanked out a surprise boost that is significant, enough to put the FX5800 Ultra in a totally different light. One has to wonder why they did not try the FX5600 yet, though I am hoping it does well. They also mention 44.05 as the version, what about the 50.xx. Do you think there is so much more to be expected?

What surprised me majorly is that drivers actually removed the misery the NV30 had in FSAA and Aniso, and in such a good way that it actually rapes sometimes by nearly 50% the competition. I just don't know why it was so bad back then, but just a few months of programming, yeilded so much in return. The Pentium 4 could do that if you use SSE 2 so well, so I'm glad to see some real programming.
Regardless, hope the Detonator FX do some justice to even the NV2x serie.

Benchmark-wise, I'll go through in order:
Medium Quality: Ok, so who actually claimed all that time that ATi was the best suited for Doom III? It doesn't make sense at all, the NV30s are so well coded here, they rape.
High Quality: Ok something got a bit weird here and the FX5900 Ultra lost. It is a bit odd it loses this badly performance, compared to Medium.
The FX5600 Ultra even has a lovely time against the 9600PRO, who btw was playing on a 3.06GHZ with i850, the second highest-performing chipset for the Pentium 4 under Medium Detail. Can you imagine you users under slower comps? I hope THG tests CPU scaling, FAST, because that leaves us with a dry mouth if we tried Doom III on 1.4GHZ-2GHZ comps, without any notice of CPU scaling. One thing is clear though, the DX9 is doing its job for the FX5600 Ultra, didn't expect it to shine this much over the Ti4200.
As for the FX5200 Ultra, DX9 for 79 works, only for Dells and gamers who prefer Medium to Low quality, otherwise imagine High Quality crap.
Still, overall, the DOOM III results could've been better IMO. They're definitely optimized though, much more than the Alpha ever was. It just worries me sick what CPU will be the bottleneck limit.

I would, however, like to direct the viewer's attention to these two odd results: <A HREF="http://www.tomshardware.com/graphic/20030512/geforce_fx_5900-12.html" target="_new"> EXHIBIT A
</A>
In Exhibit A, we see the regular performance of the FX5900 Ultra under High Quality.
<A HREF="http://www.tomshardware.com/graphic/20030512/geforce_fx_5900-13.html" target="_new"> EXHIBIT B</A>
In Exhibit B, we see the FSAA results for HIGH QUALITY as denoted by THG.
The prosecution rests, the defendants must now answer this: How do you achieve higher performance with FSAA than regular, if you claim High Quality setting was used?

UT2003: Nothing new, except that the high res results of the FX5800 Ultra are simply jaw-dropping.

Serious Sam 2: Under Aniso 8X, the FX5800 Ultra rises to the point of unbelievable driver strengh, it manages to outdo the highest competition by a clear 50%. Nothing short of amazing, I simply have a hard time believing it.
Under full FSAA+ANISO, it and the FX5900 Ultra continue to enjoy a rather healthy and loveable dinnertime eating the competition away.

Splinter Cell: And now we move on to what I like to call: The hall of crap.
In comes the 3rd proof I can provide you now, once more, as to why Splinter Cell is badly optimized, that it doesn't use Pipe awareness, or even IPC enhancement. It simply relies on clock speed, PERIOD. The FX5900 Ultra loses, it is clear. This is demonstrated ocne more by the game's lack of optimization, and it proves that not everyone has learned from the Commanche 4 disaster. There are still monkeys out there, FIRE THEM.

While at first I was surprised with the 3dMark 01 results, I was later convinced of the overall result, due to the Shader speed, which was once again rather odd. There is still also no explanation to the abnormally high Poly 8 Light test result. Also another test which does not use card architecture but sole clock speed.

<b>And here comes what I am most skeptical about, and what may rise some eyes. 3d Mark 03, a few days ago was previewed with the FX5900 Ultra, and had performance way over 6000 pts. That was with the Detonator 50. What does this mean? Does this mean there is an extra 15% to be expected from the Detonator FX, on all FX cards, especially the FX5900 Ultra who so far has NOT proven itself to its family?</b>

Now to analyze and view a few things -OBSERVATIONS-: (this list could get updated hehe)
-The FX5900 Ultra, is a disappointment to its siblings. Why am I saying that?
At the moment, the FX5800 Ultra stands most of the time at 5-10% behind it. How does having much better clock performance and 10GBs more per second not help?
Which leads me to be skeptical and awaiting the Detonator 50s. BTW, Catalyst 3.4 on the new R9800 256MB is a failure, so I must ask myself what about the rest of the family? How can the 9700s edge more than this, if they would trample the 9800s which barely have a lead over them?
-The card's clock is still too high. This proves that while nVidia's drivers are excellent and the card is performing solidly and way beyond my expectations, if the card was downclocked to 380MHZ and 320MHZ memory, it's likely not be as competitive. It may have arrived at the point of having equal FSAA and Aniso performance per clock though, if not better. But standard perf? Doubtful.
-One has to wonder, just how less performing is the FX5900 Value? And how consuming is it? I'd love a contender to the 9700PRO, with performance rivaling the FX5800 or slightly less, at such price. But I am worried about the power consumption and heat above all, so THG better try to get some info on this.
-Will we ever hear the new fan? Sound is subjective as we know.
-The 9800PRO 256MB is new the wanker of the century, costing us Canucks a horrid 750$ CDN for an extra NON-ENHANCING 128MB which often even lowers performance due to higher search time in the memory (latency).
-The price above all will make this NV35. I am hoping to god retailers will opt for lower prices or that nVidia will release value oriented versions, which maintain a CLEAR lead over the 9700 non-PRO at least. After seeing the Doom III results, I am convinced that I have to go for the 9700, or else I'll suffer like Ape will, who had thought otherwise... :frown:
-The FX5800 Ultra uses 16GB/sec of bandwidth and the same color compression, yet as we have witnessed, it raped the competition despite the usual claim that bandwidth makes everything in image quality enhancement. In contrast, the 9800 uses 21.8 and is probably not inching close to using it all effectively, and the 5900 is 11GBs above and still not more than 50% better, but rather around 20% at most, against its younger brother. That is a bit sad, but like I said, drivers might change even more the situation, as if this wasn't good enough!
-From what I just read at [H]OCP, the FX5600 for the very first time, is raping away the 9500PRO, killing the 9600PRO (VERY DECEIVING UNDERPERFORMANCE) and contending very well, far beyond what I ever imagined, especially since it boasts supposedly crappier DX9. Again I am confused at the late performance improvements and hope a future article solely on the new drivers will clarify all new FX cards' performance.
Well, that ends -or supposedly in my opinion-, this inevitably long opinionated post, which I know many view it as I.
This should represent more than enough of the opinions who were to have equal length posts, like Grape Ape would do or Twitchy-boy. :tongue:

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile: <P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 05/12/03 09:00 PM.</EM></FONT></P>
 

Sarke

Distinguished
May 4, 2003
95
0
18,630
long read...

-The FX5900 Ultra, is a disappointment to its siblings. Why am I saying that?
At the moment, the FX5800 Ultra stands most of the time at 5-10% behind it. How does having much better clock performance and 10GBs more per second not help?
I must correct you there, as the fx5900u has a SLOWER clock than the fx5800u (slower mem too: DDR1, not DDR2), but it does have the +10gb/s more bandwidth. That's why we see the fx5800u beat the fx5900u in some tests.

And supposedly the nv35's architecture is way more efficient and runs much cooler than the nv30 (one of the reasons they don't need the jet turbine anymore).

I didn't notice the benchmark differences until you pointed them out (sharp eyes), and as it only occurs at 1024x768 I would think that it is just a driver inefficiency.
 

eden

Champion
Are you referring to the DOOM III results in 1024?
Since Hard OCP has similar results to THG in DOOM III, I think it proves though that THG just mislabelled the 4XAA as High Quality, than Medium.

As for what I said, no, I meant better clock performance, which means better PER-CLOCK, not higher clock speed.
I also did not really say this because the FX5900 lost a few times, but because OVERALL it is barely itching away, sometimes only by a margin. To its predecessor, it is a disappointment, period. To the competition, it is the new king, and with a serious jump if I may say so, far more than what most recent "new cards" do, however we should keep in mind the FX5800 Ultra is not far behind and is also exhibiting similar performance, so we should not be blind to this existence and say the FX5900 has painted and done it all alone.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
 

Willamette_sucks

Distinguished
Feb 24, 2002
1,940
0
19,780
Eden, will you marry me?

Unless cat 3.4s kick ass ALOT, which i dont expect they will, im sad to say, i will consider returning my 9800 pro (i eat cost of shipping), and getting an FX5900U.
That is only if the 5900u can actually be had for the expected 399, which i find simply amazing!
I figured itd have a 499 price tag.

In the Serious Sam test, the ATI cards are using higher quality texture settings, which influences performance, esp. when AA/aniso is used.

Also in Doom3, they run the ARB2 path. This is just confusing to me. nVidia cards get their 1337 optimized paths, im SURE id could make a kick ass r300/r350 path!
COME ON!! WTF!
Course, nVidia said the 50.xx's will bring the ARB2 path to nearly the same level as their nv30/nv35 path, so maybe the optimizations wouldnt make such a big diff...

Another thing... I personally cannot tell a difference between 16x quality, and 16x performance (and probably cant tell a diff. up from 8x either) when playing games, therefore i ALWAYS use the performance mode.
The quality mode seems to hit ATI hard, but in the Performance mode (comparable to nVidias performance mode, not ultra performance mode) the performance gap is much smaller.
So well see what cat 3.4s do, and how much a 5900 will cost and when i can get one, and then well see if a swap is worth it. Ill at least have to pay shipping to return mine, and if the 5900 dosnt come out soon enough, ill probly have to Ebay mine, in which case id take a big hit:(

9800 Pro TOTALLY KICKS MAJOR ASS THOUGH!!!
So maybe im just wanting UBER UBER PERFORMANCE over UBER PERFORMANCE too much.

I could probably easily get 50-100 dollars to do the 'upgrade' (kinda) in a few months or whenever it comes out though, so well see if its worth it to me.

Boy i really need a job though!

Im going to hold off getting aftermarket cooling for my 9800 because of the 5900. If i decide to do the switch, the cooling will do me no good, and i will have lost 40 bux (or more).

I just dont get it though... i mean, 9800 is WAY bigger than 5900, by a whole 3900!
the 9800 should be way better, mathematically speaking.
Damn those confusing numbers!
lol.

Eden, please marry me!?

Oh wait, im already committed to Gen. Dang.
Oh well, I wont mind serving him beer all day long while he sits his fat ass on the couch and gets fatter.

Maybe we could have an affiar though?

Shh! Don't let Gen hear though!

Long live ATI.
 

eden

Champion
Seriously though, are you saying nVidia's results in SS2 did not use max texture settings?!

Carmack last I checked, used ARB2 just for R300 optimization. I am not sure though, but it nonetheless is weaker than I thought.

Im going to hold off getting aftermarket cooling for my 9800 because of the 5900. If i decide to do the switch, the cooling will do me no good, and i will have lost 40 bux (or more).
I didn't get that.

Funny to see how even those who praised ATi to death now are quite thrilled at nVidia's new offering. Nice to see that, but hope ATi has awakened, as this is a clear example of them prematurely enjoying victory, when they should've been more awake.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
 

eden

Champion
At least I'm back as the throne holder for longest post, finally beating Twitch and Ape, hehe. :smile:

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
 

Sarke

Distinguished
May 4, 2003
95
0
18,630
It would be longer (and an easier read) if you leave a line between and shorten the paragraphs... or just use 50 small posts.
 

dhlucke

Polypheme
The reviews from today have almost too much information. What I really pulled from the reviews was this, as quoted from Anandtech:

Those looking for the best Doom3 performance may want to wait until right after the game comes out to upgrade their graphics cards, with hopes that even faster (and cheaper) hardware will be out by then.
Although it's kinda early to really judge for sure, I think it's fair to say that anyone who wants to play Doom III at high quality is going to need to follow this advice. That's me, so I think I'll wait for that big purchase until Doom III is out.

I'll just sit back and wait. November is the release date right?

<font color=green>Everyone should be like the Dutch. They're perfect.</font color=green>
 

reever2

Distinguished
Apr 13, 2003
231
0
18,680
Funny to see how even those who praised ATi to death now are quite thrilled at nVidia's new offering. Nice to see that, but hope ATI has awakened, and has been awake, nobody, as this is a clear example of them prematurely enjoying victory, when they should've been more awake.
Of course they are thrilled by Nvidias new offering, everything needs competition. And ATI prematurely enjoying victory? They have been enjoying sweet victory ever since the 9700 Pro came out, and they ARE awake as it has been noted(as should have been known by a lot of people since it is a no brainer) that ATI's is coming with another product out other than the 256meg R350, presumed by everyone to be R350 on crack.....I mean on 0.13. And Ati didnt release anything other than the 9800 during the 9700-5900 period because it simply wasnt neccessary, not even Intel feels the need to bring out new products when they already have the lead
 

RCPilot

Champion
Good post Eden. You look at these cards deeply, a whole lot deeper than I do, keep it up, makes for a good read.

Like Dhlucke said if you what a card to play Doom 3 at high res. wait until the game is out. I'm going to use that but of advice myself.

If it ain't broke, take it apart & see why not!
 
G

Guest

Guest
Doom III is of course still in an early stage so when we actually see the retail version the scores would be a lot higher I imagine. ID Software is making it fully scalable to a range of configurations, much like Half-Life 2.

I'm getting the FX5900 once the price is reasonable.

--Why do blondes have bruises on their belly buttons? Because blonde GUYS can be dumb too! Ha ha ha ha ha ha ha!!!!!--
 

flamethrower205

Illustrious
Jun 26, 2001
13,105
0
40,780
It's so messed up, one can type something big like that and have it worded not terribly and at the same time convey leots of info. the same lengths when it's formal (and will end up sounding the same) takes like 10x the time.

Hilbert space is a big place.
 

eden

Champion
Err, was that a compliment? I'm a bit light-headed at understanding sometimes. :tongue:

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
 

eden

Champion
The R9800PRO has not been a performance jump that was worthy. The clock increase is barely proportional to the performance increase, and as demonstrated, it sometimes even ties its younger brother, which is something not too nice. It has improved color compression (6:1), and yet it falls short overall.
It simply lacks a lot of performance, and needs the drivers to push it, fast.

Not only this but then ATi proceded to actually play performance-drop to actually dumb down compete with the FX5600 Ultra, so they can then improve mainstream performance gradually from a lower level than its current one, since the competition offers too low performance. (weird huh?) Now they omitted remembering the FX will have better drivers, and the FX5600 has proven to rape in DOOM III.

ATi at first was doing great to enjoy their victory, but it has not been a year since that, and to simply release dumbed-down products and consider THAT competition simply does not help.


--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile: <P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 05/12/03 11:01 PM.</EM></FONT></P>