Sign in with
Sign up | Sign in
Your question

New benchmarks for ATI/Nvidia

Last response: in Graphics & Displays
Share
August 25, 2003 1:27:15 PM

Beyond3d has a new benchmark out, using Tomb Raider:Angel of Darkness. This being one of the first games to use DX9, they decided to test ATI's and Nvidia's newest video cards. The article can be found <A HREF="http://www.beyond3d.com/misc/traod_dx9perf/" target="_new"> here </A>.

It looks like the 3dmark 2003 was right all along....

More about : benchmarks ati nvidia

August 25, 2003 3:17:55 PM

Excellent article. It justs proves what I have suspected all along...that the Gefarce FX series is a big joke. They made them 'DX9' compliant...cuz they'll run DX9 stuffs. The only problem is that they won't run them any faster than a three toed sloth with down syndrome. Perhaps Doom3 is just waiting until Nvidia can release a video card that can actually handle it until they release the game? I know Nvidia is working 'very closely' with id software on it...perhaps they've realized that they don't have a card to handle it and have persuaded id to hold off on it a bit? Just speculation of course.

Anyways, if these benchmarks are any indication as to what to expect as normal for DX9 games...I'd say that Nvidia is about to go down in flames bigtime. I'll get the marshmallows....people who can't admit they are wrong WHEN THEY ARE WRONG friggen should die a horrible death....burn Nvidia Burn.

Oh..I really liked my GF 2 GTS Ti card though...so I guess they were ok up until then. :smile:

<font color=blue>I don't have to be careful! I have a gun!</font color=blue>
<font color=green>Homer Simpson</font color=green>

TKS
August 25, 2003 3:34:15 PM

a few people have been saying this forever


we got flamed, we were called it fanboys.


i SPIT in your face! go buy those sub par video cards! DONT listen to me!

-------

<A HREF="http://www.teenirc.net/chat/tomshardware.html" target="_new"> come to THGC chat~ NOW. dont make me get up ffs @#@$</A>
Related resources
August 25, 2003 3:35:01 PM

not directed at you TKS

just at those nvidiots who were too STUPID to see the truth of the situation

-------

<A HREF="http://www.teenirc.net/chat/tomshardware.html" target="_new"> come to THGC chat~ NOW. dont make me get up ffs @#@$</A>
August 25, 2003 10:50:36 PM

Hey Phial,

I figured by now we'd have a bunch of posts from fanboi's about how this was BS and what not. So far, the masses have been quieted. Interesting. But I guess when you see the clean, decisiveness at which the FX 5XXX was dismissed by the ol 9700 and 9800 Pro's in the DX9 games in that review...whoever counters it will probably come off sounding like a moron-in-denial. It'd be like looking at a horse and calling it a cow. Or maybe sitting on the TV and watching the couch. Dunno...but I know whoever does try and counter this would have to have no brain.

<font color=blue>I don't have to be careful! I have a gun!</font color=blue>
<font color=green>Homer Simpson</font color=green>

TKS
August 26, 2003 1:51:17 AM

The truth hurts.
I imagine they are all at home crying........
August 26, 2003 2:12:18 AM

WHAT NOW NVIDIOTS???? HAHAHA!!!!!! TRUTH BE TOLD, GOD BE PRAISED!!!!!!!!

<b>nVidia cheated on me so I left her for ATi. ATi's hotter anyway...</b>
a b U Graphics card
August 26, 2003 2:17:16 AM

Nvidia Drools while ATI Rules!


I love the smell of Napalm in the morning.



<b><font color=purple>Details, Details, Its all in the Details, If you need help, Don't leave out the Details.</font color=purple></b>
August 26, 2003 2:17:19 AM

Beyond 3D has been complaining about weak shader performance in the FX line for months. Although I kinda think this is an extreme example, and shouldnt be taken as an end all situation, it is deffinately a kick in the nuts for Nvidia.

I <b>help</b> because <b>you</b> suck.
<b>Play Raven Shield</b>
August 26, 2003 2:23:31 AM

I was just waiting to see yer reaction to this one Phi, lol!

Man, this is just...... words cannot describe (and ignore the ones where they lowered quality settings for desperation purposes, like cheats) the slaughtering these cards had. This is manslaughter, no competition, far worse than ass-raping!

Amazing, and they used high-end hardware with the latest drivers from each!

I can't wait for Dave to come see this ROFL.

Only thing that still worries me about concluding as you guys did, is the fact that Gunmetal benchmark is a DX9-pill, it's infected with it. So why is the FX serie so good and ahead? Same in Doom III. Is it because D3 is OGL?

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=yellow><b>Craptastica</b></font color=yellow></A>
August 26, 2003 2:53:13 AM

Yeah, where's Dave when you need him??

As for Gmetal and D3 being OGL, hasn't nVidia always done well with OpenGL?

<b>nVidia cheated on me so I left her for ATi. ATi's hotter anyway...</b>
August 26, 2003 3:07:41 AM

it is an extreme example.. but one that will matter more as time goes on


Nvidia didnt have to rely on shader performnace until now..

Gf4Ti series? pure raw fillrate and static T&L function power... just super-fast GeForce2s with medicore pixel shader support

even up till now, no games have really used pixel shaders.. UT2003 maybe.. but water pixel shading and simple shadowing has been around for quite a while and seems to among the simplest of shading routines..

i cant wait to see GFFX performance in HL2, which relys on shaders for MORE than just realistic water and relfections.

=)

-------

<A HREF="http://www.teenirc.net/chat/tomshardware.html" target="_new"> come to THGC chat~ NOW. dont make me get up ffs @#@$</A>
August 26, 2003 3:11:43 AM

Quote:
I was just waiting to see yer reaction to this one Phi, lol!



well.. man ! lol

weve been saying this FOREVER. with proof more often than not and STILl people would claim false facts

its just frusturating sometimes, when you have a truth before you, trying to HELP people becuase these cards aint cheap.. and yet some IDIOT will come in spreading false information just because they are brand loyalist morons. its like they feel they OWE their brand a favor or something. ITS A VIDEO CARD COMPANY FFS. get over it! (yes im talking about video card fanboys, either side)

-------

<A HREF="http://www.teenirc.net/chat/tomshardware.html" target="_new"> come to THGC chat~ NOW. dont make me get up ffs @#@$</A>
August 26, 2003 3:15:18 AM

FFS = For F*ck's sake? My sister says that all the time!

<b>nVidia cheated on me so I left her for ATi. ATi's hotter anyway...</b>
August 26, 2003 10:40:30 AM

see what?

oh.. THAT?!

uhm.. yeah.. i would be happy now i guess but instead all i can think of is my gf wich i miss so much .. so i'm not happy.. but.. i feel.. right.

doom3 is not out so forget about those nv-cheated-benches you have seen. :D  and yes, doom3 is opengl, and yes, in opengl you can code directly for nv30 with nv30 extensions, and yes, thats the way doom3 can get acceptable performance on nvidia boards. there is no way with standard opengl to get anything good for doom3 on any nvidia card (carmack .plan states that). but he can use the extensionmechanism to code nv-shitietary stuff and voilà, the cards show that they have good hw wich is just way off.


what you all want to hear:
i've said it right from the beginning! why don't you listen to me?!

"take a look around" - limp bizkit

www.google.com
August 26, 2003 10:57:23 AM

WOW!I cannt believe it!FX really lose faces!W.H.Y?
August 26, 2003 11:09:43 AM

Ouch! those figures have really gotta hurt Nvidia. I was expecting the settings to be equal but even with more demanding settings enabled the FX cards get utterly whipped...Now where can I find the cheapest 9800Pro? PML!

By the way there doesnt seem to be much performance difference between the 256Mb 9800Pro and the 128Mb 9700Pro...further proof that the extra 128Mb isnt worth it???

<i>Mmmm Dawn AND Eve at the same time...Drroooooll
-------------------------------------------------
<b>XP2100+, 2x512Mb PC2700, ASUS A7N8X, PNY 64Mb Ti4200. :cool:
August 26, 2003 12:13:30 PM

Wow!

Great benchmarks!

This clearly points out that ATI have the best DX 9.0 supports right now!

But, I would have like to see image quality comparison in different settings. This would have been great to see the differences between ATI/nVidia.

But nVidia haven't loose yet! They probably have a great chip that will push performance further... But ATI too!

On the other hand, nVidia stills own the crown in the AMD chipset market!

--
Would you buy a GPS enabled soap bar?
August 26, 2003 1:45:13 PM

Now just wait till Nvidia “optimize” the driver for this game. “cough, cough”

One point stick out, PS 2.0 is disable for 5200 by default, wonder why!!! :tongue:
August 26, 2003 2:10:43 PM

B3D gonna put up a new review in couple hours that has IQ comparison, cant wait...
August 26, 2003 4:29:36 PM

Assuming they even had to go for 16-bit cubemaps and whatnot, to get decent performance, one must assume it's horrible quality.

On top of all this, even if the FX had better quality (suppose 32-bit FP, IF it actually made a visual difference), it simply couldn't slow it down twofold.

It's a bigtime shader problem. I dunno if Drivers can possible do this.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=yellow><b>Craptastica</b></font color=yellow></A>
August 26, 2003 4:38:42 PM

You are absolutely right on that. 16bit cubemaps for Nvidia and 32bits for ATI...I was looking at a comparison yesterday in the latest CPU magazine. I didn't even realize that Nvidia used 16bit technology in this department. It amazed me that despite this advantage (it would take far less time to render scenes using 16bit) Nvidia cannot overtake the 9800 Pro often. It seems that it should kick the crap outta it...but alas...it isn't so. This PROVES that ATI is generally better....I mean, if you look at this review...the facts of the matter (16 vs. 32) and the cheating...if you are still a Nvidia fan, you should have your head hit repeatedly with a tack hammer.

<font color=blue>I don't have to be careful! I have a gun!</font color=blue>
<font color=green>Homer Simpson</font color=green>

TKS
August 26, 2003 5:20:40 PM

32bit/16bit floatingpoint doesn't mather much, only 12bit fixed point is 2x as fast on gfFX. the others are both slow. the 32bit floatingpoint has to access twice as big registers, and drops even more. but even 16bit fpu is rather slow, compared to its fixed point unit.. we can say the fpu runs at half the clock of the fixedpu.. so those ultrahighclocked gfFX cards are for dx9 games halfspeed clocked.. and, that wonder, they get then comparable in performance :D 

"take a look around" - limp bizkit

www.google.com
August 26, 2003 6:03:58 PM

I'm so upset i just bought a 5900 ultra, damn. I knew i should have upgraded my 14 inch monitor. Oh by the way if i trade my card down to a fx5600 will i be able to play doon at average settings?

If he doesn't die, he'll get help!!!
August 26, 2003 8:02:09 PM

I do beleive that Nvidia's cards are usually better optimized for the OpenGL API than DirectX, but I wasn't aware that the difference was THIS big. This could very well be some downtime for Nvidia unless the NV40 does miracles for them in the fall, (its scheduled for fall isn't it?) Boy, I'm glad I purchased those cheapo Ti 4200 cards instead of paying $100 for each of 3 FX 5600 cards that I was considering. When I saw benchmarks of the FX5600 compared to the GeForce 4 Ti series I was REALLY dissapointed, I would have been more than willing y to double my purchase amount for an 80% (maybe even a 60%) boost in speed, but it just didn't look like it was really warrented becuase the FX 5600 didn't even touch those figues. I really expected Nvidia's successor to their middle-class card to really kick butt, but it sucked butt instead since the good ol' Ti 4600 beat it in every benchmark just about in the ANandtech article.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
August 26, 2003 8:14:04 PM

I wonder if should keep my ol' Radeon 8500LE running instead of a GF4 Ti. How does the R200 do with this type of game compared to an FX?

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
August 26, 2003 9:39:17 PM

speechless

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
August 26, 2003 11:45:26 PM

I find it really agrevating when you guy's put that humongous gap in your post's like squirtle just did.
I dont enjoy scrolling down a hundred times just to participate in a thread.

I help because you suck.
August 27, 2003 12:00:31 AM

Me neither, its really getting old.

"Mice eat cheese." - Modest Mouse

"Every Day is the Right Day." -Pink Floyd
August 27, 2003 1:52:44 AM

Yeah me too.

My official stance on this whole thing is: SCREW NVIDIA.

<b>nVidia cheated on me so I left her for ATi. ATi's hotter anyway...</b>
August 27, 2003 3:12:37 AM

Agreed. Heck, what does Speechless have a goal in his post?

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=yellow><b>Craptastica</b></font color=yellow></A>
August 27, 2003 3:14:48 AM

Yeah I agree. And gee, I thought that pulling the GeForce 4 MX was really lame of Nvidia. Next thing you know a year later Nvidia is still trying to get away with even more than before.

The only mainstream component that Nvidia ever made that rocked was the GeForce 2 MX. No, it was never the fastest card in town, but back its day it could run the latest titles with good framerates.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
August 27, 2003 4:18:51 AM

Quote:
I do beleive that Nvidia's cards are usually better optimized for the OpenGL API than DirectX, but I wasn't aware that the difference was THIS big.


they AREN'T. opengl itself runs very bad on nvidia cards as well, it uses about exactly the same features/hw as dx9.

BUT

opengl allows the addition of "vendor-specific extensions". and with those, they can add features. once you use them you can gain new performance, or features, or what ever. and in nvidias case, you get access to the nv30. you use one of their extensions and you have to use about all of them, as they are all interwired. then you can get first time good performance on nv30. but this is stupid, as opengl and dx are there to unify hw.

nvidia messes opengl up all the time. it was the same with about every gf1 card and bether. they had RC,TS,VAR,NVFP,NVVP, and more, and they never ever cared on trying to expose their hw in a good, well designed way to opengl instead.

coding for nvidia is like downloading the intel processorspecs. ugly, complicated, proprietary.

and with todays scedules for gamedev's, nobody can pay an additional codepath programmed specifically for proprietary gf1+ hw, one for proprietary gf3+ hw, one for proprietary gf5+ hw. exception is carmack, who takes himself the time to do that. but he doesn't like it as well. it will be his last time where he optimizes for gpu's, he told.
after doom3, nvidia WILL have to make hw that simply runs GOOD in standard situations, or they will fall.


oh, and i see currently tons of people complaining how slow their 5200 is.. from 50fps on my radeon9700pro to 2-3fps on their 5200.. ps2.0, and, because of that, the mainfeature of dx9, is about UNUSABLE on those cards (seen in those benches, too, as they got disabled..:D ).

nvidia messed up much bigger than i thought.. i'm dissapointed.

"take a look around" - limp bizkit

www.google.com
a b U Graphics card
a b Î Nvidia
August 28, 2003 3:06:52 AM

I just found the review interesting in how well the R9600Pro did. I mean WTF pretty impressive, and really a surprise, even to me (well I have been under a rock for a week [Thanks Superman!]).
Yeah it beat an R9500NP (big F'in deal), but against the FX5600/5800/5900? weird! Seriously makes me wonder about the way it's 'meant to be played'. CG? Faged' aboudit!


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
August 28, 2003 3:22:35 AM

I see what you mean. Its astounding one would be more futureproof with a stripped down Radeon 9500 or 9600 than Nvidia's beefy FX5900 Ultra. Do I sound dillusional? R9500 a better card than 5900 Ultra? I can barely fathom that. This makes the Radeon 9500 look like a real attractive card at its low price plus you can even softmod many of them to 9700 or 9500 Pro. I'm thinking about sending back the 3 GF4 Ti 4200's I received in exchange fore 9500s or 9700s.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
August 28, 2003 3:50:19 AM

Quote:
--------------------------------------------------------------------------------

I do beleive that Nvidia's cards are usually better optimized for the OpenGL API than DirectX, but I wasn't aware that the difference was THIS big.



--------------------------------------------------------------------------------


they AREN'T. opengl itself runs very bad on nvidia cards as well, it uses about exactly the same features/hw as dx9.

I believe the hundreds of benchmarks all over the web.

I help because you suck.
August 28, 2003 7:31:44 PM

Actually, Nvidia is reliant more on their own proprietary openGL extensions for performance boosts than ATI is. That should be a known fact across the board. Don't believe me? Try the <A HREF="http://OpenGL.org" target="_new">http://OpenGL.org&lt;/A> discussion boards. You'll find more people with Nvidia problems (see the 'hot' topics that have the icon of a folder on fire). Check <A HREF="http://www.opengl.org/discussion_boards/ubb/Forum1/HTML..." target="_new">here</A> and <A HREF="http://www.opengl.org/discussion_boards/ubb/Forum1/HTML..." target="_new">here</A> for specific posts. Might as well go <A HREF="http://www.opengl.org/discussion_boards/ubb/Forum1/HTML..." target="_new">to this one</A> and <A HREF="http://www.opengl.org/discussion_boards/ubb/Forum1/HTML..." target="_new"> this one</A>. The fact is, on opengl.org's entire gaming area discussion board...all the hot topics except one are regarding Nvidia cards not working with openGL. If that doesn't say something I don't know what will.

And, the interesting thing is that Nvidia has invested more time and money into OpenGL than ATI has because they've been around a lot longer (as a large manufacturer) and therefore they have more extensions for themselves to use. I mean, their .pdf is over 1k pages and the ATI is only 500+ so, that in itself should clue you in to how much Nvidia is involved in openGL.

If you compare the number of extensions from <A HREF="http://developer.nvidia.com/attach/5439" target="_new">Nvidia</A> to <A HREF="http://ati.com/developer/atiopengl.pdf" target="_new">the ones from ATI</A> you get 33 to 14 (not counting the 4 ATIX extensions from ATI as those are only used experimentally by ATI). ATI even uses 3 of Nvidia's extensions. So an actual number of 33 to 17 (+4 experimental). What does this tell us? That Nvidia relies more on their 'own extension development' and not the accepted standard extensions. That's why IMHO DX9 is a better standard to benchmark in because Microsoft controls the standard and only THEY can make changes/optimizations...it isn't an open source movement that can be altered and 'optimized' by anyone. Granted, with openGL both Nvidia and ATI are optimizing for performance here. Exactly what each of those extensions do for each card I'm not sure...you'll have to get someone more technologically advanced in rendering/graphics/programming involved to give an opinion.

If you read Dave's post...this is exactly what he is saying...
Quote:
opengl allows the addition of "vendor-specific extensions". and with those, they can add features. once you use them you can gain new performance, or features, or what ever.

He goes on to say that with the advent of each new card Nvidia puts out, they simply put in a new extension to optimize said card. If you notice at the revisions for the extensions on the .pdf for Nvidia, you'll notice this fact. This simple fact...that Nvidia is using their own extensions and revising it for each card means that they are setting themselves up for failure. To quote Dave again,
Quote:
and with todays scedules for gamedev's, nobody can pay an additional codepath programmed specifically for proprietary gf1+ hw, one for proprietary gf3+ hw, one for proprietary gf5+ hw. exception is carmack, who takes himself the time to do that. but he doesn't like it as well. it will be his last time where he optimizes for gpu's, he told.

If Nvidia does not conform to OpenGL standards relying less on their own extensions...they're going to come into a game that isn't programmed for them (is this sounding like Tomb Raider??? eh? It should...because it wasn't programmed card specific either...in fact, Nvidia was on the partner list along with ATI on the tomb raider website...and considering that the 'glow' effect is included on the game as a default setting and adds to the 'feel and experience' of the game...why benchmark anything without it as in the Albatron 5900 Pro review using Tomb Raider? Furthermore, this makes me wonder what the heck they (Nvidia) was thinking when the game was under development...These things point to the fact that this whole SNAFU is going to be quite common with new games utilizing DX9 or OpenGL and Nvidia cards...<b>be warned</b>...Nvidia needs to realize that THEY DON'T SET THE STANDARDS and that people DON'T have to conform to Nvidia).

Truth be told my friends...ATI will always beat Nvidia in OpenGL. Why? Because it relies more on STANDARD EXTENSIONS and less on it's own VENDOR SPECIFIC ones. It has NOTHING to do with Dave or his knowledge of graphics standards..it's a known fact And also because when you speak of OpenGL you should be speaking of ONLY OpenGL...not utilizations from vendors. Of course, this is in a perfect world where everything makes sense and no one cheats the system :tongue:


<font color=blue>I've got a better idea. Let's go play "swallow the stuff under the sink." </font color=blue>
<font color=green>Stewie Griffin</font color=green> from <i>The Family Guy</i>

TKS
August 28, 2003 7:41:16 PM

Glad to see the original thread revived as well.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
August 28, 2003 8:16:51 PM

TKS,
I like your post, you took your time and covered the points you thought needed to be covered.
We are likely to run around in circles on this subject, and I admit that I'm losing intrest. So briefly let me try to get <b>my own</b> point across.
Using vendor specific extensions to program a game, or what have you, in the OpenGL API is completely normal. The more extensions a library contains, the better.
Yes Nvidia has alot more extensions under their belt, yes they have been at it quite a while longer.
We agree.
Saying that developers have a harder time getting the OpenGL API to run on Nvidia hardware, in adverse to ATi's,
I disagree.
You took the time to point out some discussions where problems were ocurring on Nvidia hardware, when you could have done the same where ATi is concerned. Problems are to be had when coding for either IHV....lets not be silly here.
ATi is currently working on their own library of extensions, just as Nvidia did.
Why?
Because they want their hardware to run the best it can.
Saying Nvidia is doomed as far as OpenGL is concerned because they rely on alot of their own extensions is silly.
ATi does it too. And their only going to do it more and more.
I feel like I am defending Nvidia here, but I'm really not.
I will likely buy ATi hardware for years to come. What I am trying to do is just be honest about the way things are going in the community.
Thanks again for a great post TKS, your pretty good at this stuff:) 

I help because you suck.
August 29, 2003 2:12:04 AM

I agree with what you said...we can go around in circles...I just hope Nvidia continues to develop openGL cuz they are some of the only ones that actually do. :p 

<font color=blue>I've got a better idea. Let's go play "swallow the stuff under the sink." </font color=blue>
<font color=green>Stewie Griffin</font color=green> from <i>The Family Guy</i>

TKS
August 29, 2003 3:24:21 AM

I agree, they have helped out in the evolution of the opengl alot.
Thanx:) 

I help because you suck.
August 29, 2003 9:24:23 AM

Quote:
I like your post, you took your time and covered the points you thought needed to be covered.

i liked it, too.
Quote:
We are likely to run around in circles on this subject, and I admit that I'm losing intrest. So briefly let me try to get my own point across

lets go:D 
Quote:
Using vendor specific extensions to program a game, or what have you, in the OpenGL API is completely normal. The more extensions a library contains, the better.

not really. more != bether. important is that the core, the unextended core, is still the main point of the api.
Quote:
Yes Nvidia has alot more extensions under their belt, yes they have been at it quite a while longer.
We agree.

add me onto the list:D 
Quote:
Saying that developers have a harder time getting the OpenGL API to run on Nvidia hardware, in adverse to ATi's,
I disagree.

then you're wrong. read up on the extension documentation i kindly wrote for you. pixelshading on nv hw was and is hell. other stuff, too..
it resulted in just some big copy-paste works from all who wanted to use the stuff.. i've worked with it. you CAN believe me:D  thats why cg IS there. because they wheren't able to find a good solution without another layer of indirection.
Quote:
You took the time to point out some discussions where problems were ocurring on Nvidia hardware, when you could have done the same where ATi is concerned. Problems are to be had when coding for either IHV....lets not be silly here.

actually, statistics work against you. sorry
Quote:
ATi is currently working on their own library of extensions, just as Nvidia did.
Why?

actually, no. they work ONLY on designing gl1.5, ARB_superbuffers, and gl2. and they do that to DROP their own extensions out again. the only ati extensions for r300 are floattextures currently. and thats where superbuffers come in. they can drop them afterwards. in short: there will be NO extension then anymore, the r300 will be gl1.5 + superbuffers compliant, and not anything more.
Quote:
Saying Nvidia is doomed as far as OpenGL is concerned because they rely on alot of their own extensions is silly.

its not. work with that is HELL. believe me. it IS. it is about IMPOSSIBLE for gamedev to not work with nvidia to code simple stuff actually for their hw. nvidia "kindly" helps you coding your stuff into your game. nice. but i don't want to be dependend on nvidia to code the graphics of my game..
Quote:
ATi does it too. And their only going to do it more and more.

no. see 2 questions above..
Quote:
I feel like I am defending Nvidia here, but I'm really not.

sounds like it. actually you just don't know the gamedev/programming side and situation and history of it. thats why you defend the wrong things. you only know the marketingside of nvidia development
Quote:
I will likely buy ATi hardware for years to come. What I am trying to do is just be honest about the way things are going in the community.

then believe me. thats why i'm in this forum! only to bring you the development side a bit more near. nvidia is too overhyped.. nobody knows what they have done wrong behind the scenes. i know a bit of it.. wich i try to show
Quote:
Thanks again for a great post TKS, your pretty good at this stuff:) 

yeah, he is. you, too.. you just have to learn much more..

"take a look around" - limp bizkit

www.google.com
August 29, 2003 5:12:38 PM

Oh oh.....
I found this <A HREF="http://www.driverheaven.net/index.php?action=view&artic..." target="_new"> post </A> over at driverheaven. Apparantly a gamer emailed Gabe Newell and asked a question about the performance FX cards and dx9. I'll let you read the <A HREF="http://www.3dgpu.com/modules/news/article.php?storyid=3..." target="_new"> question/answer </A>yourself.


Keep in mind that this is not completely official, and is most likely a rumor, but.....
August 29, 2003 5:35:59 PM

seen that, loved that :D 

no, really. all i want to see is that i was right by saying 3dmark03 WAS a good bench.. nvidia stated its not. and i still believe it is. all other demos i've seen yet perform similar in performance. doesn't mather what it is. a game, a demo, a bench. its a 3dapp and it uses the same 3dhw (if no cheat is there:D ). and the nv30+ cards suck for dx9. i mean, its a hw-design question. they did it wrong. you can read up on beyond3d how the gfFX is designed internally through analizis and tests. and it DOES run the dx9 parts at half the speed of dx8 parts. that IS fact. so what do you expect? :D 

doom3 will be different. still, even carmack states the same: in the standard codepath, the r300 performs very well. the same path on the gfFX sucks. but the gfFX path specially optimized for it can get the nv30 back to about r300 performance.
but this is not in general possible for each and every app => each and every normal dx9 or opengl app will perform BAD by default on gfFX cards.

biggest fun is the humus demo running at 50fps in normal settings on my 9700. it runs at 3-4 fps on a gfFX 5200 (ultra? dunno:D ). i mean.. OUCH! :D 

"take a look around" - limp bizkit

www.google.com
August 29, 2003 5:43:08 PM

Can I download that Humus demo dave? LINK ME!

"Mice eat cheese." - Modest Mouse

"Every Day is the Right Day." -Pink Floyd
!