Sign in with
Sign up | Sign in
Your question

The FINAL word on Nvidia's CRAP...

Last response: in Graphics & Displays
Share
August 26, 2003 8:44:10 PM

First and foremost…I am not here to flame anyone specifically. If you feel flamed because of this post, it is because of your own insecurity in yourself or something you associate yourself with. That having been said, I will continue on with my post.

I recently converted to what some would call a FanATIc. I <i>used</i> to be a die hard Nvidia fan. I owned nothing but Nvidia cards clear up until the last few months. Previously I have owned and loved a GF2 GTS Ti , a GF3 Ti 200 128MB, and a GF4 Ti 4400. My conversion wasn’t a slow one…I pretty much switched after the 3dmark SNAFU and what not. However, after reading a review today, my conversion was even more justified. To find out what review, read on.

Now, if <b>ANYONE</b> still has faith in Nvidia after the three things I am going to list here, <i>please</i>, <b>PLEASE</b> let me know. I will fly/drive/run/walk/trot/stroll/sprint/pedal/flat-out-run-like-hell over to your place of residence and repeatedly hit you on the head with a sledge hammer because you are a retard. Let the fun times begin...

1. Nvidia "cheated" the system in two places...UT2K3 and 3dmark2003. Optimizations or not, they increased the benchmark score while sacrificing image quality. It's like putting Nitrous Oxide on a Pinto and calling it a Ferrari. It cheapened the graphics card market as a whole and ruined my faith in them along with countless others. I don’t need to link any support for this here…there are enough links on this site to keep a Jimmy Dean distributor happy. Enough said.

2. I have a problem with Nvidia only having half of the pixel shaders per clock pass that ATI does. The <A HREF="http://nvidia.com/page/fx_5900.html" target="_new">FX 5900 Ultra and NonUltra</A> states that it has both 8 pixels per clock pass for a max of 16 textures per pixel. The ATI <A HREF="http://ati.com/products/radeon9800/radeon9800pro/specs...." target="_new">9800 Pro and NonPro</A> meanwhile have 16 textures per pass for a maximum of 32 textures per pixel. Why buy a card for the same price with only half the performance in an area? If this isn’t enough for you, I also have a problem with Nvidia using 16-bit floating-point colors vs. ATI using 32-bit. Now, maybe you like to buy cards for 400 bucks that will only perform at half the level of another card for the same price. Still not enough? Let me invite you to read further...

3. DX9 compliant games will be here soon. GF FX cards are DX9 compliant. ATI’s top end cards are also DX9 compliant. Wouldn’t it be cool if we had a DX9 game right now to benchmark both of these top end cards to see how they compare? <b>We do.</b> It was done already. I’ll give you a hint at what happened…See #2 of my post and ask yourself this question…is half better? Now go <A HREF="http://www.beyond3d.com/misc/traod_dx9perf/" target="_new">here</A> and read the review on the first DX9 benchmark that compares both ATI and Nvidia top end cards. This was the nail in the coffin for me. It is the review that I was talking about at the very beginning. One cannot deny this from at least shaking one's confidence in Nvidia. I can’t take credit for finding this one…Sargeduck posted it <A HREF="http://forumz.tomshardware.com/hardware/modules.php?nam..." target="_new">here</A> first.


So what does all of this tell you? Are you still in Egypt floating in a basket down the same old river that Moses floated down? I hope not. I’ll leave you with a few quotes that pretty much sum up my feelings if you still choose to ignore these facts posted here:

<font color=blue>"Nothing in the world is more dangerous than sincere ignorance and conscientious stupidity."</font color=blue> <i>Dr. Martin Luther King, Jr.</i>

<font color=blue>“The dead and the stupid never change their opinions.”</font color=blue> <i> S. Gilmary Beagle</i>

Lastly, simply due to the fact that I HAVE changed my mind and am STILL labeled a FanATIc:
<font color=blue>“A fanatic is one who can't change his mind and won't change the subject.”</font color=blue> <i>Winston Churchill</i>


------------------------------------------------------
<font color=green><b>And of course, my default quote that has nothing to do with squat:</font color=green></b>


<font color=blue>I don't have to be careful! I have a gun!</font color=blue>
<font color=green>Homer Simpson</font color=green>

TKS

More about : final word nvidia crap

August 26, 2003 9:17:02 PM

"You Freakin' fanATic! Nvidia rules because ATi sucks!!!" - The NVidiot

I've had it up to HERE with nVidia now. I saw those benchmark results, and I am extremely dissapointed with nVidia! I feel like rebelling and continuing to use my old 8500LE and using that i was going to replace with a Ti 4200. What do y'all think?

Oh yeah, almost forgot: Coolquirtle & Spitfire - <b> YOU MUST UPDATE THE BUYERS GUIDE IMMEDIATELY AFTEE THIS SHATTERING, MIND-WARPING NEWS! </b>

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!<P ID="edit"><FONT SIZE=-1><EM>Edited by UFO_WARVIPER on 08/26/03 05:22 PM.</EM></FONT></P>
August 26, 2003 9:34:24 PM

well..... what can i say...... hmmmmm

you fanATic YOUR ****ing WRONG, LONG LIVE NVIDIA~!!!!!! WAHAHAHAHAHHAHAHAHAHAHAHAH

(coolsquirtle goes temp insane)
DESTROY!!!!
AHHHHHHHHHHHH!!!!
(rips teddy bear)

okay i'm more calmed........... i donno what to say......nVidia failed me once again......i mean the dust buster was good for throwing at noob, but now u can't even throw the 5900 at noobs. URG!!!!! HOW COULD U nVidia!!!
ARG!!!!
(trashes thing around me)
AHHHHH!
takes out old Radeon 7000 and then cracks it

i'll respond to this properly after i put some sense in my self

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
Related resources
August 26, 2003 11:38:18 PM

The Tomb Raider AOD benchmark that B3D is currently using has a 'glow' post processing affect in it that doesnt like the FX line of cards. CORE, the makers of the game were not happy with the visual quality/performance hit of the affect, and are removing it from the game/benchmark.
With the glow affect removed, the Albatron FX5900 that B3D just reviewed hauls ass, and has 'virtually free AA' in the game. All of this backs on the theory that I had, that you really shouldnt judge the FX's ps2.0/floating point performance on just one DX9 title. I know it's hard not to do when there are only a couple of DX9 titles out there to choose from though.
You should really check out the new FX5900 review from B3D, it's excellent. Deffinately how a review should be done:) 
<A HREF="http://www.beyond3d.com/reviews/albatron/gffx5900pv/" target="_new">http://www.beyond3d.com/reviews/albatron/gffx5900pv/&lt;/A>

I <b>help</b> because <b>you</b> suck.
August 27, 2003 12:17:11 AM

Well you can try another demo to test the cards. I think XIII + TRON 2 has DX9 and see what the bench marks are with those demo,s. I remember when 3DFX was the rule of thumb,and paved the way for todays standereds,but they where loud mouth to death on there on stupidty.Now Nvidia owns the properties that 3DFX had. ATI was the laughing stock of the town about bad video products. When one company goes down't the brain of that company has to find jobs and most "I,m guessing mind you" went to ATI. My question is why does Nvidia have such a big ass heatsink to cool there chips when ATI has such a small one.I had both FX5900Ultra and the ATI 9800 Pro,and to see both side by side. It is mind bogling at the diff in the cards heatsink that do the same thing at basicly the same speed.Now of course the ATI 256 ver has heatsinks on memory chip that the 128 dosen't have, and my understanding the 256 boards don't need the memory heatsinks because they don't use DDRII,s faster and hotter memory on any ATI retail cards.I read on another web page The DDRII ver,s of the cards where only given to the testers for bench marking.You can call ATI and ask them or E:mail and see what they say.But with the heatsinks on the memory it,s hard to check (they are glued on) the ns. I,m sure as I,m setting here Nvidia will come out on top,after the storm has blown over.It is a cat and bird game. Tweety bird is in for a very long flight.The only way I will stop useing Nvidia cards if they fold.I don't see a road map for ATI to get ahead of nvidia after they use up the current chip technology they have.etc etc....

TREAD SLOWLY IN DANGEROUS WATERS
August 27, 2003 12:29:57 AM

Both ATI and Nvidia have smaller micron processes coming out concurrently in the upcoming years and months...

The thing I would be interested in seeing though is whether or not Nvidia can still keep the cards cool the smaller the process gets...so far they haven't been able to because they've had to up the clock speeds just to compete. We all know what happens when it gets up too high right? We get a case heater that performs like crap. I would really like to see Nvidia kick it up a notch...better competition means better prices. But I see ATI coming out on top of this one. They've got the better shader performance, image quality, and if it is a matter of just raw power like the 5900 Ultra sports...they could just throw out a card, slap a larger heatsink on it, kick up the clock speeds, and call it good...most likely soundly beating Nvidia in the process. But I think that they are holding off in order to turn out a better product...or perhaps put more money back into R&D. Mind you, be careful thinking that Nvidia will eventually pull this one out. I think ATI has something waiting in the wings...they haven't release anything 'new' for a little over a yr now. I wonder what they have up their sleeve?

<font color=blue>I don't have to be careful! I have a gun!</font color=blue>
<font color=green>Homer Simpson</font color=green>

TKS
August 27, 2003 12:36:35 AM

First off Sharkmeat, the two games you mentioned <b>do not have benchmarking utilities</b> built into them. Using <b>Fraps</b> is the only way to bench with those games. That particular methodology leaves the reader at the mercy of the of the reviewer. Even then, a framerate recording and miscelaneous possible screenshots for IQ are about all that can be achieved...IMO.
The rest of your post had nothing to do with the topic in which we're discussing, so I wont even touch on that.
One thing I will comment on, is that you claim that you've owned both a 9800Pro, and an FX5900Ultra. I find this highly unlikely judging by the current technical knowledge you've displayed here on this forum.
You could of course prove me wrong by producing some 3DMark scores from each of those cards with your name on them.
Please do.

I help because you suck.
August 27, 2003 12:43:42 AM

Two different people reviewed the two cards I think...They should have some sort of uniformity for both of the tests because I don't have enough time to actually compare the two right now...it is actually kind of confusing because they didn't use the same nomenclature for the settings/benchmarkings for both of the cards. I'll attempt to decipher them later tomorrow or this weekend. (didn't want you to think I was a lame a$$ that wouldn't check out someones alternate info/opinion) :p 

<font color=blue>I don't have to be careful! I have a gun!</font color=blue>
<font color=green>Homer Simpson</font color=green>

TKS
August 27, 2003 12:51:30 AM

:smile:

I help because you suck.
August 27, 2003 12:52:14 AM

I'm not getting anything from that link... is it spelled correctly???

<b>nVidia cheated on me so I left her for ATi. ATi's hotter anyway...</b>
August 27, 2003 12:59:23 AM

Sorry....fixed:) 

I help because you suck.
August 27, 2003 1:27:05 AM

I just checked the benches. Amazing, even at that lower clock and with the help of Cg and the glow-off effect, it still rather sucks. Wouldncha say so?

Then again the reviewer actually used a low end AthlonXP 2000+ system with a KT266A chipset. Jeez, professional.

I dunno though, I still want a full test with all the cards, and with that glow effect off. (was it a good effect?)

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=yellow><b>Craptastica</b></font color=yellow></A>
a b U Graphics card
August 27, 2003 1:34:49 AM

WTF! That can't be right, can it? I'll have to see more DX9 benchmarks before I can believe the FX series is REALLY THAT BAD!

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
August 27, 2003 2:44:44 AM

I love you GW :) 

suck on that u stupid fanATics LONG LIVE nVIDIA WAHAHAHAHHAHAHA

and GW, Sharkmeat is just a troll like ddrsdram, dont mind him^^, and he's not that bad, not nearly as bad as ddrsdram! lol

guys, seriously, u really think nVidia are run by crazy horny monkey(they're not Squaresoft.....stupid monkeys! you made Yuna wear hot pant!!) so they're not going to produce a product that'll get wasted by 9800pro. They have computers too and they know what benchmarking is. So

LONG LIVE nVIDIA WHAHAHHAHAHAHA

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
August 27, 2003 3:02:02 AM

The irony is, you make believe that you love nVidia, just to joke around, but in reality your jokes are your opinions, you DO love nVidia, you're just using jokes as a way to fool people into thinking they ARE jokes for fun.

Damn.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=yellow><b>Craptastica</b></font color=yellow></A>
August 27, 2003 3:20:57 AM

shhhhhhhh.... they're not suppose to know that!!
if ATi card comes with huge bundles and bling bling HSF then i'll switch to ATi no question ask ;) 

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
August 27, 2003 4:30:53 AM

Quote:
All of this backs on the theory that I had, that you really shouldnt judge the FX's ps2.0/floating point performance on just one DX9 title. I know it's hard not to do when there are only a couple of DX9 titles out there to choose from though.

there are enough dx9 apps out you can test. from humus, from ati, from nvidia, from nitroGL, on delphi3d, etc. and they all show about the same. my radeon beats about every gfFX cards in pixelshading, in floatingpoint tasks in general (the glow for example).

those ARE all future important tasks. image post processing will be the most important one, and fast floatingpoint glow is just one way to measure how good a card performs in it. nv cards don't work well with them at all. on one side, because their floatingpoint support runs at half speed of the fixedpoint support, on the other side, because they forgot very important floatingpoint units in the texture-samplers for various texture formats.

that makes a lot of demos non-runable on nvcards, and the others run just slow.

"take a look around" - limp bizkit

www.google.com
August 27, 2003 5:03:31 AM

well just because FX5900 sucks at PS2.0 doesn't mean it suck overall

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
August 27, 2003 7:05:11 AM

Dave, a quote from John Carmack...
Quote:
Nvidia's openGL drivers are my gold standard

Curious how OpenGl apps run faster on NV hardware then ATi's...alot more solid too. Nvidia does OpenGL better then ATi, they've been doing it longer....

I help because you suck.
August 27, 2003 7:45:33 AM

yeah! don't know why?

<b><font color=red>
"Remain at stock speed"</b> - The Overclocker.
</font color=red>
August 27, 2003 7:50:56 AM

read my message again. in standard opengl it does NOT perform well. but they overloaded opengl with tons of own extensions with wich you can regain the nv30 performance.

Quote:

GL_NV_light_max_exponent
GL_NV_vertex_array_range
GL_NV_register_combiners
GL_NV_fog_distance
GL_NV_texgen_emboss
GL_NV_blend_square
GL_NV_texture_env_combine4
GL_NV_fence
GL_NV_evaluators
GL_NV_packed_depth_stencil
GL_NV_register_combiners2
GL_NV_texture_compression_vtc
GL_NV_texture_rectangle
GL_NV_texture_shader
GL_NV_texture_shader2
GL_NV_vertex_array_range2
GL_NV_vertex_program
GL_NV_multisample_filter_hint
GL_NV_depth_clamp
GL_NV_occlusion_query
GL_NV_point_sprite
WGL_NV_render_depth_texture
WGL_NV_render_texture_rectangle
GL_NV_texture_shader3
GL_NV_vertex_program1_1
GL_NV_float_buffer
GL_NV_fragment_program
GL_NV_half_float
GL_NV_pixel_data_range
GL_NV_primitive_restart
GL_NV_texture_expand_normal
GL_NV_vertex_program2


list long enough?

if you don't use above listed extensions, nvidia performance is about as bad as in dx9. if you use some of them, you have to use lots of them. if you know them, you know the faults in the nv30 hw (lack of real floatingpoint storage support for example).

carmack does HIS job well. he does optimize for every sort of gpu.

he could as well lay back and say "look, dude, i use opengl (or dx..), and i don't care if your card performs well in it. make it perform well if you want to sell your cards". he doesn't. most gamedev do. result: most games will look rather bad on nv hw.


learn to know what you talk about, dude. espencially bigwordshouter GenericWeapon has to.

don't forget, the doom3 demo was on a closed system wich testers where not able to check what really is in, and set up. i remember have seen closed system n64 consoles showing jurassic park dinosaurs realtime.
wait for the game.

"take a look around" - limp bizkit

www.google.com
August 27, 2003 7:56:17 AM

Quote:
well just because FX5900 sucks at PS2.0 doesn't mean it suck overall

there are two features in dx9 that are an imense difference for graphics compared to dx8.
ps2.0
floatingpoint rendertargets/textures

ps2.0 performs very bad on nvidia hw.
floatingpoint rendertargets/textures are only partially implemented on nvidia hw.

result: any real dx9 game will get quite some problems. even the dx9 sdk shows examples not runable on gfFX cards. lack of floatingpoint rendertargets.

if you want bether graphics than on a gf1-gf4, then yes, ps2.0 and floatingpoint textures are essencial. ati has seen this and dropped everything else out of their hw, emulating it with ps2.0 directly instead.

it does suck at anything where it differs from a gf4. and.. that sucks. because i can get a gf4 for free from a dude who buys a gfFX instead. yes i dont' get dx9, but no, he won't really, too.. :D 

"take a look around" - limp bizkit

www.google.com
August 27, 2003 8:00:02 AM

Quote:
Nvidia's openGL drivers are my gold standard

old quote, before nv30, at the start of the r300

he thinks differently about the cheating scandal, about the lack of real gl support, and that.

after he said that, ati made up their ass to get much bether. and they've done it well.


check humus' opengl page download the demos and bench, and then see if opengl apps really run faster on nv hw. they don't. none of the demos run faster on any gfFX than an ati.

doom3 is optimized for ANY card. thats why it performs at the limit of each card. the theoretical gfFX limit is high. but its not usable in standard apps. not for standard gamedev.

"take a look around" - limp bizkit

www.google.com
August 27, 2003 10:51:27 AM

Squirtle,

I believe...the interesting part regarding the slowdown from the 5900 Ultra (in the benchmarks being discussed) is not necessarily due to poor performance. I would say it was due to the fact that they didn't plan ahead. They figured raw power would plow its way through the DX9 phase...what they didn't count on was programmers utilizing EVERY aspect of DX9 with their first game releases. I mean, who would have thought that Tomb Raider would have used the glow effect right out of the gate? And if using the 'glow' effect in tomb raider makes the 5900 suck, then what about DOOM3 or HL2? Will there be a glow effect on those? If so, will Nvidia users have to turn it off? Can they turn it off? Is it even worth it? I mean, from what I saw when I scanned over the two reviews...even with the glow disabled...the 5900 was about the same at resolutions of 1024X768 or higher with 4X. And What happens when OpenGL games in DX9 suck on the Nvidia cards? (don't think they do? Ask davepermen)

I dunno man, I'd still have to go with ATI <i>even if only to be safe</i> here. Do people really want to be stuck with a 400 dollar card that <b>might</b> have performance gaps or one that people <i>know</i> doesn't have any? Also, check out #2 of my original post...AND you have posted in <A HREF="http://forumz.tomshardware.com/hardware/modules.php?nam..." target="_new">this thread</A> as well but there are some more posts especially from davepermen that discuss a lot of issues regarding our topic.

<b>It would behoove everyone if they checked out <A HREF="http://forumz.tomshardware.com/hardware/modules.php?nam..." target="_new">the aforementioned thread</A></b>. My three treatises regarding this matter STILL stand. I have been shown nothing to counter them. As I said, I will investigate further GW's post regarding the Albatron 5900 Ultra with Glow removed and perhaps revise my original discourse...but honestly, do you really want an effect that makes the game better removed JUST so you can play it faster? Sacrifice Quality for speed? Dunno, that just isn't an option for me. It would be like strapping a Jet Engine to a soap box derby car...

Makes me feel all warm and squishy inside... :smile:

<font color=blue>I don't have to be careful! I have a gun!</font color=blue>
<font color=green>Homer Simpson</font color=green>

TKS
August 27, 2003 11:37:27 AM

I seem to recall reading somewhere that Madden 2004 was written in DX9?? (not 100% on this) can anyone confirm this? if it is then that could be benched too - its not a intensly graphicaly demnding game but it will produce some figures at least.

<i>Mmmm Dawn AND Eve at the same time...Drroooooll
-------------------------------------------------
<b>XP2100+, 2x512Mb PC2700, ASUS A7N8X, PNY 64Mb Ti4200. :cool:
August 27, 2003 12:22:37 PM

most games will get written with dx9, as it is not much different than dx8, so even a dx8 near finished game can easily be ported (and doesn't it sound bether to say your game has dx9 support? :D ). that was different between dx7 and dx8, between wich where worlds in the programming interface.

i don't think madden uses much dx9 only features, though.. i could be wrong. i'll check that.

"take a look around" - limp bizkit

www.google.com
August 27, 2003 1:25:40 PM

your probably right about it not using that many features as it run very well on my system. Just could have sworn that I'd heard it somewhere...

<i>Mmmm Dawn AND Eve at the same time...Drroooooll
-------------------------------------------------
<b>XP2100+, 2x512Mb PC2700, ASUS A7N8X, PNY 64Mb Ti4200. :cool:
August 27, 2003 1:36:42 PM

i've looked around a bit (on ea sports own page, etc), and yes, its indeed done with dx9. but there are no mentoins at all that it would use any pixelshader or vertexshader or floatingpoint texture or anything. i guess they just used dx9, but the required features are at about dx7 or dx8..

but on the other hand, i could think of madden how it would look a like with hdr image based lighting, and shaders for the skin, the grass, the sky, and all.. could look brilliant indeed.. hehe

"take a look around" - limp bizkit

www.google.com
August 27, 2003 2:11:15 PM

still looks good now, but your right, using the DX9 features would make it look rather s£xy indeed :smile:

<i>Mmmm Dawn AND Eve at the same time...Drroooooll
-------------------------------------------------
<b>XP2100+, 2x512Mb PC2700, ASUS A7N8X, PNY 64Mb Ti4200. :cool:
August 27, 2003 6:59:52 PM

Hey Dave, I bought a radeon 9800 NP based off your recommendations... it hasnt gotten here yet but it *should* be on the way... stupid pricewatch vendors!

Though I now see the 5900nonpro up on pricewatch for the same price as a 9800np, regardless I would rather have the radeon.
The 9800 NP is cheap enough I will buy two if I like the first one.. to bad SLI mode doesnt work with these!

Athlon 1700+, Epox 8RDA (NForce2), Maxtor Diamondmax Plus 9 80GB 8MB cache, 2x256mb Crucial PC2100 in Dual DDR, Geforce 3, Audigy, Z560s, MX500
August 27, 2003 7:21:09 PM

Why doesn't Nvidia double the performance on their crappy DX9 chip with SLI mode - I realize I say this in sarcasm, but They DID buy 3dfx afterall and should have direct access to the benefits SLI technology. I would like to see Nvidia and ATi video cards based upon tile-based rendering as well. This technology is simply amazing! It did wonders for a non-T&L chip like the Kyro II which held its own against a GeForce 2 GTS & Pro back in its day even with *cough* SD-RAM. Maybe Dave, Phial, or Genetic Weapon would know why nobody uses tile-base rendering anymore - and what's become of Kyro III?

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!<P ID="edit"><FONT SIZE=-1><EM>Edited by UFO_WARVIPER on 08/27/03 03:32 PM.</EM></FONT></P>
August 27, 2003 9:31:26 PM

I asked this question awhile back and pretty much I assume everyone has implemented some form of tilebased rendering into their drivers. Though of course its not a entire hardware tilebased rendering system.
I was a big fan of the kyro2 chipset after the demise of 3dfx... hoping to jump on that bandwagon rather than nvidia or ati.
Yet STMicroelectronics went under or abandoned the project of kyro3.

The kyro3 wouldve been much more interesting than the 3dfx rampage or the NV30 that was supposed to be such a big deal.

My only guess why everyones not using tilebased rendering other than its difficult to implement, is that while in the current race for supremacy ati/nv are more concerned with creating something that will be faster than the competition in the next 6months to year than something like the kyro.
Just 2cents..

You raise a good point, I just ordered a radeon 9800 and will likely be ordering another one for a different machine.
Since in reality theres little difference between the 5900 and 9800, if one supported SLI mode I would have picked that one.

Athlon 1700+, Epox 8RDA (NForce2), Maxtor Diamondmax Plus 9 80GB 8MB cache, 2x256mb Crucial PC2100 in Dual DDR, Geforce 3, Audigy, Z560s, MX500
August 28, 2003 12:01:02 AM

I find it interesting that Microsoft worked with nVidia on DX8, and when the Geforce4 Ti's came out nothing could touch them. Then Microsft worked with ATi on DX9 and lo and behold the new Radeons are superior. My plan is to watch which manufacturer teams up with Microsoft on DX10 and buy a card from them. :cool:
August 28, 2003 12:05:05 AM

Just adding some fresh oil:

1. You can't tell the difference between full trilinear filtering and half trilinear filtering. ATI is just stupid not to do it as well.

2. ATI does not have 32-bit floating-point precision, only 24-bit. Nvidia leaves programmers the choice to have 16 or 32-bit precision. This is more optimal since a lot of situations don't need more than 16-bit precision.

3. Tomb Raider is sponsored by ATI...

:lol: 
August 28, 2003 12:56:02 AM

I am sure glad that I got a 9700 pro. It was cheap and it works great.

<font color=blue>"You know, that my backstab attack does double the damage. I can make an off button for him." </font color=blue> :cool:
August 28, 2003 1:47:25 AM

A 9700 pro was cheap? Whats your version of cheap a weeks wages?

If he doesn't die, he'll get help!!!
August 28, 2003 1:49:55 AM

1. You can distinguish between full and half.
2. Check the specs again. Nvidia drops into 16 when it feels it doesn't need to render things in order to gain speed...which sacrifices quality
3. Nvidia is a partner of Tomb Raider as well. Go to the official site and click on the partners tab. Must have been a kick to the nuts for them to score so crappy on something they sponsored.

Nvidia is on it's way to becoming synonymous with 1)cheating 'optimizations' 2) sacrifice of quality for speed and 3) crappy opengl support 4) crappy AF and AA.

Go back to the beginning, read my first post again. It is still standing...the truth can always stand alone.

<font color=blue>I've got a better idea. Let's go play "swallow the stuff under the sink." </font color=blue>
<font color=green>Stewie Griffin</font color=green> from <i>The Family Guy</i>

TKS
August 28, 2003 1:57:03 AM

I got a great deal. I got it like 4-5 months ago and I got it for just over $200(new). If that is a weeks pay for you I am sorry, you should look for a job that pays more.

<font color=blue>"You know, that my backstab attack does double the damage. I can make an off button for him." </font color=blue> :cool:
August 28, 2003 2:06:09 AM

I'm in Australia they retail from 600 - 800 at the moment.

If he doesn't die, he'll get help!!!
August 28, 2003 2:35:47 AM

That sucks for you. That seems kinda high for a video card that old. You can pick them up for around 250 here. I would hate to see how much a 5900 or 9800 is for you.

<font color=blue>"You know, that my backstab attack does double the damage. I can make an off button for him." </font color=blue> :cool:
August 28, 2003 2:52:00 AM

Hercules 5900 ultra $1100
9800pro not much more than the 9700pro, less than 100 bucks difference. They are slowly coming down if ya go a crapper brand get a 9800np for under 600.

Gotta remember our dollars about 60 odd cents US, but average wage is about 700 bucks here, minimum wage is about 400.

If he doesn't die, he'll get help!!!<P ID="edit"><FONT SIZE=-1><EM>Edited by rramjet on 08/27/03 10:55 PM.</EM></FONT></P>
August 28, 2003 3:41:45 AM

Quote:
read my message again. in standard opengl it does NOT perform well. but they overloaded opengl with tons of own extensions with wich you can regain the nv30 performance.


In reply to:
--------------------------------------------------------------------------------


GL_NV_light_max_exponent
GL_NV_vertex_array_range
GL_NV_register_combiners
GL_NV_fog_distance
GL_NV_texgen_emboss
GL_NV_blend_square
GL_NV_texture_env_combine4
GL_NV_fence
GL_NV_evaluators
GL_NV_packed_depth_stencil
GL_NV_register_combiners2
GL_NV_texture_compression_vtc
GL_NV_texture_rectangle
GL_NV_texture_shader
GL_NV_texture_shader2
GL_NV_vertex_array_range2
GL_NV_vertex_program
GL_NV_multisample_filter_hint
GL_NV_depth_clamp
GL_NV_occlusion_query
GL_NV_point_sprite
WGL_NV_render_depth_texture
WGL_NV_render_texture_rectangle
GL_NV_texture_shader3
GL_NV_vertex_program1_1
GL_NV_float_buffer
GL_NV_fragment_program
GL_NV_half_float
GL_NV_pixel_data_range
GL_NV_primitive_restart
GL_NV_texture_expand_normal
GL_NV_vertex_program2




--------------------------------------------------------------------------------


list long enough?

if you don't use above listed extensions, nvidia performance is about as bad as in dx9. if you use some of them, you have to use lots of them. if you know them, you know the faults in the nv30 hw (lack of real floatingpoint storage support for example).

Nvidia loaded OpenGL with a bunch of extension you say huh?
Lol....as if ATi doesnt have their own list of extensions for dev's to work with.
Here ya go!

GL_ARB_transpose_matrix
GL_ARB_vertex_blend
GL_ARB_vertex_buffer_object
GL_ARB_vertex_program
GL_ARB_window_pos
<b>GL_ATI_clip_volume_hint</b>
<b>GL_ATI_compiled_vertex_array</b>
GL_EXT_draw_range_elements
<b>GL_ATI_multi_draw_arrays</b>
GL_EXT_rescale_normal
GL_EXT_vertex_array
<b>GL_ATI_vertex_shader</b>
<b>GL_ATI_element_array</b>
<b>GL_ATI_map_object_buffer</b>
<b>GL_ATI_pn_triangles</b>
<b>GL_ATI_vertex_array_object</b>
<b>GL_ATI_vertex_attrib_array_object</b>
<b>GL_ATI_vertex_streams</b>


GL_ARB_multisample
<b>GL_ATI_blend_color</b>
<b>GL_ATI_blend_func_separate</b>
GL_EXT_blend_minmax
GL_EXT_blend_subtract
<b>GL_ATI_fog_coord</b>
GL_EXT_secondary_color
GL_EXT_separate_specular_color
GL_EXT_stencil_wrap
<b>GL_ATI_separate_stencil</b>
GL_NV_blend_square <b><----They're using Nvidia code?...Oh No!!</b>

GL_ARB_shadow
GL_ARB_shadow_ambient
GL_ARB_depth_texture
GL_ARB_fragment_program
GL_ARB_multitexture
GL_ARB_texture_border_clamp
GL_ARB_texture_compression
GL_ARB_texture_cube_map
GL_ARB_texture_env_add
GL_ARB_texture_env_combine
GL_ARB_texture_env_crossbar
GL_ARB_texture_env_dot3
GL_ARB_texture_mirrored_repeat
<b>GL_ATI_texgen_reflection</b>
GL_EXT_texture_compression_s3tc
<b>GL_ATI_texture_edge_clamp</b>
GL_EXT_texture_filter_anisotropic
<b>GL_ATI_texture_lod_bias</b>
GL_EXT_texture_object
GL_EXT_texture_rectangle
GL_EXT_texture3D
<b>GL_ATI_draw_buffers</b>
<b>GL_ATI_envmap_bumpmap</b>
<b>GL_ATI_fragment_shader</b>
<b>GL_ATI_texture_env_combine3</b>
<b>GL_ATI_texture_float</b>
<b>GL_ATI_texture_mirror_once</b>
<b>GL_ATIX_texture_env_route</b>
GL_SGIS_generate_mipmap
GL_SGIS_texture_lod
GL_NV_texgen_reflection <b><----Oh No!</b>
GL_S3_S3TC

GL_EXT_packed_pixels
GL_EXT_abgr
GL_EXT_bgra
Miscellaneous
GL_ARB_point_parameters
GL_NV_occlusion_query
GL_HP_occlusion_test
GL_SGI_color_matrix
Windowing / WGL
GL_WIN_swap_hint
WGL_ARB_extension_string
WGL_ARB_make_current_read
WGL_ARB_pbuffer
<b>WGL_ATI_pixel_format</b>
WGL_ARB_render_texture
<b>WGL_ATI_pixel_format_float</b>
WGL_EXT_swap_control

So I suppose that this list is perfectly ok with you Dave Peppermint?
The truth of it is hotshot...OpenGL is about being open sourced for the community. Creating libraries of extensions.....making assembly time easier and less troublesome...<b>getting the most out of each IHV's hardware</b>...Your making it seem as though Nvidia is burdening the developer community and polluting OpenGL as we know it.
Save me your rhetoric <b>please</b>.


I have to go now and spend some quality time with my girlfriend <wink><wink>

Your not as smart as you seemed at first Dave......

I help because you suck.
August 28, 2003 4:13:04 AM

ding ding

GW VS DaveP

bets starting at $10

ding ding

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
August 28, 2003 4:35:09 AM

Boy, I'm going to enjoy Daveperman's reply to this. :lol: 

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
August 28, 2003 5:14:08 AM

<b>Here's a snippet of game code I pulled, It seems to be full of ATi extensions!....Oh no!</b>

EXT_texture_env_combine3, ATIX_texture_env_route, EXT_texture_env_dot3, ATI_texture_mirror_once, ARB_texture_cube_map, and EXT_texture3D.
<b>ATi must have problems with real pure OpenGL!(Oh my)</b>
Please, nobody be fooled by Daves BS. Vender extensions to optimise code for one particular IHV is both as normal and pure as OpenGL itself is.




I help because you suck.
August 28, 2003 7:35:18 AM

I won



I help because you suck.
August 28, 2003 8:10:19 AM

Up late too?

I can't seem to get a lick of sleep tonight no matter how hard I try :frown: . Its probably because I'm excited about school starting tommorrow.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
August 28, 2003 6:56:11 PM

YAY~!!!

long live nVIDIA!!!! YAY!! (not really)

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
!