The FINAL word on Nvidia's CRAP...

TKS

Distinguished
Mar 4, 2003
747
0
18,980
First and foremost…I am not here to flame anyone specifically. If you feel flamed because of this post, it is because of your own insecurity in yourself or something you associate yourself with. That having been said, I will continue on with my post.

I recently converted to what some would call a FanATIc. I <i>used</i> to be a die hard Nvidia fan. I owned nothing but Nvidia cards clear up until the last few months. Previously I have owned and loved a GF2 GTS Ti , a GF3 Ti 200 128MB, and a GF4 Ti 4400. My conversion wasn’t a slow one…I pretty much switched after the 3dmark SNAFU and what not. However, after reading a review today, my conversion was even more justified. To find out what review, read on.

Now, if <b>ANYONE</b> still has faith in Nvidia after the three things I am going to list here, <i>please</i>, <b>PLEASE</b> let me know. I will fly/drive/run/walk/trot/stroll/sprint/pedal/flat-out-run-like-hell over to your place of residence and repeatedly hit you on the head with a sledge hammer because you are a retard. Let the fun times begin...

1. Nvidia "cheated" the system in two places...UT2K3 and 3dmark2003. Optimizations or not, they increased the benchmark score while sacrificing image quality. It's like putting Nitrous Oxide on a Pinto and calling it a Ferrari. It cheapened the graphics card market as a whole and ruined my faith in them along with countless others. I don’t need to link any support for this here…there are enough links on this site to keep a Jimmy Dean distributor happy. Enough said.

2. I have a problem with Nvidia only having half of the pixel shaders per clock pass that ATI does. The <A HREF="http://nvidia.com/page/fx_5900.html" target="_new">FX 5900 Ultra and NonUltra</A> states that it has both 8 pixels per clock pass for a max of 16 textures per pixel. The ATI <A HREF="http://ati.com/products/radeon9800/radeon9800pro/specs.html" target="_new">9800 Pro and NonPro</A> meanwhile have 16 textures per pass for a maximum of 32 textures per pixel. Why buy a card for the same price with only half the performance in an area? If this isn’t enough for you, I also have a problem with Nvidia using 16-bit floating-point colors vs. ATI using 32-bit. Now, maybe you like to buy cards for 400 bucks that will only perform at half the level of another card for the same price. Still not enough? Let me invite you to read further...

3. DX9 compliant games will be here soon. GF FX cards are DX9 compliant. ATI’s top end cards are also DX9 compliant. Wouldn’t it be cool if we had a DX9 game right now to benchmark both of these top end cards to see how they compare? <b>We do.</b> It was done already. I’ll give you a hint at what happened…See #2 of my post and ask yourself this question…is half better? Now go <A HREF="http://www.beyond3d.com/misc/traod_dx9perf/" target="_new">here</A> and read the review on the first DX9 benchmark that compares both ATI and Nvidia top end cards. This was the nail in the coffin for me. It is the review that I was talking about at the very beginning. One cannot deny this from at least shaking one's confidence in Nvidia. I can’t take credit for finding this one…Sargeduck posted it <A HREF="http://forumz.tomshardware.com/hardware/modules.php?name=Forums&file=viewtopic&p=364695#364695" target="_new">here</A> first.


So what does all of this tell you? Are you still in Egypt floating in a basket down the same old river that Moses floated down? I hope not. I’ll leave you with a few quotes that pretty much sum up my feelings if you still choose to ignore these facts posted here:

<font color=blue>"Nothing in the world is more dangerous than sincere ignorance and conscientious stupidity."</font color=blue> <i>Dr. Martin Luther King, Jr.</i>

<font color=blue>“The dead and the stupid never change their opinions.”</font color=blue> <i> S. Gilmary Beagle</i>

Lastly, simply due to the fact that I HAVE changed my mind and am STILL labeled a FanATIc:
<font color=blue>“A fanatic is one who can't change his mind and won't change the subject.”</font color=blue> <i>Winston Churchill</i>


------------------------------------------------------
<font color=green><b>And of course, my default quote that has nothing to do with squat:</font color=green></b>


<font color=blue>I don't have to be careful! I have a gun!</font color=blue>
<font color=green>Homer Simpson</font color=green>

TKS
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
"You Freakin' fanATic! Nvidia rules because ATi sucks!!!" - The NVidiot

I've had it up to HERE with nVidia now. I saw those benchmark results, and I am extremely dissapointed with nVidia! I feel like rebelling and continuing to use my old 8500LE and using that i was going to replace with a Ti 4200. What do y'all think?

Oh yeah, almost forgot: Coolquirtle & Spitfire - <b> YOU MUST UPDATE THE BUYERS GUIDE IMMEDIATELY AFTEE THIS SHATTERING, MIND-WARPING NEWS! </b>

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!<P ID="edit"><FONT SIZE=-1><EM>Edited by UFO_WARVIPER on 08/26/03 05:22 PM.</EM></FONT></P>
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
well..... what can i say...... hmmmmm

you fanATic YOUR ****ing WRONG, LONG LIVE NVIDIA~!!!!!! WAHAHAHAHAHHAHAHAHAHAHAHAH

(coolsquirtle goes temp insane)
DESTROY!!!!
AHHHHHHHHHHHH!!!!
(rips teddy bear)

okay i'm more calmed........... i donno what to say......nVidia failed me once again......i mean the dust buster was good for throwing at noob, but now u can't even throw the 5900 at noobs. URG!!!!! HOW COULD U nVidia!!!
ARG!!!!
(trashes thing around me)
AHHHHH!
takes out old Radeon 7000 and then cracks it

i'll respond to this properly after i put some sense in my self

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
 

speeduk

Distinguished
Feb 20, 2003
1,476
0
19,280
Hehe my ONE YEAR OLD 9700pro is out performing the top nvidea cards ROFL. I need more dx9 benches to make an unbiased conclusion. Still sweet though!

<A HREF="http://service.futuremark.com/compare?2k1=6884109" target="_new"> MY RIG </A>
<A HREF="http://service.futuremark.com/compare?2k3=1150155" target="_new"> 3D-03 </A>
<font color=red> 120% overclocker </font color=red> (cheapskate)
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
The Tomb Raider AOD benchmark that B3D is currently using has a 'glow' post processing affect in it that doesnt like the FX line of cards. CORE, the makers of the game were not happy with the visual quality/performance hit of the affect, and are removing it from the game/benchmark.
With the glow affect removed, the Albatron FX5900 that B3D just reviewed hauls ass, and has 'virtually free AA' in the game. All of this backs on the theory that I had, that you really shouldnt judge the FX's ps2.0/floating point performance on just one DX9 title. I know it's hard not to do when there are only a couple of DX9 titles out there to choose from though.
You should really check out the new FX5900 review from B3D, it's excellent. Deffinately how a review should be done:)
<A HREF="http://www.beyond3d.com/reviews/albatron/gffx5900pv/" target="_new">http://www.beyond3d.com/reviews/albatron/gffx5900pv/</A>

I <b>help</b> because <b>you</b> suck.
 

SHARKMEAT

Distinguished
Jul 16, 2003
122
0
18,680
Well you can try another demo to test the cards. I think XIII + TRON 2 has DX9 and see what the bench marks are with those demo,s. I remember when 3DFX was the rule of thumb,and paved the way for todays standereds,but they where loud mouth to death on there on stupidty.Now Nvidia owns the properties that 3DFX had. ATI was the laughing stock of the town about bad video products. When one company goes down't the brain of that company has to find jobs and most "I,m guessing mind you" went to ATI. My question is why does Nvidia have such a big ass heatsink to cool there chips when ATI has such a small one.I had both FX5900Ultra and the ATI 9800 Pro,and to see both side by side. It is mind bogling at the diff in the cards heatsink that do the same thing at basicly the same speed.Now of course the ATI 256 ver has heatsinks on memory chip that the 128 dosen't have, and my understanding the 256 boards don't need the memory heatsinks because they don't use DDRII,s faster and hotter memory on any ATI retail cards.I read on another web page The DDRII ver,s of the cards where only given to the testers for bench marking.You can call ATI and ask them or E:mail and see what they say.But with the heatsinks on the memory it,s hard to check (they are glued on) the ns. I,m sure as I,m setting here Nvidia will come out on top,after the storm has blown over.It is a cat and bird game. Tweety bird is in for a very long flight.The only way I will stop useing Nvidia cards if they fold.I don't see a road map for ATI to get ahead of nvidia after they use up the current chip technology they have.etc etc....

TREAD SLOWLY IN DANGEROUS WATERS
 

TKS

Distinguished
Mar 4, 2003
747
0
18,980
Both ATI and Nvidia have smaller micron processes coming out concurrently in the upcoming years and months...

The thing I would be interested in seeing though is whether or not Nvidia can still keep the cards cool the smaller the process gets...so far they haven't been able to because they've had to up the clock speeds just to compete. We all know what happens when it gets up too high right? We get a case heater that performs like crap. I would really like to see Nvidia kick it up a notch...better competition means better prices. But I see ATI coming out on top of this one. They've got the better shader performance, image quality, and if it is a matter of just raw power like the 5900 Ultra sports...they could just throw out a card, slap a larger heatsink on it, kick up the clock speeds, and call it good...most likely soundly beating Nvidia in the process. But I think that they are holding off in order to turn out a better product...or perhaps put more money back into R&D. Mind you, be careful thinking that Nvidia will eventually pull this one out. I think ATI has something waiting in the wings...they haven't release anything 'new' for a little over a yr now. I wonder what they have up their sleeve?

<font color=blue>I don't have to be careful! I have a gun!</font color=blue>
<font color=green>Homer Simpson</font color=green>

TKS
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
First off Sharkmeat, the two games you mentioned <b>do not have benchmarking utilities</b> built into them. Using <b>Fraps</b> is the only way to bench with those games. That particular methodology leaves the reader at the mercy of the of the reviewer. Even then, a framerate recording and miscelaneous possible screenshots for IQ are about all that can be achieved...IMO.
The rest of your post had nothing to do with the topic in which we're discussing, so I wont even touch on that.
One thing I will comment on, is that you claim that you've owned both a 9800Pro, and an FX5900Ultra. I find this highly unlikely judging by the current technical knowledge you've displayed here on this forum.
You could of course prove me wrong by producing some 3DMark scores from each of those cards with your name on them.
Please do.

I help because you suck.
 

TKS

Distinguished
Mar 4, 2003
747
0
18,980
Two different people reviewed the two cards I think...They should have some sort of uniformity for both of the tests because I don't have enough time to actually compare the two right now...it is actually kind of confusing because they didn't use the same nomenclature for the settings/benchmarkings for both of the cards. I'll attempt to decipher them later tomorrow or this weekend. (didn't want you to think I was a lame a$$ that wouldn't check out someones alternate info/opinion) :p

<font color=blue>I don't have to be careful! I have a gun!</font color=blue>
<font color=green>Homer Simpson</font color=green>

TKS
 

eden

Champion
I just checked the benches. Amazing, even at that lower clock and with the help of Cg and the glow-off effect, it still rather sucks. Wouldncha say so?

Then again the reviewer actually used a low end AthlonXP 2000+ system with a KT266A chipset. Jeez, professional.

I dunno though, I still want a full test with all the cards, and with that glow effect off. (was it a good effect?)

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=yellow><b>Craptastica</b></font color=yellow></A>
 

Crashman

Polypheme
Former Staff
WTF! That can't be right, can it? I'll have to see more DX9 benchmarks before I can believe the FX series is REALLY THAT BAD!

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
I love you GW :)

suck on that u stupid fanATics LONG LIVE nVIDIA WAHAHAHAHHAHAHA

and GW, Sharkmeat is just a troll like ddrsdram, dont mind him^^, and he's not that bad, not nearly as bad as ddrsdram! lol

guys, seriously, u really think nVidia are run by crazy horny monkey(they're not Squaresoft.....stupid monkeys! you made Yuna wear hot pant!!) so they're not going to produce a product that'll get wasted by 9800pro. They have computers too and they know what benchmarking is. So

LONG LIVE nVIDIA WHAHAHHAHAHAHA

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
 

eden

Champion
The irony is, you make believe that you love nVidia, just to joke around, but in reality your jokes are your opinions, you DO love nVidia, you're just using jokes as a way to fool people into thinking they ARE jokes for fun.

Damn.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=yellow><b>Craptastica</b></font color=yellow></A>
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
shhhhhhhh.... they're not suppose to know that!!
if ATi card comes with huge bundles and bling bling HSF then i'll switch to ATi no question ask ;)

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
 

davepermen

Distinguished
Sep 7, 2002
386
0
18,780
All of this backs on the theory that I had, that you really shouldnt judge the FX's ps2.0/floating point performance on just one DX9 title. I know it's hard not to do when there are only a couple of DX9 titles out there to choose from though.
there are enough dx9 apps out you can test. from humus, from ati, from nvidia, from nitroGL, on delphi3d, etc. and they all show about the same. my radeon beats about every gfFX cards in pixelshading, in floatingpoint tasks in general (the glow for example).

those ARE all future important tasks. image post processing will be the most important one, and fast floatingpoint glow is just one way to measure how good a card performs in it. nv cards don't work well with them at all. on one side, because their floatingpoint support runs at half speed of the fixedpoint support, on the other side, because they forgot very important floatingpoint units in the texture-samplers for various texture formats.

that makes a lot of demos non-runable on nvcards, and the others run just slow.

"take a look around" - limp bizkit

www.google.com
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
well just because FX5900 sucks at PS2.0 doesn't mean it suck overall

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
Dave, a quote from John Carmack...
Nvidia's openGL drivers are my gold standard
Curious how OpenGl apps run faster on NV hardware then ATi's...alot more solid too. Nvidia does OpenGL better then ATi, they've been doing it longer....

I help because you suck.
 

davepermen

Distinguished
Sep 7, 2002
386
0
18,780
read my message again. in standard opengl it does NOT perform well. but they overloaded opengl with tons of own extensions with wich you can regain the nv30 performance.

GL_NV_light_max_exponent
GL_NV_vertex_array_range
GL_NV_register_combiners
GL_NV_fog_distance
GL_NV_texgen_emboss
GL_NV_blend_square
GL_NV_texture_env_combine4
GL_NV_fence
GL_NV_evaluators
GL_NV_packed_depth_stencil
GL_NV_register_combiners2
GL_NV_texture_compression_vtc
GL_NV_texture_rectangle
GL_NV_texture_shader
GL_NV_texture_shader2
GL_NV_vertex_array_range2
GL_NV_vertex_program
GL_NV_multisample_filter_hint
GL_NV_depth_clamp
GL_NV_occlusion_query
GL_NV_point_sprite
WGL_NV_render_depth_texture
WGL_NV_render_texture_rectangle
GL_NV_texture_shader3
GL_NV_vertex_program1_1
GL_NV_float_buffer
GL_NV_fragment_program
GL_NV_half_float
GL_NV_pixel_data_range
GL_NV_primitive_restart
GL_NV_texture_expand_normal
GL_NV_vertex_program2

list long enough?

if you don't use above listed extensions, nvidia performance is about as bad as in dx9. if you use some of them, you have to use lots of them. if you know them, you know the faults in the nv30 hw (lack of real floatingpoint storage support for example).

carmack does HIS job well. he does optimize for every sort of gpu.

he could as well lay back and say "look, dude, i use opengl (or dx..), and i don't care if your card performs well in it. make it perform well if you want to sell your cards". he doesn't. most gamedev do. result: most games will look rather bad on nv hw.


learn to know what you talk about, dude. espencially bigwordshouter GenericWeapon has to.

don't forget, the doom3 demo was on a closed system wich testers where not able to check what really is in, and set up. i remember have seen closed system n64 consoles showing jurassic park dinosaurs realtime.
wait for the game.

"take a look around" - limp bizkit

www.google.com
 

davepermen

Distinguished
Sep 7, 2002
386
0
18,780
well just because FX5900 sucks at PS2.0 doesn't mean it suck overall
there are two features in dx9 that are an imense difference for graphics compared to dx8.
ps2.0
floatingpoint rendertargets/textures

ps2.0 performs very bad on nvidia hw.
floatingpoint rendertargets/textures are only partially implemented on nvidia hw.

result: any real dx9 game will get quite some problems. even the dx9 sdk shows examples not runable on gfFX cards. lack of floatingpoint rendertargets.

if you want bether graphics than on a gf1-gf4, then yes, ps2.0 and floatingpoint textures are essencial. ati has seen this and dropped everything else out of their hw, emulating it with ps2.0 directly instead.

it does suck at anything where it differs from a gf4. and.. that sucks. because i can get a gf4 for free from a dude who buys a gfFX instead. yes i dont' get dx9, but no, he won't really, too.. :D

"take a look around" - limp bizkit

www.google.com
 

davepermen

Distinguished
Sep 7, 2002
386
0
18,780
Nvidia's openGL drivers are my gold standard
old quote, before nv30, at the start of the r300

he thinks differently about the cheating scandal, about the lack of real gl support, and that.

after he said that, ati made up their ass to get much bether. and they've done it well.


check humus' opengl page download the demos and bench, and then see if opengl apps really run faster on nv hw. they don't. none of the demos run faster on any gfFX than an ati.

doom3 is optimized for ANY card. thats why it performs at the limit of each card. the theoretical gfFX limit is high. but its not usable in standard apps. not for standard gamedev.

"take a look around" - limp bizkit

www.google.com
 

TKS

Distinguished
Mar 4, 2003
747
0
18,980
Squirtle,

I believe...the interesting part regarding the slowdown from the 5900 Ultra (in the benchmarks being discussed) is not necessarily due to poor performance. I would say it was due to the fact that they didn't plan ahead. They figured raw power would plow its way through the DX9 phase...what they didn't count on was programmers utilizing EVERY aspect of DX9 with their first game releases. I mean, who would have thought that Tomb Raider would have used the glow effect right out of the gate? And if using the 'glow' effect in tomb raider makes the 5900 suck, then what about DOOM3 or HL2? Will there be a glow effect on those? If so, will Nvidia users have to turn it off? Can they turn it off? Is it even worth it? I mean, from what I saw when I scanned over the two reviews...even with the glow disabled...the 5900 was about the same at resolutions of 1024X768 or higher with 4X. And What happens when OpenGL games in DX9 suck on the Nvidia cards? (don't think they do? Ask davepermen)

I dunno man, I'd still have to go with ATI <i>even if only to be safe</i> here. Do people really want to be stuck with a 400 dollar card that <b>might</b> have performance gaps or one that people <i>know</i> doesn't have any? Also, check out #2 of my original post...AND you have posted in <A HREF="http://forumz.tomshardware.com/hardware/modules.php?name=Forums&file=viewtopic&p=364695#364695" target="_new">this thread</A> as well but there are some more posts especially from davepermen that discuss a lot of issues regarding our topic.

<b>It would behoove everyone if they checked out <A HREF="http://forumz.tomshardware.com/hardware/modules.php?name=Forums&file=viewtopic&p=364695#364695" target="_new">the aforementioned thread</A></b>. My three treatises regarding this matter STILL stand. I have been shown nothing to counter them. As I said, I will investigate further GW's post regarding the Albatron 5900 Ultra with Glow removed and perhaps revise my original discourse...but honestly, do you really want an effect that makes the game better removed JUST so you can play it faster? Sacrifice Quality for speed? Dunno, that just isn't an option for me. It would be like strapping a Jet Engine to a soap box derby car...

Makes me feel all warm and squishy inside... :smile:

<font color=blue>I don't have to be careful! I have a gun!</font color=blue>
<font color=green>Homer Simpson</font color=green>

TKS