Sign in with
Sign up | Sign in
Your question

Nvidia lying about specs of FX-5900?

Last response: in Graphics & Displays
Share
July 19, 2003 10:03:24 AM

hi all,

just invested in a nice (and expensive) MicroStar Nvidia GeForce FX 5900 (128mb / non-ultra version).

problem is, I think Nvidia are playing stupid games with the tech specs for the chip (or the drivers), was wondering if anyone round here might have heard anything about this sort of thing..??

Being a graphics programmer, I was quite looking forward to using some of the new features on this particular card. One set of features being the HDR (High Dynamic Range) textures - 64bit and 128bit floating point textures. Very cool.

However, a few sample programs I have say that my card doesn't support D3D9 HDR textures. The programming tools in the DX9-SDK back this up.

I have the latest drivers (44.03), so I'm either thinking that nvidia have lied on the box, or the latest drivers have disabled support for such things.

I cant find anyone who has a logical explanation to this, anyone here heard anything like this??

Cheers,
Jack
July 19, 2003 10:36:23 AM

>> a guy called "Daverperman" might know

I'll look out for him :) 

I've contacted Nvidia developer relations to see if they can clarify the matter.

But still, if anyone else round here has any clues I'm all-ears :) 

Jack
Related resources
July 19, 2003 1:03:56 PM

I’m sorry to hear that. Do you mind if I ask you a question though? I’m considering the same card, I’ll be using it for playing games, would you recommend it?
July 19, 2003 5:08:31 PM

Install the just released DX9.0b SDK and update your drivers to 44.67 (WHQL) or even 44.90 (non-WHQL)... maybe this will help.
July 19, 2003 7:55:52 PM

I would suggest the radeon 9800 pro right now for Im hearing way too many issues with the FX card with the upcommng games.
July 19, 2003 9:00:42 PM

dude do u read the posts??? he's a graphics programmer, he's not having problems with playing game -_-" u fanATics go home and do it with ur precious 9800pro people here are having big boy talks. (Coolsquirtle walks away)

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
a b U Graphics card
a b Î Nvidia
July 19, 2003 10:10:28 PM

LOL, can I try with the 9600Pro? Likely less burns due to cooler running. :wink:

JHoxley, yeah Dave will likely be able to give you some insight. He's sorta the resident progmming guru.

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
July 20, 2003 2:27:37 AM

Quote:
I would suggest the radeon 9800 pro right now for Im hearing way too many issues with the FX card with the upcommng games.

Right away, your blind fanboyism has played you a trick but maybe your just a bot which automatically answer posts where one of the following words are used: nvidia, geforce, fx or detonator :) 

Anyway, the only bug FXes have in a upcoming game is the AA issue in HL2 and currently ATI has the same issue and is not really clear if ATI can or will fix it or not...
July 20, 2003 2:32:44 AM

Relax, man. Jeez, the word fanboy is way overused. If you look closely at the situation, they said that they could fix the problem with ATi, and as for now, they were leaving it up to nVidia to find a solution. AA is important to some people. And if it's reason enough to buy the 9800 Pro for this guy, you really have no business calling him fanboy. Just breathe, man...

These days, no matter what company you like, be it <b>nVidia, ATi, or whatever,</b> no matter how logical your reasons, you're labeled an <b>idiot</b> or a <b>fanboy</b>, or <b>both.</b>
July 20, 2003 2:37:45 AM

Like some of the others said, daveperman might answer this, i dont know if he uses the fx series though.
My guess would be a driver issue, it'd be much to serious of an issue (and discovered long ago) if they lied about the features. Dave has confirmed though that a few of its features (128bit precision color) is as he put it, math marketing.

Athlon 1700+, Epox 8RDA (NForce2), Maxtor Diamondmax Plus 9 80GB 8MB cache, 2x256mb Crucial PC2100 in Dual DDR, Geforce 3, Audigy, Z560s, MX500<P ID="edit"><FONT SIZE=-1><EM>Edited by kinney on 07/19/03 09:42 PM.</EM></FONT></P>
a b U Graphics card
a b Î Nvidia
July 20, 2003 3:50:27 AM

Dude, you forgot the TNTs! :wink:

Actually there is a <A HREF="http://www.hardocp.com/article.html?art=NDk2LDEy" target="_new">BF1942 issue</A> but I'm not sure if it's been resolved yet.

Anywhoo, whatever. I'm waiting for PCI-EX cards. MMMmm toasty! :smile:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
July 20, 2003 4:46:16 AM

I eat PCI-EX cereals for breakfast LOL



Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
July 20, 2003 6:57:14 PM

Quote:
as he put it, math marketing.

I'm fairly familiar with the play-with-numbers-to-look-better marketing, and whilst in some cases it can be down right annoying its one of those things I (we?) get used to.

To my knowledge it was the ATI's that were "cheating" with the colour precsision; to be D3D9 compliant it needs 24bit FP resolution, so thats what ATI used (and got speed for it) whereas Nvidia used "true" 32bit precision.

I'm far more annoyed if theres absolutely no (visible) support for a feature they say is actually there :) 

Jack
July 20, 2003 7:01:28 PM

Quote:
he's not having problems with playing game

yup, thats true :)  however, I'd lie if I said I didn't play games as well as make them!

Quote:
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF

hmm, dont see much of a problem with the HSF, its more the length of the board thats an issue :)  mines got 5mm clear of the 2 HDD's in my system - and I have a big server grade case!!

Jack
July 20, 2003 7:05:47 PM

where you finding these 44.xx drivers?

best I can find is a 44.03, nothing higher?!

Jack
a b U Graphics card
a b Î Nvidia
July 20, 2003 7:14:18 PM

For usually the most RECENT drivers go to <A HREF="http://www.station-drivers.com/page/nvidia drv.htm" target="_new">Station Drviers</A> they have the 44.71 WHQL and 44.90 BETA.

Also you can soon go back to <A HREF="http://www.omegacorner.com/" target="_new">Omega Drive's Little Corner</A> as NVidia has decided not to mess with their fans anymore and is letting OmegaDrive adapt his NV drivers again. He currently has just tweaked 44.03s (I think) but expect something soon now that he can tweak agin.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
July 20, 2003 7:23:19 PM

thanks for the links

I'm trying to download the files, but not getting any connection - although that might be my top-of-the-line 56k modem playing up ;) 

Jack
a b U Graphics card
a b Î Nvidia
July 20, 2003 7:32:12 PM

Ok I tried too , and it appears the link goes to PNY (makers of the Quadros) so it may have high traffic. The 44.90s gave me a 'forbidden' message so maybe that link has been discontinued.
Try <A HREF="http://www.driverheaven.net/downloads/index3.htm" target="_new">DriverHeaven</A> instead they have the 44.65 & 44.61 & lower. They should work I downloaded a WHOLE bunch of benchmarks today and just tried the 44.65 just now with no problem (Eeww almost got Detonators on my Cat! :tongue: )


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
July 21, 2003 7:06:00 AM

HERE I AM HERE I AM!!! just had a nice weekend with my gf, so no time for you, dudes:D 

uhm.. yes, the gfFX cards are not capable of supporting everything of dx9. (and they emulate f.e. the pixelshaders with their own thing quite by some bit..).

the most important thing of dx9, the floatingpoint support, wich you eagerly waited for, is the biggest problem for nvidia. because they don't really have it.

i'm working more in opengl, but i think the dx caps can report you equal features:

gfFX does have floating point texture rectangles
gfFX does not have floating point 2d textures (the ones with width and height == power of 2, the good old textures simply)
gfFX does not have floating point 1d,3d,cubemaps
all it has are texture rectangles.

they have other problems with floatingpoint integration into standard opengl, and solve it with a lot of useless extensions. dunno how well they fit' into dx9, as they can't extend it..


same for render targets, where they have fun limitations currently, too (but, nobody really knows what is limited where how and why.. just.. some work, some don't :D )


yes, the gfFX hw is definitely flawed.. pixelshaders emulated with ps2.0+, but no direct ps2.0 support, floating point support is.. laughable imho..

and they put the "we have 128bit floatingpoint hw".. yes they have it in vertex and pixel shaders.. but about nowhere else :|

oh, and if you don't want to use pixelshaders, as far as i know, you can forget to even use the floatingpoint textures then..




for you, a radeon9800pro would be a much bether choise, i think.. haven't seen any HDR demo i (owning a radeon9700pro) can't run..


i'm sorry for you that you falled in false marketing..

me, as an opengl coder, i could at least switch to the NV_ proprietary extensions (while i would not like to..), and manually emulate the features i would have liked in gl..

you, as a dx coder, don't have that opportunity..


yes, imho, nvidia cheats a bit when writing dx9 on their card.. they can run dx9 vertex and pixelshaders.. but thats not dx9 yet..

"take a look around" - limp bizkit

www.google.com
July 21, 2003 2:27:48 PM

Quote:
uhm.. yes, the gfFX cards are not capable of supporting everything of dx9. (and they emulate f.e. the pixelshaders with their own thing quite by some bit..).

thanks <i>davepermen</i>, great to get a confirmation (even tho it wasn't what I wanted to here!) of these things.

I emailed nvidia and ATI developer relations, got a nice email back from ATI but nothing from Nvidia. hmm, wonder why now :) 

Quote:
you, as a dx coder, don't have that opportunity..

true, I'm not well versed in OpenGL - and the whole extension mechanism has put me off learning it, even tho it does seem to shoot me in the foot when it comes to these issues...!

Again, thanks very much for the informed reply... now I'll be off to see what I can do about it, which may require a phone call to the office of trading standards (UK) :( 

Jack
July 21, 2003 4:10:31 PM

well, i don't bother about the extension mechanism, i just use extgl wich wraps this for me, so i don't have to bother.. i can use every ext just as if it standard.. (with some if(!extgl_Extension.ARB_fragment_program) { throw Exception("ARB_fragment_program required but not supported.. get a radeon r300 based card!!! :D "); }

opengl is quite simple then.. using its new features..


else.. hm.. radeons are quite cheap.. ... :D 

oh, and, yes, its annoying, as it means that i can support radeons without problems, but i have troubles for fx' cards.. though they have dx9, they don't have what i need from them..

"take a look around" - limp bizkit

www.google.com
July 21, 2003 6:14:30 PM

FanATIc FanATIc!!!!
Fanboy Fanboy


LOL joking ;) 

Wooba Wooba
July 21, 2003 7:36:08 PM

Quote:
i just use extgl wich wraps this for me

interesting, didn't know such a thing exists. The extra checking aint too bad really, you have to do the same in D3D - just checking against device caps that are built in...

I've always liked the GeForce cards, but I have to say I've been a little worried by Nvidia's latest moves - the CineFX and Cg specification. On they're own they're pretty cool - but they dont mesh so well with the other standards.

Maybe its just me, but it seems that they're going off on their own path - making their own (similar) standards, similar to what 3DFX did with the Glide API.

Jack
July 21, 2003 10:20:37 PM

Quote:
Maybe its just me

definitely not just you:D  i've followed them since their release of the geforce1 and they always tried that, just a little.. it was called innovation then and i liked it.. but currently its way too much and they even need to cheat to hide how way-off their hw is today.. and their software as well..

its not that their stuff is bad. but its way off. they have yet their glide, just that its invisible in way-off extensions, glued with a way-off cg-api, and all the other tools only fitting their stuff.. just download their sdk, its huge.. and all just own stuff so proprietary that you nearly can't use it without using all of it.. that IS like glide.

"take a look around" - limp bizkit

www.google.com
July 22, 2003 11:49:29 AM

Quote:
all just own stuff so proprietary that you nearly can't use it without using all of it

thats my problem really... The idea behind DirectX in general (if my memory serves me correctly!) is to abstract the interfaces to various forms of hardware. To get away from the ancient DOS days of requiring an almost entirely different graphics engine for each chipset it used etc...

I like sticking with D3D, because, for example, the shader specification is pretty much standard - a shader should work pretty much the same on both a Radeon and a Geforce (+any other cards that magically appear!)

Jack
July 22, 2003 12:57:07 PM

fully agreed. both dx and gl do expose a generic interface for graphics programmers.. ati designed hw that perfectly fit those requirements. what dx9 has, and about nothing more. they design opengl additions, extensions that can get used more or less directly in the core in a future release.

its just nvidia always beeing different, always beeing something "special".. this is so utterly useless and stupid..

"take a look around" - limp bizkit

www.google.com
July 22, 2003 1:01:55 PM

yeah, its why I'm suddenly thinking that the ATI R-9800 was a far better option... I thought I did my homework, but obviously not well enough!

Now I've got the problem that the online shop I bought the card from dont really want to let me return it. Fair play - they did nothing wrong, theres nothing wrong with the card itself...

I really feel like kicking Nvidia now, as I've got a card that might be good for playing games, but not very good for developing them! :) 

Jack
July 22, 2003 1:49:30 PM

There's a lot of homework that you have to do that you probably wouldn't ever think of. Of course, that's where Dave, forum genius, comes in.

If you can't return the card, sell it. I'm sure you could get a good price.

These days, no matter what company you like, be it <b>nVidia, ATi, or whatever,</b> no matter how logical your reasons, you're labeled an <b>idiot</b> or a <b>fanboy</b>, or <b>both.</b>
July 22, 2003 5:37:48 PM

Quote:
probably wouldn't ever think of.

and thats the best part ;)  I dont claim to be a know-it-all, but given my line of work/hobby etc.. I thought I knew more than my fair share about the current 3D chipsets. Oh well!

Quote:
If you can't return the card, sell it

yeah, thats what I'm planning... looks like they might let me return the card and refund my money, but they aren't happy about it (doesn't surprise me!).

If I did sell it, I'm not likely to get more than 75-80% price off eBay, which means I still lose £50-£60...

needless to say, my trust in Nvidia has taken a bit of a bashing!

Jack
July 22, 2003 6:45:29 PM

"its just nvidia always beeing different, always beeing something "special".. this is so utterly useless and stupid.."

I'm just speculating but if nVidia does become success with their little own thing, and if it ever becomes more popular than DX, then the idea of graphic industry monopoly for nVidia with become reality, cause think about it ATi and all the other companies build their card for DX and Open GL, if nVidia's CG stuff ever overtakes DX and GL, then ATi will go down along with Matrox wont it???
nVidia- Our Cg technology have successful over take the entire market
ATi/Matrox- Can you licences the technology to us?
nVidia- Go kill yourselves Cg is mine, all mine!!!!!!
ATi/Matrox- AHHHHHHH (dies)
nVidia- now we'll make our new GeForce FX 6100Ultra cost $1000 and the gamer will have no other choice!
now i'm just guessing so correct me if i'm wrong

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
July 22, 2003 8:04:47 PM

If you can't get a refund, why not exchange it?
The 128MB version should have a similar price to the 9800PRO, at least your losses will not be huge.
Also, about the 24-bit from ATi, I doubt this is cheating. The DX9 specs call for AT LEAST 24-bit FP precision, which btw I believe equals the 96-bit (quad) ATi claims. I don't know if they claimed 128-bit, but I think Dave can clarify it. However, ATi did nothing wrong in using 24-bit FP, they were right on, respecting the DX9 specs. nVidia chose 32-bit, which is not bad at all, but to claim it is unfair the competition uses 24-bit, is kind of like saying in an airplane race, you had a biplane and a cargo one, which supports lots more cargo, but has to go slower in traveling. To say that the biplane is unfair when the race called for a minimum of x kg in an airplane, is not right at all. Both the biplane and the cargo place followed the rules except one wanted to go further but slower.

Hope you can get an exchange for an ATi and enjoy programming. It's a sad thing that the extended programmer's dream features on the nV cards are actually not accessible or much helpful!

--
<A HREF="http://www.lochel.com/THGC/html/news.html" target="_new"><font color=purple><b>The official Tom's Hardware Guide Forums Photo Album, click here to contribute!</font color=purple></b></A>
July 22, 2003 11:58:40 PM

iff nvidia could, yes..

but they cannot. opengl and directx are two too wellknown and much supported standards..

dx: do you think microsoft lets get dx removed / replaced by some nvidiashit?

opengl: its basically _THE_ graphical backend of the last 10 years over all platforms, all x86 processors, all renderfarms, everything. it was, and it will be.

nvidias cg is dead btw..

"take a look around" - limp bizkit

www.google.com
July 23, 2003 5:59:08 AM

You learn something new everyday lol~~~ wonder if nVidia ever learns....... but the Quadros FXs are still eats ATi for breakfast, right?

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
July 23, 2003 10:57:28 AM

hi,

Quote:
If you can't get a refund, why not exchange it?

cant do either. Government trading rules here state that cos the part is used the online store cannot re-sell it (they can, but it has to be reduced/with a notice). Because they cant sell it, they wont accept a return/refund.

Off to EBay it is then, unless anyone here wants an FX-5900 for a bit less than it costs in the shop!?!?! :) 

Quote:
ATi did nothing wrong

yup, I know :)  the 24bit fpp is part of the DX9 APISpec. In fact, I dont think anything bad was said about ATI in this thread?! The only slight disadvantage of the 24bit mode is that there must be an intermediary step where the Radeon converts up/down the 16 and 32 bit pixels... but at least it does it :) 

Jack
July 23, 2003 1:34:48 PM

uhm, its a hw intermediate step.. really not much hurting actually if you know how fp are stored.. namely the 24bits from ati are just the last 8bits cut away, i guess, so its
s1e8m15 instead of the ieee s1e8m23.. so its actually reading directly, and writing just write the last 8 bits to 0..

and first: that is possibly even faster than 32bit full ieee copying:D  (at least the reading-in:D ), and definitely not slower.. as its directly in hw anyways..


and visual difference is not really seeable in any pixelshader2.0.. there are too less instructions to make a real difference for the 8bit values in the end

"take a look around" - limp bizkit

www.google.com
July 23, 2003 1:57:05 PM

Quote:
Off to EBay it is then, unless anyone here wants an FX-5900 for a bit less than it costs in the shop!?!?! :) 

Give you $20 for it. :smile:

These days, no matter what company you like, be it <b>nVidia, ATi, or whatever,</b> no matter how logical your reasons, you're labeled an <b>idiot</b> or a <b>fanboy</b>, or <b>both.</b>
July 23, 2003 2:47:23 PM

Quote:
and visual difference is not really seeable in any pixelshader2.0.. there are too less instructions to make a real difference for the 8bit values in the end

Indeed, but Nvidia supports longer shaders :wink: . I've heard of some mandelbrot algorithm where every single bit of precision counts...

Nick
July 23, 2003 6:21:16 PM

Quote:
the 24bits from ati are just the last 8bits cut away

hmm, didn't know it was that simple :)  no wonder its pretty fast.

Quote:
visual difference is not really seeable in any pixelshader2.0

guess not, I'll bet there are a few places that purists could argue that it shows up... but your average bump-specular + fancy BRDF is unlikely to pick up many problems.

Jack
July 23, 2003 6:22:41 PM

Quote:
Give you $20 for it

hm, dont feel bad - but I'm gonna have to gracefully decline to sell it for 1/20th of the price I bought it for :-)

Jack
July 23, 2003 6:52:39 PM

Dude, the 9800 Pro ALSO supports long shaders!!!

"Mice eat cheese." - Modest Mouse

"Every Day is the Right Day." -Pink Floyd
July 23, 2003 8:00:38 PM

Dude, davepermen was obviously talking about the Radeon 9700 otherwise he wouldn't mention it!!!
Dude, Nvidia still has higher precision which is useful in rare situations!!!
July 23, 2003 8:02:11 PM

Name one.

These days, no matter what company you like, be it <b>nVidia, ATi, or whatever,</b> no matter how logical your reasons, you're labeled an <b>idiot</b> or a <b>fanboy</b>, or <b>both.</b>
July 23, 2003 8:04:27 PM

Quote:
To my knowledge it was the ATI's that were "cheating" with the colour precsision; to be D3D9 compliant it needs 24bit FP resolution, so thats what ATI used (and got speed for it) whereas Nvidia used "true" 32bit precision.

But why'd you say ATi was "cheating" by being compliant?

--
<A HREF="http://www.lochel.com/THGC/html/news.html" target="_new"><font color=purple><b>The official Tom's Hardware Guide Forums Photo Album, click here to contribute!</font color=purple></b></A>
July 23, 2003 8:15:02 PM

Quote:
But why'd you say ATi was "cheating" by being compliant?

ah, gotcha... I didn't mean to say ATI was cheating, the quote marks were to refer to what I'd read. When there was all that hoo-haa over Nvidia and Futuremark one of the arguments against the ATI's (presumably from Nvidia) was that ATI was using a lower precision and that Nvidia was doing it "properly"...

Was mearly paraphrasing the media coverage I'd seen :) 

Jack
!