What i'd like to know is just what the difference is between the GF 4 TI 4600 and Quadro 4 in terms of what features they have. And just what kind of driver enhancements are worth a price increase of 600 to 1200 dollars, cause far as i can see its the Quadro 900XL thats identicle to the TI 4600 in speed etc... I cant seem to find a list anywhere on nvidia's site, but maybe if they pubished the differences more people'd be willing to take a look at the high end card. Better yet, people like me might stop thinking really nasty thoughts about why they disable that stuff on the 400 dollar cards...
On second thought...
Even with the disabled features, how can they rightly keep driver optimizations from TI users... Shouldnt Nvidia want all their nice cards running as good as they can be?<P ID="edit"><FONT SIZE=-1><EM>Edited by williamc on 03/20/02 02:27 PM.</EM></FONT></P>
It's not driver issues in general, it's specific driver developements for specific plugins like Autocad, Adobe, and other graphical programs. Basicly, a driver specificly set up to enhance that single program for that card.
Given there are 50+ programs out there, that is a large task. TI users gain the benifit of any generic speed upgrades they can get when developing these drivers. Just as the 8500 has likely gotten some OpenGL tweeks from the Fire GL people.
I do not like it Tom you see,
I do not like green PCB.
The quadros are different than GF in a few ways. Depedning upon which Quadro (1,2,DCC,or 4) there were certain differences. Typically though, the quadros are optimized for OGL, and allocate more resources to it, rather than D3D. That is why NV doesn't make the Quadro 4 have the same drivers as the GF. Also, u are paying for the research of those drivers, as research is the most expensive aspect of a product. Plus, it isn't professional if it isn't over 500$ :tongue:
Sig of the week.
Did nobody else notice that the SPECapc 3DSMax Graphic Mean "Quality" score for the 8800 (3.91) was exactly the same as it scored for the "Speed" setting? To me, that suggests the quality would be the same too. And in fact the Overall Geometric Mean score for "Quality" was HIGHER than the "Speed" setting!
Either they're taking quality shortcuts in the Quality settings as well as Speed (ATI have done this before), or they really are that fast at HiQ rendering & don't bother with the speed shortcuts at all. Or, given the geometric mean resuts, they're just a bit muddled.
Given the fairly convincing lead by the Quadro4 in all the SPECviewperf tests, I'm inclined to think it's not nVidia/ELSA's drivers that need work, but as the article's author decided screenshots of the driver settings pages were more important than rendering quality shots (his background is more in consumer gfx I assume), we can't say for sure.
Hey, u know what, that's correct! odd! Shitt, that would suck if they're taking shortcuts again, cause this is a card u'r not supposed to fvck with! Hi quality will have an impact (negative) on FPS. Also, OGL is different than maxtreme, as maxtreme gives better image quality (crisper) and but sadly in some cases performs worse than OGL (this was seen on the Quadro DCC, and who knows, it may be accented on the Q4.)
Sig of the week.
The Quadro chips do have some significant features (to pro OpenGL apps) that the GeForce chips don't:
- hardware anti-aliased lines
- two-sided polygon lighting
- unified depth/back buffers across multiple windows
These can make a big difference to CAD & 3D modelling/visualisation apps, particularly the anti-aliased lines, but they won't do a lot for gaming. For a pro user, these features are indeed worth the extra money (which is why nVidia/ELSA charge them more); gamers need not feel like they're missing out.
Previous generation chips (GF3 and earlier) were actually just Quadro chips with a different board ID, and could be "upgraded" to gain the extra features. However, Quadro4 chips are physically different. I don't know if you can gain anything by changing a GeForce4's ID, but it won't be the same as a Quadro4 anymore. For example, there's no GeForce equivalent to the Quadro4 400 NVS, which has two GPUs and four display outputs on a PCI card, and the Quadro4's AA line algorithm is significantly improved (resulting in its record-breaking ProCDRS score).
Actually, they are still the same. NV tried to be tricky this time, and rearanged the location of the resistors, but in actuality, it can still be converted. Believe me, this has been after extensive study by a few guys I know. FYI, Quadro DCC didn't have hardware lines.
to pay a grand for a video card is outrageous. i payed 140 for my r8500. And i fugured that was a good enough price. but if your work with special applications and your company pay for the card, then i understand but for main strem users, hehe i would stay away from $1000 cards. if any1 buy one, send me yoru address and leave the house open, im about to take it from you