Sign in with
Sign up | Sign in
Your question

Two upgrade paths. Which one?

Last response: in Graphics & Displays
Share
November 19, 2004 5:03:17 AM

Greetings to the forum once again.

So, I want to upgrade so that I can play HL2 and Far Cry and Doom 3 and all, because my current gaming system can't do it too well. (to the tune of playing at 640x480 and still having choppy graphics.) Of these two possible upgrade scenarios, which do you guys think would be better for playing games on? (and note the fine points at the end please)

Option 1:
XP Mobile 2500, OC'd maybe up to 2.4ghz
1GB pc3200 ram
nv 6800 GT
plain ol' 200GB hard drive.
Total <b>UPGRADE</b> cost, $255.

Option 2:
Dual 3.06 Xeons, the old 533 FSB kind (i.e. PC2700 ram not 3200!)
Quadro FX 4000
2GB PC2700 ECC RAM
sweet RAID setup, with 4x Raptor RAID-0 and a 558GB Raid-5.
Total <b>UPGRADE</b> cost, $190 - $200.


By now everybody will be saying "holy crap those prices are impossible" - I have two systems, one for games and one for pro graphics, and my roommate wants me to build him a $1,000 or less pro graphics machine and I want to upgrade my game machine. So, either I sell him my entire game machine and my old Quadro4 and use that money to buy a QuadroFX 4000, or I build his computer separately from new parts and buy a 6800 GT and a new motherboard so that I can OC my XP Mobile a bit more. I've got it all laid out in Excel, and those are actual prices. I also have an A64 price scenario that pans out to around $580, so that's not an option unless it's the only way to get games performance.

What I wonder about is, I don't want to end up with this awesome Quadro 4000, and end up being limited by the output of my 3.06ghz, 133MHZ FSB Xeons. Granted there are two of them, but most games will not benefit from dual proc power, so it just might be better for me to OC my XP Mobile a bit more instead. My main concern is that my game system currently has that XP-Mobile running at 2.18ghz (limited by mobo) and a Radeon 9800 Pro, and it SUCKS in Far Cry, Doom 3, and even Morrowind with fan packs. I think it might be processor speed, so I don't want that to happen with my new graphics card. I also have no idea how the Quadro FX 4000's performance compares to the Geforce 6800 line, or which card it's supposed to be similar to.

So what do you guys think? If I assume my Xeons will be about like having a 3GHz P4, will I end up having low FPS because of them? And, do you think that Quadro FX 4000 would perform in games at least as well as the 6800 GT? Or do I basically have to get a new Athlon 64 to get good games performance, because those CPUs are going to limit me?

More about : upgrade paths

November 19, 2004 11:35:57 AM

Your first option sounds kind of like my computer, and its playing all three games fine. Im not familiar with amd processors so im sure someone else can refrence to that. Games seem like there just as processor reliant as videocard reliant these days, so as long as you have reasonably kick ass componets in both slots you should be able to have no problem playing the current games.

If moneys an issue you could probably even squeak by with the 9800 pro with no problem instead of the 6800 GT.

Asus p4c800 Deluxe,1 Gig Mushkin PC3200 Dual Channel Level II V2,Pentium 4 3.0 512k 800fsb HT, Thermaltake Xaser III, Thermaltake Spark 7+, Sound Blaster Audigy2 ZS Platinum Pro, eVGA GeForce 6800 GT
November 19, 2004 2:30:36 PM

[EDITED: RETHINK]

1. Software has to be programmed to take advantage of Dual CPUs, and Games are not programmed to take advantage of dual CPUs.
having said that, a single 3 Ghz Xeon is a good deal faster than an AthlonXP 3200+, which is what your 2500+ will be at a 200 Mhz bus speed (which is what I'm running)

Either way, the CPU's are both good for gaming, (I've got a 2500+ o/c to 3200+ and it does very well with the 9700 PRO) but the Xeon will be faster and a bit better in newer games, methinks.

2. The 6800GT is probably a better gaming card than the QuadroFX 4000.

So if you're doing some professional CAD, then get option 2. The QuadroFX 4000 is a 12-pipe card and the 6800GT is a 16-pipe card.

Which means the quadro will still be a good gaming machine, but much better for CAD. Not sure how the quadro drivers will work for games though.

Why not sell him your old system, buy a 6800GT, and softmod it to a quadro when you need to do CAD?

If the softmod isn't out yet, it probably won't be long before someone makes it. Besides, a plain-jane 6800GT might even perform better in CAD than your old Quadro4...

________________
<b>Radeon <font color=red>9700 PRO</b></font color=red> <i>(o/c 332/345)</i>
<b>AthlonXP <font color=red>3200+</b></font color=red> <i>(Barton 2500+ o/c 400 FSB)</i>
<b>3dMark03: <font color=red>5,354</b>
Related resources
November 19, 2004 2:47:41 PM

considering you wont have $$$ issues

option 1.5:
A64 Fx55/3500-4000+ or P4 EE 3.4...
2GB pc3200 ram
nv 6800 GT - or better X800XT PE
sweet RAID setup, with 4x Raptor RAID-0 and a 558GB Raid-5.
Total UPGRADE cost: i dont know :tongue:

:tongue: <A HREF="http://www.geocities.com/priyajeet/fing.jpg" target="_new"><i><font color=red>Very funny, Scotty.</font color=red><font color=blue> Now beam down my clothes.</font color=blue></i></A> :tongue:
November 19, 2004 3:04:06 PM

Its been proven raid 0 dosnt make games run or load any faster.

Asus p4c800 Deluxe,1 Gig Mushkin PC3200 Dual Channel Level II V2,Pentium 4 3.0 512k 800fsb HT, Thermaltake Xaser III, Thermaltake Spark 7+, Sound Blaster Audigy2 ZS Platinum Pro, eVGA GeForce 6800 GT
November 19, 2004 6:48:18 PM

but still cool to have. installing, burning those dvds. Ripping, etc.

:tongue: <A HREF="http://www.geocities.com/priyajeet/fing.jpg" target="_new"><i><font color=red>Very funny, Scotty.</font color=red><font color=blue> Now beam down my clothes.</font color=blue></i></A> :tongue:
November 19, 2004 7:32:06 PM

I would prefer option 2, no reservations at all. The dual procesors are awesome for multitasking and slowly even games are becoming multithreaded, besides a single xeon is plenty for the latest games, and with 2 gigs of RAM.
The Quadro FX 4000 is an awesome card, good for games also.

<font color=red><pre>\\//__________________________________
And the sign says "You got to have a membership card to get inside" Huh
So I got me a pen and paper And I made up my own little sign</pre><p></font color=red>
November 19, 2004 8:14:48 PM

Go with option 1. Zeon rigs do not make good gamming machines.

<i><font color=red>Only an overclocker can make a computer into a convectional oven.</i></font color=red>
November 19, 2004 9:03:07 PM

Why do you say that, addiarmadar?

A Xeon is just a P4 with lots of cache... at 3 Ghz, it should be a fine gaming processor.

________________
<b>Radeon <font color=red>9700 PRO</b></font color=red> <i>(o/c 332/345)</i>
<b>AthlonXP <font color=red>3200+</b></font color=red> <i>(Barton 2500+ o/c 400 FSB)</i>
<b>3dMark03: <font color=red>5,354</b>
November 20, 2004 2:33:17 AM

OK, thanks for the input here, it's been helpful so far. So, here are the points that I'm interested in...

First, softmod worked for the Geforce3 -> Quadro3, and that's the last time I've specifically heard of it working. I thought the later Quadros were actually different chips? That's just what I thought I'd picked up, so fill me in if you have more info about it.

I'm interested in why people say raid-0 doesn't make game levels load faster? I could only see that being the case if the system was spending 100% processor time decoding and sifting the loaded data into the actual in-game memory structures, which seems unlikely to me but who knows. My RAID setup is there for the purpose of my video editing and 3D apps, with which it does an amazing job. I can view a <b>RAW</b> 1920x1080 AVI video at almost full framerate. Plus, all apps installed to the RAID drive load insanely faster than they used to on a single drive, comparable system. For example, Photoshop 6 pops up in 5 seconds. I'm a RAID enthusiast, so if you know where you got that info about games not loading faster I'd be interested in reading into it. (oh and obviously they won't RUN faster... hopefully the HDD should not be accessed at all during gameplay. That's what all the RAM is for! :smile: )

Athlon XP 3200+ is 2.2ghz right? I currently can't reach that because of my motherboard, but I believe from what I've read that 2.4ghz is about the average capability, and my proc has a 12.5 multiplier (I think it does at least... if I remember 8 months ago when I first messed with it.) which makes 2.5ghz the ceiling with my RAM. So, we're talking a 2.3ghz to 2.4ghz XP for option 1, vs. 3ghz xeons. (I'm aware multiproc won't help much - it's there for rendering.) I guess 2.4ghz is only 10% faster, so if my current problem is CPU then I'm worried it wouldn't be solved with option 1. If I remember the benchmarks I've seen correctly, a R9800 Pro should NOT have to play Doom 3 and Far Cry at 800x600 or below to stay over 15fps.

<b>Addiarmadar</b> says Xeons don't do as well in games. If their low 133 FSB becomes an issue I could believe that, but they are crazy fast for most other things, so I'm wondering if there are some benchmarks I should search for? (also I paid big $$ for those Xeons. I could see them losing pretty bad if compared against equally priced processors. But I already have the Xeons, hence I don't need to decide between them and an Athlon64 FX55 or something.)


Ok, now about the quadro vs. Geforce issue, which is REALLY at the heart of this. Most important issue for me. <b>Cleeve</b> says the Quadro4000 is a 12-pipe card and the 6800 GT is a 16-pipe card. Do you mean number of pixel shader pipes, or number of pipelines in the GPU datapath? (similar to number of pipelines in Athlon being lower than number of pipelines in P4?) I'm going to start scouring Anandtech (Tom's doesn't seem to cover pro cards much) and the nVidia site for info about it, though I didn't find much before. I lean towards the Quadro option because I do a lot of Maya work. That's what I hope to do professionally in a year or two and the Quadro would help a lot with my learning process, in addition to (ironically) being the cheaper option, thanks to my roommate.


So what do you guys think? 2.4ghz Athlon XP Mobile, good enough for games, or better than 3ghz Xeons at least? And, more about Quadro FX 4000 vs. Geforce 6800 GT?
November 20, 2004 2:38:49 AM

One more thing... A (potentially pricey) motherboard upgrade in my workstation system might allow me to overclock those Xeons quite a bit, due to their multipliers being unlocked in the downwards direction and their sockets being compatible with the new 200FSB Xeon sockets. (so, 200MHz FSB, lower multiplier, = much faster Xeons in the future, maybe) It might be something to think about. Also I won't be bothering with PCI Express yet. No decent cards are out, there's no reason to.
November 20, 2004 3:04:29 AM

I can't link you, but I specifically remember reading a review with 2 raptors in RAID stripe vs. 1 normal, and there was not a significant difference when it came to load time in games. Google it.
November 20, 2004 3:09:49 AM

But with dual chips onboard RAID would perform better.

<font color=red><pre>\\//__________________________________
And the sign says "You got to have a membership card to get inside" Huh
So I got me a pen and paper And I made up my own little sign</pre><p></font color=red>
November 20, 2004 3:32:42 AM

I'll look for that review. They may have made the mistake of maxing out the PCI bus, which could happen with Raptors. Though, I did install UT2004 to my RAID drive, and noticed that it did not load UT maps much faster than my single PATA drive game machine, so I'm still curious what their findings were. (that's the only game I've installed to my workstation, and it was always the server, so it's not conclusive.)

Having dual procs does make RAID faster, because with that comes PCI-X, and gigabytes per second of transfer rate to the hard drive sysem, instead of a maximum of 100-ish MB/sec with PCI. Maybe we'll see PCI-Express RAID cards come out soon.

EDIT: Is this the one? <A HREF="http://www.overclockers.com/articles1063/" target="_new">overclockers.com</A>

I'm surprised there wasn't more difference. Maybe the hard drive isn't the limiting factor when loading game levels now. I do notice that he's maxing out his PCI bus a little bit, with some of those results coming in close to 120MB/sec, but they're still higher than his single Raptor's 60/36/50 scores, so if hard drive performance was a factor he'd have seen improvement. Thanks for pointing me to that, it was interesting.
<P ID="edit"><FONT SIZE=-1><EM>Edited by grafixmonkey on 11/20/04 00:04 AM.</EM></FONT></P>
November 20, 2004 4:15:48 AM

"but still cool to have. installing, burning those dvds. Ripping, etc."


Agreed, but if you have to cut some corners, leaving out the 4x raptor array can save a little bit of money.

Asus p4c800 Deluxe,1 Gig Mushkin PC3200 Dual Channel Level II V2,Pentium 4 3.0 512k 800fsb HT, Thermaltake Xaser III, Thermaltake Spark 7+, Sound Blaster Audigy2 ZS Platinum Pro, eVGA GeForce 6800 GT
November 20, 2004 6:59:04 AM

I already have the 4x Raptor array. In fact, I have everything in that dual Xeon system except for the new pro card that I'm debating getting or not.

I ran across something that makes me think the Quadro FX 4000 might only have 8 pixel shader pipes in it. I really wish I could find this info on the nVidia site, this is driving me nuts. I know only having 8 pixel shader pipes would hurt games performance. 12 might be fine, it's not too far below 16, but 8 I don't like the sound of. Is it possible you were looking at the line that says "12 bit subpixel precision" and thinking 12 pixel pipes?

I just don't want to spend a lot of money and end up gimping myself on games and still having to run Far Cry etc. below 1024x768. Traditionally, it hasn't been easy to have a Pro GL card that is also good at playing the most modern games. Maybe it's still not easy.

Incidentally, this whole deal with the Quadro card possibly having a different number of pixel shader pipes than the Geforce cards is starting to reinforce my thinking that softmods aren't a good idea. Maybe there are real differences in the cards' architectures, and the softmods are covering them up with software emulation of those?

I think I'll just send a support request to nvidia asking how many pixel shader pipes there are and what the clock speeds are for the quadro 4000. That should give me the info I need.
November 20, 2004 7:20:47 AM

You might have told this story before....but might I ask why you need a Quadro, over a standard GeForce?

What workstation apps do you use? why do you need to play games on a workstation pc?

You must of course know that games crash computers, and risking the integrity of the computer <b>that your professional work is on</b>, makes absolutely zero sense right?



<A HREF="http://service.futuremark.com/compare?2k3=2216718" target="_new"><b>3DMark03</b></A>
<A HREF="http://service.futuremark.com/compare?3dm05=127163" target="_new"><b>3DMark05</b></A>
November 20, 2004 7:21:06 AM

Yes, that's the page I saw. Closed it before I thought to link to it. What's weird is it says it's a Leadtek Quadro, and I have only seen PNY. (i.e. I can't see that the product shown actually exists! Probably why I closed it before...)

I did find this: <A HREF="http://www.xbitlabs.com/news/video/display/200404281453..." target="_new">xbitlabs press release</A>
It seems to confirm that the 4000 is a different architecture than the 6800, and not something you could flash to or achieve through softmod.
November 20, 2004 7:42:16 AM

Quote:
You might have told this story before....but might I ask why you need a Quadro, over a standard GeForce?

What workstation apps do you use? why do you need to play games on a workstation pc?

You must of course know that games crash computers, and risking the integrity of the computer that your professional work is on, makes absolutely zero sense right?

I use Maya, Premiere Pro, and AfterEffects on the workstation. My professional work is backed up on a raid-5 array, separate from the OS. I don't know why you say that games crash PCs, I can't remember my gaming pc ever being notably unstable unless it's from overclocking. Sure I won't be firing up a round of Doom3 while I have a Maya scene open, but having them installed isn't going to screw up my OS by their very presence. Maybe some other things that tend to happen on a gaming PC might, but I don't tend to use file sharing programs other than Azureus, and I don't go after crackz and warez so it really should be just fine.

My current setup is a dual-screen workstation with a dated Quadro4, one screen of which is on a KVM switch with my gaming PC. The problem is my gaming PC can't seem to perform up to par, not really sure why. It's an XP-Mobile 2500, running at 2.18ghz, with a radeon 9800 Pro and 1gb ram, but it can't seem to pull more than 20fps at 800x600 in doom3, or 10-15fps 640x480 in Far Cry. High quality settings of course, I like the detail too much to turn it down.

So upgrade scenario 1 is I buy a new graphics card and motherboard for the game pc, hopefully getting it up to par, but the 9800 Pro is supposed to perform MUCH better than it is, and I don't want to end up with a 6800 GT that still cranks out the same speed because it's CPU limited. Upgrade scenario 2 gets me a new pro card for my workstation, sells the game machine to my roommate to pay for the pro card, and hopes that the pro card is halfway decent at games. I would miss having separate PCs, if only for being able to look up web pages and type in Gaim without having to leave my game. That, and trying to have a dual-screen gaming system doesn't work very well, because lots of games don't behave and I end up having to switch between single display and horizontal span all the time.

{edit: added quote because it looked confusing what I was responding to}<P ID="edit"><FONT SIZE=-1><EM>Edited by grafixmonkey on 11/21/04 04:35 PM.</EM></FONT></P>
November 20, 2004 7:52:41 AM

Quote:
There still might be a quadro flash created by somebody soon for that card. Beats paying $2,195 dollars for it

The Quadro FX 4000 I can get for $1,470 to $1,500. Costs me $300 or so total if I sell my game system to my roommate instead of building him a new system partswise off of Newegg. I could also get a 6800 GT and stick it in the Athlon Mobile system, and keep using my outdated quadro4 for workstation apps, but hey it's not so bad a card. The Quadro 4000 would give me much better workstation performance so I lean towards it, because I do often find myself dealing with 2fps navigation in complex scenes and having to spend time setting up display layers to limit the geometry on screen. I just don't want to end up paying lots of money here and ending up with a gimped system in the end, because that's exactly what happened with the 9800 Pro, and it was a huge disappointment.

I'm going look over my Athlon64 upgrade scenario a bit, I have a hard time believing I couldn't upgrade to one of those and a 6800 GT for $1500, but that's what Excel is telling me.


(EDIT): scratch that, what I mean is I can't sell as many parts to my roommate that way, so I can't make up for the cost of the new stuff as much as I can with the pro card. (because with that, I can sell the whole other system.)
<P ID="edit"><FONT SIZE=-1><EM>Edited by grafixmonkey on 11/20/04 03:55 AM.</EM></FONT></P>
November 21, 2004 8:33:22 PM

ok, an update. This is why I don't trust the "8 pixel pipes" information for the Quadro 4000. First off, nVidia doesn't seem to have that info in any of their product or technical briefs, and manufacturers usually just copy their technical info from those brochures. Also, the product brochure for the FX 4000 from PNY is a "cover-all" brochure that details the entire Quadro FX line, some of which is the Geforce FX core, and some of which is some sort of hybrid 6800-plus-something-else core referred to as the NV40GL. There's no way I can trust a paragraph that says "8 pixel pipes" when the paragraph was written by PNY sales dept. staff to cover their Quadro FX cards, when there are different cores which are simply named the same by nVidia, and no footnotes used to differentiate features. Leadtek doesn't even list workstation cards on any of their sites except the Taiwan site, doesn't seem to have cards actually available yet, and could very well have either just spit out the same information they read from the PNY brochure or done the same thing PNY may have done, using Quadro FX information to cover the new and different nv40GL core.

So I have a question out to nVidia support, and I'll post what they say back here for completeness once they respond. Thanks again for the helpful responses!
November 21, 2004 8:57:03 PM

NVIDIA Quadro FX 4000 Features
Ultra-High-End Features & Benefits


12-Bit Subpixel Precision
3x that of the nearest competitive workstation graphics, 12-bit sub-pixel precision delivers high geometric accuracy, eliminating sparkles, cracks, and other rasterization anomalies.


256-Bit Memory Interface
Delivers the industry’s highest memory bandwidth (38.4GB/sec.) for blistering data transfer. Support for the world’s fastest GDDR3 memory with lower power consumption than previous generation systems. Only supported in the NVIDIA Quadro FX 4400, 4000 and 3400 GPUs.


32-Bit Floating Point Precision
Sets new standards for image clarity and quality through 32-bit floating point capabilities in shading, filtering, texturing, and blending. Enables unprecedented rendered image quality for visual effects processing. Only supported in the NVIDIA Quadro FX 4400, 4000 and 3400 GPUs.


Advanced Color Compression, Early Z-Cull
Improved pipeline color compression and early z-culling increase effective bandwidth and improve rendering efficiency and performance. Only supported in the NVIDIA Quadro FX 4400, 4000 and 3400 GPUs.


AGP 8X
Provides double the bandwidth of AGP 4X—2.1GB/sec. vs. 1.1GB/sec. AGP 8X enables more complex models and detailed textures, creating richer and more lifelike environments. Uninterrupted data flow allows for smoother video streaming and faster, more seamless gameplay.


Cg High-Level Graphics Shader Language
Cg—“C” for graphics—is a high-level, open-standard programming language for OpenGL that takes advantage of the power of programmable GPUs. NVIDIA Quadro FX programmable graphics pipelines leverage high-level shading languages to enable the creation and integration of real-time photorealistic effects into 3D models, scenes, and designs. This represents a major leap forward in ease and speed for the creation of real-time, realistic graphics within MCAD, DCC, and scientific applications.


Dedicated Video Processing Engine (VPE)
The high-definition enabled VPE provides the highest quality video with record low CPU utilization. Video playback is smooth, the images clear and without artifacts, and the delivery at breathtaking frame rates.


Full 128-Bit Precision Graphics Pipeline
Enables sophisticated mathematical computations to maintain high accuracy, resulting in unmatched visual quality. Full IEEE 32-bit floating-point precision per color component (RGBA) delivers millions of color variations with the broadest dynamic range.


Full-Scene Antialiasing (FSAA)
Up to 16x FSAA dramatically reduces visual aliasing artifacts or “jaggies” at resolutions up to 1920x1200, resulting in highly realistic scenes.


Hardware 3D Window Clipping
Hardware accelerated clip regions (data transfer mechanism between a window and the frame buffer) improves overall graphics performance by increasing transfer speed between color buffer and frame buffer.


Hardware-Accelerated Pixel Read-Back
Greater than 1.0GB/sec. pixel read-back performance delivers massive host throughput, more than 5x the performance of previous generation graphics systems.Only supported in the NVIDIA Quadro FX 4400, 4000 and 3400 GPUs.


Highest Workstation Application Performance
Next-generation architecture enables over 2x improvement in geometry and fill rates with the industry’s highest performance for professional CAD, DCC, and scientific applications.Only supported in the NVIDIA Quadro FX 4400, 4000, and 3400 GPUs.


High-Performance Display Outputs
400MHz RAMDACs and dual DVI digital connectors drive the highest resolution digital displays available on the market. The Quadro FX 4000 SDI has one DVI connector.


Next-Generation Vertex & Pixel Programmability
The NVIDIA Quadro FX ultra-high-end and high-end GPUs introduce infinite length vertex programs and dynamic flow control, removing the previous limits on complexity and structure of shader programs. With full support for Vertex and Shader Model 3.0, NVIDIA Quadro FX 4400, 4000, and 3400 GPUs deliver sophisticated effects never before imagined for real-time graphics systems.


NVIDIA High-Precision Dynamic-Range (HPDR) Technology
Sets new standards for image clarity and quality through floating point capabilities in shading, filtering, texturing, and blending. Enables unprecedented rendered image quality for visual effects processing. Only supported in the NVIDIA Quadro FX 4400, 4000, and 3400 GPUs.


NVIDIA Quadro Unified Memory Architecture
Allows for superior memory management, which efficiently allocates and shares memory resources between concurrent graphics windows and applications.


nView Multi-Display Technology
The nView hardware and software technology combination delivers maximum flexibility for multi-display options, and provides unprecedented end-user control of the desktop experience. NVIDIA GPUs are enabled to support multi-displays, but graphics cards vary. Please verify multi-display support in the graphics card before purchasing.


Powerwall
NVIDIA's patented single-system powerwall technology allows any application to be projected on a dual-channel powerwall with sophisticated edge blending in order to achieve uniform luminosity. Powerwall works transparently with any application. This feature is supported in the NVIDIA Quadro FX 4400G, 4000, 3400, 3000G, 3000, and 1100 models.


PCI Express Certified
Designed to run perfectly with the next-generation PCI Express bus architecture. This new bus doubles the bandwidth of AGP 8X delivering over 4GB/s in both upstream and downstream data transfers. Only supported in some NVIDIA GPUs. Please check product details for bus information.


SLI Technology
The NVIDIA Scalable Link Interface technology enables intelligent and transparent scaling of professional application performance. Designed for PCI Express, and supported in the NVIDIA Quadro FX 4400, 3400, and 1400 GPUs.


Proven Workstation Graphics Architecture
The NVIDIA Quadro FX architecture takes application performance to new levels by featuring parallel vertex engines, a radically new line engine, the industry’s first on-chip vertex cache, and fully programmable pixel pipelines coupled to a high-speed graphics DDR DRAM bus.


Quad Buffered Stereo
Offers enhanced visual experience for professional applications that demand stereo viewing capability.


Rotated Grid Full-Scene Antialiasing (RG FSAA)
The rotated grid FSAA sampling algorithm introduces far greater sophistication in the sampling pattern, significantly increasing color accuracy and visual quality for edges and lines, reducing “jaggies” while maintaining performance. Only supported in the NVIDIA Quadro FX 4400, 4000, and 3400 GPUs.


Dual Dual-Link Digital Display Connectors
Dual-link TMDS transmitters support ultra-high-resolution panels--which result in amazing image quality producing detailed photorealistic images.


Unified Driver Architecture (UDA)
Part of the NVIDIA Forceware unified software environment (USE). The NVIDIA UDA guarantees forward and backward compatibility with software drivers. Simplifies upgrading to a new NVIDIA product because all NVIDIA products work with the same driver software.





<font color=red><pre>\\//__________________________________
And the sign says "You got to have a membership card to get inside" Huh
So I got me a pen and paper And I made up my own little sign</pre><p></font color=red>
November 21, 2004 9:25:34 PM

I've read through that five times already. Where in it does it tell how many pixel shader pipes are in the FX 4000? This is one of the most important factors in the huge performance increase from GeforceFX to Geforce 6, correct?

Unless I'm completely and repeatedly missing something every time I read the specs, the info is not there. I'm guessing you're seeing "12-bit subpixel precision", which is something else. (that would be the feature that would hopefully stop the edges I select, when working on SubD surfaces in shaded mode, from popping between selected color and deselected color when I move the camera. And hopefully also stop wireframes displayed on top of some kinds of surfaces from speckling and disappearing under certain circumstances.)
November 21, 2004 9:45:17 PM

How does the 4000 compare to the Fire GL X3 or the VL7100. These ATI Pro cards are around $900

ATI FIRE GL X3-256 and FireGL V7100 Specs:

Outstanding high-end workstation performance and quality utilizing 16 pixel pipelines and 6 geometry engines
256 MB GDDR3 unified graphics memory
Dual display support via two DVI outputs
Dual link support for ultra-high resolution 9 Mpixel displays
Stereo 3D connector with quad-buffered support
Optimized and certified for professional workstation applications based on OpenGL® and Microsoft® DirectX® 9.0
Windows® and Linux® support
Three year warranty with toll-free advanced technical


<font color=red><pre>\\//__________________________________
And the sign says "You got to have a membership card to get inside" Huh
So I got me a pen and paper And I made up my own little sign</pre><p></font color=red>
November 22, 2004 3:47:18 AM

It sounds promising performance-wise, but that option's out because it's ATI. Their support for dual-screen is horrible (actually, in fact, nonexistant. It defaults back to the Win XP dual-screen capability. Which is horrible.) I've tried dual screen with ATI on a friend's computer, and the nView software gives so much more and lets me do things so much better and faster because of its special options, that even a big gain in card performance would be completely ruined by having to deal with XP's built-in dual screen. My friend actually ended up returning the ATI card he bought alongside his second monitor, and sucked up the 15% restocking fee on a R9600 to get an FX 5900XT.

Plus, Windows in general has a serious flaw in its dual-screen capability, where OpenGL can't latch on to both screens for some reason. It was an issue a couple years ago, I'm not sure if it is, but it'd be a $900 gamble.

I appreciate the suggestion though, it's good to have options! I hadn't investigated the FireGL line at all (because of the dual screen).
November 22, 2004 3:50:45 AM

What is dual screen problem your referring to. I am working in AutoCAD right now, and have TV running on second 21" screen.
My card is an ATI X800XT PE, with dual DVI's.
I use ATI's HydraVision to manage the displays, and I find it rather intelligent. It definately is extremely customizable.

<font color=red><pre>\\//__________________________________
And the sign says "You got to have a membership card to get inside" Huh
So I got me a pen and paper And I made up my own little sign</pre><p></font color=red>
November 22, 2004 5:34:40 AM

And hence, so begins another skirmish in the great brand wars....

Asus p4c800 Deluxe,1 Gig Mushkin PC3200 Dual Channel Level II V2,Pentium 4 3.0 512k 800fsb HT, Thermaltake Xaser III, Thermaltake Spark 7+, Sound Blaster Audigy2 ZS Platinum Pro, eVGA GeForce 6800 GT
November 22, 2004 7:43:20 AM

Quote:
And hence, so begins another skirmish in the great brand wars....

No skirmish. ATI just hasn't bothered with good dual screen support, didn't think it was important or something. At least, in the past. That may have changed.

So, here's the deal. nVidia has nView, which tricks Windows into thinking all displays combined are a single display of the combined resolution (like 2560x960 = 1280x2x960). ATI just lets Windows realize that there are two display devices (so there isn't really any "ATI DualView", DualView is just the WinXP native handling of dual screens). In the past, whenever I let Windows provide the dual-screen functionality, Maya would be unable to use OpenGL on the screen it didn't initialize to. i.e. start Maya, it pops up on left screen. Put a child window like HyperShade on the right hand screen, and it will no longer update. Accepts interactive mouse and keyboard commands, and updates once put back on left screen, but will not draw anything on the right screen. I've had this happen in Win2K and WinXP Pro, both with a "1xAGP + 1xPCI" video card configuration and with the Quadro4 I now use. The Quadro4 worked in Horizontal Span mode, but not in normal WinXP dual monitor mode.

Back then, I read up on the issue and found that Windows had a limitation with OpenGL, where OpenGL under Windows was incapable of behaving properly with dual monitors. Something about it only being able to address one display device internally. Worked on Linux etc. but not windows, so I think it was a windows limitation, or maybe MS refused to modify their display architecture to make it compatible with the architecture openGL used in that way, i.e. a design conflict rather than a bug. So, a couple years ago, nView was necessary for dual-screen Maya work, if not all dual-screen openGL work. Basic sliders and GUI elements could go on the second screen, but that's it.

So, to make sure my info is accurate, I just did a test with Maya 6.0. I set Windows back into dualview mode instead of nView. Made a quick scene that brought the quadro4 down to 7fps in high quality rendering mode, and 28fps in smooth shaded mode. Apparently Maya WILL now update its windows when they cross the dualview screen border, but with major problems. Those fps stayed consistent when the Maya window was on one display or the other, but went down to 3fps and 7fps when the openGL window spanned the border. Also, whichever display Maya opened on would do the lighting correctly, and the other one would show the objects as all white (this problem happened consistently several times, and then stopped happening, so it's apparently an intermittent problem.) I could get it to happen every time Maya started, and it wouldn't always go away. <A HREF="http://bkgrafix.net/filepile/dualviewmaya.jpg" target="_new">Here is a jpg of the problem</A>, which I'll leave up for a few weeks and then take down. Third problem I noticed is that Maya wanted to crash every time it was started until I turned Dualview off, opened Maya, used alt+space->Move to reposition it on the screen, and then switched back to Dualview. (from then on it opened fine, so maybe this wouldn't happen to someone who had never used Maya in nView.) Also, none of those problems happen at all in nView's Horizontal Span mode. I tested a 3D scene spanning both displays, and it got the same fps as it did when not spanning the gap, without any anomalies (shading was correct no matter where the window was placed.) And, those FPS I got reassured me that I'll be able to use that new Quadro if I get it. I should also test to see if fps are higher for the same scene in nView or in DualView, just out of curiosity. Maybe tomorrow.


Anyway, so serious technical problems with Windows XP Dualview, which I <i>think</i> is what ATI uses. ATI drivers have clone mode and a mode where it displays the hardware video overlay on the second screen, but I am not aware of anything like nView's Horizontal Span mode. (Have they added one? Because I haven't seen one yet?)

From there I could get into all the awesome efficiency improvements you have with nView, like keyboard combos for sending windows to the next display or next desktop, the magnifier, and (for non Maya users) stuff like transparent window dragging, multiple desktops, z-toggling with the middle mouse button, that I just find invaluable.

I should mention, I think there are third party programs that try to do what nView does. My friend tried several of them back when he bought his R9600 and never found one that really worked well, and that's when he sent the card back. So, maybe you can make an ATI card work with 3D graphics apps in dual screen, but my info says no. If that ever changed, I'd certainly be willing to consider FireGL. Also, back when I was having the "no updating" problem in DualView, I was also taking an OpenGL programming class, and my class projects also experienced the "no update" issue, so I'm inclined to say it was <i>not</i> just Maya.


anyway, back to the quadro gaming thingy, still waiting for response, don't really expect it until monday night or tuesday afternoon at best. Feel like filling me in on any new features ATI has added to HydraVision?
<P ID="edit"><FONT SIZE=-1><EM>Edited by grafixmonkey on 11/22/04 03:49 AM.</EM></FONT></P>
November 22, 2004 10:51:20 AM

HydraVision from ATI controls dual displays, you get awesome control.
You can span across two monitors as one large image, or use it as two seperate displays.
They can each have different resolutins and refresh rates to boot.
You can have child menus come up on either monitor.
You have profiles with different settings, so you can save multiple settings depending on your application using with just one click.

I think ATI has the ultimate dual display card and s/w. You really should check it out.

<font color=red><pre>\\//__________________________________
And the sign says "You got to have a membership card to get inside" Huh
So I got me a pen and paper And I made up my own little sign</pre><p></font color=red>
November 22, 2004 5:47:15 PM

So they did add spanning? Awesome! I have ATI on my gaming machine right now (no dual display necessary, so it made sense) so I can use it to check out whatever new stuff has been added for dualscreen.

Now, is HydraVision built in to the drivers? Or is it something extra that you have to download and I just haven't noticed it? Because at one point I was actually trying to get ATI dualscreen to work, and maybe assuming HydraVision was inside the drivers was the mistake.
November 22, 2004 5:48:38 PM

Hydravision has been around for years...I imagine the program he used with that r9600 card he was talking about was the same. I've had...heh...dual 12s on an ati machine years ago...
Anyway he's talking specifically about rendering in 3d environments with open gl across multiple screens...(which i know nothing about, but i assume is the basis for his problems; which hasn't been addressed)
November 22, 2004 6:27:01 PM

The basis of the problem is the desktop spanning thing. Windows has to think there is only one monitor, and draw to it as if it was one screen, and let the graphics card divide the image between its separate RAMDACs. If windows thinks there are two monitors active, then anything OpenGL starts having problems. So apparently HydraVision now has spanning mode similar to nView, which fixes that and makes ATI dualscreen viable. Whether spanning mode was always there and I just never got HydraVision installed correctly, or it was added in the last six months, I don't know, but it doesn't matter much because it's available now and that's what counts.
a b U Graphics card
November 22, 2004 11:07:44 PM

Anyone seriusly considering or using multi-monitor should avoid both ATI and nV's integrated software. And Matrox does Multi-mon far better.
There are lots of third part apps out there that offer ALOT more functionality, even things like spanning to other computers, multiple desktops on one computer.
Both nView and Hydravision are getting better, but neither is great enough for someone who relies on multi-monitor flexibility.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
November 24, 2004 12:18:56 AM

I'd be interested in checking out some of these programs. Do you have a favorite, that I should try first? (can't compute at all without multi screen anymore!)


I have a response from nVidia:
Quote:
QuadroFX has the same shader architectures GeForce 6800. FX4000 reference design is in between 6800GT and 6800Ultra in game performance. Most game level/map designers use QuadroFX graphics for development and QA on GeForce family products.
Please check with PNY for final card specifications.

This indicates to me that I should be getting the quadro, and all I will lose is (possibly) the ability to overclock the card, because I know I won't be pumping the clock speed up on a $1500 graphics board. But if it's between GT and Ultra performance I'm already getting slightly more than a GT, and overclocking a GT all the way to Ultra speeds is a bit of a gamble right?

So decision made! I will miss my nice quiet Antec Sonata case. My workstation is not a quiet machine.
a b U Graphics card
November 24, 2004 2:20:19 AM

Well the one I used to use, and still hear great things about, is 'Ultra-Mon'.

Also MaxVista is one that allows you to use another computer. However I doubt it's good for tougher apps, more specialty solution.

Alot of applications (Like PowerPoint and Photoshop) have add-ons that extend support beyond the original concept, and beyond that basic of windows support.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
!