We all know that gaming and workstation graphics cards employ the same hardware, differentiated by slight tweaks, drivers, and validation. We also know desktop cards usually perform awfully in professional apps. Does the reverse hold true as well?
A while back, our German team benchmarked a total of 40 graphics cards (including 12 professional boards and 28 more gaming-oriented cards) at the request of our readers. We already covered some of the specific features that separate workstation hardware from the stuff most of use on the desktop in AMD FirePro W8000 And W9000 Review: GCN Goes Pro, so we won't rehash all of that. As you probably already know, though, the more expensive workstation products ship with drivers specifically optimized for certain applications. The result is typically better performance in those workloads than anything a GeForce or Radeon could achieve. Moreover, the Quadro and FirePro boards are dutifully validated in the software important to professionals, assuring not just compatibility, but also reliability in always-on environments.
With that said, a number of our readers have asked us what happens when you turn things around and use graphics hardware designed for very high-end tasks to play games. So, we set out to evaluate the current state of affairs using a number of synthetic and real-world tests.
Despite DirectX's technical limitations in professional applications, it continues to grow more popular in certain segments. Autodesk’s Inventor is a good example of this. We thought it'd be interesting to compare how nearly-identical GPUs perform, complemented by their respective drivers. The real question is: is the software you download for Nvidia's Quadro cards, along with AMD's Catalyst Pro package for the FirePro boards, only optimized for workstation tasks, or can they handle gaming, too?
Today's experiment involves measuring the performance of workstation-oriented graphics cards in applications you wouldn't normally associate with them. Not only is it interesting to see where these cards fall in relation to each other, but also in comparison to their corresponding desktop-class products.
Personally, I was surprised by the results from one particular product...
Our Graphics Card Test Bed
Once again, we're using the game titles from last year's test bed for the charts section (2012 Graphics Card Charts). You won't find either Nvidia's Quadro 600 or 400 here because they're even slower than the old GeForce GT 440. Even if they served up similar performance as that mainstream card, they'd be unusable for gaming.
| Benchmark System: Hardware and Software | |
|---|---|
| CPU | Intel Core i7-2600K (Sandy Bridge), Overclocked to 4.5 GHz, Shared 6 MB L3 Cache, Hyper-Threading Enabled |
| CPU Cooler | Prolimatech SuperMega + Noiseblocker Multiframe |
| Motherboard | Gigabyte Z68X-UD7-B3, Intel Z68 Express |
| RAM | 2 x 4 GB Kingston HyperX DDR3-1600 |
| System Drive | Kingston V100+ 256 GB SSD |
| Power Supply | Corsair AX1200i, 1,200 W, 80 PLUS Platinum |
| Operating System | Windows 7 x64 SP1 |
| Driver | Catalyst Pro 9.003.3 (FirePro) Catalyst 12.11 Beta (Radeon) GeForce 307.45 WHQL (Quadro) GeForce 310.70 WHQL (GeForce) |
- Can Workstation Graphics Cards Play Games?
- Results: 3DMark 11
- Results: Unigine Heaven
- Results: Unigine Sanctuary
- DirectX 9 Results: Mafia II
- DirectX 9 Results: Crysis 2
- DirectX 11 Results: Aliens Vs. Predator
- DirectX 11 Results: Metro 2033
- DirectX 11 Results: Crysis 2
- DirectX 11 Results: Batman: Arkham City
- DirectX 11 Results: DiRT 3
- DirectX 11 Results: StarCraft II
- DirectX 11 Results: Battlefield 3
- Cumulative Performance Index
- FirePro W9000 And W7000 Do Well; FirePro W8000 Disappoints

Did you even read the article?
I think the last paragraph, (especially the last sentence) more than adequately answers your question.
Probably the former plus they can get away with charging more as business customers need them.
Same with Enterprise hard drives. They are pretty much the same as regular hard drives. The only real difference is how they deal with data errors. The consumer drive will try to correct the error and recover the data causing the drive to not respond for a while and the RAID controller to thing it went bad potentially taking down the array when trying to rebuild. An Enterprise drive just notes the error and keeps chugging along asking the array for the corrupted data.
Now while the Enterprise hard drive is little more than a firmware change, making their price appalling. At least these workstation cards actually have some different chips and design requiring their own manufacturing equipment. So their higher price is more justified as they have to make changes to their line for a relatively small number of cards.
If they had a demand as high as the gaming cards their prices would probably be pretty close to their gaming counterpart. I'm sort of surprised one of them hasn't just unified their gaming and workstation line and dominate the workstation market.
Ask Nvidia and take a look in the NDA- Try to buy one
Just goes to show how under-utilized the high-end gaming hardware is. If that kind of driver tweaking went into gaming cards, you could probably max out Metro 2033 on a 8800GTX, eh?
Did you even read the article?
I think the last paragraph, (especially the last sentence) more than adequately answers your question.
No kidding. I saw the article title and thought "Finally, a good TH article has arrived!"
The errors would be driver related and are not bound to the hardware. You would be able to get just as accurate results with gaming cards if the drivers supported the necessary executions.
It's very simple, and I've been in this situation before. You have people here complain about $1000 processors, and of course ultra-expensive video cards, because it's not practical for them, or most people.
But, in a work environment, $4000 for something that improves productivity is a bargain. If you can shave time off the development, that saves money, allows for happier customers, and happier employees since they don't wait so long. The cost of the device is insignificant measured against the time it saves.
That's why Intel's $1000 processors are a bargain for many. You're going to waste $150 an hour for your engineers to wait only the lowly $300 processors? You'd be a moron. The same with ultra-expensive video cards. When you're paying people, time is money, and you shouldn't want to waste either one.
And you need it validated. There's no way someone can work with something they don't trust on detailed designs. So, some of it is the cost AMD and NVIDIA need to put into the cards, some of it is simply because it's worth it for a lot of people in the market.
There are plenty of examples. Look at IBM's POWER 7+, which annihilates anything Intel makes, but costs many times more. Yet, it sells.
If a driver error results during a gaming session, minimally, you will get an artifact in one of 30-60 frames drawn in a given second; maximally, crashing the game and/or computer. You lose some time and aggravation, but little real-world impact.
With a workstation card, in a business environment, a driver error that causes an app/comp crash has a very real cost associated with replicating the lost work. Moreover, while gamers are tolerant of occasional crashes in favor of overall improved performance, business are not. That premium is paid to ensure that your card is certified to work in a specific configuration error free. That form of testing and driver development is expensive to be sure. Although I don't know, I suspect that the workstation cards have superior warranty coverage too.
In the case of HDD as another commenter pointed out, the difference between Desktop and Enterprise HDDs are usually a label and some slightly altered firmware. While that doesn't really justify the increased price, the extra warranty period does. If you were to use a HDD in a 24/7 server, with at least some constant load, that will undoubtedly shorted the life of said HDD. To afford the longer warranty period on thoses drive the manufacturer must charge more for them. You can't increase the warranty, increase the duty cycle of the drive and then lower the price. You'll just lose money and go out of business. Besides, if HDD manufacturers are making thicker margins on enterprise drives, it allows them to lower prices on consumer drives. If the exchange is that I can't use a ~$100 HDD with a $600+ raid card, I'll take that. Soft R4 & R5 have both worked great for me.
Hardware failing can cause errors too you know. Even perfect software/drivers can't save you from that, but at least it'll tell you about the error rather than just silently ignore it. Also, gaming cards may not be as accurate as workstation cards since gaming cards don't usually come with double precision enabled in hardware. Even if the driver supports double precision it won't do you any good if it's not present on the hardware itself.