Sign in with
Sign up | Sign in

How Well Do Workstation Graphics Cards Play Games?

How Well Do Workstation Graphics Cards Play Games?
By

We all know that gaming and workstation graphics cards employ the same hardware, differentiated by slight tweaks, drivers, and validation. We also know desktop cards usually perform awfully in professional apps. Does the reverse hold true as well?

A while back, our German team benchmarked a total of 40 graphics cards (including 12 professional boards and 28 more gaming-oriented cards) at the request of our readers. We already covered some of the specific features that separate workstation hardware from the stuff most of use on the desktop in AMD FirePro W8000 And W9000 Review: GCN Goes Pro, so we won't rehash all of that. As you probably already know, though, the more expensive workstation products ship with drivers specifically optimized for certain applications. The result is typically better performance in those workloads than anything a GeForce or Radeon could achieve. Moreover, the Quadro and FirePro boards are dutifully validated in the software important to professionals, assuring not just compatibility, but also reliability in always-on environments.

With that said, a number of our readers have asked us what happens when you turn things around and use graphics hardware designed for very high-end tasks to play games. So, we set out to evaluate the current state of affairs using a number of synthetic and real-world tests.

Despite DirectX's technical limitations in professional applications, it continues to grow more popular in certain segments. Autodesk’s Inventor is a good example of this. We thought it'd be interesting to compare how nearly-identical GPUs perform, complemented by their respective drivers. The real question is: is the software you download for Nvidia's Quadro cards, along with AMD's Catalyst Pro package for the FirePro boards, only optimized for workstation tasks, or can they handle gaming, too?

Today's experiment involves measuring the performance of workstation-oriented graphics cards in applications you wouldn't normally associate with them. Not only is it interesting to see where these cards fall in relation to each other, but also in comparison to their corresponding desktop-class products.

Personally, I was surprised by the results from one particular product...

Our Graphics Card Test Bed

Once again, we're using the game titles from last year's test bed for the charts section (2012 Graphics Card Charts). You won't find either Nvidia's Quadro 600 or 400 here because they're even slower than the old GeForce GT 440. Even if they served up similar performance as that mainstream card, they'd be unusable for gaming.

Benchmark System: Hardware and Software
CPUIntel Core i7-2600K (Sandy Bridge), Overclocked to 4.5 GHz, Shared 6 MB L3 Cache, Hyper-Threading Enabled
CPU Cooler
Prolimatech SuperMega + Noiseblocker Multiframe
MotherboardGigabyte Z68X-UD7-B3, Intel Z68 Express
RAM2 x 4 GB Kingston HyperX DDR3-1600
System Drive
Kingston V100+ 256 GB SSD
Power Supply
Corsair AX1200i, 1,200 W, 80 PLUS Platinum
Operating System
Windows 7 x64 SP1
Driver
Catalyst Pro 9.003.3 (FirePro)
Catalyst 12.11 Beta (Radeon)
GeForce 307.45 WHQL (Quadro)
GeForce 310.70 WHQL (GeForce)
Display all 90 comments.
This thread is closed for comments
Top Comments
  • 35 Hide
    k1114 , March 4, 2013 3:34 AM
    Best article topic I've seen all year.
  • 25 Hide
    e56imfg , March 4, 2013 4:11 AM
    Now do workstation CPUs :) 
  • 22 Hide
    merikafyeah , March 4, 2013 4:33 AM
    MyUsername2Are these cards so expensive because fewer people need to buy them, or do they really have that much more tech in them?

    Did you even read the article?

    Quote:
    Once again, the lesson here is that, in the workstation graphics segment, you don’t pay that massive premium for better hardware so much as you pay for the drivers and validation. This isn't something that should be held against AMD or Nvidia, even though we know they sell the same silicon into cards that cost a fraction as much. Driver development and optimization takes a lot of expensive time and work. Games are fun and all, but when you step aboard that new 787, you need to trust that the workstations responsible for every piece of it were 100% accurate.


    I think the last paragraph, (especially the last sentence) more than adequately answers your question.
Other Comments
  • 4 Hide
    MyUsername2 , March 4, 2013 3:16 AM
    Are these cards so expensive because fewer people need to buy them, or do they really have that much more tech in them?
  • 14 Hide
    ipwn3r456 , March 4, 2013 3:19 AM
    Umm, why not the newest Quadro K5000 is being benchmarked, but the newest FirePro W9000 is being tested here?
  • 13 Hide
    velocityg4 , March 4, 2013 3:31 AM
    MyUsername2Are these cards so expensive because fewer people need to buy them, or do they really have that much more tech in them?

    Probably the former plus they can get away with charging more as business customers need them.

    Same with Enterprise hard drives. They are pretty much the same as regular hard drives. The only real difference is how they deal with data errors. The consumer drive will try to correct the error and recover the data causing the drive to not respond for a while and the RAID controller to thing it went bad potentially taking down the array when trying to rebuild. An Enterprise drive just notes the error and keeps chugging along asking the array for the corrupted data.

    Now while the Enterprise hard drive is little more than a firmware change, making their price appalling. At least these workstation cards actually have some different chips and design requiring their own manufacturing equipment. So their higher price is more justified as they have to make changes to their line for a relatively small number of cards.

    If they had a demand as high as the gaming cards their prices would probably be pretty close to their gaming counterpart. I'm sort of surprised one of them hasn't just unified their gaming and workstation line and dominate the workstation market.
  • 35 Hide
    k1114 , March 4, 2013 3:34 AM
    Best article topic I've seen all year.
  • 16 Hide
    FormatC , March 4, 2013 3:38 AM
    Quote:
    Umm, why not the newest Quadro K5000 is being benchmarked, but the newest FirePro W9000 is being tested here?

    Ask Nvidia and take a look in the NDA- Try to buy one ;) 

  • 19 Hide
    moneymoneymoney , March 4, 2013 4:03 AM
    @anxiousinfusion I would say that they're saying if you want professional performance in CAD & 3D Rendering software but also game on the same machine then these cards can do just that. Instead of buying two machines (one for work and one for gaming).
  • 25 Hide
    e56imfg , March 4, 2013 4:11 AM
    Now do workstation CPUs :) 
  • -4 Hide
    guvnaguy , March 4, 2013 4:15 AM
    Do companies use these cards for any sort of video game design? If so I could see why they need optimized for both applications.

    Just goes to show how under-utilized the high-end gaming hardware is. If that kind of driver tweaking went into gaming cards, you could probably max out Metro 2033 on a 8800GTX, eh?
  • 6 Hide
    rmpumper , March 4, 2013 4:28 AM
    I had a laptop with quadro fx3600m 3 years ago and from personal experience know that it was identical as the 8800GTm at gaming.
  • 22 Hide
    merikafyeah , March 4, 2013 4:33 AM
    MyUsername2Are these cards so expensive because fewer people need to buy them, or do they really have that much more tech in them?

    Did you even read the article?

    Quote:
    Once again, the lesson here is that, in the workstation graphics segment, you don’t pay that massive premium for better hardware so much as you pay for the drivers and validation. This isn't something that should be held against AMD or Nvidia, even though we know they sell the same silicon into cards that cost a fraction as much. Driver development and optimization takes a lot of expensive time and work. Games are fun and all, but when you step aboard that new 787, you need to trust that the workstations responsible for every piece of it were 100% accurate.


    I think the last paragraph, (especially the last sentence) more than adequately answers your question.
  • 14 Hide
    s3anister , March 4, 2013 4:42 AM
    k1114Best article topic I've seen all year.

    No kidding. I saw the article title and thought "Finally, a good TH article has arrived!"
  • -2 Hide
    slomo4sho , March 4, 2013 6:31 AM
    merikafyeahDid you even read the article?I think the last paragraph, (especially the last sentence) more than adequately answers your question.


    The errors would be driver related and are not bound to the hardware. You would be able to get just as accurate results with gaming cards if the drivers supported the necessary executions.
  • -3 Hide
    ta152h , March 4, 2013 7:19 AM
    MyUsername2Are these cards so expensive because fewer people need to buy them, or do they really have that much more tech in them?


    It's very simple, and I've been in this situation before. You have people here complain about $1000 processors, and of course ultra-expensive video cards, because it's not practical for them, or most people.

    But, in a work environment, $4000 for something that improves productivity is a bargain. If you can shave time off the development, that saves money, allows for happier customers, and happier employees since they don't wait so long. The cost of the device is insignificant measured against the time it saves.

    That's why Intel's $1000 processors are a bargain for many. You're going to waste $150 an hour for your engineers to wait only the lowly $300 processors? You'd be a moron. The same with ultra-expensive video cards. When you're paying people, time is money, and you shouldn't want to waste either one.

    And you need it validated. There's no way someone can work with something they don't trust on detailed designs. So, some of it is the cost AMD and NVIDIA need to put into the cards, some of it is simply because it's worth it for a lot of people in the market.

    There are plenty of examples. Look at IBM's POWER 7+, which annihilates anything Intel makes, but costs many times more. Yet, it sells.
  • 7 Hide
    dark_knight33 , March 4, 2013 7:27 AM
    slomo4shoThe errors would be driver related and are not bound to the hardware. You would be able to get just as accurate results with gaming cards if the drivers supported the necessary executions.


    If a driver error results during a gaming session, minimally, you will get an artifact in one of 30-60 frames drawn in a given second; maximally, crashing the game and/or computer. You lose some time and aggravation, but little real-world impact.

    With a workstation card, in a business environment, a driver error that causes an app/comp crash has a very real cost associated with replicating the lost work. Moreover, while gamers are tolerant of occasional crashes in favor of overall improved performance, business are not. That premium is paid to ensure that your card is certified to work in a specific configuration error free. That form of testing and driver development is expensive to be sure. Although I don't know, I suspect that the workstation cards have superior warranty coverage too.

    In the case of HDD as another commenter pointed out, the difference between Desktop and Enterprise HDDs are usually a label and some slightly altered firmware. While that doesn't really justify the increased price, the extra warranty period does. If you were to use a HDD in a 24/7 server, with at least some constant load, that will undoubtedly shorted the life of said HDD. To afford the longer warranty period on thoses drive the manufacturer must charge more for them. You can't increase the warranty, increase the duty cycle of the drive and then lower the price. You'll just lose money and go out of business. Besides, if HDD manufacturers are making thicker margins on enterprise drives, it allows them to lower prices on consumer drives. If the exchange is that I can't use a ~$100 HDD with a $600+ raid card, I'll take that. Soft R4 & R5 have both worked great for me.
  • 10 Hide
    crazypotato , March 4, 2013 7:36 AM
    Holy crap, been waiting for this question for a year now. THANK YOU.
  • 2 Hide
    abbadon_34 , March 4, 2013 7:46 AM
    Would have been interesting to note how easy/hard it is to convert the current workstation/gaming cards. In the past it was a simple flash, then they locked it down to make it near impossible.
  • -9 Hide
    johnvonmacz , March 4, 2013 7:47 AM
    Workstation GPU's aren't meant for gaming period.
  • 0 Hide
    merikafyeah , March 4, 2013 8:26 AM
    slomo4shoThe errors would be driver related and are not bound to the hardware. You would be able to get just as accurate results with gaming cards if the drivers supported the necessary executions.

    Hardware failing can cause errors too you know. Even perfect software/drivers can't save you from that, but at least it'll tell you about the error rather than just silently ignore it. Also, gaming cards may not be as accurate as workstation cards since gaming cards don't usually come with double precision enabled in hardware. Even if the driver supports double precision it won't do you any good if it's not present on the hardware itself.
  • 0 Hide
    iam2thecrowe , March 4, 2013 8:36 AM
    why are you using such old drivers? and BETA drivers for the AMD cards? Also, did you actually re-test all those cards? or use existing results and just add the workstation cards?
Display more comments