How Well Do Workstation Graphics Cards Play Games?

Results: Unigine Sanctuary

Unigine Sanctuary

Unigine Sanctuary is the last synthetic in our suite. The professional cards (from both AMD and Nvidia) get off to a slow start relative to the desktop boards, but manage a stronger showing once resolutions and settings get more demanding. At that point, we see the same unusual picture: AMD's FirePro W9000 solidly outperforms the other workstation cards. It’s on about the same level as the company's Radeon HD 7970 GHz Edition with its high clock frequency. The FirePro W8000’s performance is just sad.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
90 comments
    Your comment
    Top Comments
  • k1114
    Best article topic I've seen all year.
    37
  • e56imfg
    Now do workstation CPUs :)
    26
  • merikafyeah
    MyUsername2Are these cards so expensive because fewer people need to buy them, or do they really have that much more tech in them?

    Did you even read the article?

    Quote:
    Once again, the lesson here is that, in the workstation graphics segment, you don’t pay that massive premium for better hardware so much as you pay for the drivers and validation. This isn't something that should be held against AMD or Nvidia, even though we know they sell the same silicon into cards that cost a fraction as much. Driver development and optimization takes a lot of expensive time and work. Games are fun and all, but when you step aboard that new 787, you need to trust that the workstations responsible for every piece of it were 100% accurate.


    I think the last paragraph, (especially the last sentence) more than adequately answers your question.
    21
  • Other Comments
  • MyUsername2
    Are these cards so expensive because fewer people need to buy them, or do they really have that much more tech in them?
    6
  • ipwn3r456
    Umm, why not the newest Quadro K5000 is being benchmarked, but the newest FirePro W9000 is being tested here?
    14
  • velocityg4
    MyUsername2Are these cards so expensive because fewer people need to buy them, or do they really have that much more tech in them?

    Probably the former plus they can get away with charging more as business customers need them.

    Same with Enterprise hard drives. They are pretty much the same as regular hard drives. The only real difference is how they deal with data errors. The consumer drive will try to correct the error and recover the data causing the drive to not respond for a while and the RAID controller to thing it went bad potentially taking down the array when trying to rebuild. An Enterprise drive just notes the error and keeps chugging along asking the array for the corrupted data.

    Now while the Enterprise hard drive is little more than a firmware change, making their price appalling. At least these workstation cards actually have some different chips and design requiring their own manufacturing equipment. So their higher price is more justified as they have to make changes to their line for a relatively small number of cards.

    If they had a demand as high as the gaming cards their prices would probably be pretty close to their gaming counterpart. I'm sort of surprised one of them hasn't just unified their gaming and workstation line and dominate the workstation market.
    13
  • k1114
    Best article topic I've seen all year.
    37
  • FormatC
    Quote:
    Umm, why not the newest Quadro K5000 is being benchmarked, but the newest FirePro W9000 is being tested here?

    Ask Nvidia and take a look in the NDA- Try to buy one ;)
    16
  • anxiousinfusion
    So its Toms suggesting that enthusiasts who want bleeding edge performance start building gaming machines with the W9000 cards?
    -27
  • moneymoneymoney
    @anxiousinfusion I would say that they're saying if you want professional performance in CAD & 3D Rendering software but also game on the same machine then these cards can do just that. Instead of buying two machines (one for work and one for gaming).
    19
  • e56imfg
    Now do workstation CPUs :)
    26
  • guvnaguy
    Do companies use these cards for any sort of video game design? If so I could see why they need optimized for both applications.

    Just goes to show how under-utilized the high-end gaming hardware is. If that kind of driver tweaking went into gaming cards, you could probably max out Metro 2033 on a 8800GTX, eh?
    -4
  • rmpumper
    I had a laptop with quadro fx3600m 3 years ago and from personal experience know that it was identical as the 8800GTm at gaming.
    6
  • merikafyeah
    MyUsername2Are these cards so expensive because fewer people need to buy them, or do they really have that much more tech in them?

    Did you even read the article?

    Quote:
    Once again, the lesson here is that, in the workstation graphics segment, you don’t pay that massive premium for better hardware so much as you pay for the drivers and validation. This isn't something that should be held against AMD or Nvidia, even though we know they sell the same silicon into cards that cost a fraction as much. Driver development and optimization takes a lot of expensive time and work. Games are fun and all, but when you step aboard that new 787, you need to trust that the workstations responsible for every piece of it were 100% accurate.


    I think the last paragraph, (especially the last sentence) more than adequately answers your question.
    21
  • s3anister
    k1114Best article topic I've seen all year.

    No kidding. I saw the article title and thought "Finally, a good TH article has arrived!"
    14
  • slomo4sho
    merikafyeahDid you even read the article?I think the last paragraph, (especially the last sentence) more than adequately answers your question.


    The errors would be driver related and are not bound to the hardware. You would be able to get just as accurate results with gaming cards if the drivers supported the necessary executions.
    -2
  • ta152h
    MyUsername2Are these cards so expensive because fewer people need to buy them, or do they really have that much more tech in them?


    It's very simple, and I've been in this situation before. You have people here complain about $1000 processors, and of course ultra-expensive video cards, because it's not practical for them, or most people.

    But, in a work environment, $4000 for something that improves productivity is a bargain. If you can shave time off the development, that saves money, allows for happier customers, and happier employees since they don't wait so long. The cost of the device is insignificant measured against the time it saves.

    That's why Intel's $1000 processors are a bargain for many. You're going to waste $150 an hour for your engineers to wait only the lowly $300 processors? You'd be a moron. The same with ultra-expensive video cards. When you're paying people, time is money, and you shouldn't want to waste either one.

    And you need it validated. There's no way someone can work with something they don't trust on detailed designs. So, some of it is the cost AMD and NVIDIA need to put into the cards, some of it is simply because it's worth it for a lot of people in the market.

    There are plenty of examples. Look at IBM's POWER 7+, which annihilates anything Intel makes, but costs many times more. Yet, it sells.
    -3
  • dark_knight33
    slomo4shoThe errors would be driver related and are not bound to the hardware. You would be able to get just as accurate results with gaming cards if the drivers supported the necessary executions.


    If a driver error results during a gaming session, minimally, you will get an artifact in one of 30-60 frames drawn in a given second; maximally, crashing the game and/or computer. You lose some time and aggravation, but little real-world impact.

    With a workstation card, in a business environment, a driver error that causes an app/comp crash has a very real cost associated with replicating the lost work. Moreover, while gamers are tolerant of occasional crashes in favor of overall improved performance, business are not. That premium is paid to ensure that your card is certified to work in a specific configuration error free. That form of testing and driver development is expensive to be sure. Although I don't know, I suspect that the workstation cards have superior warranty coverage too.

    In the case of HDD as another commenter pointed out, the difference between Desktop and Enterprise HDDs are usually a label and some slightly altered firmware. While that doesn't really justify the increased price, the extra warranty period does. If you were to use a HDD in a 24/7 server, with at least some constant load, that will undoubtedly shorted the life of said HDD. To afford the longer warranty period on thoses drive the manufacturer must charge more for them. You can't increase the warranty, increase the duty cycle of the drive and then lower the price. You'll just lose money and go out of business. Besides, if HDD manufacturers are making thicker margins on enterprise drives, it allows them to lower prices on consumer drives. If the exchange is that I can't use a ~$100 HDD with a $600+ raid card, I'll take that. Soft R4 & R5 have both worked great for me.
    7
  • crazypotato
    Holy crap, been waiting for this question for a year now. THANK YOU.
    10
  • abbadon_34
    Would have been interesting to note how easy/hard it is to convert the current workstation/gaming cards. In the past it was a simple flash, then they locked it down to make it near impossible.
    2
  • johnvonmacz
    Workstation GPU's aren't meant for gaming period.
    -9
  • merikafyeah
    slomo4shoThe errors would be driver related and are not bound to the hardware. You would be able to get just as accurate results with gaming cards if the drivers supported the necessary executions.

    Hardware failing can cause errors too you know. Even perfect software/drivers can't save you from that, but at least it'll tell you about the error rather than just silently ignore it. Also, gaming cards may not be as accurate as workstation cards since gaming cards don't usually come with double precision enabled in hardware. Even if the driver supports double precision it won't do you any good if it's not present on the hardware itself.
    0
  • iam2thecrowe
    why are you using such old drivers? and BETA drivers for the AMD cards? Also, did you actually re-test all those cards? or use existing results and just add the workstation cards?
    0