iBuyPower P500X And P900DX Workstations, Reviewed

Armed with updated workstation benchmarks, we have two systems from iBuyPower in the lab today: a $2,000 quad-core entry-level rig, and an $8,000 sixteen-core behemoth. With $6,000 separating the two, is the performance spread really what you'd expect?

The workstation market doesn't garner quite the attention that the enthusiast space does. There's generally far less fanfare over the CPU releases, and the GPU launches are fewer and further between. However, this year, we did see the introduction Ivy Bridge-based Xeons from Intel (Intel Xeon E3-1280 v2 Review: Ivy Bridge Goes Professional), the Kepler-based Quadro K5000 from Nvidia, and Graphics Core Next-based cards from AMD (AMD FirePro W8000 And W9000 Review: GCN Goes Pro).

Today, we're putting all of that together and looking at a pair of complete workstation systems from iBuyPower.

Los Angeles-based system builder iBuyPower offers a full line of PCs, from HTPCs to gaming laptops to workstations. The company provided us with two machines from its workstation line-up for this review: the mid-range P500X and the powerhouse P900DX. At the heart of the $2,000 P500X is an Ivy Bridge-based quad-core Xeon processor able to operate on eight threads concurrently. Meanwhile, the nearly-$8,000 P900DX sports a pair of Sandy Bridge-EP-based octo-core Xeons, for a grand total of 16 cores and 32 threads!

So how does a $2,000 workstation stack up to $8,000 system? Can there really be $6,000 worth of extra performance in the P900DX?

To find out, we're overhauling our massive workstation benchmark suite and standardizing the results to a new baseline test system. Starting with this review, the modest P500X becomes our new reference workstation, giving us a constant comparison point. All workstation reviews in the near future will include it against at least one other machine, along with the same exact test suite. This ensures that you'll start to see a growing library of workstation-oriented performance numbers as we ramp up coverage.

But before we get to testing, let's take a closer look at these two professional-grade builds.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
15 comments
    Your comment
    Top Comments
  • sprucegroose
    The P900DX would be about $6500 for the parts alone. It also comes with warranty, and if you are the type of person using it, the time building it and repairing it might offset the price difference. On the other hand, you could put in better components for the same price.
    12
  • Other Comments
  • sprucegroose
    The P900DX would be about $6500 for the parts alone. It also comes with warranty, and if you are the type of person using it, the time building it and repairing it might offset the price difference. On the other hand, you could put in better components for the same price.
    12
  • Draven35
    Pretty much right on all counts, there
    5
  • manitoublack
    We've got the Quadro 4000's at work and they're junk. GTX280 is faster and they were released in 2008. I pulled mine and installed my old GTX295, made a huge difference using the mine modelling software.

    Quadrao 4000 was all stutters, GTX 295 is buttery smooth.
    0
  • samuelspark
    Is it the new H60i or the old H60?
    2
  • Draven35
    Its the old H60.
    2
  • csf60
    manitoublackWe've got the Quadro 4000's at work and they're junk. GTX280 is faster and they were released in 2008. I pulled mine and installed my old GTX295, made a huge difference using the mine modelling software.Quadrao 4000 was all stutters, GTX 295 is buttery smooth.

    that's because workstation cards are not meant to be fast at rendering frames. They are fast at doing many simple batch calculations like ray tracing, fluid movement or video editing.
    6
  • j2j663
    manitoublackWe've got the Quadro 4000's at work and they're junk. GTX280 is faster and they were released in 2008. I pulled mine and installed my old GTX295, made a huge difference using the mine modelling software.Quadrao 4000 was all stutters, GTX 295 is buttery smooth.


    This is like someone complaining that a screwdriver is really bad at pounding in nails. Learn to use the right tools for the job at hand.
    7
  • Anonymous
    I'm curious about the After Effects performance. What were your memory settings when rendering multiple frames simultaneously?
    0
  • Draven35
    they varied, I had to set them between 3gb and the minimum in order to the the maximum number of cores. I have a working theory on the AE problem that i will test next opportunity.
    0
  • Anonymous
    Wow, odd. Anywhere I could get an update on your progress once you test your hypothesis? I'd love to figure out what is causing that result. It should be destroying that benchmark.
    0
  • Draven35
    Yes, I'll post it somewhere, likely here...
    0
  • WyomingKnott
    Quote:
    The P900DX demonstrates a 3.17x increase over the P500X.

    The numbers are 5.06 and 16.09. That's a 2.17x increase, unless you are going to argue that 5.06 is a 1x increase over 5.06. Three times more, four times as much.

    Good heavens, the language has gotten sloppy.
    0
  • xelliz
    I have to say that my experience with my brothers two ibuypower systems has been less then inspiring. I personally wouldn't recommend their systems for home use much less in the business world. Of course this is just my opinion, based on my experience...but since I'm talking about me, my personally experience is whats most important.
    0
  • Anonymous
    Please Please Tell us About Lightwaves VPR (virtual progressive render)
    a its screen res times, you gotta ask Newtek for a reconconstruction test.
    The grail is human real time reconconstruction when eyeballed.
    0
  • Draven35
    VPR runnerPlease Please Tell us About Lightwaves VPR (virtual progressive render)a its screen res times, you gotta ask Newtek for a reconconstruction test.The grail is human real time reconconstruction when eyeballed.


    VPR is totally CPU-based, so adding it would just add yet another CPU-based renderer to the collection we already have. Keep in mind the Lightwave tests we're using were not created or supplied by Newtek, I made the tests, I've been using Lightwave since version 4.0 on the Amiga.
    0