Workstation Storage: Modeling, CAD, Programming, And Virtualization

VMware: Operating System Installation

Overall Statistics

Elapsed Time
11:35
Read Operations
72 057
Write Operations
219 313
Data Read
2.07 GB
Data Written
8.82 GB
Disk Busy Time
36.560 s
Average Data Rate
305.26 MB/s


Many workstation users exploit the flexibility of virtualization to enable multiple operating environments on a single hardware platform. In this specific trace, we installed Windows 7 64-bit to VMware Workstation using its default settings (20 GB VM file). The resulting performance profile reflects a write-heavy task. About 50% of the operations occur at a queue depth of one, and another 40% occur between two and four. Interestingly, only 59% of the data transferred is sequential. Finally, as expected, only about a quarter of operations are 4 KB blocks; the rest are larger.

I/O Trends:

  • 49% of all operations occur at queue depth of one
  • 40% of all operations occur at queue depth between two and four
  • 59% of all data transferred is sequential
  • 24% of all operations are 4 KB in transfer size
  • 21% of all operations are 64 KB in transfer size



Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
19 comments
    Your comment
  • Thanks for the workstation analysis. I'd really like to see some tests comparing performance while utilizing multiple programs and lots of disk caching. I.E. having many complimentary programs open, photoshop, illustrator, after effects and premiere pro ), with many gigs worth or projects opened and cached and multiple background renders. Something like this would be a worst case scenario for me, and finding the balance between ssds, raided disks, and memory properly configured would be interesting.

    I currently run my OS and production software from an SSD, have 24gb of system memory, page file set to write to ssd, and user files on striped 1tb drives. I'd be interested to see the benefits of installing a separate small ssd only to handle a large page-file, and different configurations with swap drives. Basically, there are a lot of different drive configuration options with all of the hardware available atm, and it would be nice to know the most streamlined/cost effective setup.
    4
  • clownbabyThanks for the workstation analysis. I'd really like to see some tests comparing performance while utilizing multiple programs and lots of disk caching. I.E. having many complimentary programs open, photoshop, illustrator, after effects and premiere pro ), with many gigs worth or projects opened and cached and multiple background renders. Something like this would be a worst case scenario for me, and finding the balance between ssds, raided disks, and memory properly configured would be interesting.I currently run my OS and production software from an SSD, have 24gb of system memory, page file set to write to ssd, and user files on striped 1tb drives. I'd be interested to see the benefits of installing a separate small ssd only to handle a large page-file, and different configurations with swap drives. Basically, there are a lot of different drive configuration options with all of the hardware available atm, and it would be nice to know the most streamlined/cost effective setup.


    We'll look into that!

    Cheers,
    Andrew Ku
    TomsHardware.com
    4
  • As an applications developer working on a brand new dell m4600 mobile workstation with a slow 250 mechanical hard drive it is very interesting to see tests like this and makes me wonder how much improvement I would see if my machine was equipped with an SSD.

    I would really like to see more multitasking as well including application startup and shutdowns. Throughout the day I am constantly opening and closing applications like remote desktop, sql management studio, 1-4 instances at a time of Visual Studio 2010, word, excel, outlook, visio, windows xp virtual machine, etc.......
    1
  • Is having to wait for a task really that much of a deal-breaker? I tend to use that time to hit the restroom, get a coffee, discuss with co-workers, or work on another task. Besides, if computers get to be too fast, then we'll be expected to get more done. ;^)
    0
  • Quote:
    Consequently, compiling code isn't a usage scenario where SSDs provide a clear lead.

    I disagree. Try the test again with a distributed build system.

    I work on a project with around 3M lines of code, which is actually smaller than Firefox. To get compile times down, we use a distributed build system across about a dozen computers (all the developers and testers pool their resources for builds). Even though we all use 10k RPM drives in RAID 0 and put our OS on a separate drive, disk I/O is still the limiting factor in build speed.

    I'll agree that building on a single computer, an SSD has little benefit. But I'd imagine that most groups working on very large projects will probably try to leverage the power of more than one computer to save developer resources. Time spent building is time lost, so hour long builds are very, very expensive.
    2
  • ackuWe'll look into that!Cheers,Andrew KuTomsHardware.com



    On top of the SSD Cache, i would like to know where this performance gains plateau off (like if a 16gb SSD cache performs the same as a 32 or 64+ etc etc)

    I'd like to see these put up against some SAS drives in RAID 0, RAID 1 and RAID10 @ 10k and 15k RPMs. I"m currently running a dual socket xeon board with 48gb RAM on a 120GB Vertex2 SSD and a 4 pack of 300GB 10K SAS Disks in RAID10.

    I think i'd LOVE to see Something along the lines of the Momentus XT in a commercial 10k/15k RPM SAS disk with 32gb SSD which could be the sweet spot for extremely large CAD/3dModeling Files out there.
    2
  • PLEASE!

    Add VMware benchmarks on normal desktop CPUs reviews!
    2
  • It's nice that you test "workstation software", however you do not test any compositing software such as Eyeon Fusion or Adobe After Effects. Testing 3D rendering seems pretty silly. Compositing and video editing is a LOT more demanding on storage.
    2
  • SSD 1TB for $200 right now !!!!
    -2
  • Very nice Article & Thanks! Pictures, in this case a Video is all you needed to make the point ;)

    Andrew - the reference to the 'Xeon E5-2600 Workstation' completely screwed me up, the benchmarks made no sense until I looked at the 'Test Hardware' and then noticed an i5-2500K??!! Please, swap-out the image, it's misleading at best.

    Try doing this on a RAM Drive and better on the DP E5-2600 with 64GB~128GB; 128GB might be a hard one. I've been 'trying' to experiment with SQL on a RAM Drive (my X79 is out for an RMA visit). However, the few times with smaller databases it's remarkable. Like you feel about going from HDD to SSD's, it's the same and them some going from a RAM Drive. Also, playing with RAM Cache on SSD's, stuck until RMA is done.
    1
  • willardTime spent building is time lost, so hour long builds are very, very expensive.


    And if the coding needs to be fixed and replaced, well, even more time is lost.
    0
  • I'm really delighted programming was one of the chosen workstation disciplines. Some comments:

    - The choice of a Core i5 as the host CPU is a bad one. Hyperthreading in Core i7 makes a lot of sense since it enables higher parallelism during compilation - 8 files compile in parallel instead of 4. Incidentally that would increase the I/O load as well.

    - There's nothing surprising in the mixture of random and sequential transfers. While source code files are small, the produced binary object files are not, not to mention the final libraries and executables. For a single file of source code you'd typically get a 50 to 500K of object code file produced. Precompiled headers run to 30-40 MB as well. Some of our libraries' builds exceed 4 GB in size. True, these include both debug and release builds, but they don't include the intermediate object files - only the final libraries. The main reason for these large sizes is the debug symbols.

    Small SSDs don't make much sense for development. On a complex project you can work with a 120GB drive, but you may end up frequently deleting old builds (of dependency libraries) from your cache due to running out of disk space. I have a 240 GB Vertex 2 SSD on my laptop (it's a secondary machine) dedicated for development (e.g. it's not even a boot drive) and that works ok for now, meaning I still haven't had to clean it up from obsolete builds...
    1
  • I find the result from compiling quite interesting, as I've always thought of compiling as a largely disk IO bottlenecked process. I would have figured an SSD provided significantly more benefit than a 2 disc RAID 0 would...

    I think agnickolov is onto something with his comment, though.
    0
  • teddyminesIs having to wait for a task really that much of a deal-breaker? I tend to use that time to hit the restroom, get a coffee, discuss with co-workers, or work on another task. Besides, if computers get to be too fast, then we'll be expected to get more done. ;^)


    There are things called deadlines and having a life outside of work. :) The more time spent waiting for a project to finish, the more time wasted, the more money lost, and the unhappier the client.

    Do you think they could have rendered Transformers (or any other CGI heavy movie) with a Pentium 4? Probably not. :P
    0
  • Thanks, useful test for non-kids here.

    One question. Did you compile Firefox in Release or Debug? Because Release builds tend to load processor more (optimizations take a lot of time), and Debug builds don't load processor as much, at the same time loading disks more. In programmer's work, Debug builds are far more common, BTW.
    And of course you should have used a system with i7-3930k for this test, or better yet with a pair of Xeons. i7-2500? It is not a workstation.
    0
  • I'd like to see a test using a Javac compiler instead of VS.
    Compiling with Java creates a .class file for every .java file, and even without counting in the construction of JAR and WAR files, it is very disk intensive.
    0
  • I'd like to see real CAD programs tested, e.g. solid modelers like Autodesk Inventor, SolidWorks, Creo (formerly Pro/E Wildfire) not just a line drawing program (AutoCAD). Throw in some analysis tools like ANSYS to round it out for mechanical design workstations.
    1
  • A corollary to the SSD analyses should be which of the SSDs now lend themselves well to the real-world displayed utilizations (shown in this article). Some of the SSDs shine in different areas. Now, given what's shown here, which SSDs actually make the most difference in each of the categories analyzed.
    0
  • Thanks for this test.

    Some of the most demanding workstation tasks are for FEA -- Ansys, Abaqus, Cosmos, Creo Simulate (Pro/Mechanica). A single model often takes hours or days to solve, especially if RAM is not sufficient (common) and the solver turns to swap space on a drive. An SSD can cut solution times by 50% or even 80+% -- see this article:
    http://www.ansys.com/staticassets/ANSYS/staticassets/resourcelibrary/article/AA-V4-I1-Boosting-Memory-Capacity-with-SSDs.pdf

    These programs write reams of incompressible data -- my 2 week-old SSD has had 7,000 GB written to it (yes, hammered). At this rate it will last 1-2 years, which is fine. But as a Sandforce Duraclass drive, it has throttled to ~80 GB/s writes, which slows the solution. Whether at 80 or 500 GB/s, the SSD will get the exact same GB written to it. So I don't see how the throttle helps its life -- except at the expense of human wait times -- a poor bargain.

    So for workstations, it would be really helpful to find an inexpensive SSD that doesn't throttle, or a way to defeat it on a SF.
    0