Workstation Storage: Modeling, CAD, Programming, And Virtualization

Visual Studio (Programming): Compiling Code

Swipe to scroll horizontally
Overall StatisticsHeader Cell - Column 1
Elapsed Time49:54
Read Operations1071
Write Operations61 070
Data Read64.31 MB
Data Written2.08 GB
Disk Busy Time10.966 s
Average Data Rate199.62 MB/s

Even though programmers spend much of their time editing code, that task doesn't require a fast CPU or SSD. Really, performance is most needed for compiling.

There are a variety of examples we could have used to demonstrate a compiling workloads, but we wanted something common and familiar. That's why we downloaded the source code for Firefox and used Visual Studio 2010's compiler. The Mozilla foundation largely prepares the source code, which makes it easy to compile.

According to the trace, a majority of operations occur at a queue depth higher than one. Because the compiler accesses multiple files in quick succession, operations quickly stack up. Interestingly, there's also a fairly even balance of 4 KB random and 128 KB sequential transfers. This is a little unexpected, because most of the source code files are less than 10 KB in size.

Consequently, compiling code isn't a usage scenario where SSDs provide a clear lead. Remember, reading and writing random data is where hard drives fall behind SSD performance most dramatically. This difference (or lack thereof) is demonstrated when we compare our single Vertex 3 a pair of Caviar Green 1 TB drives in RAID 0. The SSD-based system takes 49 minutes to compile the whole code base, whereas the hard drive based system finishes the job in 55 minutes.

I/O Trends:

  • 43% of all operations occur at a queue depth of one
  • 30% of all operations occur at queue depth between two and four
  • 41% of all data transferred is sequential
  • 29% of all operations are 4 KB in transfer size
  • 21% of all operations are 128 KB in transfer size