A Better Benchmark
Why You Need a Reliable Storage Benchmark
What's this
“Why is all of this important?” asks Seagate’s Craig Parris, who chairs the SPC-C group. “There was no definitive benchmark in the world. With SPC, it’s a vendor-neutral organization, and it came together to develop what we as vendors felt was a typical enterprise workload at that point in time. The SPC took real live traces and discussed it with the membership and we merged it into a complex workload, primarily online transaction processing-orientated.”
Throughout our talks with Parris, we kept returning to one core question: The storage industry seems to live and breathe by Iometer testing, which measures IOPS, so why do we need SPC testing? According to Parris, Iometer does deliver classical read and write performance data, and it’s common to set up Iometer with a 70/30 read/write mix to approximate an OLTP load. But the complexity in Iometer’s operations doesn’t translate well to the real world.
“The SPC developed what we call special locality,” says Parris. “You have hot spots, just like in the real world when you have write logs and you go back to revisit. So that’s one thing that makes it quite unique. The other thing is it covers a whole spectrum of technology all the way from the individual device—the SSD or spinning media—all the way up to petabytes. So it’s scalable.”

The dominant concern is time. Iometer tests often run one to three minutes per pass, whereas the SPC standard tests run four hours. SPC is currently evaluating whether to extend to eight hours. Certainly, Iometer remains useful for quick in-house testing when a general sense of performance is needed. But when a business’s storage is on the line, a more thorough and relevant analysis is prudent.
Also keep in mind that Iometer uses a fixed queue depth. In contrast, SPC uses a fixed workload with varying queue depths. This mirrors the real world more closely. Moreover, there are no industry standard scripts for Iometer. Many businesses create their own custom scripts, and that’s fine, but it makes correlating results across the industry much more difficult if not impossible. There’s also the issue of which Iometer version is being used. Iometer has its uses, but leveraging both benchmarking tools (and others) is obviously the best approach. Don’t be surprised if different tools provide varying and perhaps conflicting results. When in doubt, though, SPC’s extensive methodology should be taken as the authoritative standard.