Closed

Iometer test scripts

Hi,

I'm going to do some test on various disk subsystems and will be using Iometer to achieve my results. However, I can't locate what seem to be the "standard" sets of Iometer tests that are run on many sites (particularly here @ Tom's). The webserver, fileserver, workstation, and database tests as seen on this page: http://www.tomshardware.com/reviews/ssd-hdd-flash,2127-8.html

I presume these are standard tests supplied by Intel (since they started the Iometer project) or Iometer themselves and not left up to everyone to define their own for non-standard/non-comparable results. Can anyone point me in the right direction?

Thanks!
6 answers Last reply
More about iometer test scripts
  1. IOmeter is still artificial; the 'patterns' you can load are not derived from actual application access patterns, but just manually inserted to more or less simulate a given workload. So its not really the best benchmark, which is application-tracing and replay benchmarks. Intel's iPeak suite has this ability on the windows-platform, and i believe several reviewsites use it or their own tracing patterns. In either case, this should be the most realistic benchmarks available for I/O related performance.
  2. sub mesa said:
    IOmeter is still artificial; the 'patterns' you can load are not derived from actual application access patterns, but just manually inserted to more or less simulate a given workload. So its not really the best benchmark, which is application-tracing and replay benchmarks. Intel's iPeak suite has this ability on the windows-platform, and i believe several reviewsites use it or their own tracing patterns. In either case, this should be the most realistic benchmarks available for I/O related performance.

    I'll see what iPeak has to offer, though I'm still looking for apples-to-apples comparisons for reviews done here at Tom's and other sites that use Iometer.
  3. There's no such thing as apples-to-apples, different reviews use different setups and there is no valid 'ceteris paribus' which demands that all factors but the ones you're about to compare, would have to be equal to ensure proper conclusions can be made. With a different motherboard, controller, software, drivers, BIOS-settings, etc could all translate into variances in performance, even when doing the same test on the same harddrive.

    If you want to compare, you probably have to do that with the products reviewed together in one batch, using the exact same setup. Bad benchmarking discipline causes wrong conclusions in the end.
  4. sub mesa said:
    There's no such thing as apples-to-apples, different reviews use different setups and there is no valid 'ceteris paribus' which demands that all factors but the ones you're about to compare, would have to be equal to ensure proper conclusions can be made. With a different motherboard, controller, software, drivers, BIOS-settings, etc could all translate into variances in performance, even when doing the same test on the same harddrive.

    If you want to compare, you probably have to do that with the products reviewed together in one batch, using the exact same setup. Bad benchmarking discipline causes wrong conclusions in the end.

    But at least using the same software and test pattern I can say, hey, mine sux and I need to redo what I have. Or I can say, mine's on par and I'm doing it right.
  5. Hey Gents!
    I'm also interested to stress-test my new iron I had built lately, so probably Tom's Hardware is the best place to ask for.

    Homemade 48TB Enterprise Storage System - http://log.momentics.ru/homemade-48tb-enterprise-storage-system

    Would like to stress-test to see performance degradation. Suspect for doing this I heed to build a special farm and create test access patterns.

    The article is written in Russian, but contails lots of photos and graphs, so probably might be helpful.
    It is not complete, and I'm taking some tests (and go on) however, any comments by mail or tweet are very appreciated.

    momentics on gmail where root domain is com
  6. This topic has been closed by Mousemonkey
Ask a new question

Read More

Workstations Storage