Sign in with
Sign up | Sign in

SIPEW 2008: All About Benchmarking

SIPEW 2008: All About Benchmarking
By

Many people are familiar with the Standard Performance Evaluation Corporation (SPEC), a non-profit organization with the goal of providing relevant and realistic standard benchmarks. SPEC is based in Warrenton, Virginia (online at http://www.spec.org) and currently has more than 60 members. These include renowned universities and many big players in the hardware industry who want to help shape benchmark software that will eventually be used to evaluate and assess products. The organization does not make recommendations, but it provides performance results to the public.

SPEC Divisions

SPEC is divided into three divisions that take care of system level benchmarks, high performance computing and graphics benchmarking. The portfolio includes benchmarking tools for processors (SPEC CPU2006, CPU2000), professional graphics performance (SPECviewperf and SPECapc workloads), high performance computing (SPEC MPI2007), client/server benchmarks based on Java (SPECjAppServer, SPECjbb, SPECjms, SPECjvm), different server requirements, and a benchmark to determine power efficiency (SPECpower_ssj20). None of these are consumer benchmarks; they are aimed at comparing performance in professional and enterprise environments.

SPEC provides tools that enable a meaningful differentiation of system solutions on a fair basis, while allowing evaluators to focus on individual system characteristics. Benchmarks are thus based on popular applications and industry standard software that has already been ported to common platforms. Prior to running benchmarks, source code has to be compiled on the target system, using optimized compilers and specific settings to reach the best performance for each target environment.

While the SPECapc and SPECviewperf graphics benchmarks can be downloaded for free—we’ve been using SPECviewperf to evaluate OpenGL graphics cards—the CPU and system benchmarking solutions have to be ordered. The cost for these varies from $50 and $2,000, depending on the development efforts for SPEC. There are huge discounts for non-profit and educational organizations, resulting in a $50 to $900 price range.

SPEC Results are the Charts for Pros

Members are encouraged to submit their benchmark results to SPEC, which will review the submissions and publish the results on its website (see SPEC CPU2006 results as an example, and the results submission page for details). Since SPEC is the most renowned institution for professional benchmarking, companies such as AMD or Intel are eager to publish the results of their best performing solutions on SPEC. As a result, these numbers can be considered as a public and highly recognized top list for decision makers.

We visited SPEC’s so-called SIPEW workshop in Darmstadt, Germany, where members discussed the issue of scheduling in server farms, and the power efficiency benchmark SPECpower_ssj2008.

Display 6 Comments.
This thread is closed for comments
  • 1 Hide
    rhysee , July 23, 2008 9:31 PM
    Yawn .. what a boring article.
  • 1 Hide
    cangelini , July 23, 2008 9:42 PM
    RhyseeYawn .. what a boring article.


    Sounds like a good place to talk about what you'd like to be reading from the Tom's crew. We're all ears =)
  • 0 Hide
    pogsnet , July 24, 2008 4:30 AM
    I am expecting you use that software as sample between 9800GTX and HD 4850 both are good contender
  • 1 Hide
    cangelini , July 24, 2008 4:34 AM
    pogsnetI am expecting you use that software as sample between 9800GTX and HD 4850 both are good contender


    Unfortunately, probably not going to happen ;-)
  • 0 Hide
    eodeo , August 13, 2008 3:57 AM
    Wouldn’t it be MORE fair to say that SPECviewperf is more like a cheat test that states that very crippled workstation cards are still faster than the vastly hardware superior “gaming” cards only due to driver restriction and more importantly, software optimizations?

    If anything SPEC is one big Cheat tester, whose results you might as well plunge down the drain, since you aren’t going to get any useful info out of them.

    To top that off, the test still uses OpenGL- just burry the darn thing. Mac users can complain all they want, but no self-respecting professional application has been recommending OGL for anything, but legacy for quite some time now. OpenGL is outdated for several years now. It’s both noticeably slower and has far lesser visual quality compared to DX 9.0c implementations.

    I’m not sure about the rest of the SPEC family, but if SPECviewperf is any indication, its not looking good for them either.

    I get that SPECheatTest can exist since many ignorant people still use “professional” cards and OpenGL, but why don’t you at least mention this in your article, or are you happy Quadro Mac users as well? Testing under a bell environment that proves that "professional" cards with 1/20 power of a current "gaming" card is still faster is just a self fulfilling prophecy. Who needs to see this propaganda? Who are workstation cards manufacturers trying to fool? The ignorant. How about you? Question directed at THG.

    Both “SPECheatTest” and “Macs are not 200% more expensive, honest” articles have been an insult to the intelligence.

    Thank you for reading.
  • 0 Hide
    eodeo , August 13, 2008 4:00 AM
    Quote:
    (SPEC) – a non-profit organization with the goal of providing relevant and realistic standard benchmarks.


    Could anything be more further from the truth? How about black is actually white. Yeah, that about does it- barely.