Meet The 2012 Graphics Charts: How We're Testing This Year

Measuring Temperatures

Temperature Measurements

We test all cards at all load levels at 72°F room temperature. At least one hour before the test, all cards are moved to the testing room to make sure they are at room temperature when testing begins. After powering on the test rig, we wait for 15 minutes at idle load until temperatures stabilize before we start measuring. We log the temperatures using MSI Afterburner.

In order to measure the temperature under load, we use the Bitcoin mining application (GPGPU) or, if the card is incompatible, a pre-programmed Perlin noise loop from 3DMark Vantage. This typically generates very high FurMark-like loads, but, unlike with FurMark, the drivers don't throttle the performance (and, indirectly, power draw). Thus, the temperatures we see are worst-case, non-throttled temperatures.

Why don’t we just measure the card temperature in a demanding game? The answer is simple: even a demanding game may result in an inconsistent load if we're not diligent about recording the result at the same time. There are also games that generate extremely high frame rates while displaying menus, and the resulting power draw may not just match, but actually exceed, the power draw seen in a Metro 2033 loop and even approach the power draw of a Perlin noise loop. With GPGPU, you know how hot the card will get in a worst-case scenario. Also, the number of real-world GPGPU applications is increasing, so the extreme load numbers will become more and more relevant in the future.

We conduct the temperature measurements at the same time as the power measurements, which are discussed on the next page.

  • johnny_utah
    While I love the new techniques, using BITCOIN to bench GPUGPU performance instead of Folding @ Home? Um, okay.
    Reply
  • Still with the bar charts? Would *love* to see scatter plots with price/score on the axes... So much more useful in picking out a card.
    Reply
  • AznCracker
    Man the charts are dying to be updated. Too bad it isn't done more often since it takes a lot of work.
    Reply
  • You havent added how many cheese wheels it can run in skyrim as a benchmark... wth?
    Reply
  • DjEaZy
    ... i like the pile of card's @ the end of the article.... a beautiful pile...
    Reply
  • pharoahhalfdead
    johnny_utahWhile I love the new techniques, using BITCOIN to bench GPUGPU performance instead of Folding @ Home? Um, okay.
    I agree. I know Tom's spends a lot of time benchmarking, but Folding@home is something that is a bit more common. I would love to see F@H in some articles.

    BTW, I appreciate all the work you guys do.
    Reply
  • randomkid
    Where's the 5760x1080? In the area where I come from, 3x 1920x1080p 22" monitor cost around the same or even less than a single 2560x1440/1600 27" monitor so this is a more likely configuration among gamers.

    The 5760x1080 resolution will also push the GPU's harder than a 2560x1440/1600 could so why limit the resolution there?
    Reply
  • We'll add up to 20 new boards each month until the lower end of the performance range is filled out, too.
    How far back in GPU generations are you going to test, if at all? I saw the power consumption charts and could only see GTX 500, 600 and Radeon 6000, 7000 series. I have an EVGA GTX 480 SC for two years and do like to know how it compares to the newer series of GPUs. Much appreciated.
    Reply
  • Yargnit
    MMO FanYup no surprise here typical Nvidia benchmark suite fuck sakes.
    So what would YOU like to see used then? If they were trying to push Nvidia wouldn't Hawx 2 be in the suite?
    Reply
  • shinym
    For Starcraft II you say "This game doesn't stress the CPU, and is thus well-suited for GPU benchmarking." Looks like you got CPU and GPU mixed up there.
    Reply