Is 80 PLUS Broken? How To Make It A More Trustworthy Certification

A Call For Better Testing

We admit we're being hard on the 80 PLUS organization. But we don't want to see companies started with the best intentions nudged astray by profits. The methodology on which this program is based is far from perfect, in part because it was conceived more than a decade ago. In order to keep up with the increasing demands of modern technology, whether you're evaluating PSUs, CPUs, or GPUs, it's imperative to adapt. We encounter this ourselves on a daily basis. We come with new ideas on our own, with the help of manufacturers, and with input from our readers. Then we integrate those ideas into our reviews, resulting in changes to the conclusions we draw. Sometimes this renders a complete database of work useless. But there's no other way to get better than to start over from scratch.

Here's a personal example. I had my own extensive collection of results from PSU testing. Then I started working on reviews for Tom's Hardware. Because my database only included numbers with 230V input, though, I had to start over. The audience here is largely based in the U.S., where 115V input is used. For me it wasn't so hard to start from scratch. But I bet that's not the case for an organization paid lots of money by vendors bound to frown on methodology changes.

The biggest problem we have with the 80 PLUS organization's methodology is how few load levels it takes into account. You simply cannot provide a reliable efficiency certification based on three or four different measurements. To make matters worse, it doesn't consider the 5VSB rail's efficiency, or the vampire power that each PSU consumes. Moreover, 80 PLUS only adds a 10% load level test at the Titanium tier, and that's too high for PSUs with more than 1kW of capacity. Most contemporary PCs need less than 100W at idle, which is why we start our efficiency measurements from 20W in the normal tests and even lower in the cross-load tests. Finally, 80 PLUS doesn't meticulously search for fake certifications. And we've also heard of cases where the submitted PSU varied from what was actually sold. 

Surely an 80 PLUS certification is a big deal for us folks who want to know more about efficiency before we buy a power supply. But as we point out, the most popular certification program is far from perfect. You have to be extra careful when you pay a premium price for a Titanium-rated unit, and hope that it really is more efficient than a good (and probably much less expensive) Platinum-class competitor.

We're understandably biased here, but we think our efficiency scores are much more precise since they include somewhere around 26,000 different measurements that cover the entire operational range of each PSU we test. Our state-of-the-art equipment provides the highest possible accuracy in this price range, and we don't shy away from mention it in our methodology page. Again, we intend to invest in a 0.01% accuracy power analyzer in the near future, which will allow for even better results (about 4x times more accurate than the lab 80 PLUS hires for its evaluations).

MORE: Who's Who In Power Supplies

MORE: All Power Supply Content

Contributing Editor

Aris Mpitziopoulos is a Contributing Editor at Tom's Hardware US, covering PSUs.

  • loki1944
    I could not possibly care less how efficient a PSU is; what I care about is how reliable it is.
  • Sakkura
    I think you're being unreasonable when it comes to how many load levels to test. A review site like Tom's only looks at a handful of PSUs every year. Ecova runs the 80 Plus test on the majority of PSUs on the market. That necessitates simplified testing.

    It could still be updated/improved, but it's never going to be as in-depth as the very few reviews a site like Tom's does.
  • waltsmith
    I can't agree. Until 80 PLUS became common blue screen errors due to dirty ass power being delivered to components was the norm. Even so called premium name brand PSUs suffered from this problem. Diagnosing a malfunctioning computer often involved trying up to 3 or 4 PSUs to see if it fixed the problem before even looking for anything else wrong. People that have been into computer hardware for a long time will know exactly what I'm talking about. We've come a long way, but progress is what it's all about. I applaud this article!
  • Chettone
    At least is something. Those that dont even have 80 PLUS can fry your PC.
    Personally I go for trusted manufacturers (based on user and tech reviews). Seasonic for example gives like 5 year warranty, that says a lot about quality.
  • laststop311
    EVGA makes a really good PSU the G1. for 80-90 dollars you get a 650 watt G1 with a 10 year warranty. Nice to see a company tly standing behind a product. And it's 80+ gold more than good enough
  • chumly
    @aris Why don't you send emails out to johnnyguru, guru3d, techpowerup, realhardtechx, etc, and create a standard you guys can all agree on? It's just a matter of doing it. All you guys are doing independent testing anyways. I don't think it will hurt your time budget to add a few emails and trying to get some people on board. Hell, you might make some money in the long run. Standardized testing methodology for computer hardware. Set minimums for what should be necessary for proper operation, and what is considered a failure. Then start to force the hardware companies to conform. You have a huge, reputable website behind you, you can accomplish whatever you want to. I'm interested in this as well, as probably are a lot of people.
  • PRabahy
    What would it take for you guys to start a "Toms hardware certified" division? I would pay extra for a powersupply that had that logo and I knew had passed the list of tests that you mentioned in this article.
  • I
    It's almost as though you are inventing things we don't need or care about. Ideals about that next step, and next step, and so on, come at ever increasing burdens to manufacturers, shoppers, and build costs.

    Like LOKI1944, I care more about reliability. To some extent the two go hand in hand, in that a more efficient design produces less heat which has a direct relation to how quickly the two (arguably) shorted lived components, capacitors and fans last, and yet when a design has greater complexity to arrive at higher efficiency, there's more to go wrong, and reverse engineering for repair becomes much more of a hassle.

    Yes I repair PSU that are worth the bother, though that's starting to split hairs since most worth the bother don't fail in the first place unless they saw a power surge that fried the switching transistors.

    The other problem with complexity is in cutting corners to arrive at attractive price points. "Most" PCs don't need much more than median quality 300W PSU, but those are not very common these days at retail opposed to OEM systems, so you end up paying more to get quality, and end up with a higher wattage than you need for all but your gaming system. Increase complexity and we're paying that much more still.

    Anyway, PSU efficiency doesn't matter as much to me as it did in the past, like around the Athlon XP era where many motherboards had HALT disabled, and your PC was a space heater even sitting around idle. Ironically the build I'm typing on right now, uses more power for the big 4K monitor than the PC itself uses.

    Maybe we need an efficiency rating system for monitor PSU!
  • Aris_Mp
    A proper series of tests besides efficiency can also evaluate (in a degree at least) a PSU's reliability. For example any of the firecracker PSUs that is on the market today won't survive under full load, at an increased operating temperature.

    Moreover, efficiency testing doesn't mean that you cannot observe other parameters as well in a PSU's operation, like ripple for example.
  • Aris_Mp
    @CHUMLY I know very well the guy at TPU so this isn't a problem :) The actual problem is that every reviewer has its own methodology and equipment so it cannot be a standard for all of us.

    In order to make a standard which can be followed by all reviewers you have to make sure that each of them uses exactly the same equipment and methodology. And not all reviewers can afford Chroma setups and super-expensive power meters, since most of them do this for hobby and actually don't have any serious profit.

    It would be boring also if the same methodology applied to all PSU (and not only) reviewers. It is nice to have variations according to my opinion, since this way a reviewer can covers areas that the other doesn't.