When I have read articles of power supply roundups, I have noticed that you are now testing the power supply efficiency. I think that this is an excellent thing to do, but I do not think you are going far enough.
A power supply that is not very efficient, but has a good power factor may actually perform better than a power supply that is very efficient, but has a poor power factor. For me, one measure of performance is the lack of an excessive bill from the electric company.
The following formula should describe what I am thinking of (of course I could be oversimplifying the problem.):
Power Bill = ((Power used by PC)/((efficiency)*(power factor))
A "perfect" power supply would have a power factor of 1. This number is unobtainable for any practical device except for a purely resistive device such as an electric heater. The expected range of values for a power factor is between 0 and 1 exclusive. Power factor meters are available, though they are expensive.
I did not see any reference to using a power factor meter to determine the power factor of the power supply and this is something most people are not aware of. I think it would be a good thing to make this measurement.
If I have accidentally missed references to power factor in you power supply roundups I apologize in advance.
I agree with the description of power factor given in the article. It is reasonably consistant with my understanding of power factor, and I suspect that I would have to crack open my school books to find major inconsistancies, if there are any. That being said, I am naturally suspicious of things that should be and what thing are. Does a home power meter do a reasonable job discriminating the the phase angle of the current relative to the voltage? Does this discrimination work better at 60Hz (or 50Hz) and become much less effective at higher frequencies (from losses relating to eddy currents)? I do not have the answer to these questions and others and I doubt very much that I can get a solid answers from the utilty company. However, the article has caused me to reconsider the ammount that power factor may affect my bill. As a consumer, I would still want to know the power factor of a power supply as well as its efficiency with the understanding the the power factor MAY not significantly affect my electric bill. Untill I have information from what I consider a reliable source (if I ever ever get it), the formula explaining my point of view should be treated as invalid.
:wink: PFC was actually a topic of discussion in a recent thread. Bill_Bright and I went back and forth for a while...do a search for that thread and you can see our debate on the topic. I stand by my statement that it will have no impact on your electric bill. What we shouldn't do is buy the hype/sales pitch of PSU mfrs on PFC. The PSU industry has high profit margins and there are a lot of players in the market. Right now the mfrs are trying to differentiate their products by touting PFC.
By no means am I saying that PFC is a bad thing. I'm just saying that the direct benefit is to the utility company in the form of cleaner power and less harmonics losses on power supply lines. As more and more people get 1 or more PCs in their homes, those losses can/will become significant to utility companies. We both know that $ losses - call them expenditures if you want - will be passed on to the consumer. Using that as a basis for argument, it is a reasonable assumption that PFC will become a mandatory feature in PSUs via legislation - look at the EU. As that technology proliferates, utility companies will experience less line loss due to harmonics. Will that garner a reduction in power rates? Not likely. Will it slow down the rise in rates? Possibly. What it will do is conserve energy and that is a good thing however you look at it.