I don't know anything about the server PSU market, but generally speaking, the efficiency of a power supply changes across it's load levels.
Buying a high efficiency, high wattage PSU, but running it at mid-range or low wattage levels may end up reducing it's overall efficiency depending entirely on it's design.
An example (all these numbers are BS):
A 500W PSU, supplying 350W to the internal components, may draw something on the order of 400W of Real AC Power from the wall, converting it to DC power at an efficiency of 87.5%.
A 2000W PSU, supplying 1400W to different internal components, may draw 1600W of Real AC Power from the wall, converting at an efficiencty of 87.5%.
However, that same 2000W PSU on the original 350W components, may draw 500W from the wall, significantly less efficient than it's 500W counterpart at that load level, but not to a ridiculous degree.
The reason is essentially that the PSU is designed for efficiency at the load levels it's most likely to run at, usually they have some efficiency designed in for low power levels, and at levels near their specced output level.
In general, if you are running one or two servers with overkill PSUs, I would expect you probably won't see a gigantic difference (probably not enough to make it worth buying a smaller PSU now and needing to buy a larger one later on), however if you are building a whole farm of servers, you should probably use PSUs efficient at your expected load level.