alright so here goes. amps = watts/volts. this is the formula governing what we are talking about. watts is potential power and amps is actual used power. (simplified explanation). a psu is designed to supply so many amps on each of the 3 voltage rails, 3.3v, 5v and 12v. the main power of the system comes from the 12v rail. most of the power given to the 3.3v and 5v is unused while the 12v rail is leaned on heavily. A cheap psu will give a lot of power to the 3.3 and 5v rails to claim a lot of wattage. it may all add up to the stated watts but 30 amps on the 3.3v rail (or 99w) will never be used. the psu make will still add this into the total wattage.
in the end the supplied amps to the 12v rail is very low when they do this as they cheated by adding lots of watts to the low volt rails you won't be using. for instance, 25 12v amps = 300w but 40 12v amps = 480w. both of these may be on "500w" psu's but the 40 amps actually supplies more to the system than the 25 amps.
manufacturer's know this so they overestimate what is required to compensate. so they may say "800w minimum" so that a crap 800w psu will have enough power, despite the fact that a quality 600w psu does the job fine.
an analogy of this would be if you bought a car with a 700 hp engine that only gets 450 hp to the tires. another car has 600 hp but gets 580 hp to the tires. which is the better performing car? same goes for watts. it's all about what gets to the system not what sits unused in the power supply.