18ga wire is good to carry upto @10A at 12vDC upto 12" in length. Any higher amperage you degrade the wire and it burns. Any longer and the resistance is too high amperage goes up, and it burns. 150w was set as the 'standard' for pcie 8pin because of the abundance of cheap psus that use 20ga instead of 18ga, which will not carry 10A, they burn out at 7A. So it was the 'safe' wattage, not an actual wattage.
There's 2 reasons why test benches use monster psus.
1. The monster psus are more commonly the higher quality, as in AXi 1500 etc, so this massive overpowering not only means absolutely no chance of underpower, but it'll also be clean/reliable power, no matter what gpu is used on the bench.
2. Sli/CF, r9 295x2, titans etc all come with high demands, and those monster psus are expensive. So they buy/get donated one big psu that gets reused constantly, along with the high end mobo's, cpus etc all in an effort to reduce any single component from application of any kind of bottleneck to the gpu performance.
Yes, paranoid. See it every day. Ppl with barely a pc running a 750ti and 750w psus, because the salesman convinced the buyer that he has a big gpu, so needs a big psu, and the $40 550w isn't enough, better be sure and get the $50 750w. Reality is that the buyers pc couldn't possibly out draw the $30 400w the salesman was desperately trying to hide.
I've got a Asus gtx970 at 123% user OC, with i7-3770K at 4.9GHz on a Evga G2 550w. And it doesn't blink. That's an 8pin pushing 225w gpu, over 200w cpu OC along with whatever the mobo/accessories is using, which can easily total 100w. And I'd need a 650w+ psu why? Just in case? Never happen. Those are maximum wattage that cannot ever be reached simultaneously. That would be like running a full cpu 100%load stress test, while direct burn a DVD between the optical's, while pushing a gpu 100% burn in, while listening to Pandora with the A/V running in the background. Never happens.