the only reason for a GPU to exceed 300 watts would be if it was heavily overclocked and the limiters bypassed on the PCB.
that being said if ATI decides to make a 5990 (5890 is going to be a high binned 5870) it would end up being something like a pair of 5870 chips and they would need to have 2x 8pin or 3x 6pin power and a massive cooler. but that is just speculation from seeing what their past cards are
now if you were talking about a 4890 exceeding 300watts which is possible with a heavy overclock and LN2 cooling you would have to bypass the limiters and have thicker gauge cabling to the power connectors, have a single +12V rail PSU running ONLY the card. all that happens is the entire PCB heats up significantly and the chances for arcing, melting connectors and massive artifacts start to form on the display.
look into the japan overclocking competition that used GTX260's
Id like all of you to look at your psu specs.
Then look at the PCI specs.
Then look at ATIs wanting people to oc this card, which is already at 294 or thereabouts.
PSUs say things like up to.
It will/does go over 300 watts.
But, it also has to be PCI compliant, so, at stock, it comes right under it.
A 5890 would only have either better cores and better electronics,caps etc for an even higher oc
Nothing really. There is nothing magical about the 300W limit, it is just the spec that was chosen just to keep things, well, consistent. When any spec is made, there is a factor of safety in it so you are not constantly running at the verge of failure. So 300W is fine, possibly up to 400W is fine, but if you start putting much more than that through the board/cables you are asking for problems.
Also, keep in mind the conductors/wires on a power supply have a certain amperage rating. For example, a 14Ga Cu conductor has an amperage rating of 15A, 18Ga is 10A. I'm not exactly certain what size conductor is used in PSUs, but take each conductor, find the size, the amperage rating and multiply that by the voltage through that conductor. This will give you the maximum wattage that can be safely supplied through the wire while still being under the current rating. That should give you a better idea on the theoretical limit in which the GPU can draw without causing larger problems.
Whoops sorry for mistype in post, i meant 5970. So even though supposedly the board and pci-e 6pin are at 75w at 8 pin is at 150w it'll just go over? Would this be bad for the PSU because it's providing more power to those connectors than it's supposed to? O_o. Would it theoretically be bad for the gpu to have an extremely overclocked 5970? I'm not planning on buying one or anything, just was wondering lol.
Quality PSU = Designed beyond spec, so it's all good / Poor PSU = Massive singularity that engulfs us all.... errr.... some instability and potential fire hazard... and maybe a singularity... or Gremlins!
notice the total system power of 642 watts when oc'ed with a single 5970.
and when you have 2 of them in your comp, they will be throttling more because of heat rising from the bottom card into the top.
theoretically it can surpass 300 watts on closer inspection of the 5970 pcb power connections you will notice that right next to the 6 pin power connector is space for another +2 pin plug wich means ati could produce a card needing 2 8pins, 150+150+75=375 watts