Going by load ratings only. 130w for the cpu by Intel's spec, 300w max for the 5970, and 105w for the 8800gt. Add these values together 635w but knowing the two cards they don't use the full power due to their given clocks you are looking at 575w+ under load with no overclocking.
What is the brand name and model of the unit in question? As for the card at stock clocks the normal power consumption is normal around 85w while with overclocking the max is 105w. Downclocking it wont do much good and the same will apply for the 5970.
Demonn -> Chieftec is quality, FIY. It is built on the same base as Enermax, Corsair, etc.
nforce4max -> Are you referring to the 8800? It is a 512MB Sparkle 8800GT.
Same chip which is the 65nm G92. As for the figure given by the power calculator is wrong by more than 25w I own two of these cards and know them like the back of my hand as well how much power they draw. I have done quite a bit of research when I had interest in vmoding. Normal load values are typically 85w at stock clock 600/1500 same for the 1GB edition while 105w max at stock volts 700/1750~1800.
Each cap is different except for Black Gates mellow in the morning dead in the after noon. Only the really crappy caps degrade that quickly of have seen use for great lengths of time beyond their rated voltage/current. A very high quality cap can and often last for decades with some lasting in excess of 50 years. My school has one of those original kit computers from the late 70s and it still works.
You are correct Nvidia did release such drivers and continue do as such. They did this to stop people from using their G80 and G92 cards with more popular ATI cards from the 4xx0 and 5xx0 generations. The driver however can be hacked to ware users can use multy card setup with both nvidia and ati cards but not all physx games work properly.