I just put in a HD5450 on a 300 Watt Bestec PSU in a HP OEM, its a 12v 19a (so its true watts are 12v x 19a = 228 watts) Then take into account from reading from manufactuers website it runs at 70% efficency, so 70% of 228 = 160 watts.
Doesn't work that way lazy. If it says it can output 228W, then it has to or else the company faces mega lawsuits. Its 70% efficient at converting AC electricity to DC. If its 70% and outputting 228W, then its pulling 326W from the wall. If you had an 80% unit it would be pulling only 285W.
Then its an older machine from 2009 so its likely to be even less given its age since PSUs lose their efficency over time.
Unless you have a really junk unit, it will take longer then 2 yrs for the caps to age.
The manufacuter of the GPU recommends a 350 Watt quality PSU minimum.
Ofcourse it runs fine however, because under 100% load the machine would be lucky to hit say 150 Watts.
So whats the point of all that stuff you posted if it will run fine? If the computer would need ~150W, and his unit has ~200ish, then it will be fine. His at the upper edge of his power, but its still possible.
I'm not sure why he doesn't want to change the PSU Its not a void warranty if he's already cracking the case. probably a money thing, and he should replace it if he wants a real gaming card. I strongly suggest at LEAST a 5750 for any serious gaming, more if you have a higher resolution screen.