delluser1 :
8 to 11 watts difference depending on which version of the card
http://mark.zoomcities.com/images/gfx/GFXpowerchartbybrandgen.png
For the 240 I used the specific power usage measured by THG in their review for the specific card I mentioned, which performed better than the reference card. My point is valid and the difference is signficant.
delluser1 :
Newegg specs mean squat, read the manufacturers "System Requirements" tab in the XFX link
I quoted from the specifications tab in the XFX link I posted and it states "minumum power requirement".
And there are the other similar examples of "requirements" posted for other vendors' 4670 cards.
Sure its true, they cannot force a customer, who can put in anything he wants - they can only recommend. So perforce their requirement has to be a recommendation. There are no graphic card police going around checking and throwing customers that violate it in jail. So because they cannot force but only recommend you consider their guidance less important?
Again I ask, why do you think you know more about the system requirements then the vendors who design, test, manufacture, and repair the cards and have a long history of support and customer interactions to further inform them?
delluser1 :
Here's an 8800 GTS ( with it's 425 watt " power reccomendation " and 106 watt actual consumption ) running on the Dell 305w.
First, please stop misleading. Here is the power requirement from the EVGA site (that is an EVGA card in the picture) listing the various versions of the 8800 GTS - ALL with a power requirement of 400w NOT 425w.
http://www.evga.com/support/faq/afmviewfaq.aspx?faqid=58051
Second, that is just a photo.
Third, anecdotal information about one case carries little weight compared to multiple manufacturer's state requirements.
delluser1 :
It has long been known that video card manufacturers inflate thier wattage "reccomendations" in order to make up for the fact that there are crappy psu's out there that may have the wattage but not the amperage.
Again - very misleading. Watts = Volts x Amps. If they have the watts and are running within spec on voltage - as most do - then by the laws of physics they have the amperage. Now what you might be thinking of is that less of the power goes to the 12v rail - which was particularly a problem as graphics card requirements for 12v power grew quickly and many PSU manufacturers were slow to reallocate power to the 12v rail. It is not so much a problem today but one does need to watch for it.
Still it does not change the basic fact that the vendors are stating SYSTEM REQUIREMENTS of 400w or 100w more than poster has. And as you get close to PSU capacity, heat increases, noise levels increase, and efficiency decreases. I had a 6600 GT with a mimimum requirement of 350w on my Dell 8400 with a 350w PSU and was frequently annoyed with the noise level when the fan on the video card would speed up. Oops - now I am giving anecdotal information. Forget that poster - stick with the experts.