I am currently using the integrated GPU on my motherboard (ATI 4200HD), but have an Nvidia 9800GT, along with a suitable PSU, laying around. I'm not much of a gamer, and the integrated GPU seems sufficient for most of my HD video needs, but since I got this fancy GPU from a friend for free, I of course want to pop it in and see what it can do (don't we all want our computers to be as awesome as financially possible?). The only thing holding me back is my concern that since 99% of what I do can be handled with the on-board GPU, I would just be wasting electricity by powering this fancy graphics card.
I am perfectly aware that the Nvidia card has the capability to suck much more power than the on-board GPU, and would certainly do so if I were playing a graphics-intensive game, but I'm wondering how efficient these things are for the full spectrum of tasks. If the GPU were performing a task which could easily be handled by the on-board chip (i.e. just displaying a static, empty desktop), would the power consumption actually be that much different? I fully understand that, even the more powerful graphics chip aside, a dedicate card vs integrated chip is more overhead in and of itself. I just want to get a handle on how much different the power consumption would be for everyday computing tasks. What about if the display is turned off (i.e. if I want to leave my desktop powered on all day to act as something of a server)? Would the dedicated GPU still continue to suck power?
Again, my concern is power consumption in the sense that I'm worried about my electric bill. As far as the system's PSU is concerned, I've got that covered either way.
There would definitely be an increase in power consumption. You need a 6-pin PCI-E connector, and the integrated would most likely consume 50W less than the card, no matter what application. You also run into cooling issues if you're not prepared.