i got one question to ask you people and i am curious to know the result
i dont have the stuff and hardware to check and do that
so asking you people to provide me knowledge me on this issue
" let us assume if a high power consumption GPU like the top end of power hungry nvidia/ati graphic card which uses the 2X(6-pin PCI-E) power cables
to get into action, i want to know that what will happen if power supply cables of gpu are removed and the display cable is switched to the vga/dvi of onboard proccesor graphics while the card still remains on the pcie 2.0 slot of the motherboard,will it show low power consumption "
pls comment your ideas or if some one can practically test that and tell me i would be very much happy
i saw a new technology on internet called LUCID VIRTU now being implemented on the motherboards of intel 2nd generation processor motherbaords with lga 1155 sockets
i saw in in Z68 chipset , this VIRTU technology switches the graphics of intergrated VS discreet gpu . so i wanted to clarify the above question of removing gpu power cables on boards without the LUCID technology to save power consumption
All cards, beginning several years ago, have a low power 2D mode that they drop into, their voltage and their clock speeds reduce significantly, so you dont even need to remove the PCI-E cables, if you are running a single screen and not running a 3D application the card will be down clocked into its low power mode.
If you were to do as you suggested, you have a couple options, the system might not post because the video card wants some power from those cables at start to prove they are there, the video card will sit in low power idle mode, or the video card may shut off most of its systems(this depends on the card). I believe the HD 5xxx series would have the second card in a crossfire configuration power down complete when not in use, that may hold true for any card that is not driving a monitor.
PCI-E 2.0 provides 75w of power to the card installed into its slot I believe. But if a GPU card is installed in a PCI-E slot, most boards that I'm aware of will default output to an installed card and won't output video to the onboard. Some boards might default output to whichever chip is connected to a display, but I'm not sure. If the board was smart enough to default output to the chip connected to a display, being the onboard gpu, then PCI-E card might just draw just enough power to power on and power its fans but since it's not having to render graphics, it might not draw hardly any power just because it has nothing to do. Or, the board will just still route all graphics through the PCI-E card and it'll consume the same amount of power as if it was just rendering 2D graphics in a lower power mode as hunter315 said.
I'd imagine that it depends on which board you use as I think different boards would behave differently.