dual gpu can save more energy consumption?

Zeus arsenal

Reputable
Jun 7, 2014
18
0
4,510
I plan to buy FX-8350 and return 4770k. Then I would buy a low-end gpu to pair with fx8350 in order to save energy and lower heat.

The question is can it switch to low-end gpu when it's not on game or 3D modelling? with the 1 which not on board. Because 4770k about $316 while 8350 just $187 + low-end gpu about $40.

I plan to use it about 2-5 years, I will tak 3D animation class for beginning and intermediate level. So I could say for 3D work 30-40% and 60% would be just some 2-d, adobe stuffs, Ms office, and Dota2 game.
 
Solution
Dynamically switching between a discrete GPU and the integrated graphics to save power is really only an option on laptops, where doing that offers an increased battery life. The technology to switch between discrete and integrated graphics eg. Nvidia Optimus or AMD Enduro does not exist on desktop systems as power consumption really isn't that big an issue when you don't have to worry about a battery.

The big problem with turning off your discrete card on the fly to save power is that if you have your monitor plugged into it, your monitor would lose display signal when the GPU turned off, and you would have to unplug your monitor from the graphics card, and then plug it into your motherboard to use the integrated graphics. Laptops...

Zeus arsenal

Reputable
Jun 7, 2014
18
0
4,510


I forgoy to state that I have GPU ATI V7900. So in case it needs more work, can it switch automatically to the high-end one?. Then I don't need iGPU?
 

Zeus arsenal

Reputable
Jun 7, 2014
18
0
4,510


So that's why I am asking can I buy the low-end one to do the low-work load and dual with the high-end one which is ATi v7900. Can it switch automatically? or only the on-board one can?
 

LookItsRain

Distinguished
I dont think you can switch between GPUs at command like that with that type of setup. Also the fx8350 uses alot more energy than an i7, plus the i7 has intergrated graphics that can mostly do all general task while using very minimal energy over another dedicated card.
 
Dynamically switching between a discrete GPU and the integrated graphics to save power is really only an option on laptops, where doing that offers an increased battery life. The technology to switch between discrete and integrated graphics eg. Nvidia Optimus or AMD Enduro does not exist on desktop systems as power consumption really isn't that big an issue when you don't have to worry about a battery.

The big problem with turning off your discrete card on the fly to save power is that if you have your monitor plugged into it, your monitor would lose display signal when the GPU turned off, and you would have to unplug your monitor from the graphics card, and then plug it into your motherboard to use the integrated graphics. Laptops don't have this problem as the display is wired into both GPUs internally. Your discrete GPU doesn't use that much power when doing lighter tasks anyway, so the power savings in turning it off and using the integrated would be rather small in the grand scheme of things.

As said above, stick with the i7, the FX 8350 uses more power, and is slower than the 4770k, so you would lose performance, and would not get any gains in power efficiency or improved temperatures.
 
Solution

Zeus arsenal

Reputable
Jun 7, 2014
18
0
4,510


Thanks for long explanation. I got 3 questions
1) FX8350 has more core, higher clock speed, faster when it is used turbo booster than 4770k which is better for 3D modelling, 3D animation, rendering, and high-specs game. But why you said it is slower?

2)You said when it is doing light work the dedicated GPU use little energy. So I need the integrated one?

ATi 7900 would do both light work and heavy work for me along with faster clock cpu FX8350.

Feel free to correct any of my misunderstanding ^^
 
AMD's cores are a lot slower than Intel's. Clock speed and core count isn't everything, Intel's CPUs can get a lot more work done per clock cycle compared to AMD's current design. The i7 also has hyperthreading which provides a 'virtual core' for each physical core, that allows for more work to get done. Calling the FX 8350 and 8 core CPU also comes with a bit of an asterisk along side it. It's not really 8 cores, it's 4 modules that contain two integer cores each, but those two integer cores have to share a single floating point unit and other resources which can slow things down a bit. Despite the FX 8350 having a higher core count and higher clock speeds, it is slower than the 4770k. The FX 8350 only starts to become competitive with the 4770k with a pretty hefty overclock, which will consume a lot more power, and generate a lot more heat. The 8350's only real advantage is that it's a lot cheaper than the 4770k (provided you don't intend to overclock and need to invest in a better motherboard and aftermarket CPU cooler, in which case the cost advantage diminishes a bit).

You don't really need an integrated GPU if you have a discrete one. An integrated graphics chip can be nice to have as a backup if your discrete card fails, and some video encoding software can make use of the integrated GPU to accelerate encoding.
 

Zeus arsenal

Reputable
Jun 7, 2014
18
0
4,510


Why Fx 8350 got much much better rating(5/5 star and 5/5 eggs, if it not better or very good , it 's difficult to get the rating like that.) than 4770k from Amazon and newegg? those real-product users are not smart ??

On the other hand, look at 4770k rating, the proportion of 1 and 2 stars are much more than FX

I'm not a fan of intel or AMD, just curious how did it get such a great rating if it is not good one? And 4770k got like this. Or most are not the real user but just marketing strategy??

I bought 4770k , of course, I want to hear the real/ fair supporting reason of my purchase.
 

xweetok59

Honorable
Jun 23, 2012
102
0
10,680
I wouldn't even bother with this guy, looks like an alt account just to stir up drama against intel, or just bought an 8350 and is trying to make himself feel good about the purchase.
 
User reviews are not exactly the best indicator of how a product will perform. It can be an indicator of product reliability, if a lot of the low ratings come with comments to the effect of 'dead on arrival' or 'died within a month'. People also have a tendency to give disproportionately low scores to something in a user review over some minor issue that may or may not have anything to do with the product itself.

I wouldn't be surprised if a lot of the lower rating for the 4770k came from people who weren't happy with the overclocks they could achieve with it, Haswell CPUs generally cannot overclock as high as its predecessor, though overclocking generally isn't necessary with a 4770k right now, and even with the lower overclocking headroom, it still outperforms the FX 8350. Some people may have decided to upgrade to Haswell from Sandy Bridge or Ivy Bridge and did not notice a significant performance increase and gave it a low rating, despite not doing their research before buying and realizing that CPU performance, particularly on Intel's side has not increased very much in the past 3 years, and that they should not have bothered upgrading.

Bottom line is, that the i7 4770k outperforms the FX 8350 in every benchmark, and uses less power while doing so. Since you've already bought the CPU, there isn't much point in returning it and exchanging it with an FX 8350 unless you have gone way over your budget in buying the 4770k and you need to take a cheaper option.