What do you think of AMD's Fusion technology?

liquidsnake718

Distinguished
Jul 8, 2009
1,379
0
19,310
AMD can pull this off since it has its own AMD processors and motherboards. The question is will this work and change the industry?

Besides GPGPU's, One can easily imagine the power of two processors in GPU/CPU tandem working for a more efficient computing and gaming experience? AMD coming out with a possible physics processor or part of the die on their GPU?

Is this just a marketing gimmick, or will it turn out to be a dead end technology?
 
Solution
I don't think you would be able to dynamically switch from GPU to onboard in a desktop enviroment since the display ports are on the card itself. Unless of course they make a new set of cards with new standards for it.

I would really appreciate being able to switch from dedicated to onboard graphics. Not many people appreciate low power but once you have a really high end system, you realise how much heat computers can make.

4745454b

Titan
Moderator
Like Intels first GPU + CPU on a chip it won't be anything special. Unlike intel however they have the ability to make it special if they figure out how. The advantage to doing this isn't no more need for to buy a GPU, but getting the onboard GPU to do the work that its better at compared to the CPU. Once AMD gets this down, then you can/will start to see some amazing things again. Intel can't really follow at this time as their GPUs suck. They might have been better off investing in Larabee even more so they could have a better GPU to add to the CPU.
 
Yeah, it will probably be on par with current integrated graphics. I doubt it would be powerful enough to be all that useful as a dedicated physics processor but I could be wrong. I would think the main functionality might come from using it with something similar to Nvidia's Optimus where the computer can switch between the two graphics adapters depending on what is necessary. It would save power and be especially useful for notebooks but considering the low idle power of the more recent cards I guess it wouldn't be that big of a deal.
I definitely wouldn't say it is dead end technology though. The perspective on this message board is mostly in terms of gaming but the vast majority of computers are not intended for gaming and for those this may be more than adequate. It may also allow AMD to remove integrated graphics from all of their chipsets in the future, perhaps decreasing slightly the cost to produce a motherboard.
 
Once they have it down it will kill off the low end graphics card industry. I personally think that it could easilly be as good as onboard when it gets here (properly on die ) and for HTPC's and office PC's as well as just plain home general computing where gaming isnt an issue.


Mactronix
 

liquidsnake718

Distinguished
Jul 8, 2009
1,379
0
19,310
Yes in reference to Nvidias Optimus, this allows us to switch power on the GPU in order to save battery life. It thus detects if the program is gpu intensive or if it is basic and thus is similar to hybrid sli, switching from the desktop motherboard to the gpu was fantastic, only it didnt work automatically and one had to restart the computer or use the bios.
 

rofl_my_waffle

Distinguished
Feb 20, 2010
972
0
19,160
I don't think you would be able to dynamically switch from GPU to onboard in a desktop enviroment since the display ports are on the card itself. Unless of course they make a new set of cards with new standards for it.

I would really appreciate being able to switch from dedicated to onboard graphics. Not many people appreciate low power but once you have a really high end system, you realise how much heat computers can make.
 
Solution