Sign in with
Sign up | Sign in

AMD's Fusion Cloud Could Hurt GPU Biz

By - Source: Tom's Hardware US | B 31 comments

How often do you upgrade your graphics card to play the latest first person shooter games? Every six months? Every year?

Graphics cards are the focal point in a system for gamers. A GPU is the break or make point for a high-end game to be playable. This in essence, is what AMD wants to address with its Fusion super computer. AMD wants to be able to deliver a game to users on any computer. Because of the way Fusion delivers graphics--essentially pre-rendered and them streamed over the Internet--gamers will be able to enjoy all the latest games, no matter what GPU they have in their system.

While this makes perfect sense for the consumer, where does AMD's graphics card business stand in this situation?

With a really good Internet connection--net connections in the U.S. are horribly behind--you can enjoy the latest games, even on a GPU that can't even handle 3D rendering. Time to whip out that old ATI Mach 64 card.

But Fusion is more than pre-rendered games on a super computer, it's about integrating both the GPU and the CPU into one chip. But this too presents an issue for the gamer. Why would a gamer want to have their GPU tied to a CPU, when a CPU can outlast a GPU by many months and even years.

We spoke to an AMD board partner today, who wanted to stay annonymous, and it indicated to us that the future for AMD GPUs may be very limited for board partners. Once Fusion has the GPU and CPU in one chip, what will the board partners do? Will there be Fusion add-in boards on motherboards instead of CPU sockets? That could very well be the outcome. Even more interesting, Zotac, an Nvidia partner, told us that Fusion is actually great for the Nvidia camp because "no one wants to to have to upgrade both everytime; as a gamer, I want to upgrade my graphics card separately, especially when I alreayd have a great CPU."

Here's the real kicker though: both Zotac and the AMD board partner said that AMD is really going to kill 50-percent of its buiness with Fusion. Obviously, we think AMD has thought about this already. When asked if AMD plans to exit the discrete GPU business, it declined to comment on future initiatives.

As Nvidia continues to push on the strengths of its discrete GPU solutions, AMD seems to be taking a totally different route. Only time will tell how these strategies will play out.

One more thing: if AMD does turn out to make its Fusion super computer a success and gamers could always play the latest games no matter what their platform was, who would buy a GPU at all?--AMD or Nvidia.

Discuss
Ask a Category Expert

Create a new thread in the News comments forum about this subject

Example: Notebook, Android, SSD hard drive

This thread is closed for comments
  • 0 Hide
    Anonymous , January 10, 2009 1:35 AM
    Move over consoles!
  • 0 Hide
    Blessedman , January 10, 2009 1:46 AM
    lol this honestly is a pipe dream (literally)
  • -1 Hide
    knightmike , January 10, 2009 1:54 AM
    I think Fusion Cloud is a great idea.
  • Display all 31 comments.
  • 1 Hide
    falchard , January 10, 2009 3:07 AM
    I don't think it will hurt graphics cards. The application is limited. Right now you will achieve a better looking game using client-side rendering. At worst if this idea catches on, you will have companies buying hundreds of video cards every 6 months to render for the cloud.
  • 0 Hide
    Anonymous , January 10, 2009 3:13 AM
    The future course of this may be more obvious then it seems. This problem has been solved before. Evolutionary history has faced these exact same parameters in the development of nervous tissue. neurons evolved first, then neural nets and finally specialized neurons and neural nets for each problem causing evolutionary pressure. If transisters have hit a speed limit, then multiple processors are the first obvious solution. The next real step will be dedicated processors and groups of processors for specific problems. We already know that custom silicon can produce solutions several orders of magnitude faster than general purpose processors like cpu's even at the same frequency. The best known example of this is graphic processors. It is also clear that both microsoft and intel have conflicts of interest with improving the efficiency of software and hardware. Microsoft needs new hardware to justify new versions of software, and intel needs new fancier, more demanding software to drive hardware upgrades. Netbooks such as the one I am currently using scare them both silly. There is little or no incentive for them both to really innovate since they are already at the top of the heap. The real question is who will be the next disruptive innovater? How will we recognize them when they appear on the scene? Look for the first one to push heterogenious processors designed to take advantage of problem specific. Maybe a hungry AMD, Apple or some new startup will bring us what we are looking for.
  • 0 Hide
    Anonymous , January 10, 2009 3:17 AM
    Maybe I'm being stupid here, but could a GPU turn back graphics processing to the CPU(s)?

    I'm thinking gamers, here - nobody plays WOW and uses MS office at the same time, right?

    You DO? Does your BOSS know?

    J/K!

    But in current quad-core systems, what if the video drivers offloaded processing needs to idling cores? Transparent to the game software?
  • 3 Hide
    WR , January 10, 2009 4:17 AM
    First, by offloading a GPU to a cloud supercomputer, someone still has to pay for the supercomputer and now the network bandwidth. Essentially you may be renting a GPU remotely, and this may cost more or less.

    Second, notwithstanding the mediocre net connections in some countries, even the best connections will lag behind a real GPU. Responses won't be as instant, and quality won't be as good as a high-end GPU.

    Third, this will impact low end GPUs the most, especially integrated solutions. But does it fully rectify Intel's horrible integrated performance?

    Fourth, GPUs cost both in hardware outlay and power usage. If your computer is part of the cloud, what measures will be taken to prevent leeching? People might really skimp on graphics, or have nothing but 2D and try to fake 3d capability. I'm talking about the majority who just might not be using AMD platforms. Will the system hold up or defend against such abuse?
  • -1 Hide
    zodiacfml , January 10, 2009 4:22 AM
    this is nonsense.
    how could you hurt the discrete GPU business with that amd fusion supercomputer? it's as though amd will be providing discrete gpu performance to all users of that service, probaby by a million users! that even excludes the problem in speed of broadband access required for an average gaming desktop resolution probably at least 10mbps.

    what amd's fusion will do is provide a more cost and space efficient way to produce a display than integrated graphics. So it is probably the integrated graphics business will lose since it will seat between fusion and discrete graphics businesses which is quite a crowd.

    the last paragraph is just annoying. you don't buy a GPU for a handheld gaming device.
  • 1 Hide
    tuannguyen , January 10, 2009 5:20 AM
    zodiacfmlthis is nonsense.how could you hurt the discrete GPU business with that amd fusion supercomputer? it's as though amd will be providing discrete gpu performance to all users of that service, probaby by a million users! that even excludes the problem in speed of broadband access required for an average gaming desktop resolution probably at least 10mbps.what amd's fusion will do is provide a more cost and space efficient way to produce a display than integrated graphics. So it is probably the integrated graphics business will lose since it will seat between fusion and discrete graphics businesses which is quite a crowd.the last paragraph is just annoying. you don't buy a GPU for a handheld gaming device.


    Not sure you really understand what AMD is going for. With Fusion super computer, they are not adding "some" graphics power to discrete solutions--they want to render the entire game server side, compress the the video, and stream it to the client. Basically, the client has no client-side game install, just a terminal, where you provide control feedback.

    This hurts discrete, not integrated solutions. Because you will not need a super fast 3D card, if Fusion is successful, to play the latest games--provided that your net connection is fast enough to stream the video feed beyond 720p resolution.

    / Tuan
  • 1 Hide
    enewmen , January 10, 2009 5:43 AM
    I remember reading a while ago that the PCs are reaching the limits of graphics. Such as you keep making the car engine more powerful, you start needing better wheels, brakes, etc. then a whole new frame is needed. Finally, the car starts going so fast, it can't stay on the ground. etc etc. So the basic idea of what a car is needs to be re-examined.
    So, basicly, the fusion project is as much about CPU/GPU inner communication as it is putting both on one die. I also think Intel is investing heavly in this area.
    2 cents worth..
  • -1 Hide
    zodiacfml , January 10, 2009 7:21 AM
    actually, this article itself is confusing which is why i mentioned integrated graphics. amd FUSION is a tech totally different from the and amd FUSION RENDER CLOUD, i just stated my opinion regarding FUSION which is off the topic/headline.

    Regarding amd fusion render cloud on your headline, i think Wr, the guy who posted above me explains my points well.
    also quoting from the link of the amd fusion render cloud news:

    So what's under the hood of this beast? According to the company, the AMD Fusion Render Cloud will include AMD Phenom II processors, AMD 790 chipsets, and ATI Radeon HD 4870 graphic processors.

    how does amd go against its own business when the hardware used on the supercomputer is what they sell to consumers?
    i think you have to read the render cloud news again to know its applications and to whom its for.
    it could be used first as a supercomputer which harness the power floating point capabilities of graphics cards then if OTOY produces the right software, could be used as a service for consumers for which the author suggested, gaming on handheld devices.


    tuannguyenNot sure you really understand what AMD is going for. With Fusion super computer, they are not adding "some" graphics power to discrete solutions--they want to render the entire game server side, compress the the video, and stream it to the client. Basically, the client has no client-side game install, just a terminal, where you provide control feedback.This hurts discrete, not integrated solutions. Because you will not need a super fast 3D card, if Fusion is successful, to play the latest games--provided that your net connection is fast enough to stream the video feed beyond 720p resolution./ Tuan

  • 0 Hide
    NuclearShadow , January 10, 2009 8:12 AM
    I don't think it would be possible to stream modern games with our current internet connections in the manner they make it sound. Maybe if you were sitting near the server that would stream it to you it may provide decent results but that's simply not possible.

    Its not like its just a set amount of data that has to be transferred either like a normal download. With every single action taken within the game data would have to be both sent and received I simply can't see this working perhaps in the future its plausible but not the present.

    I think their goal is simply going one step to far and rather than trying to make it streamed similar to a video they should have the games get a certain % done and then allow then player to play while the game continues to download and install itself on the computer. Granted this means that the gamer would still need a decent video card but at least its realistic.
  • 0 Hide
    tuannguyen , January 10, 2009 8:42 AM
    zodiacfmlactually, this article itself is confusing which is why i mentioned integrated graphics. amd FUSION is a tech totally different from the and amd FUSION RENDER CLOUD, i just stated my opinion regarding FUSION which is off the topic/headline.Regarding amd fusion render cloud on your headline, i think Wr, the guy who posted above me explains my points well.also quoting from the link of the amd fusion render cloud news:So what's under the hood of this beast? According to the company, the AMD Fusion Render Cloud will include AMD Phenom II processors, AMD 790 chipsets, and ATI Radeon HD 4870 graphic processors. how does amd go against its own business when the hardware used on the supercomputer is what they sell to consumers?i think you have to read the render cloud news again to know its applications and to whom its for. it could be used first as a supercomputer which harness the power floating point capabilities of graphics cards then if OTOY produces the right software, could be used as a service for consumers for which the author suggested, gaming on handheld devices.


    See here:

    http://www.maximumpc.com/article/news/amd_hopes_bring_gaming_cloud_with_fusion_render_cloud_supercomputer

    http://www.amd.com/us/fusion/Pages/index.aspx

    AMD clearly says that Fusion is more than what people "might have thought."
    First strategy: combine GPU+CPU: odd for gamers
    Second strategy: take processing of intensive games/apps to the cloud and process/render them

    Great, so I guess I won't need to buy expensive graphics cards anymore to play the latest games. Sure, you mention that the Fusion Render Cloud will use AMD GPUs, so how does that help its bottom line? AMD will sell GPUs to itself? If games are rendered server side and only the video feed is streamed to users, it doesn't take much to render video.


    / Tuan
  • 1 Hide
    sparky2010 , January 10, 2009 10:18 AM
    it's going to be weird for a lot of gamers.. it's like suddenly switching from normal car engines to electric, basically there's going to be alot of hesitation and doubt.. also, not to mention, the gigabytes of bandwidth needed per person per day just to play a few hours of gaming.. a super computer might be able to render them all, but then streaming them to the people? i'm thinking expensive.. very expensive.. and, the proper connections have to be there too, which a very high percentage of this planet doesn't ATM...

    i'm not saying it's a bad idea, on the contrary, it seems to present gamers a solution that'll allow them to finally install a game and switch graphic options on to ultra high everything and just play.. i just think that it can't exactly be feasible in this year or two.. at least not in the way we've interpreted it from the article.. there's probably something they're not telling us..

    on the other hand, the fusion cpu/gpu sounds like a good idea.. i think with this (and proper drivers..) you won't need bulky crossfire/sli systems (and pricey) anymore.. we'll just have to wait and see!
  • 0 Hide
    zodiacfml , January 10, 2009 12:18 PM
    thanks for the link of the maximumpc.com article.
    i was right about speculating the render cloud being used in some sort of web browser or small application to allow pc gamers play with another handheld or smartphone device.
    still,i don't see how it can hurt gpu business there, unless graphics chips are of limited supply which is very unlikely.

    yes, the cloud will use amd chips and it will help the bottom line. as though amd is selling their hardware to small devices, which for now is exclusive only to PC's. though profit will be a spread among several users of about 10 or 20 gamers using that service, it still helps selling more.

    i think your confusion comes from the perception that the cloud will allow gaming of more than 1027x768 resolution or fast games which it can never do because you haven't read any specific detail of how will this be done or implemented.
    i see the big potential for this on MMO titles since it won't require fast or intricate graphics.

    tuannguyenSee here:http://www.maximumpc.com/article/n [...] ercomputerhttp://www.amd.com/us/fusion/Pages/index.aspxAMD clearly says that Fusion is more than what people "might have thought."First strategy: combine GPU+CPU: odd for gamersSecond strategy: take processing of intensive games/apps to the cloud and process/render themGreat, so I guess I won't need to buy expensive graphics cards anymore to play the latest games. Sure, you mention that the Fusion Render Cloud will use AMD GPUs, so how does that help its bottom line? AMD will sell GPUs to itself? If games are rendered server side and only the video feed is streamed to users, it doesn't take much to render video./ Tuan

  • 0 Hide
    Anonymous , January 10, 2009 12:19 PM
    From what I understand this sounds good on paper, but I see 2 problems:

    1. Internet speed - unless the connection is fast enough, you may as well enjoy screenshots slide show. Games are essentially real-time user-computer interactions, so any lag between inputs and outputs will really kill the experience.

    2. Costs - both the cost of service and the cost of bandwidth. I think this could turn into a pay-per-play kind of service, where you pay certain amount of money to play certain amount of time. On top of that all the ISPs from where I live have ridiculously low monthly data cap - you either pay for more or thrown back to dail-up speed when you used up - so this will increase the cost to use this service as well.

    In other words, it could/may be affordable for light users, but I don't see how that will work out for a heavy gamer like me.

    But, you never know... the world is full of surprise. But one thing is for sure, people won't stop upgrading their GPUs to play the latest games any time soon.
  • 0 Hide
    ntflite , January 10, 2009 1:39 PM
    I think the Fusion cloud sounds pretty interesting. I don't believe they are going and trying to compete against high end graphics cards (where the big profit margins are) and gamers that demand instantaneous feedback at maximum frame rates but they are trying to enable a higher level of graphics capability for devices that don't have the hardware.

    And what that means is anything with an MPEG2/H.264/VC1 video decoding core. Hypothetically say a digital settop box like for cable or fios. It will never have the capability to play WoW. But if (hypothetically) the game is hosted in the cloud, rendered in the cloud, and the resultant output is just another video stream sent to the cable box, then that user will be able to enjoy a fairly good experience. Once again assuming that the cable service implements a client to access the cloud and send user input to the game hosted in the cloud (or anywhere else).

    And thinking some more, it doesn't necessarily have to be targeted towards games as we know it. Imagine again the settop scenario where one of the applications is a virtual tour of say the Louvre. The cloud just dutifully spits out a video stream rendering of wherever the person is running around. All the terabytes (petabytes) of data needed to reconstruct every detail of every piece of art in the Louvre can be hosted on servers in a central location and rendered from there.

    Or... scientists navigating models of proteins and molecules determining how they interact. Instead of each computer needing a beastly graphics card, the complex models can all be rendered in the cloud and a video stream sent to the client.

    Definitely interesting in my opinion.
  • 0 Hide
    da bahstid , January 10, 2009 4:13 PM
    Let's not over-react people.

    The Cloud idea may or may not be successful, but if it is, it will not be providing free and easy 4870X2 graphics power for hundreds of millions. In practice, you'll likely see casual consumers being able to "rent" gameplay online at resolutions most serious gamers would consider unacceptable. It would be Crysis on low/medium settings rendered in 300x200 on a cell phone. Or perhaps medium/high setting rendered in 600x400 for people that want to play games like they surf YouTube. The graphical power required would actually be fairly small, a single 4870 being powerful enough to render several games at the same time. Tackling the problem from the CPU side would be the bigger challenge, but that would mean a lot of CPU power being rented from AMD, wouldn't it?

    On the side of having integrated graphics on the CPU...again this is not about placing the highest-power GPUs directly on top of the CPU. It'd cost way too much in the first place for most consumers, and there will never be adequate cooling solutions for such a beast. The closer current-day analogy would be AMD fitting 3650-equivalent parallel processing power into their CPUs, not ridiculously huge, yet think about all that could be done with this. Everybody knows Intel is a mediocre GPU manufacturer, yet they actually own most of the graphics market on account of hohum integrated graphics. Fusion would make HD-playback level graphics power the standard for AMD CPUs, potentially a major selling point for consumers not interested in the fuss and expense of added graphics cards and inadequate integrated graphics, finally eating into a segment of the market that Intel has been allowed to dominate without any real sort of effort.

    Finally, the parallel processing that would become available on AMD's CPUs would be accessible to developers for any application. Media conversion would no longer be the architectural weakness for AMD chips the way it has been the last two years. Developers could expect to have sufficient CPU power to include more physics in their games, without having to gamble on whether or not enough gamers have recycled 8800GTs to run PhysX. Played correctly, this could help as a transitional technology towards outboard physics cards, because the software will finally become available enough to make it worthwhile for high-end gamers.

    When you consider that Intel's architecture is presently robust enough that their i7 can almost match the parallel processing of a lot of GPUs anyway, Fusion may not just be what AMD needs to do to compete. It may be necessary for them just to survive.
  • 0 Hide
    roofus , January 10, 2009 4:26 PM
    ISP's putting bandwidth caps in place, necessity for a rocking, stable connection. yea...this is a great idea at a great time.
  • 0 Hide
    Anonymous , January 10, 2009 6:51 PM
    +1 on what's said;
    There are many security issues; internet providers are already cutting on bandwidth (bringing monthly speed and volume limitations).
    Seeing that,a night of playing could easily cost you a few gigabytes of network transit.

    Second,there might be applications where you will want privacy,and not for everyone to see.

    The extra cost of subscription is another thing, many users, including myself,did not play WOW because it costs a monthly subscription fee, and I am not prone of sharing my banking info with any company online.
    Let alone, pay $50 per year, per game, where other games just cost the CD ($25),and rely on local free servers, if they do support multiplayer.

    Plus, you can always play even when not having internet.

    I also agree with the lag issue.
    We do everything in our power, to shave off 5ms on our LCD monitors.
    They now go downto 2 and 1ms, yet we will play games over internet which has a 105ms lag?

    The games that would work, often older games, that easily can run in software render mode (does not need a powerful graphics card) will probably be playable.

    I am not for this idea;instead, AMD would do better creating their motherboards to have a plug and playable GPU (just like a CPU) right next to the CPU on the mobo, with a few hundred of lanes.
    That would void the need for PCIE, and separate graphics cards.
    Just plug in a CPU, a GPU core, and some VRAM (like you plug in DDR ram).
    I think that'd be a much better solution,if CPU/GPU on one die is not possible.
Display more comments