Skip to main content

AMD's Fusion Cloud Could Hurt GPU Biz

How often do you upgrade your graphics card to play the latest first person shooter games? Every six months? Every year?

Graphics cards are the focal point in a system for gamers. A GPU is the break or make point for a high-end game to be playable. This in essence, is what AMD wants to address with its Fusion super computer. AMD wants to be able to deliver a game to users on any computer. Because of the way Fusion delivers graphics--essentially pre-rendered and them streamed over the Internet--gamers will be able to enjoy all the latest games, no matter what GPU they have in their system.

While this makes perfect sense for the consumer, where does AMD's graphics card business stand in this situation?

With a really good Internet connection--net connections in the U.S. are horribly behind--you can enjoy the latest games, even on a GPU that can't even handle 3D rendering. Time to whip out that old ATI Mach 64 card.

But Fusion is more than pre-rendered games on a super computer, it's about integrating both the GPU and the CPU into one chip. But this too presents an issue for the gamer. Why would a gamer want to have their GPU tied to a CPU, when a CPU can outlast a GPU by many months and even years.

We spoke to an AMD board partner today, who wanted to stay annonymous, and it indicated to us that the future for AMD GPUs may be very limited for board partners. Once Fusion has the GPU and CPU in one chip, what will the board partners do? Will there be Fusion add-in boards on motherboards instead of CPU sockets? That could very well be the outcome. Even more interesting, Zotac, an Nvidia partner, told us that Fusion is actually great for the Nvidia camp because "no one wants to to have to upgrade both everytime; as a gamer, I want to upgrade my graphics card separately, especially when I alreayd have a great CPU."

Here's the real kicker though: both Zotac and the AMD board partner said that AMD is really going to kill 50-percent of its buiness with Fusion. Obviously, we think AMD has thought about this already. When asked if AMD plans to exit the discrete GPU business, it declined to comment on future initiatives.

As Nvidia continues to push on the strengths of its discrete GPU solutions, AMD seems to be taking a totally different route. Only time will tell how these strategies will play out.

One more thing: if AMD does turn out to make its Fusion super computer a success and gamers could always play the latest games no matter what their platform was, who would buy a GPU at all?--AMD or Nvidia.

  • Move over consoles!
    Reply
  • Blessedman
    lol this honestly is a pipe dream (literally)
    Reply
  • knightmike
    I think Fusion Cloud is a great idea.
    Reply
  • falchard
    I don't think it will hurt graphics cards. The application is limited. Right now you will achieve a better looking game using client-side rendering. At worst if this idea catches on, you will have companies buying hundreds of video cards every 6 months to render for the cloud.
    Reply
  • The future course of this may be more obvious then it seems. This problem has been solved before. Evolutionary history has faced these exact same parameters in the development of nervous tissue. neurons evolved first, then neural nets and finally specialized neurons and neural nets for each problem causing evolutionary pressure. If transisters have hit a speed limit, then multiple processors are the first obvious solution. The next real step will be dedicated processors and groups of processors for specific problems. We already know that custom silicon can produce solutions several orders of magnitude faster than general purpose processors like cpu's even at the same frequency. The best known example of this is graphic processors. It is also clear that both microsoft and intel have conflicts of interest with improving the efficiency of software and hardware. Microsoft needs new hardware to justify new versions of software, and intel needs new fancier, more demanding software to drive hardware upgrades. Netbooks such as the one I am currently using scare them both silly. There is little or no incentive for them both to really innovate since they are already at the top of the heap. The real question is who will be the next disruptive innovater? How will we recognize them when they appear on the scene? Look for the first one to push heterogenious processors designed to take advantage of problem specific. Maybe a hungry AMD, Apple or some new startup will bring us what we are looking for.
    Reply
  • Maybe I'm being stupid here, but could a GPU turn back graphics processing to the CPU(s)?

    I'm thinking gamers, here - nobody plays WOW and uses MS office at the same time, right?

    You DO? Does your BOSS know?

    J/K!

    But in current quad-core systems, what if the video drivers offloaded processing needs to idling cores? Transparent to the game software?
    Reply
  • WR
    First, by offloading a GPU to a cloud supercomputer, someone still has to pay for the supercomputer and now the network bandwidth. Essentially you may be renting a GPU remotely, and this may cost more or less.

    Second, notwithstanding the mediocre net connections in some countries, even the best connections will lag behind a real GPU. Responses won't be as instant, and quality won't be as good as a high-end GPU.

    Third, this will impact low end GPUs the most, especially integrated solutions. But does it fully rectify Intel's horrible integrated performance?

    Fourth, GPUs cost both in hardware outlay and power usage. If your computer is part of the cloud, what measures will be taken to prevent leeching? People might really skimp on graphics, or have nothing but 2D and try to fake 3d capability. I'm talking about the majority who just might not be using AMD platforms. Will the system hold up or defend against such abuse?
    Reply
  • zodiacfml
    this is nonsense.
    how could you hurt the discrete GPU business with that amd fusion supercomputer? it's as though amd will be providing discrete gpu performance to all users of that service, probaby by a million users! that even excludes the problem in speed of broadband access required for an average gaming desktop resolution probably at least 10mbps.

    what amd's fusion will do is provide a more cost and space efficient way to produce a display than integrated graphics. So it is probably the integrated graphics business will lose since it will seat between fusion and discrete graphics businesses which is quite a crowd.

    the last paragraph is just annoying. you don't buy a GPU for a handheld gaming device.
    Reply
  • tuannguyen
    zodiacfmlthis is nonsense.how could you hurt the discrete GPU business with that amd fusion supercomputer? it's as though amd will be providing discrete gpu performance to all users of that service, probaby by a million users! that even excludes the problem in speed of broadband access required for an average gaming desktop resolution probably at least 10mbps.what amd's fusion will do is provide a more cost and space efficient way to produce a display than integrated graphics. So it is probably the integrated graphics business will lose since it will seat between fusion and discrete graphics businesses which is quite a crowd.the last paragraph is just annoying. you don't buy a GPU for a handheld gaming device.
    Not sure you really understand what AMD is going for. With Fusion super computer, they are not adding "some" graphics power to discrete solutions--they want to render the entire game server side, compress the the video, and stream it to the client. Basically, the client has no client-side game install, just a terminal, where you provide control feedback.

    This hurts discrete, not integrated solutions. Because you will not need a super fast 3D card, if Fusion is successful, to play the latest games--provided that your net connection is fast enough to stream the video feed beyond 720p resolution.

    / Tuan
    Reply
  • enewmen
    I remember reading a while ago that the PCs are reaching the limits of graphics. Such as you keep making the car engine more powerful, you start needing better wheels, brakes, etc. then a whole new frame is needed. Finally, the car starts going so fast, it can't stay on the ground. etc etc. So the basic idea of what a car is needs to be re-examined.
    So, basicly, the fusion project is as much about CPU/GPU inner communication as it is putting both on one die. I also think Intel is investing heavly in this area.
    2 cents worth..
    Reply