AMD/ATI Accelerating GPU Flash Player 10.1 Too
That Radeon HD 5800 series card will be VERY good at making Flash go.
Yesterday when Adobe announced its Flash Player 10.1 that'll be coming down the pipe, Nvidia was among the first to chime in saying that its GPUs will be able to accelerate Flash video with GeForce and Ion.
AMD's informed us that it plans to support Adobe Flash Player 10.1 at the same level and in step with the beta release later this year.
"AMD is committed to making the video usage scenario -- playback, editing and transcoding -- a focal point for AMD platform innovation, smartly using the full CPU + GPU assets of our platform to enrich and accelerate the experience," an AMD representative told us.
In fact, at the Adobe MAX event in Los Angeles, Adobe demonstrated a private alpha of Flash Player 10.1 that is supported by the ATI Radeon HD 5800 (the only GPU on the market right now to fully support DirectX 11).
Both Adobe and AMD worked with the DirectX 11 API's compute shader and ATI Stream technology to accelerate Flash performance with the GPU.
Hooray for the growing utility of the GPU beyond just playing 3D games!

For real this is going to make netbooks actually worth it. Flash performance has always been the part that I could not stand.
Lets hope Intel gets on board as well.
Now, that would be a nice treat!
After all the main focal point for this technology would be laptops and netbooks and those are as a general rule equipped with far less capable GPU solutions, though definitely enough to handle flash.
I'd wager the 58x0 were simply used for the demo as it's the first implementation and a way to show off new technology.
Death to Flash.
Okay so I said the same thing as these guys but get 2 negatives. The article said diretx 11. All I am trying to say is how can nvidia do it on a mobile low watt 9400 (ion) that is not a diretx 11 gpu, but ati needs a 5870? Also why would I need flash acceleration for anything but a netbook or mobile device. Most laptops, and desktops can handle flash fine. The only time this becomes relevent is if ati released a mobile gpu like ion.
What GPU uses three times more power than a processor does? Most GPUs will use 225 watts (PCIe = 75w, 2x6pin 75w), and pairing a GPU that will require extra power connectors with even a duo-core won't get you the magical "3x power consumption". Both parts already consume electricity during idle, might as well put them to use.
I don't see what you're complaining about.
No, you're wrong. This doesn't mean that you will NEED a GPU to play flash games.
It means that IF you ALREADY have one, you can play flash games FASTER and BETTER, consuming LESS CPU RESOURCES, allowing you to MULTITASK BETTER.
I used caps selectively so you can get the core concept better.
ofcourse that would be a short-sighted remark. It would all make more sense when dx11 integrated graphics comes to under-powered platforms. until then, this would probably be academic.
"Both Adobe and AMD worked with the DirectX 11 API's compute shader and ATI Stream technology to accelerate Flash performance with the GPU."
i wonder why they didn't opt to use OpenCL instead so it would be more portable across OSs and actually run on more(older) hardware.
Dude we're not kids - no need to CAPITALIZE your STRONG POINTS! It's so damned annoying.
in all reality who didn't see this coming? I just hope that gpu support can be backwards compatible with older video cards if for no other reason then that older laptops can't handle full screen hulu and other flash apps. :-p
When people stop acting like kids, I will start treating them like adults.
Also you are mistaken about power usage of these cards. When accelerating something like Flash using CS, the chips will probably run at a considerably lower power level than max. Probably near their idle wattage, which on the 5800 series is extremely low. The future lower end 5xxx cards should be even better. For the record, Nvidia cards are a lot more power hungry at any given performance level, currently.
GT300 is going to be very large, so it may also suck up a lot of power like its predecessor. I don't think GT300 will scale down quickly, so a redesign of the GT200 on a 40nm process might be a good idea for Nvidia, especially for the low-cost and low-power markets, and would complement GT300 well.