Sign in with
Sign up | Sign in
Your question
Closed

AMD/ATI Accelerating GPU Flash Player 10.1 Too

Last response: in News comments
Share
a b À AMD
October 6, 2009 8:26:24 PM

Maybe one day, my Geforce 4 MX can do that too :p 

For real this is going to make netbooks actually worth it. Flash performance has always been the part that I could not stand.

Lets hope Intel gets on board as well.
October 6, 2009 8:35:51 PM

Why cant they do this entirely through stream. Why would I have to buy a direct x 11 card for this. My 4870 is still good enough, and I dont see Ati making a chipset for netbooks, so how does this really help anyone. So instead of maxing out a 65 watt part I can max out a 265 watt part. Im not an nvidia fanboy I have ati, but I just can't see where this is relevent.
Related resources
October 6, 2009 8:40:13 PM

Ati accelerated physics?

Now, that would be a nice treat!
October 6, 2009 9:00:23 PM

Its called Havok.
October 6, 2009 9:02:40 PM

They are doing it though DX11 well that's a bummer for those with ati cards that don't have that unless i mis understood
October 6, 2009 9:04:43 PM

I thought flash was already accelerated by gfx cards? or was that only open gl?
Anonymous
October 6, 2009 9:05:39 PM

I very much doubt that the 58x0 series of cards would be the only AMD GPUs to support flash acceleration.

After all the main focal point for this technology would be laptops and netbooks and those are as a general rule equipped with far less capable GPU solutions, though definitely enough to handle flash.

I'd wager the 58x0 were simply used for the demo as it's the first implementation and a way to show off new technology.
October 6, 2009 9:07:42 PM

Does this mean that Flash 10.1 can waste even more power than before? Instead of using a very efficient C2D, let's use a GPU that uses 3x as much power.

Death to Flash.
October 6, 2009 9:23:59 PM

geoffsDoes this mean that Flash 10.1 can waste even more power than before? Instead of using a very efficient C2D, let's use a GPU that uses 3x as much power.Death to Flash.

IzzyCraftThey are doing it though DX11 well that's a bummer for those with ati cards that don't have that unless i mis understood


Okay so I said the same thing as these guys but get 2 negatives. The article said diretx 11. All I am trying to say is how can nvidia do it on a mobile low watt 9400 (ion) that is not a diretx 11 gpu, but ati needs a 5870? Also why would I need flash acceleration for anything but a netbook or mobile device. Most laptops, and desktops can handle flash fine. The only time this becomes relevent is if ati released a mobile gpu like ion.
October 6, 2009 9:24:12 PM

geoffsDoes this mean that Flash 10.1 can waste even more power than before? Instead of using a very efficient C2D, let's use a GPU that uses 3x as much power.Death to Flash.


What GPU uses three times more power than a processor does? Most GPUs will use 225 watts (PCIe = 75w, 2x6pin 75w), and pairing a GPU that will require extra power connectors with even a duo-core won't get you the magical "3x power consumption". Both parts already consume electricity during idle, might as well put them to use.

I don't see what you're complaining about.
Anonymous
October 6, 2009 9:27:41 PM

Hmm, sounds like a recipe for making the most universally playable games (flash) require specific hardware. Yay!!!

:( 
October 6, 2009 9:43:00 PM

arg@argcomHmm, sounds like a recipe for making the most universally playable games (flash) require specific hardware. Yay!!!


No, you're wrong. This doesn't mean that you will NEED a GPU to play flash games.
It means that IF you ALREADY have one, you can play flash games FASTER and BETTER, consuming LESS CPU RESOURCES, allowing you to MULTITASK BETTER.
I used caps selectively so you can get the core concept better.
October 6, 2009 10:38:33 PM

wow, atom systems will be able to use flash without stuttering when paired with a 5870! that would be an uber system config! :|

ofcourse that would be a short-sighted remark. It would all make more sense when dx11 integrated graphics comes to under-powered platforms. until then, this would probably be academic.

"Both Adobe and AMD worked with the DirectX 11 API's compute shader and ATI Stream technology to accelerate Flash performance with the GPU."

i wonder why they didn't opt to use OpenCL instead so it would be more portable across OSs and actually run on more(older) hardware.
October 6, 2009 11:42:56 PM

the_krasnoNo, you're wrong. This doesn't mean that you will NEED a GPU to play flash games.It means that IF you ALREADY have one, you can play flash games FASTER and BETTER, consuming LESS CPU RESOURCES, allowing you to MULTITASK BETTER.I used caps selectively so you can get the core concept better.

Dude we're not kids - no need to CAPITALIZE your STRONG POINTS! It's so damned annoying.
October 7, 2009 1:39:17 AM

but it emphasizes when my BALLS are ITCHING like CRAZY because of that STUPID GIRL from the beach...

in all reality who didn't see this coming? I just hope that gpu support can be backwards compatible with older video cards if for no other reason then that older laptops can't handle full screen hulu and other flash apps. :-p
October 7, 2009 3:11:29 AM

omnimodis78Dude we're not kids - no need to CAPITALIZE your STRONG POINTS! It's so damned annoying.


When people stop acting like kids, I will start treating them like adults.
October 7, 2009 4:04:34 AM

thackstonnsWhy cant they do this entirely through stream. Why would I have to buy a direct x 11 card for this. My 4870 is still good enough, and I dont see Ati making a chipset for netbooks, so how does this really help anyone. So instead of maxing out a 65 watt part I can max out a 265 watt part. Im not an nvidia fanboy I have ati, but I just can't see where this is relevent.
You may not be a fanboy, but you are poorly informed. DirectX 11 (software) will include Compute Shaders for DX 10+ cards (hardware). DirectX 11 cards will be able to run CS 5.0, DX 10.1 cards will run CS 4.1, and DX 10 cards will run CS 4.0.

Also you are mistaken about power usage of these cards. When accelerating something like Flash using CS, the chips will probably run at a considerably lower power level than max. Probably near their idle wattage, which on the 5800 series is extremely low. The future lower end 5xxx cards should be even better. For the record, Nvidia cards are a lot more power hungry at any given performance level, currently.

GT300 is going to be very large, so it may also suck up a lot of power like its predecessor. I don't think GT300 will scale down quickly, so a redesign of the GT200 on a 40nm process might be a good idea for Nvidia, especially for the low-cost and low-power markets, and would complement GT300 well.
October 7, 2009 6:10:18 AM

... and intel try to save x86... but with the time, the CPU will be just a powerful router... the heavy loads will be done with raw computing power, that is provided by GPU and maybe some other hardware inventions in the future...
Anonymous
October 7, 2009 6:27:19 AM

Can't what for "Quake III Arena" level 3D graphics in my porn popups.
October 7, 2009 8:00:37 AM

Everything just needs to use more gpu. Give my new radeon 5870 a work out.
October 7, 2009 12:30:08 PM

Actually the_krasno,you are the one that is mistaken.

If the GPU acceleration lets you play Flash games "better and faster" and offloads resources from the CPU then more and more complex games will be created that for all intense purposes require acceleration to be playable.

Once you make a resource available, it is going to be used. As history in any computer game field has demonstrated over and over again, programmers will always attempt to wring every last drop of performance they can and often overshoot the current limits on hardware and reach for the next level that is yet to come.

Flash is already a resource heavy program, and with this there will surely be games developed that will bog down older systems without the hardware support.
October 7, 2009 4:29:47 PM

It's also possible that some GPUs may receive driver updates for DirectX 11 compatibility. But of course planned obsolescence from hardware manufacturers will ensure that there's always something missing.
October 7, 2009 5:15:32 PM

I have mixed feelings about this. I like the idea for some things, I have the hardware to pull it and some 3D flash does sound cool. At the same time it seems like there will be little support for this and in practice will be rarely seen, so it makes it hard to get all hyped up about it. Maybe this is a reaction to that 3D accelerated browser thing I think it was called WebGL.
Anonymous
October 7, 2009 5:34:47 PM

Marcus still doesn't have a handle on English.

"Yesterday when Adobe announced its Flash Player 10.1 that'll be coming down the pipe"

That would be "pike" not "pipe."
Anonymous
October 7, 2009 5:55:31 PM

Flash video's suck, and are not comparable to mp4/avi video's.
October 7, 2009 6:17:01 PM

CMON INTEL! the GMA 950 sucks as a graphics adapter but it would be great if flash could us it SOMEWHAT so that my netbooks cpu doesnt get the shit kicked out of it when watching hulu videos
October 7, 2009 6:18:41 PM

zingamWhy not? They could just disable n number of CUDA cores and you have a cheaper GPU. I bet they'll do it. With that 3 billion GPU they'll have a lot harvesting to do. You will definitely see a whole bunch of crippled cards by NVIDIA next year.


Yup, the only market I see is for atom/ion HTPC's for watching HULU and youtube fullscreen. Still that is a pretty good nich market.
October 7, 2009 6:22:19 PM

if you have a system thats using a 5870 then you most likely have other high end parts which means you wont notice any improvement through hardware acceleration.

whats needed is for low end systems, eg single core 1.6GHz with a geforce 7300 or something else low end, these systems will not handle full screen flash unless adobe comes out with a flash player thats fully GPU accelerated.

October 7, 2009 6:32:17 PM

dheadley ... more and more complex games will be created that for all intense purposes require acceleration to be playable....


Seriously, I just had to sign up to correct this! This is a real howler! The phrase is "all intents and purposes", as in "intentions and purposes", it has nothing to do with intensity. I can't believe I'm having to write this...
October 7, 2009 7:03:25 PM

zingamI don't get it! If you have 4870 then you should probably have powerful CPU to. Why do you need an acceleration then?
So you're saying that producing a 40nm wafer of 1.5 billion transistor chips will produce the same number of chips (working or otherwise), as a 40nm wafer of 3 billion transistor chips? Using the same wafer size and manufacturer? The answer is no. Yes, they will be binning the GT300 - all chips are binned, my friend. But the only thing binning helps with is poor yield %, it doesn't change the actual cost to make the damned things. It allows them to sell partially-broken or underperforming parts. But this will only scale down so far.

Look no further than the GT200 for an example. The chips can scale UP as yields improve, but scaling DOWN is impossible unless you like losing money. Did they ever sell a GT200 below the 260? No. They are too expensive to sell in the cheap sector, so they used a rebadged 9800 GTX+ (G92b) and named it GTS 250 rather than selling GT200 chips that were even more gimped than the GTX 260. G92 is smaller, and therefore a hell of a lot cheaper to produce on the same process. Everything GTS 250 and below is G92 currently.

They also have trouble getting heat and power consumption issues under control with large chips. Again, take the GT200 as an example. Now look at the laptop "GTX 260M" and "GTX 280M". Guess what? They're not GT200 either! That's right, the mobile "GTX260/280" chips are G92, again.

Conclusion: the huge GT200 die was just not appropriate for the mobile and cheap markets, at 55nm. At 40nm, the huge GT300 will have problems scaling for these same markets. So for Nvidia's sake, I hope they release a GT2xx (upgraded GT200, maybe with DX 10.1 and CS 4.1), to cover these important markets.
October 7, 2009 7:05:55 PM

Sorry, I meant to quote this Zingam's other comment (below). I was explaining why it's not always better to just sell your huge fancy new chip in all market sectors.

zingamWhy not? They could just disable n number of CUDA cores and you have a cheaper GPU. I bet they'll do it. With that 3 billion GPU they'll have a lot harvesting to do. You will definitely see a whole bunch of crippled cards by NVIDIA next year.

October 7, 2009 8:33:38 PM

Why are so many people complaining. This is a good thing that Adobe is doing, especially for Flash Video Sites for nextgen netbooks or HTPC.
October 7, 2009 9:19:52 PM

mlcloudWhat GPU uses three times more power than a processor does? Most GPUs will use 225 watts (PCIe = 75w, 2x6pin 75w), and pairing a GPU that will require extra power connectors with even a duo-core won't get you the magical "3x power consumption". Both parts already consume electricity during idle, might as well put them to use.I don't see what you're complaining about.
Intel's mainstream desktop C2Ds use a maximum of 65W. As you pointed out, GPUs can use upwards of 200W, which is at least 3x.
October 8, 2009 11:59:22 PM

geoffsIntel's mainstream desktop C2Ds use a maximum of 65W. As you pointed out, GPUs can use upwards of 200W, which is at least 3x.

That's at full tilt. A GPU with 3x the power consumption but many times the raw compute power means it won't be running at full tilt. At full tilt a 5870 puts out over 2 TFlops. How about that C2D? So if the GPU is mostly idle while accelerating something easy, it may not eat as much power as you think, and more importantly, its going to eat some of that power ANYWAY even if you don't use it.

Look at the latest integrated GPUs, which are capable of accelerating blu-ray streams (including H.264). If you took away that acceleration and tried to decode it entirely in software (aka on the CPU), you'd need a fast CPU and it would eat a lot of CPU cycles. In the end, it would probably eat up more power than letting the GPU do most of the work.
October 9, 2009 1:56:27 AM

Interesting, seems they're playing the "we has that 2!" game with nvidia.
Anonymous
November 4, 2009 1:17:09 PM

so Adobe would utilize M$ computing API to decode H.264 bitstream just like what CoreAVC do (using CUDA)?
Anonymous
March 2, 2010 6:35:44 PM

the purpose of flash is to have moving colors and graphics that require the cpu to build and render vectors and animate using very little data to be served by the content hosting website. this was the alternative to streaming video that maxed out bandwidth at the time of its invention to provide rich content for websites.

since it caught on and it works with most browsers via a plug-in, people have clicked ads and linked from one website to another and because of its success games and simulations and even programs have been written for it.

however, games like farmville or cafe world on facebook chew the hell outta all my pc's and kill battery life when logging on while mobile.

but to be fair to all that login to websites they use flash which is only accelerated to a point, and uses a single cpu core to the maximum no matter the quality level when rendering vector graphics.

the point of the technology is to have a successful install base using a uniform plug in acquired from one common source and to exploit this avenue for maximum advertisements and potential profits.

if they wanted to they could program the games to run using whatever acceleration you have on your pc, but, not everyone has that same chip or chipset and the cost for true support for it would kill any potential profits made.

remember when dvd first came to the pc, there were systems that could do it fine and others that would stutter and skip and look terrible, etc... the standard was the disc and mpeg2 compression, however the systems that played that media type were all different. so the push was to standardize the requirements and thats where intel and ati and nvidia all added acceleration for that particular data type for performance and power optimization. it was also driven by the media market so the sales of dvds would increase.
!