ARM And AMD Partner On OpenCL
I have to admit that I was hoping for much more news from ARM's participation in AMD's currently held Fusion software developer conference in Seattle.
There is the persistent rumor that AMD may be licensing ARM architecture to make its way into the smartphone tablet space, but there was no confirmation of such a move at the conference. Instead, ARM and AMD are partnering in the OpenCL space to promote the craetion of GPU-accelerated apps.
At its conference, AMD announced a set of new OpenCL development tools that cater specifically to its Fusion APUs. The most interesting part of this announcement is the fact it was made by Manju Hegde, AMD's corporate vice president of AMD's Fusion experience program. Some readers may remember Hegde as the founder and CEO of Ageia, the company that invented the PhysX chip. Ageia was acquired by Nvidia in early 2008 and Hegde is now at AMD pitching OpenCL support, which is in direct competition to Nvidia's CUDA.
ARM's Jem Davies delivered a keynote at AMD's Fusion event and while there are obvious competitive edges between ARM and x86 products, the executive stresses that ARM and x86 are the only remaining "relevant" CPU architectures. Davies also pitched a hybrid processor approach that outlined CPU cores, parallel arrays and circuits that are dedicated to very specific functions, which obviously would favor highly parallel software that is written in, for example, OpenCL. It is a somewhat surreal experience to see ARM speaking at AMD's (x86) developer event and AMD could have simply invited ARM to annoy Intel. To see the partnership evolve is interesting, but the benefit to developers at the event was very limited.
I still believe that there is much more to come and those AMD-ARM rumors have some substance.
Didn't AMD, in the past, drive to press event(s) meant for Intel, and whisk 'em off (the press) to AMD's own showing right under intel's nose? I think they were lured off b/c AMD had actual working hardware. Might of been the early battles of 64 architecture... I cant recall.
I just want to say I completely condone and support this type of shenanigans.
Open Compute Language v Open Graphics Language....former for doing general processing on a graphics card, the latter a graphics standard that used to compete with DirectX. Different standards for different purposes.
time to play ball.
opengl was great despite its competition.
opengl would still be widely used (i have a few games that use it, and are far better than the dx counterpart) but because microsoft produced a fud campaign, they stole the show.
as of now, opengl is about at dx10 and 11 quality graphics, but works on EVERY SYSTEM. while dx10 and 11 are windows 7 only (vista is no more... not worth mentioning)
Eventually another group took over maintaining / modifying the OpenGL standard and they had some very good ideas. But the original industrial group had veto power over any chances, and they threatened to use it liberally. So the fight over the "next" version of OpenGL went back and forth for a few years, all the while MS is perfecting DirectX and software developers had all switched to DirectX. When the OpenGL group finally was able to release OpenGL 3.0, it was lacking in many of the originally desires features and DirectX 9 was dominating everything. Now OpenGL has mostly been rendered irrelevant and its most common use is as a wrapper for Wine's DirectX emulator, or as the engine behind various GUI's in Linux. The industrial developers are still using their old OpenGL library's and they have ~zero~ incentive to move beyond that.
The "rule by committee" method doesn't work in a rapidly changing world. You can't have a counsel of representatives deciding the future of a standard, especially when their at cross purposes and in competition with each other. That would be like having Intel, AMD and Via all on a counsel trying to decide the next ISA. It would be a total and utter failure as each (especially Intel) would try to create the standard in such a way as to give itself an advantage over the others.
Whatever ARM supports will be the de facto mainstream which is why AMD is wise to partner up with them for development of openCL. ARM processors account for around 90% of all processors sold in the world, so their influence is not to be taken lightly.
Yes this is true that it is open, and it is evolving, but so is Direct X. I doubt that the industry has much incentive to roll back to OpenGl at the moment. Most games are still on the MS platform for PC, and those that are on the mac show significant less quality, with the differance being Open GL.
The fact is Open GL works great for work loads, but it does not work so great for games. MS has already developed a flexable platform that supports alot of features, and can be easily used to port over to thre own consols easily, and from my understanding most other consols use propiatory APIs, meaning the open standerd of OpenGl gives them no significant advantage, and so companies come up with something that might.
The fact is there Is not much advantage using OpenGL, unless they come out with a way to port applications from the PC directly to smart phones, which is the only mass consumer item that I actually spot in the wild that can use open GL. Even with this feature you have problems with scaling from large moniter to small, and compatability across the board since you move to an entirely differnat form facter.
OpenCL stands a chance its only real compatition is locked down to a single hardware manufacturer, so I dont see it getting ported across the board to support its competiters. The project is supported by industry, and the industry is still developing. Not to mention AMD has alot to gain from moving more of the workload to the GPU then any one else out there, so you can bet there going to do everything they can to draw developers.
Since there is no Game industry equivalant to drive OpenCL at this point they still have wiggle room to fumble a little bit, it is more understanding to have delays on a up and comming technolagy then it is for something that is perceved as masterd by there respective companies. we dont expect air bus or boing to have years of delays on a refresh of one of there desings, but tell them to make a light saber, and we think... "6 months to a year delay? Only thats really good."
same here people will see games, and companies wount want to experament with what works well for them, however if you introduce something that allowes them to shoot more of that code at the GPU then you might spark there intrest.
you can also bet that the new Llano chips are also turning heads. They support OpenCL, and they last a long time with that GPU under load. If this chip takes off the use of OpenCl can potentially do alot to drop system requirements on what would otherwise be hard to run softwear.
Yes, OpenGL was created for CAD and professional visualization. And yes, Microsoft was able to quickly improve Direct3D (DirectX is not just graphics). However, Microsoft seemingly improved by light-years because Direct3D was SO BAD to begin with. Microsoft had NO IDEA what game developers wanted from 3D graphics, that is why many in the mid 90s went with OpenGL. And if you actually look at the way the Direct3D API has evolved you will notice one thing: each version became more and more like OpenGL!
Also, you did not mention that Microsoft has tried to cripple OpenGL at every chance. For example, backing away from Fahrenheit which would have united OpenGL and Direct3D, reversing their commitment to provide DirectDraw bindings for OpenGL, and removing OpenGL support in their basic drivers.
Sorry, but anyone that has actually used OpenGL for games knows that is a false statement.
Also, I guess all those id Software based games since Quake II are not so good on the graphics? Have you seen the latest demo of idTech 5 engine?
One more thing: the OpenGL API is available natively on Windows, Linux and other Unix-like OSs, iOS (iPad, iPhone, etc.), OSX, Android, and even PlayStation 3. And with third-party libraries you can use it on Xbox, Wii, and Windows CE/Mobile/Phone. With the push for 3D games and apps on portable devices Direct3D is at a BIG disadvantage.
Just making a multicore CPU x86 compatible does not mean existing code will utilise all cores. Programmers will still need to rewrite and optimise their code but just not as much as a complete overhaul. Other aspect is not all situations are ideal for the x86 instruction set. Also it would be nice to broaden the industry in high performance computing and allowing the 40+ chip companies that make non-x86 chips compete against the two that do could really benefit everyone in the long run.