AMD: DirectX Holding Back Game Performance
AMD claims that game developers actually want the API to go away.
With all the hype surrounding DirectX 11 and how it's changing the face of PC gaming in regards to mind-blowing eye candy, AMD's worldwide developer relations manager of its GPU division, Richard Huddy, claims that developers actually want the API to go away, that it's getting in the way of creating some truly amazing graphics.
"I certainly hear this in my conversations with games developers," he told Bit-Tech in an interview. "And I guess it was actually the primary appeal of Larrabee to developers – not the hardware, which was hot and slow and unimpressive, but the software – being able to have total control over the machine, which is what the very best games developers want. By giving you access to the hardware at the very low level, you give games developers a chance to innovate, and that's going to put pressure on Microsoft – no doubt at all."
Outside a few current developers who have announced that PC game development will take priority over console versions, a good chunk of the gaming industry is developing titles for the Xbox 360 and PlayStation 3 first and then porting them over to the PC thereafter. The result is that PC versions are only slightly superior to their console counterparts on a visual sense even though a high-end graphics card has at least ten times the horsepower of the Xbox 360's Xenos GPU and the PlayStation 3's GeForce 7-series architecture.
What this means is that-- although PC graphics are better than the console version-- developers can't tap into the PC's true potential because they can't program hardware directly at a low-level, forced to work through DirectX instead. But there are benefits to working with APIs including the ability to develop a game that will run on a wide range of hardware. Developers also get access to the latest shader technologies without having to work with low-level code.
But according to Huddy, the performance overhead of DirectX is a frustrating concern for developers. "Wrapping it up in a software layer gives you safety and security," he said. "But it unfortunately tends to rob you of quite a lot of the performance, and most importantly, it robs you of the opportunity to innovate."
He added that shaders, which were introduced back in 2002, were designed to allow developers to be more innovative, to create a more visual variety in games. But now many PC games have the same kind of look and feel because developers are using shaders "to converge visually."
"If we drop the API, then people really can render everything they can imagine, not what they can see – and we'll probably see more visual innovation in that kind of situation."
The interview goes on to define the performance overhead of DirectX, explaining that the actual amount depends on the type of game in development. Huddy also talks about the possible problems of developing for a multiple GPU architecture on a low-level if the API is ignored.
"The problem with the PC is that you ideally want a PC that doesn't crash too much, and if a games developer is over-enthusiastic about the way they program direct to the metal, they can produce all sorts of difficulties for us as a hardware company trying to keep the PC stable," he said.
The interview is definitely an awesome read, so head here to get the full scoop.
I also find it very hard to believe that there are any huge restrictions in what a developer can create with a high level api, versus low-level coding. And with graphics engines like CryEngine and Unreal, the case this guy's making seems weak.
Didn't know that.
I also find it very hard to believe that there are any huge restrictions in what a developer can create with a high level api, versus low-level coding. And with graphics engines like CryEngine and Unreal, the case this guy's making seems weak.
at least Johan Andersson (DICE dev) is heavily pushing for this.
http://forum.beyond3d.com/showpost.php?p=1535975&postcount=8
So because you can't program graphics at a low level means professionals shouldn't either? Plus, you answered your own question. Working with graphics on that low of level would give massive performance improvements.
We aren't talking about chucking the API altogether... just allowing developers the choice not to use it. Forcing them to use the API is rather... Apple-esque.
Jokes aside isn't that what Compilers are for? Turning high level code into optimized low level code. Preferably as low as possible.
what? he didn't say somebody FORCES to use the said API. he said, it would take FOREVER (sorta...) to program at low level. he also didn't say you can't do it either. he was pointing out that the guy here is blabbering the obvious. low level > high level langauges -- BUT reality check people. it would be absurd to program a massive game with low level. if blizzard uses this, it would take at least 50 years to develop starcraft 3. by then, the "visual innovation" you came up with would look like crap with direct xXX (twenty).
I think many of the developers asking for the direct control have been spending too much time with consoles (where everyone has the same GPU). The logistical nightmare of supporting so many different GPUs on the market would make game development costs and times skyrocket to get the performance they want with this low-level optimization with different GPUs. Even if you limit the GPUs, optimizing heavily for (for example) Radeon 5870, 6970, GeForce 485 and 580 will be FAR more difficult than just doing Xenos in the X360. Not to mention throwing in Crossfire/SLI and GPUs that launch after the game...
It is true that even a small and portable psp go has great graphics for it's size and power consumption, but a pc has the possibility to run many of those games at much much higher resolutions (esp when using tri- or penta screen setup).
Microsoft has ignored Alex St. John's methods and message, what made DX a standard for game developers and what made them happy to use it. They've made Windows more a part of how it works, not less, and that's a bad thing.
Here's an Alex St. John interview that I think pretty much sums things up, from a few years ago:
http://rampantgames.com/blog/2007/03/interview-with-father-of-directx-alex.html
All that being said though, what is holding game developers back is the developers themselves, not any API. That's ridiculous. DX is just a tool to make life easier for you, it's not a limitation. Step out on the wild side, my friends, and expand what can be done instead of gripe about what can't be.
Just get rid of DirectX and go with OpenGL/CL.
Being less beholden to MS and DirectX will encourage innovation because all MS does is stifle it.
The gap between consoles and modern video cards is narrower already than it should be because devs don't want to (or can't afford to) take the time to optimize a game for the superior capabilities of the PC, so how on earth is it supposed to benefit anyone if we split the PC market into multiple segments, each of which they would have to optimize for?
They'd be better off developing for the PC first and then dumbing it down to work on a console. The the low level optimization that is possible on a console could then be used to squeeze the most performance possible out of the ported engine.