Kaunain :
i just recently read about Mantle API which will be released by AMD.It is said to optimise EA/FROSTBITE 3 games like BF4...and will exploit all the features of the GPU.
so how exactly does this work?is it some sort of driver that we will install for our GPUs?
another feature is that it will be available to GPU's having GCN(generation core next) architecture....
i have HD7750...it has GCN i guess,so will i be able to get this feature?
can some expert shed any light on this matter.....
Back in the days of yore (think 1980s and early to mid 1990s) games had to interface with the hardware devices themselves.
Modern game setups typically involve tweaking graphical settings such as resolution, geometry quality, Anti-Aliasing, texture filters, light and shadow reflection, etc... It's all very quality oriented.
Old game setups involved selecting the sound and graphic adapters from a list of supported devices and then fiddling with them until they worked. There's an old saying from that area, "getting the game to run is half the fun".
The reason for this is that at the time modern concepts that we take advantage of such as device drivers, APIs, and virtual memory were either non existent, or too demanding to use at all.
In the late 1990s a variety of new technologies emerged which allowed applications to speak one unified language. Devices would then interpret that language in a device/driver specific fashion. On the PC, the best known of these technologies is the DirectX API suite. In a perfect world, a game interfacing with DirectX 11 should not care if it's interfacing with an AMD HD 6000 series GPU (VLIW4/5), AMD HD 7000 series GPU (GCN), GeForce 400/500 series GPU (Fermi), GeForce 600/700 GPU (Kepler), or Intel HD Graphics 4000/5000 (IGP).
However,
DirectX and
OpenGL are not the only languages that a game can speak, they are simply ones that work with the majority of all hardware configurations without much fuss. In the late 1990s a company called 3DFX (which would later be acquired by NVidia) created an API called
3D Glide for it's Voodoo series of 3D accelerators that exposed advanced functions not found in other devices. Any application interfacing with the
3D Glide API could only work with a device that was compatible with the Voodoo series of 3D accelerators. It was not designed to be agnostic in the same way that DirectX is agnostic.
The DirectX and OpenGL approach favored by ATI (now part of AMD) and NVidia led many designers to overlook 3D Glide and forego its advanced functions in favor of portability.
Now, one would expect the same thing to happen to Mantle. Developers would flock to DirectX/OpenGL for productivity reasons alone. However, AMD is providing the APUs for both of the next generation consoles. Unlike PCs which moved to agnostic APIs years ago, consoles still expose much of the console hardware to applications directly in a fashion very similar to the 1990s and developers have been taking advantage of this ever since. This is what allows games like The Last of Us to look pretty on hardware from 2006. Console developers are used to speaking to the hardware directly to squeeze every last drop of performance out of it. This does not happen on PCs at all, as games are bound by the limitations of the DirectX and OpenGL APIs. By extending mantle to the PC as well, AMD will enable developers to port the same highly optimized codepaths used to squeeze performance out of the consoles to the PC as long as the PC has a GCN based GPU. If it doesn't have a GCN based GPU, it will have to fall back on the less efficient but more agnostic DirectX/OpenGL APIs.
The end result is that console developers will target whatever API offers the best return on investment, which will almost certainly be Mantle on the consoles. Then, they'll port it quickly to the PC where DirectX will get a quick and dirty implementation for NVidia GPUs and older AMD GPUs, while Mantle will be used for GCN GPUs.
I hope that this answers your question.