Sign in with
Sign up | Sign in

What about Direct3D 10.1?

Nvidia GeForce GTX 260/280 Review
By , Florian Charpentier

After the campaign Nvidia has been carrying on for some time now about its uselessness, it’s not really a surprise to learn that the 200 GTX won’t support this version of the Microsoft API. This is no surprise, but still, it’s a disappointment. According to Nvidia, support for the API was considered initially, but the developers they queried said they felt it “wasn’t important.” It’s true that Direct3D 10.1 doesn’t add anything revolutionary – as we pointed out when the Radeon HD 38x0 came out, it’s mostly a matter of correcting gaps left in the Direct3D 10 specifications. Yet there are still some interesting new functions that could prove to be useful to the rendering engines, such as deferred shading – which is more and more popular – and algorithms for rendering transparent surfaces without sorting.

So yes, it might all seem a little superfluous at a time when Direct3D 10 still hasn’t shown clear superiority over version 9, but it still smells a little like an easy excuse on Nvidia’s part. Saying that Direct3D 10.1 is of no use at the present time is not totally false (though Assassin’s Creed proves the contrary), but it’s a kind of vicious cycle – without support from Nvidia, clearly the API can’t really be used seriously by developers. We’ve seen a situation of this type before, but it was the converse: When the NV40 came out, what developers were using Shader Model 3? Especially on the first GeForce 6s, where the main functions, like Vertex Texture Fetch and dynamic branching in the shaders weren’t up to par. Yet, at that time the company was claiming to be in the avant-garde of 3D APIs.

So, our opinion hasn’t changed since then. Even if it may not be useful immediately, we are favorable to the inclusion of the latest technologies in new 3D circuits, so that developers can familiarize themselves with them. We knocked ATI for it at the time, and this time we’re allowing ourselves a little rant against Nvidia.

React To This Article