The performance differences between ATI and Nvidia graphics cards are currently fairly tight and well-segmented—in other words, it’s easy for an enthusiast to hit a site like newegg.com with a budget in mind and, all else being equal, find the best card for the money. Case in point: you tell me you have $160, I’ll tell you to get a GeForce GTX 260 Core 216 and cash in on the rebates. You tell me you can spend $180, I could tell you to grab a Radeon HD 4870 with 1 GB.
But all else is not equal, so recommendations aren’t quite that cut and dry. Nvidia preaches the gospels of PhysX and CUDA, with an occasional verse about GeForce 3D Vision. ATI sings the hymns of DirectX 10.1 and Stream. Depending on which company you put your faith in, one of those two messages is going to sound a little sweeter.
Right now, the two are in full-scale conversion mode, trying to get everyone they can to pitch in their tithing for a little gaming salvation. Just as PhysX and CUDA are starting to take hold with high-profile titles enabling support, so too are software developers paying more attention to DirectX 10.1.
As a superset of DirectX 10, DirectX 10.1 includes a handful of quality-enhancing features that, in some cases, will run on DirectX 10 hardware, but at a performance hit. For instance, the Gather4 function fetches four samples (2x2) where a DirectX 10 part would only be able to fetch one. The result should be more realistic shadow maps and better performance. The Stalker: Clear Sky demo lets us test that theory with a toggled check-box to enable or disable DirectX 10.1.
Our first test pits all of the Radeon HD 4800-series cards in this story against each other, without MSAA for alpha-tested objects enabled. At 1920x1200, performance is fairly similar across the board, with all test cases except the Radeon HD 4870 X2 demonstrating small gains by moving to DirectX 10.1.
We then turned on 4x MSAA for alpha-tested objects and re-ran the numbers, this time at 1680x1050, in an attempt to maintain somewhat reasonable frame rates. This time, a majority of the test cases show DirectX 10.1 incurring a small performance hit. Is the performance tradeoff worthwhile? For that, we’ll need to make an image quality comparison from within the game itself.
Up top you’ll find the screen capture provided by ATI, right up against a wall, demonstrating that the DirectX 10.1 shadows are softer and arguably more realistic. I ran all over the opening area of the game trying to find a clear example of the difference made by DirectX 10.1 shadows and just couldn’t come up with an indisputable best-case scenario. Even by reloading saved points, it was impossible to generate the exact same scene twice. Nevertheless, given the option of enabling DirectX 10.1 and not seeing a significant performance hit, you might as well turn the feature on.
With the Game Developer’s Conference recently past, ATI had a handful of DX 10.1 titles to discuss during its briefing, besides Stalker. Tom Clancy’s HAWX looks like a fun title with DX 10.1 screen space ambient occlusion and accelerated Gaussian shadows. There were two other lesser-known titles, plus the UNiGiNE game engine, on which several upcoming titles are purportedly based.
As of right now, DirectX 10.1 isn’t making a huge impact, but because it will become a subset of DirectX 11, you can expect the extra features being enabled right now to work moving forward in Windows 7. The same couldn’t be said for the disruptive shift from DirectX 9/XP to DirectX 10/Vista.
- Building A Radeon HD 4890
- The Relevance Of DirectX 10.1
- Test Setup And Benchmarks
- Benchmark Results: 3DMark Vantage
- Benchmark Results: Far Cry 2
- Benchmark Results: Crysis
- Benchmark Results: Left 4 Dead
- Benchmark Results: Stalker: Clear Sky
- Benchmark Results: Grand Theft Auto IV
- Benchmark Results: World In Conflict
- Benchmark Results: Sum Of All Games
- Power Consumption