Introduction
ATI is currently rocking a substantial lead in the DirectX 11 space—it’s blazing a trail where it has, in the past, often followed suit. “But Chris, ATI gave us DirectX 10.1!” And look how pervasive or impactful that turned out to be.
This time it’s different, though. ATI and Nvidia both agree that DirectX 11 is the API that’ll change the figurative game. Has it yet? Is this boat already sailing off into the sunset? Decidedly not. ATI’s own schedule of DirectX 11-compatible software lists three titles currently shipping with support, three more slated for Q1, and two more expected in 2010. The undeniable reality is that ATI is out in front, and Nvidia brings up the rear this time around. But its tardiness means very little in the big picture (so long as you’re an enthusiast, and not a shareholder).
Playing Catch-Up
The real values in ATI’s Radeon HD 5800-series lineup, as it stands today, are gaming performance in the more pervasive DirectX 9 and DirectX 10 titles, Eyefinity, and the family’s handling of high-def audio/video content.
And while today’s preview of Nvidia’s GF100 graphics processor (the first model based on the company’s Fermi architecture) is full of detail covering the chip’s building blocks and architectural philosophies, we’re left to make educated guesses on how a productized GF100 will stand up to its competition.
We know that, through Nvidia Surround, the company will enable multi-monitor (more than two display) gaming. But it’s only enabling two outputs per card, so three will require an SLI configuration. That sounds like it could get expensive, especially since I'm running a configuration like that using a single $290 Radeon HD 5850.
We also now know a lot more about the resources dedicated to gaming within GF100. Given this, it’s quite reasonable to assume that a graphics card based on GF100 will be significantly faster than a card centering on ATI’s Cypress. Nvidia also admitted to us that it’d be more power-hungry than GT200—the ASIC driving GeForce GTX 200-series boards.
What we don’t yet know is when GF100-based cards will be available for sale. Nvidia says Q1; you could be waiting another two months, if its projections come to pass. We don’t know the GF100’s clock rates, nor do we have GDDR5 data rates. With those variables up in the air, price targets for the planned flagship and its derivatives are naturally right out the window. And the issue of high-def audio/video is largely irrelevant at this stage, given the GPU’s known transistor count and anticipated heat. But Nvidia says GF100 has the same A/V suite as the GeForce GT 240. That means you get Blu-ray playback acceleration and eight-channel LPCM output, but no bitstreaming—a feature ATI and Intel both offer.
Bigger Fish To Fry
So, I’ve now shined the spotlight on a few big elephants in the room which will almost surely be hot topics of discussion in the comments section anyway: no cards yet, no board specs yet, no preliminary performance numbers, even. GF100 will run hot. It won’t bitstream high-def audio (right out of the gate, at least—subsequent derivatives could). And it’ll require two (likely expensive) boards to do triple display outputs, a feature ATI supports across its 5000-series lineup.
Grim though that might sound, Nvidia’s recent preview of the GF100 architecture left us with the impression that the company has a bigger-picture plan of action for DirectX 11. Not only is the hardware in production, representatives claim, but third-party software developers are also being armed with the tools to create more compelling content.
Sure some of this is done for selfish reasons—for example, its PhysX plug-ins for 3ds Max, Maya, and Softimage are all provided to further the company’s agenda for this proprietary API. However, the Nexus toolkit, which is integrated into Visual Studio, supports CUDA C calls (of course), and also DirectX 10/11 and OpenGL. To the uninitiated gamer who might not know what it takes to bring a popular title to market, it’s really just important to know that Nvidia’s efforts will, ideally, enable more efficient development and better on-screen effects. And because DirectX 11 is fairly specific in the way it dictates compatibility, Nvidia is confident that debugging on a GF100-based graphics card will allow developers to optimize for ATI’s DX11 hardware, too.
But enough about predictions of the future—we’re writing this story because we know some new facts about what GF100 will enable when it finally does emerge sometime in the next two months.