Nvidia’s GF100: Graphics Architecture Previewed, Still No Benchmarks

Introduction

ATI is currently rocking a substantial lead in the DirectX 11 space—it’s blazing a trail where it has, in the past, often followed suit. “But Chris, ATI gave us DirectX 10.1!” And look how pervasive or impactful that turned out to be.

This time it’s different, though. ATI and Nvidia both agree that DirectX 11 is the API that’ll change the figurative game. Has it yet? Is this boat already sailing off into the sunset? Decidedly not. ATI’s own schedule of DirectX 11-compatible software lists three titles currently shipping with support, three more slated for Q1, and two more expected in 2010. The undeniable reality is that ATI is out in front, and Nvidia brings up the rear this time around. But its tardiness means very little in the big picture (so long as you’re an enthusiast, and not a shareholder).

Playing Catch-Up

The real values in ATI’s Radeon HD 5800-series lineup, as it stands today, are gaming performance in the more pervasive DirectX 9 and DirectX 10 titles, Eyefinity, and the family’s handling of high-def audio/video content.

And while today’s preview of Nvidia’s GF100 graphics processor (the first model based on the company’s Fermi architecture) is full of detail covering the chip’s building blocks and architectural philosophies, we’re left to make educated guesses on how a productized GF100 will stand up to its competition.

Ray-tracing: possible in real-time using GF100's compute strengths.

We know that, through Nvidia Surround, the company will enable multi-monitor (more than two display) gaming. But it’s only enabling two outputs per card, so three will require an SLI configuration. That sounds like it could get expensive, especially since I'm running a configuration like that using a single $290 Radeon HD 5850.

We also now know a lot more about the resources dedicated to gaming within GF100. Given this, it’s quite reasonable to assume that a graphics card based on GF100 will be significantly faster than a card centering on ATI’s Cypress. Nvidia also admitted to us that it’d be more power-hungry than GT200—the ASIC driving GeForce GTX 200-series boards.

What we don’t yet know is when GF100-based cards will be available for sale. Nvidia says Q1; you could be waiting another two months, if its projections come to pass. We don’t know the GF100’s clock rates, nor do we have GDDR5 data rates. With those variables up in the air, price targets for the planned flagship and its derivatives are naturally right out the window. And the issue of high-def audio/video is largely irrelevant at this stage, given the GPU’s known transistor count and anticipated heat. But Nvidia says GF100 has the same A/V suite as the GeForce GT 240. That means you get Blu-ray playback acceleration and eight-channel LPCM output, but no bitstreaming—a feature ATI and Intel both offer.

Bigger Fish To Fry

So, I’ve now shined the spotlight on a few big elephants in the room which will almost surely be hot topics of discussion in the comments section anyway: no cards yet, no board specs yet, no preliminary performance numbers, even. GF100 will run hot. It won’t bitstream high-def audio (right out of the gate, at least—subsequent derivatives could). And it’ll require two (likely expensive) boards to do triple display outputs, a feature ATI supports across its 5000-series lineup.

Grim though that might sound, Nvidia’s recent preview of the GF100 architecture left us with the impression that the company has a bigger-picture plan of action for DirectX 11. Not only is the hardware in production, representatives claim, but third-party software developers are also being armed with the tools to create more compelling content.

Sure some of this is done for selfish reasons—for example, its PhysX plug-ins for 3ds Max, Maya, and Softimage are all provided to further the company’s agenda for this proprietary API. However, the Nexus toolkit, which is integrated into Visual Studio, supports CUDA C calls (of course), and also DirectX 10/11 and OpenGL. To the uninitiated gamer who might not know what it takes to bring a popular title to market, it’s really just important to know that Nvidia’s efforts will, ideally, enable more efficient development and better on-screen effects. And because DirectX 11 is fairly specific in the way it dictates compatibility, Nvidia is confident that debugging on a GF100-based graphics card will allow developers to optimize for ATI’s DX11 hardware, too.

But enough about predictions of the future—we’re writing this story because we know some new facts about what GF100 will enable when it finally does emerge sometime in the next two months.

Chris Angelini
Chris Angelini is an Editor Emeritus at Tom's Hardware US. He edits hardware reviews and covers high-profile CPU and GPU launches.
  • randomizer
    GF100 is entering the ranks of Duke Nukem Forever. We keep seeing little glimpses but the real thing might as well not exist.
    Reply
  • dingumf
    Oh look, no benchmarks.
    Reply
  • duckmanx88
    dingumfOh look, no benchmarks.
    wth is he supposed to benchmark? nothing has been released it's just an article giving us details on what we can expect within the next two months.
    Reply
  • decembermouse
    I feel like you left some info out, whether you just never read it or didn't mention it for fear of casting doubts on GF100... I've heard (and this isn't proven) that they had to remove some shaders and weren't able to reach their target clocks even with this revision (heard the last one didn't cut the mustard which is why they're hurrying the new one along and why we have to wait till March). Also, be careful about sounding too partisan with Nvidia before we have more concrete info on this.

    And yes, it does matter that AMD got DX11 hardware out the gate first. Somehow, when Nvidia wins at something, whether that's being first with a technology, having the fastest card on the market, or a neato feature like Physx, it's a huge deal, but when AMD has a win, it's 'calm down people, let's not get excited, it's no big deal.' The market and public opinion, and I believe even worth of the company have all been significantly boosted by their DX11 hardware. It is a big deal. And it'll be a big deal when GF100 is faster than the 5970 too, but they are late. I believe it'll be April before we'll realistically be able to buy these without having to F5 Newegg every 10 seconds for a week, and in these months that AMD has been the only DX11 player, well, a lot of people don't want to wait that long for what might be the next best thing... all I'm trying to say is let's try not to spin things so one company sounds better. It makes me sad when I see fanboyism, whether for AMD, Intel, Nvidia, whoever, on such a high-profile review site.
    Reply
  • megamanx00
    Well, not much new here. I wouldn't really be surprised if the 2x performance increase over the GTX285 was a reality. Still, the question is if this new card will be able to maintain as sizable a performance lead in DX11 games when Developers have been working with ATI hardware. If this GPU is as expensive to produce as rumored will nVidia be able to cope with an AMD price drop to counter them?

    I hope that 5850s on shorter PCBs come out around the time of the GF100 so they can drop to a price where I can afford to buy one ^_^
    Reply
  • cangelini
    dingumfOh look, no benchmarks.
    *Specifically* mentioned in the title of the story, just to avoid that comment =)
    Reply
  • randomizer
    cangelini*Specifically* mentioned in the title of the story, just to avoid that comment =)You just can't win :lol:
    Reply
  • WINTERLORD
    great article. im wondering though just for clarifacation, nvidia is going to look better then ati?
    Reply
  • sabot00
    Finally some solid info on GF100.
    Reply
  • tacoslave
    Even though im a RED fan im excited because its a win win for me either way. If amd wins than im proud of them but if nvidia wins than that means price drops!!! And since they usually charge more than ati for a little performance increase than ill probably get a 5970 for 500 or less (hopefully). Anyone remember the gtx280 launch?
    Reply