Nvidia’s GF100: Graphics Architecture Previewed, Still No Benchmarks

Why Re-Organize GF100? One Word: Geometry

Successful architectures don’t get re-worked just to impress the ladies. There’s a rhyme and reason behind Nvidia’s decision to arm each GPC with its own raster engine and each SM with what it calls a PolyMorph engine (no, WoW players, we’re not talking about sheeping the stream processors here…).

First things first: the PolyMorph engine refers to a five stage piece of fixed-function logic that works in conjunction with the rest of the SM to fetch vertices, tessellate, perform viewport transformation, attribute setup, and output to memory. In between each stage, the SM handles vertex/hull shading and domain/geometry shading. From each PolyMorph engine, primitives are sent to the raster engine, each capable of eight pixels per clock (totaling 32 pixels per clock across the chip).

Now, why was it necessary to get more granular about the way geometry was being handled when a monolithic fixed-function front-end has worked so well in the past? After all, hasn’t ATI enabled tessellation units in something like six generations of its GPUs (as far back as TruForm in 2001)? Ah, yes. But how many games actually took advantage of tessellation between then and now? That’s the point.

Scalable LoD, lowScalable LoD, low

Dynamic tessellation LoD turned all the way upDynamic tessellation LoD turned all the way up

Ever since the days of Nvidia’s GeForce 2 architecture, we’ve been hearing about programmable pixel and then vertex shading. Now we’re getting some very impressive shaders able to add tremendous detail to the latest DirectX 9 and 10 games (Nvidia claims a 150x increase in shading performance from the GeForce FX 5800-series to GT200). But I know we’ve all seen some of the terri-bad geometry that totally ruins the guise of realism in our favorite games. Purportedly, the next frontier in augmenting graphics realism involves cranking the dial on geometry.

DirectX 11 posits to fix this via three new stages in the rendering pipeline: the hull shader, which computes control point transforms, the tessellator, which takes in the tessellation factors from the hull shader and outputs domain points, and the domain shader, which operates on each of those points.

From left to right: quad mesh, tessellated, and with a displacement map applied. id Software, 2008 From left to right: quad mesh, tessellated, and with a displacement map applied. id Software, 2008

But in order to facilitate the performance needed to make tessellation feasible, Nvidia had to shift away from that monolithic front-end and toward a more parallel design. Hence, the four raster and 16 PolyMorph engines. The company naturally has its own demos that show how much more efficient GF100 is versus the Cypress architecture, which employs the “bottlenecked” monolithic design—however, we’ll want to compare the performance of a title like Aliens Vs. Predator from Rebellion Developments with tessellation on and off to compare a more balanced app. Up front, though, Nvidia claims that GF100 enables up to 8x better performance in geometry-bound environments than GT200.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
120 comments
    Your comment
    Top Comments
  • decembermouse
    I feel like you left some info out, whether you just never read it or didn't mention it for fear of casting doubts on GF100... I've heard (and this isn't proven) that they had to remove some shaders and weren't able to reach their target clocks even with this revision (heard the last one didn't cut the mustard which is why they're hurrying the new one along and why we have to wait till March). Also, be careful about sounding too partisan with Nvidia before we have more concrete info on this.

    And yes, it does matter that AMD got DX11 hardware out the gate first. Somehow, when Nvidia wins at something, whether that's being first with a technology, having the fastest card on the market, or a neato feature like Physx, it's a huge deal, but when AMD has a win, it's 'calm down people, let's not get excited, it's no big deal.' The market and public opinion, and I believe even worth of the company have all been significantly boosted by their DX11 hardware. It is a big deal. And it'll be a big deal when GF100 is faster than the 5970 too, but they are late. I believe it'll be April before we'll realistically be able to buy these without having to F5 Newegg every 10 seconds for a week, and in these months that AMD has been the only DX11 player, well, a lot of people don't want to wait that long for what might be the next best thing... all I'm trying to say is let's try not to spin things so one company sounds better. It makes me sad when I see fanboyism, whether for AMD, Intel, Nvidia, whoever, on such a high-profile review site.
    26
  • cangelini
    dingumfOh look, no benchmarks.


    *Specifically* mentioned in the title of the story, just to avoid that comment =)
    25
  • randomizer
    GF100 is entering the ranks of Duke Nukem Forever. We keep seeing little glimpses but the real thing might as well not exist.
    24
  • Other Comments
  • randomizer
    GF100 is entering the ranks of Duke Nukem Forever. We keep seeing little glimpses but the real thing might as well not exist.
    24
  • dingumf
    Oh look, no benchmarks.
    -20
  • duckmanx88
    dingumfOh look, no benchmarks.


    wth is he supposed to benchmark? nothing has been released it's just an article giving us details on what we can expect within the next two months.
    23
  • decembermouse
    I feel like you left some info out, whether you just never read it or didn't mention it for fear of casting doubts on GF100... I've heard (and this isn't proven) that they had to remove some shaders and weren't able to reach their target clocks even with this revision (heard the last one didn't cut the mustard which is why they're hurrying the new one along and why we have to wait till March). Also, be careful about sounding too partisan with Nvidia before we have more concrete info on this.

    And yes, it does matter that AMD got DX11 hardware out the gate first. Somehow, when Nvidia wins at something, whether that's being first with a technology, having the fastest card on the market, or a neato feature like Physx, it's a huge deal, but when AMD has a win, it's 'calm down people, let's not get excited, it's no big deal.' The market and public opinion, and I believe even worth of the company have all been significantly boosted by their DX11 hardware. It is a big deal. And it'll be a big deal when GF100 is faster than the 5970 too, but they are late. I believe it'll be April before we'll realistically be able to buy these without having to F5 Newegg every 10 seconds for a week, and in these months that AMD has been the only DX11 player, well, a lot of people don't want to wait that long for what might be the next best thing... all I'm trying to say is let's try not to spin things so one company sounds better. It makes me sad when I see fanboyism, whether for AMD, Intel, Nvidia, whoever, on such a high-profile review site.
    26
  • megamanx00
    Well, not much new here. I wouldn't really be surprised if the 2x performance increase over the GTX285 was a reality. Still, the question is if this new card will be able to maintain as sizable a performance lead in DX11 games when Developers have been working with ATI hardware. If this GPU is as expensive to produce as rumored will nVidia be able to cope with an AMD price drop to counter them?

    I hope that 5850s on shorter PCBs come out around the time of the GF100 so they can drop to a price where I can afford to buy one ^_^
    8
  • cangelini
    dingumfOh look, no benchmarks.


    *Specifically* mentioned in the title of the story, just to avoid that comment =)
    25
  • randomizer
    cangelini*Specifically* mentioned in the title of the story, just to avoid that comment =)

    You just can't win :lol:
    20
  • WINTERLORD
    great article. im wondering though just for clarifacation, nvidia is going to look better then ati?
    -10
  • sabot00
    Finally some solid info on GF100.
    -5
  • tacoslave
    Even though im a RED fan im excited because its a win win for me either way. If amd wins than im proud of them but if nvidia wins than that means price drops!!! And since they usually charge more than ati for a little performance increase than ill probably get a 5970 for 500 or less (hopefully). Anyone remember the gtx280 launch?
    5
  • dingumf
    sabot00Finally some solid info on GF100.


    Solid info?

    No price, no benchmarks, no specs.

    What's wrong with you
    -22
  • dingumf
    duckmanx88wth is he supposed to benchmark? nothing has been released it's just an article giving us details on what we can expect within the next two months.


    This is the end of the NDA. Do you even know what NDA is kid?
    -21
  • Reynod
    Chris your review was unusually kind.

    I'd rank it up there with Anand's on the first Phenom iteration - he had ES well before the others and there was mounting pressure to at least publish something ... and the AMD fanbois should consider tha article very fair.

    I had heard Nvidia were booting some silicon and the clocks were low ... and in order to get within the power elvelope it was likely some SP's would have to be shaved ... that's about all anyone can say.

    I imagine NVidia will also be concentrating on ensuring the die is securely attached to the substrate.

    They won't want to cheese off the OEM's like last time:

    http://www.theinquirer.net/inquirer/news/1050052/nvidia-chips-underfill
    7
  • falchard
    One thing I wonder is if nVidia finally pushes forward with this architecture, does this mean developers will finally start utilizing some of the tech ATI has had in its cards for generations? For instance, will they utilize more efficient poly rendering effectively making ATI cards perform 300% faster in drawing polies and make every consumer nVidia card before the GF100 moot?

    Also will they adopt a naming convention that finally makes sense? Up to 9000, reset, skip double digits and 100, go straight to 200. Now go back to 100. I mean seriously who comes up with these names?
    G80, G92, G200, GF100..
    5
  • Kelavarus
    One thing you didn't mention about the Supersonic Sled Tech Demo there is that it took three GF100s in a triple-SLI configuration to do that.
    12
  • TheGreatGrapeApe
    Chris, some 'leaked' 'internal' nV slides recently appeared with THG results from the HD5970 review, since I can't ask the question I would like yo about that (there's no way you could answer if true), I'll simply ask, were you aware of this? [:grahamlv:3]

    http://news.softpedia.com/newsImage/Alleged-GeForce-GTX-360-and-380-benchmarks-Surface-3.jpg/

    Slight tweaking of the RE:5 results (likely because they didn't point in the right direction for the existing cards) :evil:

    And Charlie's recent 'Pro-nVidia' article is somewhat telling about the possibility of scaling downward, what's your opinion on it if you can say, other than "Charlie's just being Charlie". ;)

    http://www.semiaccurate.com/2010/01/17/nvidia-gf100-takes-280w-and-unmanufacturable
    2
  • aggrressor
    umm, Guys, If you want benches - They are "kind of" available at guru3d. I have just read their article, and while it's a bit too technical for my taste, they've recorded a Far Cry 2 bench at Nvidia conference on a crappy camera. The end result was 50 FPS on GTX285 vs 84 FPS on GF100 based product. Now I know it's not raw numbers or charts or anything like that, but at least that gives me a rough idea of what GT300 stuff will be like.
    1
  • randomizer
    dingumfThis is the end of the NDA. Do you even know what NDA is kid?

    Do you? The end of an NDA does not mean every detail has to be divulged. You can still only provide the details that have been given to you. If NVIDIA don't hand out the review samples, you can't benchmark them. It's not rocket science!
    7
  • masterjaw
    How long should we wait before we actually see an article like "Alas! Fermi has arrived (late?)".

    If they claim that it is "significantly faster" then better it would be or else..
    2
  • notty22
    WoW this sounds like a real "Game Changer". Seems like the Fanbois are already making excuses and their bias b/s is already starting. Sad.
    -11