Nvidia’s GF100: Graphics Architecture Previewed, Still No Benchmarks

Going Surround

I still remember back to when Matrox launched its Parhelia. The single-slot card included provisions for connecting up to three displays via an adapter. As its name implies, the Surround Gaming feature let you play across the trio (the GPU even supported adaptive polygon tessellation way back in 2002). Unfortunately, Parhelia wasn’t really fast enough to make gaming on one screen all that enjoyable, so three was pretty much out of the question.

The launch of ATI’s Radeon HD 5870 gave us enough 3D performance that, even at 2560x1600, we had graphics muscle to spare. Although gaming at 5760x1200 in Eyefinity mode exacts a substantial load on the Cypress GPU, it’s still completely feasible—and in fact enjoyable.

Nvidia runs into a similar situation with GF100, a GPU expected to wield significant graphics horsepower. Turning on GeForce 3D Vision helps spend some of the budget, since rendering stereoscopically effectively halves frame rates. But Nvidia undoubtedly felt the pressure to counter Eyefinity, giving way to 3D Vision Surround. Supporting displays with resolutions of up to 1920x1080, the technology facilitates stereoscopic rendering across up to three 120 Hz LCDs. The addition of bezel correction takes into account the fact that you’ll have fairly sizable gaps between each display, hiding that part of the game behind the bezel to provide an experience Nvidia describes similar to looking through a cockpit window’s frames

If you aren’t prepared to spend money on a trio of 120 Hz displays, Nvidia will also enable vanilla Surround—the same technology across a trio of up to 2560x1600 LCDs, similar to Eyefinity. But there are two important distinctions here. First, both capabilities are being exposed through “a future driver” that will reportedly be made available by the time GF100-based hardware ships (and not only on GF100; GT200-based boards will pick up 3D Vision Surround as well). Second, all three-display configurations will require SLI, since GT200- and GF100-based GPUs only include two display outputs each.

Pro: you’re looking at solid performance potential in applications pushing three 1920x1080 displays (5760x1080) in stereo.

Con: Even at known prices on the GeForce GTX 285, you’re looking at $780 worth of graphics cards to achieve what a $650 Radeon HD 5970 can beat, if you’re looking at Nvidia Surround. And if you want to go stereoscopic, you’ll probably want to wait for Acer’s upcoming G245—today you’re largely limited to smaller 1680x1050 panels from ViewSonic and Samsung.

Chris Angelini
Chris Angelini is an Editor Emeritus at Tom's Hardware US. He edits hardware reviews and covers high-profile CPU and GPU launches.
  • randomizer
    GF100 is entering the ranks of Duke Nukem Forever. We keep seeing little glimpses but the real thing might as well not exist.
    Reply
  • dingumf
    Oh look, no benchmarks.
    Reply
  • duckmanx88
    dingumfOh look, no benchmarks.
    wth is he supposed to benchmark? nothing has been released it's just an article giving us details on what we can expect within the next two months.
    Reply
  • decembermouse
    I feel like you left some info out, whether you just never read it or didn't mention it for fear of casting doubts on GF100... I've heard (and this isn't proven) that they had to remove some shaders and weren't able to reach their target clocks even with this revision (heard the last one didn't cut the mustard which is why they're hurrying the new one along and why we have to wait till March). Also, be careful about sounding too partisan with Nvidia before we have more concrete info on this.

    And yes, it does matter that AMD got DX11 hardware out the gate first. Somehow, when Nvidia wins at something, whether that's being first with a technology, having the fastest card on the market, or a neato feature like Physx, it's a huge deal, but when AMD has a win, it's 'calm down people, let's not get excited, it's no big deal.' The market and public opinion, and I believe even worth of the company have all been significantly boosted by their DX11 hardware. It is a big deal. And it'll be a big deal when GF100 is faster than the 5970 too, but they are late. I believe it'll be April before we'll realistically be able to buy these without having to F5 Newegg every 10 seconds for a week, and in these months that AMD has been the only DX11 player, well, a lot of people don't want to wait that long for what might be the next best thing... all I'm trying to say is let's try not to spin things so one company sounds better. It makes me sad when I see fanboyism, whether for AMD, Intel, Nvidia, whoever, on such a high-profile review site.
    Reply
  • megamanx00
    Well, not much new here. I wouldn't really be surprised if the 2x performance increase over the GTX285 was a reality. Still, the question is if this new card will be able to maintain as sizable a performance lead in DX11 games when Developers have been working with ATI hardware. If this GPU is as expensive to produce as rumored will nVidia be able to cope with an AMD price drop to counter them?

    I hope that 5850s on shorter PCBs come out around the time of the GF100 so they can drop to a price where I can afford to buy one ^_^
    Reply
  • cangelini
    dingumfOh look, no benchmarks.
    *Specifically* mentioned in the title of the story, just to avoid that comment =)
    Reply
  • randomizer
    cangelini*Specifically* mentioned in the title of the story, just to avoid that comment =)You just can't win :lol:
    Reply
  • WINTERLORD
    great article. im wondering though just for clarifacation, nvidia is going to look better then ati?
    Reply
  • sabot00
    Finally some solid info on GF100.
    Reply
  • tacoslave
    Even though im a RED fan im excited because its a win win for me either way. If amd wins than im proud of them but if nvidia wins than that means price drops!!! And since they usually charge more than ati for a little performance increase than ill probably get a 5970 for 500 or less (hopefully). Anyone remember the gtx280 launch?
    Reply