Skip to main content

NVIDIA Takes a Walk on the SLI Side with Double Graphics Processing

Hardware Requirements, Continued

Too Expensive? Who spends that kind of money on a graphics solution? This is a justified question, and the answer is very simple: enthusiasts, of course.

The same question was asked back in the days of the Voodoo 2 as well, and at the time, a similar proposition was even more expensive. A V2 SLI setup required a 2D graphics card ($200-300) and two Voodoo2 3D add-in cards ($500 each). Yet despite these horrendous prices, SLI systems were very popular in the gaming community. Today, the entry-level price point for SLI systems from NVIDIA is set at two times $299 for two GeForce 6800 (standard) cards, and this investment will net you the rendering power of 24 pixel pipelines... We can only hope that NVIDIA will also offer SLI support on its mainstream cards. Of course, right now the prices for motherboards with two x16 PCI Express slots are still an unknown in this equation. The prohibitively expensive Tumwater paired with the equally pricey Xeons is certainly not an attractive option.

Benchmarks: NVIDIA has yet to release any concrete numbers or benchmark results. So far, the company has quoted an improvement of x1.87 based on 3DMark03 running at 1600x12004x/8x (Game Test 2,3 and 4) and Unreal Engine 3 running at 1024x768. In some cases, however, a factor of x 2.00 was almost reached.

Possible Problems: Will there be a performance penalty when complex shaders are used (dynamic branching)? How will NVIDIA solve the problem of a shader requiring pixel values that lie in the other card's rendering field? At present, it is too early to answer such questions.

Power Requirements: The power consumption of such an SLI system will be immensely high. In addition to the CPU, the power supply will have to be powerful enough to feed the two x16 slots with 75 Watts each, in addition to the GF 6800 cards' auxiliary power connectors. On top of that, there are of course the remaining components such as hard drives, optical drives etc. that also draw power.

A power draw of 250 Watts for the 6800 Ultra SLI solution is very realistic. NVIDIA is confident that a PC equipped with sundry drives and a 6800U SLI configuration should be able to run with a power supply rated at 550 Watts. Consequently, smaller power supplies should be sufficient for a 6800 GT or 6800 (standard) SLI configuration.

Workstation / Quadro: NVIDIA will also offer SLI for its Quadro line of 3D workstation boards. NVIDIA should be able to make an especially strong impact in this market, since more performance automatically translates into shorter rendering times, which thus justifies every additional dollar spent.

SLI configurations are also possible with the NVIDIA Quadro 3D-workstation linecard.

When? Currently, NVIDIA plans to launch SLI in the August / September timeframe.

Conclusion

SLI, that magical formula from bygone days, is back. In the future, when you find your system lacking the necessary 3D punch, simply stick in a second card. This formula already worked for the Voodoo 2, making it a bestseller. As a feature, it is aimed at the enthusiast, at least where high-end cards are concerned.

Hypothetically, you would shell out $499 for a GeForce 6800 Ultra or spend an extra $100 and get two standard 6800 processors with SLI. Of course, you could just as well buy only the motherboard and one standard 6800, and then buy a second card later after your credit card is no longer maxed out.

Things will get even more interesting if NVIDIA should decide to offer SLI in its mainstream cards as well. That would make us wonder whether such a step wouldn't make the top models rather unattractive. In the end, neither NVIDIA nor the card makers will really care how the consumer decides, as either choice will be good for their bottom line.

The only missing ingredients at this point are dual x16 PCIe capable mainstream motherboards and chipsets, which SLI requires. Intel's Tumwater workstation chipset is not a real option, as we explained above. However, as we said before, PCI Express is still young, and who knows what solutions and implementations lie ahead. VIA and SiS haven't even officially introduced their PCIe capable chipsets yet. It is also likely that NVIDIA will present its own chipset solution with two x16 slots in time for the SLI launch in August / September. Whatever the case may be, SLI is the first really reasonable argument for PCI Express that we have heard so far, at least where graphics cards are concerned.

It will be interesting to see what will happen to Alienware's recently announced graphics array technology. One possible solution would be to team up with ATi, since Alienware's Array works with cards from any company. Then again, NVIDIA holds several patents relating to SLI, meaning that ATi won't just be able to copy this approach. Alienware's solution, on the other hand, is based on patents held by MetaByte, alias Wicked3D. ATi will definitely have to react to this challenge. If NVIDIA's claims should turn out to be true and SLI setups really do improve performance by a factor of nearly 1.9, ATi will suffer in the prestigious battle for benchmark supremacy.

SLI could turn out to be less appealing for the mass market. However, knowing that you could easily upgrade your graphics performance by using NVIDIA's SLI might indeed sway many buyers' decisions in favor of NVIDIA.