Gigabyte's 3D1: Are Two Engines Better Than One?

3D1 Package

At the heart of the 3D1 lie two NVIDIA GeForce 6600 GT processors, each with its own 128 MB frame buffer and a memory bandwidth of 128 bits. In its marketing brochures, Gigabyte happily adds these numbers up, quoting 256 MB of memory and a bandwidth of 256 bits - not unlike the way XGI promoted its own dual-core solution. However, the truth is that only 128 MB per card on a 128 bit bus are available in real terms, since each chip requires its own memory and can't "see" that of the other chip.

Single-card SLI. Both GeForce 6600 GT graphics processors share the same x16 PCI Express connector. Each chip possesses its own 128 MB framebuffer, which uses a 128 bit wide memory bus.

The two graphics processors communicate through NVIDIA's SLI technology. In a classic SLI configuration, two separate graphics cards, each in its own x16 PCIe slot, are connected via a small SLI connector circuit board. The two cards communicate via this connector. The remainder of the cards' communication is handled through the PCI Express bus, via the motherboard's Northbridge. The bandwidth of the two x16 PCIe slots is reduced to x8 during SLI operation.

With the 3D1, Gigabyte has put a full SLI system on only a single card. When using the 3D1, the bandwidth of the first x16 PCIe slot is reduced to two times x8 - the slot's PCIe lanes are simple divided between the two chips on the card. For this to work, the BIOS of the SLI-capable motherboard also explicitly needs to support this mode of operation. Currently, the special mode used by the 3D1 only works with the Gigabyte K8NXP-SLI motherboard, which is based on NVIDIA's nForce 4 chipset. In any other motherboard, the card will act like a single x8 PCIe GeForce 6600 GT card. Whether Gigabyte will also create other motherboards compatible with the 3D1 remains to be seen, and the same goes for support from other companies. Since the card is only available bundled with the very motherboard that supports it, the question of whether this limitation is a drawback is moot. And, in case you were wondering - no, a dual 3D1 setup will not work.

  • fb39ca4
    Wow this stuff is ancient.
    Reply
  • Shankovich
    So funny to look back now and see how abstract the concept of dual GPU design was back then. Happy to see my favorite mobo company be one of the pioneers
    Reply