Page 1:It Began With A Bang!
Page 2:MSI’s Big Bang Fuzion Motherboard
Page 3:The Many Heads Of Hydra
Page 4:Hydra 200: An Evolved ASIC
Page 5:Test Setup And Benchmarks
Page 6:Benchmark Results: 3DMark Vantage
Page 7:Benchmark Results: S.T.A.L.K.E.R.: Call Of Pripyat
Page 8:Benchmark Results: Crysis
Page 9:Benchmark Results: Left 4 Dead 2
Page 10:Benchmark Results: DiRT2
Page 11:Benchmark Results: Call Of Duty: Modern Warfare 2
Page 12:Benchmark Results: Batman: Arkham Asylum
Hydra 200: An Evolved ASIC
When Lucid first began showing off what it was working on, the company was using Hydra 100-series hardware—manufactured on a 130nm process, limited to PCI Express 1.1 signaling rates, and rated for 3.5W power consumption.
The LT24102—Lucid’s highest-end ASIC (in a family of three Hydra 200-series SoCs)—is a second-gen part compatible with PCI Express 2.0, manufactured at 65nm, and rated for up to 5.5W. The ASIC’s 48 PCIe lanes allow it one x16 upstream port and two x16 downstream ports, one x16 and two x8s, or a quartet of x8 connections. An embedded 300 MHz RISC processor with 64KB instruction cache and 32KB data cache exercises control over the device’s switch port.
Hardware, Meet Software
Before you’re able to utilize the Hydra engine you have to install Lucid’s driver software, which is currently evolving in a notable way. It used to be that every time ATI or Nvidia updated their own drivers, Lucid would have to qualify them and iron out any new glitches introduced by either vendor. Clearly, this would have been an ongoing (and compounding) support nightmare for Lucid’s engineers.
Enabling Hydra is ridiculously easy through the driver's control panel.
But with the company’s most recent driver drop, version 1.4, a reshuffling of where the Hydra engine exists in software means it’s no longer necessary to sweat the Catalyst or GeForce version you’re using.
That’s not to say the game, API, and operating system compatibility stories have simplified at all:
- You’re still limited to DirectX 9 and DirectX 10.
- You’re still limited to Windows Vista (32- and 64-bit) and Windows 7 (32- and 64-bit). Moreover, X-mode—running an ATI and Nvidia card in the same machine, is limited to Windows 7, which gives you the ability to install multiple graphics drivers concurrently.
- You’re still subject to Lucid’s own game testing. According to the company, many titles work right out of the box. Others require specific optimization in its driver. This is perhaps the biggest challenge facing Lucid in making Hydra a transparent technology for gamers to enjoy. Not only do the hardware vendors have to work out the kinks when a new title is launched, but then Lucid has to do the same thing.
Depending on the graphics card configuration you’re running, there are different lists of games qualified to pass QA. For example, in driver 1.4.1, Lucid presents a list of 42 different games validated on all five of its available hardware combinations. An additional 22 are supported by N- and A-modes (not X-). Nine others work in N-mode, and five work in A-mode. One of the things we’ll be testing today is Hydra’s compatibility. We’ve recently upgraded our benchmark suite with newer games, so it’ll be a challenge for Lucid, to be sure.
What if your favorite new game isn’t one of the ones qualified to run acceptably? Does that mean you’re out of luck? Not necessarily. You can manually add the game to the driver control panel, which will turn Hydra on for that title. Here’s the breakdown:
- If Hydra is disabled on the Fuzion board (through the control panel or system tray icon), any game you play will run on a single GPU.
- If Hydra is enabled and the game is not on the control panel’s list of validated/manually-added titles, it’ll run on a single GPU.
- If Hydra is enabled and the game is on the list, it’ll run on multiple GPUs and (hopefully) realize a speed-up. If you added the game manually, it could encounter problems given that it wasn’t validated.
Perhaps Hydra’s sexiest selling point is the ability to augment your once-fastest Nvidia-based graphics card with something faster from ATI. Sure beats hawking that $500 GeForce GTX 285 on Ebay for $250 used, right? Well, there are a few things you’ll need to keep in mind before assuming Radeons and GeForces get along.
Most important, the obvious: you’re using dissimilar architectures from competitors who use differentiation to sell more GPUs. Mixing them will get you a lowest common denominator. Lucid seems rife with smart engineers, but they’re not magicians. They can’t make a Radeon HD 5870 accelerate PhysX or a GeForce GTX 260 support DirectX 11. Instead, you have to give up both. You’ll see in the benchmarks that we weren’t able to achieve PhysX acceleration as long as there was an ATI card installed, and we weren’t able to run the latest S.T.A.L.K.E.R.: Call of Pripyat test with DX11 lighting with an Nvidia board present.
There’s another caveat here that might temper your enthusiasm a bit: Lucid recommends mixing non-identical cards with performance profiles as close as possible in order to maximize scaling. Match too-fast of a board with something too slow and you’ll see minimal gain, if any. That might be a tough pill to swallow for upgraders who aren’t necessarily looking to jump sideways from a GeForce GTX 260 to, say, a Radeon HD 4890.
- It Began With A Bang!
- MSI’s Big Bang Fuzion Motherboard
- The Many Heads Of Hydra
- Hydra 200: An Evolved ASIC
- Test Setup And Benchmarks
- Benchmark Results: 3DMark Vantage
- Benchmark Results: S.T.A.L.K.E.R.: Call Of Pripyat
- Benchmark Results: Crysis
- Benchmark Results: Left 4 Dead 2
- Benchmark Results: DiRT2
- Benchmark Results: Call Of Duty: Modern Warfare 2
- Benchmark Results: Batman: Arkham Asylum