Radeon R9 290X Review: AMD's Back In Ultra-High-End Gaming

CrossFire: Farewell Bridge Connector; Hello DMA

Before now, adding a second, third, or fourth Radeon card in CrossFire was a matter of picking a compatible motherboard (with the right PCI Express slot spacing), dropping in the additional hardware, and linking the cards up with a bridge connector draped over the top. That connector shuttled the secondary card’s frames to the first, where an on-die compositing engine put the stream together for output.

This approach worked well in a world of 2560x1600 and less. However, it became problematic above four-megapixel resolutions, where information had to be moved across PCI Express instead, negatively affecting practical frame rates at 5760x1080 and Ultra HD.

So, AMD built a DMA engine into its compositing block, facilitating direct communication between GPUs over PCI Express and enough throughput for those triple-screen and 4K configurations that performed so poorly before. This is not inconsequential. Moving display data is a real-time operation, necessitating bandwidth provisioning, buffering, and prioritization.

The big benefit is that there’s no longer any need for an external bridge. Whereas the interplay between CrossFire connector and PCIe bus continues to stymie the rest of AMD’s cards (except for the Bonaire-based R7 260X—that GPU has the xDMA feature, too), a pair of Radeon R9 290Xes in CrossFire support the company’s frame pacing technology at 3840x2160 right out of the gate.

Windows display properties next to CCC, showing Frame Pacing enabled at 3840x2160

You don’t even need PCI Express 3.0 connectivity—as in, the xDMA engine doesn’t rely on any specific feature of the third-gen standard. AMD says this feature will work on platforms limited to older versions of PCIe, too. With that said, if you’re shopping for two R9 290Xes and you’re still using a motherboard with PCI Express 2.0, it might be time to upgrade (or think about a combination of hardware that won’t bottleneck performance).

AMD’s timing is ideal. When I wrote Gaming At 3840x2160: Is Your PC Ready For A 4K Display?, there wasn’t even a point to testing Radeon cards. Without frame pacing, we fully expected to see one-half of dual-GPU configuration’s frames getting dropped. But now the company has a new flagship seemingly built for high-resolution gaming. And, given that we already considered two GeForce GTX 780s the sweet spot for smooth frame rates on a 4K screen, it seems probable that you’d want at least two 290Xes, too.

  • beta212
    That's incredible. Especially at high res, I wonder how they do it. But the low price alone is enough to blow the competition away. Seriously think about it, it's around half the price for higher performance!
    - AMD: We're not aiming for the ultra high end.
    I think Nvidia just got trolled.
    Reply
  • slomo4sho
    Great price point. This card has already broken world records just a few hours after release!

    Reply
  • esrever
    2 of these for 4k looks amazing but Im a little disappointed by the power consumption when you crank up performance.
    Reply
  • aznguy0028
    I was thinking about hopping on the 7970ghz when it's on sale, but after seeing this, it's time to break apart the piggy bank for the 290x, what value!
    Reply
  • Benthon
    Like the conclusion said, you just can't argue about aesthetics and thermals at this price point/performance. Well done AMD, lets see team green's response! Go consumer!
    Reply
  • tuklap
    This is awesome for us ^_^
    Reply
  • Shankovich
    Wow, and it's pegged at 73% too. Even if nVidia's "780ti" beats the 290X, it probably won't beat a 290X running at full power. And if mantle does make some big performance boosts, nVidia is going to be in a really tight spot. Looking forward to what they'll do. In the mean time, loving this competition! We all win in the end.
    Reply
  • julianbautista87
    daaaaayyyyyuuuummmm
    Reply
  • anxiousinfusion
    Wait the 290 X... X? is going to be $550?! Forgive me, padre for I have sinned.
    Reply
  • Darkerson
    Good job, AMD!
    Reply