Radeon RX 6400 Suffers 14% Performance Loss Over PCIe 3.0

GPU
GPU (Image credit: Shutterstock)

TechPowerUp recently tested AMD's new Radeon RX 6400 GPU on the slower PCIe 3.0 interface to measure any substantial drops in performance compared to PCIe 4.0 and found that to be the case thanks to the GPU's unorthodox configuration of just four PCIe lanes. On average, performance drop-offs were upwards of 14%, depending on the resolution.

AMD's Radeon RX 6400 is the company's newest entry-level GPU for the desktop market featuring AMD's super tiny Navi 24 die, initially for laptops. Like its bigger brother, the Radeon RX 6500 XT, this GPU only packs four PCIe 4.0 lanes as a measure to cut down costs. The Radeon RX 6400 features a rough 13% reduction in core count and memory bandwidth compared to the Radeon RX 6500 XT. The graphics card has just 768 cores and 128 GBps of memory bandwidth, thanks to a reduction in GDDR6 speeds to 16 Gbps. However, memory capacity stays the same at 4GB. But the Radeon RX 6400's main strength is its power consumption. It has half the Radeon RX 6500 XT power draw for 53W, making this GPU very useful in small systems budget systems that cannot power higher-end GPUs with auxiliary power. 

In the 24 games, TechPowerUp tested, the Radeon RX 6400 running on PCIe 3.0 averaged a 14% decrease in performance at both 1080P and 1440P cumulatively compared to PCIe 4.0. 4K results were even worse, with a 23% deficit in performance. However, we doubt it'll matter much since the Radeon RX 6400 offers unplayable FPS on either PCIe generation. 

Some of the worst offenders included F1 2021 and Doom Eternal, which saw a 79% and 43% difference in performance at 1080P. Thankfully, almost all other titles tested were in the 14% range or lower, but these two games specifically show how truly PCIe bandwidth-intensive some game engines can become.

TechPowerUp's testing for the Radeon RX 6500 XT showed nearly the same results, showing a 13% deficit in performance for PCIe 3.0 vs. PCIe 4.0. The Radeon RX 6400 can't escape Navi 24's four-lane limitations, even if the GPU itself is slower. The Radeon RX 6400's PCIe 3.0 was almost 20% slower than GeForce GTX 1650, which operates on the same PCIe 3.0 interface and runs on a much healthier x16 lane configuration. However, when installed on a PCIe 4.0 interface, the Radeon RX 6400 performed similarly to the GeForce GTX 1650.

Consumers eyeing the Radeon RX 6400 need a PCIe 4.0 platform to avoid the severe performance drop-off from PCIe 3.0. The biggest issue is that most systems are still on PCIe 3.0, and the consumers that are likely to buy a Radeon RX 6400 are on an old system. If you don't have access to PCIe 4.0, the better alternative is Nvidia's GeForce GTX 1650 or even the more senior Radeon RX 570, which is faster than the Radeon RX 6400 and doesn't require PCIe 4.0 to unlock its full performance.

Aaron Klotz
Freelance News Writer

Aaron Klotz is a freelance writer for Tom’s Hardware US, covering news topics related to computer hardware such as CPUs, and graphics cards.

  • danbob9992
    would it really be much more expensive to put a PCIe 3.0 x16 interface (or even x8)? Sounds like it would have been the better option for compatibility with older PCs while not cripling the performance.
    Reply
  • magbarn
    danbob9992 said:
    would it really be much more expensive to put a PCIe 3.0 x16 interface (or even x8)? Sounds like it would have been the better option for compatibility with older PCs while not cripling the performance.
    There were originally supposed to be used in laptops which usually have a limited number of PCIe lanes.
    Reply
  • Alvar "Miles" Udell
    It has to be said that this test does carry a big asterisk, that the lowest resolution tested, 1920x1080, is -barely- on the playability side of games with this card, with the average FPS figure at 38fps on PCIe 4.0.

    Granted there would likely still be a measurable difference, this is really a 1280x720 card.
    Reply
  • KananX
    It doesn’t lose much performance in games it has solid fps to begin with, like 1-5% max. In games it has atrocious fps of lower than 60 or lower than 30, yes, there it loses a lot of performance with PCI E 3.0. Again TPU with terrible settings that overburden the card extremely and skew the results. A reasonable benchmark would’ve been to test almost all but the esport titles with only medium settings. Because that is what you will ultimately use with this GPU.
    Reply
  • cryoburner
    danbob9992 said:
    would it really be much more expensive to put a PCIe 3.0 x16 interface (or even x8)? Sounds like it would have been the better option for compatibility with older PCs while not cripling the performance.
    If this card were positioned at the price point they likely planned it at, the x4 interface might have made more sense. Most likely, this card was intended to be positioned closer to $100 USD, not $160, but ended up where it is as a result of crypto mining's influence on the market, combined with limited production capacity. Even on a PCIe 4.0 connection, the RX 6400 generally isn't quite as fast as an RX 470, a card that was readily available for around this price back in 2016, more than 5 years ago. You could even find a number of RX 570s for around $120 bundled with free new game releases a few years back, or RX 580s with games for $160. This card certainly doesn't belong at this price point.

    Alvar Miles Udell said:
    It has to be said that this test does carry a big asterisk, that the lowest resolution tested, 1920x1080, is -barely- on the playability side of games with this card, with the average FPS figure at 38fps on PCIe 4.0.

    Granted there would likely still be a measurable difference, this is really a 1280x720 card.
    It's not exactly a 720p card, it's just not a 1080p "Ultra" card. And really, ultra settings typically don't bring much in the way of visual improvement even compared to medium, in exchange for a significant hit to performance, and are arguably not worth using unless one has the performance to spare.

    Though I agree that they probably should have adjusted the settings to more realistic levels for the card. Even on a 4.0 connection at 1080p, a number of these games were averaging frame rates in the twenties or below, which is not likely how most will be using these cards. For more meaningful results, they should have adjusted settings to average at least 30fps on the 4.0 connection.
    Reply