AMD Radeon RX 6600 XT Review: The Memory Compromise

Navi 23 only has 8GB GDDR6 on a 128-bit interface

AMD Radeon RX 6600 XT
(Image: © Tom's Hardware)

Why you can trust Tom's Hardware Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test.

Unlike previous RX 6000 series launches, the Radeon RX 6600 XT will only be available from AMD's add-in board (AIB) partners. AMD sent us the ASRock RX 6600 XT Phantom Gaming D OC for this review, which of course doesn't use reference clocks. It sports a 2.4-slot cooler with triple fans and a slightly boosted TDP (based on our testing), which means it's probably running about 5% faster than a reference-clocked RX 6600 XT.

That's an important distinction because nearly all of the other GPUs we'll use in our testing are reference AMD and Nvidia cards that aren't factory overclocked. Models that run 3–5% faster should be easy enough to find — well, as easy as it is to find any GPU in stock right now. The only exception to that is the RTX 3060, which also doesn't have a reference model. Instead, we used an EVGA RTX 3060 XC, which comes factory overclocked, though the cooling solution is admittedly far less robust on the EVGA model (a compact dual-fan solution).

It's also worth noting that the official price for the ASRock Phantom Gaming comes in at a suggested price of $499.99, over 30% higher than AMD's SEP (Suggested Etail Pricing) of $379.99. ASRock does offer other RX 6600 XT cards, like the Challenger, with a $379.99 MSRP, but without all the extras that the Phantom offers. Whether the cards will actually sell at that price, in reasonable quantities, remains to be seen.

ASRock's card uses three traditional-style axial fans, meaning there's no integrated rim like we see on higher-end GPUs. The heatsink features a copper base that makes direct contact with the GPU, plus "premium" thermal pads that make contact with the memory. The Phantom also has 0dB silent cooling for lower workloads, where the fans don't spin at all at GPU temperatures below ~50C. Given the 160W (maybe 180W) TDP, cooling should be more than sufficient — and it is, as we'll detail later in our power, temperature, and fan speed testing.

The card measures 306x131x47mm and weighs 898g (those are my measurements), a comparative lightweight when looking at cards like the RX 6800 XT. It has a single 8-pin connector for power, a metal backplate, and RGB lighting on both the Phantom Gaming logo on the top of the card and the middle clear fan. Connectivity consists of the now typical single HDMI 2.1 port with three DisplayPort 1.4 (with DSC) outputs.

Test Setup for Radeon RX 6600 XT 

(Image credit: Tom's Hardware)

Our basic test hardware remains unchanged from previous reviews, though with a few updates. First, we're now running the latest version of Windows 10 (21H1, build 19043.1151). We're also using motherboard BIOS version 7B12v1B1, which includes beta Resizable BAR support (aka, 'ReBAR' or Smart Access Memory). That's on the Intel Core i9-9900K system. At the bottom of our parts list, you'll also see the Ryzen 9 5900X, MSI X570 Godlike, and Thermaltake GF1. Those are all used as a second test PC, because we wanted something that had PCIe Gen4 support as well as an AMD CPU to see how much that changes the performance story (also with ReBAR / Smart Access Memory enabled).

Considering this is a more modest GPU, CPU bottlenecks aren't likely to be much of a problem, even on the relatively old Core i9-9900K. We previously looked at CPU scaling on the latest GPUs for the RTX 3060 Ti launch, with a focus on the top-performing solutions at the time (Ryzen 9 5900X, Core i9-10900K, and Core i9-9900K). While there were some differences, overall, the net gain from swapping to a different CPU was only 1–2 percent. Of course, the Core i9-11900K has now launched, but with Alder Lake and Zen 4 in the works, we'll hold off any further testbed upgrades (which would necessitate retesting everything) until a later date.

We're sticking with the same 13 games we've been using since the RTX 3080 launch, all with DXR (DirectX Raytracing) disabled. We have a second test suite that includes DXR in ten games for those curious about how the RX 6600 XT holds up with maxed-out graphics settings and ray tracing. We provided a more extensive look at ray tracing and DLSS performance recently, after the RTX 3060 launch, and found that AMD's RX 6700 XT matched the RTX 3060 (without DLSS running). We'll use those same tests and test results here.

One thing we're not testing for this review is FidelityFX Super Resolution performance. Actually, I did run a couple of tests (in Godfall and Terminator: Resistance), but the blessing and curse of AMD FSR is that it works with everything — at least on the hardware front — and the gains are generally pretty similar across the latest AMD and Nvidia GPUs. That means if a card is faster without FSR, it's generally faster with FSR as well. Plus, none of the games in our standard test suite currently support FSR. 

MORE: Best Graphics Cards

MORE: GPU Benchmarks and Hierarchy

MORE: All Graphics Content

Jarred Walton

Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.

  • Zarax
    I know this is a lot to ask but given the ridicolous MSRP you might want to design a benchmark of discounted games (you could use isthereanydeal to see which ones have been at least once 50% off) that would be good to use with lower end cards or ones available used for acceptable prices.

    Something like "Budget gaming: how do the cheapest cards on ebay perform?" could be a very interesting read, especially given your high standards in writing and testing.
    Reply
  • salgado18
    I like the decision to lower memory bus width to 128 bits. It lowers mining performance without affecting gaming performance, and can't be undone like Nvidia's software-based solution.
    Reply
  • ottonis
    Due to production capacity constraints, AMD's main problem is they can't produce nearly as many GPUs as they would like and are thus being outsold by Nvidia by far.

    It's pretty obvious that AMD had one goal in mind with Navi23: increase production output as much as possible by shrinking die size while maintaining competitive 1080p gaming performance.
    Apparently, they accomplished that task. Whether or not the MSRP will have to be adapted: we will see,but I guess not as long as the global GPU shortage lasts.
    Reply
  • InvalidError
    salgado18 said:
    I like the decision to lower memory bus width to 128 bits. It lowers mining performance without affecting gaming performance, and can't be undone like Nvidia's software-based solution.
    Still having a 128bits on a $400 GPU is outrageous, especially if VRAM bandwidth bottleneck is a major contributor to the 6600(XT)'s collapse at higher resolutions and DXR.

    With only 8GB of VRAM, the GPU can only work on one ETH DAG at a time anyway, so narrowing the bus to 128bits shouldn't hurt too much. A good chunk of the reason why 12GB GPUs have a significant hash rate advantage is because they can work on two DAGs at a time while 16GB ones can do three and extra memory channels help with that concurrency.
    Reply
  • -Fran-
    salgado18 said:
    I like the decision to lower memory bus width to 128 bits. It lowers mining performance without affecting gaming performance, and can't be undone like Nvidia's software-based solution.
    Sorry, but you're not entirely correct there. It does affect performance. This is a very "at this moment in time" type of thing that you don't see it being a severe bottleneck, but crank up resolution to 1440 and it falls behind, almost consistently, against the 5700XT; that's not a positive look to the future of this card, even at 1080p. There's also the PCIe 3.0 at x8 link which will remove about 5% performance. HUB already tested and the biggest drop was DOOM Eternal with a whooping 20% drop in performance. That's massive and shameful.

    I have no idea why AMD made this card this way, but they're definitely trying to angry a lot of people with it... Me included. This card cannot be over $300 and that's the hill I will die on.

    Regards.
    Reply
  • ezst036
    The 6600 XT looks like a good Linux gaming card for Steam.
    Reply
  • InvalidError
    Yuka said:
    I have no idea why AMD made this card this way, but they're definitely trying to angry a lot of people with it... Me included. This card cannot be over $300 and that's the hill I will die on.
    Were it not for the GPU market going nuts over the last four years, increases in raw material costs and logistics costs, this would have been a $200-250 part.
    Reply
  • -Fran-
    InvalidError said:
    Were it not for the GPU market going nuts over the last four years, increases in raw material costs and logistics costs, this would have been a $200-250 part.
    I would buy that argument if it wasn't for the fact both AMD and nVidia are reeking in the cash like fishermen on a school of a million fish.

    Those are just excuses to screw people. I was definitely giving them the benefit of the doubt at the start, but not so much anymore. Their earn reports are the damning evidence they are just taking advantage of the situation and their excuses are just that: excuses. They can lower prices, period.

    Regards.
    Reply
  • ottonis
    Yuka said:
    I would buy that argument if it wasn't for the fact both AMD and nVidia are reeking in the cash like fishermen on a school of a million fish.

    Those are just excuses to screw people. I was definitely giving them the benefit of the doubt at the start, but not so much anymore. Their earn reports are the damning evidence they are just taking advantage of the situation and their excuses are just that: excuses. They can lower prices, period.

    Regards.

    The market has its own rules. As long as there is larger demand than the amount of GPUs AMD can produce, they will keep the prices high. That's just how (free) markets work.
    You can't blame a company for maximizing their profits within the margins the market provides to them.
    For a bottle of water, you usually pay less than a Dollar. Now, in the desert, with the next station being 500 miles away, you would pay even 10 Dollars (or 100?) for a bottle of water if you are thirsty.
    This will not change as long as global GPU shortage is lasting.
    Reply
  • -Fran-
    ottonis said:
    The market has its own rules. As long as there is larger demand than the amount of GPUs AMD can produce, they will keep the prices high. That's just how (free) markets work.
    You can't blame a company for maximizing their profits within the margins the market provides to them.
    For a bottle of water, you usually pay less than a Dollar. Now, in the desert, with the next station being 500 miles away, you would pay even 10 Dollars (or 100?) for a bottle of water if you are thirsty.
    This will not change as long as global GPU shortage is lasting.
    You're misunderstanding the argument: I do not care about their profit over my own money expenditure. I understand perfectly well they're Companies and their only purpose in their usable life is maximizing profit for their shareholders.

    So sure, you can defend free market and their behaviour all you want, but why? are you looking after their own well being? are you a stakeholder? do you have a vested interest in their market value? are you getting paid to defend their scummy behaviour towards consumers? do you want to pay more and more each generation for no performance increases per tier? do you want to pay a cars worth for a video card at some point? maybe a house's worth?

    Do not misunderstand arguments about AMD and nVidia being scummy. You should be aware you have to complain and not buy products at bad price points or they'll just continue to push the limit, because that's what they do.

    Regards.
    Reply