Why you can trust Tom's Hardware
Our basic test hardware remains unchanged from the initial RX 6600 XT launch, except we're using the public 21.8.1 AMD drivers now. You can refer to the launch review for additional details, but we're using the same 13 game test suite. We've also enabled Trixx Boost and tested it at 1440p (2176x1224) as an extra data point. All of the games worked with Trixx Boost, though Dirt 5 required setting the desktop resolution to 2176x1224 and then running at "100% Native" scaling. We'll start with 1080p and then look at 1440p and 4K ultra results.
Pretty exciting stuff, right? At 1080p, the three RX 6600 XT cards all land within 0.5% of each other. The only real difference came from our Ryzen 9 5900X testing, where performance improved by about 2% overall, thanks mostly to a couple of the games that were really hitting CPU limits on the i9-9900K. There are a few cases where the Sapphire card came out slightly ahead, but mostly it's just margin of error differences. Let's move on.
The results at 1440p are again super close. This time, however, we've got Trixx Boost enabled. That improved performance by 24% on average, though a few games showed smaller or larger gains. Red Dead Redemption 2 was the outlier, with 52% higher performance. It's not clear if the card was running out of memory, or memory bandwidth (maybe both), but the lower resolution clearly helped a lot. Forza Horizon 4 showed the lowest gain of 13% because it was already hitting CPU limits. Everything else landed in the 20–32% range of improvement.
Trixx Boost is a nice extra feature, though it's possible to get similar results by manually creating a lower resolution as well — just without Radeon Image Sharpening doing the upscaling. We wouldn't necessarily spend a lot of extra money on a Sapphire card just to get access to Trixx Boost, but it's there if you happen to encounter a game that can't quite hit a steady 60 fps or more at your desired resolution.
Last, we have 4K, a resolution that's generally a better fit for faster GPUs — unless you don't mind 30 fps gaming. Forza Horizon 4 was the only game to clear 60 fps at 4K, and several games are right at the 30 fps threshold. We also saw the biggest difference between the ASRock and Sapphire cards at 4K: just 3%, in Red Dead Redemption 2. Despite its substantially larger cooler and third fan, plus a higher TDP and modest overclock, the Sapphire Pulse is only a hair slower than the ASRock Phantom Gaming.
MORE: Best Graphics Cards
MORE: GPU Benchmarks and Hierarchy
MORE: All Graphics Content
Current page: Sapphire Radeon RX 6600 XT Pulse Gaming Performance
Prev Page Sapphire Radeon RX 6600 XT Pulse Design and Features Next Page Sapphire Radeon RX 6600 XT Pulse Power, Temps, Fan Speed, and NoiseJarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.
Chiplet processors leverage light-based communication — new active optical interposers connect multi-chiplets with minimal latency
BrainChip unveils AI NPU that consumes less than a milliwatt
We tested Intel's flagship Lunar Lake Ultra 9 and it's slower than an Ultra 7 in multi-threaded work in the only laptop where it's currently for sale [Updated]
-
logainofhades No bling and few extras isn't exactly a con, nor is its performance, at higher resolutions, as it was marketed as a 1080p card.Reply -
-Fran- One super important point about Sapphire cards: they have upsampling built in into TRIXX. It may not be FSR or DLSS, but you can upscale ANY game you want through it using the GPU. I have no idea how it does it, but you can.Reply
Other than that, this card looks clean and tidy. Not the best looking Sapphire in history, as that one goes to the original RX480 Nitro+ IMO. What a gorgeous design it was. I wish they'd use it for everything and get rid of the fake-glossy plastic garbo they've been using as of late.
Regards. -
JarredWaltonGPU
Wow! It's almost like you ... didn't read the review. LOLYuka said:One super important point about Sapphire cards: they have upsampling built in into TRIXX. It may not be FSR or DLSS, but you can upscale ANY game you want through it using the GPU. I have no idea how it does it, but you can.
Other than that, this card looks clean and tidy. Not the best looking Sapphire in history, as that one goes to the original RX480 Nitro+ IMO. What a gorgeous design it was. I wish they'd use it for everything and get rid of the fake-glossy plastic garbo they've been using as of late.
Regards.
I talk quite about about Trixx Boost and even ran benchmarks with it enabled at 1440p, FYI. -
-Fran-
I did read it; I must have omitted it from my mind :PJarredWaltonGPU said:Wow! It's almost like you ... didn't read the review. LOL
I talk quite about about Trixx Boost and even ran benchmarks with it enabled at 1440p, FYI.
Apologies. -
TheAlmightyProo Yuka said:One super important point about Sapphire cards: they have upsampling built in into TRIXX. It may not be FSR or DLSS, but you can upscale ANY game you want through it using the GPU. I have no idea how it does it, but you can.
Other than that, this card looks clean and tidy. Not the best looking Sapphire in history, as that one goes to the original RX480 Nitro+ IMO. What a gorgeous design it was. I wish they'd use it for everything and get rid of the fake-glossy plastic garbo they've been using as of late.
Regards.
iirc that RX 480 Sapphire Nitro (did they do this in 580 too?) is the one with all the little holes in it and otherwise straight lines etc. I also seem to recall swappable fans, but could be wrong...
But yeah, it was a beaut, my fave design at the time, and I'd have so gone for one if I hadn't decided on a 1070 (Gigabyte Xtreme Gaming) as a safer bet holding 2560x1080 for longer before needing to drop to 1080p. That said, I've always liked Sapphires and eventually got one this year (6800XT Sapphire Nitro+ SE @3440x1440) and I'm absolutely not disappointed in that or my first full AMD CPU in 16 years (5800X) Sure, not so great at RT and FSR needs to catch up and catch on but I have maybe 2-3 games out of 50 I'd play that'll make use of either, no great loss yet until they become more refined and ubiquitous imo. It runs like a dream and cool too. Assuming AMD keep up or overtake the next gen but one, I'd be happy to buy Sapphire again. -
TheAlmightyProo JarredWaltonGPU said:Wow! It's almost like you ... didn't read the review. LOL
I talk quite about about Trixx Boost and even ran benchmarks with it enabled at 1440p, FYI.
Trixx looks like a damn good app tbh. Having a Sapphire 6800XT Nitro+ SE I could be using it but omitted doing so... I dunno, maybe cos it's already good enough at 3440x1440?
However, I do have a good gaming UHD 120Hz TV (Samsung Q80T) waiting to game from the couch (after an upcoming house move) which might do well with a little boost going forward as I'm not even thinking of upgrading for at least 3-5 years and after the first iterations of the 'big new things' have been refined somewhat.
So thanks for spending some time on that info and testing with it on. I might've ignored or forgotten it but knowing it's there as a tried and tested option is good to know. -
JarredWaltonGPU
FWIW, you can just create a custom resolution in AMD or Nvidia control panel as an alternative if you don't have a Sapphire card. It's difficult to judge image quality, and in some cases I think it does make a difference. However, I'm not quite sure how Trixx Boost outputs a different resolution via RIS. If you do a screen capture, it's still at the Trixx Boost resolution, as though it's simply rendering at a lower resolution and using the display scaler to stretch the output. Potentially it happens internal to the card's output, so that 85% scaling gets bumped up to native for the DisplayPort signal, but then how does that use RIS since that would be a hardware/firmware feature?TheAlmightyProo said:Trixx looks like a damn good app tbh. Having a Sapphire 6800XT Nitro+ SE I could be using it but omitted doing so... I dunno, maybe cos it's already good enough at 3440x1440?
However, I do have a good gaming UHD 120Hz TV (Samsung Q80T) waiting to game from the couch (after an upcoming house move) which might do well with a little boost going forward as I'm not even thinking of upgrading for at least 3-5 years and after the first iterations of the 'big new things' have been refined somewhat.
So thanks for spending some time on that info and testing with it on. I might've ignored or forgotten it but knowing it's there as a tried and tested option is good to know.
Bottom line is rendering fewer pixels requires less GPU effort. How you stretch those to the desired output is the question. DLSS and FSR definitely scale to the desired resolution, so that Windows+PrtScrn capture images at the native resolution. Trixx Boost doesn't seem to function in the same way. ¯\(ツ)/¯ -
InvalidError
Performance at higher resolution is definitely a con since in a sane GPU market, nobody would be willing to pay anywhere near $400 for a "1080p" gaming GPU with a gimped 4.0x8 interface and 128bits VRAM in a healthy market. This is the sort of penny-pinching you'd only expect to see on sub-$150 GPUs. On Nvidia's side, you don't see the PCIe interface get cut down until you get into sub-$100 SKUs like the GT1030.logainofhades said:No bling and few extras isn't exactly a con, nor is its performance, at higher resolutions, as it was marketed as a 1080p card.
As some techtubers put it, all GPUs are turd sandwiches. The 6600XT isn't good for the price, it is just the least worst turd sandwich at the moment if you absolutely must buy a GPU now. -
logainofhades Price aside, the card was advertised as a 1080p card, and the 6600xt does 1080p quite well. I don't understand the gimped interface either, but AMD promised 1080p, and delivered. Prices are stupid, and will be for quite some time, as many are saying 2023, before this chip shortage ends.Reply -
InvalidError
$400 GPUs have been doing "1080p quite well" with contemporary titles for over a decade. I personally find it insulting that AMD would brag about that in 2021.logainofhades said:Price aside, the card was advertised as a 1080p card, and the 6600xt does 1080p quite well.