We noted at the start that the rated TBP comes in higher than we'd normally expect, likely thanks to the higher GPU clocks. We're not going to take the time (right now) to see how much power use drops at lower clocks, but we'll run our normal suite of Powenetics testing and check the GPU power consumption and other aspects. We run Metro Exodus at 1440p ultra and FurMark at 900p to collect these results.
It's good to see the RX 6700 XT come in a bit lower than the rated TBP, sitting at around 215W. As usual, FurMark required a bit more power, but it's only 2W in this case — nothing to worry about. It looks as though AMD was a bit conservative with its power ratings. That's better than the alternative, which we've seen quite a bit of lately, of cards using 5W-10W more than the rated TBP. Of course, third-party cards are free to increase the power limits, and clearly, the Sapphire card did.
Power scales with voltage and clock speed, and the 6700 XT has the highest reference clocks of any GPU to date by about 175MHz. Interestingly, the Sapphire and reference cards are basically tied on average clocks in Metro Exodus, while the Nitro+ makes use of its additional power in FurMark. Maximum clocks during gaming tend to average roughly 2.5GHz, depending on the game, while the RX 6700 XT still cruises along at 2.35GHz in FurMark.
Fan speeds directly affect temperatures, and here we see the reduced cooling capacity of AMD's reference design. It's not loud, but it does hit higher temperatures — not that 72C is particularly hot. The larger fans help make up for the reduced number of fans, but the triple fan cards all achieve lower temperatures. Meanwhile, Sapphire's RX 6700 XT posts some of the lowest average fan speeds we've ever seen.
Lower fan speeds naturally mean lower noise levels. The noise floor of our test environment and equipment measures 34 dB(A) at a distance of 15cm from the side of the GPU. We put the SPL (Sound Pressure Level) meter close to the GPU fans to focus on their noise, rather than case fans or other noise sources. The reference RX 6700 XT measured 40.4 dB, while Sapphire's card was only a touch above the noise floor with 36.0 dB.
Radeon RX 6700 XT Mining Performance
Unlike Nvidia's RTX 3060 12GB, AMD isn't even trying to stop miners from using its cards. On the one hand, that might seem like a poor decision, but we also saw how that all played out with Nvidia accidentally posting a development driver that doesn't fully implement the Ethereum mining speed limiter. Anyway, for better or worse, cryptocurrency mining is a thing right now, so we checked the hashing performance of the RX 6700 XT using NiceHashMiner.
Before running the built-in benchmark in precision mode, we tuned for optimal Ethereum mining performance. Basically, that means finding the highest stable memory clocks and then dropping the GPU clocks (or power limit, depending on the card) until we find a good balance.
In the case of the RX 6700 XT, we settled on 50% maximum GPU clocks (around 1300MHz) with a 150MHz GDDR6 overclock (17.2Gbps effective) and fast memory timings enabled. We also set the power limit to the maximum — that appears to help AMD's RDNA2 cards make better use of the memory bandwidth even if GPU clocks don't improve. Actual power consumption was only about 120W with these settings.
Considering Ethash tends to favor memory bandwidth over other factors, and the RX 6700 XT has a 192-bit bus instead of the 256-bit bus on the Navi 21 6800/6900 cards, the drop in hash rate pretty much matches the drop in bandwidth.
With tuning, we're able to get around 65MH/s out of the Navi 21 GPUs, while the RX 6700 XT maxed out at around 47.5MH/s. 25% less bandwidth, 25% lower hash rate. Maybe additional tuning would improve hash rates a bit more, but it's unlikely the 6700 XT will get much above 48MH/s with current mining software.
It's worth pointing out that 48MH/s is actually lower than the previous-gen RX 5700/5700 XT, again, thanks to the narrower memory bus width. It's slightly faster than the RX 5600 XT due to the higher memory clocks and also matches the "oops, we accidentally unlocked it" RTX 3060 12GB in Ethash rates.
What does that mean for gamers? Not much, except that the other cards that can do around 48MH/s currently sell for $800 or more, which means you can pretty much count on and a very limited supply of cards priced remotely close to the official MSRP, and the RX 6700 XT selling out.
MORE: Best Graphics Cards
MORE: GPU Benchmarks and Hierarchy
MORE: All Graphics Content
I think the 6700 XT could be a good replacement for my 1070. However, do I really want to waste even more time fighting bots, adding to cart only to be unable to checkout, or being led to believe I have a shot at getting a GPU when I really never did? No. I'm not waking up early to watch a page that instantly flips from "coming soon" to "sold out" again.
Power efficiency is poor as expected. Meh. That tiny chainsaw whacked off a few too many CUs.
But I did make an error: I was looking at the memory clock, not the GPU clock, when I thought the GPU was still running at 2.13GHz. I've done a bit more investigating, now that I'm more awake (it was a late night, again — typical GPU launch). With the slider at 40% (which gives a MHz number of something like 1048MHz), I got nearly the same mining performance as with the slider at 65% (1702MHz). Here are three screenshots, showing 40%, 50%, and 65% Max Frequency settings (but with advanced control ticked on so you can see the MHz values). This is with the Sapphire Nitro+, so the clocks are slightly higher than the reference card, but the performance is pretty similar (actually, the reference card was perhaps slightly faster at mining for some reason — only like 0.3MH/s, but still.)
I've updated the text to remove the note about the max frequency not appearing to work properly. It does, my bad, the description of the tuned settings was and is still correct: 50% Max Freq, 112% power, 2150MHz GDDR6, slightly steeper fan curve, 115-120W.
My 1070 is now worth double what I paid for it. I'm going to sell it this weekend and forget about AAA gaming for the rest of the year. Plenty of indie games get by happily with lower-end GPUs or iGPs.
Yeah, that would be swell.
What I seriously want to do is just checkout with 1 GPU without the GPU being ripped out of my cart mid-checkout or the vendor cancelling the order after it's been placed.