AMD Radeon R9 380X Nitro Launch Review
Tonga’s been around for more than a year now, and it’s taken all this time for it to finally be available for regular desktop PCs. Before now, this configuration was exclusively offered for a different platform. But does it still make sense today?
Introduction & Specifications
The question is a fair one. After all, a year is a long time in the graphics game. Back in 2014, the uncut version of Tonga (with 2048 Stream processors, rather than the 1792-shader version that became Radeon R9 285) might have been an interesting complement to Hawaii, which lives on in the Radeon R9 390 and 390X. But the landscape doesn't look the way it did last fall.
To begin, AMD launched its Fiji GPU in no less than three different flavors: the Radeon R9 Fury, Fury X and Nano. And Nvidia didn't sleep through any of that; it has a portfolio stuffed with Maxwell-based processors. Still, AMD’s new Radeon R9 380X does seem to hit a gap in Nvidia’s line-up, right between the price and performance of GeForce GTX 960 and 970. This could make the 380X a more interesting option than it might otherwise appear.
Antigua Pro, a relabeled version of Tonga Pro, offers four shader engines, each of which had access to seven compute units (CUs). The 380X, armed with Antigua XT, increases this to eight CUs. Just like we’re used to from the previous generation’s GCN-based GPUs, every CU houses 64 shader and four texture units, resulting in a total of 2048 (instead of 1792) Stream processors and 128 (instead of 112) texture units.
Tonga and Antigua have fewer render back-ends than Hawaii and Grenada. Their four shader engines only have two each, compared to the 390(X)’s four. Each individual back-end can render four full-color pixels per clock, which amounts to a total of 32 per cycle. This is only half of what Hawaii and Grenada can do.
The aggregate memory bus is also narrower than Hawaii/Grenada. It's 256 bits wide, sacrificing as much as 27 percent of its throughput compared to the higher-end cards. AMD's solution is similar here as it was for the Radeon R9 380: color data in the frame buffer can be written and read in a compressed lossless format.
A number of other features are also integrated. Parallel execution of instructions between two SIMD lanes, new and improved algorithms to plan compute tasks and new 16-bit floating-point and integer instructions for compute and media processing tasks all get rolled into the latest implementation of GCN. TrueAudio and FreeSync (an alternative to Nvidia’s G-Sync technology) both find their places in the hardware as well. And, finally, there are the Unified Video Decoder (UVD) and updated Video Coding Engine (VCE).
Specifications
AMD isn’t making a reference board for its 380X, instead tasking its partners with finding the best possible configuration. Consequently, we’re using a partner’s card that AMD provided to us.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Process | 28nm |
---|---|
Transistors | 5 Billion |
GPU Clock Frequency | 1040MHz |
Shader Units | 2048 |
Texture Units | 128 |
Texture Fill Rate | 133.1 GT/s |
ROPs | 32 |
Pixel Fill Rate | 33.3 GP/s |
Memory Interface | 256-bit |
Memory | 4GB GDDR5 |
Memory Clock Frequency | 1500MHz |
Memory Bandwidth | 192 GB/s |
-
ingtar33 so full tonga, release date 2015; matches full tahiti, release date 2011.Reply
so why did they retire tahiti 7970/280x for this? 3 generations of gpus with the same rough number scheme and same performance is sorta sad. -
Eggz Seems underwhelming until you read the price. Pretty good for only $230! It's not that much slower than the 970, but it's still about $60 cheaper. Well placed.Reply -
chaosmassive been waiting for this card review, I saw photographer fingers on silicon reflection btw !Reply -
Onus Once again, it appears that the relevance of a card is determined by its price (i.e. price/performance, not just performance). There are no bad cards, only bad prices. That it needs two 6-pin PCIe power connections rather than the 8-pin plus 6-pin needed by the HD7970 is, however, a step in the right direction.Reply
-
FormatC I saw photographer fingers on silicon
I know, this are my fingers and my wedding ring. :P
Call it a unique watermark. ;) -
psycher1 Honestly I'm getting a bit tired of people getting so over-enthusiastic about which resolutions their cards can handle. I barely think the 970 is good enough for 1080p.Reply
With my 2560*1080 (to be fair, 33% more pixels) panel and a 970, I can almost never pull off ultimate graphic settings out of modern games, with the Witcher 3 only averaging about 35fps while at medium-high according to GeForce Experience.
If this is already the case, give it a year or two. Future proofing does not mean you should need to consider sli after only 6 months and a minor display upgrade. -
Eggz 16976217 said:Honestly I'm getting a bit tired of people getting so over-enthusiastic about which resolutions their cards can handle. I barely think the 970 is good enough for 1080p.
With my 2560*1080 (to be fair, 33% more pixels) panel and a 970, I can almost never pull off ultimate graphic settings out of modern games, with the Witcher 3 only averaging about 35fps while at medium-high according to GeForce Experience.
If this is already the case, give it a year or two. Future proofing does not mean you should need to consider sli after only 6 months and a minor display upgrade.
Yeah, I definitely think that the 980 ti, Titan X, FuryX, and Fury Nano are the first cards that adequately exceed 1080p. No cards before those really allow the user to forget about graphics bottlenecks at a higher standard resolution. But even with those, 1440p is about the most you can do when your standards are that high. I consider my 780 ti almost perfect for 1080p, though it does bottleneck here and there in 1080p games. Using 4K without graphics bottlenecks is a lot further out than people realize.
-
ByteManiak everyone is playing GTA V and Witcher 3 in 4K at 30 fps and i'm just sitting here struggling to get a TNT2 to run Descent 3 at 60 fps in 800x600 on a Pentium 3 machineReply