Should I get the R9 380 or 390?

William Schnatz

Reputable
Dec 7, 2014
45
0
4,540
Hi. I've recently began building my new computer and I'm not sure which of the two cards to buy. I was running a HD 7850 but it's started to slow down in more recent titles like Fallout 4 where it can usually get 45-60 FPS at medium. I only play at 1080p and don't plan on going higher, I don't see a real benefit above that resolution. So which of these two would be better for me?
 
Solution
To be factual:

1. The R90 does not have better performance than the 970 at 1080 .... or 1440p. While some games favor one or the other, averaged out the 970 leads.
perfrel_1920.gif


2. According to TPU, the 970 overclocks 17% ... the 390 just 8% widening the above gap. In fact, again using TPUs numbers, the 970 tops the 390x at 1080 and 1440p when both are overclocked, but to be fair < 1% in the latter case.

3. There is absolutely no benefit whatsoever to anything over 4 GB at 1080 or 1440p.
http://www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x/5

4. The 970 does not have 3.5 GB of VRAM, it has 4 GB or RAM divided...
The 380 is a solid choice with a good price / performance ratio versus the competition. I can't say the same for the 390... the 970 is just so much better in every respect. The 380 does start to strain a little in today's more demanding games... and newer games as may be expected will tax the system more.

http://www.guru3d.com/articles_pages/asus_radeon_r9_380_strix_review,12.html

If you can afford the 390, I'd get a 970. It performs faster out of the box and has about 2 - 3 times the overclocking headroom

 

George Phillips

Reputable
Jun 17, 2015
614
0
5,360
It's better to pick the better one between 390 and 970. 380 is good but far not as fast as the other two. R9 390 features more advanced features and supports more complete DX 12 features

GTX 970:
PROS
-Less power consumption
-less noise
-Cooler
-Nvidia drivers are generally better than AMD’s drivers
-PhysX, Gameworks and other effects
-More games favor nvidia
CONS
-3.5GB of ram
-Less powerful
-Not as future proof as R9 390
-Not fast on higher than 1080p
-less powerful than r9 390 on games that utilize a lot of video card memory(ex: bf4)

R9 390
Pros:
-better performance
-4.5GB more memory
-More future proof
-Performs better on higher resolutions than 1080p
-better than the 970 on games that need more vram
Cons
-AMD drivers updates and support of games aren't as good as Nvidia's solution.
-Consuming more power
-hotter and noisier than the 970

 
970 and 390 basically the same. Benchmarks show trading blows and depends entirely where you look and what they were doing with settings etc. 970 is not quicker out of box. OC'd, the 970 will beat the 390 but only at a very high OC. my OC'd 390 will match a reasonably OC'd 970 at benchmarks - game specific performance favours the 970 with GW games, other games it is more even.
YES, the 390 does consume about 70 watts more when both are OC'd, but mine runs at 65C and quietly.

390 is much quicker than the 380.

Id actually recommend most to go with the 970. But I quite liked the idea of the 390, it was £40 cheaper, and OC'd it is an absolute beast.
 
To be factual:

1. The R90 does not have better performance than the 970 at 1080 .... or 1440p. While some games favor one or the other, averaged out the 970 leads.
perfrel_1920.gif


2. According to TPU, the 970 overclocks 17% ... the 390 just 8% widening the above gap. In fact, again using TPUs numbers, the 970 tops the 390x at 1080 and 1440p when both are overclocked, but to be fair < 1% in the latter case.

3. There is absolutely no benefit whatsoever to anything over 4 GB at 1080 or 1440p.
http://www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x/5

4. The 970 does not have 3.5 GB of VRAM, it has 4 GB or RAM divided into 3.5 GB and 0.5 GB segments. Dozens of sites have tried to replicate the alleged "performance issues" w/o success.

http://www.guru3d.com/news-story/middle-earth-shadow-of-mordor-geforce-gtx-970-vram-stress-test.html

5. The above data flips the "future proof" comment.

6. 4 k resolution isn't on the table.

7. There are NO games whatsoever that *need* more than 4 Gb of VRAM as was conclusively established by extremetech article. Peeps need to understand that allocation RAM (which is what GPU-Z reports has no bearing on use or need. For every game that can use more than 4GB, the settings required to make this happen requuired a) 4k resolution and b) settings that make the game completely unplayable.

GPU-Z: An imperfect tool

GPU-Z claims to report how much VRAM the GPU actually uses, but there’s a significant caveat to this metric. GPU-Z doesn’t actually report how much VRAM the GPU is actually using — instead, it reports the amount of VRAM that a game has requested. We spoke to Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, whether it’s GPU-Z, Afterburner, Precision, etc. They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.

When we started this process, I assumed that a number of high-end titles could readily be provoked into using more than 4GB of VRAM. In reality, this proved a tough nut to crack. Plenty of titles top out around 4GB, but most don’t exceed it. Given the lack of precision in VRAM testing, we needed games that could unambiguously break the 4GB limit.

We tested Assassin’s Creed Unity, Battlefield 4, BioShock Infinite, Civilization: Beyond Earth, Company of Heroes 2, Crysis 3, Dragon Age: Inquisition, The Evil Within, Far Cry 4, Grand Theft Auto V, Metro Last Light (original), Rome: Total War 2, Shadow of Mordor, Tomb Raider, and The Witcher 3: Wild Hunt. Out of those 15 titles, just four of them could be coaxed into significantly exceeding the 4GB limit: Shadow of Mordor, Assassin’s Creed: Unity, Far Cry 4, and Grand Theft Auto V. Even in these games, we had to use extremely high detail settings [@ 4k] to ensure that the GPUs would regularly report well over 4GB of RAM in use.

While we do see some evidence of a 4GB barrier on AMD cards that the NV hardware does not experience, provoking this problem in current-generation titles required us to use settings that rendered the games unplayable any current GPU. It’s reasonable to ask why we didn’t fine-tune results, attempting to measure the impact of just going over the 4GB threshold with the GTX 980 Ti or Titan X, and then test with those settings. Unfortunately, GPU-Z simply doesn’t measure accurately enough to make this possible.

The most we can say of a specific 4GB issue at 4K is that gamers who want to play at 4K will have to do some fine-tuning to keep frame rates and resolutions balanced, but that’s not unique to any vendor. If you’re a gamer who wants 4K and ultra-high quality visual settings, none of the current GPUs on the market are going to suit you. HBM2 and 14/16nm GPUs may change that, but for now playing in 4K is intrinsically a balancing act. The Fury X may require a bit more fine-tuning than the 980 Ti or Titan X, but that’s not grounds for declaring 4GB an unsuitable amount of VRAM in today’s games.

In short, by the time you get to any setting where 4GB might be needed the GPU is totally incapable of providing playable frame rates. In other words, absolutely useless.

8. There are more 970s in use today than AMD's entire lineup including all 2xx and 3xx series.

I don't see the power and heat issues as significant on that I don't expect they would enter the decision making process unless performance were equal. But what AMD has done with the 2xx and 3xx series is simply to put a very aggressive clock on the cards in the box. Still, i this price niche, the 970 has the overall edge ... and at 3%, the additional heat / power issues and associated costs should be part of the decision making process.

But when both are pushed up to near the maximum temperature / voltage limits, the 970s advantage breaks 12%. At lower prce points, as with the 960 / 380, nVidia doesn't have a horse in the game... at least until the 960 Ti drops
 
Solution

chedda87

Reputable
Oct 18, 2014
97
0
4,660
The 3.5 gb is real... The division beta @1080p with a 970 with ultra settings uses 3.5 gb of VRAM.... A 980ti @1080p with ultra uses 4.5 gb..... So this shows the 970 is avoiding the last 0.5gb because of how much slower it is..
 
^ agree with all that. As I said, the 970 is the clearer choice. However, sometimes a challenge is good. I got my 390 for 50£ less than the 970 at the time, and only decently overclocked 970's can beat it on Firestrike for example, so value for money I am doing quite well.

Really, its personal choice at the time. I'll never buy into the power argument (outside of PSU choice) as its irrelevant compared to other life costs/cost of system etc, and the 390 runs as cool, so for me quite liking AMD, it was a clear choice!
 


Again, read the link and the dozens of rferenced site reporting 0 impact at 1080 / 1440p. The 1st 3.5 Gb is used as normal VRAM, the last 0.5 is used for frame buffering. If I set aside 8GB of system RAM as a RAM Drive have I somehow lost one of my 4 x 8GB sticks of RAM ? or am I just using some of it in a different way ?

nVidia had a problem with the 970s performance being to close to the 980 so they did 2 things

a) Changed the architecture to bring it down a bit
b) When it was till too close to the 980, they lowered the throttling point to 80C. The 980 and 980 Ti (max 92C) throttles at 85C, even tho they have lower max temps than the 970 (98C).

1. Just because your monthly statement reports a credit limit if say $5k that doesn't mean you owe any one $5k. It only means they have "allocated' you $5k. And that's what GPU-z and every other utility reports, the allocation not the usage. See extremetech quote above.

2. Numerous web sites have tested this with the 2Gb and 4Gb 960s again finding no performance difference outside of 1 or 2 terrible console ports. Alienbabeltech did it with 2 and 4 GB 770s testing like 50 game sand found no difference at 5760 x 1080, most importantly Max Payne would not install w/ the 2Gb card installed. So they fooled it by installing the 4 GB and testing it, then swapping for the 2 GB .. same fps, same quality, same experience. You can see the results on youtube now as site is down

https://www.youtube.com/watch?v=o_fBCvFXi0g

Need more ...

http://www.legitreviews.com/gigabyte-geforce-gtx-760-4gb-video-card-review-2gb-4gb_129062/4
https://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/
http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_g1_gaming_4gb_review,12.html

3. http://www.guru3d.com/news-story/middle-earth-shadow-of-mordor-geforce-gtx-970-vram-stress-test.html

Thing is, the quantifying fact is that nobody really has massive issues, dozens and dozens of media have tested the card with in-depth reviews like the ones here on my site. Replicating the stutters and stuff you see in some of the video's, well to date I have not been able to reproduce them unless you do crazy stuff, and I've been on this all weekend

Let me clearly state this, the GTX 970 is not an Ultra HD card, it has never been marketed as such and we never recommended even a GTX 980 for Ultra HD gaming either. So if you start looking at that resolution [4k] and zoom in, then of course you are bound to run into performance issues, but so does the [4 GB] GTX 980. These cards are still too weak for such a resolution combined with proper image quality settings. Remember, Ultra HD = 4x 1080P. Let me quote myself from my GTX 970 conclusions “it is a little beast for Full HD and WHQD gaming combined with the best image quality settings”, and within that context I really think it is valid to stick to a maximum of 2560x1440 as 1080P and 1440P are is the real domain for these cards. Face it, if you planned to game at Ultra HD, you would not buy a GeForce GTX 970.

Overall you will have a hard time pushing any card over 3.5 GB of graphics memory usage with any game unless you do some freaky stuff. The ones that do pass 3.5 GB mostly are poor console ports or situations where you game in Ultra HD or DSR Ultra HD rendering. In that situation I cannot guarantee that your overall experience will be trouble free, however we have a hard time detecting and replicating the stuttering issues some people have mentioned.

Utilizing graphics memory after 3.5 GB can result into performance issues as the card needs to manage some really weird stuff in memory, it's nearly load-balancing. But fact remains it seems to be handling that well, it’s hard to detect and replicate oddities.