R9 280x v R9 290 MSI for gaming and mining?

skyscraper7

Honorable
Jan 26, 2014
61
0
10,660
Hi, I'm going to build a new pc for gaming and mining dogecoins or litecoins (when not in use). Would a 280x or 290 be better for gaming and mining? From what I've read the 290 is worth it for gaming but not mining. Any other opinions? I will be getting MSI to match my mobo.

R9 280x: http://www.overclockers.co.uk/showproduct.php?prodid=GX-225-MS&groupid=701&catid=56&subcat=1842

R9 290: http://www.box.co.uk/MSI_Radeon_R9_290_Gaming_Video_Graphics__1536054.html?gclid=CI_ym_i4prwCFRDLtAod8W4A4w

Rest of build:
i5 4670k
Toshiba DT01ACA100
Seasonic M12II-620 Semi-Module620W
Coolermaster Hyper 12 EVO
Either msi Radeon r9 280x or msi radeon r9 290
TeamGroup Xtreem LV 8GB (2x4GB) DDR3 2400MHz
Zalman z11 plus case
Any suggestions or compatibility errors?
 
Solution
the 290 is faster in both gaming AND mining. I'm not sure who told you otherwise
the only difference between the two is that the 290 is faster, hotter, uses more power and is more expensive. it is just a more powerful card in the lineup. also your parts are all compatible :)
just make sure the motherboard you get is z87 (matches your processor)
the 290 is faster in both gaming AND mining. I'm not sure who told you otherwise
the only difference between the two is that the 290 is faster, hotter, uses more power and is more expensive. it is just a more powerful card in the lineup. also your parts are all compatible :)
just make sure the motherboard you get is z87 (matches your processor)
 
Solution

skyscraper7

Honorable
Jan 26, 2014
61
0
10,660

Yeah I'm planning on using the
MSI Z87-G45-GAMING, thanks for the quick answer.
EDIT: I've decided to choose the 280x because of the £90 price difference :)
 
A couple points:

1) The 290 not being "worth it" applies to all modern gaming cards, not just this model. The problem is the algorithm complexity is increasing so your statistical chance of making money versus the cost of electricity is decreasing.

It seems mining will continue to evolve so dedicated machines MAY be the only viable way, and they may also require new hardware from time to time so the goal may just be to have them pay for themselves before coming obsolete.

2) Which card?
The R9-290 on paper is slightly superior to the R9-280X. You must be very careful to research which models have good cooling solutions and don't lose performance under load. MOST DO.

I don't have the time to research now, but I believe there was a particular SAPPHIRE card that did fine, but some of the others lost 13% of their performance once the THERMAL throttling kicked in.

Even if some do NOT throttle under load, make sure the NOISE isn't excessive from the fans. There are many REVIEWS you can look at. Find one that says MINIMAL throttling with acceptable NOISE and buy that EXACT model.

Other:
Unless you have an interest in games with MANTLE, you may also wish to consider the GTX780, specifically the EVGA model with 967MHz base/1020MHz boost and ACX cooler. It's about $510 after MIR in USD funds. It has no thermal throttling and runs very quietly in comparison.

*I wouldn't limit your choice of cards to MSI just to match your motherboard. There's absolutely NO advantage in performance, and there may (probably is) a better card from Sapphire, Asus, Gigabyte, or EVGA though again I don't have time to read a lot of reviews.
 

skyscraper7

Honorable
Jan 26, 2014
61
0
10,660

Thanks for the answer, I think I'll go with the 280x
CAn I just ask what kind of OC i can get with that card?
 


OC?
Look up some reviews. I already told you that many of the cards are losing performance under load due to overheating which causes the GPU frequency to be reduced to avoid crashing. Overclocking might provide NO benefit or even make things worse.

I also can't say because it varies depending on how good the COOLER is. Thus, I'm back to telling you to read reviews for the R9-280X from every vendor and find this information out, specifically:
- thermal throttling?
- overclocking potential
- NOISE (idle, load, and when overclocked)

Here's a start, but I recommend finding more: http://www.tomshardware.com/reviews/radeon-r9-280x-third-party-round-up,3655.html
 
The 280x is not loosing performance with it's aftermarket coolers
the same goes for the 290 and 290x (although they don't oc very well)

the stock coolers were junk, but with almost all the 280x variants have great coolers, and don't suffer from thermal throttling (even the 290 and 290x do not suffer from thermal throttling with aftermarket coolers)
 
And i havent seen a 280x yet that suffers from thermal throttling (as none of them have stock coolers anyway)
the msi gaming is about the only one that suffers from thermal throttling according to the review the other guy put up, so just stay away from that one
 


As I said though, some of the cards are throttling performance. THIS card from MSI was, but apparently a BIOS update helped. So this needs to be researched carefully: http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/31.html

There is roughly a 20% difference from the best to lowest performing cards, and some with very noisy fans as well.

SAPPHIRE seems to have one of the best R9-280X cards (Sapphire R9-280X Toxic OC).
http://www.tomshardware.com/reviews/radeon-r9-280x-third-party-round-up,3655-12.html

I believe it was tested against a few other cards and the only one not to throttle (some by 13%). I forget the exact details.

That Sapphire, again possibly the best R9-280X, got a 7.8% overclock by one reviewer.
http://www.hardocp.com/article/2013/12/02/sapphire_toxic_r9_280x_video_card_review/10

*Also, not to be a jerk but you can't gain 23% improvement by overclocking 10%. At BEST you would get a 10% improvement if the GPU was the sole bottleneck. At 7.8% (say 8%) that would be 54FPS instead of 50FPS for example, but probably closer to 52 or 53FPS in reality. If fan noise goes up much, it's probably best to stay at STOCK clocks.
 

skyscraper7

Honorable
Jan 26, 2014
61
0
10,660


Did a bit of digging around and found this:
http://www.guru3d.com/articles_pages/msi_radeon_r9_280x_twinfrozr_gaming_oc_review,10.html
http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/31.html

So the VRM can get very hot. Does that matter to much? Should I go with gigabyte or asus then? I found an even cheaper gigabyte card (If it ever returns to stock:
http://www.amazon.co.uk/Gigabyte-GDDR5-PCI-E-DVI-I-DisplayPort/dp/B00FSC5BNG

 

skyscraper7

Honorable
Jan 26, 2014
61
0
10,660


Ok thanks for the answers, I'll look into the toxic
 
Some BF4 Mantle info: http://battlelog.battlefield.com/bf4/news/view/bf4-mantle-live/

The BF4 update is out to support Mantle, but you need the AMD 14.1Beta I believe which I don't think is out yet.

You need an AMD modern graphics card, but it appears the largest benefits come from the CPU improvements so it will depend what CPU you have as well. More data will come out once people can test it themselves.

Very interested to see results comparing the same card on the following CPU's: i5-4670K (4 core), i7-4770K (4 core, hyper threaded), FX-6300 (6 core), and FX-8350 (8 core). The FX-6300 in particular is a relatively inexpensive CPU. We're still currently talking ONE GAME however, but it's interesting nonetheless and in two years this may have a large impact on gaming.

The FX-8350 + HD7970 setup got a 25% improvement.

I suspect that an i5-4670K + R9-280X would get similar improvements. The i5-4670K might only be 4-core, but some of that is still unused. Using more THREADS doesn't mean more CORES necessarily so the i5-4670K might still get similar performance. It's 25% better so if the CPU wasn't using more than 80% already it might manage to do so.

It's obviously far more complicate than this but we'll have benchmarks soonish. I'm willing to bet that the OP's rig (i5-4670K + R9-280X) gets at least 20% improvement in BF4 if he plays that.

Most importantly, that might be a LARGER percentage improvement of the LOW FPS scores so it would be awesome if people jumped from say 20FPS to 30FPS on the low end.
 
The intel cpu's get less of a boost in mantle than the amd ones do. It really just cuts the cpu out of the equation. since the intel cpu's are better in pure gaming (it's fact, it's been tested, blah blah blah blah) they see less of a performance increase with any card than the amd ones do. is this the 5% gap in performance, well vs an 8350, the answer is maybe. so basically mantle has just made the amd side more appealing for gaming while buying a lower quality cpu.

Mantle also makes huge improvements to crossfire no matter what cpu you are using (obviously something with direct x was making the cpu's hold back multi card solutions)

so basically if you read the reviews on a gpu, say the tester used a 4770k or 4960x or something crazy; the performance of the cards is still pretty true (1-2% performance increase for single amd cards with these processors) so we will still be able to know which cards are more powerful.

Nvidia will likely either adopt mantle (won't be till late this year or early next year), intel won't like this at all, as it will reduce the need for powerful processors for gaming.