misfitkid86

Distinguished
May 23, 2012
709
0
19,060
hi all! so amd has just launched their trinity line cpu's and to me they look pretty good, but what i'm still a bit lost on is this: for high end gaming is the new A10 5800k chip better for me overall than the i5 3570k? i care less about onboard graphics because i plan on getting a somewhat high-end gpu (7950) but i wonder what i may lose if anything over the intel chip. i will be using it for games like skyrim, metro 2033, witcher 2, and borderlands 2. with those games and mine, plus the usual youtube and other sites is the A10 a good choice. thanks for your input!
 

NoUserBar

Distinguished
Nov 1, 2011
305
0
18,810
You will lose nothing, other than the money. (Which is a lot of money)

7950 should be enough, i5-3570k is better in general still I think. 5800k can match in some aspects I've heard from people, and some reviews. But matching isn't better so yeah.
 
Oh its this thread again.

1] A APU is not a high end part, you don't get any gains from it using higher end Radeon cards, but essentially strip the APU of its purpose. Consider the APU a separate project in conjunction to FX which work hand in hand until AMD is fully HSA. So no I don't suggest you pair a 5800K with a HD 7950 but would suggest a FX processor from the AMD front, preferably a FX 8000 chip.

2] Gaming performance differentials are abysmal yet harked on like a fish wives tale.

http://www.techpowerup.com/reviews/AMD/FX8150/11.html I am struggling to find benches which give Max, Min and average frame rates which severely discredit any "Intel is the one and only gaming processor" lollery that does its rounds.

Simply put a mere handful of broken or yet to be patched titles every show a significant FPS advantage that would make you want to spend $350 for a i7 to get roughly 3-5FPS more than a FX 8150 at $160 (needless to say the cost of a setup) Witcher 2, BF3, Skyrim, Metro, Dirt 3, any codemaster game barely and I mean barely show difference between AMD respective price point competitor chips, I do accept a minor group of games show around a 10% difference but those are all CPU dependent titles.

The other aspect that is hidden from many reviewers is MAX, MIN and AVG, they joy pony Intels max FPS at low and high res but hide the dismal fall off in AVG and MIN FPS notably higher res, this is odd as I don't fully understand why intel are affected more than AMD in this regard.
 
Here is Anandtech's rather limited array of results, basically showing low res games that are not GPU dependant Intel stretches its legs, add GPU/CPU interface and the performance disappears, High res maxed textures again hurt Intel setups more while AMD stays rather locked.

http://www.anandtech.com/show/4955/the-bulldozer-review-amd-fx8150-tested/8


the crux, where a game is CPU dependent, in most instance Intel will do better and by a margin, where games are GPU-CPU dependent the margin is non-existent.

of the list of titles you play:

BL2 - this game seems to be broken for AMD setups and should be patched soon enough but the game is Nvidia/Intel partnered so its unlikely that this title is any good for AMD components.

Witcher 2 - CPU bound game, graphics element is limited so Intel are better but its actually a margin of error differential.

Skyrim - Direct port, CPU dependent almost complete ported graphics engine which is rather unimpressive, A intel system is generally better in this title but its not a discernible difference.

Metro2033 AMD and Intel are toe to toe, favors neither and literally give similar results to the percentage. This is about the only CPU/GPU interactive game on your list.
 


Ya mean like this one? http://techreport.com/review/23246/inside-the-second-gaming-performance-with-today-cpus/3

skyrim-fps.gif


More importantly, the Intel chips spend far less time 'stuck' rendering difficult frames (the '99th percentile frames') and hence game more smoothly than the AMD chips tested:

skyim-99th.gif


skyrim-beyond-16.gif


However, if we crank down the tolerance to 16.7 milliseconds, the equivalent of 60 FPS, then the differences become apparent. The FX processors again fare poorly, relatively speaking. If you covet glassy smoothness, where the system pumps out frames consistently at low latencies close to your display's refresh rate, then you'll want a newer Intel processor. In this scenario, no entry in the FX lineup comes as close to delivering that experience as a Phenom II X4 980 or a Core i5-655K.

Of course, Trinity (or Piledriver) wasn't reviewed in the above so there's probably some improvement in these numbers..
 

misfitkid86

Distinguished
May 23, 2012
709
0
19,060
hmmm, ok, i'm leaning towards just sticking with my original 3570k setup, but that of course may change once the higher end trinity chips are released. i just want the best gaming rig possible for a rather cheap price. thus far i've not payed much mind to amd for cpu's, but i'm thinking it might be a good budget choice for gaming.
 


Of your list apart from Metro the others favor intel cores more, you will get between 5-10FPS on average more, if you crank up details then it becomes useful, if you are a FPS pusher then its rather pointless.
 

LastVampyer

Distinguished
Jul 30, 2010
81
0
18,630
I used to be an AMD fanboy. My old Phenom X4 was so awesome, it never put a foot wrong and lasted me a good few years. Now I use an Intel 3930K because I have been so disappointed by the performance of AMD's CPUs lately. I thought Bulldozer was going to blow Intel out of the water so to speak... I was dead wrong. :(

If you're going for the "best of the best" PC - Intel all the way.

If you're on a budget but still want to play all the latest games and stuff like that - AMD is your best bet.
 


If you take the integrated graphics out of the whole thing and if all you care about is sheer CPU power the Ivy Bridges I5 is the better CPU. The Trinity APU's are better than the old Llano's but they still can't compete with Ivy Bridges in CPU power. The I5 has a better IPC making t faster and more efficient at proccessing information over the APU's.