Gaming At 1920x1080: AMD's Trinity Takes On Intel's HD Graphics
Tags:
-
AMD
- Trinity
-
Graphics
Last response: in Reviews comments
Anonymous
September 27, 2012 3:46:07 AM
Think you're pretty snazzy because your integrated graphics core plays mainstream games at 1280x720? We're on to bigger and better things, like modern titles at 1920x1080. Can AMD's Trinity architecture push high-enough frame rates to make this possible?
Gaming At 1920x1080: AMD's Trinity Takes On Intel's HD Graphics : Read more
Gaming At 1920x1080: AMD's Trinity Takes On Intel's HD Graphics : Read more
More about : gaming 1920x1080 amd trinity takes intel graphics
-
Reply to Anonymous
confish21
September 27, 2012 5:05:53 AM
confish21
September 27, 2012 5:08:06 AM
Related resources
- Intel HD Graphics (Sandy Bridge Pentium) vs Trinity - Forum
- Will AMD A10-5800K Trinity Dual Graphics work with a Radeon HD 4850? - Forum
- Intel HD graphics 4000 vs AMD radeon 8400? - Forum
- Games which are running in Intel HD 4000 graphics are crashing in AMD 7670M. - Forum
- Intel Core i3-3110M & Intel HD Graphics/Nvidia GT 820M vs AMD A8-5550M & Radeon HD 8550G + 8670M - Forum
azathoth
September 27, 2012 5:31:19 AM
luciferano
September 27, 2012 5:31:57 AM
Nintendo Maniac 64
September 27, 2012 5:34:14 AM
mayankleoboy1
September 27, 2012 5:39:13 AM
Consoles set the bar for game developers. These iGPU's are comparable to the consoles and thats why games will run smooth here.
With next gen consoles coming out next year, game devs will target them. Hence the minimum standard for games will rise, making the next gen games much slower on the iGPU's. So both AMD and Intel will have to increase performance much more in the next 1-2 years.
tl;dr : next gen games will run poorly on these igpu's as next gen consoles will set the minimum performance standard.
With next gen consoles coming out next year, game devs will target them. Hence the minimum standard for games will rise, making the next gen games much slower on the iGPU's. So both AMD and Intel will have to increase performance much more in the next 1-2 years.
tl;dr : next gen games will run poorly on these igpu's as next gen consoles will set the minimum performance standard.
-
Reply to mayankleoboy1
m
14
l
mousseng
September 27, 2012 5:50:19 AM
mayankleoboy1 said:
tl;dr : next gen games will run poorly on these igpu's as next gen consoles will set the minimum performance standard.Keep in mind, though, that that's exactly what's going to allow AMD and Intel to advance their hardware faster than games will, as they were discussing in the article (first page of the interview). Look how far Fusion and HD Graphics have come over the past 3 years, and look how long the previous console generation lasted - if that trend is anything to go by, I'm sure integrated graphics could easily become a viable budget gaming option in the next few years.
-
Reply to mousseng
m
14
l
falchard
September 27, 2012 5:52:22 AM
luciferano
September 27, 2012 5:54:14 AM
mayankleoboy1Consoles set the bar for game developers. These iGPU's are comparable to the consoles and thats why games will run smooth here.With next gen consoles coming out next year, game devs will target them. Hence the minimum standard for games will rise, making the next gen games much slower on the iGPU's. So both AMD and Intel will have to increase performance much more in the next 1-2 years.tl;dr : next gen games will run poorly on these igpu's as next gen consoles will set the minimum performance standard.
Actually, the A10 and A8 have somewhat superior graphics compared to current consoles. Current consoles can't even play in 720p as well as these AMD IGPs played 1080p despite being a more optimized platform, so that this is true is kinda obvious IMO. Also, new games would simply mean dropping resolution for these APUs. They wouldn't be unable to play new games, just probably at 1080p and 16xx by 900/10xx resolutions too.
Intel probably isn't very motivated by gaming performance for their IGPs and they're supposedly making roughly 100% performance gains per generation with their top-end IGPs anyway, so they're working on growing IGP performance. AMD also gets to use GCN in their next APU and I don't think that I need to explain the implications there, especially if they go the extra mile with using their high-density library tech too.
-
Reply to luciferano
m
3
l
e56imfg
September 27, 2012 6:10:41 AM
digiex
September 27, 2012 6:27:07 AM
gondor
September 27, 2012 6:40:57 AM
Menigmand
September 27, 2012 6:45:17 AM
If intel/amd can convince most mainstream buyers that this is "good enough", and the next generation of consoles will run for 10+ years, could this be the end of dedicated graphics?
With market share going down, there could be less economy of scale and less investment, leading to stagnation and very high prices.
For some time, you will still be able to buy a dedicated GPU, but it will be a niche product that costs you an arm and a leg, and soon hardware support will dwindle as producers move to smaller form factors.
With market share going down, there could be less economy of scale and less investment, leading to stagnation and very high prices.
For some time, you will still be able to buy a dedicated GPU, but it will be a niche product that costs you an arm and a leg, and soon hardware support will dwindle as producers move to smaller form factors.
-
Reply to Menigmand
m
0
l
mylloc
September 27, 2012 6:51:08 AM
EzioAs
September 27, 2012 6:53:18 AM
jijibu
September 27, 2012 6:55:35 AM
army_ant7
September 27, 2012 6:57:44 AM
mayankleoboy1 said:
Consoles set the bar for game developers. These iGPU's are comparable to the consoles and thats why games will run smooth here.With next gen consoles coming out next year, game devs will target them. Hence the minimum standard for games will rise, making the next gen games much slower on the iGPU's. So both AMD and Intel will have to increase performance much more in the next 1-2 years.
tl;dr : next gen games will run poorly on these igpu's as next gen consoles will set the minimum performance standard.
I'm not sure it's accurate to say that consoles play on a game's absolute minimum settings, disregarding resolution. With that in mind, the PC versions would still have graphics options to tune down compared to the what the console versions would have their settings configured, I would think.
I do wonder how good these Trinity APU's could typically overclock, and how they'd perform there, along with their RAM overclocked to a reasonable level to compensate for the more graphics processing power.
More so, I'm wondering if the PSCheck method where you manipulate core P-states would have a substantial affect with mainly dual-threaded titles.
Also maybe I'd like to see if Dual-graphics performs better (scaling) and has a wider compatibility range than Llano's.
-
Reply to army_ant7
m
4
l
luciferano
September 27, 2012 7:03:30 AM
EzioAsI like the performance improvement in graphics, but I wish it was a little better. Maybe 20% more, but hey, at least it's improving
They did what they could on their 32nm process node that they had to stick to. Kaveri, assuming that it is true that it has GCN, will make undoubtedly some much more huge improvements over Trinity than Trinity did over Llano.
-
Reply to luciferano
m
6
l
Anonymous
September 27, 2012 8:59:21 AM
abitoms
September 27, 2012 9:00:06 AM
americanbrian
September 27, 2012 10:15:29 AM
I have to call out the review on choice of RAM as well. 1866 does not incur a significant price premium and would show the "TRUE" performance available to adopters of the AMD solution.
It seems unfair that you would make this choice to bias the results to favor intel. When you tested the first i-series chips with triple channel memory you enabled that feature (correctly) as it is a feature of the hardware you are testing.
Here there is a feature of the AMD hardware you have chosen to ignore. Not cool...
It seems unfair that you would make this choice to bias the results to favor intel. When you tested the first i-series chips with triple channel memory you enabled that feature (correctly) as it is a feature of the hardware you are testing.
Here there is a feature of the AMD hardware you have chosen to ignore. Not cool...
-
Reply to americanbrian
m
9
l
technoholic
September 27, 2012 10:58:27 AM
Cryio
September 27, 2012 12:28:33 PM
technoholic
September 27, 2012 12:41:29 PM
chesteracorgi
September 27, 2012 12:46:10 PM
I doubt that any serious gaming enthusiast would constrain himself to an APU/IGPU for the freseeable future. The advance of integrated graphics are impressive, but cannot even touch low-medium end discreet GPUs. My recent build for my grandson is limited to the IGPU on the i3 3225 until he shows me and interest in progressing to higher end gaming. If and when my grandkids start playing the likes of WoW, Skyrim or Crysis I'll take the plunge into a AMD 7000 or Nvidia 600. But, even before we get there we'll cannibalize the SLI setup of my GTX 470 to see if that will provide the sufficient horsepower.
Nevertheless, kudos to AMD on its Trinity platform, but I am interested to see how it competes with the IB platform when you add a discreet GPU.
Nevertheless, kudos to AMD on its Trinity platform, but I am interested to see how it competes with the IB platform when you add a discreet GPU.
-
Reply to chesteracorgi
m
4
l
+10 to THG for using the MP Holy Grail
What is amazing about the APU graphics is how they perform on older titles. Not all folks are interested in purchasing $50-$60 new games but are easily persuaded by $10-$20 bargains on the discount rack.
And as noted, not everyone is motivated by 1920x1080 at super detail. Dropping back to any lower 16:9 can greatly improve game play -- even for those who like higher settings.
And, not that I want to create extra work for you guys (!), but it would be interesting to see how Trinity stacks up against the *old* integrated Radeon HD4250 IGP -- just for snits and giggles on a few titles.
It's hard to judge how far we have come in a few short years without seeing where we were ...
-
Reply to Wisecracker
m
13
l
A good article.
I'd like to see how the AMD A10's and the Intel CPU's with HD4000 go with a decent overclock.
If you could get about a 30% increase in framerate on the more taxing games then I'd consider making up a SFF gaming box (well a few actually for network gaming) as an entry level gaming machine.
Currently I use 3 X M405 Toshies for this sort of thing ... but we are restricted to the sorts of games that are pretty basic ... we are Freelancer addicts !!!
Our E450 Notebook is better ... but something like one of these might fit the bill a lot better.
I'd rather make up a SFF box with a decent monitor than use notebooks ... though the price on the notebooks has dropped a lot.
Anyway ... AMD's superior graphics will surely make Intel do some work on their replacement for the HD4000 ... I expect a healthy performance boost will be in their new offering shortly.
I'd like to see how the AMD A10's and the Intel CPU's with HD4000 go with a decent overclock.
If you could get about a 30% increase in framerate on the more taxing games then I'd consider making up a SFF gaming box (well a few actually for network gaming) as an entry level gaming machine.
Currently I use 3 X M405 Toshies for this sort of thing ... but we are restricted to the sorts of games that are pretty basic ... we are Freelancer addicts !!!
Our E450 Notebook is better ... but something like one of these might fit the bill a lot better.
I'd rather make up a SFF box with a decent monitor than use notebooks ... though the price on the notebooks has dropped a lot.
Anyway ... AMD's superior graphics will surely make Intel do some work on their replacement for the HD4000 ... I expect a healthy performance boost will be in their new offering shortly.
-
Reply to Reynod
m
6
l
ojas
September 27, 2012 1:05:49 PM
luciferano
September 27, 2012 1:13:30 PM
americanbrianI have to call out the review on choice of RAM as well. 1866 does not incur a significant price premium and would show the "TRUE" performance available to adopters of the AMD solution.It seems unfair that you would make this choice to bias the results to favor intel. When you tested the first i-series chips with triple channel memory you enabled that feature (correctly) as it is a feature of the hardware you are testing. Here there is a feature of the AMD hardware you have chosen to ignore. Not cool...
To be fair, when Tom's did that for Intel's X58 platform, it was the very highest end consumer platform available. This is AMD's lowest end desktop platform where an entire computer based on it can be cheaper than a motherboard plus the lowest end CPU that was supported by X58 at launch (although to be fair, that's also related to prices on a lot of other components going down too, but still) and far cheaper than the high-end six-core CPUs for that platform.
Also, 1866 memory is not always as cheap as even 1600 memory. I think that you make a good point in that it should be cheap enough to have been used, but maybe it wasn't when Tom's got the hardware for these tests and maybe they simply didn't have anything else available. Tom's did other tests with RAM up to DDR3-1866 in some Trinity reviews, so maybe you can at least get a good guess of how it would help. It's a roughly 17% frequency jump and it probably increases bandwidth by a little under half of that, so maybe 6 to 8%, and performance is unlikely to increase by much less or much more than that. It would have been nice, but I don't think that it'd be a game-changer in performance. Maybe going up to 2133 would be more substantial coming from 1600.
-
Reply to luciferano
m
0
l
luciferano
September 27, 2012 1:17:49 PM
technoholici'm not impressed by this performance in a desktop machine although these trinities can be good for a laptop. AMD needs more steps forward to make these reasonable for desktop user, a couple of generations more
Trinity uses Piledriver and is about 15% faster than Bulldozer per Hz while also being able to hit higher frequencies at a given amount of power consumption than Bulldozer, a CPU architecture that just happens to have an 8MiB L3 cache (even if a slow one) whereas Trinity does not. The graphics made a good improvement over Llano. With good memory, Trinity's A10s can probably match a Radeon 6670 DDR3 in gaming performance quite well whereas Llano could only hope to get close. It's not a huge leap in CPU nor GPU, but it's pretty good improvement and we have the actual desktop CPUs (Vishera) coming out (hopefully) soon enough and they should be faster than Trinity.
-
Reply to luciferano
m
6
l
godfather666
September 27, 2012 1:40:27 PM
Anonymous
September 27, 2012 1:41:29 PM
belardo
September 27, 2012 1:52:34 PM
Wisecracker...What is amazing about the APU graphics is how they perform on older titles. Not all folks are interested in purchasing $50-$60 new games but are easily persuaded by $10-$20 bargains on the discount rack.And as noted, not everyone is motivated by 1920x1080 at super detail. Dropping back to any lower 16:9 can greatly improve game play -- even for those who like higher settings...
This. Sure this doesn't impress the "enthusiast" market, but we're a very small percentage, in absolute numbers, and therefor also in total dollars available, even if we prefer $300+ graphics cards (and always will). Even if I have no desire to buy one, this kind of thing is making AMD's plan to focus on the APU look like a great business decision every time it comes up.
-
Reply to Onus
m
12
l
bawchicawawa
September 27, 2012 2:03:16 PM
moussengKeep in mind, though, that that's exactly what's going to allow AMD and Intel to advance their hardware faster than games will, as they were discussing in the article (first page of the interview). Look how far Fusion and HD Graphics have come over the past 3 years, and look how long the previous console generation lasted - if that trend is anything to go by, I'm sure integrated graphics could easily become a viable budget gaming option in the next few years.
Of course, and look how far integrated graphics have come THIS far. 23 fps on bf3 at 1080p? On integrated? Sounds pretty beastly. I'm sure it will be a LOT better in 2014.
-
Reply to bawchicawawa
m
10
l
oomjcv
September 27, 2012 2:46:24 PM
According to this article: , AMD has had a say in this 'review'...
Is this in fact the case? To what extent was this article influenced, if so? If articles like this are to be published I think it should be clearly mentioned.
I'm an AMD and Toms Hardware fan and would prefer things like this rather not happen, I - and I assume others readers - expect reviews to be honest, transperant and independant.
Is this in fact the case? To what extent was this article influenced, if so? If articles like this are to be published I think it should be clearly mentioned.
I'm an AMD and Toms Hardware fan and would prefer things like this rather not happen, I - and I assume others readers - expect reviews to be honest, transperant and independant.
-
Reply to oomjcv
m
-4
l
americanbrianI have to call out the review on choice of RAM as well. 1866 does not incur a significant price premium and would show the "TRUE" performance available to adopters of the AMD solution.It seems unfair that you would make this choice to bias the results to favor intel. When you tested the first i-series chips with triple channel memory you enabled that feature (correctly) as it is a feature of the hardware you are testing. Here there is a feature of the AMD hardware you have chosen to ignore. Not cool...
They've talked about this in the past and it basically was said they choose not to use premium memory, because these chips are not meant for enthusiasts. These are budget systems, and budget buyers use budget parts.
-
Reply to bystander
m
2
l
luciferano
September 27, 2012 2:59:54 PM
oomjcvAccording to this article: , AMD has had a say in this 'review'...Is this in fact the case? To what extent was this article influenced, if so? If articles like this are to be published I think it should be clearly mentioned.I'm an AMD and Toms Hardware fan and would prefer things like this rather not happen, I - and I assume others readers - expect reviews to be honest, transperant and independant.
I doubt that AMD has the power to make APUs perform better than they can unless they used a new driver without telling us. I see your point, but I don't see what AMD could have done to influence this except to encourage Tom's to do the tests at 1080p or something like that and if Tom's didn't, then they could simply increase settings at the lower resolutions and results would be pretty much the same, just without being able to say that AMD can perform in 1080p in many games even on their IGPs.
The settings have to be set too low to use any settings that can overstate the performance differences because none of these IGPs have the performance to work with settings that change things around like if you were to do a set of comparisons of AMD and Nvidia high end cards, one with huge tessellation and one with huge AA, to show the performance characteristic differences. These IGPs look like they're simply too low weak to do something like that.
-
Reply to luciferano
m
7
l
sarinaide
September 27, 2012 3:06:40 PM
-
Reply to sarinaide
m
5
l
americanbrian
September 27, 2012 3:10:13 PM
luciferanoTo be fair, when Tom's did that for Intel's X58 platform, it was the very highest end consumer platform available. This is AMD's lowest end desktop platform where an entire computer based on it can be cheaper than a motherboard plus the lowest end CPU that was supported by X58 at launch (although to be fair, that's also related to prices on a lot of other components going down too, but still) and far cheaper than the high-end six-core CPUs for that platform.Also, 1866 memory is not always as cheap as even 1600 memory. I think that you make a good point in that it should be cheap enough to have been used, but maybe it wasn't when Tom's got the hardware for these tests and maybe they simply didn't have anything else available. Tom's did other tests with RAM up to DDR3-1866 in some Trinity reviews, so maybe you can at least get a good guess of how it would help. It's a roughly 17% frequency jump and it probably increases bandwidth by a little under half of that, so maybe 6 to 8%, and performance is unlikely to increase by much less or much more than that. It would have been nice, but I don't think that it'd be a game-changer in performance. Maybe going up to 2133 would be more substantial coming from 1600.
I was maybe a little harsh, in that they do not entirely ignore the fact that there is more performance available. However, if you bother to read the link posted in the review (which most people WON'T, which is what makes me call them out) you see that they have RAM available in there lab which they clock to 1866. It is there, sitting right in the lab.
And I don't know about where you are but the cost for 8GB of 1866 vs 1600 ddr3 is about $5 US or £3 GBP. Nothing really.
From the linked to benchmarks we see scaling in performance only in one game (WoW). I take issue with them choosing the 1600 for the rest of that review too. It is disingenuous in that an inexperienced person looking here for guidance may choose to duplicate that choice when for 1% of the total cost of the system ($5 of $500) they can realise an 8% total gain in gaming performance.
This is not really made clear and it is not fair to the average reader...
-
Reply to americanbrian
m
6
l
americanbrian
September 27, 2012 3:11:38 PM
jemm
September 27, 2012 3:54:21 PM
godfather666
September 27, 2012 4:29:06 PM
oomjcv
September 27, 2012 7:52:00 PM
luciferanoI doubt that AMD has the power to make APUs perform better than they can unless they used a new driver without telling us. I see your point, but I don't see what AMD could have done to influence this except to encourage Tom's to do the tests at 1080p or something like that and if Tom's didn't, then they could simply increase settings at the lower resolutions and results would be pretty much the same, just without being able to say that AMD can perform in 1080p in many games even on their IGPs.The settings have to be set too low to use any settings that can overstate the performance differences because none of these IGPs have the performance to work with settings that change things around like if you were to do a set of comparisons of AMD and Nvidia high end cards, one with huge tessellation and one with huge AA, to show the performance characteristic differences. These IGPs look like they're simply too low weak to do something like that.
Something went wrong with my link, here's the article I'm referring to: techreport.com/blog/23638/amd-attempts-to-shape-review-content-with-staged-release-of-info
-
Reply to oomjcv
m
-3
l
luciferano
September 27, 2012 8:04:17 PM
americanbrianI was maybe a little harsh, in that they do not entirely ignore the fact that there is more performance available. However, if you bother to read the link posted in the review (which most people WON'T, which is what makes me call them out) you see that they have RAM available in there lab which they clock to 1866. It is there, sitting right in the lab. And I don't know about where you are but the cost for 8GB of 1866 vs 1600 ddr3 is about $5 US or £3 GBP. Nothing really. From the linked to benchmarks we see scaling in performance only in one game (WoW). I take issue with them choosing the 1600 for the rest of that review too. It is disingenuous in that an inexperienced person looking here for guidance may choose to duplicate that choice when for 1% of the total cost of the system ($5 of $500) they can realise an 8% total gain in gaming performance. This is not really made clear and it is not fair to the average reader...
Hmm... Perhaps you were a little harsh, but I might have not been harsh enough too, now that I've read and thought about this post.
Also, about pricing on 1866 memory,
http://pcpartpicker.com/parts/memory/#v=1500&z=8192&t=1...
The cheapest, quality 8GB DDR3-1600 9-9-9-24 kit is $31 (after MIR, but still).
The cheapest, quality 8GB DDR3-1866 kit has 9-10-9-27 timings and is $42 (also after MIR).
Also, I only counted 1.5V kits and didn't care to see if there are better prices on higher voltage kits. I wouldn't want a higher than 1.5V voltage kit. IDK if you don't mind going to a higher voltage or not, but I wouldn't, at least not with these APUs. One thing going for the 1866 kits is that they have a good chance of being able to be overclocked to DDR3-2133 without unreasonable timings, maybe even at stock voltage of 1.5V or at least below 1.6V. The difference in performance between 1600 and 1866 isn't huge, but 1600 to 2133 is probably a greater boost. I've had better luck overclocking most 1866 kits to 2133 than I've had getting 1600 to 1866 without voltage hikes or crap timings and this could be more incentive.
Still, that price difference is well under (by percentage) the performance difference, so despite my semantic ramblings, you do seem to be correct.
-
Reply to luciferano
m
3
l
luciferano
September 27, 2012 8:21:06 PM
oomjcvSomething went wrong with my link, here's the article I'm referring to: techreport.com/blog/23638/amd-attempts-to-shape-review-content-with-staged-release-of-info
I read the article, but all it said was that AMD was asking some sites to refrain from posting non-gaming benchmarks. It didn't say anything about polarizing the benchmarks that were used and Tom's had a review about four months ago that was full of non-gaming benchmarks anyway. This was just a follow-up with a gaming focus like AMD asked for and shouldn't be held against Tom's for that because Tom's already did a CPU performance review of these desktop Trinity APUs.
-
Reply to luciferano
m
4
l
Related resources
- Solveddoes Sapphire AMD/ATI HD 6670 2 GB DDR3 2 GB DDR3 Graphics Card fit in intel dh61ww motherboard Forum
- SolvedIs Intel HD Graphics Enough for Gaming? (Intel pentium g3258) Forum
- AMD Radeon HD 7670M and Intel(R) HD Graphics 4000 good for 3D Games? Forum
- SolvedIs Intel i5 4430 compatible with Gigabyte H87M-D3H Motherboard and Asus AMD/ATI Radeon HD 6450 1 GB DDR3 Graphics Card?! Forum
- SolvedAMD Radeon HD 8750m & Intel HD Graphics 4600 issues Forum
- Solvedgaming motherboard intel hd graphics Forum
- Solvedcan i use asus amd/ati HD 7750 WITH INTEL HD GRAPHICS? Forum
- SolvedAMD Radeon HD 7600G or Intel HD Graphics 4400 Forum
- Solvedis sapphire amd/ati hd 7750 1 gb gddr5 graphics card compatible with intel dh67cl motherboard Forum
- SolvedUpgrading Intel HD Graphics 2500 (Desktop) to AMD Radeon 7750 Forum
- SolvedAMD Radeon 7640g or Intel HD Graphics 4000? Forum
- Intel Takes on AMD/ATI, NVIDIA in Graphics Forum
- SolvedDoes using my Intel HD Graphics 3000 for a second monitor affect my AMD Radeon HD 6950? Forum
- SolvedIs pentium g2020 is good for avg gaming????and how is intel hd graphics??? Forum
- SolvedBF3 not detecting 2nd graphics card (intel HD4000 & AMD radeon 8750m) Forum
- More resources
!