The Elder Scrolls V: Skyrim employs DX 11 support to enhance performance. The Creation Engine used by Bethseda is the same one used for Elder Scrolls IV: Oblivion, and there’s been little to no visual improvement between generations. However, the faster implementation pays dividends for our purposes.

In AMD Desktop Trinity Update: Now With Core i3 And A8-3870K, we benchmarked with The Elder Scrolls V: Skyrim and discovered that the Medium quality preset simply was not playable on any of the processors we tested at 1920x1080 (the A10 and A8 were arguably pretty fluid at 1280x720, though).
Dropping quality all the way to the Low preset does give AMD's new APUs a lot more room to breathe at our target resolution. It's just a bummer that the graphics quality is so terrible down there. Our benchmarks suggest this game is accessible to Trinity, but we'd strongly suggest an upgrade to discrete graphics and a step up to at least the Medium quality preset.
Intel's Core i3s, on the other hand, cannot muster playable performance, even at this game's most entry-level settings.

As we’ve described in the past, Deus Ex: Human Revolution excels in two key points: amazing story quality and excellent use of anti-aliasing modes made available through DX 11. Fortunately, Eidos Montreal made sure the game was GPU-friendly, even without DX 11 assistance. Our earlier testing with discrete graphics showed that Human Revolution is playable at 1920x1080 with medium details and 8x AF enabled across a broad spectrum of last year's graphics cards. So, it comes as no surprise that three of our four integrated engines turn in average frame rates in excess of 30.
The AMD A10-5800K averages close to 50 FPS, and its minimum dips to 37 FPS. That might leave a little bit of headroom for more demanding settings, but we wouldn't push this title much further. Intel’s Core i3-3225 is barely able to deliver an average in excess of 30 FPS, but its 23-frame minimum is a little low for our liking. You can expect frequent stutters, even with all detail settings as low as they go.
- Can The Latest Integrated Graphics Engines Game At 1080p?
- Professional Opinion: Gaming On Integrated Graphics
- Professional Opinion: Gaming On Integrated Graphics, Cont.
- Test Setup And Benchmarks
- Benchmark Results: Call Of Duty: MW3 And Metro 2033
- Benchmark Results: Skyrim And Deus Ex: HR
- Benchmark Results: Battlefield 3, Crysis 2, And Witcher 2
- Benchmark Results: DiRT Showdown
- Benchmark Results: Batman: Arkham City
- Benchmark Results: World Of Warcraft
- Second-Generation APUs: Playable, If You Compromise Detail


AMD really deliver stinging jabs at Intel with its APU's. I hope the pricing would be OK.
With next gen consoles coming out next year, game devs will target them. Hence the minimum standard for games will rise, making the next gen games much slower on the iGPU's. So both AMD and Intel will have to increase performance much more in the next 1-2 years.
tl;dr : next gen games will run poorly on these igpu's as next gen consoles will set the minimum performance standard.
Keep in mind, though, that that's exactly what's going to allow AMD and Intel to advance their hardware faster than games will, as they were discussing in the article (first page of the interview). Look how far Fusion and HD Graphics have come over the past 3 years, and look how long the previous console generation lasted - if that trend is anything to go by, I'm sure integrated graphics could easily become a viable budget gaming option in the next few years.
Actually, the A10 and A8 have somewhat superior graphics compared to current consoles. Current consoles can't even play in 720p as well as these AMD IGPs played 1080p despite being a more optimized platform, so that this is true is kinda obvious IMO. Also, new games would simply mean dropping resolution for these APUs. They wouldn't be unable to play new games, just probably at 1080p and 16xx by 900/10xx resolutions too.
Intel probably isn't very motivated by gaming performance for their IGPs and they're supposedly making roughly 100% performance gains per generation with their top-end IGPs anyway, so they're working on growing IGP performance. AMD also gets to use GCN in their next APU and I don't think that I need to explain the implications there, especially if they go the extra mile with using their high-density library tech too.
How about one more article with Ivy Bridge i3s and the 6570 on both setups. I want to see how much better gamin performance will be with AMD's hybrid cards.
AMD really deliver stinging jabs at Intel with its APU's. I hope the pricing would be OK.
With market share going down, there could be less economy of scale and less investment, leading to stagnation and very high prices.
For some time, you will still be able to buy a dedicated GPU, but it will be a niche product that costs you an arm and a leg, and soon hardware support will dwindle as producers move to smaller form factors.
Server -> intel
Mobile -> ARM
console -> ??? (amd should play in this area)
With next gen consoles coming out next year, game devs will target them. Hence the minimum standard for games will rise, making the next gen games much slower on the iGPU's. So both AMD and Intel will have to increase performance much more in the next 1-2 years.
tl;dr : next gen games will run poorly on these igpu's as next gen consoles will set the minimum performance standard.
I'm not sure it's accurate to say that consoles play on a game's absolute minimum settings, disregarding resolution. With that in mind, the PC versions would still have graphics options to tune down compared to the what the console versions would have their settings configured, I would think.
I do wonder how good these Trinity APU's could typically overclock, and how they'd perform there, along with their RAM overclocked to a reasonable level to compensate for the more graphics processing power.
More so, I'm wondering if the PSCheck method where you manipulate core P-states would have a substantial affect with mainly dual-threaded titles.
Also maybe I'd like to see if Dual-graphics performs better (scaling) and has a wider compatibility range than Llano's.
They did what they could on their 32nm process node that they had to stick to. Kaveri, assuming that it is true that it has GCN, will make undoubtedly some much more huge improvements over Trinity than Trinity did over Llano.