Gaming At 1920x1080: AMD's Trinity Takes On Intel HD Graphics
Think you're pretty snazzy because your integrated graphics core plays mainstream games at 1280x720? We're on to bigger and better things, like modern titles at 1920x1080. Can AMD's Trinity architecture push high-enough frame rates to make this possible?
Can The Latest Integrated Graphics Engines Game At 1080p?
Seemingly, the only number that matters in the living room today is 1080. Practically every TV sold now must handle 1080p. Blu-ray content is obviously 1080p. So are the HD facets of Vudu, Netflix Watch Instantly, and YouTube. The Xbox 360 and even Apple TV output at 1080p.
We love the way content looks running at a native 1080p from the couch. But video is only half of the living room entertainment equation. What about gaming? That becomes a surprisingly tricky question.
When we talk about an enjoyable gaming experience at 1920x1080, be it in a home theater environment or on a 24" desktop monitor, we have to discuss the titles themselves and the hardware running them. In the past, achieving playable frame rates in a modern game at 1920x1080 almost necessitated discrete graphics. But when you start veering off into the realm of HTPCs and other more compact form factors, an add-in upgrade might not be an option, particularly if you're also looking for quiet acoustics.
If we continue adding on to our wish list of what a capable machine should be able to do, excelling at video playback and accelerating the encode pipeline have to be included as well. We've simply been spoiled by Intel's Quick Sync technology. But there's no question we need to see compelling gaming performance at 1920x1080. And that's in modern, best-selling games, not Bejeweled or mahjong.
Would it be too much to expect all of that for less than $500? In the past, it would have at least been challenging. Paul Henningsen demonstrated to us that it was possible to construct a competent gaming machine in our most recent System Builder Marathon (System Builder Marathon, August 2012: $500 Gaming PC). But that certainly wasn't something we'd want sitting in an entertainment center, nor was it meant to be.
But what if it was possible to get decent results from an on-die graphics engine? AMD was close with its Llano-based platform (AMD A8-3850 Review: Llano Rocks Entry-Level Desktops), though we've had to conclude on multiple occasions that it's just not fast enough for most mainstream games. Chris Angelini delivered an exclusive preview of the company's next-gen APUs in AMD Trinity On The Desktop: A10, A8, And A6 Get Benchmarked! that showed gaming performance improving notably.
In a follow-up (AMD Desktop Trinity Update: Now With Core i3 And A8-3870K), Chris added benchmark results from Intel's HD Graphics 3000 and 2000 engines, which trailed hopelessly in every gaming workload he threw at them. Clearly, the Sandy Bridge generation simply wasn't cut out for the quality settings and resolutions we were looking for. Unfortunately, Intel wasn't yet selling price-comparable CPUs with either HD Graphics 4000 or 2500, so Chris couldn't gauge the progress of its Ivy Bridge architecture. That situation changed recently, though, when Core i3 CPUs started surfacing with both new graphics configurations.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
With both AMD and Intel beefing up the graphics horsepower on their respective processors, are we finally at a point where it's finally possible to game at 1920x1080 in the living room without needing a chunky chassis with room for an add-in card?
Sorting Through Chip Choices
Let’s be clear. We know that it’s possible to get amazing 1080p performance from discrete graphics, and you don't even necessarily need to add fan noise. A card like Sapphire’s Ultimate Radeon HD 6670 1 GB can do the job passively. But it also tacks an additional $85 or so onto your system price. Today's story asks if the latest built-in graphics engines deliver playable performance at 1920x1080 without the help of a PCI Express-based upgrade.
In one corner, we have Intel’s Ivy Bridge-based Core i3 CPUs. As of this writing, the only desktop Core i3 shipping with the HD Graphics 4000 engine is the company's -3225 with 3 MB of shared L3 cache, a 3.3 GHz clock rate, a 55 W TDP, and a $144 price tag. The Core i3-3220, which sells for $125, is essentially the same chip, except that it has HD Graphics 2500 instead. Both processors are manufactured using 22 nm lithography and feature two physical cores with Hyper-Threading enabled. Reflecting Intel’s segmentation strategy, these mainstream Core i3s lack features like Turbo Boost and AES-NI.
Make | Model | Cores / Threads | CPU Freq. | Max. Turbo | GPU | Memory Support | TDP |
---|---|---|---|---|---|---|---|
AMD | A8-5600K | 4 /4 | 3.6 GHz | 3.9 GHz | HD 7560D | DDR3-1866 | 100 W |
AMD | A10-5800K | 4 /4 | 3.8 GHz | 4.2 GHz | HD 7660D | DDR3-1866 | 100 W |
Intel | Core i3-3220 | 2 /4 | 3.3 GHz | N/A | HD Graphics 2500 | DDR3-1600 | 55 W |
Intel | Core i3-3225 | 2 /4 | 3.3 GHz | N/A | HD Graphics 4000 | DDR3-1600 | 55 W |
AMD’s A-series APUs are priced similarly, aiming at the same entry-level demographic. What AMD lacks in manufacturing technology (Trinity continues to leverage the company's 32 nm node) it remedies with a much more potent graphics architecture.
Simply put, our emphasis here is on the highest-end models in each product family, hoping that we're able to achieve playable performance at our desired resolution using nothing but a CPU with integrated graphics.
The image above illustrates AMD's gaming message as it prepares to roll out its Trinity-based APUs. Those look like pretty incredible gains, don't they? A quick glance at the fine print, however, reveals that these internally-run benchmarks put the A10-5800K against Intel's last-gen Core i3-2120 with HD Graphics 2000, which amounts to a worst-case scenario for Intel. Not only are there Sandy Bridge-based Core i3s with HD Graphics 3000 available, but also Ivy Bridge-based chips with HD Graphics 2500 and 4000.
Our goal is to set up a fairer fight. And just as we elected to pit the new HD Graphics 4000-equipped Core i3-3225 against its less-endowed -3220 cousin, we also decided to throw the A10 in the ring with AMD's A8-5600K, another Trinity-based part running at 3.6 GHz and armed with fewer shader units running at a slightly slower clock rate.
We do have to point out, though, that both Intel chips feature a 55 W TDP. AMD's Trinity-based parts maintain Llano's status quo with 100 W thermal ceilings. Under load, AMD's reference cooler is noticeably louder, a factor we have to take into account when we consider putting this hardware in the living room.
Current page: Can The Latest Integrated Graphics Engines Game At 1080p?
Next Page Professional Opinion: Gaming On Integrated Graphics-
azathoth Seems like a perfect combination for a Casual PC gamer, I'm just curious as to the price of the Trinity APU's.Reply -
luciferano They both have graphics that have HD in their name, but AMD's HD graphics are more *HD*, lol.Reply -
Nintendo Maniac 64 Err... did we really need both the A10-5800k and the A8-5600k? Seeing how both are already 100w unlocked CPUs, surely something like an A10-5800k vs a 65w A10-5700 would have been more interesting for an HTPC environment...Reply -
mayankleoboy1 Consoles set the bar for game developers. These iGPU's are comparable to the consoles and thats why games will run smooth here.Reply
With next gen consoles coming out next year, game devs will target them. Hence the minimum standard for games will rise, making the next gen games much slower on the iGPU's. So both AMD and Intel will have to increase performance much more in the next 1-2 years.
tl;dr : next gen games will run poorly on these igpu's as next gen consoles will set the minimum performance standard. -
mousseng
Keep in mind, though, that that's exactly what's going to allow AMD and Intel to advance their hardware faster than games will, as they were discussing in the article (first page of the interview). Look how far Fusion and HD Graphics have come over the past 3 years, and look how long the previous console generation lasted - if that trend is anything to go by, I'm sure integrated graphics could easily become a viable budget gaming option in the next few years.9537609 said:tl;dr : next gen games will run poorly on these igpu's as next gen consoles will set the minimum performance standard. -
falchard Since when as AMD or nVidia actually taken on Intel graphics? Thats a bit insulting considering the disproportionate results time and time again.Reply -
luciferano mayankleoboy1Consoles set the bar for game developers. These iGPU's are comparable to the consoles and thats why games will run smooth here.With next gen consoles coming out next year, game devs will target them. Hence the minimum standard for games will rise, making the next gen games much slower on the iGPU's. So both AMD and Intel will have to increase performance much more in the next 1-2 years.tl;dr : next gen games will run poorly on these igpu's as next gen consoles will set the minimum performance standard.Reply
Actually, the A10 and A8 have somewhat superior graphics compared to current consoles. Current consoles can't even play in 720p as well as these AMD IGPs played 1080p despite being a more optimized platform, so that this is true is kinda obvious IMO. Also, new games would simply mean dropping resolution for these APUs. They wouldn't be unable to play new games, just probably at 1080p and 16xx by 900/10xx resolutions too.
Intel probably isn't very motivated by gaming performance for their IGPs and they're supposedly making roughly 100% performance gains per generation with their top-end IGPs anyway, so they're working on growing IGP performance. AMD also gets to use GCN in their next APU and I don't think that I need to explain the implications there, especially if they go the extra mile with using their high-density library tech too.