Can The Latest Integrated Graphics Engines Game At 1080p?
Seemingly, the only number that matters in the living room today is 1080. Practically every TV sold now must handle 1080p. Blu-ray content is obviously 1080p. So are the HD facets of Vudu, Netflix Watch Instantly, and YouTube. The Xbox 360 and even Apple TV output at 1080p.
We love the way content looks running at a native 1080p from the couch. But video is only half of the living room entertainment equation. What about gaming? That becomes a surprisingly tricky question.
When we talk about an enjoyable gaming experience at 1920x1080, be it in a home theater environment or on a 24" desktop monitor, we have to discuss the titles themselves and the hardware running them. In the past, achieving playable frame rates in a modern game at 1920x1080 almost necessitated discrete graphics. But when you start veering off into the realm of HTPCs and other more compact form factors, an add-in upgrade might not be an option, particularly if you're also looking for quiet acoustics.
If we continue adding on to our wish list of what a capable machine should be able to do, excelling at video playback and accelerating the encode pipeline have to be included as well. We've simply been spoiled by Intel's Quick Sync technology. But there's no question we need to see compelling gaming performance at 1920x1080. And that's in modern, best-selling games, not Bejeweled or mahjong.
Would it be too much to expect all of that for less than $500? In the past, it would have at least been challenging. Paul Henningsen demonstrated to us that it was possible to construct a competent gaming machine in our most recent System Builder Marathon (System Builder Marathon, August 2012: $500 Gaming PC). But that certainly wasn't something we'd want sitting in an entertainment center, nor was it meant to be.
But what if it was possible to get decent results from an on-die graphics engine? AMD was close with its Llano-based platform (AMD A8-3850 Review: Llano Rocks Entry-Level Desktops), though we've had to conclude on multiple occasions that it's just not fast enough for most mainstream games. Chris Angelini delivered an exclusive preview of the company's next-gen APUs in AMD Trinity On The Desktop: A10, A8, And A6 Get Benchmarked! that showed gaming performance improving notably.
In a follow-up (AMD Desktop Trinity Update: Now With Core i3 And A8-3870K), Chris added benchmark results from Intel's HD Graphics 3000 and 2000 engines, which trailed hopelessly in every gaming workload he threw at them. Clearly, the Sandy Bridge generation simply wasn't cut out for the quality settings and resolutions we were looking for. Unfortunately, Intel wasn't yet selling price-comparable CPUs with either HD Graphics 4000 or 2500, so Chris couldn't gauge the progress of its Ivy Bridge architecture. That situation changed recently, though, when Core i3 CPUs started surfacing with both new graphics configurations.
With both AMD and Intel beefing up the graphics horsepower on their respective processors, are we finally at a point where it's finally possible to game at 1920x1080 in the living room without needing a chunky chassis with room for an add-in card?
Sorting Through Chip Choices
Let’s be clear. We know that it’s possible to get amazing 1080p performance from discrete graphics, and you don't even necessarily need to add fan noise. A card like Sapphire’s Ultimate Radeon HD 6670 1 GB can do the job passively. But it also tacks an additional $85 or so onto your system price. Today's story asks if the latest built-in graphics engines deliver playable performance at 1920x1080 without the help of a PCI Express-based upgrade.
In one corner, we have Intel’s Ivy Bridge-based Core i3 CPUs. As of this writing, the only desktop Core i3 shipping with the HD Graphics 4000 engine is the company's -3225 with 3 MB of shared L3 cache, a 3.3 GHz clock rate, a 55 W TDP, and a $144 price tag. The Core i3-3220, which sells for $125, is essentially the same chip, except that it has HD Graphics 2500 instead. Both processors are manufactured using 22 nm lithography and feature two physical cores with Hyper-Threading enabled. Reflecting Intel’s segmentation strategy, these mainstream Core i3s lack features like Turbo Boost and AES-NI.
|Make||Model||Cores / Threads||CPU Freq.||Max. Turbo||GPU||Memory Support||TDP|
|AMD||A8-5600K||4 /4||3.6 GHz||3.9 GHz||HD 7560D||DDR3-1866||100 W|
|AMD||A10-5800K||4 /4||3.8 GHz||4.2 GHz||HD 7660D||DDR3-1866||100 W|
|Intel||Core i3-3220||2 /4||3.3 GHz||N/A||HD Graphics 2500||DDR3-1600||55 W|
|Intel||Core i3-3225||2 /4||3.3 GHz||N/A||HD Graphics 4000||DDR3-1600||55 W|
AMD’s A-series APUs are priced similarly, aiming at the same entry-level demographic. What AMD lacks in manufacturing technology (Trinity continues to leverage the company's 32 nm node) it remedies with a much more potent graphics architecture.
Simply put, our emphasis here is on the highest-end models in each product family, hoping that we're able to achieve playable performance at our desired resolution using nothing but a CPU with integrated graphics.
The image above illustrates AMD's gaming message as it prepares to roll out its Trinity-based APUs. Those look like pretty incredible gains, don't they? A quick glance at the fine print, however, reveals that these internally-run benchmarks put the A10-5800K against Intel's last-gen Core i3-2120 with HD Graphics 2000, which amounts to a worst-case scenario for Intel. Not only are there Sandy Bridge-based Core i3s with HD Graphics 3000 available, but also Ivy Bridge-based chips with HD Graphics 2500 and 4000.
Our goal is to set up a fairer fight. And just as we elected to pit the new HD Graphics 4000-equipped Core i3-3225 against its less-endowed -3220 cousin, we also decided to throw the A10 in the ring with AMD's A8-5600K, another Trinity-based part running at 3.6 GHz and armed with fewer shader units running at a slightly slower clock rate.
We do have to point out, though, that both Intel chips feature a 55 W TDP. AMD's Trinity-based parts maintain Llano's status quo with 100 W thermal ceilings. Under load, AMD's reference cooler is noticeably louder, a factor we have to take into account when we consider putting this hardware in the living room.