Tegra 4: The Brains
As with Nvidia’s past SoCs, Tegra 4 still does not employ a unified shader architecture. Its 72 cores are split between 24 vertex and 48 pixel shaders, evolving from Tegra 3’s four vertex shaders and eight pixel shaders. The company says that, at this point in time, it simply makes the most sense for it to continue separating them, lending to better power efficiency than it could achieve with a unified design. Also, it points out that developing for its immediate-mode renderer is easier for the ISVs already accustomed to working with Nvidia’s architecture on the desktop.
The webcast introducing Tegra 4 made it clear that Nvidia is using four Cortex-A15 cores, but it wasn’t clarified until later that the fifth battery-saver core is also Cortex-A15. In its previous generation, Nvidia stated that the fifth core was transparent to the operating system and applications. However, it isn’t supported in Windows RT. We’ve heard claims that Microsoft doesn’t properly support the asymmetrical nature of Tegra’s four processing cores (ramping up to 1.9 GHz) and single power-saver core running at a slower maximum clock rate.
Nvidia was at least able to get around this in Android, so it won’t be an issue for Shield; the fifth core will play a key role in keeping platform power consumption down around that 1 W estimate at idle.
Though the cores themselves are standard –A15s, the memory controller is Nvidia’s own design. And whereas Tegra 3 (T30) employed a single-channel 32-bit pathway with support for up to DDR3-L, Tegra 4 features a dual-channel interface supporting DDR3-L and LP-DDR3.
We’re hoping to go into greater depth with Nvidia on Tegra 4, specifically, in the weeks to come. As it pertains to Shield, the 28 nm HPL (low-power with high-k metal gates) SoC is balanced to minimize leakage at the expense of maximum performance, which we’re counting on to enable the battery life claims Jen-Hsun made on stage.
Now, Who’s Going To Use It
As I talked to Nvidia’s team at CES, I was very clear that I’m probably not Shield’s ideal customer. I live in a single-story house, I sit in a nice Herman Miller chair in front of three 27” screens driven by a GeForce GTX 680, and when I want to play games, I close the door to my office and I play.
I don’t find myself on the couch wishing I could beam Battlefield onto my television. I don’t travel enough to need a handheld in my bag to pass the time. And I don’t use my phone for gaming.
Yet, as a technologist, I have to admire the amount of Nvidia IP that came together to make Shield possible. Simply nailing the PC gaming component would be amazing. Offloading rendering to a desktop system and enjoying the benefit of accelerated video decode to make 38 Wh of capacity last almost all day is brilliant.
The unanswered question is: how much will it cost? Nvidia preemptively threw out that Shield won’t be subsidized, and it does want to make money on the device. But prospective customers are going to compare it to the 3DS, Xperia Play, and Vita, rather than Atom/Cortex-A15/Krait-based tablets.
To that end, I’d like to see Shield selling in the $250 range, though the shot across the bow of console companies suggests this might not be the case. Nvidia does need to lean heavily on its ISV relationships to bolster the appeal of Android gaming beyond where it sits today though, and Nvidia absolutely must knock the PC gaming experience out of the park. That’s the one capability most appealing to me. I just need to find a few titles I can play without embarrassing myself using those joysticks.