Sign in with
Sign up | Sign in

Can Nvidia Strike While The Iron Is Hot?

Nvidia Tegra K1 In-Depth: The Power Of An Xbox In A Mobile SoC?
By

On paper, Tegra K1 fixes the issues that most clearly put Tegra 4 at a competitive disadvantage. Some of the questions we still have can’t be answered until we get our hands on a derived device to test. Others won’t be resolved until game developers either bring premium content over from the console space or create newer titles using advanced APIs.

We had hoped to benchmark one of Nvidia’s reference platforms in time for the Tegra K1 announcement. They’re still so rare, though, that real performance data will need to come later. Production of the SoC purportedly started in December, and company representatives claim devices based on Tegra K1 will ship in the first half of 2014. However, specific announcements aren’t Nvidia’s to make, so it can’t comment on the form factors we’ll see or the regions they’ll be available in (nor did it mention any product of its own based on K1). But our own research in the early hours before CES suggests that at least one Tegra K1-based product is already on display. We consider this to be promising news.

What the company did say was that Tegra K1 is definitely a tablet play, and will also be available to premium superphones (think big screens and loaded with new technology). Tegra 4i, the more smartphone-oriented SoC with Nvidia’s i500 LTE modem built-in, is purportedly happening still, and we’re told there will be more information, again, in the first half of 2014.

But today’s discussion clearly isn’t about Tegra’s recent track record or Tegra K1’s ultimate destiny. Rather, knowing how amped our audience gets about speeds and feeds, Nvidia wanted to share more information about the SoC's inner workings. Together with Intel, Nvidia is one of the most forthcoming vendors in the mobile segment, revealing far more about its hardware than Apple or Qualcomm. Small gaps in the spec sheet (like a final GPU clock rate) remain; however, given what we already know about the Kepler architecture, this is the Tegra we were hoping for in 2013.

Approx. Comparison To Both Last-Gen Consoles

Tegra K1
PlayStation 3
Xbox 360
Peak Shader (GFLOPS)
365
192
240
Texturing (GTex/s)
7.6
12
8
Memory Bandwidth (GB/s)
17
28.8
22.4
Feature Set (DX)
DX 11.2
DX 9
DX 9
CPU Performance (SPECint, Per-core)
1403
1200
1200

One of the slides Nvidia presented in its briefing compared Tegra K1 to the Xbox 360 and PS3. Although the console specifications aren’t 100% on-target, the math suggests that a single SMX running at what we presume to be about 950 MHz offers substantially more shader horsepower and almost as much peak texture fillrate. At least in theory, Tegra K1 could be on par with those previous-generation systems that continue to occupy shelf space today. Could you imagine the gaming performance of your old Xbox in a tablet form factor, perhaps with a Bluetooth-connected controller to solve the I/O issue?

We’d only be missing the games—and that’s not an altogether bad position for Nvidia to be in, given the developer relationships it maintains on the PC side. The company appears to have its hardware ducks in a row. Let’s see if it can redefine the mobile gaming experience beyond Tegra-optimized titles with a few extra effects in them.

My challenge to Nvidia: do whatever it takes to bring games to tablets that enthusiasts want to play, instead of that superficial content we only bother with because we’re bored somewhere else. Show us that your mobile hardware has the same goodness as Kepler on the desktop and that those same developers will follow your lead. When Android gaming is as compelling as it is on the PC, there will be a long line of Tom’s Hardware readers ready to buy new tablets.

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 45 comments.
This thread is closed for comments
  • 0 Hide
    Marcopolo123 , January 6, 2014 6:50 AM
    funny
  • 0 Hide
    azzazel_99 , January 6, 2014 6:56 AM
    .
  • 0 Hide
    renz496 , January 6, 2014 8:12 AM
    Quote:
    Where is the maxwell stuff did they not even address it yesterday? WTF


    most likely they will talk about it when the actual gpu is close to release.
  • 8 Hide
    Wisecracker , January 6, 2014 8:38 AM
    Quote:
    At least in theory, Tegra K1 could be on par with those previous-generation systems ...


    AMD does it. Intel, too. But ...

    When it comes to over-the-top hype, embellishment and hyperbole, nVidia is the king.

  • 2 Hide
    ZolaIII , January 6, 2014 10:04 AM
    @ Rupert Jr

    Well multiplied 3x & with higher memory bandwidth you are there.
    1.5x Maxwell & 2x cores (28nm to 16-14nm).
    Hyper memory cube.
    All this in 2015.
  • -2 Hide
    RupertJr , January 6, 2014 10:11 AM
    @ZolaIII
    but what you are assuming does not exists nowadays and maybe until years...
    where will be other chips at that time?
  • 2 Hide
    dragonsqrrl , January 6, 2014 10:48 AM
    "in fact, the first Maxwell-powered discrete GPUs are expected in the next few weeks"

    Excellent.
  • 5 Hide
    InvalidError , January 6, 2014 11:07 AM
    Quote:
    It looks like NVidia will loose another battle...

    Well, they are right on at least one thing: 2014's SoCs are now about on par with high-end components from ~7 years ago or mid-range gaming PCs from ~5 years ago and are managing to do so on a 2-3W power budget, which is rightfully impressive IMO.

    If SoCs continue improving at this pace while PCs remain mostly stagnant, the performance gap between SoCs and mainstream PCs will be mostly gone by the time 2015 is over.
  • 0 Hide
    Shankovich , January 6, 2014 11:27 AM
    I'm assuming this SoC will run at around 15 watts. With 192 CUDA cores and DDR3 LP, your max output will probably be somewhere around 180 GFLOPS. Yes yes it CAN do the same effects as consoles and PC, so what? If the Game Cube had DX 11.2 api running on it, it could as well, but obviouslt it coiuldn't put too much on the screen. This will be the same deal with the K1.

    It's cool, I like it, don't get me wrong, but the stuff they're saying, though technically correct, is so misleading to about 90% of the market. People are going to think they have PS4's in their hand when in reality they have half a xbox 360 with up to date API's running on it.
  • 5 Hide
    Shankovich , January 6, 2014 11:48 AM
    Ah crap that's what I get for posting before reading everything -_-. Well, in retrospect I say I'm very surprised! 365 GFLOPS peak is amazing for what it's running off of! However, it really doesn't come close to competing with any current gen console or base gaming PC. I'd love to have a ultrabook running around 800 GFLOPS in the near future though :) 
  • 0 Hide
    photonboy , January 6, 2014 11:49 AM
    Factor of 10??

    Would it not be a factor of 100?
    (200W/2W)
  • 2 Hide
    esrever , January 6, 2014 12:49 PM
    nvidia forgot it rated the ps3's GPU 2TFLOPs to market it to sony.
  • 0 Hide
    redeemer , January 6, 2014 1:54 PM
    Quote:
    Where is the maxwell stuff did they not even address it yesterday? WTF



    That says a lot right there, Maxwell is far away!
  • 0 Hide
    redeemer , January 6, 2014 1:56 PM
    Quote:
    Where is the maxwell stuff did they not even address it yesterday? WTF



    True but K1 at the right price can be a force in the SOC arena
  • 0 Hide
    redeemer , January 6, 2014 2:15 PM
    So another things guys no mention of current products Tegra 4 /4i...lol, all I see is Tegra 5/6. Nvidia has to stop this non-sense they are all over the place, so I guess Tegra 4 is dead in the water not that it was being adopted well anyways. Nvidia is good at building Tegra hype but can never deliever. Hopefully they do not mess up k1!
  • 0 Hide
    apache_lives , January 6, 2014 2:16 PM
    http://www.news.com.au/technology/california-crop-circle-was-a-marketing-stunt-by-nvidia-corp/story-e6frfrnr-1226796160776
  • 0 Hide
    Durandul , January 6, 2014 3:15 PM
    Quote:
    Factor of 10??

    Would it not be a factor of 100?
    (200W/2W)


    It would be except they take into account that there is only one SMX rather than eight. So it would really be (25W/2W).

    On a side note, if a company like Epic Games were to go back and do a thorough port of UDK 3, I believe it would be reasonable for a publishing company like 2K to either have it's developers go back and recompile the games, or hire new ones to do so, and we might see a market emerge for the K1.

    The dream scenario for me would include having Sony add drivers for the dualshock to android, and being able to play borderlands on a tablet with it. Now if only there were someone to throw money into their faces...
  • 2 Hide
    Durandul , January 6, 2014 3:18 PM
    Quote:
    So another things guys no mention of current products Tegra 4 /4i...lol, all I see is Tegra 5/6. Nvidia has to stop this non-sense they are all over the place, so I guess Tegra 4 is dead in the water not that it was being adopted well anyways. Nvidia is good at building Tegra hype but can never deliever. Hopefully they do not mess up k1!


    Actually, the article mentioned that 4i was still going to be released in the first half of 2014 as a solution for phones. Makes you wonder why they bother after reading the article however.
  • 0 Hide
    InvalidError , January 6, 2014 10:39 PM
    Quote:
    Nvidia has to stop this non-sense they are all over the place, so I guess Tegra 4 is dead in the water not that it was being adopted well anyways.

    Tegra3 launched several months late, just before everyone else released their new next-gen SoCs so it was too little too late to have much of a chance to gain much market share. Tegra4 was largely in the same boat and I'm guessing many device manufacturers may have shied off due to lack of unified shaders and GPGPU too.

    If Tegra5/K1 is delivered on-schedule and is priced right, it should have a decent shot at the market. That's a lot of ifs with a sub-par track record so Chris' implied skepticism (as well as many people in these comments, myself included) is very well warranted.

    Time will tell.
Display more comments