Sign in with
Sign up | Sign in

Wii U CPU and GPU Rendered in High-Res

By - Source: WiiU Daily | B 53 comments

Information about Nintendo's Wii U CPU is being unveiled by the slice.

Today we got a decent rendering of the Wii U CPU and GPU that complements the previous blurry chip shots. It is not an actual photo, but rather a photoshopped recreation of the package.

The image includes the PowerPC 750-based tri-core Wii U CPU as the smaller version of the two chips, as well as the larger GPU and 32 Mb of GPU memory. The 1.23 GHz CPU has been widely criticized for its lackluster performance, while it has been praised for its low power consumption of about 33 watts under load.

The Wii's GPU reportedly runs at 550 MHz.


Contact Us for News Tips, Corrections and Feedback

Display 53 Comments.
This thread is closed for comments
Top Comments
  • 25 Hide
    thecolorblue , December 13, 2012 11:18 AM
    LOL - "rendering" of a few squares and rectangles

    does nintendo pay you guys 'per story'
  • 23 Hide
    BeatMason , December 13, 2012 11:20 AM
    Yay, a green rectangle with 2 grey rectangles inside of it :|
  • 22 Hide
    tomfreak , December 13, 2012 11:11 AM
    I still have no idea why they clock the CPU at such low frequency. This isnt a mobile device, power consumption isnt an issue here.

    higher power VRM and power components of the motherboard to supply the higher clock CPU + better CPU cooling isnt gonna cost another $50.
Other Comments
  • 22 Hide
    tomfreak , December 13, 2012 11:11 AM
    I still have no idea why they clock the CPU at such low frequency. This isnt a mobile device, power consumption isnt an issue here.

    higher power VRM and power components of the motherboard to supply the higher clock CPU + better CPU cooling isnt gonna cost another $50.
  • 20 Hide
    vaughn2k , December 13, 2012 11:16 AM
    Wow, this is amzaing! /sarcasm/
  • 25 Hide
    thecolorblue , December 13, 2012 11:18 AM
    LOL - "rendering" of a few squares and rectangles

    does nintendo pay you guys 'per story'
  • 23 Hide
    BeatMason , December 13, 2012 11:20 AM
    Yay, a green rectangle with 2 grey rectangles inside of it :|
  • 12 Hide
    wanderer11 , December 13, 2012 11:21 AM
    Why would anyone care about power consumption on a stationary box? I would rather have a 200W chip if it was going to be more powerful.
  • 5 Hide
    eklipz330 , December 13, 2012 11:32 AM
    whatever gets the job done i suppose. i don't think they'll make the same killing as they did with the wii though
  • 8 Hide
    spentshells , December 13, 2012 11:33 AM
    I saw it at walmart last weekend the graphics are actually pretty great
  • -5 Hide
    silverblue , December 13, 2012 11:36 AM
    550MHz... that'd make this slower than a 4850 and only marginally faster than the 4830, assuming it had the full 800 shaders. From what I've heard though, it's far closer to Llano in specs.
  • 9 Hide
    TheViper , December 13, 2012 11:47 AM
    This was actually found to be news worthy?
  • 8 Hide
    TW_Honorius , December 13, 2012 11:56 AM
    WARNING!!! pic is NSFW
  • -2 Hide
    Anonymous , December 13, 2012 11:59 AM
    It is just me or is the Nexus 4 more powerful?
  • 1 Hide
    SinisterSalad , December 13, 2012 12:05 PM
    I half expected Zak to be the "author" of this "news" piece. lol
  • 7 Hide
    tipoo , December 13, 2012 12:06 PM
    I think more important than the clock speed is the size of the CPU. Anandtech measured it to be 30mm2, and we know it's made on a 45nm fab. That's about the size of a *single* core Atom, while this thing has three cores. The transistor budget per core is even lower than the 7 year old Xenon and Cell. So about a third the clock speed, and about a third the transistor budget, even Intel didn't get 9x the performance per transistor from 2006 to now.
  • 1 Hide
    tipoo , December 13, 2012 12:07 PM
    silverblue550MHz... that'd make this slower than a 4850 and only marginally faster than the 4830, assuming it had the full 800 shaders. From what I've heard though, it's far closer to Llano in specs.


    Consensus on Neogaf seems to be 480 shaders or less, with the 32MB eDRAM on-die it's not big enough for 800 by far.
  • 4 Hide
    tipoo , December 13, 2012 12:10 PM
    TomfreakI still have no idea why they clock the CPU at such low frequency. This isnt a mobile device, power consumption isnt an issue here. higher power VRM and power components of the motherboard to supply the higher clock CPU + better CPU cooling isnt gonna cost another $50.


    Because it's Nintendo, that's why. Slightly higher clocks may not have cost much more, but they want to hit certain price points and they were already taking a small loss on Wii U hardware, and taking a loss on hardware isn't Nintendos style. They got to a level where it can get most ports from the 360/ps3 gen, and that's good enough for them, forget about next gen. Same strategy as the Wii.

    I do agree though, I'd rather it draw 70 watts or even more for higher clocked parts and better cooling, the difference is like half a lightbulb, switch one off or to a CFL if it's the planet you're worried about.
  • -6 Hide
    matt_b , December 13, 2012 12:12 PM
    This thing is so low-tech now, can you imagine how horrible games would look/play in 5 years? This is a very lackluster job on Nintendo's part; I cannot find any reason to root for them right now. Who cares about 33 watts TDP when the two processing units are running at 1230 and 550 MHz respectively - I hope consumption would be that low. So this would make it the Gamecube revision 3 console now correct? Unbelievable Nintendo........

    solomangIt is just me or is the Nexus 4 more powerful?

    I would at least bet money that when the Wii U gets towards EOL, most smartphones will probably be right there neck and neck with it on hardware capability.


  • 2 Hide
    tipoo , December 13, 2012 12:15 PM
    solomangIt is just me or is the Nexus 4 more powerful?



    I doubt it, but I think it's close enough that smartphones will close in within two or three years. They were already a generation or two away from beating the PS3 and 360 in raw performance, and the Wii U doesn't seem far from there. With the SGX Rogue series and Cortex 15s, I think phones will be close to the Wii U.

    As for why it's not more powerful yet, remember that current ARM cores have pretty narrow front ends to save power, etc. And the GPUs are still not there by a quite a margin, current smartphone GPUs may hit 30Gflops, the Rogue series will hit the 200 the PS360 have.
  • 6 Hide
    Lyrick , December 13, 2012 12:55 PM
    At the very least they're going to skip the whole overheating (RRoD/LYoD) fiascoes the current gen Platforms all went through. All Sony and MS proved this gen is that cramming Desktop and Server based CPU's into tiny little cases with little to no cooling will just put the consumer out $399 and/or $599 when the temperatures get high enough to melt the solder and disconnect internal components of the current gen platforms.
  • 5 Hide
    kawininjazx , December 13, 2012 1:00 PM
    You can't get too wrapped up in numbers, hardware in game consoles is extremely customized and therefore make a lot better use of it's power. Compare the specs of an XBOX360 vs. a PC with the same specs (Pentium D/7600GT maybe), and then see how much better an XBOX runs the same game.
  • 5 Hide
    tipoo , December 13, 2012 1:01 PM
    LyrickAt the very least they're going to skip the whole overheating (RRoD/LYoD) fiascoes the current gen Platforms all went through. All Sony and MS proved this gen is that cramming Desktop and Server based CPU's into tiny little cases with little to no cooling will just put the consumer out $399 and/or $599 when the temperatures get high enough to melt the solder and disconnect internal components of the current gen platforms.



    Not necessarily, the first revision PS3s had a pretty standard electronics failure rate, I think it was like 1-3%. Good design can offset high thermals. It was one of the quietest too. The 360 was just poorly designed, the GPU heatsink was tiny and under the DVD drive with low airflow, and the fans didn't directly touch the heatsinks at all, they just sucked air out the back of the case with some airflow over the heatsink as a result. The Wii U uses a similar method, but at far lower wattage.

    I wouldn't mind if the next gen was also 200 watt boxes. Heck, so long as the reliability and fan noise were taken care of, I wouldn't care if they went higher. The power draw is still small beans compared to say all the lightbulbs in your house.
Display more comments