Nintendo Finally Chats About Wii U Guts, Tears It Apart

In the latest edition of Iwata Asks, four key players in the development of the upcoming Wii U actually open up about what's inside the new console. Granted they didn't offer a full list of specs that can be compared to the current crop of hardware, but it's probably the first time we've heard the term "multi-core CPU" from Nintendo's lips.

In this session, Nintendo president Satoru Iwata speaks with four members from the Product Development Department located in the Integrated Research and Development Division: Nobuyuki Akagi, Yasuhisa Kitano, Deputy General Manager Ko Shiota, and Senior Managing Director and General Manager of the Integrated Research and Development Division, Genyo Takeda.

Iwata asked what was key to achieving low power consumption and high performance with the Wii U. Takeda started his response by saying this will be the first time a Nintendo console will use a multi-core CPU. He then said they used a multichip module (MCM), thus data is processed faster and more efficiently between the CPU cores (IBM Power Architecture-based), the GPU (AMD Radeon) and the high-density eDRAM on the same piece of silicon (the MCM), requiring less energy consumption.

"The LSI chips were made at different companies, so when a defect arose, it was difficult to isolate the cause. In defect analysis, it was inside the MCM, so figuring out the problem was incredibly difficult," Shiota added. "We really drew on the wisdom of Renesas, IBM and AMD, who cooperated with us. To isolate the problem we devised a way to have a minimum amount of signal travel outside of the MCM, so we could verify the problem with the minimum amount of overhead."

The interview actually moves on to show the console's motherboard and MCM, seemingly addressing the hardcore gamers and critics who want to know what's inside the box. They eventually talk about the console's casing, revealing that their intention was to make the console somewhat unnoticeable when sitting next to an HDTV – to play an "unobtrusive role behind the scenes."

Given its smaller size compared to the Wii, they talked about the Wii U's thermal design and how it casts off heat. In the Wii, the CPU and GPU were separate, so it required two heat sinks. In the Wii U, Nintendo only needs one, but it's larger than what was used in the original Wii because it cranks out about three times the amount of heat.

"We really had to wrack our brains," Kitano said. "We considered solutions such as making the fan bigger and raising the number of fan revolutions. We conducted heat tests for prototypes a number of times and optimized placement of the air holes. Another small detail is the vent cover in the back of the fan. We had to put a lot of work into improving efficiency, making it thinner and slanting the inside so that the air could escape more smoothly."

To see the full 4-page Wii U teardown by Nintendo, head here. The console will arrive on North American store shelves on November 18 in 8 GB and 32 GB flavors.

  • memadmax
    Sigh...
    It's a hybrid version of SOC...
    They basically took the CPU, GPU and stuck them on the same chunk of circuit board and X'd out the bus interface chip between the two while integrating the necessary circuitry for communication between the two on the chips...

    Why they didn't go with a pure SOC route is anyones guess... maybe for future upgrade-ability or something...
    Reply
  • rebel1280
    Quoting Takeda from the article:I think that's the magic of game-console development. We carry out development together with other partner companies, but rather than having IBM employees and AMD employees and Renesas employees, we joined into what might be called "Team Nintendo". That happened because, it seems like they can talk to their families like their children, grandchildren and spouses about what they have made. In that respect, one of the good points of game-console development is how the participants' motivation inspired the team as a whole.

    That is awesome, the full article is very revealing as to how the whole thing came together. Great read! :)
    Reply
  • usbgtx550
    Well, it seems Nintendo went for innovation versus performance, and that seemed to do them well with the Wii. You do have to admit the controllers have a lot of potential if the right developers get behind it.
    Reply
  • hapkido
    9414537 said:
    Why they didn't go with a pure SOC route is anyones guess... maybe for future upgrade-ability or something...

    They're using IBM CPU and AMD GPU. SOC are designed as a whole. They can't just put different processors on the same die.
    Reply
  • internetlad
    Toms Hardware Users: LOOK AT THESE SPECS THIS IS NOWHERE NEAR MY 3000 DOLLAR GAMING RIG

    4 year old: Man this tennis game is fun. My Mii looks so cool.

    Grandparents: only 199 bucks? and it has that italian man on the box, the kids are sure to like this.



    Nintendo: IT STILL PRINTS MONEY :D :D
    Reply
  • atminside
    Cool and everything, but I was really looking forward to hear about architecture of the CPU and GPU. What type of memory it has, how much it has, how many cores or threads the CPU has, what version of open gl can the GPU support. What type of GPU is used? Sheesh this beak down was like reading a transcript from a political debate where there is no details and just some bare minimum information. Seems like Nintendo is REALLY down playing the importance of hardware. Do they really think they will have enough titles at start up to make consumers ignore that deficiency?
    Reply
  • @internetlad

    Except this one is 299.99 or 349.99 depending on the SKU (there's supposed to be a 249.99 SKU but it doesn't seem to exist here)
    Reply
  • internetlad
    Daki@internetladExcept this one is 299.99 or 349.99 depending on the SKU (there's supposed to be a 249.99 SKU but it doesn't seem to exist here)
    lol yeah I forgot the big N was pricing it's systems all stupid now.

    Never mind, the N ship is going down. Everybody pray to your respective gods.
    Reply
  • scottiemedic
    I just like the pic with the honking GPU and the little bitty CPU, makes me giggle...
    Reply
  • CaedenV
    usbgtx550Well, it seems Nintendo went for innovation versus performance, and that seemed to do them well with the Wii. You do have to admit the controllers have a lot of potential if the right developers get behind it.The motion controller was great because it brought more natural control to everyone from little kids to senior citizens, and it worked well (unlike kinnect), and it stayed out of the way (unlike the PS Move). Having to continually switch between 2 monitors, 1 which is 5-10 feet away, and another that is 1-2 feet away, is not natural, and not user friendly. It can allow for some more interesting game mechanics (especially for a card based game such as pokemon), but most games will not develop specifically for this feature, or will tack it onto a ported title and not really make it work like it ought to. Basically, having 2 screens will decease the audience by pricing the accessories out of the reach of the causal/budget gamer, and being too complicated for older audiences.

    It will still sell well and print lots of money for Nintendo, but I do not think it will gain the wide-spread audience that the wii had. Personally I even got a wii (and I never get consoles), and enjoyed using it, but there were a lack of games that appealed to my age bracket which were not ports that I could simply get for my PC (and even an entry level PC with onboard graphics today looks infinitely better than the wii does). So after getting the hang of bowling, golf, and zelda I pretty much never picked it up again, so I will not be getting the wiiU. 5-8 years from now when the next gen console comes out my kiddos will be old enough to play, and I will probably feed the N machine for a generation then, but by the time the next gen rolls around they will be teens with either big kid consoles, or their own game rigs, so it will probably be the only N console I ever get.
    Reply