Sign in with
Sign up | Sign in
Your question
Closed

IBM Confirms that Nintendo's Wii U Has a Power-based CPU

Last response: in News comments
Share
September 26, 2012 7:33:31 PM

What a piece of crap.

They should be ashamed of themselves for selling something like this.

I am not sure who would even buy it. $300 dollars for 5 year old hardware... or $350 if you get the 32 gig version (the extra 16 gig costing the manufacturer about $3 dollars, but somehow translating into an extra 50 dollars retail).

This is a POS and no one should buy it.
Score
-16
September 26, 2012 7:35:15 PM

"According to a tweet, the CPU is not based on IBM's Power 7 chip, as previously believed, but rather a Power-based CPU. "

Stupid and it's a contradiction

We know just as much as we did yesterday. Not only that it could still be a Power 7 chip since a Power 7 chip IS a Power-based CPU, but it could be ANY Power based chip.
Score
23
Related resources
Can't find your answer ? Ask !
Anonymous
a b à CPUs
September 26, 2012 7:37:55 PM

@southernshark next time, before making such a stupidly retarded comment, check your facts:

The Pro model, which is currently sold out for pre-orders, includes a $60 game, a couple of accessories (about $60 if bought separately) AND 32 GB of memory instead of 8 GB. That's a lot of extra value for just $50.
Score
12
September 26, 2012 7:38:33 PM

Sluggish, like a wet sponge
Score
2
September 26, 2012 7:40:01 PM

I know I catch heat for this but so what. I personally think Nintendo's time in the sun as a console manufacture has come and gone. The Wii was a awesome concept they seriously dropped the ball on with games. I had a Wii the Christmas the year it released and I was like DAMN this thing is awesome.

Then I waited for games
Zelda
Metroid
Mario
Dragon Ball Z
RE4

to me where the only decent games ever released for the console. Nintendo really dropped the ball on pushing manufactures to come out with new decent games. As far as graphics go they where not on par with the Xbox or PS3 but that was ok they made up for it with the controls.

This new console looks like Nintendo is doing the same dropping the ball by using a old out dated processor. Why not go ahead and use a new processor to get some more performance out of the product.
Score
10
September 26, 2012 7:46:25 PM

southernsharkWhat a piece of crap.They should be ashamed of themselves for selling something like this.I am not sure who would even buy it. $300 dollars for 5 year old hardware... or $350 if you get the 32 gig version (the extra 16 gig costing the manufacturer about $3 dollars, but somehow translating into an extra 50 dollars retail).This is a POS and no one should buy it.


32-8=24.

Basic math, I'm not sure who would even know this. ;) 

And since when is price and demand based around logic? You want something, pay this price or don't buy it. Just because it cost me $10 off eBay, doesn't mean I can't sell it to you for $2000.
Score
6
September 26, 2012 7:48:10 PM


This is good. As much as I like AMD, I'm glad it isn't an APU. X86 is much too prevalent. We need variety in this world.
Score
-11
a b à CPUs
September 26, 2012 7:50:44 PM

southernsharkWhat a piece of crap.They should be ashamed of themselves for selling something like this.I am not sure who would even buy it. $300 dollars for 5 year old hardware... or $350 if you get the 32 gig version (the extra 16 gig costing the manufacturer about $3 dollars, but somehow translating into an extra 50 dollars retail).This is a POS and no one should buy it.


But how else are you going to waggle through your favorite rehashes? With modern graphics!? Blasphemy!
Score
3
September 26, 2012 7:52:45 PM

Combined with a few well known developers saying the CPU is causing difficulty with ports, this doesn't look good. Broadway is an ancient architecture, that's based on Gamecube/PowerPC G3 era stuff. Even with the eDRAM cache, multicore support and a higher clock rate, the instructions per clock can't be up to par with something modern like Power7. And plus, with a 75W MAX power draw, the CPU probably only has 25w or so to work with. At least the GPU and RAM are better than the current consoles.
Score
4
September 26, 2012 7:55:52 PM

ddpruitt"According to a tweet, the CPU is not based on IBM's Power 7 chip, as previously believed, but rather a Power-based CPU. "Stupid and it's a contradictionWe know just as much as we did yesterday. Not only that it could still be a Power 7 chip since a Power 7 chip IS a Power-based CPU, but it could be ANY Power based chip.


Following the tweet thread the context is clear, they backtracked on saying it's Power7 based and apologized for saying so, then said it's simply Power based. That could mean Broadway, that could mean Power6, etc. And developers are complaining about the CPU but praising the GPU, so it looks like lipstick on a Broadway is likely.
Score
6
a b à CPUs
September 26, 2012 8:05:15 PM

tipooFollowing the tweet thread the context is clear, they backtracked on saying it's Power7 based and apologized for saying so, then said it's simply Power based. That could mean Broadway, that could mean Power6, etc. And developers are complaining about the CPU but praising the GPU, so it looks like lipstick on a Broadway is likely.


If it's simply Power based, then even if it's Broadway-based, it could be a well-modified version of it that is simply difficult for these devs to code for. Heck, look at the Cell CPU in the PS3. We probably still don't get the full performance out of it, but it's a very fast CPU for the time if it can be fully utilized. Maybe there are again many cores or there's something else going on. I think that it's best to not make assumptions about it until we have more concrete evidence.
Score
1
a b à CPUs
September 26, 2012 8:07:22 PM

We still know for a fact that it is out of order execution. Broadway was in order execution. So it can't be Broadway. And only Power 7 and Power 5 are out of order execution.
Score
6
September 26, 2012 8:13:02 PM

luciferanoIf it's simply Power based, then even if it's Broadway-based, it could be a well-modified version of it that is simply difficult for these devs to code for. Heck, look at the Cell CPU in the PS3. We probably still don't get the full performance out of it, but it's a very fast CPU for the time if it can be fully utilized. Maybe there are again many cores or there's something else going on. I think that it's best to not make assumptions about it until we have more concrete evidence.


Fair enough about not making assumptions, but it's almost certain it's either three or four cores, maybe four with one reserved for the OS, nothing crazy complex like the Cell.

TheViperWe still know for a fact that it is out of order execution. Broadway was in order execution. So it can't be Broadway. And only Power 7 and Power 5 are out of order execution.


Actually Broadway WAS out of order, it was one of the few things it had going for it at the time.
Score
2
September 26, 2012 8:14:52 PM

I also wonder how broad their GPGPU implementation that they mentioned is, if it's just used for texture decompression like Civ 5, or if it can be used to offload physics and AI to the GPU.
Score
1
a b à CPUs
September 26, 2012 8:15:04 PM

If it is indeed Power based it could still be a Power 7. However Broadway is a PowerPC (not Power) CPU based off the PowerPC 750CXe.
Score
2
September 26, 2012 8:18:10 PM

(actually I may be wrong that broadway was OoO, I was sure it was but I can't find a source now)
Score
1
a b à CPUs
September 26, 2012 8:21:00 PM

tipoo(actually I may be wrong that broadway was OoO, I was sure it was but I can't find a source now)


I could be wrong, but I don't think that it was OoO.
Score
0
September 26, 2012 8:29:04 PM

It mystifies me why they don't use AMD products. Powerful, cheap, reliable, low power. Stupid move they made.
Score
1
September 26, 2012 8:41:44 PM

Quote:
Nintendo has been taking some heat for an apparently sluggish processor in its Wii U console.

The same sentiments were felt by many by the "improved" Gamecube edition with motion capture control called the Wii. If as stated in the article that this thing turns out to be a refreshed Wii chip, then I'm left absolutely baffled at Nintendo using a core design (but at least tweaked and updated) dating back OVER a decade ago - for the third time now! No wonder why complaints about processing inferiority are coming in from multiple sources. It's very apparent these days that Nintendo has lost its core-roots focus on what it used to mean to be Nintendo. All they look to be wanting to sell you is overpriced, way outdated, gimmicky hardware with clearly an uncertain path for quality software releases. Quality not quantity, remember that Nintendo? Let's at least hope the software side isn't so lackluster this go around.
Score
0
September 26, 2012 8:45:44 PM

HotRoderxI know I catch heat for this but so what. I personally think Nintendo's time in the sun as a console manufacture has come and gone. The Wii was a awesome concept they seriously dropped the ball on with games. I had a Wii the Christmas the year it released and I was like DAMN this thing is awesome. Then I waited for games ZeldaMetroidMarioDragon Ball Z RE4 to me where the only decent games ever released for the console. Nintendo really dropped the ball on pushing manufactures to come out with new decent games. As far as graphics go they where not on par with the Xbox or PS3 but that was ok they made up for it with the controls. This new console looks like Nintendo is doing the same dropping the ball by using a old out dated processor. Why not go ahead and use a new processor to get some more performance out of the product.


IMO gameplay > graphics, so I'll take an interesting input-device over cpu / gpu improvements anytime.

Should also take into consideration that the next XBox will likely integrate kinect, increasing the cost of the device, meaning likely otherwise lowered specs ( likely approximately the same or little higher as / than WiiU ), and I seriously doubt Sony will release a high-cost device this time ( they suffered quite a bit because of it last time ).
Score
-4
September 26, 2012 8:49:13 PM

tipoo said:
Fair enough about not making assumptions, but it's almost certain it's either three or four cores, maybe four with one reserved for the OS, nothing crazy complex like the Cell.



Actually Broadway WAS out of order, it was one of the few things it had going for it at the time.



I think we can agree that we know nothing of the current CPU. And even if it is a heavily modified Broadway it may not necessarily be a bad thing (after all Cell is a heavily modified PowerPC).
Score
1
September 26, 2012 9:07:48 PM

It cvost Nintendo let alone IBM way to much money to re purpose the Broadway based CPU as it is so dang old it would not be cable of speeds over 1ghz let alone does not have the architectur to be able to be a Multi-core based chip because the G3 chips were year before Multi-core and it dont have the supporting info structure in it to support that stuff.


Score
-2
a b à CPUs
September 26, 2012 9:12:00 PM

Bloob said:
IMO gameplay > graphics, so I'll take an interesting input-device over cpu / gpu improvements anytime.

Should also take into consideration that the next XBox will likely integrate kinect, increasing the cost of the device, meaning likely otherwise lowered specs ( likely approximately the same or little higher as / than WiiU ), and I seriously doubt Sony will release a high-cost device this time ( they suffered quite a bit because of it last time ).


I doubt that the next Xbox wouldn't be superior to the Xbox 360 in performance and if it doesn't beat the Wii U, then it's unlikely that it will really beat the Xbox 360.
Score
0
September 26, 2012 9:15:49 PM

BloobIMO gameplay > graphics, so I'll take an interesting input-device over cpu / gpu improvements anytime. Should also take into consideration that the next XBox will likely integrate kinect, increasing the cost of the device, meaning likely otherwise lowered specs ( likely approximately the same or little higher as / than WiiU ), and I seriously doubt Sony will release a high-cost device this time ( they suffered quite a bit because of it last time ).


Considering the xbox 360 acording to this article is more powerful than the WiiU, then I don't see how it would be possible for them to come out with a system that similar in power. Additionally "integrated kinect" is not possible with the xbox console in the way you are thinking. Who would want a sensor built into the console, which is usually hidden from sight / buried behind piles of games or inside a closed cabinet? It might have a sensor bar that comes with it, and have the software integrated.

Additionally the xbox chips have seen over a 50% reduction in size (and heat) since their release. I am positive that Microsoft could use the exact same chips and scale up the frequency by 50-60% if they wanted to, but why bother when you can use a 32nm fab instead of 65.
Score
0
a b à CPUs
September 26, 2012 9:19:23 PM

notuptome2004 said:
It cvost Nintendo let alone IBM way to much money to re purpose the Broadway based CPU as it is so dang old it would not be cable of speeds over 1ghz let alone does not have the architectur to be able to be a Multi-core based chip because the G3 chips were year before Multi-core and it dont have the supporting info structure in it to support that stuff.


Any CPU, regardless of the architecture's age, would be using newer processes than back then, so much higher clock frequencies are to be expected.

GHz is not performance.

Any arch can be made into a multi-core CPU. The same buses that are used to communicate with the memory or anything else can be used to link different CPUs and/or different cores and this is exactly what was done with some of the early dual-core CPUs and some early forms of many other core count CPUs even up to recently where AMD's 12/16 core CPUs all use two distinct CPU dies that are simply linked through the Hyper Transport buses with the chipset(s). Older examples could include Pentium D and Core 2 Quad (Netburst dual-core CPUs and quad-core Core 2 CPUs that had two dies that communicate along the FSB, a bus that was on CPUs from like ten years before them). Neither of them needed architectural changes from the single core CPUs that I'm aware of.

Beyond that, IBM/Nintendo could have (assuming that they actually used Broadway) modified the architecture rather than using a copy of older Broadway CPUs, so they could have modified the architecture to have native multi-core support instead of needing multiple single-core dies.
Score
2
September 26, 2012 9:31:41 PM

luciferanoAny CPU, regardless of the architecture's age, would be using newer processes than back then, so much higher clock frequencies are to be expected.GHz is not performance.Any arch can be made into a multi-core CPU. The same buses that are used to communicate with the memory or anything else can be used to link different CPUs and/or different cores and this is exactly what was done with some of the early dual-core CPUs and some early forms of many other core count CPUs even up to recently where AMD's 12/16 core CPUs all use two distinct CPU dies that are simply linked through the Hyper Transport buses with the chipset(s). Older examples could include Pentium D and Core 2 Quad (Netburst dual-core CPUs and quad-core Core 2 CPUs that had two dies that communicate along the FSB, a bus that was on CPUs from like ten years before them). Neither of them needed architectural changes from the single core CPUs that I'm aware of.Beyond that, IBM/Nintendo could have (assuming that they actually used Broadway) modified the architecture rather than using a copy of older Broadway CPUs, so they could have modified the architecture to have native multi-core support instead of needing multiple single-core dies.




ok but that would cost way to much money for Nintendo and IBM to do cause IBM would not take an old chip and have to take the time to make it work like a modern day chip.


As to your other statements made the Core 2 series chip differs allot compared to the crap P4 stuff and as for AMD the thing is they build the AMD 64 arch with future Multi-core CPU or things inmind as they ahad gateways planted in the chip for that poosiabilties and what not at anyrate the Wii U is just slower in clock speed more then likely thus them developer are complaining big time and they are just half assing there work is all and not doing proper coding to efficient ther threads
Score
0
a b à CPUs
September 26, 2012 9:44:13 PM

notuptome2004ok but that would cost way to much money for Nintendo and IBM to do cause IBM would not take an old chip and have to take the time to make it work like a modern day chip.As to your other statements made the Core 2 series chip differs allot compared to the crap P4 stuff and as for AMD the thing is they build the AMD 64 arch with future Multi-core CPU or things inmind as they ahad gateways planted in the chip for that poosiabilties and what not at anyrate the Wii U is just slower in clock speed more then likely thus them developer are complaining big time and they are just half assing there work is all and not doing proper coding to efficient ther threads


Core 2's micro-architecture was different and did support native dual-core CPUs, but the Core 2 Quads were made with the exact same methods as Intel's early Netburst dual-core CPUs, just with two dual-core dies instead of with two single-core dies.

Athlon 64 was not what I talked about and has nothing to do with this conversation. I spoke of AMD's MCM CPUs such as their Magny-Cours CPUs and their Interlagos CPUs that use the same methods as Intel's older Pentium Ds and Core 2 Quad CPUs, IE re-purposing preexisting buses instead of changing the architecture to support more cores.

IBM re-using an older arch would be a helluva lot easier and cheaper to do then making a whole new CPU arch. Considering that console CPUs are usually in development for several years, this is entirely possible and regardless, there's no guarantee that the Wii U actually uses a Broadway-based CPU.

Don't limit your perspective so much. This has all been done before and can be done again. For example, up until Bulldozer and since Athlon 64 AMD has mostly been just updating their Athlon 64 arch rather than making a new one and the same is true for Intel since Core 2 IIRC.
Score
0
September 26, 2012 9:51:15 PM

ddpruitt"According to a tweet, the CPU is not based on IBM's Power 7 chip, as previously believed, but rather a Power-based CPU. "Stupid and it's a contradictionWe know just as much as we did yesterday. Not only that it could still be a Power 7 chip since a Power 7 chip IS a Power-based CPU, but it could be ANY Power based chip.

It baffles me how this got rated up. We do know more than we did yesterday. We know it isnt a Power7 chip now. The qualifications for the processor are now any Power based chip EXCEPT Power 7.
Score
0
Anonymous
a b à CPUs
September 26, 2012 9:58:31 PM

I can't make sense of what notuptome2004 is actually saying.
Props to the guy that suggested AMD. It seems like even a low-end APU would be more powerful than what's been rumored, and it'd solve a number of problems; unified architecture, ample CPU power for a console, good video horsepower, low power usage, and it's extremely affordable. Plus it's gotta take the difficulty out of coding a game if the console is using off-the-shelf x86 PC parts, right? Who doesn't know how to make that work and optimize it?
Score
0
September 26, 2012 10:05:20 PM

Nintendo said the typical power draw during gaming was 45 watts...This is with a CPU on the same process, 45nm, as the 360 slim. Even with the max draw of 75 watts, with power supply inefficiency, the GPU, the RAM, charging controllers, and the big one of a spinning optical drive, the CPU is even more wattage constrained than the old consoles, no matter which architecture it is on. With a modern architecture like the Power7 there would have been some relief knowing the instructions per clock were far higher, but I don't feel comforted by all that is being said about the weak performance. I suspect it's geared towards low power draw in favor of giving the GPU more headroom, Nintendo and IBM did mention its power saving features a few times.
Score
0
a b à CPUs
September 26, 2012 10:12:55 PM

tipooNintendo said the typical power draw during gaming was 45 watts...This is with a CPU on the same process, 45nm, as the 360 slim. Even with the max draw of 75 watts, with power supply inefficiency, the GPU, the RAM, charging controllers, and the big one of a spinning optical drive, the CPU is even more wattage constrained than the old consoles, no matter which architecture it is on. With a modern architecture like the Power7 there would have been some relief knowing the instructions per clock were far higher, but I don't feel comforted by all that is being said about the weak performance. I suspect it's geared towards low power draw in favor of giving the GPU more headroom, Nintendo and IBM did mention its power saving features a few times.


Something like AMD's High-density library could be implemented too. It wouldn't be quite as good as a die shrink to 32nm, but it'd be helpful for optimizing for low power draw without sacrificing performance.
Score
0
September 26, 2012 10:24:29 PM

I honestly can't care less about how a game looks like, I'm only interested in how it plays, and how much fun it is =]
Score
2
September 26, 2012 11:38:40 PM

Amazing the volume of fanboys who have pre-ordered something without knowing what they are actually buying. The Wii U sounds like 5 year old hardware being old at a premium...but isn't that Nintendo, repackage the old and sell it as new.
Score
-2
September 26, 2012 11:40:28 PM

ikarugaI honestly can't care less about how a game looks like, I'm only interested in how it plays, and how much fun it is =]
So you aren't bothered about value for money? Fanboy logic is so tragic to see.
Score
-5
September 26, 2012 11:42:00 PM

DarkPGR@southernshark next time, before making such a stupidly retarded comment, check your facts:The Pro model, which is currently sold out for pre-orders, includes a $60 game, a couple of accessories (about $60 if bought separately) AND 32 GB of memory instead of 8 GB. That's a lot of extra value for just $50.

No it's just less of a rip-off.
Score
-3
a c 146 à CPUs
September 27, 2012 12:01:45 AM

Meh POS console with crappy hardware. They are the ruination of gaming.
Score
0
September 27, 2012 12:27:59 AM

maztyNo it's just less of a rip-off.




let see the fact it uses Flashed based storage makes it a good deal because flash storage is $$$ and oh how about the fact you can plug in a ext 3TB HDD if you want to expand storage
Score
0
September 27, 2012 12:35:52 AM

mazty said:
So you aren't bothered about value for money? Fanboy logic is so tragic to see.


I'm fanboy of nothing and nobody. If I see value in something, I decide if it's worth the price or not, and that's solely a subjective decision. It might worth it for me for the given price while it could be too much for you at the same time.
Luckily it's a known fact among mature people that nobody forces you, me or anybody else to buy anything, and the very same fact is what makes you the only tragic element of this story.
Score
3
a b à CPUs
September 27, 2012 12:45:01 AM

maztyAmazing the volume of fanboys who have pre-ordered something without knowing what they are actually buying. The Wii U sounds like 5 year old hardware being old at a premium...but isn't that Nintendo, repackage the old and sell it as new.


Unless you are a part of Nintendo's hardware team(s), chances are slim that you actually know for sure what hardware is being used, so you seem to be a Nintendo hater, aka not any better than a fanboy.
Score
1
September 27, 2012 12:51:40 AM

hastenI'm starting to think that all the rumors are correct and this toy will be extremely underpowered. With the release date about a month and a half away you would think they would want to start generating buzz of their "next-gen" console by giving us details other than the latest gimmick...

Consoles don't move products by bragging about stats, this isn't a PC. The latest gimmick is their selling point. It seems to be working since they sold out of preorders on the first day, within hours really. Even though they're likely manipulating supply to increase demand, they still sold what they wanted before Christmas several months beforehand.

maztySo you aren't bothered about value for money? Fanboy logic is so tragic to see.

The enthusiast logic that a system is better purely because it has higher stats is also pretty sad. Have fun playing Duke Nukem Forever on a PC that cost you 3k. The idea that gameplay is more important than graphics is clearly lost on some. I'm sure they'll make the next Dora the Explorer game in 4k just for you. Value is garnered from enjoyment, if they enjoy the game they're playing, they achieved their value for their money.

The speculation that it is Broadway and the same from the Wii is an absurd conclusion to draw and PURE speculation. There are far more chips than the Broadway possible. A similar chip would not surprise me though, in order to make backwards compatibility easier. They don't promise backwards compatibility and pull it later due to "incompatibility" like some do -coughSonycough- Wanting to pad their library with GC/Wii classics, what little there are, isn't surprising.

I'm also not really concerned whether or not the devs at Koei are having trouble with Dynasty Warriors, a game that's played almost the same since the ps2 days. Do not tell me a modified ps2 engine can not run on this CPU. Koei's idea of updated gameplay is apparently drawing 200 fodder enemies on screen rather than 50. There is a difference between "the CPU is difficult to utilize properly" which is often the case on new systems using new chips and "the CPU is too weak". Dynasty Warriors has NEVER been on a Nintendo console before, it's not surprising a PS friendly dev team isn't used to developing for a Nintendo chip. It speaks volumes to me yet another dev team that hasn't touched NIntendo before is finally on the team at all.

With that being said, this system isn't going to deliver a whole lot of horsepower. Lost value? Wasn't the controller estimated to be at least $100? That leaves the system itself to be around $200, only slightly higher than the Wii. For that price, you're getting an HD Wii with a fancy controller, take it or leave it. You will not get a top of the line system at any rate for $200. The pricing is in line with what it should be, considering the controller's cost. I don't expect miracles out of this hardware at that price, doing so would be foolish. That's why I have a PC.
Score
1
September 27, 2012 12:51:52 AM

ikarugaI'm fanboy of nothing and nobody. If I see value in something, I decide if it's worth the price or not, and that's solely a subjective decision. It might worth it for me for the given price while it could be too much for you at the same time. Luckily it's a known fact among mature people that nobody forces you, me or anybody else to buy anything, and the very same fact is what makes you the only tragic element of this story.


I'm with you on this.
I prefer gameplay and great story over eye-blowing graphics.

Sometimes I like better to play older games because of the gameplay.
Score
3
a b à CPUs
September 27, 2012 1:25:34 AM

BloobIMO gameplay > graphics, so I'll take an interesting input-device over cpu / gpu improvements anytime. Should also take into consideration that the next XBox will likely integrate kinect, increasing the cost of the device, meaning likely otherwise lowered specs ( likely approximately the same or little higher as / than WiiU ), and I seriously doubt Sony will release a high-cost device this time ( they suffered quite a bit because of it last time ).


I'd like a car with 80 MPG but I don't really want it if it rides like total crap. :)  How about a reasonable compromise? Decent graphics and goodgameplay.

Seriously? The original Wii was hyped far too much. I can only name eight or so games that were remotely worth playing and instead we ended up with crapware for the vast majority of releases.

As for performance... my cell phone can play games with better than Wii level graphics and has a multicore CPU that is likely faster than anything that will be in the Wii U. Color me not impressed. Nintendo needs to just let the G3 die already.
Score
-2
September 27, 2012 1:47:29 AM

nocteratus said:
I'm with you on this.
I prefer gameplay and great story over eye-blowing graphics.

Sometimes I like better to play older games because of the gameplay.


Agree. You can always go graphics whore style as well of course, but we have our high-end PCs for that.

I have no idea how the new Wii-U will turn out. The Wii was the weakest of its generation, and still had very good sales figures for quite a long time. The info we know now suggest that Nintendo still don't want to compete on the hardware side, and they are targeting a more casual audience (that’s the majority btw). They spent the a significant amount of the target price on the new controller, that's where they will focus the most with the gameplay.

A console which allegedly only has 40W average power draw (70W at peak) surely won’t make wonders graphics wise.
Reports and leaks from devs suggest that the GPU does shader model 4, and can do all the things what the current gens (xbox360 / PS3) are doing in 720p or less. The CPU is reported to be weaker and problematic, which must be the true indeed, since it won’t have more than 10-20W to work with, and that’s very low indeed.
Perhaps things will get better as devs begin to know the hardware more, when they will have better libraries and engines, and when compilers start to produce more optimized code, but I don’t expect anything like what we have with the current maxed out dx11 visuals on the PC, or perhaps only at the very end of the lifetime of the console when some devs will start to pull out miracles from the hardware.

The Wii-U will probably do well with recent DX9 titles in 720p or 1080p, so it will probably handle well whatever the PS3 and the Xbox360 are capable of, and that’s more than enough for a casual gamer or to be the “console of the family”. Many of the successful games in the recent years (e.g:Source engine games like Portal2, Unreal engine games like the Mass effect series, Moba games like LoL or Dota, Minecraft or other indie games,etc,etc..I could go on) do not require state of the art hardware, so perhaps they can ride a good trend with this approach. It will all come down to the online service they will provide and also to the big first party titles like Zelda, Metroid or Mario, etc, because those will be the major selling points.
Score
0
a b à CPUs
September 27, 2012 2:31:30 AM

tajisiI'd like a car with 80 MPG but I don't really want it if it rides like total crap. How about a reasonable compromise? Decent graphics and goodgameplay.Seriously? The original Wii was hyped far too much. I can only name eight or so games that were remotely worth playing and instead we ended up with crapware for the vast majority of releases. As for performance... my cell phone can play games with better than Wii level graphics and has a multicore CPU that is likely faster than anything that will be in the Wii U. Color me not impressed. Nintendo needs to just let the G3 die already.


The graphics in your phone, assuming that you have an iPhone 5 or a phone with comparable graphics performance (I'm not aware of if any Android phones has comparable graphics performance to the iPhone 5), might be almost as good as that of the Xbox 360 and PS3, but that's it. The CPU is not even close to them and is probably far weaker than the Wii U's CPU too, although that's obviously assuming that the Wii U is at least as fast as it is supposed to be compared to the Wii. Even then, it's no faster than the original Wii's CPU because even the fastest ARM/Medfield CPUs can't compare to it. The CPUs in these devices are much more outdated than the GPUs.

Letting their consoles die would be very stupid. It's out-sold preorders already, so Nintendo is obviously gearing up to make some money off of this regardless of how good it is in your opinion (which is really useless considering that we don't actually know the specs of it).
Score
1
September 27, 2012 3:36:33 AM

If they refuse to properly update it, then yes they should let it die.

I don't care one bit if my original post about this product being a POS has upset people, or OMG that I thought for 300 dollars it had 16 gigs of ram instead of the ridiculously low 8...... has set off microwaves in people's heads either.

The bottom line is that for late 2012, this product is fail.
Score
0
a b à CPUs
September 27, 2012 3:45:20 AM

southernsharkIf they refuse to properly update it, then yes they should let it die.I don't care one bit if my original post about this product being a POS has upset people, or OMG that I thought for 300 dollars it had 16 gigs of ram instead of the ridiculously low 8...... has set off microwaves in people's heads either.The bottom line is that for late 2012, this product is fail.


The only fail's are that you think that the 8GB and 32GB refers to RAM and that it's 16GB, not 32GB. Even worse is that you think that this matters. The games are on optical disks. Having more than 1 or 2GB of flash memory is unlikely to make a difference.

Furthermore, you don't even know if this is a proper update or not because you don't even know the actual hardware specs. Well, that's assuming that you're not on the inside of this, but if you were, I doubt that you'd be on some forum such as this whining about it.
Score
1
Anonymous
a b à CPUs
September 27, 2012 5:50:27 AM

running return to castle wolfeinstein (2001) on a Pentium II 400mhz with 256mb RAM and voodoo3000 was sometime ago a wonderfull experience... and still having some juice HA HA HA!! just we need funny well designed games... crappy programmers are redundants now
Score
0
Anonymous
a b à CPUs
September 27, 2012 6:20:54 AM

cliff blezzinski must go to nintendo and kick out all the shit that console can give...
Score
0
September 27, 2012 6:41:46 AM

HotRoderxI know I catch heat for this but so what. I personally think Nintendo's time in the sun as a console manufacture has come and gone. The Wii was a awesome concept they seriously dropped the ball on with games. I had a Wii the Christmas the year it released and I was like DAMN this thing is awesome. Then I waited for games ZeldaMetroidMarioDragon Ball Z RE4 to me where the only decent games ever released for the console. Nintendo really dropped the ball on pushing manufactures to come out with new decent games. As far as graphics go they where not on par with the Xbox or PS3 but that was ok they made up for it with the controls. This new console looks like Nintendo is doing the same dropping the ball by using a old out dated processor. Why not go ahead and use a new processor to get some more performance out of the product.


I agree mostly, though I think Nintendo should just go strictly third party and stop making consoles. I mean, why does anyone get a Nintendo console at all? Because you want Mario Kart and Zelda and all those other cool Nintendo-only franchises. They should release themselves from the burden of making the hardware and just make their games available across everyone else's consoles.

They could still make add-on accessories for their kookier games that plug into a xbox or a playstation. They could have made the wii-mote and the fit-pad add-ons for the xbox easily.
Score
0
!