Previous Generation Radeon HD Powers the Wii U
The big reveal at the E3 games show this year was the Wii U, Nintendo's next-generation gaming system that's set to appear in 2012.
Like the Wii, the Wii U appears to be using modest technology that's affordable today, rather than the state-of-the-art parts that Microsoft and Sony chose for the Xbox 360 and PlayStation 3 over five years ago.
According to Japanese publication Game Watch, powering the graphics in the Wii U is an AMD Radeon HD 4000 series class GPU. Yes, this does mean that the Wii U, which is still a year away from market, is already spec'ed with a GPU that's decidedly last-generation.
Still, despite it being quite old in the PC world, it's still more advanced than Microsoft and Sony current offerings. The ATI GPU in the Xbox 360 and Nvidia GPU in the PS3 are of the DirectX 9 generation and feature Shader Model 3.0.
The Wii's GPU is based off of the RV770 core, which is of DirectX 10.1 generation with Shader Model 4.0.
Nintendo's use of Xbox 360 and PS3 game footage as its own at the E3 keynote may have raised a few eyebrows, but with this GPU, the Wii U should have no problem at least matching the graphical output of the current generation.

I can buy a 6870 RIGHT NOW for $165 after rebate. I'm sure if Nintendo was approaching Nvidia or AMD, they could strike a handsome deal on something similar.
Why are they going to charge in the range of $400, for a console that has CHEAP ASS hardware in it? Nintendo is seriously the Apple of consoles.
Hopefully the Xbox 720 will come out a year later then the Wii U sporting a HD8000 based GPU in it, and you know be caught up with the current PC hardware.
The Wii U should have been called the Wii 2, it will be just as far behind and based around a stupid controller as the first one!
I can buy a 6870 RIGHT NOW for $165 after rebate. I'm sure if Nintendo was approaching Nvidia or AMD, they could strike a handsome deal on something similar.
Why are they going to charge in the range of $400, for a console that has CHEAP ASS hardware in it? Nintendo is seriously the Apple of consoles.
Hopefully the Xbox 720 will come out a year later then the Wii U sporting a HD8000 based GPU in it, and you know be caught up with the current PC hardware.
The Wii U should have been called the Wii 2, it will be just as far behind and based around a stupid controller as the first one!
The console will be cheap, but the controller will be $200 by itself!
If they use a HD48xx core that should do for the foreseeable - you really don't need HD6000 series in a console, absolute waste of resources.
Pretty much all games that are being released these days are straight Console ports.
When fully optimized to run the game code like it will be in the Wii U the Rv770 core will be a huge step ahead of the graphical abilities of the other consoles. I don't see people complaining about the Xbox graphics at the moment do you ?
1. Higher end GPUs of today require quite alot more power then what was needed few years ago.
2. With power comes heat.
Both of these are severe issues for console makers. A console is generally very restrictive when it comes to cooling and doesnt have much space for a proper PSU (a good PSU also costs lot of money console makers generally want to fit the cheapest of stuff).
No console maker will want to deal with issues like RROD or YLOD. Not only will a high end GPU cost alot it will also require a huge casing, good PSU and high airflow.
I can buy a 6870 RIGHT NOW for $165 after rebate. I'm sure if Nintendo was approaching Nvidia or AMD, they could strike a handsome deal on something similar.[/quote]
That's not how it works.
When Nintendo started developing the Wii U, that GPU probably didn't even exist. It's a pretty power hungry card and why would you think Nintendo's spending even $100/unit for a GPU? According to Wired, the upclocked Gamecube chip in the Wii cost ~$30 at launch.
[quote=assmar]But when people have talked about the controller they have pointed out the terribly outdated tech in those as well.[/quote]
Are you talking about Nintendo's decision to go with a single-touch resistive display over a multi-touch capacitive one?
Nintendo decided against multi-touch after they found out capacitive screens "don't work well with old hands" and after concerns with precision according to an interview with a German publication.
If you're talking about the screen's speculated resolution (853*480), it has a higher pixel density than the iPad and most laptops.
And it's the perfect resolution for playing any game Nintendo releases on it's Virtual Console service, from NES to (possibly) Gamecube.
You're joking right? Using 3 year old hardware is without a doubt, limiting graphic potential.
That's great isn't it, Nintendo is inspiring to beat Xbox 360 graphics; all the while It's 2012 and we're still not achieving much.
They should be pushing for the best graphics and effects, not this "econo-we-buy-cheap-shit-so-we-make-extra-profits-and-lol-at-you-bro".
They could slap a 5/6 Series Radeon that uses DX 11 and It seriously wouldn't cost much more. It would also open up a lot of room in the future graphics department.
So, the Wii U will have slightly better graphics than the 360 and PS3 - remember that will be IN 2012. Assuming the console has at least 5 years of life in it, that means by 2017, we'll be at a PATHETIC level by comparison to what will be out for PCs, but will hardly cost crap. Seriously, high end GPU's run for around $150 these days.
Nintendo is taking the cheap route and doing what a typical greedy corporation does best: Expects mega-bucks, for cheap hardware.
I hope MS and Sony release high-end GPUs and stomp the ever living shit out of Nintendo. We'll see then how successful they are in bringing back the hardcore crowd, like they've already have admitted to losing due to the casualness of the Wii. Trust me, It wasn't the lack of HD, It was the lack of your console was an upgraded Gamecube with a gimmick controller.
Um, no and no.
Power management and heat have actually gotten a bit better. You might really want to look into this. Assume they're using a 4850, and then really look over your statement and see if you want to run with that.
At load a GPU like 4000 series would be very comfortable with about 150watts. A GPU like 6000series or GTX 580 will require anywhere between 300 - 400 watts at full load. Try fitting in one of those GPUs in a small HTPC with console like airflow and see where the temps get to. I can assure you it wont be pretty.
Last thing console makers will be doing is investing in large PC like casings then giving the same machine expensive PSUs, just to accomodate todays high end GPU requirements.
Sony and MS at best will be going for entry level to mid range parts so that they can start making profits day one. Lets not even count Nintendo in this as they are not going for entry level or mid range they are going for outdated stuff.
http://www.tomshardware.com/reviews/radeon-hd-6870-radeon-hd-6850-barts,2776-22.html
and what, put 1200$ worth of gpu into it? please, think before you post. it may be single card in 5 years but not next year or the year after that.
to even begin to take advantage of dx11 you need 2 off the highest end cards, yea, im not spending 1000+ on a console, and nintendo isnt eating a 5-600$ loss per system to make it reasonably priced.
i actually dont think they are the apple of console , seriously you are stabbing your foot in your mouth saying this , yeah they are using older vid chip , but cosider the cost of packing that chip into a smaller profile than it's ever been in previously (on board vid chips in consoles are indeed smaller than a graphic card version of said chip) not to mention the cooling space restraints it will be under compared to a 4000 series video card in a pc, reagardless you havent paid attention at all to what the Wii U 's big selling point is apperently. the console is not the main cost factor in that 400 dollar price tag those controlers equipped with an HD lED-LCD display are the big cost factor on this system not the graphics hardware. likely the lower end vid chip was used as a means to bring down the total cost of the system, so this is by no means an "apple-sell-on-looks/name" case here. imagine the cost of this system if they had slapped a brand new 6000 series chip in it.
nintendo's pricing has always been fair compared to MS and sony, the orginal wii launched at a flat 250, compare that to Xbox 360 releasing at 399 or 499 for a model that came with a hard drive, or compare it to sony's rediculous $499 20 gig and $599 80 release models. and you can see a very different picture.
If you ask me Sony is you "apple" of consoles NOT nintendo.
At least MS & Sony are willing to provide you with hardware that they take an initial loss at. Nintendo wants to milk for the console, games and everything else.
Except that Nintendo doesn't take a loss on their hardware like MS and Sony did.
The Wii was profitable from DAY 1. MS and Sony were losing HUNDREDS per console.
MS & Sony are willing to give me more for my money, knowing that they'll make it up in games and etc. I'll take the better hardware please.