ikaz :
Murissokah :
From what I hear, they will be both based on the jaguar architecture, but they won't use the same processor. It seems to be true that Sony went GDDR5 while MS stuck to GDDR3, but added 32MB of ESRAM to the CPU. I read one game Dev stating that it won't be enough for XBOX to be able to do 1080p60, while the PS3 may be able to.
Both systems will use 8GB of shared memory. I'm not sure they will use 8 cores though, I had read the PS4 would be a quad APU @ 2.7GHz, though frequencies nowadays mean very little with all the dynamic "turbo-boost"-like technologies.
Umm won't be able to do 1080p@60 ? what ever dev said that must not know what he talking about since both the xbox 360 and ps3 can do 1080p@60 if they wanted too. The questions is how much eye candy will be enabled. Also if you see the interview with most of the major publishers and gaming companies (EA, activision etc) most are still taking about targeting 1080p@30
since it allows them to put more visaul refinements on the screen. Not to mention a large number of console gamer can't tell the difference (not to bash console users since I PC and console game).
I went searching for the original article:
http://www.examiner.com/article/ps4-gddr5-ram-compared-against-xbox-720-ddr3-ram-and-esram
The guy in question is Timothy Lottes, not a game developer as I had thought to recall, but actually a graphics engineer and inventor of FXAA, so he certainly knows what he's talking about. When he says it won't be able to handle 1080p60 it should be inferred that he means it won't handle it at an acceptable level of detail. Maybe that was my bad phrasing, after all, even an Intel GPU can do 1080p60 if you are playing solitaire. What the guy meant is that, with the rumored specs, everything else aside, the bandiwidth provided by the PS4's GDR5 is enough for most game engines to hit 1080p60 with a good level of detail, whereas the rumored specs for the Xbox would not allow that.
I have a feeling he was actually stinging Microsoft to try and convince them to go GDDR5 too.