AMD May Supply GPU for PlayStation 4, Say Former Employees
It is not a surprise that AMD is in the race for providing the graphics chip (or some other processor) for Sony's next-generation game console.
It appears that AMD may actually have won the contract to supply the chip. Former AMD employees told Forbes that AMD may play a "key role" in this next product. Both AMD and Sony declined to comment on the statement. Sony even denied the existence of a potential PlayStation 4.
Forbes' information attributed to an industry source with unknown credibility and unknown access to current information leaves quite a bit of room for interpretation. Does "key role" refer to graphics alone or more? In the end, there have been no recent news about the continued development of the Cell Broadband Engine, the heart of the PS3. The PS3 Slim received the most recent version of the chip, a 45 nm shrink, in August 2009. IBM, who played a critical role in the development of the chip, said in 2009 that it had halted the development of a Cell processor with 32 SPUs. The two most recent ISSCC events, which have historically been prominent venues for Sony to reveal new developments for the Cell technology, did not include news about the processor.
There is not much substance to this information at this time, but any part of Sony's next console would be an attractive deal for AMD. Of course, it is most likely that AMD would be the graphics chip supplier for this game console. Nvidia currently provides its IP for the RSX graphics chip built into the PS3. The GPU ships as a 40 nm version with a 550 MHz clock speed.
Then you clearly do no understand how consoles work.....expect a 6870 at best in it
Sony needs to counter the cheap console idea...and make 2 versions of the PS4. One that costs $600-$700 and has something like a Radeon 7950 3gb..and another $350-$400 one with a cheap card for casual gamers. They may make less profit in the first 1-2 yrs...but as die shrinks come in...the cost will lower and they will make more money on the expensive console. It will also mean that their console aint obsolete in 3-4 yrs.
I like playing Bf3 on ultra on my pc...but I also want to play God of War on ultra someday. We really need 2gb+ of vram for eye candy on console...this way Pc gamers will benefit too.
What do you guys think?
i think in 2016 it will be outdated, 4gb vram is good enough! ;D
I got all hot and bothered by the PS3 when it was coming out, and really wanted one very badly. And then it came out... and I quickly realized that my PC at the time (which was nothing particularly special) was better than the PS3. The PS3 was only 720p (and even now has a rough time with 1080p), had terrible screen tearing issues, and no filtering (AA or otherwise), while my PC did 1200p (16:10 instead of 16:9), and could hack a little bit of filtering to even out the rough spots. Not to mention the PS3 was 30fps (or less), while my PC was pushing 30-50fps on similar games. There is simply no comparing PC gaming with consoles. The image quality is better, the frame rate is better, the resolution is higher, and there are very few games that are console exclusive (though I do miss playing final fantasy
I am certain that my current GTX570 will beat any next gen console coming out, and I get to enjoy that level of graphics today, instead of waiting a few more years.
Besides, consoles are going to do the same thing they did last time. When the last gen of consoles came out they were really designed with high end SD content in mind, and they pushed them to work for the 720 and 1080p standards that had just come out. By the time the next consoles come out they will be designed for 1080p when 2K (slightly wider than 1080p) and 4K (think 4 1080p screens tied together) show up on the scene, and consoles will again be pushed beyond their capabilities to run at the new standard.
Then you clearly do no understand how consoles work.....expect a 6870 at best in it
Considering such a card would cost $5-600 when it comes out and the entire console needs to fit in the $5-600 range this is highly unlikely.
Much more likely to be in the 6670 to 6870 range, but as a dedicated use machine it will perform better than a 6670-6870 in a windows environment, and it will be paired with a CPU that will meet or slightly choke it.
The ps3s cpu was choked by the gpu and memory quite a bit.
their gpus are great too.
it might mean console ports will run better on amd gfx cards thus getting an edge over nvidia's!
as long and glofo doesn't manufacture the gpu....
I doubt you will ever see 1.2k or 2k resolutions at standard, they offer very little benefit over 1.08k.
I hate to be the bearer of bad news, Manufacturers are not going to derive from fabing 1080's on a large scale until the major networks and the TV industry move the HD standard from 720 to 4k. Unfortunately movies will not be a major factor in forcing a standards increase, it just wont happen for at least another 10 years. The cost of doing so is too great for them to make it profitable. You are going to need up words of 4 life cycles of 1080 televisions (i.e. People buying their 3rd or 4th main screen replacement) before the costs of production on 4k screens becomes an acceptable expense for most households.
As it stands today, 10-15k for a 42" 4k screen is more unreasonable then a SD Plasma's that were selling for 17-20k in 1999-2000. Especially since you can get either a 56" 1080p Plasma for $1300 and or a 50" LED 1080 3D 120+hrz for 1600-2000, and that's the bottom of the high end TV's on the market today.
Most people wound not even spend that much for a TV now. The standard is going to be at 1080 for at least the next 8 years before 4k starts even starts hinting into saturate.
It would make the most since for the PS5 to push 4k resolution along with either another physical medium or, praise jesus, the network infrastructure of the US gets upgraded to accommodate 100Mb/s+ speeds that would be a requirement to transmit 4k resolution media to most households in a acceptable amount of time.
God no. The one advantage of consoles is you can use low level assembly to squeeze every single ounce of power out of a machine because you have a single HW specification. If you use two different HW specs, you HAVE to code using higher level API's, and performance will fall off a cliff.
Hence why the 360's GPU is powerful enough: Its good enough to drive 1080p with 2x/4x AA, which is all next-gen consoles have to push. Hopefully by the time the following wave comes out, we'll have moved to ray tracing instead of Rasterization, giving us a graphical boost.
The ps3 has a weeker gpu than the 360. The reason it cost more was because of the blue-ray, cpu, and better connectivity
Here's my wish list for PS4.
1. Use the highest-end CPU/GPU available at the time of the release date.
2. Play PS1, PS2, and PS3 games with high-quality antialiasing (which can be done on PC using emulators).
3. Let users mod it as much as they want to!
4. Holodisk?
Yes and the reason for the better GPU is due to the Cell handling physx and not the GPU.
The 360 gpu is better because it has more ram to work with and possibly other reasons. The ps3's cpu and gpu share the same ram.