Why Can't Next Gen Consoles Do HD?

How To Be A Loser

Reputable
Aug 6, 2014
2
0
4,510
I used my intel 4000 intergrated GPU for 2 years playing Battlefield 3 on lowest grafics and HD before buying a GPU (GTX660 OC).

I was deciding between a $500 gpu for my 16gb ram, i7 3770K pc or a "next gen console".

I was going to go console, but now I see that consoles have certain games that struggle to do HD and many game that run on a visibly inferior 30 fps and I have decided against it.

My question is simple, how can the PS4 claim to have a HD7850 in it, the 7850 gets over 30 frames on max setting in BF3 and the "next gen" consoles grafics are NOWHERE near what BF3 and BF4 look like when maxed out. Also the optimisation doesn't seen very good, certainly only 10% more then a pc with latest drivers at max.

My question is has Sony lied about the spec's of the machine, a grafics card that can play BF3 on max with HD at 30fps can CERTAINLY play any console game at the console quality (which is about med grafics on a pc) in HD?! Ryse Son of Rome and all the other console games look like bad quality rail shooters compared even with BF3 or BF4 (an HD7850 will even play Crysis 3 at decent levels).

My concern is that people are buying off spec's, 8 cores doesn't mean better then a 1.6-1.8Ghz Core2Duo or i3, 2gb GPU doesn't mean as good as a 7850 with superior cooling, soldered ram is slower then pin ram even if it is DDR5 (otherwise PC ram would be attached to the board trust me) and optimisation doesn't mean better then PC drivers.

I am not saying that you could build a PC with the same specs for less then 50-100 more then a PS4 (although you would save in the long run on game cost), I am just asking if console sellers are selling on hype and not merit.
 
Solution
Next-gen consoles are using the equivalent of a laptop processor with AMD integrated graphics. High resolution and high detail require more performance than integrated graphics can deliver. The solution is either to lower the resolution to 720 or lower the detail settings. Lowering the detail settings results in people saying the "graphics suck" so they reduce the number of pixels.

Syntax42

Reputable
Aug 4, 2014
133
0
4,760
Next-gen consoles are using the equivalent of a laptop processor with AMD integrated graphics. High resolution and high detail require more performance than integrated graphics can deliver. The solution is either to lower the resolution to 720 or lower the detail settings. Lowering the detail settings results in people saying the "graphics suck" so they reduce the number of pixels.
 
Solution

The_Freeman

Reputable
Apr 3, 2014
468
0
4,860


Yes they are!

Hope this helps :)

The_Freeman
 

Vergilangleos

Honorable
Apr 22, 2014
236
0
10,760
when you compare ps3 and xbox 360 first year games you can see consols need more time to reach their top , i can say these consols will do MUCH MORE better in future ,
xbox and ps3 in specs have only 512mb ram , but their performance is far better than pcs with 2gb ram or even more ! about ps4 and xbox one is the same
 
They're mostly selling on hype, tbh.

The 8-core CPUs they're using are meant for tablets and tiny netbooks, and really get outperformed by even a Pentium G3258.
The 'graphics cards' the consoles use are also just APUs.

I've mostly been testing Warframe because I don't have a lot of multi-platform games, but at least in Warframe my rig (specs in signature) is ahead of the PS4. Higher framerates at ultra settings with PhysX enabled. And the X1 is weaker still.

And a PC comparable to mine in raw performance only has to to cost $600-$700. Not bad, when you consider that practically anyone who has a console for gaming would need a PC for the internet and general use anyway. It actually seems cheaper to build a gaming PC, rather than get a gaming console + XBL/PSplus subscription + general use PC.

Of course, the PS4 and X1 are still big upgrades over the PS3/360, but... There's a reason why all these console exclusives that are supposed to be selling the systems keep flopping or getting delayed. Developers are culling graphics, backpedaling, delaying releases for optimization, and still getting sub-par performance and worse graphics. The next-gen systems aren't nearly as strong as they were expected to be, and developers must be frantic about it.

Ryse got graphically downgraded and got trashed in the reviews.
Killzone Shadow Fall runs at 1080p/30 fps in single player or 720p/60 fps in multiplayer, and doesn't look nearly as good as the pre-launch footage.
Order 1886 is looking worse by the day, and they already cut it to 30 fps.
Watchdogs is at 900p/30 fps on the PS4, and still not quite at ultra settings.
Titanfall on the X1 runs at a paltry 792p, as low as some PS3 games.
Warframe on the PS4 is aiming for 1080p/60 fps, but they had to disable the particles and they still drop to ~25 fps at times.
Deep Down gets more graphically downgraded every time they show a video, to the point where people suspect the earliest footage was actually running on a PC.

And in the years to come, we can only expect framerates and resolutions to fall lower and lower. At launch, the PS3 even was a minimum of 720p/30 fps. A few games even hit 720p/60 fps or 1080p/30 fps. By the end of the PS3's life cycle, games were usually at 500p/unstable 30 fps, low settings.

By the time the PS4 is older in 5-6 years, will it be running games at 720p/30fps, low-medium settings? Probably.
I'd expect that lifespan out of a PC with no upgrades, really.
 

mlga91

Admirable
Ps4 indeed has a something like a hd7850 (they have more or less the same number in shaders), but is a crippled version, so it doesn't have the same raw power than a desktop hd7850, now, you wonder, how a crippled hd7850 can run those games at that resolution at 60fps? thats the magic of console optimization, consoles dont have the driver layer, because there is only one gpu.

In the long run, you can get a good gaming pc for something like $500-600, which you can easily upgrade when you need it.