Consoles Wars: PS4 and XBOX One vs PC

CmdrJeffSinclair

Reputable
Aug 29, 2014
785
1
5,010
I've been really keeping up with the headlines and news on video game websites like Gamespot, and it seems to me that consoles in general are actually really suffering hardware-wise. Even people who do not own nice gaming computers might unconsciously be perceiving the end of consoles as a whole.

The more I read, the more I see how each console is surprisingly underpowered and uncompetitive even compared to some of the oldest PC's/Laptops still being used. My 7 year old Alienware M17 is only a few hairs less powerful than either of the consoles and I've been complaining about how !@#$ty it is for a very long time.

Is it just me, or did consoles screw themselves over by attempting to hit a $400 mark? Perhaps a $500 mark would have made them more reasonable.

1080p is great, but 30FPS is pretty difficult to accept. In the end, the difference between 1080p/30 FPS and 720p/60 FPS is minor. Better graphics at 720p upscaled to 1080p I think would be better all around then pure 1080p with 2 year old graphics.

What does everyone here think? I love pure 1080p, don't get me wrong. I'm a videophile and I see the difference, but if $400 was the mark developers didn't want to budge from, then I would imagine best graphics possible at 720p would be more beneficial than bad graphics with perfect clarity to see just how bad the graphics really are. 720p to 1080p upscaled instead of Full 1080p would save on memory some for better graphics while still offering a little bit of the 1080p resolution arena, perhaps?

I see the stupid battles of PS4 offering COD: AW with 1080p60 but with minor frame rate issues vs XBONE has some weird resolution at 60 FPS but has smoother playback via a weird dynamic upscaling. It is just wayyyyyy too soon to be seeing such issues in consoles. They are struggling to hit levels of graphics that I've had for 7 years right out of the gate. Something tells me no matter how much they optimize their consoles that the PS4 and XBONE are pretty screwed, and they have only themselves to blame!!
 
Solution
Things are only going to get worse as 4k TVs start to proliferate and the consoles seem even further behind as they simply can't play any game at 4k resolution and I doubt 720p upscaled to five times the resolution is going to look good. 2013 does seem to have been an awkward time to launch new consoles, 4k was starting to show up and may well be the new standard in a couple of years, but the GPU horsepower needed to run games at 4k was, and still is prohibitively expensive for a console. If Sony and Microsoft wanted another ten year long generation, they probably needed to wait another 2 to 3 years for GPU technology to advance to cope with 4k.

As it stands right now, we might see a rather short generation, and see a PS5 and Xbox...

CmdrJeffSinclair

Reputable
Aug 29, 2014
785
1
5,010


That was my first thought too. They wanted to stick with that $400 tradition from previous generations, however technology has not progressed sufficiently ahead of the PS3 and XBOX 360 to pull off a new generation at that price. Perhaps they could have waited another year or two if they wanted that $400 mark so badly. In the end the comparison between PC and Console will always be horrible, so screw it all and just wait. Next gen is definitely not next gen. More like negative three gen
 

CmdrJeffSinclair

Reputable
Aug 29, 2014
785
1
5,010


Very very very true. Somehow I think that the tech in the PS4/XBONE are only a couple years ahead of the PS3 and XBOX. It's been about 6 years for either of them and the graphics are at LEAST 4 years out of date.

Somehow this probably means that budget gaming rigs like the Steam Machines may wipe away consoles. Also, anything from Alienware is steaming pooooooo. Just had to throw that in there
 
Things are only going to get worse as 4k TVs start to proliferate and the consoles seem even further behind as they simply can't play any game at 4k resolution and I doubt 720p upscaled to five times the resolution is going to look good. 2013 does seem to have been an awkward time to launch new consoles, 4k was starting to show up and may well be the new standard in a couple of years, but the GPU horsepower needed to run games at 4k was, and still is prohibitively expensive for a console. If Sony and Microsoft wanted another ten year long generation, they probably needed to wait another 2 to 3 years for GPU technology to advance to cope with 4k.

As it stands right now, we might see a rather short generation, and see a PS5 and Xbox whatever stupid name MS can come up with in 4 or 5 years, as the current generation consoles are struggling to even hit 1080p/30FPS in some cases, and 4k gaming is completely out of the question with those machines. Developers are also likely to start complaining within the next couple of years about the limitations of the hardware negatively effecting their ability to make games, kind of like what started happening around 2010/2011 with the seventh generation consoles.
 
Solution

CmdrJeffSinclair

Reputable
Aug 29, 2014
785
1
5,010


You're right in some respects, but 4K cannot possibly break into mainstream media for at least another 5 years. Higher resolution TV's first need to have higher refresh rates because between 40-60% of the image's clarity is lost during even moderately fast scenes, this is why 4K is considered a gimmick at this point in time because while people are thinking they have the best tech, TV developers are not really doing anything new while scratching their heads at how the hell they could possibly improve pixel response times and panel refresh rates to even make that 4K worth the dough.

No matter how anyone shakes it, people expected 1080p/60 for consoles and the consoles are simply not going to provide that ever without dumbing down the already lame graphics, so instead of making a console $100 more expensive to fulfill our expectations, millions of people have wasted $400 for tech that is archaic by computer standards.

The PS2 was over 100x more powerful than the PS1 (33MHz vs 334MHz sorta dual core not really), and the PS3 was 33x more powerful than the PS2 in general.

The only real gain the PS4 truly has in spades is the 8GB of high speed RAM and the ability to share graphics with the GDDR5 RAM and the well-optimized CPU, but only 6GB can be used for gaming and the 2GB remaining is on OS reserve.

Here is the absolute breakdown:
PLAYSTATION 1
CPU: 32-bit RISC (33.9MHz)
RAM: 2MB, 1MB Video RAM
Graphics: 3D Geometry Engine, with 2D rotation, scaling, transparency and fading and 3D texture mapping and shading
Colors: 16.7 million
Sprites: 4,000
Polygons: 360,000 per second
Resolution: 640x480
Sound: 16-bit 24 channel PCM

PLAYSTATION 2
CPU: 128-bit PlayStation2 @294.912 MHz 16KB Cache (Data: 8KB + 16KB (ScrP))
RAM: 32MB Direct Rambus DRAM, 3.2GB/s (roughly equivalent to 800MHz DDR3 RAM)
Co-Processor for FPU @6.2 GFLOPS
GPU: 4MB Graphics Synthesizer™ @147.456MHz
(DRAM Bus bandwidth 48GB/s, 2560 Bits wide interface)
Max Resolution: 1280 x 1024

PLAYSTATION 3
CPU: 128bit PowerPC-base Core 8 Cores @3.2GHz 512KB L2 cache
7 x 256KB SRAM (for data crunching for the cores)
1 of 8 cores (SPEs) reserved for redundancy
total floating point performance: 218 GFLOPS
GPU: 256MB NVIDIA RSX @550MHz w/ GDDR3 VRAM @700MHz (VRAM: 22.4GB/s)
(G70/GeForce 7800 GTX)
1.8 TFLOPS floating point performance
RAM: 256MB XDR Main RAM @3.2GHz (25.6GB/s)
System Floating Point Performance: 2 TFLOPS

PLAYSTATION 4
CPU: AMD x86 Jaguar 1.6GHz 8-Core (two 4 cores Jaguars side-by-side sharing 2MB L2 Cache, out-of-order-operations)
GPU: Custom 2GB AMD Radeon approx. equal to a 7970/7990 (GCN Architecture, 1.84TFLOPS)
RAM: 8GB GDDR5 unified 8 GB GDDR5 RAM setup (~3.5GB for typical OS use, ~5.5GB max for games)
HDD: 500GB 5400RPM