Here's one for you: why do PC specs keep growing when console doesn't?

iZver

Distinguished
Oct 1, 2010
48
0
18,530
So. I don't have a console, so correct me if I'm wrong, but the first xbox360 released can play the same titles that the newest one can. I don't even know if the performance is any different. You have to tell me.

Meanwhile the performance of PC's from 2005 will have grown tenfold. Yet an xbox360 from 2005 can power far cry 3 just as a 2000€ PC from today can.

Before you frantically begin explaining me how pc graphics are over 9000 times better I know this. Only that if viewing far cry 3 the difference is so small in the first look. Okay, the resolution is smaller, textures are lower res and other details aren't there but generally speaking the difference is not 8 years worthy.
Here's a video: www.youtube.com/watch?v=mz8paB0KOHQ

I know the developers are using some serious ghetto hacks on xbox360 to be able to pull this off, and for example far cry3 grass is completely wack even on PC because of that. And again I say that how can you make far cry 3 look comparable on 8 year old mediocre spec and brand new high end pc?

I also know that the consoles have been seriously slowing down graphics develpoment of games, but that is not the issue in this topic. What I want to discuss with you is how the hell is it even possible for an 8years old console to run games like far cry3, and More Importantly!, why have the PC requirements grown yet Xbox remains the same? Are the devs just lazy and make poorly optimised ports or are the graphics on PC's really using up all the power of today's gaming computers? If they are, why are the differences so minute? (Yes tesselation is nice, Yes AA is nice, Yes 8k textures are nice, but if comparing 1997 games to 2005 games you will see what a proper world of difference is.)

The difference is in specs is such also because of the hardware industry lobby. This cannot be denied. nvidia, intel and amd are pressing on the gaming industry to raise the specs. It is an important factor to note. So are we PC gamers getting fucked by everyone? The hardware makers and the game industry? Are we just the suckers everyone can abuse?

I look forward to reading your reply!

Cheers!

I will add some comparison shots:
skyrim:
http://4.bp.blogspot.com/-9ivkuf5iHzM/UDnDGL0-fQI/AAAAAAAAAXU/7OxsESXO2gU/s1600/comapreeee.jpg

crysis2:
http://bossfightgamingblog.files.wordpress.com/2011/03/crysis_2_pc_xbox_360_vun9r.jpg

http://fileframe.sector.sk/files/novinky/2011-3-2-20-8-31/pict-370.jpg¸

crysis3:
http://www.youtube.com/watch?feature=player_embedded&v=hGRI7_jPL2U
YOU ARE NEVER going to convice me the difference is worth the crysis pc specs! Or am I just stupid and am viewing a comparison of prerendered videos? xD
 

this6guy

Honorable
Feb 12, 2013
74
0
10,660
2 things:
--> Same hardware spec across all consoles means extreme optimization, whereas on PC you have to make sure all variety of hardware will work fine
--> More direct access to the hardware without having to go through an OS (Windows), the entire console's hardware is dedicated to games while the PC will have to run Windows, AV, background processes etc.
 

Ponyface

Honorable
Jan 23, 2013
217
0
10,760
I agree with this6guy, optimization is the problem for pc games, those games are tailored to take advantage of every drop of power from a console, where as with a pc, there are too many different hardware configurations.

The ps4 has similar specs to a med to high end gaming system, and look at what it can do! Our pc's are capable of so much more... It's such a pity that the consoles dictate when pc gamers can have a "next gen" experience, we could have been experiencing "next gen" years ago
 

corvak

Honorable
Jan 31, 2013
58
0
10,630
1. the console runs in 720p, where the PC is 1080p or higher. PC gamers also aim for 60fps, as opposed to 30.

2. a PC running Crysis 3 at 720p at 30fps, with the graphical details off - the same way consoles run it - does not cost $2000. You could get that from a $500 machine.

3. The savings for PC comes from software, not hardware - when you consider sales on places like Steam, you're saving 25% off of an often lower base price per game. So the extra $300-400 spent on the PC is reclaimed in savings by the time you want to upgrade again. PC is also less likely to require a subscription to play a game online.

4. The other major advantage to PC, is competition in the digital distribution space. This leads to lower prices and more sales and the ability for developers to go DRM-free, as opposed to a monopoly from someone like Microsoft or Sony.

In closing, if you just prefer the idea of a console - being able to stick a disc in and play, don't mind the online platform on your console of choice, by all means, stick with consoles. I have both, and each have strengths and weaknesses.
 

this6guy

Honorable
Feb 12, 2013
74
0
10,660


Isn't there a setting for FOV in the options menu? I've never played it on a console so I'm just asking.
 
I don't think so it was my brother that told me how bad the FOV is on his mates Xbox so I cannot check. Also if you search gogle there are loads of sites that say FC3 if a great demonstration of how the PC is far better than current consoles. It also shows the consoles dip to 20 fps and are like playing on the PCs lowest settings.
 

iplikator3333

Honorable
Jan 18, 2013
169
0
10,690








You don't have to run an AV on your PC
 

iZver

Distinguished
Oct 1, 2010
48
0
18,530
Yes, but I feel like you are all missing a point here. Sure the consoles are more optimised. I wrote as much in the first post.

The thing that baffles me is that the PC requirements for playing games grow, while the requirements for xbox remain the same. Yet the graphics are (approximately) improving linearly on both systems.
 

iplikator3333

Honorable
Jan 18, 2013
169
0
10,690





They aren't improving much on console, Consoles often use cheap tricks to make the graphics seem better than they actually are, and also the fact that it's a much more direct access without any OS like Windows.
 

iZver

Distinguished
Oct 1, 2010
48
0
18,530



I wrote as much. Yes they use a lot of illusions and tricks. Xbox360 game makers are more illusionist that game programmers :D. The amount of prerendered is staggering.

But anyway, how can you say graphics aren't improving? Just look at the difference between skyrim and oblivion.
http://gamervets.com/wp-content/uploads/2012/01/Argonian-Oblivion-vs-Skyrim.bmp
http://livingwithanerd.com/wp-content/uploads/2011/11/oblivion-vs-skyrim-khajit.jpg
http://www.fmpatch.com/wp-content/uploads/2011/12/oblivionvsskyrim.jpg

Or remember GTASA
http://1.bp.blogspot.com/-3xeDPeGfsDo/Tjz2WBromQI/AAAAAAAAAvg/YzwKV0mL1wU/s1600/gta_sanandreas_screenshot.jpg

Surely you can't claim graphics haven't improved much?
 

this6guy

Honorable
Feb 12, 2013
74
0
10,660


You're missing the point, console games are optimized as hell, while PC games rely on brute force of hardware more than anything else. It's just the reality. I don't think you fully grasp just how much difference optimization for a single fixed hardware spec can make...
 

iZver

Distinguished
Oct 1, 2010
48
0
18,530


Yes they are optimised. I get why the PC requirements are grater. BUT!!! And this is the important bit right here: xbox hardware is the same for a long time, and the pc is better. So, maybe asking a little differently will help: how have the graphics on consoles advanced that much on the same hardware?
 

this6guy

Honorable
Feb 12, 2013
74
0
10,660


Better hardware isn't the sole criteria for advancement in graphics - especially since 90% of graphics is software based. SOFTWARE, not hardware. Sure DX11 needs DX11 hardware, but lets not forget that consoles are using DX9. If you turn down Far Cry 3 on your PC to DX9, the result won't be staggeringly ugly. Developers grow more used to the hardware the longer it's on the market, and the more familiar they get with it, the more efficiently they can use it. In closing, here's a thought for you: The Xbox/PS3 having the same hardware for a long time is what helped developers extract that level of graphics from them, if there was a new Xbox every year, the developers would never fully utilize the hardware.
 

iplikator3333

Honorable
Jan 18, 2013
169
0
10,690








PS3 and Wii U don't use DX, That's for Microsoft platforms only
 

jesot

Distinguished
Dec 19, 2008
260
0
18,790
Software really hasn't forced hardware to advance very far recently. Most games are run on a few different engines and have the same basic requirements. I have only recently started to put games on medium settings and haven't upgraded in 5 years.

When Epic started putting out UE4 demos, it took 3 Keplars (guessing 660 or 670) in SLI to run the demo. When they put out the demo last year, they were easily down to a single Keplar and said that they weren't even close to done optimizing it.

With more and more console titles also being released on PC, I'm making the switch to PC in my livingroom, too.
 


I don't get what is so difficult to understand.

First off, the PC version of Far Cry 3 is not the same as the console version. The PC version has DX11 features and many more visual options not available to the console, and in many games, maybe even Far Cry 3, the PC has better AI and physics as well. On top of that, PC's play at higher FPS.

PC's continue to improve, so developers are able to add new improvements to visuals and raise the requirements as well. If they did not do this, those with newer computers would complain. This is also a means to get people to upgrade their PC's.

So while the console can play Far Cry 3, they are playing a dumbed down version of Far Cry 3.

Lastly, as we add more visual improvements, the more hardware requirements we get and the smaller those visual improvements are. In the early days of gaming, each advancement was huge visually, but today those advancements are harder to recognize and require more and more power to achieve, but they are what keeps everything going forward.

You seem to recognize that the PC versions have better graphics, so I don't understand why you can't comprehend all this.
 

michaelmk86

Distinguished
Dec 9, 2008
647
1
19,015


xbox360 - run fc3 with 1274x702, low settings, barely 30fps
2000€ PC - run fc3 with 2560x1600, ultra settings, 60fps
basically, all you need to run fc3 with console settings is a 30-40€ GPU
 

biffordm

Honorable
Mar 21, 2013
4
0
10,510
I think it really comes down to monetizing through different groups. If a developer only makes a game for PC, then yes it can have the most amazing graphics, performance, etc.; however, they will be shut off from a huge amount of people who are not technically savvy, do not wish to build out a gaming PC, and so forth.

A 14 year old kid is more apt to get a PS3 or a 360 for $199 and to buy a ton of games than they are to invest $1,000+ on a PC to pay essentially the same price.

It all comes down to the return on investment - Far Cry 3 for PC alone would not make nearly as much as Far Cry 3 distributed on all major platforms.
 

ShindoSensei

Honorable
Mar 6, 2013
147
0
10,690
There are two ways of looking at it.

First off PC's are very easy to customize. Need a new GPU? spend 300-1000 bucks and you have a huge list of games to play. Want more than one monitor? You now have unleashed NVIDIA surround. For convenience and integrity for the costumer, consoles can't just make hardware for the person to add, (unless the plan on upgrading easily in the future i.e. the n64 expansion pack). A PC is almost universal in any aspect, you can do anything you want without any one really caring if you can or can't do it. Microsoft makes the OS, and other companies make the hardware, and sometimes other companies put it for you and put their label on it.

With a console it's the last step, so you can just get something out of the box, put a game in it, and ensure to play it.

Now for part two.

A console isn't a PC (surprise!). A console for the most part was engineered to run software, computer games for the most part. Games for consoles are written for that console and that console only. A PC is a computational device, it can run any software and the software can do anything (in theory). For a computer to run the latest games well, you need a lot of horse power to run that game along with everything else you expect your computer to do. A console is just expected to play games, and thus doesn't require that much power. Also, since the developer has control for almost all the hardware in that certain console, it can utilize it to it's full extent. You have gorgeous looking games like Halo 4 running with the same piece of hardware that was barley running perfect dark zero. The developer has full control of something that was only meant for what the developer wanted.

With that said, games can only do so much with a console. The leap from a playstation console, is much larger than a leap between a GTX card one (or even two) generations ahead. I hope this answers your question, and happy gaming! (you should get into console gaming, there are some great exclusives. Personally I've been a PC gamer all my life as I was privileged enough to always have a good PC, but I made sure to always get the latest consoles just so I can play something fantastic. The list is endless.
 

airborne11b

Distinguished
Jul 6, 2008
466
0
18,790
The main, correct answer seems to be over looked by many people in this thread.

PC gamers tend to not aim for such low visual quality and FPS. Xbox 360 and PS3 play games at 720p and 30fps, as already stated in this thread.

That's the majority of the "optimization" you're seeing in action. Sure, they might get a handful of FPS due to unified hardware, but the major kicker is the settings at which console games are running.

Look at how much a PS3 costed at launch, then look at how much it would cost for a PC to run a game at 1280 x 720 @30fps on DX9 and not even the highest setting for things like shadow/physics/textures/etc.

All it takes to run Crysis 2, on High and DX9 on 1280 x 720 on a PC is a core2duo CPU, 2GB of ram and a 7900 or 8800GT GPU (all of which were mid-grade components around the time the Xbox 360 launched.

The main thing is that 1080p has long been a standard now, and most people I think might have a system that's actually stronger than both the major consoles combined, yet they're trying to run 1080p/high settings and get frustrated that it plays out poorly. If they simply set their game to the same graphic quality and resolution of the consoles, you'd see that almost any low-mid grade gaming PC built over the past 5 to 7 years can do the job just fine.

TL;DR, optimization isn't the major factor, it's how low of settings the consoles are actually running.