PC games are optimised as well as their console equivalent

tomsjbhard432

Distinguished
Oct 4, 2011
7
0
18,510
Hi all,

I'm raising this discussion because:

1: I keep seeing console owners saying, "Why do we have the same game as the PC owners, when we have older tech?".
2: I keep seeing PC owners saying, "We all have different PC setups so the developers can't optimise for us".

We've all seen the kind of remarks across the net.

There are the exceptions to the rule, but mostly I think the games are optimised to the same degree.

I've got an old PC with a e6300 and a ATI x1950pro in it. It plays L4D2, FEAR 2, Portal 2, Bioshock 1 and 2, Unreal Tounament 3, Oblivion, Dead Space 2 and a few others with performance mostly between 30-60 fps at max settings (or near) at 720p. The consoles get around 30fps at mostly 720p (not quite so good in all games). I tried the resolution at 1680X1050 and the card could still handle most things at 30fps, which is of course a better resolution and probably higher IQ settings.

In June I bought a budget laptop with a AMD 3500M with a 6620g (APU). This, is definately weaker than the x1950pro. This can play games well at 1366X768 (native to laptop). I've been playing Gears of War at around 3/4 settings at mostly 60fps, and the same with UT3, LFD2, Portal 2. I tried Crysis 2 on the laptop and it could just manage a resolution of 1024X768 at the lowest in game settings (High) and get 15-35fps. It looked awful, but that is what it's like on the PS3 (same settings and bad frame rate). But this laptop has a weaker GPU than in the xbox or PS3.

The x1950pro and the 6620g are not any more powerful than the consoles' GPUs but seem to perform at least as well if I lower some settings and play at about 720p, in some cases much better as I get 60fps.

There is the question of high amounts of RAM. I only have 2gb in my XP PC running the x1950pro and the e6300, this is of course lots more than the consoles have, but RAM would not be the limiting factor in building a PC as it is quite cheap. What matters more is of course the GPU and CPU.

My laptop GPU is weaker, but the CPU element is probably stronger and seems to perform better than it should.
The x1950pro is similar to the x1800xt, which is similar to the xbox GPU, and this performs in a similar way to the consoles, sometimes better.

Now, some will say "who cares"? But, lots of PC owners seem to think that games are optimised to run better on the consoles, and I wanted to say what my experience of this is.

Also, I know the x1950pro isn't going the run BF3 well at all, but neither do the consoles!

Anyone else got the same experience of running old hardware and getting similar results as the consoles?

Thanks, John.
 

MajinCry

Distinguished
Dec 8, 2011
958
0
19,010
I ran my Pentium D @2.8 GHz with an 8600GT DDR2, got abysmal framerates with Skyrim. <10 fps. Though I got amazing FPS with Call of Juarez. Even the most heavy parts of the game didn't phase it.
Then I got an e6700 @3.2GHz with an AMD Radeon HD 6670 DDR3. Didn't run CoJ as well, got an improvement with Skyrim with a max of 30 fps (even with half the particles, I still didn't pass 30 fps).
Now, I've upgraded to an AMD Phenom II X4 965 BE, and still have my 6670. Framerates are at a steady 40 FPS (With more grass, don't need to halve the number of particles, and all the trees have been tripled in size and are more abundant.), though it would dip in certain areas, until I got Skyboost. Now everything is AOK.

Funnily enough, The Witcher 2 runs the exact same on all three set ups. Never gets past 15FPS unless I'm looking at the ground.
 

dscudella

Honorable
Sep 10, 2012
892
0
11,060
You kind of answered your own question. The difference is that the Xbox 360 & PS3 hardware hasn't changed in years and everyone has the exact same setup. It's easy for game developers to optimize a program to one specific set of hardware. Also, the consoles have different hardware that isn't available to PC's.

Check this out Team Xbox for 360 tech specs. The hardware was specifically designed for the console with one thing in mind, game performance.
 

tomsjbhard432

Distinguished
Oct 4, 2011
7
0
18,510
@ dscudella

My post is claiming there isn't any special optimisation for the consoles. Most console and PC gamers seem to think there is.

So I am saying with an old dual core and an old GPU from that time you can still play Crysis 2 (harder to run than most games) at the consoles' settings and get similar performance.

I am not saying they optimise!

For the odd game this may happen, but then again the old dual core and GPU can out perform the consoles in games up to 2010 (approx).

Cheers for you reply as I really want to make people aware of this fact.

John.
 
Firstly I find problems with your opening statement:
1: I keep seeing console owners saying, "Why do we have the same game as the PC owners, when we have older tech?".
really? you mean to tell me you've found console gamers who whine about having latest video games on their consoles. Would they rather those games were exclusive to PC platform? Give me a break, that's just beyond laughable.
2: I keep seeing PC owners saying, "We all have different PC setups so the developers can't optimise for us".
Sorry, you got this one wrong. What PC gamers typically say is "why do we have to play a game that was designed and optimized for consoles and then ported to run on PC with minimial optimization for PC hardware." There're hardware abstraction layers that allow a developer to optimize for multitude of hardware without having to nitpick through each one combination.

As for the rest of your "awareness" proof. Sorry, you're looking at the problem from an FPS point of view which is cardinally wrong.

There is a hardware difference between a console and a PC. Not just because you got different parts but because consoles are designed very differently from a PC. So, to your point, games are optimized to run on that specific hardware. The problem is that console hardware architecture and properties are often vastly different from PC. Even xbox vs PS, the hardware performs differently depending on the game engine, features of the game and how they are implemented (look up some AMD or nVidia presentations, you'll see differences in how xbox vs ps handles different graphics features pointed out on practically every 5th page)

just take this for instance:
http://en.wikipedia.org/wiki/Synergistic_Processing_Unit

you will never see this in a regular desktop pc. telling me that the optimization for cell processor and typical CPU is the same, is just downright ignorant.

You're probably closer to the truth when comparing an xbox to pc, but lets not forget that the gpu on xbox takes some responsibilities of a typical motherboard. Also, lets not forget that in most cases a cpu on console is able to address vram directly, which is impossible on a regular PC.

Due to fundamental differences in hardware, the graphics engine that was designed for PC can run vastly different on a console and vice versa. In example of skyrim, at release the CPU was tasked with handling shadows processing, because that's how you do it on the consoles. Guess what, it is a very crappy idea to run the shadows on a pc cpu. So because of that bethesda spent months optimizing the game for PC (1.1 through 1.4).

So, in closing, think what you wish, if you think that comparing 2 machines to two consoles lets you draw a conclusion on graphics engine optimization for multitude of video games out there, be my guest. But please refrain from "really want to make people aware of this fact"
 

dscudella

Honorable
Sep 10, 2012
892
0
11,060
I'm sorry John, I don't agree with you. I am however with Antizig and I believe you misread my post. Games ARE optimized for the consoles aging hardware. The xbox 360 was officially released in the US summer of 2005 and the redesigned "Slim" released in 2010. It's hardware is 7 years old. The PS3 was released in winter 2006, so it's hardware is about to hit 6 years old.

If you take a look at todays games, Skyrim & BF3, you have to have a hefty setup to run them at the intended 1080p, but not on Xbox or PS3. I'm not even sure you could run them on a 7 year old desktop configuration.

Let's take a look at CPU's from the generation. AMD released the Athlon 64 3200+ the same time as the Xbox 360. 2.0ghz, 64 bit, Single Core, L1 64k+64K & L2 512K.

The Xbox 360 had a specially designed PowerPC Xenon. 3.2ghz, 64bit, 3 cores, 1mb L2 and 512mb system RAM. That's quite a difference.

Are games better optimized for consoles. Yes. simply because they've had 6-7 years of the same hardware to code their games for. PC's are ever evolving, ever changing. Which is why their are constant patches and updates for games that have been out 1+ years, they're still finding ways to optimize them for PC.

Edit: After RE-reading your post I had some things to say about your "observations". Consoles only run games at 720p because that's the TV max resolution. Plug that same console into a TV or better yet a Monitor that supports 1080p and voila, 1080p, what the console was designed to display at. Also, games are designed to run @ 60fps on both consoles. If you have a lot of action on your screen it will dip, but if theirs a normal amount of stuff needing to be rendered you get 50-60fps.

In conclusion, I should say games are EASIER to optimize for consoles because they've had 6-7 years to work it. When that goes to PC and, like Antizig said, they don't optimize (the shadow processor) the game for different hardware, you have major flaws.
 
I am afraid you are still mistaken. First of all, consoles dont have the same quality as the pc counterparts. There are tons of things that may be improved.
So.... optimized vs non optimized.

The thing is very, very, very simple:

Its easy to test a game on one setup of hardware (console), and then improve, fix, etc.
Its cheap, easy and fast.
Now, to do that on a pc, you need to test various hardware settings, on various drivers, on various operationg systems (not only type, but also updates of every one of them), and probably a few more i am forgetting about.

So companies go the easy way: why spend so much cash for the PC platform, when we will for sure sell on consoles, and then we get a few agree customers from the PC platform that still belive our game will work well for them.
In other words, they make more money this way, so tehre is no point for them to change that.

Optimizing for PC could make a game run around 10 times faster than on a console since the hardware power is that much stronger. Or better looking (but thats more work as well, so more costs for the company).

Oh btw, consoles run on 720p, what happnes is that you tv/console setup SCALES to 1080p. thats is not the same as native 1080p.
The diffrence is like making photograpns with optical zoom or digital zoom (something to compare it to).

If you ever play a high quality game (metro 2033,witcher 2, etc), on full quality seetings with a powerfull enought pc to run it on FULL HD+, and then play a ported game from a console, you will probably understand most of the anger of the PC gamers at dealing with games ported from consoles :D.
 

tomsjbhard432

Distinguished
Oct 4, 2011
7
0
18,510
Thank you for all you replies.

It seems that everyone is in agreement that the consoles are optimised to a great degree compared to the PC version of games, and that the consoles have hardware that gives them an advantage over the PC (AntiZig's post).

But, then why do I get similar results with a similar set of hardware? Unless the hardware I'm using is more powerful than the consoles? Surely I should be getting less than half the frame rates with the optimisations they do for the consoles.

Looking at the specs for the xbox 360 they did say it was a cut down x1800xt and that could explain it. It may be that the GPU is more like and old 6800gt and for that to get the same results as a x1800xt would be good optimisation.

But, the performance of the consoles is still no more that a x1800xt/x1950pro (approx).

And, BTW, as cats_paw has said, hardly any games on the consoles are at 1080p and newer hard to run games barely hold on the 30fps.