I want technical answers to this, not simple answers, because the are too many articles on the web that don't really answer my question. I want to understand specifically why consoles have better frame rates, less stutter, and less volatility in frame rates than pc for all their games. I want to specifically compare the xbox360 to pc, I consider both the Xbox and ps3 similar. I used to have an Xbox360, and I noticed that the game frame rate performance was optimal and I didn't notice any stutter compared to a pc with a better graphics card than the Xbox ATI Xenos of the Xbox 360. But I don't understand how a console with an old ATI Xenos is capable of playing the latest games like battlefield 3 with good average frame rate performance.
Some of the reasons that I think help the Xbox 360/ps3 keep a good non volatile frame rate is because the games are obviously optimised better for it's operating system, and it's hardware. System resources are used less for background tasks and transferred more efficiently for games, and the graphics are slightly inferior to pc. But I don't understand how these few reasons allow the Xbox 360 to maintain such a good and stable frame rate compared to the pc. Another thing I get the impression of is that the Xbox and ps3, might have a free form of antialiasing like Fxaa, that is optimised for polygons and 3d images so it doesn't effect the text. I noticed that turning aa off for games on the pc will substantially increase performance, but I don't understand how a massive difference in hardware from an ATI Xenos, and say an ATI 4670 (I don't have this card anymore) still sees that the ATI Xenos is still reaping much higher performance even in the latest games. When I play on the Xbox I feel like the FPS is near 60, so don't tell me that it is playing at 30 which is why it is less volatile than pc frame rate.
Also take into account that the pc variables I'm comparing to the Xbox is one with no background applications or tasks using system resources other than the game itself, and a pc with a good (but older) graphics card and CPU for gaming.
First off, FPS aren't all that volatile on all PC's. Which touches on the first thing to consider. There are millions of different hardware combinations in a PC. Dev's can't specifically develop for a single setup like on a console, it is up to the PC gamer to setup a balanced system that will perform well.
PC's have GPU's that are several years ahead of consoles, but if you have a slower CPU in your PC, the CPU may hold back performance at difference places in a game. In some cases, even the fastest CPU cannot allow a GPU to reach FPS beyond a certain FPS point. In multiplayer BF3, on a few maps, 60 FPS is the fastest your CPU will allow performance, while some GPU setups allow for 100+ FPS. As the CPU limits you, FPS are low, when it no longer limits you, it shoots up. V-sync and FPS limiters can be used to not allow for the high end FPS. In contrast, Console GPU's typically can only handle 30ish FPS, so they never run into this problem, if you can call having multiple times more FPS a problem.
On consoles, dev's gain an advantage of knowing that everyone will have the same hardware, so they can test out that single setup and ensure everyone will have playable FPS. On PC's, they can only setup multiple sliders for visual improvements and let the user figure out what is the best settings for their hardware. People with lower to medium power PC's tend to get themselves into trouble here, as they all want the highest settings possible, often making their games questionably playable. All you have to do is turn down visual settings and still have better graphics and FPS than a console.
Personally, the FPS and visuals of console games make me want to puke. They are no where near as good or fluid. 30 FPS does not cut it, but some people can get used to those FPS, but don't pretend they are as good as PC's.
Oh so you are saying that consoles really run at 30fps? If that is true then it is totally understandable if the console has less frame rate volatility and appears to have better performance. What's all this stuff about pal-50 and pal-60 hertz though, for certain games. I remember my brother wanted to play a tennis game that was pal-50. This doesn't have anything to do with 50fps? It's odd though because I'm sure when I use an Xbox controller on pc and I use fraps 30fps feels a lot slower on the pc than it does Xbox. It doesn't seem that the Xbox enables motion blur to make it feel smoother if that would be a possible suggestion why. To be honest I was definitely impressed with Xbox graphics as being only slightly inferior to computer graphics, the main thing I saw that was inferior was object detail, but I thought the lighting was good, the textures were good. It was like comparing a game for example with 4 settings; low medium high and ultra, and the Xbox in my perception seemed like pc graphics on high or on the scale of medium to high at least. Pcs tend also to stutter even with just high graphics and putting it on medium settings honestly does look inferior to Xbox.
Another thing is that I think I saw somewhere that what makes consoles faster is because of the way the console motherboard is set up, the chips are arranged in a more efficient way than that of computer, is that true? For instance something about the PCI e frequency speed (at 100mhz default) setting in the BIOS that you can overclock, it speeds up the connection between hardware components, am I wrong? Something about Northbridge as well?
First off, stop with the, "consoles are faster...." stuff. They aren't. They aren't any where near as fast as medium to high end PC systems. That said, if you are using a low end PC as your comparison, then you are going to have a different experience. Remember, PC gaming isn't a set standard.
One of the tricks they do to make them perform better is to render everything at 720p instead of 1080p, then there is an upscaler to make it fit on a 1080p screen. With half the resolution, they can keep up better.
Consoles also do not have AA options in most cases, they lack tessellation, SSAO and many of the more advanced features on modern PC games.
That said, if you compare to a single console port, you may have a skewed idea of their comparisons. Not all ports are done well, or upgraded to take advantage of PC hardware.
The pal-50 and pal-60hz sounds like a compatibility thing between European and American hardware. TV in Europe work at 50hz, TV's in the USA work at 60hz.
Thank you for that response, oh so newer games that came out during directx 11 can make use of tesselation for pc, but the Xbox is using graphic rendering similar to directx 9, would you say? I thought they would update graphical rendering and quality in sync with the game. I'm not an expert on this, but I'd like you to talk more about console porting, to simplify it. I have looked on Wikipedia and have got a small idea about, but most of it I cannot understand.
Tank you for that response, oh so newer games that came out during directx 11 can make use of tesselation for pc, but the Xbox is using graphic rendering similar to directx 9, would you say? I thought they would update graphical rendering and quality in sync with the game. I'm not an expert on this, but I'd like you to talk more about console porting, to simplify it. I have looked on Wikipedia and have got a small idea about, but most of it I cannot understand.
Yes, this is correct. As DirectX advances, its console implementation remains roughly the same. This is because consoles and their games are for the most part, designed to run at their maximum potential from day 1. Several years and several generations of PC-advances later, console doesn't have much to show comparatively.