I once read someone say (can not remember where I read it) that games for the consoles are made/developed to more efficiently maximize the capabilities of consoles. Whereas games for the pc are less efficient. Does anyone know how true this statement is? Because if the developers do not adequately maximize consoles' abilities, then the graphics would not be able to compete with pc games.
This leads me to believe that game developers are purposefully making the pc graphics less efficient so that the consoles can still compete with pc gaming (or because they are lazy and they don't have to be as careful when making pc games). If they do this and keep the console industry alive, they will be able to sell more copies. After all, the majority of people playing video games are teens who can't afford, or don't properly understand the abilities of, gaming pc's.
It more has to do with the fact that console specs are static and can be made to work efficiently with what you have, and have been working with for years. With the PC you have to account for many players with different set ups, parts, OS, etc.. I wouldn't say one is more efficient than the other nor would I don a tin foil hat and proclaim conspiracy. I cannot think of a game that the consoles run better than an equipped computer, especially with HD texture packs being common for major releases.
That being said there are issues with porting titles to the PC from console versions that can end up being a mess (GTA4).
Take of your tin foil hat and enjoy the games. The console industry does not have an issue staying alive. For the average person the work required to maintain and know about your machine, actively finding the right parts and upgrades, as well as the cost of the PC make consoles very attractive. There is no compatibility or system specs to look at really, just give 'em your money and get gaming. In the end, people are stupid, and will do what is easiest.
Haha no, from a purely objective and technical standpoint, a console of this generation (though likely any console for the next hundred years) could never compete with a mid/high range PC, simply due to the technical components being better on a PC. I think you're assuming that developers design games to look like 'X' and that's exactly how the consumers play them, which is actually false - on consoles this is closer to the truth, but you still must take into account the monitors that are being used. As for PC, the spectrum varies wildly depending on the players PC. Oblivion on my PC is 10x better graphically than Oblivion on my PS3, simply because my PS3 is using a modified Nvidia 7800 GTX which is way behind almost all of the modern GPUs gaming computers are using, such as the GTX 560Ti I have in this computer, or the GTX580 I have next door.
Consoles have to remain at a set standard during their market lives so that the business model can actually work. The whole business model for consoles is scrimping as much performance as possible for the lowest cost, which an evolving market just can't handle, especially when it's based around communal interaction (online gaming, etc.) If they started selling vastly more capable models of console, not only would people cry foul about how unfair it was, it'd really limit the consoles generation length, as it'd prompt issues with the community who'd feel cheated (think how irritated people get with Apple each year they bring out a marginally different iPhone) and cause their competitors to try and one up them constantly, meaning they'd get no respite or chance for the consumers to buy the darn things before they'd have to start shipping out updates. Furthermore, I imagine the console developers would whinge.
The problem is for console developers is that they're working within a box. Certainly, that can make it easier to develop because they don't have to consider consumer variation, but it also limits them greatly compared to the few PC developers still going, who have a much larger canvas to paint on and a lot more paints to paint with.
Like the previous poster said, issue can arrive when porting (part of the reason why the PC community often feel distrustful towards anything designed for both consoles and PC.)
No, they're not more efficient. I have recently tested several games on a x1950pro, which is from 2006 and about the same power gpu in the xbox 360 and ps3, and the results were probably favouring the pc.
If you put the res to 720p, have no AA or anisotropic on and max all other setting you get frame rates of 15-60 fps. The console versions might not be max settings and are generally capped at 30fps. These test i did with Borderlands, Left 4 Dead 2, Fear 2.
Fear 2 and COD 4 were at 1680x1050, which is 1.7 times 720p in terms of pixels, so oddly i think the pc can be more efficient.
Fear 2 looked awful on my ps3 and it couldn't maintain 30fps, same for borderlands, dropping to 20fps.
But users want hi-res, AA, 60fps, so we need stronger hardware.