Getting sick of these fake system specs...

I'm beginning to suspect console devs are trying to make the PS4 and X1 look better than they really are, or are trying to artificially brag up their games' graphics, by inflating the recommended system requirements for PC ports. I'm not talking about optimization or how the games perform - I'm talking purely about the "recommended systems specifications" that they release to the media.

For example, Watch Dogs recommends an i7 or 8-core AMD CPU, yet the game runs on ultra above 60 fps on an i3-4130 or a slightly overclocked FX-6300 (probably a stock FX-6300 as well, though I haven't seen benchmarks with one at stock), provided the video card is strong enough to keep up.

Similarly, the Thief remake also recommends an i7 or FX-8000 CPU, yet it runs far better on even the low-clocked Ivy Bridge i5s, such as the i5-3470, than it does on the FX-8350, and in fact the game is mostly optimized for 2 cores. It runs above 50 fps on ultra on the old Ivy Bridge i3-3220.

And now Ryse Son of Rome has had its recommended system specs released, and not surprisingly, it's recommending a GTX 690 and a "quad-core or hexa-core CPU". First, I'll eat my shirt if it requires anything stronger than an R9 270 to put it at ultra settings, because it's on a well optimized engine and it doesn't look very impressive. Second, there are a great many quad-cores stronger than hexa-cores, and there are hexa-cores weaker than dual cores. Their vague CPU recommendation is going to severely disappoint Phenom II 1030T owners, though Pentium G3258 owners may be pleasantly surprised.

The Phenom II 940, by the way, is similar in strength to the PS4 and X1 CPUs. Terrible core-per-core, and only passable when all cores are being used. Even then, it's outstripped by any random modern CPU, even the bargain bin ones. Treating the recommended system specs right now as anything short of fodder for jokes would be a mistake.

I'm getting very tired of these bloated recommended system specs. They're assigned seemingly at random and only make less informed PC gamers build their systems inefficiently.
 

Selenog

Reputable
Aug 30, 2014
40
0
4,560
Well you can't just compare consoles and pc's to each other in raw compute power. Games for consoles are optimized to run on a very specific set of hardware, pc-ports or even pc-only games need to support a whole lot of configurations and thus the room for optimizations is small.
 


I'm not convinced you read the details.
PC ports are running easily on far weaker hardware than the system specs say they should, not the other way around.
 

Selenog

Reputable
Aug 30, 2014
40
0
4,560


No I read just didn't write out the answer completely, like I said pc-games need to support a whole lot of configurations and perform well, specs are indeed most of the time higher than what is probably playable because it's simply not possible to test every configuration so they increase specs a little just to avoid complaints as much as possible, and that's not to rip you off, remember game developers don't sell you the hardware. They do it so people's experiances are good and they buy their next game.
 

GObonzo

Distinguished
Apr 5, 2011
972
0
19,160
Turn up everything to max, boost filters through Nvidia Control Panel. You will not be getting 60+fps with a 650ti on 2014 PC games. Tried it and got better frames with the AMD7870 I ended up putting in their system.

After reading the title I imagined you would be commenting on all the people posting these over-glorified personal PC specs I keep seeing over and over. People posting every day claiming they've just built a $8,000+ system when all they're doing is searching Newegg or something similar for the most expensive parts they can find and then posting them over and over as their own DDR3 32GB, 3x Titan SLI \ 290X Crossfire BS.
 


No he is talking about game publishers/developers marketing scheme using system requirements to make the new consoles(PS4/X1) look more powerful than PC

Edit: Ryse has updated its recommended system requirements saying that it was for 4K resolution. It needs 2GB VRAM for 1080p
 

Vynavill

Honorable
AFAIK, it's mainly because of what selenog said. Marketing is just a side effect.

Honestly although, I actually find it funny to see how bad they inflate recommended ones. Shadow of Mordor is definitely the funniest one out ATM. Minimum look ok-ish for a refresh-gen game, but recommended ones are downright hilarious.

- Minimum requirements -
Processor: Intel Core i5-750, 2.67 GHz | AMD Phenom II X4 965, 3.4 GHz
Memory: 3 GB RAM
Graphics: NVIDIA GeForce GTX 460 | AMD Radeon HD 5850
Hard Drive: 25 GB available space

- Recommended -
Processor: Intel Core i7-3770, 3.4 GHz | AMD FX-8350, 4.0 GHz
Memory: 8 GB RAM ("What the...?!", strike 1)
Graphics: NVIDIA GeForce GTX 660 | AMD Radeon HD 7950 ("What the...?!", strike 2)
Hard Drive: 40 GB available space ("What the...?!", strike 3 and out)

I can't wait to see the mass of budget gamers screaming at it. They're literally saying "cough up 1 grand or get the hell outta here, if you want to play nice" with this one.
I can understand a slight increase, but its recommended specs could be compared to two PS4/XBO in an hypothetical Crossfire setup...
 


Recommended specs are for high and/or default ultra. Jacking up 8xSGSSAA and HBAO+ in the Nvidia Control Panel is neither optimized nor taken into account in the recommended specs the developers release.
 


They must actually sell more, because before Watch Dogs released some Ubisoft executive was literally bragging about how the game was so cutting edge it wouldn't even run on dual-cores. Now, of course, we know the game averages over 60 fps on ultra even on the lower end of i3s.
 

Vynavill

Honorable
Sucks being on mobile, can't quote decently...
In any case:

[...]Now, of course, we know the game averages over 60 fps on ultra even on the lower end of i3s.
Unless some miraculous performance patch I'm not aware of came out on PC, and that may very well be true, last time I looked that was valid only for the green team; the red team still struggles keeping 40~50 fps at medium-high, even on flagship cards and i5s...
Everyone I know that bought the game had something to say about it being hyped more than it should have (I, for one, consider it a relatively stealth-oriented, more serious GTA with funky hacking mechanics, so I do agree), but while Nvidia owners played decently, it was and it is still hell for AMD ones (me included).
 


I was talking about the CPU side only, not video cards.
 

Vynavill

Honorable
Yet they're directly related, in this precise case. Correct me if I'm wrong, but you can't completely disable Physx in Watch_Dogs, right? Yet the lowest settings for it are still taxing enough.
Any Nvidia card from the 660/750 models and above can provide enough brawn to handle them, and thus even lower end i3s run it relatively effortlessly. With AMD, as you surely know, physx get offloaded to CPU, with what I believe is nothing more than an instruction translator (I never bothered to look, so once again, if I'm wrong, correct me), adding an incredible overhead on it. I'd gladly see a low end i3's attempt at processing the game's logic AND physx at the same time, as top-notch i5s and stock clock i7s struggle badly with them already...
 


Watch Dogs doesn't use PhysX. It was patched out and replaced with Ubisoft's in-house physics solution to accompany the graphics downgrade before launch.
 
Watch dogs is probably my most disappointing games purchase of 2014. Press x to win gameplay, cars that drive like trucks, and far too much hype.

With regards to the hardware requirements much of that comes down to poor coding rather than next gen graphics imo