3D Engine vs GPU - Red Orchestra 2

simplefranco

Distinguished
Oct 9, 2010
54
0
18,630
Hello everyone. I've always been a huge fan of WWII games, being 'realism' such a fun factor to me. I've always been crying over a WWII game similar to the Quake 2: D-Day Normandy mod. Recently, while browsing on Steam, I found Red Orchestra 2 and bought it; being able to access the beta, I'd say it's a very promising game for the WWII fans!

Anyway, there's something that's bugging me: The game uses the well-known Unreal 3 Engine, which always brightened my eyes whenever I'm hunting down a video game; however, this is a up to 64 player on wide open maps game, which breaks down performance.

My question would be: Is it plausible that I can blame the 3D engine for not being able to support as many as X models/polygon count? I am able to run at a steady 50-66 fps on High/Ultra settings in some maps. However, when being able to look from one place to another on...let's say 'Pavlov's House' for those who are familiar (a very long urban map, with lots of buildings and debris, being all houses populated with furniture, etc), I feel a crushing blow to my FPS count (down to like 30, even 25).

I was wondering if it's worth an upgrade on any of my system specs, as I am really into this game, or if there is a limit that developers can't go through, which is set by the 3D engine; and no matter what computer you have, it's impossible to improve frame rate.

Here are my specs. (any tips on a new graphics card are welcome)

CPU: i7 950 @ 3.07Ghz (stock)
RAM: 12GB @ 1600Mhz (stock)
GFX: GTX 460 1Gb - C:763Mhz / S:1526Mhz / M:1900Mhz (stock)
PSU: Corsair 650W - 52A @ 12V

As it is now, it's a very stable machine.

Thanks in advance!

(Edit: Added PSU, as I thought it could be relevant)
 
If your problem is mostly when you are out in the open looking long distances then its probably your GPU, you could turn down the view distance a bit and see if that improves it. I assume you are at 1920x1080? If so then running ultra settings on a GTX 460 you will occasionally have moments when it cant keep up.
 
The one thing you could consider is going with another GTX 460 and doing sli , you psu might be able to handle it but I think you are right on the edge of it being able to handle two cards. The other option is to get a GTX 580 with 2 gb of video ram and use the 460 as a dedicated physix card however going that route you might have to upgrade the psu as well. I don't know if you are passionate enough about your game to go all out to get an enjoyable gaming experience. In my case I do tend to do that and will spare no expense to get what I want.
On another note your other components are certianly up to spec with the 12 gb ram and i7 950 , there is no weakness there. With these games that they are coming out with now it seems more and more that you better have a top notch video card or you won't be able to play at high resolutions.
 

simplefranco

Distinguished
Oct 9, 2010
54
0
18,630
Thanks for the quick replies!
I am running at a 1680 x 1050 resolution, as it's the highest resolution for my screen :bounce:

I was considering an upgrade on first hand (1 GPU only preferably), but I was just wondering if it was worth it, or will I see no improvement whatsoever, due to the possibility of the game being terribly optimized.

I like to be on par with what the game has to offer. I am not a graphics freak, but the 'looks' play a huge role on the gameplay feeling.
Even if I had to upgrade my PSU, would the 580 (only by itself) be able to handle the new coming games? Or would it require to be set up with the 460 as an independent PhysX render?

Thanks again for the input.

(Edit: Holy mother of Jesus! The 580 costs about 440€!)
 

simplefranco

Distinguished
Oct 9, 2010
54
0
18,630
Hey, thanks for pointing that out jscottmoss!
I've noticed a few improvements on the framerate to be honest! I'm very glad it turned out to be a good game.

As far as I know there aren't many games like RO2 out there, so this is really great news! Thanks for the input.
 



Sticker shock !!! :eek: