HD 7850 FPS Drops in Tomb Raider, is it just me?

mike2012

Distinguished
Mar 15, 2012
329
0
18,790
Hey, Mike2012 here. I have a rig with a Core i3 2120 CPU and an HD 7850 OC GPU.

Tomb Raider in most places will run at 60fps on high settings at 1080p...

However, at the part of the game where Lara needs to find the wolves and there's a lot of rain the fps drops below 60fps even on low at 720p.

In similar games like Black Ops II or Just Cause 2 when there's a lot of rain, I can still maintain 60fps.

Is this an optimization issue or is it just me?
 

your cpu is bottlenecking the graphics card because the rain starts useing a lot off your cpu so the cpu can't feed the hd 7850 enough and starts to lag. solution: get a better cpu ex a cheap i5 with onboard hd graphics so you can use those to help the fps in your games.
 

mike2012

Distinguished
Mar 15, 2012
329
0
18,790

Then we can conclude the game is unoptimized then. Like I said, there are sections where there's a ton of rain in Black Ops II and it still maintains 60fps.

No other game but BF3 64-man multiplayer noticeably bottlenecks my Core i3 2120.
 

mike2012

Distinguished
Mar 15, 2012
329
0
18,790

But like I said on low graphics at 720p, I'm only getting in the low 50's at that spot.

The game will run on high at 1080p at 60fps before that, then at that section, it will drop all the way down to 30fps.

The point I was trying to make though is, is that in Black Ops II with everything on max, with tons of enemies on screen, I still get 60fps when there's a lot of rain.

I think this a software issue.
 

never heard off the thing that you can use the onboard intel graphics with the graphics card you have too boost your game lavcopricetech did a video about that with there 670 ftw edition and it gained 2300 points in 3d mark with it and better performance in games so i would go for it if you have it.
 

try useing the latest beta catalyst drivers from amd that could help.
 

mike2012

Distinguished
Mar 15, 2012
329
0
18,790
 

mike2012

Distinguished
Mar 15, 2012
329
0
18,790
Not sure if you're trolling or not. But there's not enough going on in Tomb Raider to justify getting less than 60fps on low at 720p on a system that can do 60fps in most games on high or max settings.

It's an optimization issue due to it being a console port and all I can wait for them to do is patch it.

Getting 60fps the whole rest of the game, then having a massive fps drop when there's rain, means there's something wrong with the game.

I've boiled it down to 2 things that are unoptimized and causing the low fps:
Level of detail. If I turn the level of detail to low I get a huge FPS boost.

Post Processing. If I turn the Post Processing to off I can get a solid 60fps even with all the rain.
 


if its a lie why is this true than? http://www.youtube.com/watch?v=Ne9pqx9ZJfk the lavco price tech movie that shows its true and my friend does this and it WORKS.
 

in sli/crossfire or if you have a really powerful gpu. it works
edit: SORRY I LINKED THE WRONG MOVIE RIGHT LINK: http://www.youtube.com/watch?v=3i5OKS1mj7E SORRY what a fail
 

true but if you get something with your cpu and you don't have a high end card then it is useable. and why not you have it so why not use it
 

wierd because my friend get a bit of an performance boost in every games he plays when that is on but what you say is true it should be better known
 


that good old hyperthreading i did an generation test (p4-i7) and the result was that the p4 had the best hyperthreading boost (12%) and the i7 (4%) but my friend gets about 5 frames more if he enables it so your write just like hyperthreading
 

i know that ht doesn't work in games because it is fake an d yes this is off topic