Shadow of mordor on my laptop.. settings confusion

uthvag

Reputable
Oct 5, 2014
1,115
0
5,660
hello
i recently got my hand on shadow of mordor and i have just started playing it.
My laptop
i7 4700 hq
8gb ddr3 1600mhz
gtx 860m 2gb (maxwell)
res- 1080p (of course!!!)
So i started the game and i started tinkering with the graphics setting

and the infamous 6GB ultra texture quality caught my eye
But as a guy who was forced to get a laptop (living in a dorm, no desktops allowed ..meh)
i kept it on medium (it said 2GB needed)
and then i started increasing the settings one by one
i ended up with
all settings high except the loading distance objects one(VHS?..i dont remember) which was at low
and the texture quality at medium i was getting 40-50FPS at the first intro battle with orcs
where so
FPS -40-50 during battle
all setting high except the texture one at medium and the distance object at low

next i wanted to push it harder - the texture quality was increased to high
And a surprise awaited me -
the FPS was almost 40??!!!!

now my question is how is this possible?
is it because of the shared VRAM memory (windows 8.1 where 1/4 of the ram is added to gpu to use when needed)

My MSI afterburner said that the memory used at the peak time was 2056 MB

Can someone clarify this?? Is that battle scene less demanding

PS- the CPU temp was around 70 - 75 C and GPU was at 70 C at max load

Thanks!!
Sorry for the long post!

UPDATE -with everything on high i get 33- 40 FPS during some open world battles
And no OCing too
 
Solution
In all honesty, from what I've read and heard, the game's PC port is probably one of the best this year, so it's not surprising cards like a mobile 860, which isn't that bad for a mobile GPU, can still perform well enough with it. The ultra package itself isn't shipped with the game, but rather it's available to anyone wanting it, free of charge. The game already looks awesome with just high settings, but if your pc can afford it, you can get the ultra ones.

I'm not much into w8's wacky memory management mechanisms, but yes, shared memory may very well be the reason why you're running it on high detail, and as a matter of fact anyway, there are plenty of people playing ultra with 3gb cards, so that 6gb requirement is kinda excessive...

Vynavill

Honorable
In all honesty, from what I've read and heard, the game's PC port is probably one of the best this year, so it's not surprising cards like a mobile 860, which isn't that bad for a mobile GPU, can still perform well enough with it. The ultra package itself isn't shipped with the game, but rather it's available to anyone wanting it, free of charge. The game already looks awesome with just high settings, but if your pc can afford it, you can get the ultra ones.

I'm not much into w8's wacky memory management mechanisms, but yes, shared memory may very well be the reason why you're running it on high detail, and as a matter of fact anyway, there are plenty of people playing ultra with 3gb cards, so that 6gb requirement is kinda excessive (but then again, official requirements are always excessive)...
Anything lower, however, and you can't avoid incurring into memory allocation issues.

With a 2gb card, you'll probably fail running it at ultra, and it'll probably run bad too. It's at 40fps already with high settings, which is usually considered the "playable" limit; ultra's going to dip in the lower 30. Also, your CPU also helps, since it's not an ULV chip (which could've bottlenecked the card after heating up).

If I were you, I'd try ultra at something lower than 1080p, just for the sake of testing :p
Texture detail and resolution are amongst the heaviest effecting changes when it comes to games, so cranking them up is obviously hitting your performance.
 
Solution

uthvag

Reputable
Oct 5, 2014
1,115
0
5,660


Thanks!!
and are the temps normal under that load?
The ambient temp was around 35 C
 

Vynavill

Honorable
I'd say yes for a laptop, especially if ambient temp was around 35°C. They could be lower, but they're still in the safe zone for under-load operation. If possible and if you're going to mainly use it for gaming (and for long periods at that), try getting a cooling pad or something to raise it a little over the surface it's sitting on; you can get them for as low as 15$, so they're pretty affordable ;)
 
I think in the end you'll find 33-40 FPS isn't all that good overall.

You say "some open world battles". You don't really know how bad performance can get until you've really engaged a lot of Orcs in one of the more graphically intense areas in Nurn.

Play the game some more, do some more testing. I've seen loads of people say they can run higher than recommended texture detail, then they come back saying they're getting massive frame drops.

Monolith are one of the more diligent dev teams as far as realistic settings recommendations, because they look at the entire game performance overall, vs giving best case scenarios to sell more copies.
 

uthvag

Reputable
Oct 5, 2014
1,115
0
5,660


Tested it further stayed around 40 when fighting almost 20 uruks

PLus i get around 35 with 16 AA

 
In my experience, playing with VSync off results in unacceptable micro stutter. Some may not notice it, but this game isn't optimized very well in that respect, at least on older (7970) AMD GPUs anyway.

I've dropped settings one at a time from Ultra to High until I could run the bench with no ave frame rate dips below 60. I ended up dropping all but Texture filtering to High, and turning Order Independent Transparency off.

Reason being I wanted to see if having VSync on smoothed out the micro stutter, and I wanted it to be able to stay at a solid 60 FPS with it on. It did, and last night I tried turning it off, and even with everything still dropped to High, I got noticeably worse micro stutter.

So in my experience, going by frame rate alone does not tell the whole story. I can be averaging 90 FPS or more and still get micro stutter. With VSync on, it's limited to minimal amounts, and only just after coming out of a Dominate. With it off, I see it a lot, esp while moving.

That said, I'm still using Cat 14.6, but from what I've seen, every driver since has no improvements for my GPU, or for any of the games I play. I'll definitely be going back to Nvidia when Pascal comes out.
 

Vynavill

Honorable
Just wondering but...
Are you sure you're not mixing up micro-stuttering with screen tearing? Feels weird that you're experiencing that with vsync off, considering that it should be the other way around due to how vsync works.
You also say you see a lot when moving, which pushes things towards tearing even more...
 


No it's not screen tear. My display resists tear very well, and I'm really not using VSync for that, and I know the difference between the two.

 

Vynavill

Honorable
Uhhh...screen tearing directly correlates to your monitor's set refresh rate and the application's running framerate, so your monitor (like anybody else's) can't be "tear resistant", as it all depends on your hardware, the game's actual requirements and its settings. I also wasn't trying to say you didn't know the difference in the first place, but it's just that, by reading your main post, it really felt like it could've been screen tearing; you said you notice it a lot more when VSync is off than when it's on, and the main reason you use VSync is to actually remove screen tearing (or, more rarely, to remove useless workload from the GPU setup, if it's capable of running everything at a much higher framerate than your monitor's set refresh rate)

In any case, could literally be anything without any additional information. Just for the sake of completeness, can you post your full specs?
A 7970 backed up by a decent CPU is supposed to be able to run SoM with everything high and a 1080p@60hz resolution. Ultra might occasionally dip on the lower 45 with 3GB models (when not in SLI/Crossfire setups) and most probably won't work at all on 2GB and lower (both SLI/Crossfire and not), but it shouldn't stutter anyway.
 
I'll leave it at your not understanding how displays can differ. I've seen tons of people complain of bad tear, even on GPUs same or similar to me, and the main difference was their display.

Yes, some displays are worse than others when it comes to tear. This forum has a lot of noobs, though, so I'm not surprised to hear arguments to the contrary.