GPU not at 100%

Giorgos

Distinguished
Feb 19, 2011
26
0
18,530
I play this game called Darkfall. With my old 8600M GT i had 150 fps and the GPU was at 100% usage. Now, with better CPU and 525M GPU i have 80 fps and 65% usage. The frames or usage doesnt really change whichever graphics settings i use. Is it a problem with the card's drivers or with the game?
 
Well is it not playing well? This game does not need much to play I looked at the recomended specs and it is low. My guess is new drivers just dont make the card work so hard b/c it doesnt need to to play smoothly. A video card does not sit at 100% regardless of task the newer cards try to ballence power needed with power draw and the mobile versions even more so.

T
 

caqde

Distinguished
Well another thing to think about though is that depending on the game this card may not be faster than the 8600GT (Desktop or Notebook) as although a lot of the specs are higher one part is slower. The 8600GT has 8 Pixel shaders while the 525M has 4 meaning it can't shade pixels at the same rate and this may lower framerates in certain games compared to the 8600GT.

Max shader rate for each card (8600GT = 4.32GP/s, 8600M GT = 3.8GP/s, 525M 2.4GP/s). Given that your framerates are about 53% of what you had and the GPU's power in shading is about 53% of what the older GPU was I would say this game is likely highly dependent on Pixel Shading.
 

legendkiller

Distinguished
Jun 19, 2011
1,812
0
19,960
THE GAME IS TOO OLD!!!... Depending on how old the game is, it wont use 100% GPU... For example, Warfrat 3 is old like 10 years old or so and my GPU (GT 220) ain't getting used 10)% nor even near 80% but avarage at 40-60% and my FPS is from 60FPS to 23FPS when there's like real action going on...
 

Giorgos

Distinguished
Feb 19, 2011
26
0
18,530
caqde has a good point here. I used to run this game on a GTX 460M and i had more than 200fps and the GPU was at 100% constantly. How many pixel shaders does 460M have? Also, where do you find this info? GPUz for 525M says Shaders 96 Unified and Pixel Fillrate 9.6GPixel/s
 

legendkiller

Distinguished
Jun 19, 2011
1,812
0
19,960

What ever GPU it is doesn't matter, it's what the game support and/or if it even support the full speed of the GPU... To get 200+ FPS in a low-end game, get a GTx 580 and you wont be slowed even at 20% GPU Use...
 

caqde

Distinguished
Well I guess looking further I was a bit wrong about what was slower the number I was looking at was ROPs (Render Operation Units) which means the figure was Pixels drawn to the final screen buffer per second. So your new card is faster at modifying the data that is being drawn to the screen but it can't keep up with the old one when it comes to actually drawing the data to your screen.

On another point Legend that makes no sense of course it matters what your GPU is and it also matters how the game was coded. Both matter not necessarily in the same amount but they both matter.

Information on the technical aspects of Nvidia GPU's was obtained from -> http://en.wikipedia.org/wiki/Comparison_of_NVIDIA_Graphics_Processing_Units#GeForce_8_.288xxx.29_series

there are other sites that provide this information including www.techarp.com but that was the first one I went to as both Desktop and Mobile are on the same page on wiki.
 

Giorgos

Distinguished
Feb 19, 2011
26
0
18,530
Turns out there is a problem with the CPU. The reason the GPU isnt at 100% is because CPU bottlenecks being only at 25-30% with darkfall being a cpu intensive game. I remember T8300 being at 70+ all the time. Is T8300 in any way better than 2410m?
 

Giorgos

Distinguished
Feb 19, 2011
26
0
18,530
I solved the puzzle today. I noticed that the more the HD 300 is used the less CPU Darkfall uses for some reason. So, i disabled windows aero which was using HD 3000 and now i have much more fps.

Is there an explanation on that? Can i solely use 525m for all apps including windows?