Hello everyone! I posted a thread a really long time ago so I hope I will not break any rules here haha!
I just upgraded my Gigabyte GTX 970 to an Asus Strix GTX 1070 a few days ago and I could not figure it out why the GPU usage is so low, most of the games I get 60-70% but in some other titles I get 30% and most of the time the performance is the same or worst than with the previous card.
I want to point out a few things, first of all I need all 95%-100% usage what the GPU can provide because I'm playing on a 1080p screen with 144hz and YES I can see the difference everyday so I would be happy if the card would deliver all the power what it can provide. I don't want to upgrade to a 1440p or 4K display and I don't want to max out my settings (a lot of people suggest to choose between these options to use the card 100% but I would like to use it's power to achieve the maximum refresh rate what my monitor can apply, in this case 144FPS)
So I started to search about my current setup so if maybe I would find something what is bottlenecking my GPU but even tho it's a pretty old system everything seems fine to me, here is the full spec list:
MOBO: Asus Maximus IV GENE-Z/GEN3
CPU: i7-3770K @ 4.2GHz (Cooler: Corsair H100)
RAM: 4x4GB Corsair Vengeance 2133MHz
GPU: Asus Strix GTX 1070 (not OC yet)
SSD: Kingston HyperX Fury 120GB (for OS)
HDD: 2TB Seagate Barracuda (for data and games)
PSU: Corsair TX650 v2
OS: Windows 10 Pro
Monitor: Asus VG278HV
The CPU should be fine, how I know all i7 from 2600k to the current ones should not bottleneck any GPUs. The only thing what is really weak in the system is that Seagate drive, even tho it's 7200rpm it's really slow, I mean I can wait for the games to load but does it make any impact in games? I tried to install Fortnite (cuz it's not that big) to my small 120GB SSD but I got the same performance.
So now that we are talking about Fortnite, when I set the game to about medium settings I get around 70-100 frames and lower than 50% of GPU usage, and in some areas where are a lot of buildings and players it can go low as 30-40 frames and I got 20-30% GPU usage, when I max out the game, it remains kinda in the same FPS rate (sometimes higher) but I got 70-90% GPU usage.
So basically the GPU doesn't want to reach a certain FPS or somehow tries to use the least amount of power what it needs to run the game.
In HITMAN 2016 it's the same thing, I got around 60% usage with medium settings with 70-80 frames and with everything on ultra mostly same frames but with 99% load.
Also really important, I forget to mention, this card was removed from a mining rig what used it non-stop for a year or so. I know this can damage the card in a lot of ways but I ran benchmarks beforehand and everything looked fine, in benchmarks like FurMark I get 99% all the time.
The only clue what I recently discovered is that the GPU temperature doesn't want to go more than a fixed rate, depens on the game like in one game 59°C was the max, it always stayed there, in others 55 was the max limit like it either stayed fixed on that temp or go below, in MSI Afterburner the Temp Limit is set to 83°C by default. Any other ways that this can be restricted?
I could not find any other ways to fix this problem, I uninstalled all of the NVIDIA drivers and reinstalled them, not helped.
Thankyou so much for reading this long post haha!
I just upgraded my Gigabyte GTX 970 to an Asus Strix GTX 1070 a few days ago and I could not figure it out why the GPU usage is so low, most of the games I get 60-70% but in some other titles I get 30% and most of the time the performance is the same or worst than with the previous card.
I want to point out a few things, first of all I need all 95%-100% usage what the GPU can provide because I'm playing on a 1080p screen with 144hz and YES I can see the difference everyday so I would be happy if the card would deliver all the power what it can provide. I don't want to upgrade to a 1440p or 4K display and I don't want to max out my settings (a lot of people suggest to choose between these options to use the card 100% but I would like to use it's power to achieve the maximum refresh rate what my monitor can apply, in this case 144FPS)
So I started to search about my current setup so if maybe I would find something what is bottlenecking my GPU but even tho it's a pretty old system everything seems fine to me, here is the full spec list:
MOBO: Asus Maximus IV GENE-Z/GEN3
CPU: i7-3770K @ 4.2GHz (Cooler: Corsair H100)
RAM: 4x4GB Corsair Vengeance 2133MHz
GPU: Asus Strix GTX 1070 (not OC yet)
SSD: Kingston HyperX Fury 120GB (for OS)
HDD: 2TB Seagate Barracuda (for data and games)
PSU: Corsair TX650 v2
OS: Windows 10 Pro
Monitor: Asus VG278HV
The CPU should be fine, how I know all i7 from 2600k to the current ones should not bottleneck any GPUs. The only thing what is really weak in the system is that Seagate drive, even tho it's 7200rpm it's really slow, I mean I can wait for the games to load but does it make any impact in games? I tried to install Fortnite (cuz it's not that big) to my small 120GB SSD but I got the same performance.
So now that we are talking about Fortnite, when I set the game to about medium settings I get around 70-100 frames and lower than 50% of GPU usage, and in some areas where are a lot of buildings and players it can go low as 30-40 frames and I got 20-30% GPU usage, when I max out the game, it remains kinda in the same FPS rate (sometimes higher) but I got 70-90% GPU usage.
So basically the GPU doesn't want to reach a certain FPS or somehow tries to use the least amount of power what it needs to run the game.
In HITMAN 2016 it's the same thing, I got around 60% usage with medium settings with 70-80 frames and with everything on ultra mostly same frames but with 99% load.
Also really important, I forget to mention, this card was removed from a mining rig what used it non-stop for a year or so. I know this can damage the card in a lot of ways but I ran benchmarks beforehand and everything looked fine, in benchmarks like FurMark I get 99% all the time.
The only clue what I recently discovered is that the GPU temperature doesn't want to go more than a fixed rate, depens on the game like in one game 59°C was the max, it always stayed there, in others 55 was the max limit like it either stayed fixed on that temp or go below, in MSI Afterburner the Temp Limit is set to 83°C by default. Any other ways that this can be restricted?
I could not find any other ways to fix this problem, I uninstalled all of the NVIDIA drivers and reinstalled them, not helped.
Thankyou so much for reading this long post haha!