I've had a few questions lingering that I can't seem to find information about. The first question is about GPU LOAD/STRESS. So I downloaded GPU-Z and monitored the GPU-Load bar while playing some games. Many times, the load was well below 100%. So now my question is this. Shouldn't the load always be at 100% no matter the game played to achieve maximum FPS? That load factor was on a game that I was playing around 60-70fps. What if I had a 120hz monitor and wanted 120FPS? How does the card know how much FPS to crank out and doesn't that make benchmarks showing maximum FPS useless? I am really stumbled on this. I thought that the GPU should process as many FPS as it can at all times. Does the driver and operating system all have instruction sets for that or perhaps the game is the limiting factor or perhaps the GPU-Z information is showing something completely irrelevant to what I'm asking.
The second question is about drivers. Throughout the years, it seems that drivers (specifically talking about Nvidia) have drastically improved. If I took a really old card...say a Geforce 6 series (not 600s, but 6-series) from like 6-8 years and used the drivers of that time...and then compared benchmarks with that same card with today's top drivers like 310.70, would the benchmark results be vastly different? Doubled? Tripled? I was sure I could find somebody else do this experiment before, but I haven't yet.