Nvidia Explains Why it Thinks High Frame Rates Matter in Competitive Games

(Image credit: Nvidia)

Some people value high performance just because they like to watch numbers rise. Others want to push their systems to deliver more frames per second (fps) for another reason: having an edge over their opponents in competitive titles. Nvidia published a blog post today explaining why it feels that playing with higher fps can be beneficial.

The blog post quantifies many of the arguments made by people who've played games at higher fps on equally high refresh rate monitors: animations are smoother, graphical issues are rarer and games are more responsive. These benefits aren't exclusive to competitive games, but they are particularly important there.

Smoother animations aren't just better to look at; Nvidia claims they're also crucial in high-stakes moments. Nvidia explained that "smoother animations help you track your target," because "when micro-correcting your aim for overshooting or undershooting, having a smooth target helps you get back on target faster." That's vital for aim-heavy titles.

The reduction of issues such as ghosting, which leaves "a trail behind the object typically found in the object's position from the previous frame," and screen tearing also matters. The former could lead people to shoot at where their opponent was rather than where they currently are, for example, while the latter is just distracting.

Perhaps the most important point in Nvidia's blog post has to do with system latency. Many people blame latency on their ISP, a game's servers or a loss of favor with their deity of choice. But Nvidia said that playing at higher frame rates (again, with high refresh rate monitors, natch) can help reduce latency at the system level.

The company explained how system latency works using a player moving from behind a corner in Counter-Strike: Global Offensive at 60 fps and 240 fps:

[E]ach system’s CPU receives the player’s position at the same time. In this example, the CPU and the GPU take approximately the same time to prepare and render the frame. The CPU portion of the pipeline on the 60 FPS system is 4 times longer than the 240 FPS system. Similarly, the GPU render time is also four times longer on the 60 FPS system. Finally, the display section is also 4 times longer on the 60 FPS system as the refresh cycle is 4 times slower than a 240 Hz display. On a 60 FPS/Hz system it simply takes longer to process and is therefore further behind the actual state of the game. At 240 FPS/Hz, the rendering is much closer to the actual state of the game, but there is still some difference.

And it explained what that means in terms of the actual playing experience:

Lower system latency allows you to see player earlier. Additionally, reducing system latency makes the game feel more responsive as the time between your mouse movements and the results on screen is shorter. With these benefits together, lower system latency gives you a competitive edge on the battlefield.

Now, there is a caveat to all this, which is that having better fps and buying a 240Hz monitor won't suddenly turn you into an esports-level player. Some of those folks have managed to hone their skill playing on laptops with nothing but their CPU's integrated GPU and a trackpad; equipment only makes so much of a difference.

But anyone looking to have a better experience and potentially increase their skill beyond the limits of regular practice, well, they could do worse than to heed Nvidia's advice and push for higher fps. Checking out our GPU Hierarchy and CPU Hierarchy posts for top-performing products would probably be a good place to start.

Nathaniel Mott
Freelance News & Features Writer

Nathaniel Mott is a freelance news and features writer for Tom's Hardware US, covering breaking news, security, and the silliest aspects of the tech industry.

  • King_V
    Yeah, I'll buy into this when they can prove that people can, first of all, SEE things quickly enough that 100fps vs 240fps or more will make any sort of difference, and second, that they can physically respond that quickly.

    They're trying to sell more powerful video cards, I get it - but this seems, to me, to be feeding into a line of BS about the "need' for faster frame rates that can't be backed.

    240fps = 0.00417 seconds per frame.
    Reply
  • drea.drechsler
    Pumping unnecessarily high frames generates a lot of heat and wastes power. It's the computing equivalent of 1960's muscle cars... an illusion of superior performance wasting irreplaceable energy and filling the air with CO2 while bolstering fragile egos.

    Thank you, Nvidia, for such forward thinking.
    Reply
  • Gurg
    Yet the impetus for TV buying and viewing is now 4k with more details rather than faster refresh rates.
    Reply
  • Mottamort
    King_V said:
    Yeah, I'll buy into this when they can prove that people can, first of all, SEE things quickly enough that 100fps vs 240fps or more will make any sort of difference, and second, that they can physically respond that quickly.

    They're trying to sell more powerful video cards, I get it - but this seems, to me, to be feeding into a line of BS about the "need' for faster frame rates that can't be backed.

    240fps = 0.00417 seconds per frame.
    Actually Linus Tech Tips did quite an in-depth video with different e-sport "pro" players with different setups using 60/144/240hz setups with different tests.
    Results were pretty conclusive...higher fps gives better results across the board.

    Next thing you'll say is that the eye can't see faster than 24fps :P
    Reply
  • DSzymborski
    Gurg said:
    Yet the impetus for TV buying and viewing is now 4k with more details rather than faster refresh rates.

    You don't interact with a TV in the same manner you interact with a monitor when gaming.
    Reply
  • King_V
    Mottamort said:
    Actually Linus Tech Tips did quite an in-depth video with different e-sport "pro" players with different setups using 60/144/240hz setups with different tests.
    Results were pretty conclusive...higher fps gives better results across the board.

    Link? Because that sounds pretty questionable at best. So, if they put pro players in front of monitors with randomly assigned refresh rates, these players could tell when it was 60, 75, 120, 144, 165, and 240? I doubt it.

    What was their methodology? Do they have a write-up? I'd rather read it than slog through a video.

    I'll believe it when there's a repeatable, provable process.

    Mottamort said:
    Next thing you'll say is that the eye can't see faster than 24fps :p
    You'd better take your crystal ball in to the shop - it's giving you bad info.


    drea.drechsler said:
    Pumping unnecessarily high frames generates a lot of heat and wastes power. It's the computing equivalent of 1960's muscle cars... an illusion of superior performance wasting irreplaceable energy and filling the air with CO2 while bolstering fragile egos.
    To be fair, you most certainly could feel the difference with the ridiculous gobs of torque those things produced. Of course, the tires, brakes, and suspensions of the day weren't up to that kind of abuse.

    Naturally, with the progress of technology, those old cars are laughably inefficient for fuel usage, as well as the amount of power they produce per amount of displacement, compared to today's standards.

    There were some odd outliers, in terms of efficiency, though, strangely enough.
    Reply
  • nikolajj
    This article attracted a lot of deniers.
    More frames = more good. (diminishing return apply ofc.)

    Passive media like TV and movies is different in that you do not have a user performance to be affected. Heigher FPS is still more true to life, but people are so used to 24 FPS that they are thrown off by new standards like HFR. "Looks too realistic" and "It is like being there for real" is thing that I for SOME reason have seen thrown around as negatives. Unbelievable!

    The Linus Tech Tips video mentioned:
    OX31kZbAXsAView: https://www.youtube.com/watch?v=OX31kZbAXsA
    Reply
  • Gurg
    DSzymborski said:
    You don't interact with a TV in the same manner you interact with a monitor when gaming.

    Both are 4K 3840x2160 @ 60Hz both computer and TV are connected to ethernet although a displayport 60Hz connection from PC to monitor, my PC has a six core processor and GPU capable of 60fps at 4K While my TV has a "Quad-core CPU/Multi-core GPU for responsive streaming – experience ultra-smooth streaming of 4K video at up to 60 fps, instant search results and fast and fluid responsiveness". Monitor is 28" vs 55" for TV although when factoring in distance from screen both have similar effective viewing size.

    I admit I haven't tried gaming on my TV as I'm very often using both PC and TV at the same time. When there are competing ball games on I often fire up my laptop to carry a second game. Can't really watch all three for TV broadcasts simultaneously as ATT Now limits me to two devices at one time.

    Just like "King V" I also would like to see reliable high level research evaluating gaming performance differences over 60fps by experienced high level gamers to see the point of diminishing returns.
    Reply
  • sizzling
    King_V said:
    Yeah, I'll buy into this when they can prove that people can, first of all, SEE things quickly enough that 100fps vs 240fps or more will make any sort of difference, and second, that they can physically respond that quickly.

    They're trying to sell more powerful video cards, I get it - but this seems, to me, to be feeding into a line of BS about the "need' for faster frame rates that can't be backed.

    240fps = 0.00417 seconds per frame.

    This is such a difficult thing to prove. I liken it to audio. I grew up with studio grade equipment in my home and I know I am more sensitive to different equipment than most people. I have got people to try different setups and usually they will say they can’t tell the difference but I can. I’m not as sensitive as a professional musician but also I definitely notice a lot more than most people. I really would not be surprised if very experienced and skilled players can tell the difference that the vast majority cannot.
    Reply
  • TJ Hooker
    Gurg said:
    I admit I haven't tried gaming on my TV as I'm very often using both PC and TV at the same time.
    If you're not gaming on your TV then reduced input response/latency (the goal of high refresh rates/fps) is irrelevant.
    Reply