Sign in with
Sign up | Sign in

Do I Need To Worry About Input Lag?

The Myths Of Graphics Card Performance: Debunked, Part 1
By

Myth: Graphics Cards Affect Input Lag

Let’s say you’re getting shot up in your favorite multi-player shooter before you have the chance to even react. Is your opposition really that much better than you? Could they be cheating? Or is something else going on?

Aside from the occasional cheat, which does happen, the truth might be that those seemingly super-human reflexes are at least partly assisted by technology. And they might have very little to do with your graphics card.

It takes time for what happens in a game to show up on your screen. It takes from for you to react. And it takes time for your mouse and keyboard inputs to register. Somewhat improperly, the delay between you issuing a command and the on-screen action is commonly called input lag. So, if you press the trigger in a first-person shooter and your weapon fires .1 seconds later, your input lag is effectively 100 milliseconds. 

Human reaction times to visual inputs vary. According to a 1986 U.S. Navy study, the average F-14 fighter pilot reacted to a simple visual stimulus in an average of 223 ms. And it might not seem correct, but human beings actually react faster to sound than visual inputs. Reactions to auditory stimuli tend to be in the ~150 ms range.

If you're curious, you can test for yourself how quickly you react to either by clicking the simple visual test and then the audio test.

Fortunately, no matter how poorly-configured your PC may be, it probably won't hit 200 ms of input lag. So, your personal reaction time remains the biggest influencer of how quickly your character responds in a game.

As differences in input lag increase, however, they increasingly do affect gameplay. Imagine a professional gamer with reflexes comparable to the best fighter pilots at 150 ms. A 50 ms slow-down in input means that person will be 30% slower (that's four frames on a 60 Hz display) than his competition. At the professional level, that's notable.

For mere mortals (including me; I scored 200 ms in the visual test linked above), and for anyone who would rather play Civilization V leisurely than Counter Strike 1.6 competitively, it’s an entirely different story; you can likely ignore input lag altogether.

Here are some of the factors that can worsen input lag, all else being equal:

  • Playing on an HDTV (even more so if its game mode is disabled) or playing on an LCD display that performs some form of video processing that cannot be bypassed. Check out DisplayLag's Input Lag database for a great list organized by model.
  • Playing on LCD displays, which employ higher-response time IPS panels (5-7 ms G2G typical), versus TN+Film panels (1-2 ms GTG possible), versus CRT displays (the fastest available).
  • Playing on displays with lower refresh rates; the newest gaming displays support 120 or 144 Hz natively.
  • Playing at low frame rates (30 FPS is one frame every 33 ms; 144 FPS is one frame every 7 ms).
  • Using a USB-based mouse with a low polling rate. The default 125 Hz is a ~6 ms cycle time, yielding a ~3 ms input lag on average. Meanwhile, gaming mice can go to ~1000 Hz for ~0.5 ms average input lag.
  • Using a low-quality keyboard (keyboard input lag is 16 ms typically, but can be higher for poor ones).
  • Enabling V-sync, especially so when using triple buffering as well (there is a myth that Direct3D does not implement triple buffering; the reality is that Direct3D does account for the option of multiple back buffers, but few games exploit this). Check out Microsoft's write-up, if you're technically inclined.
  • Playing with high render-ahead queues. The default in Direct3D is three frames, or 48 ms at 60 Hz. This figure can be increased to 20 for greater “smoothness” and dropped to one for increased responsiveness at the cost of greater frame time variance and, in some cases, somewhat lower FPS overall. There is no such setting as a zero setting; what zero does is simply reset to the default value of three. Check out Microsoft's write-up, if you're technically inclined.
  • Playing on a high-latency Internet connection. While this goes beyond what would be defined as input lag, if effectively stacks with it

Factors that do not make a difference include:

  • Using a PS/2 or USB keyboard (see a dedicated page in our article: Five Mechanical-Switch Keyboards: Only The Best For Your Hands)
  • Using a wireless or wired network connection (just try pinging your router if you don’t believe us; you should see ping times of less than 1 ms). 
  • Enabling SLI or CrossFire. The longer render queues required to enable these technologies are generally compensated by higher frame throughput.

Bottom Line: Input lag only matters in "twitch" games, and really matters only at highly competitive levels.

There is a lot more to input lag than just display technology or a graphics card. Your hardware, hardware settings, display, display settings, and application settings all influence this measurement.

Display all 137 comments.
This thread is closed for comments
Top Comments
  • 27 Hide
    blackmagnum , February 10, 2014 1:08 AM
    Myth #123: Gamers are lonely boys in Mother's dark basement or attic...
  • 16 Hide
    cats_Paw , February 10, 2014 4:45 AM
    Awsometacular article.Not only its a new standard for GPU performance, but the Human Benchmark and audio test was really fun!Im normally very critisizing about toms articles becouse many times they feel a bit weak, but this one?10/10
  • 12 Hide
    Jaroslav Jandek , February 10, 2014 5:38 AM
    Quote:
    The info on V-Sync causing frame rate halving is out of date by about a decade. With multithreading the game can work on the next frame while the previous frame is waiting for V-Sync. Just look at BF3 with V-Sync on you get a continous range of FPS under 60 not just integer multiples. DirectX doesn't support triple buffering.
    The behavior of V-Sync is implementation-specific (GPU drivers/engine). By using render ahead, swap chains, Adaptive V-Sync, etc., you can avoid frame halving.

    DirectX DOES support TB by using DXGI_SWAP_CHAIN_DESC.BufferCount = 3; (or D3DPRESENT_PARAMETERS.BackBufferCount = 2; for DX9). It actually supports more than triple buffering - Direct3D 9Ex (Vista+'s WDDM) supports 30 buffers.
Other Comments
  • 5 Hide
    ingtar33 , February 10, 2014 12:43 AM
    awesome article, looking forward to the next half.
  • 27 Hide
    blackmagnum , February 10, 2014 1:08 AM
    Myth #123: Gamers are lonely boys in Mother's dark basement or attic...
  • 4 Hide
    AlexSmith96 , February 10, 2014 1:09 AM
    Great Article! I love you guys for coming up with such a nice idea.
  • 2 Hide
    hansrotec , February 10, 2014 1:09 AM
    with over clocking are you going to cover water cooling? it would seem disingenuous to dismiss overclocking based on a generating of cards designed to run up to maybe a speed if there is headroom and not include watercooling which reduces noise and temperature . my 7970 (pre ghz editon) is a whole different card water cooled vs air cooled. 1150 mhz without having to mess with the voltage on water with temps in 50c without the fans or pumps ever kicking up, where as on air that would be in the upper 70s lower 80s and really loud. on top of that tweeking memory incorrectly can lower frame rate
  • 6 Hide
    hansrotec , February 10, 2014 1:18 AM
    I thought my last comment might have seemed to negative, and i did not mean it in that light. I did enjoy the read, and look forward to more!
  • -1 Hide
    hansrotec , February 10, 2014 1:22 AM
    I thought my last comment might have seemed to negative, and i did not mean it in that light. I did enjoy the read, and look forward to more!
  • -1 Hide
    noobzilla771 , February 10, 2014 1:26 AM
    Nice article! I would like to know more about overclocking, specifically core clock and memory clock ratio. Does it matter to keep a certain ratio between the two or can I overclock either as much as I want? Thanks!
  • 5 Hide
    chimera201 , February 10, 2014 1:28 AM
    I can never win over input latency no matter what hardware i buy because of my shitty ISP
  • -1 Hide
    immanuel_aj , February 10, 2014 2:00 AM
    I'd just like to mention that the dB(A) scale is attempting to correct for perceived human hearing. While it is true that 20 dB is 10 times louder than 10 dB, but because of the way our ears work, it would seem that it is only twice as loud. At least, that's the way the A-weighting is supposed to work. Apparently there are a few kinks...
  • 0 Hide
    FunSurfer , February 10, 2014 3:35 AM
    On Page 3: "In the image below" should be "In the image above"
  • -1 Hide
    Formata , February 10, 2014 3:37 AM
    "Performance Envelope" = GeniusNice work Filippo
  • 0 Hide
    beetlejuicegr , February 10, 2014 4:19 AM
    I just want to mention that db is one thing, health of gpu over time is another. In many cases i have seen graphic cards going up to 90C before the default driver of ATI/Nvidia start to throttle down. i prefer a 50C-70C scenario
  • 16 Hide
    cats_Paw , February 10, 2014 4:45 AM
    Awsometacular article.Not only its a new standard for GPU performance, but the Human Benchmark and audio test was really fun!Im normally very critisizing about toms articles becouse many times they feel a bit weak, but this one?10/10
  • 0 Hide
    ubercake , February 10, 2014 5:00 AM
    What's up with Precision X? It seems like they would update it every couple of months and now there hasn't been an update since last June or July?Is EVGA getting out of the utility software business?
  • 8 Hide
    kzaske , February 10, 2014 5:01 AM
    Its' been a long time since Tom's Hardware had such a good article. Very informative and easy to read. Thank you!
  • -1 Hide
    ddpruitt , February 10, 2014 5:04 AM
    Very good article even though there are some technical errors. I look forward to seeing the second half! I would also be interesting in seeing some detailed comparisons of the same cards with different amounts and types of VRAM and case types on the overall impact of performance.
  • 12 Hide
    Jaroslav Jandek , February 10, 2014 5:38 AM
    Quote:
    The info on V-Sync causing frame rate halving is out of date by about a decade. With multithreading the game can work on the next frame while the previous frame is waiting for V-Sync. Just look at BF3 with V-Sync on you get a continous range of FPS under 60 not just integer multiples. DirectX doesn't support triple buffering.
    The behavior of V-Sync is implementation-specific (GPU drivers/engine). By using render ahead, swap chains, Adaptive V-Sync, etc., you can avoid frame halving.

    DirectX DOES support TB by using DXGI_SWAP_CHAIN_DESC.BufferCount = 3; (or D3DPRESENT_PARAMETERS.BackBufferCount = 2; for DX9). It actually supports more than triple buffering - Direct3D 9Ex (Vista+'s WDDM) supports 30 buffers.
  • 8 Hide
    Adroid , February 10, 2014 5:55 AM
    I would love to see a Tom's article on debunking the 2GB vs 4GB graphic card race. For instance, people spam the Tom's forum daily giving advice to buy the 4GB GTX 770 over the 2GB. Truth is, the 4 GB costs 50$ more and offers NO benefit over the 2GB. Even worse, I see people buying/suggesting the 4GB 760 over a 2GB 770 (which runs only 30$ more and is worth every penny). I am also curious about the 4GB 770 sli scenario. For everything I have seen, even in Sli the 4GB offers no real-world benefit (with the exclusion of MAYBE a few frames per second higher at 3 monitor scenarios, but the rates are unplayable regardless so the gain is negligible). The other myth is that the 4GB 770 is more "future proof". Give me a break. GPU and future proof do not belong in the same sentence. Further, if they were going to be "future proof" they would be "now proof". There are games that are plenty demanding to show the advantage of 2gb vs 4gb - and they simply don't. It's tiring seeing people giving shoddy advice all over the net. I wish a reputable website (Tom's) would settle it once and for all. In my opinion, the extra 2 GB of RAM isn't going to make a tangible difference unless the GPU architecture changes...
  • 0 Hide
    ubercake , February 10, 2014 5:55 AM
    DisplayLag.com lists 120Hz and 240Hz HDTVs amongst the monitors, but the maximum input speed for the HDTVs' inputs equate to 60fps? Or am I missing something?If I buy a 240Hz refresh TV, that's output. It processes the 60Hz signal to transform it to a 240Hz output (usually through some form of frame duplication) to minimize motion blur. Does this displayLag.com site mentioned in the article compare apples to oranges by listing HDTVs with monitors as if they operate the same way or am I way off here?
Display more comments