According to my knowledge, the response time is the time the blur appears until the frame is fully made. I don't know if "made" is a correct word though. Anyways...A 144hz monitor would show 144 frames each second, which means 1000ms/144hz = 7ms. This means that each frame has 7 milliseconds before the next frame takes it's turn. so a 1ms response time display would show blur the first 1ms, and it would show the frame at the 6ms left. However, I have seen a 10ms 144hz monitor on Amazon. How can the monitor have 10ms response time if each frame only has 7ms to show? The next frame would come while the first frame didn't even finish being "made" which would cause blur all the time making the monitor useless. Am I missing something here?