A majority of monitors, especially newer models, display excellent grayscale tracking (even at stock settings). It’s important that the color of white be consistently neutral at all light levels from darkest to brightest. Grayscale performance impacts color accuracy with regard to the secondary colors: cyan, magenta, and yellow. Since computer monitors typically have no color or tint adjustment, accurate grayscale is key.
Since most folks don't calibrate their monitors, I'll show you the results from three of the RL2460HT’s picture modes without any adjustments.

Fighting is BenQ’s default mode, and it does take some artistic license with color that you'll see illustrated in the gamut results. Grayscale performance isn’t too bad, fortunately. The white point runs a little blue as brightness rises. You’ll also experience a slight green error in the darkest areas of the screen. The mid-tones (30-50 percent) are the best part of this chart.

Changing to Standard mode makes the tracking a little more linear, if not entirely flat. Green errors are visible from 50 percent on up. One-hundred percent is the exception where the white point suddenly becomes much better.

The best out-of-box accuracy comes from the sRGB preset. Unlike a lot of other monitors, you can still change brightness and contrast in this mode. The only locked-out controls are color temp and gamma. For an uncalibrated picture mode to measure so well on a sub-$250 monitor is pretty astounding.

Going back to the Standard mode, we tweaked the RGB sliders and recorded a superb result. With the exception of 0 and 10 percent, all errors are well under one Delta E.
Here is our comparison group:

Since Fighting is the RL2460HT’s default mode, we’re using that as our stock Delta E value. To BenQ’s credit, it doesn't claim this mode meets typical color standards. According to the company's marketing, it was created to help highlight certain colors in fighting games, so there are intentional modifications in play. With that said, 3.65 Delta E is not a bad result in that it’s a barely visible error.

Calibrating the Standard mode vaults the RL2460HT into some impressive (and expensive) company. An average error of .71 Delta E is right up there with all of the pro monitors we’ve tested.
Gamma Response
Gamma is the measurement of luminance levels at every step in the brightness range from 0 to 100 percent. It's important because poor gamma can either crush detail at various points or wash it out, making the entire picture appear flat and dull. Correct gamma produces a more three-dimensional image, with a greater sense of depth and realism. Meanwhile, incorrect gamma can negatively affect image quality, even in monitors with high contrast ratios.
In the gamma charts below, the yellow line represents 2.2, which is the most widely used standard for television, film, and computer graphics production. The closer the white measurement trace comes to 2.2, the better.
Our gamma results presented us with some choices. Like most computer monitors, there are multiple presets available. Only two measure close to our preferred average value of 2.2, though. Neither curve is ideal, so we’re showing you both.

Gamma 3 offers the flattest tracking, but it’s a little too dark at an average of over 2.4. Since the RL2460HT is not super-bright, this gamma preset might make the image a little too dim for some tastes. That was our observation, compelling us to try the Gamma 2 preset as well.

Gamma 2 tracks right around the 2.2 mark. However, it incurs a dip and rise at 10 percent and a hump at 80 percent. Even still, when it comes to watching real-world content, this option looks better. The errors are small, and you might not even be able to distinguish them from a perfectly flat measured trace. In case you’re wondering, sRGB mode generates the exact same result.
Here is our comparison group again:

We include both runs in our round-up so you can make up your own mind which gamma to choose. Gamma 3 has the tightest tracking. Even though its average value is too high (meaning too dark), it’s much more consistent than Gamma 2.
Gamma deviation is calculated by simply expressing the difference from 2.2 as a percentage.

On the other side of the equation, Gamma 2 comes much closer to 2.2 than Gamma 3. Ultimately, that’s why it’s our preference. If you recall the calibration notes from page three, you need to set the brightness, contrast, and RGB sliders differently for each gamma preset. You can’t just switch between them without altering the calibration.
- BenQ RL2460HT 24” TN Gaming Monitor Review
- Packaging, Physical Layout, And Accessories
- OSD Setup And Calibration Of The BenQ RL2460HT
- Measurement And Calibration Methodology: How We Test
- Results: Brightness And Contrast
- Results: Grayscale Tracking And Gamma Response
- Results: Color Gamut And Performance
- Results: Viewing Angles And Uniformity
- Results: Pixel Response And Input Lag
- BenQ RL2460HT: Half The Speed Equation
No, but chances are if you're dropping 300+ on a monitor and genuinely want the extra frame rate you will be the type of person who is ready and expecting to tweak the game's files to run at those frame-rates.
No, but chances are if you're dropping 300+ on a monitor and genuinely want the extra frame rate you will be the type of person who is ready and expecting to tweak the game's files to run at those frame-rates.
I achieve first place in multiple games when playing multiplayer, on a regular basis.
60hz is not the problem, the problem is your system if it CAN'T sustain 60 fps.
http://frames-per-second.appspot.com/
I don't think competitive players win because they have 144Hz monitors and can react with all that information being fed to them. I think they win because they are proactive, and that there are many tells anyway to allow someone who's tuned in the game to react quickly.
I mean, StarCraft has choppy animation that is independent of refresh rates (they look like they move at 20FPS), but there's a lot of high level competition there.
You do have a point with newer games that have very nice graphics. Such as, BF, Metro LL, and Arma 3 you need a beefy GPU set up or some people turn down the settings. (Eye candy is nice but if it is going to be a slideshow it isn't worth it) However, older titles such as CS GO where having the higher FPS will give you an edge doesn't take much to get 200+ FPS. Basically computers with at least an i5 and a 6970 or 580 can hit FPS 100+ on older titles. Newer titles i5/i7 (depends on the game if it take advantage of the hyper threading) 7970(280)/290x or 680/780. Crossfire or SLI helps but I personally find the gaming experience smoother playing CS GO on one 7970 instead of two in crossfire. With one I am still well over 100 FPS. When I play BF4 I have crossfire enable and high settings with some things turned down I get over 100FPS on DX11 API. When I try mantle (When it works....) I get an extra 10fps if I am lucky and feels smoother. You also can check Toms GPU charts of even their recently released SMB. I own Asus 144hz and never can go back to playing FPS on something less. I just wish they will catch up to my golden days with the CRTs refresh rates .
Yes to a point. The 100 FPS ceiling or headroom gives you a 20-30 FPS buffer to account for high action scenes where your FPS might be pulled down to a noticeable level.
Being able to tell 60 FPS from 80 FPS might not be so easy. Watching 100 FPS drop to below 60 FPS can be detected though, even if you don't "see" it, something becomes detectable in the game-play.
This may be a silly question, and if it is I apologize, but are you sure your 27" LG HDTV is actually a 120Hz panel? I seem to recall LG being one of the manufacturers that have sold TVs with the description '120Hz' in the past, when all they were really referring to was a post processing trick to smooth out 60Hz feeds to get rid of motion blur, with the actual panel itself still being a 60Hz one.
I could of course be completely wrong, your panel is natively 120Hz, and you just don't see a difference between 60Hz and 120Hz (many people don't - like you say, everyone has different tolerances on what they can see when it comes to refresh rates); I just wanted to clarify and double check.
Peoples views are also different, I aim for 60fps at the highest possible detail, such as BF4 at ultra at 2560x1440 using a Dell 27inch U2713HM and AMD 290X, it usually manages 55 to 80 FPS. Once you go 1440p, you realise just how crap looking 1080p is. Cannot wait for 4K! The extra color like sRGB is also so much better than standard color. My advice to people is spend good money on a screen, it's worth it.
Oops actually it's a 47" HDTV, my Dell is the 27". And yes, it's native 120Hz. And I can't tell any difference when test gaming on my 600Hz 42" Samsung plasma either just for the record. The older and lower-level LG TVs simulate 120Hz for those models that advertised 120Hz.
Oops actually it's a 47" HDTV, my Dell is the 27". And yes, it's native 120Hz. And I can't tell any difference when test gaming on my 600Hz 42" Samsung plasma either just for the record. The older and lower-level LG TVs simulate 120Hz for those models that advertised 120Hz.
No TVs are true 120hz. They may have 120hz panels, but they only accept 60hz input, and then either make up interframes or insert blank frames etc. Your 600hz plasma definitely isn't able to be fed a 600hz input as there is no connectivity standard capable of transporting that amount of data (10x 1080p60???)
If you connect to your "120hz" TV, and look in the advanced monitor settings in Windows, you will notice that it is still outputting 60hz.
These TVs are amazing for video and blueray, which is what they are designed for. The Blueray only has 60fps on it, so to get more, you make up interframes, it looks good. Also, 24fps video now can run without 3:2 pulldown (as you display each frame 5 times.
They are not designed for PC input.