Color gamut is measured using a saturation sweep that samples the six main colors (red, green, blue, cyan, magenta, and yellow) at five saturation levels (20, 40, 60, 80, and 100 percent), yielding a more realistic view of color accuracy.
First up is the Fighting mode. Some alterations are made to the standard sRGB color gamut by design.

Red and magenta show the largest deviations, but all of the colors are off in saturation, hue, and luminance by varying degrees. Using the Fighting mode is purely a matter of personal preference. In our experience, games look best when the display is calibrated to a proper sRGB color gamut.

The Standard mode is pretty good overall. Our only real concern is the under-luminance of blue, red, and magenta. Those same colors are over-saturated at the 20-, 40-, 60-, and 80-percent levels. If you only look at the 100-percent saturations, the gamut looks very good.

Calibrating the grayscale improves the gamut results significantly. Now the luminance levels are near-perfect and the saturation problems have been mostly fixed. Red is a little under, but only just. This is a great example of how getting the white point correct can improve a display's total color performance.
Now we return to the comparison group:

An average error of 1.19 Delta E is amazing when you consider that the RL2460HT is the least expensive monitor in our round-up. We wouldn’t expect a gaming monitor to perform this well, but we’ll take it! BenQ really raises the bar here.
Gamut Volume: Adobe RGB 1998 And sRGB
There are basically two categories of displays in use today: those that conform to the sRGB/Rec. 709 standard like HDTVs, and wide-gamut panels that show as much as 100 percent of the Adobe RGB 1998 spec. We use Gamutvision to calculate the gamut volume, based on an ICC profile created from our actual measurements.

The RL2460HT is clearly an sRGB-only display. With a measured error of 1.19 Delta E in our color saturation sweep test, a large gamut volume of 97.72 percent goes hand in hand with that. Few professionals would put this monitor on their short list. But you could use it for video work when the wider Adobe RGB gamut isn’t required.
- BenQ RL2460HT 24” TN Gaming Monitor Review
- Packaging, Physical Layout, And Accessories
- OSD Setup And Calibration Of The BenQ RL2460HT
- Measurement And Calibration Methodology: How We Test
- Results: Brightness And Contrast
- Results: Grayscale Tracking And Gamma Response
- Results: Color Gamut And Performance
- Results: Viewing Angles And Uniformity
- Results: Pixel Response And Input Lag
- BenQ RL2460HT: Half The Speed Equation
No, but chances are if you're dropping 300+ on a monitor and genuinely want the extra frame rate you will be the type of person who is ready and expecting to tweak the game's files to run at those frame-rates.
No, but chances are if you're dropping 300+ on a monitor and genuinely want the extra frame rate you will be the type of person who is ready and expecting to tweak the game's files to run at those frame-rates.
I achieve first place in multiple games when playing multiplayer, on a regular basis.
60hz is not the problem, the problem is your system if it CAN'T sustain 60 fps.
http://frames-per-second.appspot.com/
I don't think competitive players win because they have 144Hz monitors and can react with all that information being fed to them. I think they win because they are proactive, and that there are many tells anyway to allow someone who's tuned in the game to react quickly.
I mean, StarCraft has choppy animation that is independent of refresh rates (they look like they move at 20FPS), but there's a lot of high level competition there.
You do have a point with newer games that have very nice graphics. Such as, BF, Metro LL, and Arma 3 you need a beefy GPU set up or some people turn down the settings. (Eye candy is nice but if it is going to be a slideshow it isn't worth it) However, older titles such as CS GO where having the higher FPS will give you an edge doesn't take much to get 200+ FPS. Basically computers with at least an i5 and a 6970 or 580 can hit FPS 100+ on older titles. Newer titles i5/i7 (depends on the game if it take advantage of the hyper threading) 7970(280)/290x or 680/780. Crossfire or SLI helps but I personally find the gaming experience smoother playing CS GO on one 7970 instead of two in crossfire. With one I am still well over 100 FPS. When I play BF4 I have crossfire enable and high settings with some things turned down I get over 100FPS on DX11 API. When I try mantle (When it works....) I get an extra 10fps if I am lucky and feels smoother. You also can check Toms GPU charts of even their recently released SMB. I own Asus 144hz and never can go back to playing FPS on something less. I just wish they will catch up to my golden days with the CRTs refresh rates .
Yes to a point. The 100 FPS ceiling or headroom gives you a 20-30 FPS buffer to account for high action scenes where your FPS might be pulled down to a noticeable level.
Being able to tell 60 FPS from 80 FPS might not be so easy. Watching 100 FPS drop to below 60 FPS can be detected though, even if you don't "see" it, something becomes detectable in the game-play.
This may be a silly question, and if it is I apologize, but are you sure your 27" LG HDTV is actually a 120Hz panel? I seem to recall LG being one of the manufacturers that have sold TVs with the description '120Hz' in the past, when all they were really referring to was a post processing trick to smooth out 60Hz feeds to get rid of motion blur, with the actual panel itself still being a 60Hz one.
I could of course be completely wrong, your panel is natively 120Hz, and you just don't see a difference between 60Hz and 120Hz (many people don't - like you say, everyone has different tolerances on what they can see when it comes to refresh rates); I just wanted to clarify and double check.
Peoples views are also different, I aim for 60fps at the highest possible detail, such as BF4 at ultra at 2560x1440 using a Dell 27inch U2713HM and AMD 290X, it usually manages 55 to 80 FPS. Once you go 1440p, you realise just how crap looking 1080p is. Cannot wait for 4K! The extra color like sRGB is also so much better than standard color. My advice to people is spend good money on a screen, it's worth it.
Oops actually it's a 47" HDTV, my Dell is the 27". And yes, it's native 120Hz. And I can't tell any difference when test gaming on my 600Hz 42" Samsung plasma either just for the record. The older and lower-level LG TVs simulate 120Hz for those models that advertised 120Hz.
Oops actually it's a 47" HDTV, my Dell is the 27". And yes, it's native 120Hz. And I can't tell any difference when test gaming on my 600Hz 42" Samsung plasma either just for the record. The older and lower-level LG TVs simulate 120Hz for those models that advertised 120Hz.
No TVs are true 120hz. They may have 120hz panels, but they only accept 60hz input, and then either make up interframes or insert blank frames etc. Your 600hz plasma definitely isn't able to be fed a 600hz input as there is no connectivity standard capable of transporting that amount of data (10x 1080p60???)
If you connect to your "120hz" TV, and look in the advanced monitor settings in Windows, you will notice that it is still outputting 60hz.
These TVs are amazing for video and blueray, which is what they are designed for. The Blueray only has 60fps on it, so to get more, you make up interframes, it looks good. Also, 24fps video now can run without 3:2 pulldown (as you display each frame 5 times.
They are not designed for PC input.