Closed

BenQ XL2720Z Monitor Review: A 27-Inch, 144 Hz Gaming Display

Many displays are marketed as gaming monitors. But only a tiny handful operate at refresh rates greater than 60 Hz. The BenQ XL2720Z sails along at 144 Hz and offers many gaming-specific features. Today we run it through our benchmark suite.

BenQ XL2720Z Monitor Review: A 27-Inch, 144 Hz Gaming Display : Read more
39 answers Last reply
More about benq xl2720z monitor review inch 144 gaming display
  1. What a coincidence. I got a XL2720Z just this Wednesday. I would've preferred to wait for the arrival of Haswell-E before buying a new monitor, but my Samsung T260 emits something that causes reddening of the skin around my nose, above my left eyebrow, and smack in middle of my forehead. (Sunlight and florescent tubes don't do this to me.) Happy to say the XL2720Z does not cause me any injury, or at least not yet...

    It's still an interim monitor, though. What I really want is a large affordable WQHD or UHD IPS gaming monitor.
  2. When do monitor manufacturers understand, that 1080p resolution is a JOKE,
    especially on a large 27" screen?
    In the early 2000s it might have been ok to have such resolution, but nowadays
    it is no longer usable. Even for a 24" screen the minimal resolution is
    1920x1200.
    Until they are not making these 27" screens to have more pixels, they are not
    seeing any cash out of me. I rather buy el-cheapo monitors from Ebay as a mail
    order from Korea.

    Monitor manufacturers, please stop living in the 80s and stop hustling us with
    your prices!
  3. "oh neat, a new monitor. lets check specs. oh coool 1080p and a bunch of buzz words"
    sometimes I feel like nobody is listening... or just trying to sell me a bridge for the low low.
    decent review, but if youve got the gpu for 144Hz. 2160p @60 is just a cootie shot away
  4. So from the results I still need an IPS with Gsync or you're stuck with TN. I'm hoping by xmas they have a dozen good choices for gsync monitors with 144hz also in there (why not, what if I go AMD again after maxwell?). Might as well get as many bases covered as possible. IF monitor makers are reading this, 2560x1600! Screw this 1440p crap. Wider is NOT better in monitors of these sizes. I want to scroll up & down less than I am already on 1920x1200 but at least at 1600p I get the same. I won't buy 1440p.
  5. Quote:
    "oh neat, a new monitor. lets check specs. oh coool 1080p and a bunch of buzz words"
    sometimes I feel like nobody is listening... or just trying to sell me a bridge for the low low.

    To add offense to injury, in 2008 I bought my first LCD (24" Samsung with 1920x1200 pixels) for 330Euros. Now a 24" Samsung would cost me 400 Euros, and it has only 1920x1080 pixels.

    What on earth have happened? Why did prices go up, and resolution went down?
    Why are there no 30" 2560x1600 TN panels out there for gamers for 700 Euros?
  6. Quote:
    When do monitor manufacturers understand, that 1080p resolution is a JOKE,
    especially on a large 27" screen?
    In the early 2000s it might have been ok to have such resolution, but nowadays
    it is no longer usable. Even for a 24" screen the minimal resolution is
    1920x1200.
    Until they are not making these 27" screens to have more pixels, they are not
    seeing any cash out of me. I rather buy el-cheapo monitors from Ebay as a mail
    order from Korea.

    Monitor manufacturers, please stop living in the 80s and stop hustling us with
    your prices!


    This monitor is made for gaming, true gamers don't care a grate deal about resolution. We are in it for the refresh rates and the response time. Ive been gaming on a old CRT monitor till last year with a change to the Asus VG248QE. Even though its still much slower than my old CRT it works. I have 2 computers 1 for gaming and one for everyday and video work. Gaming machine is set up around a single R9 290X, 4770K, and a Asus VG monitor. My other computer is a crossfired 295s, 3930K, and 3 IPS 4k monitors.

    A single card is better for latency, 4770K is more than enough to push data to the 290X and the monitor has a fast refresh rate. Its better at gaming than my extremely high dollar build. Peripherals are set up differently as you can imagine gaming computer has razer and a 7.1 headset. The other is mostly set up for 2.1 but i do have a 7.1 headset for room sound.
  7. @siman0

    "This monitor is made for gaming, true gamers don't care a grate deal about resolution"

    You meant, online multiplayer gamers don't care a grate deal about resolution.

    I prefer to play single-player FPS, where I do want to have all the eye candy,
    and I want to see the vegetation, desert, sky etc. The only advantage of a PC is
    that it can provide better graphics, that is the main point.

    Otherwise I could just go out, buy an 1080p TV, a Crapbox1, Crapbox360, or PlayStopper 4, and game on that thing in 1080p.

    I think monitor technology is not moving forward (in fact moving backwards) exactly because people are happy to buy their 1080p crap for 500 Euros.
  8. This seems a little weird to me: the panel uses constant-current to drive LEDs because some people claim to see flicker at ~20kHz PWM frequency yet the very same display uses backlight strobing to reduce blur and this would be occurring at 144-288Hz which is 100X lower.
  9. Serious question. Why not just buy a quality HDTV with 120 (or greater) Hz for your gaming monitor? Especially if you'll be gaming at 1920x1080. A neighbor has his PC hooked up to a quality HDTV and it looks great to me. I've played Battlefield on it with no issues at all. It's pretty awesome!
  10. Quote:
    Monster Cookie:
    When do monitor manufacturers understand, that 1080p resolution is a JOKE,
    especially on a large 27" screen?
    In the early 2000s it might have been ok to have such resolution, but nowadays
    it is no longer usable. Even for a 24" screen the minimal resolution is
    1920x1200.



    In the early 2000s CRTs were still the standard. 4:3 was the standard aspect ratio. There were no 1080p LCD monitors let alone large 1080p LCD monitors, and I paid ~$1200(NZD) for a 17" 1280x1024@60Hz (16ms) LCD display in 2003. That's how bad it was back then.
  11. DookieDraws said:
    Serious question. Why not just buy a quality HDTV with 120 (or greater) Hz for your gaming monitor?

    Most 120+Hz TVs take 60Hz input and pulse their backlight 2-5X per frame to reduce blur during display refreshes and perceivable flicker.
  12. Quote:
    DookieDraws:
    Serious question. Why not just buy a quality HDTV with 120 (or greater) Hz for your gaming monitor? Especially if you'll be gaming at 1920x1080. A neighbor has his PC hooked up to a quality HDTV and it looks great to me. I've played Battlefield on it with no issues at all. It's pretty awesome!


    Try searching for a 120HZ HDTV in my country. See what comes up.
    http://pricespy.co.nz/category.php?k=107
  13. Quote:
    @siman0

    "This monitor is made for gaming, true gamers don't care a grate deal about resolution"

    You meant, online multiplayer gamers don't care a grate deal about resolution.

    I prefer to play single-player FPS, where I do want to have all the eye candy,
    and I want to see the vegetation, desert, sky etc. The only advantage of a PC is
    that it can provide better graphics, that is the main point.

    Otherwise I could just go out, buy an 1080p TV, a Crapbox1, Crapbox360, or PlayStopper 4, and game on that thing in 1080p.

    I think monitor technology is not moving forward (in fact moving backwards) exactly because people are happy to buy their 1080p crap for 500 Euros.


    It has absolutely nothing to do with people being happy with buying them. The reason monitors are being made at 1080p or 1440p is because of media. Media companies set the standard of 16:9 which is why you see the majority of monitors now only being built at 16:9 instead of 16:10. So blame the media companies for basically forcing the monitor manufacturers into 16:9 because there are far more TVs/Monitors that are sold at 16:9 than at 16:10 (this is why there are also so few TVs/Monitors at 2.35 - True Letterbox).

    Also, you don't understand why having a monitor refresh above 60Hz is useful, even if you play single player games like FPS. There are two aspects to having a monitor above 60Hz, one is 3D so you can play true 3D (ie, there's true depth of image vs. pseudo-depth where the illusion of 3D is provided through 2D) capable games at a respectable frame rate of 60 or 72 (depending if your monitor is 120Hz or 144Hz) and the second (which has been mentioned) is smoothness in turning (with a 60Hz monitor at 60 FPS, each will only show an arc of 6 degrees if you make a full revolution within one sec where as a 120 Hz monitor at 120 FPS will have an arc of 3 degrees if you make a full revolution within 1 sec).

    There is a very noticeable difference when playing between a 120+ Hz monitor and a 60 Hz monitor. It's not something that can be really imparted well through explaination, it's more something you have to see yourself. After spending a few years gaming on a 120Hz monitor, I really can't go back to 60Hz without getting some level or motion sickness due to the fluidity of movement on a 120Hz monitor where I can notice the "jerkiness" in a 60Hz monitor.
  14. A whopping savings of $10?

    I'm assuming this was either a typo or sarcasm was being used.
  15. I have the little brother, the 2420z and the monitor is amazing for FPS games.
  16. so what beast of a pc will run that? i reckon you'd need at least a 780 ti or r9289x to play games at ultra above 55 fps...
  17. Balister - Said it best.
  18. Great article!
  19. cool!<img src="http://s04.flagcounter.com/mini/epU/bg_FFFFFF/txt_DEDEDE/border_F7F7F7/flags_1.jpg" width="1" height="1">
  20. Hard market this monitor will be. Asus released a 27" 4k 60Hz display for $600. Personally I have 2 cards and 60 FPS is fine for me as well as 4k would be fantastic!
  21. People like me buy ' gaming' monitors not just for the high refresh rate, but also for the fast response time.

    I can game on a normal monitor just fine, but at times I crave a smoother gaming experience without noticeable blur. At the end of a gaming session I just switch back to my regular monitor.
  22. Meh, another 1080p display - Who cares? /yawn
  23. Since I do photography and gaming (but not competitive FPS) I've been waiting and waiting like everyone else. Unfortunately my monitor is starting to go and I don't know how much longer I can wait. Is there ANY hint out there that there is a new, better, panel tech in the works that might be out by the end of 2014?

    WARNING: Incoming run on sentence.
    Like a lot of people I would love to see a new or upgraded panel tech that gives me at least 27" ABOVE 1920x1200 at 120Hz+ with 8bit sRGB and AdobeRGB that has close colour accuracy when calibrated, even colour and light distrubition, low to nil light bleed, with a response of no more then 5ms.

    Grats if you got through that without having to stop for a breath. :)

    SO! Anyone know if anything is on the horizon that will do something like that? So that we can start merging the needs of photographers and gamers into one?

    Thanks!
  24. 1080p TN panel? Why does this disgusting piece of shit even bother existing?

    When will 1440p/1600p IPS panels reach 120Hz in non-junk brands?
  25. Make it 30 inches and 2560 x 1600 with 144hz led thin and I'm sold, I'm not going backwards in tech and size over refresh rate difference.
  26. InvalidError said:
    This seems a little weird to me: the panel uses constant-current to drive LEDs because some people claim to see flicker at ~20kHz PWM frequency yet the very same display uses backlight strobing to reduce blur and this would be occurring at 144-288Hz which is 100X lower.

    There are two modes:
    Blur Reduction OFF -- It's completely PWM-free, but you have to live with motion blur
    Blur Reduction ON -- It's one strobe per refresh, to reduce persistence, for less motion blur

    Also, detecting flicker at high Hz such as >100Hz-10,000KHz isn't done directly but via the stroboscopic side effects (e.g. wagonwheel effect, phantom array effect). Humans can't directly see flicker at ultra-high frequencies, but they can be affected by the stroboscopic effects. There are people who actually detect kilohertz flicker, via the wagonwheel effect type side-effects (e.g. spinning wheels look stationary), and sometimes the wagonwheel effect itself induces headaches itself, rather than the flicker itself. Even indoor scenery panning around sideways can produce stepping effects (phantom array effect) for some people, so things as simple as turning your head while walking under a 500Hz squarewave LED flickering light source, can be instantly headache-inducing. See scientific lighting study paper at http://www.lrc.rpi.edu/programs/solidstate/assist/pdf/AR-Flicker.pdf .... Page 6 of this paper has the graph, including people who detected something was flickering at 10KHz via observing(or being bothered by) the stroboscopic/wagonwheel side-effects caused by the light source. Most old fluorescent light ballasts (120Hz flicker) are now replaced by 20,000Hz ballasts to cover the outliers, and reduce headaches by a huge amount.

    Now, back to displays... Some people get issues from flicker (seen up to ~100Hz-ish, thresholds vary), other people get issues from the stroboscopic effect (seen far beyond 100Hz) and other people get issues from motion blur (meaning strobing is the lesser evil). On Blur Busters, there are people who actually get headaches from motion blur (e.g. motion blur the worse evil than flicker). It depends on the person. Hundreds of forum posting exists, from places like Overclock.net to other places such as Blur Busters Forums -- where using proper motion-blur-reducing strobing reduced eyestrain (while it increased for others) -- so there are tons of people in the niche market that are actually bothered by motion blur far more than by flicker.

    InvalidError said:
    DookieDraws said:
    Serious question. Why not just buy a quality HDTV with 120 (or greater) Hz for your gaming monitor?

    Most 120+Hz TVs take 60Hz input and pulse their backlight 2-5X per frame to reduce blur during display refreshes and perceivable flicker.
    They actually do one strobe per pixel per unique refresh. Mathematically, for proper motion blur reduction, you need to do one strobe. However, several of them 'scan' (e.g. scanning backlights) so multiple flashes may occur of different parts of the screen (segmented scanning backlight). However, regardless of how the screen is strobed, each pixel needs to be strobed once per unique frame or per unique interpolated frame, in order to get effective motion blur reduction.

    For example, a 240Hz HDTV with four scanning backlight segments, and calling it "Clear Motion Ratio 960", even though each segment strobes only 240 cycles a second (during 240fps interpolation). That said, it tries achieves 1/960sec persistence even with 240fps material. Different displays do strobing techniques differently, e.g. segmented backlight scanning, versus all-at-once global strobing (easier with edgelights).

    Regardless, what this means is that for 120fps, the ideal stroberate is 120 times per second, and for 240fps, the ideal stroberate is 240 times per second, since you want one strobe per frame, otherwise you get double-image effects (like 30fps@60Hz CRT or 60fps@120Hz LightBoost) because during eye-tracking, your eyes has already tracked onwards in the time between the flashes, so the flashes gets occurs in two different places in your retinas. Doing it 3 times, or 4 times, can lead to triple-edge or quadruple-edge effects. However, it doesn't matter if the whole screen is flashed all at once, or scanned sequentially top-to-bottom (like a CRT), the display-based motion blur (persistence) is proportional to how long a pixel is illuminated for to the human eye. High persistence, more motion blur. Low persistence (shorter illumination), less motion blur.

    Anonymous said:
    so what beast of a pc will run that? i reckon you'd need at least a 780 ti or r9289x to play games at ultra above 55 fps...
    Sometimes reducing detail actually increases motion detail. Otherwise you have a situation of ultra-detailed graphics during standing still, but blurry VHS-quality during fast motion. So, essentially, lowering static detail to gain extra motion detail. So backing off slightly from Ultra settings can be advantageous when you want to increase motion detail. Though, some of us run multiple Titans to allow us to have our cake and eat it too (for the most part), during strobed 120Hz operation.

    Disclaimer: I am the creator of Blur Busters, of TestUFO.com motion tests, and of the Strobe Utility mentioned on Page 9 of the TomsHardware review.
  27. mdrejhon said:
    Also, detecting flicker at high Hz such as >100Hz-10,000KHz isn't done directly but via the stroboscopic side effects (e.g. wagonwheel effect, phantom array effect).

    The "wagonwheel effect" only applies to a strobe lighting an object in motion where the timing correlation between the movement and strobe can create the illusion of slow/stop/reverse motion or discontinuous motion in the case of your lighting research paper. The image on an LCD on the other hand is fundamentally static between screen refreshes so there is nothing to "wagonwheel" with in-between.

    Your paper says everyone was very satisfied with anything over 2kHz strobe rate at 100% modulation so the 20+kHz PWM rate on LED-lit LCDs is an order of magnitude beyond what the test group effectively deemed indistinguishable from continuous light.

    The reason why electronic FL/HID/sodium/etc. ballasts operate at over 20kHz has nothing to do with "outliers;" it is mainly to avoid producing human-audible whine in HF transformers, discharge arc and filaments. Were it not for that, most ballasts would likely operate at less than 10kHz since that is high enough to prevent arc extinction in discharge lamps, thereby significantly reducing EMI, increasing efficiency and improving tube lifespan by reducing electrode/filament sputtering. They cannot crank frequency arbitrarily high due to parasitic inductance in lighting wiring and lamps - particularly the electronic ballast conversion kits which have to cope with a much broader range of tube types than CFL where the ballast is tuned specifically for the tube type it is bonded to.
  28. If it were 1440p I would buy it. Alas it is not, so I have to pass.
  29. Quote:
    Quote:
    Monster Cookie:
    When do monitor manufacturers understand, that 1080p resolution is a JOKE,
    especially on a large 27" screen?
    In the early 2000s it might have been ok to have such resolution, but nowadays
    it is no longer usable. Even for a 24" screen the minimal resolution is
    1920x1200.



    In the early 2000s CRTs were still the standard. 4:3 was the standard aspect ratio. There were no 1080p LCD monitors let alone large 1080p LCD monitors, and I paid ~$1200(NZD) for a 17" 1280x1024@60Hz (16ms) LCD display in 2003. That's how bad it was back then.


    I paid $ 300 (bargain-favor) for a $ 800 Gateway LCD monitor in 2001, with the same rez and timing. It was expensive compared to CRT but my eyes were safer :)
  30. I guess christmas came early :)
  31. InvalidError said:
    The "wagonwheel effect" only applies to a strobe lighting an object in motion where the timing correlation between the movement and strobe can create the illusion of slow/stop/reverse motion or discontinuous motion
    John Carmack and Michael Abrash might have a word with that. Wagonwheel effects and other stroboscopic effects (relative of the wagonwheel effect) also affects displays. I see it in operation too, when I play games. Some people are sensitive to it, while others don't notice it or pay attention to it. Like some people see stutters and others see tearing.

    See Michael Abrash's paper that talks of the effect even of sample-and-hold displays:
    Michael Abrash: Down The VR Rabbit Hole

    Also a very good discussion thread:
    So what refresh rate do I need?
    (Especially see the 60Hz vs 120Hz photo halfway down the thread, and then do the same test with your human eyes)

    It is correct that the image on an LCD is static and strobefree between screen refreshes. However, the stroboscopic effect still exists on such displays when eyes are not tracking the display motion, due to the nature of finite number of images (positions on screen) visually mimicking continuous motion (infinite positions of object in real life). As image moves on a display, such as frames of a wagon wheel spinning, the wheel can look stationary on a display too unless you add artificial motion blur to it, and adding GPU-based motion blur effects is not always desirable (unnatural extra motion blur enforced on your eyes above-and-beyond natural human limitations), especially in fast-action FPS games.

    Also, recognize that stroboscopic effects is another artifact of finite-refrehsrate displays. Such as waving a mouse arrow around on a black background, there's a multiple-mouse-arrow effect if you stare stationary while moving mouse. The only way to make the mouse arrow a continuous motion blur, is to add GPU motion blur effect, or make sure there's enough refreshes for each pixel of movement (e.g. 1000 pixel/second mouse movement at 1000Hz, would fill each refresh with a unique mouse arrow cursor position only one pixel away from its previous refresh). Stroboscopic-like effects of this type also occurs on stample-and-hold displays (regardless of strobing or impulse driving) as the static refreshes show objects of static frames in different parts of retinas, without a continous motion path between the frames. Scientific 500Hz and 1000Hz show far less of this problem in the laboratory (e.g. ViewPixx sells a true-500Hz vision scientific research projector display), and they show much less of this stroboscopic/stepping issue. If you stare stationary while motion scrolls past (e.g. Eiffel Tower in the TestUFO Moving Photo Test), you will see a stepping effect (stroboscopic effect) on any display (even flickerfree LCD) as the static frames show on different parts of our retina with no continuous motion path between the individual refreshes. Even 120Hz wouldn't be the final frontier on eliminating all human-detectable effects, and researchers can tell apart 250Hz vs 500Hz via a reduced stroboscopic effect, on a Viewpixx true 500Hz Scientific Projector, or one of the laboratory vision research projectors, and the stroboscopic effect is still visible even at 500Hz (you can move the mouse arrow faster than 500 pixels per second, and lots of computer motion can go faster than 500 pixels per second).

    Granted, the stroboscopic effect / phantom array effect is not exactly the same as the wagonwheel effect, but they are very similiar visual phenomenae caused by static frames with no continous motion path between the object positions.

    For the usage of displays, the stroboscopic effect is caused by the lack of intermediate motion between adjacent frames, so you don't need to flash between the frames to see a stroboscopic effect, for the stationary-eye-moving-object scenario. This effect is beautifully illustrated in Michael Abrash' diagrams, and well-demonstrated in certain TestUFO motion tests.

    The discrete stepping-forward of frames provides its own built-in stroboscopic effect when the eye is not in sync with the motion. The artificial invention of "Frame Rate" (since the zoetropes and kinetoscopes of 19th century) attempting to represent continuous motion using a series of static frames, can create a stroboscopic effect, by the virtue of the series of static frames containing objects in different positions relative to eye stare position. People like John Carmack, Michael Abrash, good VR display developers such as Oculus, and various vision researchers, and other "Blur Busters" minded people (like me) understand the vision phenomena behind this.

    The effect is not a major hindrance especially if you use low-persistence via strobe techniques, but the stroboscopic effect exists (during stationary-eye-while-objects-move-past situations). Regardless of 60Hz or 120Hz, and whether the display is flickerfree / sample-and-hold or uses scanning/impulse-driving/strobe-backlight techniques, due to a finite, discrete frame rate being used represent continuous motion. It is true it is not a major problem once we hit 120Hz+ refresh rate, and points of diminishing returns exist, but by all means, it doesn't mean the stroboscopic effect isn't detectable on displays, even at current common refresh rates (and far higher).
  32. Most people still crying about the resolution. Until recently I had a 32" Sony HDTV. That thing was a gem to TV on. Crisp, clear etc, even up quite close. I think 27" HD is just on the edge of being perfectly decent in terms of resolution. I agree with the gamers - speed is king in choosing a monitor because it DOES cost you kills. Someone asked why not use a TV as a monitor... well... because of response time too. That's one of the factors that makes a TV inappropriate for serious gaming on a powerful PC.
  33. I don't know why people demand QHD monitors with 120-140hz like if it would be too easy achieve 120-140 fps, even now, most demand games at ultra settings at 1080p barely achieve the 120-140fps, you can find any review lets say about the Titan Black and see by yourself what Im talking about and it´s a 1k GPU!, here is a proof:

    http://www.bit-tech.net/hardware/graphics/2014/02/26/nvidia-geforce-gtx-titan-black-review/3

    So I think that´s the reason why they release monitors like this one, so you can really get those frames.
  34. heydan said:
    I don't know why people demand QHD monitors with 120-140hz like if it would be too easy achieve 120-140 fps, even now, most demand games at ultra settings at 1080p barely achieve the 120-140fps, you can find any review lets say about the Titan Black and see by yourself what Im talking about and it´s a 1k GPU!, here is a proof:

    http://www.bit-tech.net/hardware/graphics/2014/02/26/nvidia-geforce-gtx-titan-black-review/3

    So I think that´s the reason why they release monitors like this one, so you can really get those frames.


    I absolutely agree with you! That's my point as well. Pixel density on a 27" FHD is still decent enough and on the most demanding games you finally get absolutely fluid motion on the high-spec cards on maxed out settings. Even then though, with some games, as soon as you switch on some of the more intensive settings, you'd still fall short of 144fps. I say well done with this monitor - it's super-fast and I'm certain that the resolution would be perfect for a 27" display
  35. For a 27" monitor, the clarity of a 1080p monitor becomes as good as it can for the human eye at about 42 inches away. I sit about 35" away from my VG278H 120hz 1080p monitor and can barely see any pixels, and the 120hz, lightboost mod, near non-existant input lag and 1ms refresh time makes gaming a very smooth and blur free experience.


    2560 x 1440 monitors only look a tiny bit better at the same distance, but there hasn't been any good gaming monitors that can match the smoothness of gaming TN panels yet. The ROG swift will be the first one that can do this as far as I know and that's not even due out until end of July or something like that.

    In any case, the point is that 1080p / 1440p are perfect pixel quality for computer desk viewing distance and 120hz / low refresh rates are what's important when gaming.

    You'd have to stick your face 15 inches away from a 4k 27 monitor to see any real difference between a 1440p monitor and a 4k and to top it all off you increase your GPU costs and murder your frame rates using 4k at this current stage in the game.

    Unless you can afford 2x Titan Z's and plan on playing 4k resolution on a 50"+ screen on a HTPC or something, 1080p / 1440p is more than good enough for your needs.
  36. LOLz at you guys.
  37. Quote:
    Serious question. Why not just buy a quality HDTV with 120 (or greater) Hz for your gaming monitor? Especially if you'll be gaming at 1920x1080. A neighbor has his PC hooked up to a quality HDTV and it looks great to me. I've played Battlefield on it with no issues at all. It's pretty awesome!



    It depends on your GPU.... some don't have the proper connections available for a 120 hz Television... or don't produce over 1080i @ 30 hz refresh rate through the HDMI connection. most gaming monitors with low ms times, and above 100 hz use DVI or Display port... which the televisions do not always have. So again, it would probably need to be a side by side comparison to see the actual difference for most people, but the monitor will have lower ms times (1 to 5) and less input lag than the TV in most cases, and the capability of running 2 or 3 screens for gaming in Phys-X.
  38. Also remember that if your PC doesn't run games at 100 fps or above...this doesn't really matter anyhow.... you won't notice the difference playing an MMO @ 38 fps or less...it will be the same as a 60 hz refresh rate. So this is mainly for fps games or well built gaming rigs.
  39. Quote:
    Hard market this monitor will be. Asus released a 27" 4k 60Hz display for $600. Personally I have 2 cards and 60 FPS is fine for me as well as 4k would be fantastic!


    that monitor won't work @ 60 hz without Display port 1.2...make sure your card has it, as the HDMI cable will only run it @ 30 hz
Ask a new question

Read More

BenQ Monitors Tom's Hardware Top Picks Displays