Skip to main content

ViewSonic Upgrades LG’s 1ms IPS Panel With Mouse Bungees, Headset Hook, G-Sync

ViewSonic Elite XG270QG with headset hook out.
(Image credit: Tom's Hardware)

When shopping different monitor panel types, gamers seeking speed will typically opt for TN. LG in June shook up this conversation, though, by introducing the first IPS gaming monitor with a 1ms response time, along with the image quality and superior viewing angles presumed of IPS technology. Now, ViewSonic is taking this same groundbreaking panel enhancing its package with thoughtful gaming perks, like a mouse bungee and true G-Sync, rather than G-Sync Compatibility.

The 27-inch model in the LG UltraGear Nano IPS gaming monitor lineup has QHD resolution and 144 Hz refresh rate. It’s G-Sync Compatible, meaning it’s been certified to successfully run Nvidia G-Sync, despite lacking Nvidia's proprietary module. A ViewSonic exec told me that the gaming community is asking for G-Sync in favor of features like overdrive and ultra low motion blur that you can get run with G-Sync Compatibility, plus the ability to run adaptive sync at lower refresh rates.

The ViewSonic Elite XG270QG, which I got to check out (without a connection to a PC) in-person, instead uses G-Sync with the same LG panel. It’s also QHD but can hit a 165 Hz refresh rate when overclocked. With an identical “IPS Nano Color” LG panel, both monitors cover 98% of the DCI-P3 color gamut. LG’s Nano Color technology means each particle is treated with an extra phosphorous layer, allowing it to absorb more light and colors for greater color gamut coverage.

Two mouse bungees.
(Image credit: Tom's Hardware))


In addition to true G-Sync, the Elite XG270QG also stands out from the LG with bonus features of convenience, namely two mouse bungees (for right and left handers) to prevent your gaming mouse’s cable from snagging during aggressive gaming, as well as a hook so you have somewhere to stash your gaming headset. This all points to a cleaner desk that lets you focus on what matters: gaming.

(Image credit: Tom's Hardware)

Like the LG UltraGear 27GL850-B, the Elite X6270QG comes with RGB lighting. There’s an RGB hexagon on the back of the aluminum build, but the ViewSonic exec we spoke with admitted that it may not be bright enough to make a huge impact on your wall. So ViewSonic also incorporated an RGB strip at the bottom of the panel, which cast a nice light on our office’s conference room table, even with the lights at max brightness. You can easily turn off either (or both) RGB effects with a joystick on the bottom of the monitor should those lights distract you.

(Image credit: Tom's Hardware)

In that same vein, ViewSonic told us that it wanted to keep things more subtle than the aggressive, colorful and angular looks of other gaming monitors, so the Elite XG270QG could be versatile, perhaps even fitting in an office setting and with older users. In fact, ViewSonic’s more monotone approach will likely be the Elite’s look for the foreseeable future. The stand was also redesigned, including a hook handle up top, while you can also run wires through its central loop.

Additionally, the lighting will be controllable via the Elite RGB Controller on-screen display (OSD) software, which will be the first app for ViewSonic’s Elite line when it debuts in November. It lets you pick colors and browse through different lighting modes.

The XG270QG will debut worldwide in November for an estimated $599.99, about $100 more than the LG UltraGear 27GL850-B with the same panel.

ViewSonic is also releasing two other Elite monitors with the Elite Display Controller, mouse bungee and headphone hook (the XG270 in November and the XG270QX in Q1 2020), plus its XG05 series (in North America in Q1 2020). You can see full specs for all the upcoming Elite monitors below.

ViewSonic Elite Monitors Specs

An Elite Ally

(Image credit: ViewSonic)

ViewSonic is also sharing details on a new touchscreen peripheral called the Elite Ally. According to the vendor, gamers will be able to use the device to for easy access to OSD settings, like game modes, FreeSync or G-Sync, RGB lighting, brightness, HDR and contrast. The accessory will only work with the Elite line of monitors and will come out in November. The price wasn’t disclosed.

  • Giroro
    Does anyone actually know when the release date is for LG's?rel=ugc]https://www.bhphotovideo.com/c/product/1488890-REG']LG's UltraGear 27GL850-B ?
    In the past couple weeks B&H has pushed back availability from early October to November, and dropped Best Buy from their "Where to Buy" section. All the early reporting for this monitor said it would come this summer.... which is over.
    I'm similarly frustrated that TCL's new 55R625 series 6 TV's were also supposed to come out "this summer" and still haven't surfaced.
    Reply
  • bit_user
    Giroro said:
    Does anyone actually know when the release date is for LG's?rel=ugc]https://www.bhphotovideo.com/c/product/1488890-REG']LG's UltraGear 27GL850-B ?
    Allegedly, it's already shipping. You can find reviews of it and some people claim to have them, even though availability is spotty.
    Reply
  • bit_user
    I'm disappointed the table indicates no HDR. I'd also like to know whether this monitor include FreeSync/FreeSync2 support.

    What's attracting me to LG's monitor is the combination of high refresh rate + HDR. For me, 2560x1440, 27", and non-curved are also requirements. For a larger display, I'd go with curved, but I'd also want 4k and I don't care to spend on a graphics card that can drive it well. So, if it doesn't have quality issues (and I'm reading some people are experiencing bleed), I think LG's is the monitor for me.

    BTW, I'm skeptical of array backlighting and I don't even want a monitor so bright that it'd hurt my eyes. Two more points in favor of this panel, vs. some of the more "premium" options.
    Reply
  • cryoburner
    ...and true G-Sync, rather than G-Sync Compatibility.
    I would rather have a "G-Sync Compatible" (FreeSync) display than one with "True G-Sync". With FreeSync over DisplayPort, you get a screen that should support adaptive sync with cards from all the major vendors, including Nvidia, AMD and likely Intel once they enter the graphics card market, along with some titles for the Xbox One for screens that support the feature over HDMI, and I would bet that the upcoming generation of consoles from Microsoft and Sony will both offer wider support for it. With "True G-Sync" you get what amounts to the same thing, only locked to Nvidia graphics cards and nothing else, and there's no telling who will be providing the best options for graphics hardware a few years down the line.

    bit_user said:
    What's attracting me to LG's monitor is the combination of high refresh rate + HDR...
    ...BTW, I'm skeptical of array backlighting and I don't even want a monitor so bright that it'd hurt my eyes.
    Yeah, I'm not sure I'd be all that fond of super-bright full-array backlighting on an HDR screen, at least in a desktop environment where its filling one's field of view in what may be a relatively dimly lit environment.

    However, full array backlighting would pretty much be a necessity to get an HDR experience out of a panel like this. Judging by the TechSpot/HardwareUnboxed review for the LG 27GL850, that screen's panel has a relatively weak native contrast ratio at under 800:1, placing its contrast even below most of the recent TN panels they tested, despite being IPS, which are usually a bit better...
    T5Loh7vOcVMView:&rel=ugc]https://www.youtube.com/watch?v=T5Loh7vOcVM]View: &rel=ugc]https://www.youtube.com/watch?v=T5Loh7vOcVM

    ?rel=ugc]https://www.techspot.com/review/1908-lg-27gl850
    Even if you don't want parts of a scene in HDR content to be uncomfortably bright, one of the primary points of HDR is to offer an increased range of brightness levels, and you are unlikely to see any benefit from that without either making the brightest parts of a scene brighter, or the dimmest parts dimmer. A panel like this isn't going to provide decent HDR output using a standard backlight, since it doesn't have enough contrast on its own. A VA panel provides around 3 to 4 times the native contrast, and would almost certainly be a better starting point for HDR without array backlighting.

    The screen's backlight apparently doesn't even meet the requirements for HDR400 certification, only offering up to 350 nits of peak brightness, so while it technically may "support" an HDR input, it would be a stretch to consider the output "HDR".

    The main draw of that LG screen (and this one that apparently uses the same panel), is that they offer some of the fastest refresh rates for IPS. Again, it's probably a stretch to consider it a "1ms" refresh rate, since that apparently requires setting overdrive to a level that causes substantial inverse ghosting, but even at normal overdrive levels, they perform pixel transitions faster than other IPS displays. Of course, again, that results in below normal contrast for an IPS screen, so there's a tradeoff there in terms of image quality.
    Reply
  • bit_user
    Thanks for the wealth of info.

    cryoburner said:
    Even if you don't want parts of a scene in HDR content to be uncomfortably bright, one of the primary points of HDR is to offer an increased range of brightness levels, and you are unlikely to see any benefit from that without either making the brightest parts of a scene brighter, or the dimmest parts dimmer. A panel like this isn't going to provide decent HDR output using a standard backlight, since it doesn't have enough contrast on its own. A VA panel provides around 3 to 4 times the native contrast, and would almost certainly be a better starting point for HDR without array backlighting.

    The screen's backlight apparently doesn't even meet the requirements for HDR400 certification, only offering up to 350 nits of peak brightness, so while it technically may "support" an HDR input, it would be a stretch to consider the output "HDR".
    What I really care about is reducing banding. That's why I want a 10-bit panel, even if it's 8-bit + FRC (the artifacts of which I feel should be diminished with a higher refresh rate).

    To the extent that I care about HDR, it might be just to dabble with it from a programming perspective. And for that, as long as I can see a discernible difference on the monitor, I don't really care if it's subtle. I still don't know a whole lot about HDR, however (i.e. getting into the nitty gritty of how to actually use it).

    It seems like HDR has been coming for more than a decade - I can remember back when the PS3 rolled in firmware updates that enabled deep color (up to 16-bit per channel!) and it seemed like that + xvYCC was gonna be the next big thing. And when AMD and Nvidia's pro cards started adding support for 10-bit and 12-bit. At my office, we even have monitors from 7-8 years ago that are 8-bit + 2-bit FRC - so old, I'm pretty sure they still have fluorescent backlights!
    Reply
  • SirGalahad
    cryoburner said:
    I would rather have a "G-Sync Compatible" (FreeSync) display than one with "True G-Sync". With FreeSync over DisplayPort, you get a screen that should support adaptive sync with cards from all the major vendors, including Nvidia, AMD and likely Intel once they enter the graphics card market, along with some titles for the Xbox One for screens that support the feature over HDMI, and I would bet that the upcoming generation of consoles from Microsoft and Sony will both offer wider support for it. With "True G-Sync" you get what amounts to the same thing, only locked to Nvidia graphics cards and nothing else, and there's no telling who will be providing the best options for graphics hardware a few years down the line.

    I much prefer True Gsync as it’s far superior to freesync and freesync 2. There is a lot of misinformation out there with people thinking that freesync and gsync are the same thing. Mostly because Nvidia has down a terrible job at actually explaining what their technology does and why it’s better. Gsync compatible and freesync are the same mostly. But actual gsync is different.
    Gsync works from 1 hz to the max refresh rate of the monitor (say 144 hz or 240 hz) whereas freesync is within a window. such as 60 - 144hz. If you go over or under that it introduces tearing, stutter, and latency.
    The majority of freesync monitors are less than 75 hz. With the next step being 76 - 144 with only a small margin being over 144hz. Whereas gsync is almost all 144 - 240 hz.
    Gsync goes through a rigorous certification Process that freesync doesn’t. Although some freesync 2 monitors go through more of a process but still not to the same standard as gsync.a. This includes 300 plus image tests (there are a ton of freesync monitors that suffer from image artifacts and flickering and stuff. They also struggle a ton with ghosting)
    b. variable overdrive (1hz - max refresh rate) not available on freesync.
    c. factory color calibrated (you’d be shocked how bad color is for the majority of monitors and what you are missing due to that).
    d. Handles frametime variance, frame time spikes, and ghosting a lot better for way more consistent and smoother results.
    e. Has increased scanout speed (meaning on the NES there is a game that is 60.1 hz. Which gsync can do. It also decreases input lag from 16.9 ms to 6.9 ms)
    f. There is no latency increase when setup correctly.

    yes they both provide a tear free and stutter free experience. But that should be with an * as freesync isnt as consistent or smooth even under ideal conditions as gsync. You can take a gsync monitor in the poorest of conditions and it will perform better than freesync in the best of conditions. As gsync has so many more things happening behind the scenes with their chip that is talking to the graphics card.
    Reply
  • cryoburner
    SirGalahad said:
    There is a lot of misinformation out there...
    Misinformation, indeed. : D

    SirGalahad said:
    Gsync works from 1 hz to the max refresh rate of the monitor (say 144 hz or 240 hz) whereas freesync is within a window. such as 60 - 144hz. If you go over or under that it introduces tearing, stutter, and latency.
    That's not exactly accurate. Both AMD's FreeSync and Nvidia's "G-Sync Compatible" implementation support Low Framerate Compensation (LFC), so long as the screen's maximum refresh frequency is at least 2.4x above its minimum, which applies to nearly all 144+Hz FreeSync displays. In the event that the framerate drops below the minimum refresh rate window supported by the hardware, the refresh rate is simply doubled to stay within the supported range, effectively removing that lower limit.

    SirGalahad said:
    The majority of freesync monitors are less than 75 hz. With the next step being 76 - 144 with only a small margin being over 144hz. Whereas gsync is almost all 144 - 240 hz.
    This is not meaningful or particularly accurate either. Sure, there are plenty of FreeSync displays with standard refresh rates available, but those are in addition to a wide selection of high refresh rate displays. There are simply far more FreeSync options available, ranging all the way from budget $100 75Hz screens up to $1000+ premium high refresh rate screens. In fact, even just looking at high refresh rate displays, there are significantly more available with FreeSync than with G-Sync. Let's see what's currently available for FreeSync and G-Sync screens in Newegg's Gaming Monitor section (limited to new hardware, in stock and sold by Newegg)...

    (True) G-Sync:
    (3) 200-240Hz models
    (8) 144-165Hz models
    (4) 100-120Hz models
    (1) 60 Hz model

    FreeSync:
    (10) 200-240Hz models
    (44) 144-165Hz models
    (5) 100-120Hz models
    (20) 60-75Hz models

    So sure, there's a bunch of standard refresh rate screens that support FreeSync (actually Newegg has more than what's shown here, since a number of them are not listed in their Gaming Monitor section), but those are in addition to a wider selection of high refresh rate displays. In fact, Newegg stocks around four times as many 100-240Hz FreeSync displays than they do G-Sync, before we even get into the lower refresh rate options. And while those low refresh rate options might not provide an ideal adaptive sync range, that's still better than the range offered by true G-Sync hardware anywhere near their price points, which is nothing at all. G-Sync displays tend to be priced well outside what the vast majority of people are willing to pay for a monitor, so they are not particularly relevant to most people.

    SirGalahad said:
    Gsync goes through a rigorous certification Process that freesync doesn’t. Although some freesync 2 monitors go through more of a process but still not to the same standard as gsync.
    Again, if we're talking "G-Sync Compatible" displays, Nvidia certifies those to meet their standards for adaptive sync and refresh rate capabilities as well. And their standards appear to primarily amount to just keeping pixel response times within a certain threshold, not having any obvious flaws, and likely paying them a decent amount for the certification process that probably takes one guy an afternoon's worth of testing. It's not a bad thing that they have a certification process, but G-Sync certification alone doesn't necessarily mean that a screen has good all-around image quality or design, just that it meets certain arbitrary standards set by Nvidia, so one should still check out detailed reviews to determine how good a monitor is.

    SirGalahad said:
    You can take a gsync monitor in the poorest of conditions and it will perform better than freesync in the best of conditions.
    This is not true. Many manufacturers produce both FreeSync and G-Sync versions of a display that use the same panels and are nearly identical outside of pricing. There are undoubtedly a number of screens that don't meet Nvidia's specifications for G-Sync at the lower end, but again, those are not priced anywhere remotely close to what a comparable G-Sync screen costs. At a similar price, a FreeSync screen should typically be very similar to its G-Sync counterpart, and one might even be able to use the money saved to move up to a "better" screen.
    Reply