Asus ROG Swift PG258Q Monitor Review

Ever since the original Asus VG248QE was introduced, the race towards higher refresh rates has continued unabated. It seems only recently we saw the first 160Hz screens in our lab. Now Asus has broken new ground with its first 240Hz display, the ROG Swift PG258Q. But that isn’t the whole story. It also sports G-Sync from 24-240Hz, ULMB up to 144Hz, and a host of features sure to appeal to enthusiasts.

Right off the bat, some of you are saying “not another TN panel!” But stick with us here, it’s not so bad. For starters, it’s one of the nicest TN screens we’ve ever seen. Viewing angles look far better than pretty much any other TN display we’re aware of. And our photos on page five support that. And yes, it uses Frame Rate Compensation (FRC) to take its native 6-bit color depth to 8-bits. But thanks to that, the PG258Q has the lowest input lag we’ve ever measured. Again, page five has the details.

Specifications

MORE: Display Calibration 101

MORE: The Science Behind Tuning Your Monitor

The feature list is just what you’d expect from Asus’ premium ROG line. You get ULMB, and it works up to 144Hz. The panel is bright enough that output remains high in both G-Sync and ULMB modes. And you can take advantage of independent brightness settings to keep levels matched. Another improvement is the elimination of the refresh overclock feature. You can simply choose 240Hz in Nvidia Control Panel rather than rebooting the monitor to change rates.

Held over from other ROG screens is the GamePlus feature with its reticles, timers, FPS counter, and screen alignment guide. Asus has also retained the excellent OSD joystick navigation and styling cues reminiscent of something you’d see on the deck of a starship. It’s a premium package at a premium price, but so far it seems like you’re getting your money’s worth. Let’s take a look.

Packaging, Physical Layout & Accessories

The carton is quite substantial and more than capable of protecting its expensive contents. Rigid foam surrounds the panel and attached upright. The metal base, input panel cover, and cable bundle reside in a top tray. Those cables include HDMI, DisplayPort, and USB 3.0. The power supply is external and looks like a miniature Apple TV.

Assembly requires only that you screw the base on with a captive bolt. You can also snap lenses in place that work with the Light In Motion feature to project colored patterns onto your desk. More on that below.

Product 360

To me the panel says “starship” while the base says “steam punk.” The same accents we first saw on the PG348Q ultra-wide are here on the PG258Q. The base has copper-colored trim, which continues around back on a large ring that surrounds the panel/upright interface. It seems there is no VESA mount, so you’ll have to use the included hardware.

The screen is surrounded by a very thin bezel that measures just 7mm on the sides, which is great for multi-monitor setups. The top is slightly wider at 9mm, and the bottom goes to 13mm. No lighting is visible on the front except for a tiny power LED at the lower right edge. It glows red for G-Sync, green for ULMB, and white for Normal mode.

Control keys are around back and are topped by a slick joystick that makes OSD navigation a snap. Our only nit-pick is the power button feels the same as the others and is not separated from them. It’s too easy to accidentally turn off the monitor when you’d rather pop up the GameVisual menu. It has a different texture though, so if you’re careful, you’ll soon get the hang of it.

Like other ROG Swift monitors, the PG258Q has a Light In Motion effect. It takes the form of a small LED that projects a colored symbol on your desktop. It has three intensity levels, or you can turn it off.

Styling dictates a somewhat thick panel with a rounded taper across the back. There are no speakers, nor are there side USB inputs. One upstream and two downstream (version 3.0) ports reside on the bottom-facing input panel. Also there are one each of HDMI and DisplayPort along with a 3.5mm headphone output. Once you’ve plugged in all the cables, an included cover snaps on to hide the jacks. Cable management is aided by a hole in the upright.

MORE: Best Computer Monitors

MORE: How To Choose A Monitor

MORE: All Monitor Content

Create a new thread in the Reviews comments forum about this subject
This thread is closed for comments
32 comments
Comment from the forums
    Your comment
  • apertotes
    I just don't get it. How can you leave the contrast (arguably the most importante feature on a screen after the resolution) out of the first page of the article? I couldn't care less about the bezel width, but please, state the contrast!
  • cknobman
    G-Sync, no thanks.

    No wonder this thing is $600.

    Have fun paying the Nvidia tax.
  • ahnilated
    1080P, *sigh* who wants this anymore! Get me a 4K monitor at 30-32" with good specs and G-Sync that isn't $4000.
  • dstarr3
    Anonymous said:
    1080P, *sigh* who wants this anymore! Get me a 4K monitor at 30-32" with good specs and G-Sync that isn't $4000.


    Because that's the reality of GPUs at the moment. GPUs can get you a solid 4K/60 or a solid 1080p/144. You're not going to get 4K/144 in any modern games on any PC at the moment (unless you're only playing 20-year-old games). So there's not a lot of sense in wanting to invest in a 4K/144 monitor now, only in anticipation of when GPUs can finally push that many pixels, because you'll be wasting the monitor while you wait, and when such GPUs finally do arrive, the monitors will be better and cheaper.
  • Rosanjin
    Do we know if this monitor is 3dVision capable?

    I would assume so, but I've learned some very expensive lessons by making purchases based on assumptions. : /
  • __Isomorph__
    @apertotes: LOL damn right! who cares about the bezel??!
  • ahnilated
    Anonymous said:
    Anonymous said:
    1080P, *sigh* who wants this anymore! Get me a 4K monitor at 30-32" with good specs and G-Sync that isn't $4000.


    Because that's the reality of GPUs at the moment. GPUs can get you a solid 4K/60 or a solid 1080p/144. You're not going to get 4K/144 in any modern games on any PC at the moment (unless you're only playing 20-year-old games). So there's not a lot of sense in wanting to invest in a 4K/144 monitor now, only in anticipation of when GPUs can finally push that many pixels, because you'll be wasting the monitor while you wait, and when such GPUs finally do arrive, the monitors will be better and cheaper.


    The reason you won't get it is because the GPU's won't do it, not because the games won't. Game developers want to make more realistic games but the GPU's are lagging way behind. Nvidia hasn't had any real competition for many years so there was no need for them to push to 4K gaming at 144Hz or higher. I am hoping AMD's cards will force Nvidia to get off their buts as it seems the consumers aren't going to pull their money from Nvidia until Nvidia gets back on the ball.
  • Geo Matrix
    I agree with AHNILATED! DSTARR3 says, "Because that's the reality of GPUs at the moment". I say let's have some serious change! Asus and Nvidia are "milking the cow" with these old relics. Everything is now going 4K, 6K and 8K. It's time to stop milking the cow and people's wallets and put out the new technology. It's 2017, not 1980. We all know the new tech is already here.
  • dstarr3
    Based on the performance bump we saw from the 1080 Ti, I don't think it's fair to say that nVidia's slouching when it comes to GPU performance. Pricing, sure, they could use more competition. But something like 4K/144 is a seriously enormous amount of processing to do. The DisplayPort and HDMI interfaces themselves had to be updated to transfer that much data. I'm amazed we got 4K/60 out of GPUs as quickly as we did. Give it another generation and we should be hovering around 4K/144. But you ask why there's no 4K/144 gaming monitors coming out yet, and this is why. There aren't any 4K/144 GPUs out yet, either. And it's not because any particular company is stagnating. It's because pushing that many pixels to a monitor is a huge, huge task.
  • Deadshot-89
    I use an Eizo EV 2336W. It has incredibly accurate and incredibly deep and vidid colors right out of the box. It also has a very deep picture and for an IPS screen very deep blacks. And to top it all off, it has extremely nice viewing angles, no color-shifts and it retains a ton of brightness at very steep angles. Motion performance is very OK for a 60 Hz screen.

    My current hardware isn't really capable of producing more than a reliable 1080p60. (GTX 970, i5-4590). So I see no reason to switch to a higher res screen or higher refresh rate screen.
  • Kridian
    Anonymous said:
    @apertotes: who cares about the bezel??!
    Right? And who wants more lights shooting out the bottom to distract from the screen? Lame.
  • dstarr3
    Don't get me started on that tacky, tacky base. I HATE that base, so, so much.
  • Stone Cold
    A $600 TN, 1080p panel gets a recommended... what the actual hell.
  • dstarr3
    Anonymous said:
    A $600 TN, 1080p panel gets a recommended... what the actual hell.


    TFT Central's got their review up and they love it, too. TN panel, yes, but apparently just a really damn good TN panel.
  • ahnilated
    Anonymous said:
    Based on the performance bump we saw from the 1080 Ti, I don't think it's fair to say that nVidia's slouching when it comes to GPU performance. Pricing, sure, they could use more competition. But something like 4K/144 is a seriously enormous amount of processing to do. The DisplayPort and HDMI interfaces themselves had to be updated to transfer that much data. I'm amazed we got 4K/60 out of GPUs as quickly as we did. Give it another generation and we should be hovering around 4K/144. But you ask why there's no 4K/144 gaming monitors coming out yet, and this is why. There aren't any 4K/144 GPUs out yet, either. And it's not because any particular company is stagnating. It's because pushing that many pixels to a monitor is a huge, huge task.


    I understand it is a huge task but we have been sitting on getting low quality upgrades in GPU versions. It is only recently because AMD came back into the market and scared Nvidia that we got a good bump in the 10XX series. If you don't get at least a 30% improvement it is a waste. Game developers have been waiting for years for GPU's that can push the stuff they want to put out.
  • dstarr3
    Anonymous said:
    Anonymous said:
    Based on the performance bump we saw from the 1080 Ti, I don't think it's fair to say that nVidia's slouching when it comes to GPU performance. Pricing, sure, they could use more competition. But something like 4K/144 is a seriously enormous amount of processing to do. The DisplayPort and HDMI interfaces themselves had to be updated to transfer that much data. I'm amazed we got 4K/60 out of GPUs as quickly as we did. Give it another generation and we should be hovering around 4K/144. But you ask why there's no 4K/144 gaming monitors coming out yet, and this is why. There aren't any 4K/144 GPUs out yet, either. And it's not because any particular company is stagnating. It's because pushing that many pixels to a monitor is a huge, huge task.


    I understand it is a huge task but we have been sitting on getting low quality upgrades in GPU versions. It is only recently because AMD came back into the market and scared Nvidia that we got a good bump in the 10XX series. If you don't get at least a 30% improvement it is a waste. Game developers have been waiting for years for GPU's that can push the stuff they want to put out.


    That's not a new problem at all. It's a problem that's probably as old as computer gaming itself. I remember back in the days of Doom development in 1993, they were lamenting how underpowered PCs were and how hard they had to work to get Doom to play well on reasonable machines. The problem is not recent and has nothing to do with nVidia. There's just always some developers out there that're pushing the limits, and there will never be enough computing power for them. And that's fine, that's how we make progress. But to blame technology for not keeping up with demands is a bit unfair, because there will always be a demand for more than what's possible at any given point.

    Also, "if you don't get at least a 30% improvement it is a waste" is rubbish. It's the tick-tock cycle. One generation improves compute power, the next generation improves efficiency. Those efficiency-boosting generations are not even slightly wasteful just because they didn't produce "a 30% improvement" in performance.
  • Joe Black
    Ehm... GSync... Bane of my PC planning existence at the moment. You've never seen a panda as sad as this one that wants to upgrade to a really good monitor now and doesn't like proprietary shit. Why did I buy that NVidia card? Why!?

    I'm jumping ship back to AMD when Vega and Freesync2 comes. Can't stand this walled garden with all its little greedware shrubs anymore. They can keep their 5-10 extra fps.

    And if AMD alone ends up supporting Freesync then fine - At least it wont be just because some other company decided not to support an open standard.
  • caustin582
    I agree that 4k is probably impractical for most gamers, but at this price point I would at least expect 1440p. I just can't justify dropping $600 on a 1080p display in 2017, even if it is a quality product.
  • photonboy
    SIGH... such strange comments. Oh, why can't it be 4K? I just want an inexpensive, high refresh, high quality, high res monitor...

    *This monitor fits a NICHE. A very budget friendly (for the specs) monitor aimed at people who play fast SHOOTERS who care more about response times, minimum blur, and game smoothness.

    Playing CSGO at 200FPS with GSYNC enable would be incredibly smooth. Just amazing to the point where you would pull your HAIR out going back to a 4K, 60Hz monitor without GSYNC.

    (as for the "walled garden" with GSYNC, I've heard rumors that FREESYNC HDR (slash Freesync 2) may be licensed thus closer to GSYNC in that respect. They WILL need to do a lot of work to get HDR working properly... in fact, it's possible, in fact LIKELY that the GSYNC module will make it much easier so prices may end up being similar with GSYNC 2 vs FREESYNC 2 because NVidia charges more for the module (which will get cheaper with mass product) but can save money developing the monitor since they've done a lot of the work already.)

    I have a GSYNC GPU, but when I heard that the XBOX Scorpio is going to support 4K Freesync HDR I was a little annoyed. That's great news, but if I want the best experience I would need both GSYNC HDR and Freesync HDR. One for my PC, and one for the Scorpio. SIGH. Please just make it universal.
  • chumly
    Nobody wants a 1080p TN panel anymore. Especially not for what these crazy people are asking for it. I'm not really seeing any positive comments from people either.

    The resolution sucks. The color quality sucks. $600? You're going to offend people.

    -------E