Gigabyte M27QP QHD 170 Hz Gaming Monitor Review: Blistering Performance and Lots of Color

Gigabyte’s M27QP is a 27-inch QHD/IPS gaming monitor with 170 Hz, Adaptive-Sync, HDR and wide gamut color.

Gigabyte M27QP
(Image: © Tom's Hardware)

Why you can trust Tom's Hardware Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test.

I often talk about balanced performance, and the QHD/165 Hz category is still the best place to find it. Ultra HD at 144 Hz is great, but you’ll need an expensive video card to make that happen and it still won’t be as smooth as a good QHD/165 monitor. Not only will you save money on graphics hardware, but the displays are 30 to 50% less expensive.

This means the category is well-stocked with competent screens, so comparisons come down to the minutia. In my experience, overdrive quality and video processing are the deciding factors. If you can also have accurate color that covers a wide gamut, so much the better.

(Image credit: Gigabyte)

The Gigabyte M27QP checks almost every box. It has one of the largest gamuts in the category, covering nearly 95% of DCI-P3. It has very accurate color right out of the box, so there’s no calibration necessary to maximize image fidelity. And it has plenty of light output, 400 nits, to produce bright HDR. The only downside is black levels which are a bit higher than the competition. This means the contrast is around 800:1 instead of the preferred 1,000:1.

The M27QP’s video processing definitely impresses. You have the choice of a great overdrive, or use Aim Stabilizer Sync (AKA backlight strobe, ULMB, blur reduction) along with Adaptive-Sync. It’s one of the few monitors that can do this. Most require you to choose one or the other. And input lag is very low, almost as low as a 240 Hz monitor. That’s something that will attract competitive players.

When it comes down to versatility and usability, a 27-inch QHD monitor already has an advantage. The Gigabyte M27QP increases that advantage with excellent gaming performance and lots of rich vibrant color. With competitive pricing, it’s definitely worth checking out and is highly recommended.

MORE: Best Gaming Monitors

MORE: How We Test PC Monitors

MORE: How to Buy a PC Monitor: A 2022 Guide

MORE: How to Choose the Best HDR Monitor

Christian Eberle
Contributing Editor

Christian Eberle is a Contributing Editor for Tom's Hardware US. He's a veteran reviewer of A/V equipment, specializing in monitors. Christian began his obsession with tech when he built his first PC in 1991, a 286 running DOS 3.0 at a blazing 12MHz. In 2006, he undertook training from the Imaging Science Foundation in video calibration and testing and thus started a passion for precise imaging that persists to this day. He is also a professional musician with a degree from the New England Conservatory as a classical bassoonist which he used to good effect as a performer with the West Point Army Band from 1987 to 2013. He enjoys watching movies and listening to high-end audio in his custom-built home theater and can be seen riding trails near his home on a race-ready ICE VTX recumbent trike. Christian enjoys the endless summer in Florida where he lives with his wife and Chihuahua and plays with orchestras around the state.

  • truerock
    In 2012 I built a PC with an nVidia GeForce GTX 690 video card.
    USB 3.0, PCIe 3.0, SATA III, DDR3 memory, 120GB SSD, etc.
    It runs a 27" monitor at 1080p, 8-bits, 60Hz.

    I can't believe PCs have advanced so little in 10 and 1/2 years.

    I guess when PCs have moved up to USB 4 and PCIe 4, DDR 4... I'll be ready to upgrade.
    If it will run a 4k, 10-bits, 120Hz monitor (80 Gb/s).
    I guess that will be in 2024? 2025? I hope I don't have to wait until 2026.
    Reply
  • SyCoREAPER
    truerock said:
    In 2012 I built a PC with an nVidia GeForce GTX 690 video card.
    USB 3.0, PCIe 3.0, SATA III, DDR3 memory, 120GB SSD, etc.
    It runs a 27" monitor at 1080p, 8-bits, 60Hz.

    I can't believe PCs have advanced so little in 10 and 1/2 years.

    I guess when PCs have moved up to USB 4 and PCIe 4, DDR 4... I'll be ready to upgrade.
    If it will run a 4k, 10-bits, 120Hz monitor (80 Gb/s).
    I guess that will be in 2024? 2025? I hope I don't have to wait until 2026.

    What on earth are you rambling about? None of what you said has any relevance to this monitor.

    Can I call you a cab to take you home?
    Reply
  • truerock
    sycoreaper said:
    What on earth are you rambling about? None of what you said has any relevance to this monitor.

    Can I call you a cab to take you home?

    Fair point... I guess it was off topic.
    My "ramble" was trying to say, "170Hz QHD... so what". I was trying to put that in context.
    Reply
  • SyCoREAPER
    truerock said:
    Fair point... I guess it was off topic.
    My "ramble" was trying to say, "170Hz QHD... so what". I was trying to put that in context.

    Makes more sense.

    The 170hz is the big deal. It wasn't until fairly recent that monitors moved above 144hz which was a big deal. IIRC there are monitors that go above that but you are getting into build multiple computers for the price territory.

    As for resolutions, 4K isn't really that prevalent, at least not with high refresh rate and affordable prices mainly because most cards until this Gen simply could barely get triple digits at 1440 in AAA titles.
    Reply
  • truerock
    sycoreaper said:
    Makes more sense.

    The 170hz is the big deal. It wasn't until fairly recent that monitors moved above 144hz which was a big deal. IIRC there are monitors that go above that but you are getting into build multiple computers for the price territory.

    As for resolutions, 4K isn't really that prevalent, at least not with high refresh rate and affordable prices mainly because most cards until this Gen simply could barely get triple digits at 1440 in AAA titles.

    I absolutely agree with you.

    I'm expressing an emotional impatience with how slowly PC technology has advanced over the last 10 years.
    I'm kind of the opposite of a lot of people who want future technology to support old technology standards.
    I think Apple is good at getting rid of the old and moving more quickly to new technology.
    I occasionally will attach a 4k Samsung TV to my 10-year-old PC just to get a feel of the experience. It is very cool. Unfortunately, on my 10-year-old PC 4k-video runs at 30Hz, 8-bit.

    Oh... I just rambled aimlessly again... my bad.
    Reply
  • SyCoREAPER
    truerock said:
    I absolutely agree with you.

    I'm expressing an emotional impatience with how slowly PC technology has advanced over the last 10 years.
    I'm kind of the opposite of a lot of people who want future technology to support old technology standards.
    I think Apple is good at getting rid of the old and moving more quickly to new technology.
    I occasionally will attach a 4k Samsung TV to my 10-year-old PC just to get a feel of the experience. It is very cool. Unfortunately, on my 10-year-old PC 4k-video runs at 30Hz, 8-bit.

    Oh... I just rambled aimlessly again... my bad.

    Moore's Law is dead, has been for a while unfortunately.
    Reply
  • Wimpers
    sycoreaper said:
    Moore's Law is dead, has been for a while unfortunately.

    What did you expect? We can't keep cramming more and more transistors on the same space and/or ramp up the frequency, there actually are physical constraints to about everything.
    We haven't reached them yet when it comes to storage capacity and perhaps memory and network or bus speeds but for a lot of other things only parallelisation is an option but this is not possible everywhere and sometime requires a redesign and adds some overhead.
    Reply
  • SyCoREAPER
    Wimpers said:
    What did you expect? We can't keep cramming more and more transistors on the same space and/or ramp up the frequency, there actually are physical constraints to about everything.
    We haven't reached them yet when it comes to storage capacity and perhaps memory and network or bus speeds but for a lot of other things only parallelisation is an option but this is not possible everywhere and sometime requires a redesign and adds some overhead.

    Congrats?

    I know that, I was explaining to OP why he feels that PC components haven't evolved further than they thought by now.
    Reply