ASRock preps a superfast monitor for competitive gamers — PG27FFX2A boasts a 27-inch, 1080p display with a 520 Hz refresh rate

ASRock's 520 Hz LCD
(Image credit: VideoCardz/ASRock)

ASRock is a relatively new entrant to the display world. Still, the company is ambitious enough to demonstrate something that can easily be called the King of Speed: a Phantom Gaming monitor with a 520 Hz refresh rate. The LCD joins several other Phantom Gaming displays the company plans to showcase at the upcoming CES next week.

The Phantom Gaming monitor in question is called the PG27FFX2A, and we are likely talking about a TN-based 27-inch LCD with a whopping 520 Hz refresh rate featuring a 1920x1080 (1080p) resolution, a set of characteristics that can place it into the list of the best gaming monitors for professional e-sport gamers. ASRock has yet to disclose other monitor specifications. Still, we should expect the monitor to support VESA's Adaptive Sync-compliant variable refresh rate technology. However, we do not know whether it will carry AMD's FreeSync or Nvidia's G-Sync badge on top. We will refrain from discussing how this display is made (but you can read about a 600 Hz display in another story).

When it comes to connectivity, ASRock remains mum for now. Meanwhile, it is an interesting point to consider as even a 1080p video stream at up to 520 Hz refresh rate requires a lot of bandwidth if uncompressed. When it comes to a color depth of eight bits per channel (8 bpc), one would need a DisplayPort with a UHBR 10 support (38.68 Gbit/s) or an HDMI 2.1 with a 48G cable (using the FRL5 40G transmission mode). Regarding ten bpc, we are looking at graphics processors that support DisplayPort with UHBR 13.5, as the HDMI 2.1 specification only supports up to 505 Hz using a 48G cable and the FRL6 transmission mode.

AMD's high-end Radeon RX 7000-series graphics cards support DisplayPort 2.1 with UHBR 13.5 (and offering 54 Gbit/s of raw bandwidth); they also support HDMI 2.1 with up to 48 Gbps throughput so that they can handle such displays without problems at eight bpc. By contrast, Nvidia's GeForce RTX 30 and RTX 40-series only support DisplayPort 1.4 (which rules out compatibility) and HDMI 2.1 48G with DSC 1.2a, which should technically be sufficient both for eight bpc and ten bpc. Still, we recommend consulting with ASRock's compatibility table before making a purchase decision. Furthermore, since most GeForce RTX 40-series graphics boards have only one HDMI output, using two of ASRock's Phantom Gaming PG27FFX2A displays on a single PC with these cards will be impossible.

The Phantom Gaming PG27FFX2A monitor is just one of the displays that ASRock plans to demonstrate at CES. The lineup also includes OLED models, curved monitors, and moderate-sized gaming IPS displays with a 180 Hz resolution and probably a reasonable price.

Anton Shilov
Freelance News Writer

Anton Shilov is a Freelance News Writer at Tom’s Hardware US. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • bit_user
    I seem to recall Nvidia having a tech demo of display that ran at like 8 kHz. I have no idea what resolution it was or how it was driven, but this was like 8 years ago. My former boss saw it on a VIP tour of Nvidia HQ, back then, and said that monitor was like looking through a window.

    Perhaps this is something a little different, but wow...
    nUXZwzH-114View: https://www.youtube.com/watch?v=nUXZwzH-114
    Reply
  • Notton
    I am a firm believer that the human eye can see the difference between 60Hz and 120Hz. That said, I don't think the human eye can see the difference between 240Hz and 520Hz.
    Reply
  • bit_user
    Notton said:
    I am a firm believer that the human eye can see the difference between 60Hz and 120Hz. That said, I don't think the human eye can see the difference between 240Hz and 520Hz.
    I think a it mostly has to do with motion fidelity, IMO. With eye-tracking + accurate motion blur, we could probably get by with lower framerates.

    In this case, I think the main point of 240 vs. 520 Hz is probably more about latency than visual fidelity. Sure, it's only about 2 ms you're shaving off, but if you're a competitive gamer facing off against others with similar reaction times, maybe a 2 ms is enough to confer a meaningful advantage?
    Reply
  • George³
    bit_user said:
    I think a it mostly has to do with motion fidelity, IMO. With eye-tracking + accurate motion blur, we could probably get by with lower framerates.

    In this case, I think the main point of 240 vs. 520 Hz is probably more about latency than visual fidelity. Sure, it's only about 2 ms you're shaving off, but if you're a competitive gamer facing off against others with similar reaction times, maybe a 2 ms is enough to confer a meaningful advantage?
    ...In track and field sprints, the sport's governing body, the IAAF, has a rule that if the athlete moves within 0.1 seconds after the gun has fired the athlete has false-started. This figure is based on tests that show the human brain cannot hear and process the information from the start sound in under 0.10 seconds,
    From Wikipedia. Add times to send signals from eyes to brain and after brain reacting signal from brain to muscles of the fingers.
    Reply
  • DavidLejdar
    George³ said:
    From Wikipedia. Add times to send signals from eyes to brain and after brain reacting signal from brain to muscles of the fingers.
    Yes, but still what bit_user said. In an example of 2ms, that is constantly 2ms earlier, one has the signal from the outside to the eye.

    In particular, the issue is about precision or accuracy. When a target is only something like 20x10 pixels, and moving, then one wants to actually see where the target is at right now, instead of with a delay (with which the shown position may be off by some pixels, in comparison to the real position). And even only 2ms can mean a noticeable difference in pixels - depending on the game, e.g. taking a peek may be a movement at a speed of "10,000 pixels per second", and 2ms would simplified mean 20 pixels difference. There still are some other factors of course, for which one needs to compensate.
    Reply
  • bit_user
    George³ said:
    From Wikipedia. Add times to send signals from eyes to brain and after brain reacting signal from brain to muscles of the fingers.
    Let's say all e-sports competitors have a reaction time around 100 ms. That's the amount of time between when photons hit their eyeballs and when they can twitch an arm or finger muscle to act on it. Next, let's say each frame takes about 4 ms to render and up to another 4 ms to get displayed (due to 240 Hz refresh rate). That's an end-to-end latency of 108 ms.

    Now, if one competitor is able to shave more than 2 ms off of that end-to-end latency and react within 106 ms, maybe that's enough to gain a slight advantage. At the highest levels of competition, sometimes a slight advantage can make the difference between winning and losing. Not for the average gamer, but competitors at the elite tier are all pushing the limits of human capability and I'm sure will take any advantage they can get.
    Reply