RTX 4090 Can Run 'Genshin Impact' at 13K Resolution

RTX 4090 Running Genshin Impact at 13K
(Image credit: YouTube - Golden Reviewer)

Nvidia's RTX 4090 is an impressive GPU, and one of the best graphics cards out there; it can push triple digit frame rates at 4K without breaking a sweat. But YouTuber Golden Reviewer has pushed things to another level, by playing Genshin Impact at 13K resolution. Incredibly, the game managed a playable 30 frames per second at this crazy high resolution.

If you're unfamiliar with the game, Genshin Impact is a highly popular open-world action RPG, that has over 60 million active monthly players. You've probably heard about the title as a mobile game that is difficult to run on most smartphones, but it is also available on PC, PlayStation 5 and PS4 (in addition to Android and iOS).

As a result of the game's requirement to run on mobile devices, it isn't that demanding on PC - despite its reputation in the mobile space. The minimum requirement is a Nvidia GT 1030, and the recommended GPU is a GTX 1060.

In the video, we can see Golden Reviewer running the game on the high and highest graphical presets in the game, at a resolution of 13760 x 5760 which translates into 178.32 megapixels per frame (or 13K). For reference, this resolution is almost 1.4 times greater than 8K, and 8.5 times larger than 4K.

With the RTX 4090, the game is surprisingly playable at this daunting resolution, with an average frame rate of 30 fps. It isn't a perfectly locked 30, but its still impressive considering the resolution. For reference, this is an upscaled image to 13K - since there are no 13K monitors on the market that we know of today.

At 13K, aliasing is effectively deleted from the image, with all edges looking very realistic. Even in zoomed up images of the game session, pixelation is nowhere to be seen.

This is one of the great advantages of running upscaled resolutions, especially at ridiculously high resolutions such as 13K. Image detail is crisp, sharp and lacks any jagged edges you might see at native or sub-native resolutions without anti-aliasing techniques. In fact, at 13K, you could probably disable AA completely and still not notice any jagged edges what so ever.

But it is easy to tell that this game wasn't designed to run at resolutions anywhere near 13K. Texture esolutions are no where close to 13K, resulting in blurriness all around. Plus, the overall lighting presentation is sub-par in comparison to what you might see in a modern AAA title designed to run on a PC. But this is to be expected, for a game designed to be scaled down to run on smartphones.

Still, it's amazing to see a game that is playable at 13K at all, and is a testament to the insane capabilities of Nvidia's RTX 4090. 

If you have an RTX 4090, and want to mess with upscaling yourself, plenty of games have built-in upscales for you to play with - and you don't have to go all the way to 13K to get really crisp graphics, generally any resolution beyond twice your native resolution is good enough. But if you want to lessen the performance impact of upscaling, you can play with Nvidia's DSR feature in the Nvidia Control Panel which features AI enhanced upscaling to reduce the performance impact of running a game at higher than native resolution.

Aaron Klotz
Contributing Writer

Aaron Klotz is a contributing writer for Tom’s Hardware, covering news related to computer hardware such as CPUs, and graphics cards.

  • PlaneInTheSky
    I don't find resolutions above 1440p useful. Nothing above that seems to actually add any detail unless you're sitting way too close to your screen.

    If you need a magnifying glass to perceive 8k, you probably need an electron microscope to see 13k.
    Reply
  • ManDaddio
    I have been playing at 4k for several years now. I have both 4k and 1440p 144hz screens. There is a noticable difference. But viewers may not notice because of the quality of the display.

    My 1440p screen has great colors and blacks versus my 4k TV. Therefore, the 1440p screen seems to look better. But when comparing similar quality screens the 4k will look better every time.

    And most people are close to their screens so I am not sure about the comment referring to distance.

    Size of screen matters, color quality matters, and resolution matters.

    If you look at screen shots 4k will look 100% better. Video editors also will testify 1440p is not better than 4k when up or downscaling.
    Context matters, perception and preconceived thinking does also as well as quality of hardware, etc.

    I think some people crap on 4k because they want high refresh and/or can't afford a good 4k solution.

    Yes, I said it. Get over it. 😄♥️
    Reply
  • -Fran-
    I play Waifu Impact at 1440p with my Vega64 and it looks quite good (hovers around 60FPS). I also play it on my phone (Samsung Note 10+) and it also looks quite good. Wherever you want to run this game, your Waifus are guaranteed to look good, so don't worry.

    At over "8K" resolution, you just get more Waifu details and that is always welcome, just like you get tiny Waifus in your phone.

    I did try Eyefinity with it, but not quite "13K" resolution. I haven't tried VR yet for it, but I'm sure the Waifus will still look great.

    Regards :LOL:
    Reply
  • brandonjclark
    ManDaddio said:
    I have been playing at 4k for several years now. I have both 4k and 1440p 144hz screens. There is a noticable difference. But viewers may not notice because of the quality of the display.

    My 1440p screen has great colors and blacks versus my 4k TV. Therefore, the 1440p screen seems to look better. But when comparing similar quality screens the 4k will look better every time.

    And most people are close to their screens so I am not sure about the comment referring to distance.

    Size of screen matters, color quality matters, and resolution matters.

    If you look at screen shots 4k will look 100% better. Video editors also will testify 1440p is not better than 4k when up or downscaling.
    Context matters, perception and preconceived thinking does also as well as quality of hardware, etc.

    I think some people crap on 4k because they want high refresh and/or can't afford a good 4k solution.

    Yes, I said it. Get over it. 😄♥


    This is actually a fair take.

    When I purchased my 3090 rig I was dead set on a quality 1440p solution because to me, refresh rate is king, even in non-competitive titles.

    So, I agree.
    Reply
  • PlaneInTheSky said:
    I don't find resolutions above 1440p useful. Nothing above that seems to actually add any detail unless you're sitting way too close to your screen.
    This is actually demonstrably wrong. Here are 3 screenshots I took in Half Life 2 at different resolutions, all downsampled to 1080p for your convenience, so you can't argue with "sitting way too close to your screen". All 3 screenshots were taken at maximum graphical settings, including AA. Without AA, the differences would be even more obvious. I recommend opening all 3 screenshots in a new tab and using CTRL+TAB to see the differences even more clearly.

    1080p

    1440p
    Notice the increased detail in the fence and the antenna on the house.

    2160p (4K)
    Notice the increased detail on the crane (especially one of the 2 support cables is now much more clearly visible), the antenna has gained even more details, but the most obvious improvement is in the tree in the background on the left, where some of the branches are now being rendered, when it used to be invisible before. Fun fact: without AA, the mesh in the fence would be barely recognizable in 1080p and 1440p, and perfectly rendered in 4K. The tree in the foreground (at the top right) would also have holes in 1080p.

    There's a reason why super resolution is a thing.
    By the way, I originally wanted to take a screenshot at 3240p (5K Dynamic Super Resolution), but only the top left corner was visible (even in the screenshot), which wasn't very useful for comparing.

    I mean, you guys are free to prioritise framerate over resolution, that's your choice. But to claim that there is no increase in (visible) detail is plain wrong. Have fun playing with your microscope, but stop assuming that everybody's as blind as you are.
    Reply
  • hotaru251
    1440p is the sweet spot for mostly everyone...up to a point.


    say u have 2 same spec monitors 1 in 1440p and other in 4k.

    if they are 30inches? you aint gonna notice much. (the more you try to cram into a small thing less discernible they become)

    now if they are 40 inches?
    you might notice because as size gets bigger the increased resolution becomes more noticeable .


    but is ur an average joe with a 32" monitors...you likely dont even want 4k.
    it pushes ur hardware harder for no real benefit while lowering ur frames. (and 1440p at 244hz is gonna feel/look better than a 120hz 4k at smaller screen size)

    now if ur more into cinematic style games 4k likely more beneficial (as u'd want a brighter/larger screen and not care much about frames) but those type of games are minority.
    Reply
  • drivinfast247
    PlaneInTheSky said:
    I don't find resolutions above 1440p useful. Nothing above that seems to actually add any detail unless you're sitting way too close to your screen.

    If you need a magnifying glass to perceive 8k, you probably need an electron microscope to see 13k.
    Screen size is important.
    Reply
  • ikernelpro4
    PlaneInTheSky said:
    I don't find resolutions above 1440p useful. Nothing above that seems to actually add any detail unless you're sitting way too close to your screen.

    If you need a magnifying glass to perceive 8k, you probably need an electron microscope to see 13k.
    The difference between 1440 and 2160 is substantial just in pixel count alone.

    I would say that after 4k the differences become hard to tell.
    Reply
  • LordVile
    Nolonar said:
    This is actually demonstrably wrong. Here are 3 screenshots I took in Half Life 2 at different resolutions, all downsampled to 1080p for your convenience, so you can't argue with "sitting way too close to your screen". All 3 screenshots were taken at maximum graphical settings, including AA. Without AA, the differences would be even more obvious. I recommend opening all 3 screenshots in a new tab and using CTRL+TAB to see the differences even more clearly.

    1080p

    1440p
    Notice the increased detail in the fence and the antenna on the house.

    2160p (4K)
    Notice the increased detail on the crane (especially one of the 2 support cables is now much more clearly visible), the antenna has gained even more details, but the most obvious improvement is in the tree in the background on the left, where some of the branches are now being rendered, when it used to be invisible before. Fun fact: without AA, the mesh in the fence would be barely recognizable in 1080p and 1440p, and perfectly rendered in 4K. The tree in the foreground (at the top right) would also have holes in 1080p.

    There's a reason why super resolution is a thing.
    By the way, I originally wanted to take a screenshot at 3240p (5K Dynamic Super Resolution), but only the top left corner was visible (even in the screenshot), which wasn't very useful for comparing.

    I mean, you guys are free to prioritise framerate over resolution, that's your choice. But to claim that there is no increase in (visible) detail is plain wrong. Have fun playing with your microscope, but stop assuming that everybody's as blind as you are.
    All look the same to me, resolution only matters when you factor in distance. A 32” TV for example anything over 1080p at a normal viewing distance isn’t noticeable and that’s a physical limitation of the human eye.
    Reply
  • PlaneInTheSky
    This is actually demonstrably wrong. Here are 3 screenshots I took in Half Life 2 at different resolutions, all downsampled to 1080p for your convenience, so you can't argue with "sitting way too close to your screen".

    What you're doing is downsampling. With downsampling you can overcome chroma subsampling for example, and turn a 4:2:2 image into a 4:4:4 image.

    But that has nothing to do with being able to distinguish extra detail at higher resolutions. That extra detail per pixel only appears when you downsample.

    There are several videos on Youtube where they test 1080p vs 4k screens at close viewing distances, and people had a really hard time telling the difference.

    This idea that you can easily tell the difference is simply baloney imo. I think most people can not tell the difference from normal viewing distances.

    Between 1440p and 4k, there's no way people can spot the difference from regular distances, it is incredibly hard to do at 1080p already.

    First people said they could see the difference between 4k and 1080p, now you have people claiming they can tell 8k from 4k, soon you'll have people claiming they can see 16k. Either some people have become mutants with incredibly hawk-like vision, or they're trying to justify their $10,000 8k TV. I'm going with the later.

    pzw1D9dU6tsVxNBiAV4UnMRodCjVf-5AE
    Reply