Remnant II Devs Designed Game With DLSS/FSR 'Upscaling In Mind'

Remnant 2
(Image credit: Remnant 2)

The developers behind Unreal Engine 5 game Remnant 2 have confirmed via Reddit that the game was designed with DLSS/FSR/XeSS image upscaling in mind to achieve playable frame rates on modern PC hardware — rather than being optimized to run at native resolution, as many PC gamers would expect. Unsurprisingly this did not sit well with the Remnant 2 gaming community, with many Redditors calling out the devs for laziness.

Image scaling is a very effective tool for improving performance on computer hardware. With the right upscaler you can get extremely good returns on gaming performance with a low cost to image quality. However, relying on upscaling tools from the start to achieve a playable performance is not very common (though it's becoming more common) in the PC landscape. 

In the past, image upscalers on the PC platform were mostly used as a last resort to improve performance, not used at the forefront of improving performance, like they are with consoles. Another problem is that upscalers can hide bad or sloppy game optimizations behind the scenes, which will reduce the amount of performance a system can naturally provide.

Unfortunately, Daniel Owen on YouTube confirmed the developer's statement to be true, uncovering the fact that the game is extremely intensive at all graphics settings. At the game's default settings — consisting of medium graphics settings and DLSS performance mode — an RTX 2060/i5-9600K system was only able to hit just above 60 fps at 1080p resolution. Owen tried to get a 60 fps experience at native resolution but found this was impossible, even at low settings. On top of this, the game also suffered from severe micro stutter due to the older Intel CPU.

To hit 60 fps at 1080p ultra settings — a configuration modern 60-class cards can achieve with most titles — Owned had to jump up from the RTX 2060 to an RTX 3080 12GB. To get a similar performance at 1440p (let alone 4K), Owen had to jump even higher — to Nvidia's current flagship GPU, the RTX 4090.

There is no denying that this game is incredibly demanding, and most gamers will almost certainly be using image upscalers to run the game at playable frame rates. Unfortunately, we don't know how many of these issues are related to sloppy game optimization or if the game is actually that intensive. 

The worst part is that the game does not look visually stunning, compared to similarly demanding AAA titles like Cyberpunk 2077 (and even Unreal Engine 5 demos), which makes the game’s reliance on heavy image upscaling pretty embarrassing. In fact, we’re not sure if the game even has any ray-tracing effects since the graphics menu lacks any of these more demanding options. Hopefully the developers will push out performance updates in the future that will alleviate at least some of the performance demands.

Aaron Klotz
Freelance News Writer

Aaron Klotz is a freelance writer for Tom’s Hardware US, covering news topics related to computer hardware such as CPUs, and graphics cards.

  • emike09
    There are many better-looking games that run much smoother. I was excited to play a full UE5 title so thought I'd give it a try, but ended up refunding it (for now). Nanite sure does make objects look great, but shadows are a joke and constantly flicker, especially in cutscenes, and the lighting is flat and boring. 30-40fps in native 4K on my 4090 is unacceptable for a game that looks like it came out 5 or 6 years ago. I guess I see why they skimped out on Lumen or other RT rendering. I also hoped the UE5 engine would have overcome poor development choices. I'm not a developer so I can't speak much towards optimization, but this is ridiculous.
    Reply
  • -Fran-
    And this is why I've always said any upscaling for PC games is stupid. Keep console stuff in consoles.

    Enjoy crappier and crappier performance in games thanks to parroting DLSS (and even FSR/XeSS) being a "feature". Talk about drinking the whole koolaid bottle and then some more.

    Cynicism aside, I hope people does not give them a pass for that and actually pushes back saying "yeah, no".

    Regards.
    Reply
  • Makaveli
    "To hit 60 fps at 1080p ultra settings — a configuration modern 60-class cards can achieve with most titles — Owned had to jump up from the RTX 2060 to an RTX 3080 12GB."

    I know it is an auto correct typo but found it funny going to start calling him Daniel Owned now ;)
    Reply
  • Alvar "Miles" Udell
    Translation: In 6 years GPUs have only about doubled in speed (4090 doesn't count as it's a Titan class), so since they've stalled you need to reduce your IQ to play our game, and since nobody wants to admit they reduce their IQ in 2023, you must use "upscaling techniques" which allow you to say you're using max settings.
    Reply
  • umeng2002_2
    What an odd way of saying that you are bad at optimization...
    Reply
  • mhmarefat
    RT and nvidia are to blame for this freak show not only for this game but all other modern poorly optimized games that have made jokes of themselves by completely hiding behind upscaling tech. DLSS/FSR was supposed to make so called "RT" performance tax acceptable yet 6 years later games are releasing without "RT" but keeping the upscale clown show without bothering truly optimizing.. thank you nvidia (and all those who supported nvidia's greed show). The best part? these billionaire companies are now telling us "RT" was all an illusion. "Path Tracing" is the real "RT" so yeah.. be prepared for $10k graphics cards if you want the "RT".. for the rest of us though, all our games are being ruined. RT or not.
    Reply
  • tamalero
    umeng2002_2 said:
    What an odd way of saying that you are bad at optimization...
    This, what was feared happened.
    Devs are now using DLSS and frame generation as clutch instead of optimizing.
    Reply
  • Amdlova
    Whatttt rtx 4060 720P edition

    The old times come back... up scale 4:3 1024x768
    Reply
  • Aleksey Green
    tamalero said:
    This, what was feared happened.
    Devs are now using DLSS and frame generation as clutch instead of optimizing.
    And what's wrong with what developers use on upscalers?
    What's wrong with upscalers?
    99.9% of the content People consume with loss of quality (mp3, mp4,jpeg....)tons of content, and then suddenly some nerds decided to heighten DLSS, the flaws of which need to be examined under a microscope.
    You have figured out for yourself how it should be, without taking into account all the nuances in the production of chips in 2023 and stagnation in the gaming industry.
    But when it comes to ordinary gamers that the heyting of the 4000 series was largely unjustified, they will quickly begin to shut up such "specialists" as you.
    Reply
  • umeng2002_2
    At 1440p and DLSS quality, you can still tell when it's on. It's not nearly as bad at simple bi-linear scaling; but it's not imperceptible. In a lot of titles that I use DLSS, I'm still injecting AMD CAS with ReShade to bring the sharpness back up to near native. Add to that that a lot of games aren't adjusting their texture mips LOD bias with DLSS or other upscalers engaged, and you start to have an even more compromised experience.

    Look at the Witcher 3 RT updates. They bloody just used a DX11 to 12 wrapper, and told people to just use frame generation because of the CPU penalty when using d3d11on12.
    Reply