Nvidia's midrange GPUs through the years revisited — pitting the RTX 5070 versus the 4070, 3070 and 2070 in an all-encompassing gaming showdown

GeForce RTX 3050 graphics cards
(Image credit: Nvidia)

Nvidia’s midrange GPUs have long been a sweet spot for gamers seeking solid performance without breaking the bank, so it's high time we revisit them. ComputerBase.de’s recent testing of the RTX 2070, 3070, 4070, and 5070 across 1080p and 4K resolutions provides valuable insights into how each generation has evolved—not just in raw performance, but in efficiency, thermal behavior, and the impact of architectural advancements. You might not be expecting any surprises in this roundup, and this might be lengthier than the average news piece, so let's take a look.

Compared at 1080p

In more forgiving rasterized titles like Diablo II: Resurrected and Overwatch 2, which start out comfortably above 100 FPS on the 2070, the latest cards push into the 300–380 FPS range at 1080p — far beyond the limits of most monitors, though the manufacturers are always pushing for more. The takeaway is that while not every title stresses modern GPUs equally at lower resolutions, the newer architectures consistently scale better in demanding scenarios, especially where ray tracing is involved.

4k is a different story

The 4K benchmarks reveal a different story because this is where the GPUs really stretch their legs and the performance hierarchy comes into sharper focus. In Cyberpunk 2077 at ultra settings with ray tracing and FSR Balanced, the 4070 delivers 27.8 FPS, while the 5070 manages 32.0 FPS — a modest uplift, but still a generational gain. Ratchet & Clank: Rift Apart tells a similar tale, with the 4070 at 92.6 FPS and the 5070 at 119.2 FPS, more than doubling the 2070’s 19.2 FPS baseline. Horizon: Forbidden West also benefits heavily, where performance climbs from 31.6 FPS on the 2070 to 91.1 FPS on the 5070, showing how newer GPUs increasingly make 4K playable even in modern open-world titles.

The esports angle scales as well: Overwatch 2 runs at 99.3 FPS on the 2070 but hits 275.2 FPS on the 5070, proving that high-refresh 4K competitive gaming is no longer out of reach. The generational contrast is sharpest here, not only because raw throughput matters more at 4K, but also because features like DLSS—which just got updated to a much faster Transformer model—gain greater leverage.

Swipe to scroll horizontally
4K Performance (Source: ComputerBase.de)

Metric

RTX 2070

RTX 3070

RTX 4070

RTX 5070

Raster (Avg FPS)

36

61

85

107

Ray Tracing (Avg FPS)

25

48

55

63

Average Power (W)

171

218

188

231

Efficiency (FPS/W, Raster)

0.21

0.28

0.45

0.46

Efficiency (FPS/W, Ray Tracing)

0.15

0.22

0.29

0.27

Power and efficiency further tilt the scales because, despite being slower in absolute terms, the 4070 consistently runs with 20–25% lower power draw than the 5070, leading to 15–25% stronger performance-per-watt in titles like Cyberpunk 2077, Horizon, and Doom Eternal. In some cases, like Doom Eternal with ray tracing, the 4070 more than doubles the 2070’s performance while consuming almost the same wattage, signifying the architectural leaps. The RTX 4070 and 5070 are pretty much neck-in-neck efficiency-wise with both trading blows, but the 4070 often leads.

This small advantage actually positions it the most balanced upgrade in the lineup, particularly for players who care about thermals and acoustics as much as raw frame rates. Owners of the 2070 will find transformative gains by moving to either the 4070 or 5070, while 3070 users benefit most from the 4070’s efficiency and DLSS enhancements. But for 4070 owners, the leap to a 5070 often feels underwhelming — the higher power and thermal cost buys modest raw FPS improvements, but comes with an efficiency penalty that undermines the real-world value.

RTX 4070 is a standout

In conclusion, ComputerBase.de’s testing confirms many anticipated trends while offering insights that make the bigger picture clear, in hindsight. As expected, the 4070 stands out as the best balance of performance, efficiency, and thermal management, while the 5070 provides peak throughput for most scenarios but with diminishing practical returns. Of course, if money is no object then there's no reason not to pick the 5070, and the 3070 also offers significant improvements over its Turing-based predecessor.

Moreover, this testing shows us how the mid-range segment today rewards intelligent architecture, efficient memory handling, and AI-assisted frame generation over brute-force scaling. This is why the RTX 5070 fares unimpressively against the RTX 4070, which exemplifies this principle in practice. Make sure to check out ComputerBase.de's coverage yourself, as they have some great, interactive charts over there that'll offer you an even more granular understanding of the differences.

Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.

TOPICS
Hassam Nasir
Contributing Writer

Hassam Nasir is a die-hard hardware enthusiast with years of experience as a tech editor and writer, focusing on detailed CPU comparisons and general hardware news. When he’s not working, you’ll find him bending tubes for his ever-evolving custom water-loop gaming rig or benchmarking the latest CPUs and GPUs just for fun.

  • Notton
    I glanced at the review.
    Not including 1440p results on a xx70 series cards... is a choice.
    Reply
  • DexSK
    Seriously, midrange? The last one I remember worthy of that description was the 1660.
    From the age of RTX there's no midrange anymore. Either it's underperforming entry level 50/60's, or high-end 70/80's and overkill 90's. All of them overpriced in my view. Anytime I upgrade I'm furious that I have to either pay half the price of the rig on a single component (the GPU), to get something at least moderately worth the money, or settle for previous-gen that got cheaper because of the new bells and whistles on the market. Next upgrade I'm considering a Radeon instead, after I've ditched the f00ked up 13th gen i7 for a Ryzen three weeks ago. Midrange used to be best bang for the buck, nowdays it's a sad joke.
    Reply
  • Diabl0
    Midrange is 60 not 70.
    Reply
  • Baeloro1481
    Be me: browsing articles on doomscroll... See an article about 70 class cards. Reading some nonsense...

    My setup:
    MSI Mag z590 tomahawk
    Intel i5 10400 cooled by a hyper 212 black rbg edition.
    Corsair 2x16 ddr4 3200
    Rtx 3070 fe
    Corsair Rm750e PSU
    Sn770 500gb as a boot drive.
    Sn850 1tb as a game drive.

    Using the 1080p settings described... DLSS Balanced with Rtx medium... Getting 58fps with an average of 55 and a max of 79, 38 fps 1% low. Frametime showing 19.5 ms (cuz DLSS) and yes I waited for Frametime to normalize.

    These settings result in a very poor looking image... Turning OFF rtx suddenly you're not blowing out pixel full with highlights everywhere. During off DLSS results in better response times (go figure).

    By running native 1080p without rtx, cyberpunk 2077 on a 3070 pushes 90fps with an average of 85, max of 146 and 52fps 1% low. Looking at 9ms on the Frametime and the game generally looks better with these settings at this resolution.

    Unsure what you guys are doing to test games, but this is what I threw together in about 30 minutes of testing on a fresh cyberpunk install. Again, this was my result after running through the city after letting Frametime normalize. No driving, no combat, no sitting in a corner to get best results. Normal game play.

    Even setting rtx to ultra with DLSS quality results in 69 fps avg with 81 max and 54 1% low. 13ms response time. My CPU is running 4000mhz and my ram at 3200mhz. My GPU is running stock clock speeds.

    My Rtx 3070 is doing as well as the 4070/5070 numbers you are showing here for 1080p. Unsure why yours are so bad...
    Reply
  • mortsmi780
    How are they getting such low fps on Cyberpunk 2077 with RT medium and DLSS balanced at 1080p? I get 138fps avg with RT psycho and DLSS balanced at 1440p with a 4070ti.
    Reply
  • nitrium
    I'm happy enough with my RTX 5070. Of course I was coming from an RTX 2060 (6GB), so of course I was going to be massively impressed with the performance boost. Not so much the price though.
    Reply
  • MarKers94
    Baeloro1481 said:
    Be me: browsing articles on doomscroll... See an article about 70 class cards. Reading some nonsense...

    My setup:
    MSI Mag z590 tomahawk
    Intel i5 10400 cooled by a hyper 212 black rbg edition.
    Corsair 2x16 ddr4 3200
    Rtx 3070 fe
    Corsair Rm750e PSU
    Sn770 500gb as a boot drive.
    Sn850 1tb as a game drive.

    Using the 1080p settings described... DLSS Balanced with Rtx medium... Getting 58fps with an average of 55 and a max of 79, 38 fps 1% low. Frametime showing 19.5 ms (cuz DLSS) and yes I waited for Frametime to normalize.

    These settings result in a very poor looking image... Turning OFF rtx suddenly you're not blowing out pixel full with highlights everywhere. During off DLSS results in better response times (go figure).

    By running native 1080p without rtx, cyberpunk 2077 on a 3070 pushes 90fps with an average of 85, max of 146 and 52fps 1% low. Looking at 9ms on the Frametime and the game generally looks better with these settings at this resolution.

    Unsure what you guys are doing to test games, but this is what I threw together in about 30 minutes of testing on a fresh cyberpunk install. Again, this was my result after running through the city after letting Frametime normalize. No driving, no combat, no sitting in a corner to get best results. Normal game play.

    Even setting rtx to ultra with DLSS quality results in 69 fps avg with 81 max and 54 1% low. 13ms response time. My CPU is running 4000mhz and my ram at 3200mhz. My GPU is running stock clock speeds.

    My Rtx 3070 is doing as well as the 4070/5070 numbers you are showing here for 1080p. Unsure why yours are so bad...
    It's great you're getting better numbers, I've seen others report better numbers. However, the takeaway from the article, since I assume consistent benchamarking from test to test, is the gen on gen improvements. Presumably if you had access to the next gen newer card and reran YOUR test methodology you would see similar improvements.

    This isn't about how good the cards are or aren't compared to what you (anyone reading the article) is capable of. It's one persons testing methodology on successive gen's of xx70 and how they compare to each other.

    Take the info for what it is. Proof that there has been gen over gen improvements even it we dislike the price.
    Reply
  • VizzieTheViz
    MarKers94 said:
    It's great you're getting better numbers, I've seen others report better numbers. However, the takeaway from the article, since I assume consistent benchamarking from test to test, is the gen on gen improvements. Presumably if you had access to the next gen newer card and reran YOUR test methodology you would see similar improvements.

    This isn't about how good the cards are or aren't compared to what you (anyone reading the article) is capable of. It's one persons testing methodology on successive gen's of xx70 and how they compare to each other.

    Take the info for what it is. Proof that there has been gen over gen improvements even it we dislike the price.
    Yeah but there’s the rub isn’t it?

    I don’t think anyone disputes there’s been gen on gen improvement in absolute performance, however it used to be (yeah getting old over here) that the next gen was in roughly the same price bracket as the previous gen. So you got more performance per dollar of euro or whatever currency.

    Now you get more performance but also a price increase that’s at least as high as the performance increase (usually more than the performance increase) so per dollar there’s no performance increase, might even be a degradation in performance.
    Reply
  • das_stig
    Baeloro1481 said:
    Be me: browsing articles on doomscroll... See an article about 70 class cards. Reading some nonsense...

    My setup:
    MSI Mag z590 tomahawk
    Intel i5 10400 cooled by a hyper 212 black rbg edition.
    Corsair 2x16 ddr4 3200
    Rtx 3070 fe
    Corsair Rm750e PSU
    Sn770 500gb as a boot drive.
    Sn850 1tb as a game drive.

    Using the 1080p settings described... DLSS Balanced with Rtx medium... Getting 58fps with an average of 55 and a max of 79, 38 fps 1% low. Frametime showing 19.5 ms (cuz DLSS) and yes I waited for Frametime to normalize.

    These settings result in a very poor looking image... Turning OFF rtx suddenly you're not blowing out pixel full with highlights everywhere. During off DLSS results in better response times (go figure).

    By running native 1080p without rtx, cyberpunk 2077 on a 3070 pushes 90fps with an average of 85, max of 146 and 52fps 1% low. Looking at 9ms on the Frametime and the game generally looks better with these settings at this resolution.

    Unsure what you guys are doing to test games, but this is what I threw together in about 30 minutes of testing on a fresh cyberpunk install. Again, this was my result after running through the city after letting Frametime normalize. No driving, no combat, no sitting in a corner to get best results. Normal game play.

    Even setting rtx to ultra with DLSS quality results in 69 fps avg with 81 max and 54 1% low. 13ms response time. My CPU is running 4000mhz and my ram at 3200mhz. My GPU is running stock clock speeds.

    My Rtx 3070 is doing as well as the 4070/5070 numbers you are showing here for 1080p. Unsure why yours are so bad...
    Basically all I read from this articles was buy the new shiny Nvidia card for daft money as it gives a small bump each generation that we carefully control to squeeze every cent of revenue out of at every level, that you may or may not need, while boosting our profits.
    Reply
  • kyzarvs
    In 2018 (2070), the UK Average weekly wage was £510, launch price £479
    In 2020 (3070), the UK Average weekly wage was £550, launch price £469
    In 2022 (4070), the UK Average weekly wage was £610, launch price £589
    In 2025 (5070), the UK Average weekly wage was £715, launch price £549

    As a non-fan on nVidia's pricing, I have to say I was really quite surprised when collating these figures, they are much more reasonable than my bias wanted them to be!

    Of course the unknown is just how many people managed to get cards at launch at that price - be interesting to be able to see that stat through the generations.
    Reply