Nvidia GeForce RTX 2070 Founders Edition Review: Replacing GeForce GTX 1080

Results: Destiny 2 and Doom

Destiny 2 (DX11)

A 37% lead over GeForce GTX 1070 lines up with Nvidia’s internally-generated results comparing RTX 2070 to its generational predecessor. But the real comparison should be to GeForce GTX 1080. In that face-off, RTX 2070 is 9% faster for a 20%-higher price tag.

To RTX 2070’s credit, we’re glad to see 100+ FPS average performance in Destiny 2 using the game’s Highest quality preset. At the right price, this card could be a real winner (particularly since its nearest competition from AMD, Radeon RX Vega 64, is only 80% as fast).

Turning off SMAA allows the GeForce RTX 2070 Founders Edition to maintain playable frame rates at 4K in Destiny 2. The lead over GTX 1080 is a mere 8%, but a 28% lead over the $500 Radeon RX Vega 64 does speak to Nvidia’s competitive positioning.

Doom (Vulkan)

Although the five fastest graphics cards in our chart are limited by Doom’s 200 FPS limit, GeForce RTX 2070 Founders Edition is not. An 8% lead over GTX 1080 and an 18% advantage compared to Radeon RX Vega 64 translate to perfectly smooth frame rates in Doom at 2560x1440.

Upping the resolution to 3840x2160 imposes a more taxing graphics workload, knocking frame rates down across the board. Now, GeForce RTX 2070 is 14% faster than GTX 1080 and 26% quicker than Radeon RX Vega 64.

MORE: Best Graphics Cards

MORE: Desktop GPU Performance Hierarchy Table

MORE: All Graphics Content

63 comments
    Your comment
  • coolio0103
    200$ more expensive a rtx 2070 than a gtx 1080 Strix or msi which is the same perfomance than the FE 2070. Bye bye 1080? Really? This page is dropped to the lowest part of the internet news pages. Bye bye Tom's hardware
  • 80-watt Hamster
    Frustrating how, in two generations, Nvidia's *70-class offering has gone from $330 to $500 (est). Granted, we're talking more than double the performance, so it can be considered a good value from a strict perf/$ perpsective. But it also feels like NV is moving the goalposts for what "upper mid-range" means.
  • TCA_ChinChin
    gtx 1080 - 470$
    rtx 2070 - 600$

    130$ increase for less than 10% fps improvement on average. Disappointing, especially with increased TDP, which means efficiency didn't really increase so even for mini ITX builds, the heat generated is gonna be pretty much the same for the same performance.
  • jaexyr
    What a flop
  • cangelini
    2755195 said:
    200$ more expensive a rtx 2070 than a gtx 1080 Strix or msi which is the same perfomance than the FE 2070. Bye bye 1080? Really? This page is dropped to the lowest part of the internet news pages. Bye bye Tom's hardware


    This quite literally will replace 1080 once those cards are gone. The conclusion sums up what we think of 2070 FE's value, though.
  • bloodroses
    2755195 said:
    200$ more expensive a rtx 2070 than a gtx 1080 Strix or msi which is the same perfomance than the FE 2070. Bye bye 1080? Really? This page is dropped to the lowest part of the internet news pages. Bye bye Tom's hardware


    Prices will come down on the RTX 2070's. GTX 1080's wont be available sooner or later. Tom's Hardware is correct on the assessment of the RTX 2070. Blame Nvidia for the price gouging on early adopters; and AMD for not having proper competition.
  • demonhorde665
    wow seriously sick of the elitism at tom's . seems every few years they push up what they deem "playable frame rates" . just 3 years ago they were saying 40 + was playable. 8 years ago they were saying 35+ and 12+ years ago they were saying 25+ was playable. now they are saying in this test that only 50 + is playable? . serilously read the article not just the frame scores , the author is saying at several points that the test fall below 50 fps "the playable frame rate". it's like they are just trying to get gamers to rush out and buy overpriced video cards. granted 25 fps is a bit eye soreish on today's lcd/led screens , but come on. 35 is definitely playable with little visual difference. 45+ is smooth as butter
    yeah it would be awesome if we could get 60 fps on every game at 4k. it would be awesome just to hit 50 @ 4k, but ffs you don't have to try to sell the cards so hard. admit it gamers on a less-than-top-cost budget will still enjoy 4k gaming at 35 , 40 or 45 fps. hell it's not like the cards doing 40-50 fps are cheap them selves… gf 1070's still obliterate most consumer's pockets at $420-450 bucks a card. the fact is top end video card prices have gone nutso in the past year or two... 600 -800 dollars for just a video card is f---king insane and detrimental to the PC gaming industry as a whole. just 6 years ago you could build a decent mid tier gaming rig for 600-700 bucks , now that same rig (in performance terms) would run you 1000-1200 , because of this blatant price gouging by both AMD and nvidia (but definitely worse on nvidia's side). 5-10 years from now ever one will being saying that 120 fps is ideal and that any thing below 100 fps is unplayable. it's getting ridiculous.
  • jeffunit
    With its fan shroud disconnected, GeForce RTX 2070’s heat sink stretches from one end of the card, past the 90cm-long PCB...

    That is pretty big, as 90cm is 35 inches, just one inch short of 3 feet.
    I suspect it is a typo.
  • tejayd
    Last line in the 3rd paragraph "If not, third-party GeForce RTX 2070s should start in the $500 range, making RTX 2080 a natural replacement for GeForce GTX 1080." Shouldn't that say "making RTX 2070 a natural replacement". Or am I misinterpreting "natural"?
  • Brian_R170
    The 20-series have been a huge let-down . Yes, the 2070 is a little faster than the 1080 and the 2080 is a little faster than the 1080Ti, but they're both are more expensive and consume more power than the cards they supplant. Shifting the card names to a higher performance bar is just a marketing strategy.
  • 0InVader0
    132378 said:
    wow seriously sick of the elitism at tom's . seems every few years they push up what they deem "playable frame rates" . just 3 years ago they were saying 40 + was playable. 8 years ago they were saying 35+ and 12+ years ago they were saying 25+ was playable. now they are saying in this test that only 50 + is playable? . serilously read the article not just the frame scores , the author is saying at several points that the test fall below 50 fps "the playable frame rate". it's like they are just trying to get gamers to rush out and buy overpriced video cards. granted 25 fps is a bit eye soreish on today's lcd/led screens , but come on. 35 is definitely playable with little visual difference. 45+ is smooth as butter yeah it would be awesome if we could get 60 fps on every game at 4k. it would be awesome just to hit 50 @ 4k, but ffs you don't have to try to sell the cards so hard. admit it gamers on a less-than-top-cost budget will still enjoy 4k gaming at 35 , 40 or 45 fps. hell it's not like the cards doing 40-50 fps are cheap them selves… gf 1070's still obliterate most consumer's pockets at $420-450 bucks a card. the fact is top end video card prices have gone nutso in the past year or two... 600 -800 dollars for just a video card is f---king insane and detrimental to the PC gaming industry as a whole. just 6 years ago you could build a decent mid tier gaming rig for 600-700 bucks , now that same rig (in performance terms) would run you 1000-1200 , because of this blatant price gouging by both AMD and nvidia (but definitely worse on nvidia's side). 5-10 years from now ever one will being saying that 120 fps is ideal and that any thing below 100 fps is unplayable. it's getting ridiculous.


    The 10 series cards and the consoles were the ones that marked a big step up in standards. Namely making 60fps 1080p minimum - which it fcking should be by now, let's be honest. If my ancient GTX 760 manages 60 fps in most modern games, I'd expect these cards to do 144+

    Speaking of, 144Hz 1080p/1440p is what most people want anyways and they want the high refresh rates a lot more than 4K too.
  • bit_user
    132378 said:
    wow seriously sick of the elitism at tom's . seems every few years they push up what they deem "playable frame rates". just 3 years ago they were saying 40 + was playable. 8 years ago they were saying 35+ and 12+ years ago they were saying 25+ was playable. now they are saying in this test that only 50 + is playable?

    I don't know if it's fair to pin fps inflation on Tom's. It would be interesting to do a similar survey of other publications and perhaps forum posts on various sites.

    Maybe what's happening is that many people are upgrading graphics cards separately from their monitors, and getting spoiled on high fps at lower resolutions.

    132378 said:
    35 is definitely playable with little visual difference. 45+ is smooth as butter

    Depends on what kind of game and to some extent whether your monitor is variable refresh. With a flick of the wrist to glance around a corner, a few extra frames could save your life. But in a RTS game, I think high fps is basically just eye candy.

    VRR 4k monitors are a fairly recent development - some 4k owners might still be using fixed-refresh models. Dropped frames on fixed refresh monitors show as motion judder, and that can be pretty noticeable.
  • cangelini
    70240 said:
    With its fan shroud disconnected, GeForce RTX 2070’s heat sink stretches from one end of the card, past the 90cm-long PCB... That is pretty big, as 90cm is 35 inches, just one inch short of 3 feet. I suspect it is a typo.


    Believe that should have been 19cm--fixed!
  • cangelini
    328798 said:


    Thank you bit!
  • csm101
    not going to get the FE edition. i'm waiting EVGA to release there SC gameing edition which will have a price tag of 499$. my 970 is dead and playing games in a old 670. so its time to get that 2070.
  • AgentLozen
    The more I dug into this article, the more frustrated I grew with the RTX2070. I was bottling up my anger until the end of the article for a scathing comment but this article's conclusion as well as the top comments in the forum really summarized what I was feeling.

    Nvidia has refined Pascal into Volta and then into Turing in the last 2 years but all it seems to amount to is enhanced AI computation, Ray Tracing, and a minor efficiency bump. They're moving in the right direction but I'm a little underwhelmed with the finished product.

    I feel like in 1st grade elementary school Nvidia taught its Pascal chips single digit addition. Two years later, in 3rd grade, they just got around to double digit addition. Not a very big improvement for how much time has gone by.
  • magresens
    So in conclusion, the new Turing cards are not worth the money at this time.
  • mlee 2500
    1781251 said:
    Frustrating how, in two generations, Nvidia's *70-class offering has gone from $330 to $500 (est). Granted, we're talking more than double the performance, so it can be considered a good value from a strict perf/$ perpsective. But it also feels like NV is moving the goalposts for what "upper mid-range" means.


    Other PC components, including GPU's and other semiconductor parts, have historically improved without prices adjusting far in excess of inflation. No, this is far more about what the market will pay or bear, and maybe just a little bit about the higher cost of technologies (like faster DDR) or foundry time.
  • mlee 2500
    This seems like a really nice card, but when you're already paying well over $500 it's hard not to justify paying another $200 for a card with ~33% or more performance (like the RTX2080).

    That is, based on the results in this article, the value proposition doesn't seem to hold up. The performance delta is far greater then the cost savings.

    I'd pay half as much for 2/3rds the performance, but that's really not the case here. Seems like allot less GPU for just a little less money.
  • keith12
    1791309 said:
    1781251 said:
    Frustrating how, in two generations, Nvidia's *70-class offering has gone from $330 to $500 (est). Granted, we're talking more than double the performance, so it can be considered a good value from a strict perf/$ perpsective. But it also feels like NV is moving the goalposts for what "upper mid-range" means.
    Other PC components, including GPU's and other semiconductor parts, have historically improved without prices adjusting far in excess of inflation. No, this is far more about what the market will pay or bear, and maybe just a little bit about the higher cost of technologies (like faster DDR) or foundry time.



    Exactly. People can give out about the price and/or performance, but sadly it's down to what people are willing to pay. In a market where Nvidia have almost complete dominance (at least for the high end), they can pretty much charge what they want! The premium covers the generational leap in performance as it nearly always had, and also includes features that will possibly bring massive improvements in graphical goodness and realism.

    I would agree though with others, at least for me, it's not worth it. I couldn't care less about ray tracing or the fancy affects it may deliver. But do care about max FPS, with high HZ. Does the card deliver on both fronts? Yes. Will I make use of the extra features now if I buy one? No. The caveat being, the extra capabilities may be worth the extra spondulix. We just don't know yet.

    This reminds me of the days when a Geforce 256 brought hardware T&L to the scene mainstream. Prices were crazy, but the performance and effects were worth it for the enthusiast. Back then, did I pay the extra bucks for the performance the GF256 offered at the time. Yes, I did. And it was glorious as more games took advantage of it.
  • mlee 2500
    182540 said:
    1791309 said:
    1781251 said:
    Frustrating how, in two generations, Nvidia's *70-class offering has gone from $330 to $500 (est). Granted, we're talking more than double the performance, so it can be considered a good value from a strict perf/$ perpsective. But it also feels like NV is moving the goalposts for what "upper mid-range" means.
    Other PC components, including GPU's and other semiconductor parts, have historically improved without prices adjusting far in excess of inflation. No, this is far more about what the market will pay or bear, and maybe just a little bit about the higher cost of technologies (like faster DDR) or foundry time.
    Exactly. People can give out about the price and/or performance, but sadly it's down to what people are willing to pay. In a market where Nvidia have almost complete dominance (at least for the high end), they can pretty much charge what they want! The premium covers the generational leap in performance as it nearly always had, and also includes features that will possibly bring massive improvements in graphical goodness and realism. I would agree though with others, at least for me, it's not worth it. I couldn't care less about ray tracing or the fancy affects it may deliver. But do care about max FPS, with high HZ. Does the card deliver on both fronts? Yes. Will I make use of the extra features now if I buy one? No. The caveat being, the extra capabilities may be worth the extra spondulix. We just don't know yet. This reminds me of the days when a Geforce 256 brought hardware T&L to the scene mainstream. Prices were crazy, but the performance and effects were worth it for the enthusiast. Back then, did I pay the extra bucks for the performance the GF256 offered at the time. Yes, I did. And it was glorious as more games took advantage of it.


    Right! And let's face it, there has to be a certain amount of Ray Tracing market saturation...a certain number of consumers who have the feature....before developers will really start to code for it. That means the first generation of cards with Ray Tracing GPU resources will never or hardly ever use it, and it will be two or three more generations of NVIDIA products before it's meaningful to consumers.

    Just like nobody would bother coming out with a 4K 120Hz monitor back when the best prevalent GPU interfaces were DisplayPort 1.2 (4K@60Hz). It will require a few years of saturation with 10 series and 20 series cards which have DisplayPort 1.4 before there exists a consumer base to even sell such a product to.
  • logainofhades
    Sounds like another don't buy it. We sorely need AMD to bring some competition to the table. Nvidia's price gouging has gotten out of control.
  • hannibal
    Hmmm... third party 2070 seems to cost about 650€ in europe, so even higher than FE version... 1070 was almost as fast as 1080, now there is huge cap between 2080 and 2070...
    in reality this 2070 is in the same gategory as 1060 was compared to 1080. So it seems to be, like we all know, that 1070 was too good and too cheap. Now it is ”remedied” by these new namings. Interesting to see what 2060 will look like next year. Cost $500? Good prise for ”middle range” card...

    These Are good GPUs, the price just don’t meet the old series. Normally it is because old cards gets so cheap, that They have better bang for the buck. Now old cards remains the old price and new ones get higher resulting the same situation, but offering nothing to customers.

    Update: asus price in germany ;)
    https://geizhals.de/asus-rog-strix-geforce-rtx--oc-90yv0c90-m0na00-a1907931.html?