Nvidia Teases GeForce RTX 2080 Performance in Rasterized Games

Status
Not open for further replies.

Patrick_1966

Reputable
Jan 19, 2016
12
0
4,510
If I can't get 60 - 144fps in 4k then RTX is a total failure for me and not worth the outrageous price. Ray tracing really doesn't help me play my FPS games better. Most of the games I play don't have photo realistic imaging anyways. I am not doing engineering design of living room furniture where the final render would benefit from ray tracing to enhance coffee table reflections of the lamp sitting on it to make it look photo real. The new tomb raider benchmark shows 40fps as it is now.First thing I owuld do would be to turn off the ray tracing since it slows me down. For a majority of the gaming and mining market the RTX is a complete failure and should be ignored.
 

Patrick_1966

Reputable
Jan 19, 2016
12
0
4,510
NVidia has decided that gamers have no value and has added nothing after spending two full years to the value of its graphics cards. There is no value here for gamers. Maybe in another two years after all the games have been re-written and are able to take advantage of the secret tweaks that are currently un-released it might improve. But that will make a total of four years without significant improvements over the 1080Ti. The pricing is simply ridiculous as well. Starting gamers simply do not have the money and can not justify thousands of dollars for a graphics card. This is a total failure on the part of NVidia and its partners.
 
I think we need to see this whole RTX thing from Nvidia's perspective. What does the average/casual gamer want? Image Quality, they don't care didly about frame rate, all they want is that fancy cinematic experience. Unfortunately for all of us hardware enthusiasts/hardcore gamers, we are not exactly that large of a community vs the average consumer.

Nvidia is always going to target the largest consumer market first, hence Ray Tracing taking full priority.

I'm not saying I like this decision one bit, but Nvidia is a company and companies do what gives them the best $$$ for their work.

I wish Nvidia would of launched Volta AND Turing, have Turing be the premium GPU for those consumers who want to get into Ray Tracing fast, then have Volta be the standard card that specifically replaces pascal.
 


Dude, the haters are always gonna hate. It wouldn't matter if Nvidia released a card that could could produce hard light holograms, turn lead into gold, cure cancer and bring about an end to world hunger and poverty.
 

MCMunroe

Distinguished
Jun 15, 2006
283
1
18,865
First of all, I have disclose that I am a Enthusiast PC Hardware guy that is into maximum visual quality and tend to turn up the graphics until the 45-60 FPS range. I own a water cooled GTX 1080 Ti and play all my games in 4K.
I also think that people whom think they need over 100FPS to enjoy a game are morons. … if I read one more forum post of "Help I can't get 500FPS in CS:GO"...

Guys. Let us wait for the real reviews and bench marks. Ray Tracing may be too slow to turn on right now. But all these effect upgrades start as too slow, like HairFX and even Anti-aliasing.

Ray Tracing is a holy grail of rendering as it is what all the Animated Movies ever made are using. It is good that they are pushing it, even if it will take a few generations of hardware to be broadly useable.
 

Giroro

Splendid
@Techyinaz

The "average/casual" gamer isn't out there spending $800 on a graphics card.
The "largest consumer market" is the 85% of gamers who spend less than $300, and Nvidia has not announced any plans whatsoever to bring ray tracing to that market segment. Even if they do, the performance hit on the lower end hardware will be so dire that most games will become totally unplayable, regardless.

Of course, I would argue that Nvidia doesn't care about ray tracing or AI in the high-end gaming segment either. They are just trying to figure out a sales pitch to convince gamers to pay a markup for the enterprise features left over in these factory-reject quadro chips.
The Geforce RTX series has ray tracing because Disney/pixar/Autodesk/Adobe (Nvidia even uses those logos in their marketing) are willing to pay 5 figures for a fully featured Turing card. The AI features are similarly there for Google, Amazon, and auto makers.

Gamers are just being served leftovers that are being re-purposed as a marketing gimmick. This isn't a big deal in itself, that's what they always do... but I don't think Nvidia actually has much motivation to ensure it catches on with developers, or even to fully flesh out drivers to ensure that the tech can be used effectively in games.
There's a good chance that Nvidia will drop ray tracing and DLSS the when they start trying to get the die size down for low/mid range gaming. Of course that's assuming they don't just slap 2060/2050 labels on their overstock 1080/1070 chips like AMD would.
 

Geef

Distinguished
**Think of it like this**: When Anti-Aliasing first came out how often were you using it to play games? Not much. We had to wait for new cards with more power to show up and allow it to be used without slowing things down.

It will be similar with Ray Tracing.
 

tran.bronstein

Prominent
Aug 21, 2018
1
0
510
While I am equally outraged at the high RTX prices as other entusiasts, I think it was disingenuous to actually believe there would be no improvement over current GTX 1000 era cards. There will be some for sure. It's just a matter if we consider that improvement worth the price. I sense people are hoping for the drastic 56% increases in FPS the 1000 series brought over the 900 series and NVIDIA to be fair has been saying all along that ray-tracing is what they implemented to accomplish just that. I agree with their pushing the technology forward if not the exorbitant price they want to charge for it.
 

TJ Hooker

Titan
Ambassador

That's very contradictory. You say that ray tracing is important to Joe Walmart because all they care about is the best visuals but these are high-end, expensive cards, marketed heavily around an immature technology with limited game support (ray tracing). That puts the cards firmly in the relatively small enthusiast/hardcore market segment that are willing to pay a premium for the best performance and being on the cutting edge, not the mainstream demographic who want midrange cards that offer decent performance and great bang for the buck.
 

Patrick_1966

Reputable
Jan 19, 2016
12
0
4,510


So what games, that would benefit from ray-tracing would casual games play that would justify a $CDN 2500 (Assuming current conversion to canadian dollars and tax and GST) for a video card. The tournament games that fathered in Vancouver last week (an estimated 10 million both in person and on-line) are the people that would want this and This end up being a really steep price for something that really is only aesthetics and doesn't improve 4k experience and performance to a significant degree. I hope that I am wrong but the two initial review I have seen suggest otherwise.
I
 
Let me ask you this Hooker, how do you know it is only the hardcore/enthusiast buying the ultra high end cards? Not to mention prebuilt PCs that have these GPUs in them (in the future). Now obviously yes most people are going to buy a xx60 xx70 series GPU, but I believe there are quite a few regular "rich" people that'll buy the best of the best simply from how popular gaming has gotten and with how many streamers i've seen with ultra high end components (I would argue that not all streamers are hardcore gamers).

Also we have no idea still what the final product is, as many have said already, this is a beta test basically of ray tracing, games aren't done optimizing, drivers aren't finished yada yada yada.

Patrick, why do you call ray tracing aesthetics, ALL graphics quality settings are aesthetics. Some would easily say that 4k is stupid and is just for aesthetics. Ray tracing is like 4k in the sense that it isn't for everybody, it looks good but it's a personal preference.
 

AlistairAB

Distinguished
May 21, 2014
229
60
18,760
Everyone is missing the main problem. I judge my video cards based on frames per dollar (can't wait to see Tom's review, as they do the same). In Canada, the 2080 is 87 percent more expensive, for 40 percent more frames per second.

This is the first time in history nVidia has released new products that perform worse than their existing ones at normal game rendering. I'm actually in shock. Nobody should buy a single one of these cards honestly. I don't mind if they add ray tracing abilities, but not if the fps / dollar has actually declined vs the previous generation.

That's why they called it the 2080, to confuse people, as they could have called the chip anything they wanted to. It's not a 1080 successor when it is in an entirely different price bracket.
 


The difference between this and HairFX or even AA is that it has dedicated hardware for this purpose. That alone allows it to surpass existing hardware.

Even if nVidia had to turn off AA to get these results with DLSS if the image quality is equal to or better than the best AA while giving 35% or better performance gains, is that not a good thing?

AA is a major performance killer. The good kind that is. Sure we could be like consoles and be happy with the low performance drag of FXAA but that's not what we do. The best AA out there is good but not perfect and yet it drags performance down a ton. If you can get the same or better results without compromising performance I say why not.
 
If FPS were the only thing that mattered to gamers we wouldn't even have modern consoles. Or textures. Or high def. We'd still be rocking pure vector graphics on our souped up oscilloscopes. But it turns out, gamers also like pretty pictures. When was the last time a game engine came out and didn't tout the new graphical capabilities? Has any game developer ever announced they were stripping out all the eye candy in order to maximize refresh rate?

All the way up until 2002 the highest selling PC game was Myst, which was released at a time when CD-ROM adoption wasn't a thing. Technology has to start at the enthusiast level before it can become mainstream. Will ray tracing take off? It already has at the commercial level, so I'd give it significantly better odds than 3D or VR which a lot of companies are banking on.
 
Aug 22, 2018
1
0
10
Is this graph comparing 2080 to first 1080 with slower memory or the refreshed version? I assme to the first because 1080 ti released with faster memory an then comparision wouldn't be so good. Second is how will 2080 preform compared to 1080 in FHD games?
 

mellis

Distinguished
Jun 17, 2011
25
1
18,535
Wow, I can't believe this card has finally came out, Hopefully I will now be able to play games in 4K resolutions. Wait, I can't afford it, therefore this new product is no good to me. I guess it will make more sense to get the latest console for 4K gaming, unless AMD can come up with something cheaper for 4K PC gaming. I just thought I spent a lot of money in the 90's for those two Voodoo 3DFX cards to be able to play games at 1024 x 768 in games. 1024 x 768 back then reminds my of 4K today. In the 90's, you were doing good to play games at 800 x 600. Don't get me started about the 80's when 320 x 200 was the greatest thing minus any polygons. Lol!
 


The 1080 Ti is not a refreshed 1080. Its a larger GPU with more CUDA cores on top of the additional memory.
 
Pfft. Once independent testers have their hands on and can draw their own conclusions, then and ONLY then will people have something to consider over what they may have already in their rig. Nvidia (or any manufacturer) supplying their own numbers without second hand verification is laughably weak. It's right up there with Intel claiming their 28 core processor was totally legit while running on an undisclosed industrial chiller.

Thanks but no thanks.
 

thrakazog

Distinguished
Aug 16, 2011
182
0
18,690
I made this comment elsewhere, but I'll make it here as well. They are claiming 135% performance increase, and showing a graph that gives absolutely no useful information. The graph shows that the 1080 goes to "1", while the 2080 goes to "2.35". The only thing this shows is that tallest graph bar for the 2080 is 135% longer than the bar for the 1080. Until we get 3rd party (independent) reviews showing actual performance numbers, all we can do is wait.......and certainly not pre-order something we know nothing useful about.
 


Congratulations. You win making it into the Top 10 Most Ignorant Posts Of All Time category on Tom's Hardware. That's a tall order because there are a LOT of them over 20 years of their existence. The funny thing is that there's a pattern here: just about all of them come from relatively new accounts, <3yrs old, with only a handful of comment numbers (like yours). Let me try and help you out here:

1) Ray tracing tech is the future. It can no longer be ignored, and it is NOT about increasing FPS or making you play better. It's about moving towards a more beautiful near real-world visual gaming experience WITHOUT performance loss (like we saw in Crysis which was the next level of gaming graphics that brought to knees new generations of top end video cards for YEARS).

2) If these figures are accurate, and I would take them with a grain of salt coming from Nvidia, I'd suggest you look at the launch 1080 FE original release price ($699). The regular (non-FE edition) 2080 is expected to be $699 and will still blow the 1080 FE out of the water PLUS a new AA/Ray tracing tech targeted at 4K. To inform you, Nvidia first went with their own version of a light-on-GPU demand AA tech called TXAA which mimicked higher traditional AA without the performance loss of said higher AA settings.

3) If you think there is a single GPU solution out there capable of running 4K at high quality settings consistently above 100FPS then you are out of your mind (by consistent I'm referring to MINIMUM FPS). This is irrespective of the fact that not even two GTX 1080 Ti's can accomplish that and that's ASSUMING those games scale well with SLI (most do not these days). You have just described why people use the popular phrase "Better to remain silent and be suspected a fool than to speak and remove all doubt."
 

TJ Hooker

Titan
Ambassador
@TechyInAZ
I thought I remembered reading that the largest volume of sales of discrete GPUs was in the $150-$300 price range a while back (when AMD was promoting the first generation of Polaris cards), but unfortunately I can't find a source that provides actual numbers at the moment.

I will say, anecdotally, whenever I've seen the word "mainstream" used in the context of graphics cards by someone in the tech industry or press (which is fairly often), it is used to refer to cards with MSRP <=$300, maybe a bit higher. So in the Nvidia lineup, it'd be up to an [X]60, or occasionally an [X]70 depending on the price. So if we assume that the price range commonly referred to as mainstream really is what the mainstream gamer is buying, then anyone buying a high end card is not a mainstream gamer by definition (I'm treating "mainstream", "average", "casual", "regular" etc. gamer as being synonymous here).

Of course, it's entirely possible that convention of calling $150-$300 cards is inaccurate and doesn't actually represent what the average person is buying.
 
Status
Not open for further replies.