Sticky

Nvidia Turing & Volta MegaThread


Welcome to the Volta & Turing MegaThread! I know, it is strange to mash two different architectures in one thread, but with how similar the Volta and Turing GPUs are, plus the fact that Turing doesn’t succeed Volta; it’s much easier to discuss both architectures if they are integrated into one thread.

VOLTA:

The Volta architecture was designed around the need for a cheaper and simpler way to power AI/Deep Learning compute tasks. Traditional deep learning algorithms require vast amounts of server arrays that cost hundreds of thousands of dollars and also tens of thousands of dollars to power each month.

Running these resource hungry AI algorithms on regular CPUs and GPUs isn’t the most effective way of processing AI, since normal processing units are designed to process different kinds of tasks and not just one specific workload. This is where Tensor cores come into play, these cores are designed specifically for deep learning and machine learning compute tasks. The result is a core that runs FAR more efficiently and performs better than traditional CPUs (in AI computing), now a server with several Volta GPUs can process just as much data and information as a whole line of servers reaching from wall to wall in a large warehouse.

However, Volta didn’t exactly, takeoff, as many expected, there are only two variants of the Volta core, one is a Quadro variant and one is in a prosumer variant, the Titan V.


https://www.nvidia.com/content/dam/en-zz/es_em/Solutions/geforce/TITAN/TITANV/nvidia-titan-xp-shop-625-ud.jpg" class="img lazy">
Titan V:

Price: (Around) $3000

CUDA Cores: 5120
Streaming PUs: 80
Texture Units: 320
Base Clock: 1200 MHz
Boost Clock: 1455 MHz
Memory Clock & Bandwidth: 850 MHz & 1.7 Gbps
Memory Bus: 3072-bit
VRAM: 12.3GB HBM2


TURING:

While Volta may have been focused almost exclusively on machine learning, Nvidia’s latest architecture, Turing, is focused back on the pure graphical performance of a GPU, this time in the form of Real Time Ray Tracing.

Turing is a special architecture, compared to all (known) previous Nvidia architectures, Turing has been in the making for over 10 years (yes that means Nvidia has been developing Turing since the days Fermi). You might be asking, is Ray Tracing that important? Nvidia seems to think so.

Without going too deep, Ray Tracing is incredibly intensive to run on normal GPUs since it simulates actual light rays. It can take a day, days, or even a WEEK to generate ONE ray traced image. This is why the movie industry has been the only place to take advantage of ray tracing, since they can afford to wait that long for a single ray traced image.

Turing on the other hand can generate Ray Traced images in REAL TIME. Through Nvidia’s new RT core and optimizations for Ray Tracing, real time ray tracing can now be done.

This is why we are now seeing Nvidia pushing ray tracing into video games.

Of course ray tracing isn’t the only thing Turing and the RTX gaming cards are good at, these cards also have a big bump in CUDA cores compared to their Pascal predecessors and include an unknown amount of Tensor cores (most likely for Nvidia’s new-deep-learning-anti-aliasing tech).

So for the first time ever in a GPU, we have three different cores designed for three separate functions, but all somehow work towards one goal. Pretty interesting stuff.

For now Nvidia has launched three RTX gaming cards, the RTX 2080 Ti, RTX 2080 and RTX 2070, and 3 more Turing cards in the Quadro family for the enterprise space.


Note: Turing GPUs support GPU Boost 4.0 in spec sheet. Something Nvidia hasn’t shown light on (yet).



RTX 2080 Ti:

Price: Around $1150

CUDA cores: 4352
Base Clock: 1545 MHz
Boost Clock: 1545 MHz (1635 MHz OC on FE)
VRAM: 11GB GDDR6
Memory Speed: 14Gbps
Memory Bus: 352-bit
Memory Bandwidth: 616GB/s
TDP: 250-260W
SPCs: Dual 8 pin connectors.


RTX 2080:

Price: Around $750

CUDA cores: 2944
Base Clock: 1515 MHz
Boost Clock: 1710 MHz (1800 MHz OC on FE)
VRAM: 8GB GDDR6
Memory Speed: 14Gbps
Memory Bus: 256-bit
Memory Bandwidth: 448GB/s
TDP: 215-225W
SPCs: 8 pin + 6 pin connectors (FE only, AIBs can use different configurations)

https://www.nvidia.com/content/dam/en-zz/Solutions/geforce/geforce-rtx-turing/2070/gallery/geforce-rtx-2070-gallery-b-641-u.jpg" class="img lazy">
RTX 2070:

Price: Around $550

CUDA cores: 2304
Base Clock: 1410 MHz
Boost Clock: 1620 MHz (1710 MHz OC on FE)
VRAM: 8GB GDDR6
Memory Speed: 14Gbps
Memory Bus: 256-bit
Memory Bandwidth: 448GB/s
TDP: 175-185W
SPCs: 8 pin connector (FE only, AIBs can use different configurations)
59 answers Last reply
More about nvidia turing volta megathread
  1. I find it odd they released the 2080ti alongside the 2080 and 2070... It just feels... Odd...

    Oh welp, let's hope it's just a coincidence and they'll still release something better for a refresh later on.

    Cheers!
  2. I think the 2080 Ti was ready for prime time because they delayed release of RTX due to the 10 series still selling strong. I imagine they never really stopped R&D and continued until the Ti was ready and then they launched them all together. This also might pace them well for the next wave as AMD, while not really competitive right now, has 7nm in the pipeline. They can define raytracing as the new direction and further entrench themselves in the market because AMD has no equivalent. Then, as AMD drops 7nm, NVidia drops 7nm with raytracing - advantage extended. Someone mentioned another reason which is the premium pricing of Turing gpus props up the 10 series which is still in demand because Turing is out of reach for a great many gamer. I’m sure after waiting and then seeing the prices for Turing, many people said to heck with that and bought a 10 series after all.
  3. Thanks to no competition from AMD they are able to quote the price of their GPUs as high as they want.
  4. King Dranzer said:
    Thanks to no competition from AMD they are able to quote the price of their GPUs as high as they want.


    but i think the question will be the usual "does it worth it or not"? if 2070 for example capable of matching 1080ti in classic games (no ray tracing, no DLSS) then officially 1080ti level of performance has been officially brought down to $500 from $700.
  5. If 2070 performs at the lvl of 1080Ti then yes. But 2070 is not performing at the lvl of 1080Ti it is only performing bit better than 1080 as per the leaks.

    If 2080 performs nearly at the lvl of TITAN V then yes. But 2080 is only performing a bit better than 1080Ti as per the leaks and the NVIDIA chart.

    If 2080Ti performs double 2x of 1080Ti then yes. But 2080Ti is only 50% more powerful which is only 1.5x of 1080Ti as per the leaks.
  6. Why don't you ask the editor he says just buy it. Removed
  7. The 2080 and 2070 might be expensive but i'm not sure about 2080ti. I think majority of people want 2080ti to be priced at the exact same price as 1080ti while offering 50%-60% more performance. but the thing is right now the competitor canmot even touch 1080ti. There is no pressure for nvidia to drop 1080ti price. So it is more logical for nvidia to charge even more for 2080ti. But ultimately if nvidia did price 2080ti at $700 it does not good for consumer either down the road.
  8. renz496 said:
    The 2080 and 2070 might be expensive but i'm not sure about 2080ti. I think majority of people want 2080ti to be priced at the exact same price as 1080ti while offering 50%-60% more performance. but the thing is right now the competitor canmot even touch 1080ti. There is no pressure for nvidia to drop 1080ti price. So it is more logical for nvidia to charge even more for 2080ti. But ultimately if nvidia did price 2080ti at $700 it does not good for consumer either down the road.


    The day Pascal series is either out of stock or not being bought anymore NVIDIA will have to lower its price on RTX series especially RTX 2080Ti(to around $800, we cannot expect NVIDIA to drop its price to $700 at any given point). Till then we have to wait. All this is caused by two main reasons. First being that AMD is unable to compete with NVIDIA. Second but equally important the Crypto-currency mining because of which NVIDIA had to delay Turing launch by months only to clear already manufactured Pascal GPU. If both of these factors were answered then RTX 2080 and 2070 would have launched long ago and at much meaningful price of $450 and $650 respectively and now RTX 2080Ti would have launched for $750.
  9. The problem here is how you put the price into the wider picture of things. Currently, the price of the 2080ti (or anything above $400, really) goes to weeks (months?) of groceries, maybe a full rent or even a nice 3 day vacation somewhere with your family/friends. Out of those, you have to make choices on what you can live with and without. Given that, and being a tad obvious (needed at times), it seems like nVidia is touting the cards for people that:

    1.- Don't need to choose and can afford everything.
    2.- Has little understanding of life costs and/or priorities.
    3.- Is willing to get credit for one.
    4.- Needs the extra FPS'es at all costs and/or the eye candy.
    5.- Is a raging fanboi.

    There might be a few more, but that list is pretty damn good and it even applies to every single expensive piece of tech out there. Including Teslas!

    People needs to make a stand here on what they consider affordable for themselves and what they're willing to put up with here. I'm not going to put my hands on fire for any big Corporation, so I won't even give nVidia the benefit of the doubt on pricing. If the people don't buy the cards, nVidia (and all the chain behind) will get the message instantly and get reasonable with pricing.

    Cheers!
  10. Yes and trust me I thought of it as well but then NVIDIA can get away saying that RTX 2080Ti was made for the crowd who direct their budget toward TITAN cards and there will simply be no TURING TITAN or even easier thing to do simply produce less number of 2080Ti cards. Unless people neglect buying RTX 2080(I doubt if it is possible) only then we can expect there to be a price cut.
  11. Nah let the price goes sky high for a generation or two. If nvidia drops the price for turing after pascal inventory depletes that still going to give a lot of trouble to AMD. AMD needs the room to put themselves in better position first. Nvidia charging crazy price will give that opportunity to AMD. That's why i said if nvidia put 2080ti pricing at $700 from the get go it is not good for us in the long run either.
  12. until we have actual answers about direct comparisons between pascal and turing, nobody is gonna have the true answers
    if we get 30% and above 1080ti performance in rasterization then the card is definitely worth the price, if not, sticking with my 980tis lol
  13. Those prices are salty

  14. People are considering it to be RTX 2080. But I can argue of it being RTX 2070 based on limited info given there. Both RTX 2080 and 2070 pack 8GB GDDR6 Memory and have similar clock speeds. Why consider it RTX 2080 when there is a possibility of it being RTX 2070.
  15. Lowering expectation. Better to be surprised later than being disappointed.
  16. renz496 said:
    Lowering expectation. Better to be surprised later than being disappointed.


    That's a positive way to take in the hit. Well it works for me.

    We will have to wait for 2 more weeks to get exact info on the performance of cards.
  17. King Dranzer said:
    (snip)
    Why consider it RTX 2080 when there is a possibility of it being RTX 2070.


    I'd heard it said that NVidia hasn't given out any 2070s for review, so that makes it likely that the benchmark is for the 2080. At only marginally faster rasterized graphics speed than 1080 ti, that is disappointing if it is the 2080.
  18. mortemas said:
    King Dranzer said:
    (snip)
    Why consider it RTX 2080 when there is a possibility of it being RTX 2070.


    I'd heard it said that NVidia hasn't given out any 2070s for review, so that makes it likely that the benchmark is for the 2080. At only marginally faster rasterized graphics speed than 1080 ti, that is disappointing if it is the 2080.


    It has not been given out but it can possibly be internal testing leak. I am saying that because possibility of it being 2080 is as low as it being 2070. As it not being 2070 has a reason of 2070 not being given out yet. 2080 on the other hand is only weeks away from launch and I doubt that it still does not have developed drivers which can possibly display it as 2080 and not as Generic VGA. But as 2070 is still far on launch date can possibly be under testing on non-developed drivers which shows it as Generic VGA.

    If this was posted few weeks before then I would have possibly agreed blindly. But presently RTX 2080 is already being transported to retail outlets all over the world so I expect it to have perfect working finalized drivers instead of under-developed drivers which is unable to display it as 2080. There is no reason for anyone to keep testing on older drivers when NVIDIA already has new drivers preinstalled on RTX 2080s which are being shipped.
  19. Yeah, could go either way. It's all just speculation. We'll find out when we find out.
  20. Here are my 2 cents in regards to all the speculations and rumors when it comes to the RTX line. I neither be pro nor against the line and have a neutral stance in regards to pricing. I'm really following every little piece of information in regards to the RTX launch and performance leaks. Here a few thoughts:

    1. Nvidia researched this technology for 10 years. They could have launched it last year, they could launched it next year, but they decided to launch it this year. Considering that they skipped one year with new GeForce releases shows me that they finally got it to a point where it will make an impact on the graphics market. It was not ready last year, otherwise we would have seen it (also has to do with availability and cost for GDDR6).

    2. Nvidia is doing pretty well as a company overall. So there is no desperate financial need to throw a product on the market that is not ready yet.

    3. Nvidia is very well aware that most players want to play games at 60fps+, and they know if they can't deliver this experience with the new technology that their loyal customer base would get a little upset

    4. Price: yeah it sucks, but there is a lot of factors to consider. GDDR6 RAM, second largest chip ever build (that costs extra money), 12nn technology (gives lower yield than 14nm), new import duties on some electrical components, and R&D that was invested over the last 10 years into Tensor and Ray Tracing. I don't love the high price, but I understand their pricing to some degree. Don't get me wrong, I still would prefer to pay $600 for a RTX2080.

    5. Will Ray Tracing make gaming slower? Probably yes, but I speculate that the technology is good enough to deliver 60fps at least in 1080p and 1440p. As soon that there will be optimized driver and game developer played around with that technology it will get much better than demos shown at the GamesCon. Ray Tracing will make games significantly look better, but if you play online competitively and you need 144fps, Ray Tracing might not be for you.

    6. Don't forget, if you play a game that uses the RTX chip for ray tracing to create the soft shadows or reflections (or both), the CUDA cores have a lot of extra resources for other stuff.

    7. The same will most likely be true for the new AI AA that the tensor core is supposed to do. If that turns out to look nice and game developers integrate it into more games, not doing AA with the CUDA cores will open up HUGE resources. Just play a game and turn off all AA and see how that affects your FPS.

    This is all just speculation, and it could come better or worse. But I just really think that Nvida wouldn't throw a product on the market that does not deliver. Nvida can't just live from preorders....

    Just be patient and wait until mid-September. Read the reviews.

    This all looks like I am very optimistic. Yeah, I don't have any reason not to be optimistic. But based on this I will NOT go ahead and pre-order a RTX without reading at least a couple of reviews when they come out.

    Again, take all performance leaks with a grain of salt. We talking about unofficial reviews missing optimized drivers. NEVER underestimate optimized drivers.
  21. I just watched a HW News video by GamersNexus which included info on overstock of 10 series GPUs. So it would make sense why turing is so expensive due to Nvidia wanting to get rid of 10 series cards.

    At least it isn't too big of an issue, at least for now, seem like (performance wise) 10 series is still very good even though the 20 series is coming.
  22. MERGED QUESTION
    Question from hmonkey20 : "What does everybody think about the RTX?"

    hmonkey20 said:

    What does everybody think about the RTX?


    I think until real benchmarks come out everyone is hyping up ray tracing for nothing. Yes its a cool technology but it will probably be a while before its taken advantage of well.
  23. The most benefit is for game developer not so much for gamer. For the foreseeble future hybrid rendering is the way to go. Though game engine need to adapt first before it can really shine. Just like 4A dev said (Metro series) what we see right now is still the naive implementation of RT.
  24. NVIDIA : RTX 2080 Ti, 2080 & 2070 Are ~40% Faster vs Pascal in Gaming
    By Khalid Moammer
    Aug 30


    https://wccftech.com/nvidia-rtx-2080-ti-2080-2070-are-40-faster-vs-pascal-in-gaming/?utm_source=dlvr.it&utm_medium=twitter
  25. Going for Turing over Pascal is only worthy if(for launch price of $499, $699, $999 and not the current pricing which is high)

    RTX 2070 performs equal to or greater than TITAN XP
    RTX 2080 performs equal to or greater than TITAN V
    RTX 2080Ti performs equal to or greater than twice the GTX 1080Ti

    Unless that happens Turing is a big fail on NVIDIAs part.
    With all the leaks we got presently it is a fail.
  26. King Dranzer said:
    Going for Turing over Pascal is only worthy if(for launch price of $499, $699, $999 and not the current pricing which is high)

    RTX 2070 performs equal to or greater than TITAN XP
    RTX 2080 performs equal to or greater than TITAN V
    RTX 2080Ti performs equal to or greater than twice the GTX 1080Ti

    Unless that happens Turing is a big fail on NVIDIAs part.
    With all the leaks we got presently it is a fail.


    lol not defending nvidia here but even if 2080ti "only" 40% faster than 1080ti in traditional games i can't see where nvidia fail. not when competitor cannot even touch cut down version of GP102 (let alone fully enabled GP102). with pascal nvidia in general still give consumer 30% performance increase each year where as on AMD side you need to wait two years for that 30%. now turing will give us another 40% jump. can we at least expect AMD to come up with Vega64 successor this year that is 40% faster? so nvidia "fail" when they are the only one consistently giving consumer performance boost every year? right now nvidia are competing with themselves. so very high price is very much expected. i agree the price are getting outrages but i would not going to call nvidia "fail" just because they are going crazy with the pricing.
  27. What defines a "win" it's always an interesting topic, but I think it kind of escapes the scope of this thread.

    You all know what makes you want to purchase a card or not.

    That being said, and going mostly on a hunch / assumption, I'd say it's better FPS per dollar spent at ~$300 for a LOT of people. Just look at the Steam survey stats: https://store.steampowered.com/hwsurvey/videocard/

    If nVidia wants to charge an arm and leg for the latest and greatest, they're most welcome to do so. Will they actually turn a profit that way? Well, looking at how expensive they are, they will. Is RTX good enough to make current 10-series owners upgrade? Hell no; period. Is the performance increase over the 10-series worth it? Hard to say for current 1080ti owners, really. I'm not one, so no idea.

    Cheers!
  28. renz496 said:


    lol not defending nvidia here but even if 2080ti "only" 40% faster than 1080ti in traditional games i can't see where nvidia fail. not when competitor cannot even touch cut down version of GP102 (let alone fully enabled GP102). with pascal nvidia in general still give consumer 30% performance increase each year where as on AMD side you need to wait two years for that 30%. now turing will give us another 40% jump. can we at least expect AMD to come up with Vega64 successor this year that is 40% faster? so nvidia "fail" when they are the only one consistently giving consumer performance boost every year? right now nvidia are competing with themselves. so very high price is very much expected. i agree the price are getting outrages but i would not going to call nvidia "fail" just because they are going crazy with the pricing.


    They failed to provide performance increment they consistently have been doing for past 3gen(this statement only applies if the leaks of RTX 2080 performing only 8% better than GTX 1080Ti are true). That is an unwanted hit for consumer.
  29. Compare 2080 with 1080. Not 1080ti. Look at 780ti vs 980. The gap is around 10 to 15 percent only. The gap only widen when more games being develop exclusively for 8th gen console. And realistically we can't expect major performanve boost every year either. Not when smaller node becoming very expensive and directly negating the reason of adopting smaller node: to reduce cost. To increase performance nvidia can't rely only on node shrink. They need to make more efficient (on terms of IPC) architecture. And that going to need a lot of R&D to make something better (Turing) than something that many people consider as the best architecrure available for gaming workload (maxwell). AMD for their part rely heavily on node shrink and we can see how node shrink can't even save the aging GCN. They need drastic change on their architecture if they want to compete on the even ground as nvidia. Not just adding stuff and minor tweak to GCN.
  30. Frankly, I'll have to see actual performance, in a game I'm willing to play, before I make any decision about buying a new card. The ray tracing looks good, but I don't need it to enjoy the games I play. If I do pick up a 20 series card, it will be based on the frame rate. In other words, make my frame rate soar in 1440p gaming first, then we'll discuss pricing. Otherwise, I'll yawn and look away.
  31. So review on 14th?
  32. Since they go on sale on the 10th, it would make more sense to be the 10th? Wasn't it?

    Cheers!
  33. renz496 said:
    Compare 2080 with 1080. Not 1080ti. Look at 780ti vs 980. The gap is around 10 to 15 percent only. The gap only widen when more games being develop exclusively for 8th gen console. And realistically we can't expect major performanve boost every year either. Not when smaller node becoming very expensive and directly negating the reason of adopting smaller node: to reduce cost. To increase performance nvidia can't rely only on node shrink. They need to make more efficient (on terms of IPC) architecture. And that going to need a lot of R&D to make something better (Turing) than something that many people consider as the best architecrure available for gaming workload (maxwell). AMD for their part rely heavily on node shrink and we can see how node shrink can't even save the aging GCN. They need drastic change on their architecture if they want to compete on the even ground as nvidia. Not just adding stuff and minor tweak to GCN.


    Yes comparing 2080 to 1080 the performance gain is not as huge as it has been for last 3gen. 2070 should be 8% more powerful than 1080Ti and not 2080(that was my unelaborated point above). Will have to see how the AMD node shrink gonna level up.
  34. That AMD node shrink won't do much, if anything at all.

    If you ask me, even with nVidia jumping 10% from gen to gen, that nets a bigger jump than AMD due to the current gap and net gain from AMD.

    This is to say, AMD might have a chance to shrink the gap, but it won't be remotely close to close it nor even surpass nVidia's performance lead.

    Cheers!
  35. Price wise, the 1080Ti is almost half price compared to the 2080Ti, with ~30%% improved performance. Nvidia's new 2,000 series are just too expensive, because of hardware assisted "lighting effect."
  36. Nvidia Turing and Nvidia Pascal cards both drive HDMI 2.0 output. Both architectures will provide 3840x2160 video at 60Hz (12.54 Gbit/sec).
    I'm thinking a DirectX 12 game will play about the same on either architecture. I do not anticipate any reason someone would want to replace a Pascal card with a similar Turing card.
    Perhaps there will be some new games developed that will be able to max out HDMI 2.0 with Turing but will not perform as well on Pascal.
  37. goldstone77 said:
    Price wise, the 1080Ti is almost half price compared to the 2080Ti, with ~30%% improved performance. Nvidia's new 2,000 series are just too expensive, because of hardware assisted "lighting effect."


    that's the problem when nvidia only have themselves to compete with. the best from competitor cannot even touch 1080ti let alone fully enabled GP102. actually i heard about dedicated RT hardware was suppose to be part of nvidia hardware maybe since maxwell. but back then it is not ideal to put such hardware inside their GPU when AMD can still compete head to head with their solution. they need all the die are they can get their transistor on to maximize rasterization performance. right now nvidia have both performance efficiency and die size advantage against AMD.
  38. truerock said:
    Nvidia Turing and Nvidia Pascal cards both drive HDMI 2.0 output. Both architectures will provide 3840x2160 video at 60Hz (12.54 Gbit/sec).
    I'm thinking a DirectX 12 game will play about the same on either architecture. I do not anticipate any reason someone would want to replace a Pascal card with a similar Turing card.
    Perhaps there will be some new games developed that will be able to max out HDMI 2.0 with Turing but will not perform as well on Pascal.




    the main target for turing will mainly for those with maxwell based GPU not pascal. though for those that want a single GPU solution that is much faster than 1080ti/Titan Xp (does not want to deal with SLI issues) Turing is the only answer to that. also Turing most likely inherit some of volta trait and then optimized for rendering performance more. we already see that Volta fares much better with DX12 than pascal. for games that really optimized for Turing architecture the game might end up being very fast on turing even without using DLSS to boost performance!
  39. renz496 said:
    truerock said:
    Nvidia Turing and Nvidia Pascal cards both drive HDMI 2.0 output. Both architectures will provide 3840x2160 video at 60Hz (12.54 Gbit/sec).
    I'm thinking a DirectX 12 game will play about the same on either architecture. I do not anticipate any reason someone would want to replace a Pascal card with a similar Turing card.
    Perhaps there will be some new games developed that will be able to max out HDMI 2.0 with Turing but will not perform as well on Pascal.




    the main target for turing will mainly for those with maxwell based GPU not pascal. though for those that want a single GPU solution that is much faster than 1080ti/Titan Xp (does not want to deal with SLI issues) Turing is the only answer to that. also Turing most likely inherit some of volta trait and then optimized for rendering performance more. we already see that Volta fares much better with DX12 than pascal. for games that really optimized for Turing architecture the game might end up being very fast on turing even without using DLSS to boost performance!


    So, I guess we are mostly agreeing with each other.
    Still, to further illustrate - I'm writing this reply on a PC I built in 2012 that has an Nvidia GTX 960 card that outputs HDMI 1.4a which gives me 3840x2160 video at 30Hz (6.18 Gbit/sec). I'm just thinking a 2018 Nvidia Turing card will not really seem significantly different from my 6 year old Nvidia GTX 960 card... until perhaps some new games come out that really push the Turing envelope. Because if new games continue to run really well on my 6 year old Nvidia card - I'm just not going to upgrade. I think it is going to take full HDMI 2.1 support with games that want to be run at 3840x2160 video at 120Hz to finally push me to upgrade.
  40. I will hold down my opinion on Turing series till the day I see official gaming benchmarks and head to head comparison with previous gen. I am also interested in the switch of multi-gpu bridging from SLI to NVLink and improvement of performance scaling(if there is any).

    Mostly by present leaks(which I don't trust completely) It is not going well for NVIDIA in consumer point of view.
  41. @truerock maxwell v2 was barely 4 years old at this point not 6 (980 and 970 launch in september 2014, 960 in january 2015).

    About turing i'm more interested in the architecture improvement itself rather than how much faster 2080ti will be compared to 1080ti. Is there any IPC increase vs maxwell/pascal SM configuration? What the benefit will turing have in gaming application by having the capabilities of running floating point and integer operation at the same time? RT is just something extra that is new with traditional GPU. Rather than the "shiny" and photorealistic graphic i'm more interested how RT will help to "clean" existing game engine in term of coding.
  42. MERGED QUESTION
    Question from mushroom23 : "Should i get the 2080 TI"

    mushroom23 said:
    Right now I have TItan Xp. There is a big difference between them that it's worth to upgrade??

    Thanks in advance for an answer


    wait until the review
  43. Do you think RTX 2080 TI would be able to run games on high at 2560 x 1440p x 2 @90 HZ (1 panel per eye) - Pimax "5K+" VR headset?
    Upscaling 1440p to 2160p x 2 (Pimax "8K") should have the same performance requirement, no?
  44. King Dranzer said:
    renz496 said:
    The 2080 and 2070 might be expensive but i'm not sure about 2080ti. I think majority of people want 2080ti to be priced at the exact same price as 1080ti while offering 50%-60% more performance. but the thing is right now the competitor canmot even touch 1080ti. There is no pressure for nvidia to drop 1080ti price. So it is more logical for nvidia to charge even more for 2080ti. But ultimately if nvidia did price 2080ti at $700 it does not good for consumer either down the road.


    The day Pascal series is either out of stock or not being bought anymore NVIDIA will have to lower its price on RTX series especially RTX 2080Ti(to around $800, we cannot expect NVIDIA to drop its price to $700 at any given point). Till then we have to wait. All this is caused by two main reasons. First being that AMD is unable to compete with NVIDIA. Second but equally important the Crypto-currency mining because of which NVIDIA had to delay Turing launch by months only to clear already manufactured Pascal GPU. If both of these factors were answered then RTX 2080 and 2070 would have launched long ago and at much meaningful price of $450 and $650 respectively and now RTX 2080Ti would have launched for $750.


    In 2016, there was no Ryzen. Tossed Intel into panic mode. With 7nm in the upcoming mix, Nvidia is probably eyeing AMD closely. Nvidia doesn't own ray tracing and just because they have a process in working with it doesn't mean it can't and won't be done better in the future. and as an aside, other than a very limited range of programs that can utilize it, who needs live ray tracing? Much like buying a VHS player...or "pre-ordering" when there were only 4 movies you could buy for it....all insanely overpriced because they knew they could milk the greed and ego of those that just "Had to be 1st".
  45. starborn63 said:
    King Dranzer said:
    renz496 said:
    The 2080 and 2070 might be expensive but i'm not sure about 2080ti. I think majority of people want 2080ti to be priced at the exact same price as 1080ti while offering 50%-60% more performance. but the thing is right now the competitor canmot even touch 1080ti. There is no pressure for nvidia to drop 1080ti price. So it is more logical for nvidia to charge even more for 2080ti. But ultimately if nvidia did price 2080ti at $700 it does not good for consumer either down the road.


    The day Pascal series is either out of stock or not being bought anymore NVIDIA will have to lower its price on RTX series especially RTX 2080Ti(to around $800, we cannot expect NVIDIA to drop its price to $700 at any given point). Till then we have to wait. All this is caused by two main reasons. First being that AMD is unable to compete with NVIDIA. Second but equally important the Crypto-currency mining because of which NVIDIA had to delay Turing launch by months only to clear already manufactured Pascal GPU. If both of these factors were answered then RTX 2080 and 2070 would have launched long ago and at much meaningful price of $450 and $650 respectively and now RTX 2080Ti would have launched for $750.


    In 2016, there was no Ryzen. Tossed Intel into panic mode. With 7nm in the upcoming mix, Nvidia is probably eyeing AMD closely. Nvidia doesn't own ray tracing and just because they have a process in working with it doesn't mean it can't and won't be done better in the future. and as an aside, other than a very limited range of programs that can utilize it, who needs live ray tracing? Much like buying a VHS player...or "pre-ordering" when there were only 4 movies you could buy for it....all insanely overpriced because they knew they could milk the greed and ego of those that just "Had to be 1st".


    Heh... Although I don't disagree with your main point of "early adoption fee", you need to take into account the silver lining of it: you DO need early adoption for any technology to flourish. You might not like nVidia, Microsoft or even their products, but Ray Tracing is a technology that has been around for a very long time. Seeing it being approached again is amazingly nice. It's what GPUs were created for: better eye candy. Otherwise, why the hell do we even want better GPUs for? Low polygon count should be enough, right? No lighting effects and everything should look like Minecraft with no shader effects, or not?

    In your example though, VHS didn't succeed because of it's early adoption rate, quite the contrary. VHS was competing with BetaMax. Sony pushed BetaMax to compete against VHS, but VHS was way cheaper to produce due to it being licenced whereas BetaMax was Sony only. BetaMax was technologically superior, but way more expensive. Now, here's an interesting parallel you can draw here: how much more expensive is it to add Ray Tracing into games than traditional lightning techniques? Well, nVidia, if you remember the presentation, made it very very clear: "it is cheap". Why do you think that is, thinking back to the "VHS vs BetaMax"?

    Being "better" doesn't mean you'll succeed over the cheaper alternative that is "good enough".

    Cheers!
  46. starborn63 said:
    King Dranzer said:
    renz496 said:
    The 2080 and 2070 might be expensive but i'm not sure about 2080ti. I think majority of people want 2080ti to be priced at the exact same price as 1080ti while offering 50%-60% more performance. but the thing is right now the competitor canmot even touch 1080ti. There is no pressure for nvidia to drop 1080ti price. So it is more logical for nvidia to charge even more for 2080ti. But ultimately if nvidia did price 2080ti at $700 it does not good for consumer either down the road.


    The day Pascal series is either out of stock or not being bought anymore NVIDIA will have to lower its price on RTX series especially RTX 2080Ti(to around $800, we cannot expect NVIDIA to drop its price to $700 at any given point). Till then we have to wait. All this is caused by two main reasons. First being that AMD is unable to compete with NVIDIA. Second but equally important the Crypto-currency mining because of which NVIDIA had to delay Turing launch by months only to clear already manufactured Pascal GPU. If both of these factors were answered then RTX 2080 and 2070 would have launched long ago and at much meaningful price of $450 and $650 respectively and now RTX 2080Ti would have launched for $750.


    In 2016, there was no Ryzen. Tossed Intel into panic mode. With 7nm in the upcoming mix, Nvidia is probably eyeing AMD closely. Nvidia doesn't own ray tracing and just because they have a process in working with it doesn't mean it can't and won't be done better in the future. and as an aside, other than a very limited range of programs that can utilize it, who needs live ray tracing? Much like buying a VHS player...or "pre-ordering" when there were only 4 movies you could buy for it....all insanely overpriced because they knew they could milk the greed and ego of those that just "Had to be 1st".


    there is no "probably". Nvidia is always watching AMD and as a matter of fact nvidia is very well aware what AMD is capable of (there is a lot of evidence of this over the years). but because of this you will probably never going to see AMD to pull "Ryzen" over nvidia like how it does with intel. about ray tracing don't just look it from the consumer perspective. for consumer the benefit was just a bit better graphic. but for the developer the benefit can be very big in term of much easier game development and more cleaner game engines codes.
  47. If 20series is 50% better in performance than 10series(according to new rumors) is true only then I can argue to some extent of it being meaningful to price them high as they are giving both decent improvement in rasterized games as well as ray tracing in addition. But if they are only 35% better in performance then even the ray tracing cannot justify its crazy pricing scheme.

    5 Days to go. Waiting for official benchmarks which can be trusted.
  48. King Dranzer said:
    If 20series is 50% better in performance than 10series(according to new rumors) is true only then I can argue to some extent of it being meaningful to price them high as they are giving both decent improvement in rasterized games as well as ray tracing in addition. But if they are only 35% better in performance then even the ray tracing cannot justify its crazy pricing scheme.

    5 Days to go. Waiting for official benchmarks which can be trusted.


    we want 2080ti to cost the same as 1080ti MSRP while offering more performance but the problem is 1080ti still haven't met it's challenger to this date that is not from nvidia own line up. yeah price is going crazy but this is simply how they work when there is no competitor. i for one think it was not all that bad. because this high price will be the main reason for competitor still competing in this market. well yeah maybe for some people they hope nvidia will push the performance to much cheaper price point and hope AMD will react to it. and they finally able to get their Vega at much more affordable or even "steal" price. i know some of AMD "fans" like to wait for this moment because to them AMD is their budget king to give high end performance for the price of mid range haha.
Ask a new question

Read More

Gaming PC gaming Ray Tracing Nvidia Graphics GPUs Graphics Cards