Nvidia Unlikely To Unveil 2018 Graphics Cards At GDC, GTC
Anyone waiting for a big Nvidia news blowout at Game Developers Conference (GDC) or Nvidia's own Graphics Technology Conference (GTC), both coming up later in March, will probably be disappointed. We’ve learned from multiple independent sources that apart from a possible announcement and a rather vague “appetizer,” nothing concrete regarding Nvidia’s next-gen gaming graphics products is likely to be revealed at these shows. Rumors that these venues would be the big reveal for Nvidia’s 2018 cards to succeed its current Pascal lineup should be put to rest.
It does appear, though, that Nvidia will continue to roll out its Volta architecture while maintaining a clear separation between silicon meant for work and for play. For example, it was already clear from server-industry circles that Ampere, pegged as the successor to Volta, will not (at least initially) get any consumer offshoots. Instead, gamers will feast on cards built on the rumored Turing architecture.
That strategy makes some sense; there are advantages of two separately optimized architectures. On the consumer side, certain key features can be dispensed with from the outset, which makes development and production significantly cheaper. The time of all-in-one architectures, which manufacturers can change arbitrarily by lasercut or activation, is likely to be over (at least with Nvidia).
What remains for gamers? AMD just isn’t putting enough pressure on the market with Vega right now to force Nvidia to make a major move. And if you leave the crypto-mining sector out of the equation, Nvidia has no big impetus to push innovation in the gaming sector. This is in keeping with our information that Nvidia’s Turing launch has apparently been pushed back. Board partners were expecting to receive Turing specifications by now so they could start creating their bill of materials (BOM) and begin development and testing, but they likely won’t get even that until May. The reason why (and whether the mining scene also plays a role in this respect) remains mere speculation for the time being.
Tidbits we’ve heard from various other sources with knowledge of the situation also suggest that the mass production of Turing cards will not start until mid-June, and thus a hard launch of board-partner cards is not expected before July. This should rule out a rollout even at Computex in Taipei in June, although we may see some non-functioning mockups.
It looks more likely that the venue for the big unveiling of Turing-based partner cards will come at Gamescom in August, at which time gamers will be able to get their hands on the them--if the crypto miners don't eat up the supply straight from the factory.
Of course, there are still a lot of question marks, and bits and pieces that could change, but many of the puzzle pieces we’ve gathered from around the industry are showing a clearer, fuller picture. We suspect that by July we’ll see a next-gen GeForce launch, and all will be made clear.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
-
King_V I'll admit it - even if there's no incentive for them to push faster, I was hoping Nvidia would tease us with something graphics-card related, at the very least.Reply -
redgarl I doubt we are going to see these cards before October. Nvidia is already making so much money over the generic stuff.Reply
Also, new cards are supposed to be HBM2/New memory interface... with the actual situation, the prices are going to be insane if you add the Nvidia tax (they love to price their products with a good margin of profit).
A vega card is already selling for 2-3 time its MSRP, I cannot expect things to change anytime soon. -
Jeremy Griner What they need to do is make a sub category of cards for the asshats who are into mining. That way when it comes time for the majority of the world to upgrade their PC we are not having to pay 1000 dollars for a 600 card. I am sure I am not the first person to have thought of this. But either way I said it lol.Reply -
spdragoo 20753386 said:Why are they called 20 and not 11? Has Nvidia confirmed this?
Don't think it's official yet -- Techspot's article only has the 20-series numbers in its title, but the body of the article indicates they could be the 11-series or the 20-series (https://www.techspot.com/news/73456-nvidia-new-geforce-cards-rumored-arrive-next-month.html).
Although, while nVidia & AMD both have a history of "restarting" their number sequences, historically they tend to lower the numbers for the newer sequence -- i.e. the top-line card in the generation preceding the GTS 150 was the 9800 GTX/GTX+. And while it kind of made sense when they skipped over 800-series GPUs (used in laptop models but not the desktop ones), I don't see any purpose in moving from 1000-series to 2000-series...unless they're hoping the unwashed masses will assume they're twice as powerful ("2000 is 2 times as big as 1000!!")... -
TJ Hooker
You're correct in that what you're suggesting has been said 100 times before. But AMD has already said that a large part of the supply issue (and associate price gouging) has to do with VRAM shortages, meaning that creating a separate line of mining GPUs wouldn't help anything.20753374 said:What they need to do is make a sub category of cards for the asshats who are into mining. That way when it comes time for the majority of the world to upgrade their PC we are not having to pay 1000 dollars for a 600 card. I am sure I am not the first person to have thought of this. But either way I said it lol. -
kenzen22b More time that computer builders will have to endure the pain of high prices and lack of supply.Reply
With supply channels empty for the 10 series, you would like to believe that Nvidia start building the new cards sooner rather then later. (for them to make money and to keep their customers happy) -
InvalidError
That won't make anywhere near as much of a difference as you think it will: most alt-coins are memory-intensive and memory is currently the main bottleneck to increasing GPU production. If you want alt-coin miners to buy your crypto-oriented CPU/GPU, you need to make it more cost-effective than GPU-mining. If you do that though, pressure on the RAM supply will get even worse. If you restrain alt-coin miner memory supply to increase GPU production, alt-coin miners will get back to GPU-mining. If you sink more RAM into alt-coin miners, than you still have no RAM to make more GPUs.20753374 said:What they need to do is make a sub category of cards for the asshats who are into mining.
Rinse and repeat for every type of component common to GPUs and alt-coin miners. The only thing that can 'fix' this is popping the crypto bubble. -
Giroro At this point I'm expecting nvidia to release a new line of rebranded Pascal cards to take advantage of the current market, with the only major difference being a higher MSRP.Reply -
Giroro 20753468 said:
Rinse and repeat for every type of component common to GPUs and alt-coin miners. The only thing that can 'fix' this is popping the crypto bubble.
Or we teach miners how to use the big FPGAs that Intel is marketing for AI use, as those have more than enough memory to mine "FPGA resistant" crypto. Sure a dev board is over $2k, but so are some GPUs.