Get GTX 970 and upgrade later when AMD and Nvidia 16nm GPUs to come out or get the GTX 980 Ti now?

Apr 27, 2015
19
0
10,510
I am planning to upgrade my GPU and was undecided on whether I should get a GTX 980 Ti because it looks reasonably futureproof with 6gb ram etc.

However, recently I have been reading about the AMD 300 series and the Nvidia Pascal series of GPUs which will be coming out and are expected to be much faster (I know that Pascal is unlikely to come out until 2016).

As I only game at 1080p currently, would I be better getting a GTX 970 for now to save my money and then upgrade in maybe a year when both AMD and Nvidia have released their new 16nm chip GPUs?

I want to play the Witcher and GTAV at pretty much highest settings on 1080p. In the next couple of years (max) I intend to upgrade to gaming at 4k which I appreciate the 970 would not be powerful enough for.

Thanks for your help in advance guys!
 
Solution
If you're going to stick with a 60Hz 1080 display the GTX970 is the way to go, no need to wait; by the time you go 4K the current hardware will be at least one, if not two generations old and the stuff available then should be better suited to 4k displays-and without the current large price premium 4k capable hardware currently carries.
Otherwise, I'd wait, as has been said, the 'new' R300 is pretty well a washout as far as tech goes, but if you're planning on spending so much I feel it would be wise to at least see what the AMD Fury can do with new silicon and HBM before making a decision to go super high end now.
The Fury should be out mid June, which will give the supply chain time to get some better cooled GTX980Ti cards into...

senseijtitus

Honorable
I would say get either GTX 970 or 980 for now. later on when the AMD Card is out, view benchmark comparisons and user reviews for it and compare it with the GTX 980Ti. Then decide on the best one at that time for any upgrade from there.
 
AMD 300 series video cards are already here. They just have 200 series numbers on them. Yes, AMD is rebadging the entire 200 series line into 300 series lines. From what we have seen so far, only the new AMD Radeon Fury will have new silicon, and since its not going to have a number, I think its safe to say the entire 300 series will be re badges. The mobile lineup was done last month, and now the it is time for the desktop announcement.

As for what to buy, it depends on what you have now.

One year from now, roughly, both AMD and Nvidia will most likely begin releasing smaller node (process technology, likely to be 14nm or 16nm), video cards, and both companies have announced that HBMv2 (High Bandwidth Memory, stacked, double data rate version) will be used on at least some of those cards. This is going to give the cards that use it massive amounts of memory almost (but not quite) on the GPU itself. Not to mention incredible memory bandwidth. But, that is a year from now, possibly a bit later in the year.

So you get to choose. Buy whats out there today, or wait a year, and see what you think. I believe HBM is going to revolutionize the memory industry. And pretty quickly. Because that memory works equally well with any processor. Intel and AMD could turn out to be the 2 largest memory vendors on the planet in 2 years time because of HBM. But it comes in the processor package, and not on sticks of memory that we need to mount on a circuit board (video cards), or on the CPU/SOC instead of sticks that have memory on them that we plug into slots on our motherboards. Things are a changin'!
 
If you need a new GPU urgently, I would say get a GTX970 or GTX980Ti, depends on your budget.
If you do not in deep need of a new GPU, I would rather wait like the others have suggested.

In my opinion, we are now in a big graphic standard transition, from 1080p to 4k. (like several years ago as we started to move towards 1080p)
During such transitions, GPU and monitor techs are advancing faster as usual.
Waiting for later GPUs are thus always be the best option, if possible.
 


how did you come into such conclusion? AMD have their hands on HBM but the effort was done together with Hynix. AMD also involved with GDDR5 creation but they never sell GDDR5 chips to any graphic card maker. instead the memory module was sold by establish player in the market like Samsung, Hynix, Elpida etc. also i don't see RAM sticks going away in favor of HBM on desktops or laptops.
 
If you're going to stick with a 60Hz 1080 display the GTX970 is the way to go, no need to wait; by the time you go 4K the current hardware will be at least one, if not two generations old and the stuff available then should be better suited to 4k displays-and without the current large price premium 4k capable hardware currently carries.
Otherwise, I'd wait, as has been said, the 'new' R300 is pretty well a washout as far as tech goes, but if you're planning on spending so much I feel it would be wise to at least see what the AMD Fury can do with new silicon and HBM before making a decision to go super high end now.
The Fury should be out mid June, which will give the supply chain time to get some better cooled GTX980Ti cards into circulation, the reference coolers seem to be unable to cope with this card with all the reviews I've seen so far commenting on high temperatures and noise-either way, the wait will be worth it.
 
Solution
Apr 27, 2015
19
0
10,510
Thanks guys. That's all really helpful. I think I will get a GTX 970 at this stage and put some money aside for a bigger upgrade in a year or so when GPUs will be able to handle 4k better without such a huge price tag.

My HD6950 is in desperate need of an upgrade so the GTX970 should give me the boost I need to keep me going in the latest games until both AMD and Nvidia have released HBM down the line.

Cheers!
 

Intel keeps saying that Skylake, coming at the end of this year has new Memory technology. AMD has already announced that starting with their new Zen CPU and SOC next year, that processors will have HBM. They have also said that HBM will help all processors. It will take time for the cost to come down on HBM, but with a couple of years you are going to be seeing HBM almost everywhere.

I predict that Intel and AMD will become the worlds largest memory manufacturers simply because if you start dropping 16 GB to 64 GB of HBM on a processor, the cast majority of processors will have all of the memory they will ever need. Xeons might need more, and they will find a way to do that. They will likely put 128 GB on the CPU package, and then have DDR4 slots to expand that as needed.

 
As i said just because HBM included on the package it did not turn AMD nor Intel into memory vendor. They will still need to buy or license them from memory maker. By your definition nvidia will also become a memory vendor because they will also going to use HBM starting with Pascal.
 
There wont be HBM memory sticks. The memory will be on the new processor packaging. Like this shot yesterday of the new AMD Fuji GPU with four 1GB stacks of HBM on the package. Those 4 rectangles next to that huge GPU are the 4 stacks of HBM:

2iqo93d.jpg


By putting the memory on the GPU package, the Fuji card length has shrunk to just 19cm. The exact same thing can be done with any kind of processor or SOC.

Right now, they are only using 4 layers, and only 4 stacks on this GPU. AMD has said that 16 layer HBM is doable, and that more stacks per processor is very possible. So we could see 4GB single stacks, and if they used 8 of those, then you would be looking at 32GB of memory with a incredible 8092 bit wide bus.

This HBM that AMD is using right now was done on a very old process. Which makes sense because its easier to work with for now. The process used will shrink, and stacks will go higher fairly quickly. As the size of the HBM shrinks, more HBM stacks per processor will be able to be used.

Imagine a motherboard where the CPU comes with 32GB or 64 GB of HBM on the package. We would no longer need memory slots on that motherboard. That opens up a lot of new possibilities. More M.2 slots. Smaller motherboards. All kinds of things could be added to those motherboards that the engineers have not even attempted until now because of the incredible number of traces that have been on those boards until now. Removing the 4 memory sockets would just open things up nicely.
 
Except that would kill flexsibility and probably affordability as well. What if the memory somehow got faulty or damage? You will end up needing to replace your entire cpu because of that. Replacing the memory stick is a lot cheaper than have to replace the processor + memory.
 
No. Stop and think about this for a second. Right now, a software designer has to worry about how much memory a person might have in their computer. And a whole lot of people only have 4GB.

Now if the CPU's started having 16GB or 32GB of HBM on the CPU package, the cost of the CPU with HBM would go up a little bit, but you would not need to buy memory sticks, and suddenly, everyone has 16GB or 32GB, and software designers can begin to plan on creating programs to take advantage of that. This includes, but is not specific to game designers. The possibilities of what they can do with those amounts of memory in every computer are endless.

It also would mean that consumers and many businesses would no longer have to worry about memory upgrade problems because they would never need to upgrade. Compatibility problems would just cease to be a problem. Old memory vs new memory would just not even be a thought.

Now sure, Xeons are a different beast. Some of those systems use Terabytes of RAM. And at least for now, I don't see HBM being able to do that, so there might well be some huge amount of HBM on the CPU package, and then slots for DDR4 in addition to that.

And finally, Intel and AMD would not be memory vendors in the sense that they would be selling memory sticks. But they would be putting more memory on the CPU's and SOC's than are being sold today by other companies.
 

TRENDING THREADS