Will 970 price drop because of AMD 300 and Fury release?

xStampede

Distinguished
Jun 18, 2013
818
0
19,360
Hi, i need some input from an experienced computer entusiast, how long ussualy takes for prices to adapt after new releases?

How long should it take for 970 to drop due to the new AMD released?

AMD_Radeon_Fury_HBM_Lisa_Su_600.jpg


DSC01928-980x653.jpg
 

oczdude8

Distinguished
We can only speculate at this point, but it seems highly unlikely. The new 300 series are just a rebadged version of the 200 series, and the r9 fury is meant to compete more with the 980/980ti/Titan x. 970 seems to be pretty much still uncontested in its category.

I doubt NVidia will even lower its 980ti prices because of the new r9.......
 

StarChief

Reputable
Jun 22, 2015
844
0
5,160
Why would the price drop due to a new AMD release? The price will drop due to age and new GTX products. Nvidia won't drop the prices because it doesn't have to. On the other hand, AMD drops the prices on it's hardware because it needs to do so to stay competitive with Intel/Nvidia, that's why you saw the r9 2xx series drop in price upon the release of the Maxwell 900 series. Doesn't work the other way unfortunately, at least not recently. AMD has been too far behind Intel/AMD so they lower prices to compete.
 
Price drop? I doubt it.

As said, the 300 series is a rebadged 200 series mostly. It is at the price point of the GTX970. So for the same price you'll have to look at what is offered.

I would get the GTX970 over any other R9-290/290X/390/390X.

You can find a GTX970 for as cheap as $300USD so it's hard to recommend anything AMD offers until below say $250. The 8GB of VRAM on the R9-390/390X seems pretty ridiculous to me as well but maybe someone will be laughing at me in a year and saying "see you NEEDED more than 4GB of memory for the 390X".
 

xStampede

Distinguished
Jun 18, 2013
818
0
19,360
AMD R9 390 have better Dirext X 12 support and 8GB of ram at same price, it trades blows with 970 in 1080p and beats the 970 in higher resolutions. After DX12 comes into widespread use the 970 will be far behind and weaker due to lower levels of support for DX 12 of Maxwell arhitecture. If consumers were informed Nvidia would be forced to drop the prices because for gamers the 390 is much better choise if you are looking forward.

Also consider than Nvidia have terrible drivers for older GPU's like the ones that have 770 and 780 are getting 30 fps in games with newer drivers cause Nvidia is trying to force them to buy the 900 series GPU, same will happen when Nvidia release newer GPU's, they will gimp and make the 970 slow with the drivers.

I need Nvidia for CUDA cores in video editing and Blender so it's a different story, if i was using it for gaming only then i would go with 390 without a doubt. No contest.
 


Wow... where to start?

1. The R9-390 is NOT comparable to the GTX970 if you look further. The new cards are the same GPU as last-gen but are HEAVILY OVERCLOCKED at stock and thus have minimal overclocking left.

Thus STOCK vs STOCK is not comparable as the GTX970 overclocks further. Here's a benchmark (couldn't find a good R9-390 with lots of games): http://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/30.html

Note how high the R9-390X scores compared to the R9-290X. Assume the R9-390 does that and it should match the GTX970. Again though, the GTX970 also have overclocking left.

2. Better DX12 support?
R9-390/390X-> DX12_0
GTX970/980-> DX12_1
http://www.extremetech.com/extreme/207598-demystifying-directx-12-support-what-amd-intel-and-nvidia-do-and-dont-deliver

Some people where confused by an AMD slide presentation that said "DX12 Tier 3" support which is misleading.

3. Lower level of support for NVidia DX12?
I think I just proved that incorrect, though the advantages of DX12_1 over DX12_0 might not get used much.

4. NVidia poor support for older GPU's?
Proof?
I have a GTX680 and tested over 300 games with NO PROBLEMS to speak of. Most of the community seems to agree that NVidia is now ahead on driver support. Probably due to AMD's financial problems.

Other:
We also didn't discuss power, MFAA, H.265, HDMI 2.0 support, voxel lighting, HUD support, and PhysX.

*So not sure where you're getting your information but you really aren't correct.
 

xStampede

Distinguished
Jun 18, 2013
818
0
19,360


1.I agree with you on point 1 for 1080p, overclocked 970 should do better overall.

2. That article is extremely biased, anyway the table showing support levels is correct, the most important for games is resource binding, you can see AMD current GCN arhitecture having level 3 support while Nvidia Maxwell has only level 2, which should give AMD cards around 5-25% advantage over Nvidia 900 in actual games on windows 10 with DX12.

4. I was talking especially about 780 and 770 driver issues many of it's users are experiencing, one example from TomHardware forum.
http://www.tomshardware.co.uk/answers/id-1931735/gtx-780-low-fps.html

I used to have both Nvidia and AMD gpu's in last 10 years, gtx 580 and r9 280x are the most recent examples. In general AMD works better all around and in less famous games, Nvidia works better in famous games that are have great nvidia optimization.

I trust the AMD drivers more because they never caused the fans to turn off like in the Nvidia drivers that killed thousands of Nvidia GPU's.

You can read more about that here in this article named "WARNING! NVIDIA 196.75 drivers can kill your graphics card"

http://www.zdnet.com/article/warning-nvidia-196-75-drivers-can-kill-your-graphics-card/

The main think i don't like about nvidia is it's compression and low detailed specifications like gflops and bus speed. Nvidia get better performance by lowering the detail and compressing textures which decreases the quality to the point that the eye still can't tell the difference. I don't mind using a little more power and electricity for the real think without compromises.
 
Too lazy to read all the responses in this thread. But I don't think the 970 price will drop because of the 390/390X or Fury/FuryX. The reason is that lowering the price of the 970, would then make it a better deal to SLI two 970's then to have one 980 Ti. Nvidia pays attention to such things. The only way that would happen is if competition forced the 980 Ti price down and hence the 970 went down in response to it. For instance, if the price of the 980 Ti went from $650 to $600, then the 970 might go from $325 to $300. Get it? The discount would effectively be half of what is seen.
 
Nvidia won't drop 970 price unless AMD priced the newly released 390X below 300 mark. That's how the reality is when it concerns AMD vs Nvidia. AMD if anything would like nvidia to price their stuff high so they can get better profit. Nvidia was very aware about this hence they release 980ti ahead with that 650 price tag to pressure AMD. Some said that move will kill titan x but i think nvidia have their own plan. Instead of making tons of titan x now they can put most of full fledge GM200 towards Quadro line up. The one that didn't make it will go towards 980ti.
 
xStampede,
If you want to PM me we can discuss this further, however your knowledge of the subject is very skewed.

For example, you admit that the particular AMD series discussed supports DX12_0 and the NVidia DX12_1 but then state that the AMD cards support "Level 3" but NVidia only "Level 2" which makes absolutely no sense.

If you support DX12_1 then DX12_0 is a SUBSET of that so you support everything it does plus more features.

(I did try to explain that AMD has a misleading slide stating "TIER 3" for DX12_0 cards which then got people confused)

And for VIDEO DRIVERS, if you do some research it's generally agreed that AMD is behind in this overall. I can provide lots of links to reputable sites but I'm not sure that will change your mind.

*And the "main thing" you don't like about NVidia is texture compression to increase bandwidth?

First of all it's LOSSLESS which means you can't see the difference in quality. The point is they can make a cheaper card that does the same thing. And it's not only NVidia that does this:

"This isn't actually entirely new, we just saw something similar on AMD's launch of the Radeon R9 285 (Tonga) GPU. The Radeon R9 285 introduced something called: "Lossless Delta Color Compression." The Radeon R9 285 with a 256-bit memory bus matched the performance, and in some cases a little faster, than the AMD Radeon R9 280 which has the same GPU specs, but a faster 384-bit memory bus. AMD's Delta Color Compression allowed the R9 285 with a narrower bus to perform like that of a wider 384-bit memory bus.

NVIDIA is doing something similar to AMD’s Lossless Delta Color Compression. NVIDIA is using its third generation Delta Color Compression. NVIDIA's Delta Color Compression is also lossless. Delta Color Compression was actually present all the way back to Fermi, but it was limited in capability. Maxwell improves effectiveness by offering more choices of delta calculation to the compressor. Delta Color compression reduces the memory bandwidth needed. It has major results on R9 285, and we are seeing that it has evolved on Maxwell."
 
Quite the AMD fanatic aren't we xStampede ? If you are going to bring up the driver issue from five years ago why not mention that Nvidia instructed it's AIB's to replace any damaged cards? And while you are at it why don't you also mention the GSOD/vertical line issue that plagued the 5xxx series?
 

sz0ty0l4

Distinguished
there seems to be a big confusion about directx12. d3d feature levels are not the same as directx point updates. DX12.0 is not the same as DX12_0.... the direct3d is a part of directx, it is responsible for the rendering features in dx12, it consists of header files that contains predefined functions, values etc. Depending on the GPUs physical architecture they work with devs can use different rendering features in dx12. the whole point of these features is to create a lot better and more accurate graphical view under dx12 api, more lifelike shadows, reflections, and so on.
there are x2 12_1 features you can read about both here:
https://msdn.microsoft.com/en-us/library/windows/desktop/dn903929(v=vs.85).aspx
https://msdn.microsoft.com/en-us/library/windows/desktop/dn903791(v=vs.85).aspx

gtx 970 supports rovs and tier2 CV so it's 12_1. AMD cards are 12_0

By the way I think a maxwell 2.0 card is a better investment for any gamer. It's great architecture which has tons of unexploited performance even for the average customer. There is actually no sense in comparing a GCN card to a maxwell 2.0 on stock clocks. That's a totally unlogical decision for reviewers.
Just look at the situation with Fury X and GTX 980 ( not Ti). Fury X can gain 5% performance at maximum, while the 980 can gain around 20-25% on a closed loop. Even on a hybrid asus rog posseidon card it can do 1600mhz where it's around on pair with the overclocked Fury X and gets close to a 295x2 in terms of gaming performance, on even higher clocks it will outperform the Fury X.

The 300 series is even worse. AMD just increased the core clock, wrote DirectX12 support on the box of the 290x , and yes the 290x was already dx12.0 card before the rebrand. :D :D

There are no AMD cards that support d3d 12_1.
 

StarChief

Reputable
Jun 22, 2015
844
0
5,160
Guy buys an HD series AMD card. Card eats 400w, heats up to 85c, can't play utoob videos. Guy looks for drivers, drivers blue screen pc when installing repeatedly. u wot m8? Guy trades in amd HD card and buys the mighty r9 290x, guy goes home excited. Guy plugs card in, all and more issues persist. Guy throws 290x into the sun and buys a GTX 970. Guy has no problems ever since.

Plot twist

guy is me.
 


Welcome to the dark side. :ange:
 


customer already cast their vote. just look at the market share between the two. some of these issue is old and quite known to the public but despite that people are still continue buying nvidia cards.

people can point at nvidia flaw but did people try to do the same when looking at AMD? and about Project Cars the dev clearly mention the lack of cooperation from AMD to which AMD did not denied at all. if that's not the case then AMD will come out with public statement denying what Slightly Mad studios claim. they did it during batman Arkham Asylum fiasco then why they did not the same this time? even with the witcher 3 Richard Huddy dare to say both nvidia and CDPR sabotaging AMD but why not Project Cars? GameWorks crippling AMD? but when Dirt Showdown comes out did we ever heard rage about AMD crippling nvidia performance? stuff like G-Sync whilst proprietary nvidia did put their effort to properly supporting the tech. it is the same to their other proprietary tech. can the same be said to AMD? look at their 3D support. for games that have no build in 3DHD support it is up to monitor maker to provide the driver. in AMD stance they want monitor maker to compete among themselves to provide the best driver. but what happen if the monitor maker of your choice are going out of business and can no longer provide future driver support? with nvidia 3D vision there is no such worry because all the driver will be provided by nvidia themselves. and i've already seeing something similar brewing up with FreeSync.
 

xStampede

Distinguished
Jun 18, 2013
818
0
19,360
There is no doubt that at the moment Nvidia Maxwell arhitecture is superior to any other. We still have to see how Windows 10 and DX12 will impact the performance of AMD and Nvidia GPU's.

Microsoft and AMD colaborated a lot lately for Xbox One and DX12 that took a lot from Mantle. Most insiders are saying that the Resource Binding is most important for gaming, the rest is just for After Effects, Blender and similar CAD and Video editing programs.

The table shown on the website you linked is not even showing AMD latest GCN 1.2, either way, you can see that 1.1 has Resource Binding level 3 while Maxwell has only level 2.

DX12FeatureLEvels.png


We won't have a definite anwer on that until DX12 with Windows 10 is out and ready with games made for DX12.
 

razamatraz

Honorable
Feb 12, 2014
200
0
10,710


AMD basically raised prices on the same old cards and changed 2s to 3s so if anything Nvidia could raise prices.