AMD " FURY X " coming with 4gb HBM ## own - goal??

sz0ty0l4

Distinguished
So I've just recently read that the new AMD flagship card which is advertised for 4k gaming, is coming with 4GB HBM. In my understanding the whole point of extending the memory bandwidth is to provide performance advantage at higher resolutions.

Now here is the issue: In 4k gaming all modern(2014-2015) [NOT BF4 before anyone starts, thats 2013!!!] games already use more than 4gb vram, usually around 5- 6gb vram or even more. honestly you can push games like watch dogs to take up to 10GB vram with 8x MSAA on 4k resolution. Nvidia already realised this and that's the reason why their 4k gaming cards( 980ti 6gb, titan-x 12gb ) come with sufficient vram.

Whats actually the point of this design flaw? Is it intended , or is it a reason of bad communication between AMD and game devs? Maybe an engineering issue of implementation of HBM? ( my best bet)

PC gamers are usually smart and by this i mean 80% of pc builders understand hardware, software and know what they do when it comes to building PCs. So why would these smart people invest in 4k gaming cards with high bandwidth but insufficient memory buffer? Now this in reality drops back the FURY X to a 1080p gaming card where the HBM memory is totally redundant.

This topic is to share your opinions on this matter civilised.I'm greatly interested in everyones opinion, but please no personal attacks or hardcore-fanboyism.

I will also share a "leaked" synthetic benchmark about the card:
http://wccftech.com/amd-radeon-fury-x-fiji-based-graphics-cards-synthetic-benchmarks-revealed/
 
Solution
HBM is around 3x faster than GDDR5, but that won't effect the lack of vram capacitiy, because the performance decrease from having insufficient memory comes from the memory transfers between the RAM and VRAM which is initialized by the CPU not from the GPU's memory interface. if the vram is insufficient swapping will occure-> performance will decrease. so called " stutters" and high frame time spike will happen.

@RobCrezz The 8x MSAA in watchdogs on 4k was a extraordinary example for the 10GB vram usage. Most games on 4k without any kind of AA already use 5-6 gb vram. they even use 4,5-5 gb vram on 1440p. you can check for yourself on youtube benchmarks.
to back up my "statements":

1440p: https://www.youtube.com/watch?v=WE9kE1TSrBU...
As I understand it HBM is still a new and thus expensive, so I guess this is a way to keep the costs down. It might be a bit on the short side for 4k gaming, but you have dual/triple monitor 1080p, you have 1440p, so there are alot of possibilities to put them to good use.
 

Azn Cracker

Distinguished
4GB is not enough in some cases, but AMD uses compression to lower the need for memory. It worked fine on the 2gb r9 285 and it will work on the Fury. The compression probably will not magically match it to 12 gb of ram, but it will help.

Also keep in mind the HBM ram is a lot faster so data can be processed faster on the ram.
 

Dan414

Reputable
Dec 24, 2014
127
0
4,710
Seems like 4 GB is just a starting point, and as HBM v2 comes out with higher densities in the stack, they will be able to increase memory without changing the layout of the card. Currently, the bandwidth that the GPU can handle seems to be the limiting factor - more than 4 GB of memory would be wasted. The GPU would need double the stream processors in order to use 8 GB of memory. However, since its more efficient, you could imagine more dual-GPU implementations, which might be useful in home-theater settings/smaller builds, which is where VR is more likely to take hold.
 

sz0ty0l4

Distinguished
HBM is around 3x faster than GDDR5, but that won't effect the lack of vram capacitiy, because the performance decrease from having insufficient memory comes from the memory transfers between the RAM and VRAM which is initialized by the CPU not from the GPU's memory interface. if the vram is insufficient swapping will occure-> performance will decrease. so called " stutters" and high frame time spike will happen.

@RobCrezz The 8x MSAA in watchdogs on 4k was a extraordinary example for the 10GB vram usage. Most games on 4k without any kind of AA already use 5-6 gb vram. they even use 4,5-5 gb vram on 1440p. you can check for yourself on youtube benchmarks.
to back up my "statements":

1440p: https://www.youtube.com/watch?v=WE9kE1TSrBU [fxaa ~ nearly 0 impact on performance or vram]
4k: https://www.youtube.com/watch?v=lxzYoFWXn9Q [AA off]
4k: https://www.youtube.com/watch?v=ofSfLtD-0oQ [AA off]

@Dahotshot as i said already the increased memory bandwidth has nothing to do with the performance decrease when running out of vram. the advertised memory interface width and the bandwidth is between the GPU and the vram. when you run out of vram the swapping happens between RAM and VRAM which is initalized by the CPU, not the GPU.
but if my conclusion is wrong correct it please
 
Solution
4GB is 4GB. Period. There is no such thing as the excessive bandwidth will make up for the lack of VRAM. if that's really the case high end card will be equipped with much lower VRAM than mid range and low end because they have higher bandwidth. What AMD has been talking was how to manage the VRAM usage. Probably via drivers. AMD never claim that HBM will make up the lack of VRAM. As for OP question the current 4GB limitation was because of HBM (v1) limitation itself. Not because AMD did not want to put more VRAM. It is possible to have 8GB VRAM using certain work around (like using dual interposer suggested by many) but i heard using such method will introduce latency and will cut the bandwidth by half (defeating the purpose using HBM in the first place).
 

RobCrezz

Expert
Ambassador


Usage doesnt = needed. Many games nad programs will fill vram if its free, doesn't always mean it needs it.
 
Very true. As an owner to 660Sli which have wierd memory config i can see that under the same setting my 660SLI will use much less VRAM than my current 960. You can say that 970 memory config was an improvement from what nvidia did with the likes of 550Ti, 660Ti and 660.
 
Very true. As an owner to 660Sli which have wierd memory config i can see that under the same setting my 660SLI will use much less VRAM than my current 960. You can say that 970 memory config was an improvement from what nvidia did with the likes of 550Ti, 660Ti and 660.
 

Azn Cracker

Distinguished


Like I said before, AMD implements compression of data, which is supposed to be lossless

http://techreport.com/review/26997/amd-radeon-r9-285-graphics-card-reviewed/2
 

Azn Cracker

Distinguished



ahh i think you are right.

Still I haven't seen many instances where more than 4 gb is required. I mean the gtx 970 does ok with it's 3.5 gb
 

sz0ty0l4

Distinguished
RobCrez you are partly right, but not fully accurate. it's all down to how the game engine handles dynamic memory allocation. what you saying is that if there is more vram available there will be data preloaded into the vram for instant access and in some cases this data can be swapped fastly enough( because it's not actually needed as you said ) not to cause any trouble, so the player don't actually notice any difference if there is 4gb usage or 4,3gb usage on a 4gb card, however this doesn't mean that 1,5gb vram is preloaded and games run fine on a 4gb card while they use 5,5gb. sadly that's not how it works in most cases.
There are of course certain games which's data streaming system will allow for effective vram usage, but that will also produce unavoidable popins, like the npc popins in witcher 3. And in most games that's not the case. It also depends what data is missing from the vram. for example if there is not enough place to store the ultra textures on 4k, or higher resolution that will still lead to terrible frame time spikes..
 

RobCrezz

Expert
Ambassador


How is what I said not accurate?
 

sz0ty0l4

Distinguished
I was referring to " 4gb is enough for 4k" and " vram usage != vram needed " and i just tried to explain why that's not the case in reality in most games.

Both statements are partly right based on certain circumstances , but not fully accurate ,because the average everydays experience shows the opposite. don't you think?

when i say certain circumstances i mean: playing on 4k with " medium " textures as an example.
 

TRENDING THREADS