The cards are one tier apart in performance on tom's hierarchy chart.
http://www.tomshardware.com/reviews/gpu-hierarchy,4388.html
The difference can be detected with a synthetic benchmark, but in actual gameplay, the difference will not be very significant.
Vram is used differently between amd and nvidia.
More is better, but is the added cost worth it?
VRAM has become a marketing issue.
My understanding is that vram is more of a performance issue than a functional issue.
A game needs to have most of the data in vram that it uses most of the time.
Somewhat like real ram.
If a game needs something not in vram, it needs to get it across the pcie boundary
hopefully from real ram and hopefully not from a hard drive.
It is not informative to know to what level the available vram is filled.
Possibly much of what is there is not needed.
What is not known is the rate of vram exchange.
Vram is managed by the Graphics card driver, and by the game. There may be differences in effectiveness between amd and nvidia cards.
And differences between games.
Here is an older performance test comparing 2gb with 4gb vram.
http://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/
Spoiler... not a significant difference.
And... no game maker wants to limit their market by
requiring huge amounts of vram. The vram you see will be appropriate to the particular card.
One possible other factor is the power needs.
The amd cards generally need 75w more power.
Here is a chart:
http://www.realhardtechx.com/index_archivos/Page362.htm
Go to Newegg and find the candidates.
Filter on the reviews by verified buyers.
Then look at what percent of the reviews have zero or one eggs indicating some sort of a problem.
In particular, look at the reasons for a bad review. Some are not very valid, so exclude those.