There are many variables that go into performance numbers of a GPU and that is just one of them.
Usually that number will not change within the same architecture. On the nVidia side, you are referring to a GTX 770(256-bit for 2GB and 4GB of VRAM) or a GTX 780(384-bit for 3GB of VRAM).
The 256-bit memory interface on the GTX 770 4GB is likely one of the factors why it is not able to use the extra VRAM until you get to Ultra resolutions and/or multiple displays in SLI. There is almost no performance increase in any game right now for a single 1080p display when comparing the 4GB GTX 770 to the 2GB GTX 770.
Both cards can make good arguments for a 1080p display. The GTX 770 is good enough to max a 60Hz display while the GTX 780 will show you frames in triple digits on a 120-144Hz display.
At these levels your talking mid-high end graphics cards anyway. I would compare actual game benchmarks to determine which is the better GPU, as opposed to bit rate. A 256 card could be faster than a 384 bit in certain games & vice versa.
It's like buying a car solely based on the rim size. It just doesn't make sense. You'd want to compare all specs. But this is not car buying and for gpus, you should just throw away the spec sheet. In the end, the only thing that matters is which is better performance. We need to know what cards.
The 2GB version. The 4GB version won't do much at all for you on a single 1080p display. The GTX 770 2GB should be able to play 90%+ games for the next two years on High or Ultra settings with a 60Hz display, pretty easily.