I'm sure this has been asked a million times but I can't find the answer,
Whats better to go with for a graphics card? a higher Memory interface? or a higher Memory Size? I see a lot of cards that have either one higher then the other. From what I remember of my old school gaming days is that the 32 bit SNES was far more inferior to the 64 bit N64, the graphics were much cleaner and crisper but the games seemed to run at a steady frame rate as they were designed to. So i'm guessing the bit interface is the final rendered detail of the graphics? and the Memory size determines the rate at which it's read? Am I right?
Looks like the 1GB HD 4890 has a lot of bang for the buck but they run about $200 a little more then i wanted to spend at the moment, found out that the 1Gb HD 4870 is almost as good but they run about $160 hmm... decision decision.