Sign in with
Sign up | Sign in
Your question

GPU ram vs 24" monitor

Last response: in Graphics & Displays
Share
February 18, 2009 4:25:28 PM

Hello, my question is, What is the proper ammount of RAM needed on a GPU to perfectly coencide with a 24" monitor at 1920x1200 resolution?
I understand the highest ammount of memory per GPU currently offered is 1GB. From what I understand this is not enough and creates framebuffer issues on any resolution 1920x1200 and up. With that in mind do you think GPU creators will be adding cards to the market with more than 1GB of ram per GPU anytime in the near future? Thank you for any information you can provide.

More about : gpu ram monitor

a c 191 U Graphics card
a b C Monitor
February 18, 2009 4:40:08 PM

My own 512Mb card seems ok with only rare `glitches` in games like Crysis if I set any AA so 1Gb should be more than enough.
a c 130 U Graphics card
a b C Monitor
February 18, 2009 4:45:35 PM

Well the correct answer to your question is that it depends on the game you are playing. Im dont know where you got your info and if you have a link i would be interested in reading what it has to say.
As far as i am aware there is no frame buffer issue @ 1920x1200, couldnt say for above as i dont know enough about that res, however i suspect this is the point where dual card set ups become needed as oposed to wanted.
I can see some vendors doing it for marketing reasons but i dont think that Extra Ram would help at those kind of resolutions.
the chips can only push so many pixels regardless of how much Ram you give them.
This is a good artricle that covers the Pro's and Cons.
http://www.anandtech.com/video/showdoc.aspx?i=3415&p=2

Mactronix
Related resources
February 18, 2009 4:45:39 PM

Right, I currently have a Oc4830 with 512 and it runs crysis on med/high fine with no AA. Im more curious what ammount of Ram would be needed to be able to run with like 8xAA... because that kills even the best cards with 1gb ram
February 18, 2009 5:00:20 PM

Hey Mactronix, In toms most recent SBM of the 5000$ pc, they used 2x MSI N295GTX-M2D1792 GeForce GTX 295 GPU's.
http://www.tomshardware.com/reviews/ssd-overclock-i7,21...
Which had horrible output on Crysis with AA enabled.
Im trying to find where I read it, but I remember this is largely due to ram + framebuffer on resolutions over 1920x1200

a c 130 U Graphics card
a b C Monitor
February 18, 2009 5:55:38 PM

Well if you read what you posted you will see that they explain the main problem which is that Crysis scales really badly and also that Nvidia's AA isnt as good as ATI's they even say that 4870X2 would be a better solution.
Seriously its mainly down to the GPU having the power to push the resolution in the first place. Thats Basically how all cards work. First they renderthe scene then any extra performance is taken up with shading and finally AF/AA. Some of the newer AA filters are helping the situation but i think we are still a long way off of having serious AA on all games as standard.
If you want a laugh read this http://forums.slizone.com/lofiversion/index.php?t31683....

Mactronix
February 18, 2009 6:16:16 PM

It depend on games you play ... REALLY. sometime ATI is better ... anothertime its Nvidia. read reviews about game you play on this site :

hardocp.com
In the GPU section, find the games you play and read about it. They try ATI and Nvidia.

The goal is to find wich runs @ highest settings when keeping decent fps (30fps +)

Have fun !
February 18, 2009 6:19:21 PM

haaaa and about CRYSIS. its true that nvidia dont scale really good with this game. MY sli just allow me to use 2x or 4x AA . CODwaw, MAXED OUT and still over 60fps all the time ...
February 18, 2009 6:21:08 PM

Thanks for the replys guys =)
!