Is usable monitor size dependant upon the graphics card?

Hi all,

I am interested in this Samsung monitor:

I have an GeForce 8800 GTX but only have an AMD 3800+ Dual core. From what I've read, the larger the monitor, the more stress is placed on the graphics card. Is there any truth to this?

I use the computer almost exclusively for gaming and I wanted a bigger display than my current 21" widescreen but I also do not want a drop a noticable drop in gaming performance.

14 answers Last reply
More about usable monitor size dependant graphics card
  1. Yes
  2. runswindows95 said:

    LoL, thank you for that very detailed response. Haha. Anyways, is a 24" monitor good to go with a single 8800 GTX or does something that large need SLI support? Especially considering games such as Crysis and Age of Conan, which will tax a graphics card right out of the gate without even considering the size of the monitor.
  3. I had some similar considerations when I had to buy a new monitor too. From what I read the main issue with larger LCD monitors is that they display better quality when using them at their native resolution. I moved up from an old 19 inch CRT to a 19 inch widescreen LCD (with a native resolution of 1440 x 900). I was used to running at 1024 x768 before so my graphics card wasn't up to running games at the higher resolution. Although I haven't noticed a big difference running at lower (non-native) resolutions.

    Hopefully someone who runs a monitor that large can add their input with regards to FPS in certain games.
  4. Geeky has it right. The larger the monitor, the larger the native resolution... and you need a beefy graphics card to handle a high native resolution, because if you're using an LCD monitor your image quality will suffer at anything but the native resolution.
  5. "From what I've read, the larger the monitor, the more stress is placed on the graphics card. Is there any truth to this? "

    One thing to note, as Cleeve said, the higher native resolution requires a beefy graphics card. This is because at higher resolutions the fps are more dependent on the GPU and not the processor. I may be wrong, but I think your processor might be a bottleneck when gaming at those high resolutions though. If Cleeve sees this he might be able to refer you to an article that will show you that (I can't find it ATM). If not though, a single GTX should be fine for gaming at 1900x1200.
  6. drudge, I can confirm that higher resolution will indeed give you a noticable drop in performance. I play games at 2560x1600, and the FPS did go down a lot from 1600x1200, my previous resolution.

    You have a very powerfull GPU so i think you will be alright on that front. You might have to OC your CPU a bit but I think the experience of a large monitor is worth it. But this is just a personal preference.

    I also have a "weaker" CPU the E4300 and here is some results for me running FEAR both at stock and OCed. Hope it helps yu decide:

    |CPU |GPU     |3DMark06 |FEARFPS(Min,Avg,Max)Dist(<25, >25<40,>40)  
    | 3.1|625/1000| 11364   | FPS(26,42,60), FPS % Dist(0, 52, 48)  
    | 1.8|576/900 |  7689   | FPS(23,39,60), FPS % Dist(9, 39, 42)  

    These are for FEAR @ 2560x1600 with everything on max, and using an E4300 with an eVGA8800GTX

    Good luck.

    EDIT - It might be obvious, but I figure I mention it anyways. If you get the Samsung, and a game you play slows down too much for your taste, you can always bring down the resolution. The opposite is not possible, so with the larger monitor you have more choice also.

    But remember LCDs at non native resolution dont look as good, but not that bad either. I play some really slow games (eg DiRT) at 1280x800 and you know what? It still a much better experience because of the larger screen area.
  7. We did a 'budget gamer's special' test with an Athlon X2 3800+ with an 8800 GTX vs an e6600 with an 8800 GTS 320mb.

    The 3800+ and GTX beat the e6600/8800 GTS combo at high resolutions:

    At 1600x1200 and over - the resolutions you'd buy an expensive videocard for - the impact of the CPU is greatly minimized.
  8. I have a 8800 gts and a c2d @ 3.5 ghz. I play on a 24" LCD. As it stands now the gts will run more or less anything at native rez., with some exceptions, so the GTX should do even better. The only question I suppose is your CPU. I'm just not sure. (ah, just saw cleeve's post above which answerers that)

    You can always run at lower rez if need be, I do that for Bioshock because in some scenes my FPS were pretty low, the game seems to look fine at 1680 x 1050. Not as crystalline as native I suppose but honestly I hardly notice the difference. Maybe I'm just easy to please though. You can also run lower rez and turn off scaling in which case you have black bars but get the full img quality.

    There is a lot of uncertainty about future games.
  9. hm thats very interesting because , i plan buying 20.1 inch wide lcd whit res 1650 : 1050 but my configurations is old p4 630 3 GHz 2mb cache and 6600 gt 128 mb 3
    i play old games like cm 2005 , f1 challange , half life ,
    how will be the performance whit this res , now i play this games whit full efects
  10. Yes it is based on your graphics card. normally for 24 + SLI is recommended...

    But if you have a good monitor that supports 1:1 pixel mapping you can get away without sli. Basically 1:1 pixel mapping places black around the borders of your screen in order to play at a smaller resolution. let's say you have a 1920x1200..You enable 1:1 pixel mapping and can play at 1600x1050 resolution. The monitor will place center the 1600x1050 display in the center of your screen while surrounded the rest with a black border. This is the best solution to play in lower resolutions without having the image being scaled and become slightly blurry.

    Not all monitors support this though.
  11. 1:1 pixel mapping is nice but it kind of defeats the whole purpose of wanting a bigger monitor. When I play games in 1600x1050 with 1:1 on my Dell 24 it becomes the equivalent size of a 21" monitor.

    I'm sure the 8800GTX in todays games will be fine at 1920x1200 and if you've got the money to spend on that second GTX might as well save it for when next gen games come out and sell your card and use that cash towards the next gen top of the line card.
  12. well , i look for dell 20.1 and this lcd have two version wide and non native 1600 : 1200 this is like ' big brother' 1024 : 768 and ...
    is the 1600 : 1200 better than wide in games for my vga 6600 gt 128/128 ddr3, i hope u understand what i mean
  13. It would be easier on the graphics card and I find games over a year old have better support for the older 4:3 resolutions.
  14. Drudge said:
    From what I've read, the larger the monitor, the more stress is placed on the graphics card. Is there any truth to this?

    No. It's not size but resolution which has an effect on video card performance. At first glance it might seem that the larger the LCD , the larger the resolution but his does not hold. For example, I have a 17" LCD which runs at 1920 x 1200....A typical 22" is 1680 x 1050. My 17" puts a 31% bigger load on the system than does the typical 22" LCD.

    You should be able to find reviews of your card at various resolutions. The brain can recognize performance degradations when frame rates drop below 24-30 fps, so dropping from say 80 fps to 45 only affects bragging rights, not the game experience.
Ask a new question

Read More

Graphics Cards Monitors Graphics