Sign in with
Sign up | Sign in
Your question

Force 24 in display on 27 in display?

Last response: in Graphics & Displays
Share
December 9, 2012 9:27:01 PM

Hey guys I'm currently gaming on a 27 in imac display that is connected to my pc tower. I have already forced 1920 x 1200 on my display to get better frames per second in my games (Which gives me big black bars around my screen), but I measured the actual screen size that I'm getting with it and it's only like 19-20 inches. I was just wondering If there was a way I could increase the screen size to maybe 23-24 inches while keeping 1920 x 1080 or 1920 x 1200 forced? Thanks for the help.

Btw I have a 7950 if it makes a difference.

More about : force display display

December 10, 2012 6:55:08 AM

bump
m
0
l
a b U Graphics card
December 10, 2012 6:57:40 AM

Go into your drivers and enable GPU scaling.
m
0
l
Related resources
December 10, 2012 7:08:13 AM

darth pravus said:
Go into your drivers and enable GPU scaling.


What would this accomplish exactly? I already know how to force 1080p, but i'm just wondering if i cold make that screen size any bigger
m
0
l
a b U Graphics card
December 10, 2012 7:11:16 AM

It would stretch it out to a larger ratio but to fill the screen.

http://support.amd.com/us/kbarticles/Pages/UnableToSetG...

There is a reason it's not usually done. It tends to look blurred as it's stretching pixels.

The reason it's small is because it's displaying the amount of pixels your telling it to.
m
0
l
December 10, 2012 7:14:51 AM

would there be a way to set a custom resolution? So I could keep trying different numbers out until I find the right size? Or are there only set resolutions one can use for games like 1080P and 1920 x 1200?
m
0
l
a b U Graphics card
December 10, 2012 7:17:23 AM

Some games limit your resolution as you can't even use 1920x1200 I don't know if they will natively be able to set custom resolutions. Maybe using the INI file.

It's always better to lower the details and play at native res though.

m
0
l
December 10, 2012 7:18:48 AM

alright i guess ill do that, it's true, 2560 x 1440 looks pretty good, performance does take a pretty massive hit though..
m
0
l
a b U Graphics card
December 10, 2012 7:21:40 AM

Thats the price you pay for a huge res. Multi GPU setups are the only way to be honest.
m
0
l
December 10, 2012 7:22:39 AM

yeah my next gpu upgrade next year or the year after will probably be another 7950
m
0
l
a b U Graphics card
December 10, 2012 7:31:21 AM

I wouldn't go for crossfire as the 7 series is known to have microstuttering problems.

Just sell your 7950 and get the next dual GPU monster ;) 
m
0
l
a c 147 U Graphics card
a b 4 Gaming
December 10, 2012 7:53:51 AM

kalvus5 said:
alright i guess ill do that, it's true, 2560 x 1440 looks pretty good, performance does take a pretty massive hit though..


*You may wish to PRINT THIS out.

Hi,
Let me solve your problem.

I game on a 2560x1440 monitor. Most of the time I choose 1920x1080 due to the performance hit. There is an optimal combination of screen resolution and all the other quality settings. In some cases 1600x900 might even be optimal.

You can change the resolution in the GAME and you should NEVER be choosing a ratio that causes black bars.

60FPS and VSYNC:
If your computer is capable, it's usually desired to output at 60FPS synched to the monitor. In the game you choose "VSYNC" (if it's there). Example: Torchlight 1 or 2 aren't too demanding. You should be able to run at 60FPS with VSYNC. You can monitor your frame rate using FRAPS.

Here's an example of how to setup your game:
1) run FRAPS
2) start the game
3) set game to 1920x1080
4) set game quality to MEDIUM/HIGH (depends on your PC)
5) start game and observe frame rate (FRAPS, in one of the corners)
6) Ideally, you want 60FPS (for SHOOTERS, aim for at least 40FPS)
7) for Starcraft 2 style, you want at least 30FPS
8) *Sometimes it's best to increase Anti-Aliasing and decrease SHADOWS (play around for desired quality, but achieving a high enough frame rate is the goal, and again if possible synching with VSYNC at 60FPS)

NOTE:
AMD users can get RadeonPro. I used it to force VSYNC and anti-aliasing when not supported by the game. Mass Effect 1 didn't have AA and forcing 4xAA SuperSampling was a huge difference (normally I don't use SuperSampling).

Witcher #1 also doesn't have VSYNC. I could FORCE it using RadeonPro (and now with NVidia Control Panel).

Other settings:
1) choose 2560x1440 as your DESKTOP resolution
2) change the DPI scaling (make text and other items larger or smaller) to 40%
3) In your NVidia or AMD Control Panel->
a) Perform scaling on: "GPU"
b) Scaling mode: "ASPECT"

*3 a) is not necessary but I prefer it. It does all the scaling on the actual graphics card and sends a 2560x1440 video feed to your monitor. This prevents an incorrect setting on your monitor from stretching the picture. If you set a GAME to 1920x1080 it does all the processing at 1920x1080 then simply scales it just before sending it to the monitor.

SUMMARY:
- desktop to 2560x1440
- most GAMES to 1920x1080 or 1600x900
- adjust game quality to achieve desire frame rate

Good luck.
Feel free to ask anything you wish.
m
0
l
a c 147 U Graphics card
a b 4 Gaming
December 10, 2012 8:02:41 AM

darth pravus said:
I wouldn't go for crossfire as the 7 series is known to have microstuttering problems.

Just sell your 7950 and get the next dual GPU monster ;) 


Don't go with a dual-GPU or SLI/Crossfire. As said it causes micro-stutter (dual-GPU is the same as two cards).

A single high-end card works great. I have one of the best cards available (ASUS GTX680 DC2T) which is $520 and I can play nearly every single game at 60FPS and the highest quality settings. The HD7950 is still an awesome card too. You still need a good CPU and RAM (i.e. i5-3570K, Z77/1155 motherboard, 8GB DDR3 1600MHz or 2133MHz) or you'll see some performance drop.

Again, absolutely stay away from any multi-GPU setup (please guys, don't start a discussion here).

*UPDATE:
I forgot to add, in addition to the DPI Scaling, you can also use CTRL +Scroll (up or down) to easily adjust the text/image size in WEB PAGES. I also use "No squint" addon for Firefox to pre-set every page. I use "FULL PAGE ZOOM", 200%, 100%, 10% under "Zooming" (makes sense when in the addon Options).
m
0
l
a b U Graphics card
December 10, 2012 8:08:51 AM

Dual GPU cards work in a different way because of their internal bandwidth so don't really stutter.

Anyway moving on from that, a single card isn't great at that high res. A 680 will struggle playing metro/crysis/far cry/most new RTS games.
m
0
l
a c 147 U Graphics card
a b 4 Gaming
December 10, 2012 8:20:02 AM

darth pravus said:
Dual GPU cards work in a different way because of their internal bandwidth so don't really stutter.

Anyway moving on from that, a single card isn't great at that high res. A 680 will struggle playing metro/crysis/far cry/most new RTS games.


I own a GTX680. I can tell you from experience that the visual quality of 2560x1440 versus 1920x1080 is negligible. The actual Textures created don't improve. The ONLY improvement is possibly slightly sharper text, such as in Diablo 3 with small text but I can run that game at 2560x1440, max quality, 60FPS.

I have basically EVERY GAME available and toggled between 2560x1440 and 1920x1080 and again, the quality difference is usually not even noticeable.

(Just a quick response to SLI/Crossfire micro-stutter. Dual-GPU cards aren't immune. I think the GTX690 uses Frame Rate Metering to help the situation but it still exists and is very obvious in some scenarios: http://www.tomshardware.com/reviews/radeon-hd-7990-devi... ; I even saw a video with the main NVIDIA representative discussing this situation on the GTX690. He was annoyed though, as he obviously didn't expect or want the question).
m
0
l
a b U Graphics card
December 10, 2012 8:25:22 AM

I think there are some third party solutions to limit microstuttering. It exists in single GPU's too.

I know screen wise the difference isn't huge it's more that fact that stretching res and running non native tend to looks pretty awful.

I use 1920x1200 as I don't feel anything higher has a benefit over the performance hit but he already has a screen of higher res.
m
0
l
a c 147 U Graphics card
a b 4 Gaming
December 10, 2012 10:38:47 AM

darth pravus said:
I think there are some third party solutions to limit microstuttering. It exists in single GPU's too.

I know screen wise the difference isn't huge it's more that fact that stretching res and running non native tend to looks pretty awful.

I use 1920x1200 as I don't feel anything higher has a benefit over the performance hit but he already has a screen of higher res.


Hi,
There's no reason to stretch the resolution. I recommended he use 1920x1200 which is the same Aspect Ratio (1.6) so there's no issue there.

As to running 2560x1600, as I've stated the actual visual experience is minor. It makes NO difference in the main content, basically the HUD elements like text are slightly sharper. Again, in Diablo 3 I do recommend running at 2560x1600 since he can get 60FPS and max quality and the smaller text looks sharper.

As to these so-called 3rd party solutions to micro-stutter, I've read about them but there is still no way to negate the issues completely.

Again, I've toggled many games back and forth between 2560x1440 and 1920x1080 to compare. In the end, the (usually) minor quality difference vs major performance impact make 1920x1080 (or 1920x1200 for him) optimal for most games.

m
0
l
a b U Graphics card
December 10, 2012 10:44:17 AM

I think radeon pro had something which eliminated it entierly. The frametimes were completely even.

It looks to me like a bad case of overused FXAA but he can try it. Each to his own and all. Just seems pointless to have such a high res screen and blur it up.

m
0
l
a c 147 U Graphics card
a b 4 Gaming
December 10, 2012 4:05:17 PM

darth pravus said:
I think radeon pro had something which eliminated it entierly. The frametimes were completely even.

It looks to me like a bad case of overused FXAA but he can try it. Each to his own and all. Just seems pointless to have such a high res screen and blur it up.


I have no idea what you are referring to with that FXAA comment (I know what FXAA is).

You keep implying that the best solution is to scale his games to the highest resolution or there will be major visual issues. That is simply not the case. Why you keep stating this baffles me unless you've actually tested this yourself which you appear not to have done.

I've tested this on over TWENTY games.

Again, the ONLY advantage to using that highest resolution is slightly sharper text which in most games is rarely noticeable. The drawback is a huge framerate drop, in many cases 30%.

I have set 2560x1440 in Diablo 3, and Torchwood 2 and a couple others. The rest scale from 1920x1080 and they look great.
m
0
l
a b U Graphics card
December 10, 2012 4:47:09 PM

I meant the way fxaa tend to overblur textures. I have tested this before with multiple monitors and it's not usually worth the lost definition.

Nothing major I just love the clarity. It's like xbox upscaled 720p to my eyes.

And yes the framerate drop is huge ubt why invest so much in the monitor if your not willingto invest in the GPU's.
m
0
l
a c 147 U Graphics card
a b 4 Gaming
December 10, 2012 5:43:28 PM

darth pravus said:
I meant the way fxaa tend to overblur textures. I have tested this before with multiple monitors and it's not usually worth the lost definition.

Nothing major I just love the clarity. It's like xbox upscaled 720p to my eyes.

And yes the framerate drop is huge ubt why invest so much in the monitor if your not willingto invest in the GPU's.


The decision to use FXAA is independent of the monitor resolution. If you don't like it, don't use it.

I keep saying that the quality difference at 2560x1600 versus 1920x1080 is usually quite minor, yet the frame rate drop is major.

Not only is it quite expensive to add the second card but you add micro-stutter. This is the LAST time I'm commenting but let me leave you with this->

Micro-stutter:

Note:
http://en.wikipedia.org/wiki/Micro_stuttering

In particular:
"AMD's Radeon HD 7000 series is severely more affected by micro stuttering than nVidia's GeForce 600 Series"

If you still disagree, then there's little more to be said. How about we just agree to disagree at this point.

(FYI, even NVidia's main product rep discussed micro-stutter. Both AMD and NVidia are well aware of this problem. The ONLY people stating it's a non-issue are people who don't understand the situation well enough. It's actually understandable when you look at a game and it seems "fine", that is until you compare it to a similar setup with no micro-stutter and play both for an extended time, then it's clear there's a problem.)
m
0
l
a b U Graphics card
December 10, 2012 6:26:07 PM

Nvidia are in a much better position on the microstutter front however I did see a bench somewhere that radeon pro have some kind of limiter which gave it perfect frame timings.

I meant it looked like it not it actually was. I agree that 1920x1200 are optimal but he already has such a high res so maybe a dual GPU solution with the frame time limiter would be an option.
m
0
l
a c 147 U Graphics card
a b 4 Gaming
December 10, 2012 9:57:49 PM

darth pravus said:
Nvidia are in a much better position on the microstutter front however I did see a bench somewhere that radeon pro have some kind of limiter which gave it perfect frame timings.

I meant it looked like it not it actually was. I agree that 1920x1200 are optimal but he already has such a high res so maybe a dual GPU solution with the frame time limiter would be an option.


It's called Frame Rate Metering for NVidia.

The best dual-GPU solution is the GTX690.
The best multi-GPU solution is a 3xGTX670 or 3xGTX680.

The RadeonPro solution referred to is here:
http://www.tomshardware.com/reviews/radeon-hd-7990-devi...

*It seems great at first, but as I read it one thing concerned me. I see no synching to the monitors refresh rate (i.e. 60FPS). This indicates to me that you will get screen tearing.

I'll admit this sounds very interesting. I've looked for information on the issue of screen tearing and can't confirm anything.

**I found this comment referring to RadeonPro:
"I've tried new Dynamic frame rate control which gives me massive screen tearing ,same with Dynamic Vsync when fps below refresh."

I'm not sure his issue is fixable. Tom from NVidia also said they could fix the issue easily as well but the buffering required introduces lag (time from mouse movement to change on screen) like triple buffering.

If NVidia and AMD technicians have difficulty over YEARS I don't think the issue will be fixed easily in software. There's got to be a trade-off.

But hey, I'll admit I don't know everything.
m
0
l
a b U Graphics card
December 11, 2012 6:00:45 AM

3 GPU's have always been the best solution but obviously that requires major power and space.

I didn't know about the screen tearing issue. I'm guessing it's not as simple as applying vsync or triple buffering in a supported game?

Whatever the OP decides, good gaming and may your PC treat you well for many years :) 
m
0
l
a c 147 U Graphics card
a b 4 Gaming
December 11, 2012 9:58:56 PM

Final note.
On reading again, I said I recommended "1920x1200"; that's likely for a different person. The iMac 27" is 2560x1440 so it should be "1920x1080" most of the time and "2560x1440" when 60FPS can be achieved (i.e. Diablo 3/Torchwood).
m
0
l
!