Sign in with
Sign up | Sign in
Your question

Widescreen Gaming effects on performance???

Last response: in Video Games
Share
August 7, 2006 7:15:43 PM

Just a generic question for those that might be doing actual gaming development.

Does Widescreen or Custom or resolutions or NON-STANDARD cause any hit on performance?

Are games/gaming engines optimized for standard resolutions like 800x600, 1024x768, 1280x1024 or 1600x1200?

I am currently running a res of 1680X1050 on my 20.1 widescreen monitor.
I also hook up on occasion to my 720P projector for Gaming Nights.

I am also using an older Proc and Vid card (P4 3.0C and ATI 9600 256mb).

This came up in another forum and I was just trying to put it back on topic by moving it here.

Thanks in advance for any replys (I think :) ).
August 7, 2006 8:44:08 PM

Quote:
Just a generic question for those that might be doing actual gaming development.

Does Widescreen or Custom or resolutions or NON-STANDARD cause any hit on performance?

Are games/gaming engines optimized for standard resolutions like 800x600, 1024x768, 1280x1024 or 1600x1200?

I am currently running a res of 1680X1050 on my 20.1 widescreen monitor.
I also hook up on occasion to my 720P projector for Gaming Nights.

I am also using an older Proc and Vid card (P4 3.0C and ATI 9600 256mb).

This came up in another forum and I was just trying to put it back on topic by moving it here.

Thanks in advance for any replys (I think :) ).


In general, the larger the screen, the more of a performance hit. I cannot recall the formula, but take a 7MP picture taken with a digital camera for example, the picture in MB is approx 3 MB. The size of said picture is 3072x2304. I am not a game developer, but I do know that a 7600GT has a memory bandwidth of 24GB/s, a 7800GT, 35GB/s. Now what you're probably wondering, 3MB is far lower than either of the cards memory bandwidths, why cant we have true color images, moving in real time. Sorry, I dont have the answer to that, but I can say that probably some developers do alot more than shifting an image around in video memory, lets not forget about shaders, mathmatical calulations done in vid memory etc.
August 7, 2006 8:46:43 PM

Quote:
Just a generic question for those that might be doing actual gaming development.

Does Widescreen or Custom or resolutions or NON-STANDARD cause any hit on performance?

Are games/gaming engines optimized for standard resolutions like 800x600, 1024x768, 1280x1024 or 1600x1200?

I am currently running a res of 1680X1050 on my 20.1 widescreen monitor.
I also hook up on occasion to my 720P projector for Gaming Nights.

I am also using an older Proc and Vid card (P4 3.0C and ATI 9600 256mb).

This came up in another forum and I was just trying to put it back on topic by moving it here.

Thanks in advance for any replys (I think :) ).


In general, the larger the screen, the more of a performance hit. I cannot recall the formula, but take a 7MP picture taken with a digital camera for example, the picture in MB is approx 3 MB. The size of said picture is 3072x2304. I am not a game developer, but I do know that a 7600GT has a memory bandwidth of 24GB/s, a 7800GT, 35GB/s. Now what you're probably wondering, 3MB is far lower than either of the cards memory bandwidths, why cant we have true color images, moving in real time. Sorry, I dont have the answer to that, but I can say that probably some developers do alot more than shifting an image around in video memory, lets not forget about shaders, mathmatical calulations done in vid memory etc.

[EDIT]

Ah, dont forget, a game (or atleast most modern games) are at the very least psuedo 3D, and have more than just the 2 dimensions of a digital picture to calculate, thats alot of calculations per second, per frame.
Related resources
August 7, 2006 9:15:07 PM

Obviously they are 3D now... I understand the XYand Z side of the house...

The question is do the engines render the image based upon the given res or do they for instance always render for the largest available/possible resolutions?

Is it a case by case (change for the res you are set for) or is it a single render regardless of the res?

With the difference being the viewable area of the rendered images.
August 7, 2006 10:50:41 PM

Quote:
Obviously they are 3D now... I understand the XYand Z side of the house...

The question is do the engines render the image based upon the given res or do they for instance always render for the largest available/possible resolutions?

Is it a case by case (change for the res you are set for) or is it a single render regardless of the res?

With the difference being the viewable area of the rendered images.


Those are questions for the developers of EACH game. I thought I made that clear like 5 posts ago ;) 
August 8, 2006 2:47:20 AM

Should clear some things up... first off, the memory bandwidth being 36 GB/sec or whatever doesn't have much to do with... yeah... lemmie explain.

The Memory on a graphics card does more than just send the frame data off to the monitor. It stores all the information about textures and wireframes, partially-rendered stuff as well... basically whatever doesn't fit into the GPU has to sit in the memory.

Now your thing about a 7 megapixel image taking up 3 MB... the problem with that is that digital cameras generally store thing as JPEGs. First of all, flash memory seems to take a while to write to, so by compressing to a JPEG and storing, you get a small loss of quality but an increase in speed and capacity. Here's the formula for the size of an image: height x width x color depth /8. Now, the number of megapixels is height x width, color depth is generally 24 bit, so to simplify, it's the 3x # of megapixels. That 7 megapixel image would take up 21 MB if it were a BMP.

Now, in terms of the graphics card, it stores images as BMP for a number of reasons. First, even at high resolutions like 1920x1080 we're generally talking about under 2 megapixels, so the BMP of that is under 6 MB. That's not very much in modern cards. You were at least right when you were saying that with such high memory bandwidth we should be able to have a lot of colors; even if you send out one of those 2 megapixel images to the display 100 times a second, that's still just 600 MB/sec. This is done instead of a JPEG, because while it would be smaller, the monitor would take time to decode the JPEG, the graphics card would take time to encode the JPEG, and there would be color loss... basically it would be bad in all ways.

But back to the question at hand... the higher the resolution, the more calculations have to be done, so the more RAM bandwidth is needed and the the more GPU calculations are needed. But as far as widescreen versus 4:3, I'll say again, I don't think or see how performance could be impacted by the orientation. It's the same amount of data, just looking at it a different way, so I don't see how or why it would matter. Take your widescreen display and turn it on its side so it's really tall. Is it any slower? No, just like your graphics card doesn't care about how your monitor is oriented, it doesn't care about how the frame it's rendering is oriented. And if it does, it's probably a very minor thing due to some technical limitation or optimization--nothing game devs intended to do. I'd say the most you'll see out of widesreen optimization would be to include interfaces that take advantage of the extra horizontal screen space by putting more heads-up displays along the tops and bottoms of the screen rather than the sides and corners.
August 8, 2006 4:56:55 AM

from what i know, its more based on the total amount of pixels

e.g. a 1024 X 768 has less total pixels to calculate then 1280 X 1024

its (nearly) all maths
1024 X 768 = 786432 pixels to show whereas
1280 X 1024=1310720 pixels to show

see the pattern?
the higher the res the more calculations
as it goes
for widscreen 19 inch
1440 X 900=1296000
has slightly less pixels then a mon. set at 1280, an as such the fps will be slightly higher then a game set to 1280X1024.

ur 20.1 has a res of 1680
1680 X 1050 = 1764000
less then
1600 X 1200=1920000
therefore

the game being optimised for widescreen means that a 4:3 will be streached to 16:10

it doesnt really have to do with performance(fps), rather it is to do with the image quality
the wide screen will change add the additional FOV to 'normalize' the image.

I think that this is what you were on about...though how your 9600XT can handle a 20 at 1680 + a HD projector is beyond me...unless you've put the resolution down to about 800X600 with no AA no AF.
or your playing the original counter strike or equivilent
August 8, 2006 7:02:52 AM

The higher the resolution the more performance of a graphics card it needs. But you can however do low-res widescreen.
August 8, 2006 5:50:56 PM

Not really asking "IF" I can do widescreen gaming. I already do that....

I play COD, COD2, UT2k4 mostly..... I run the 720P projector only during my Lan parties, I normally play on my 20.1 Widescreen @ full 1680x1050 with really no lag/stutter.

So I know my machine is capable was just wondering if it was a disadvantage or not.

I am still not totally convinced on some of the explanations. For instance there are many different games that devs play in rendering the NEXT scene. Like knowing you are entering a dark room with a small amount of light on the far wall. To speed things up they render only what is behind the door until such time as the door swings wide open to reveal the next bit of info.

This is the reason we hear about issues with wide open fields with lots of characters. So technically in the above scenario they are not really worried about the rest of the screen as far as texture and shading....

Do similar tricks apply in resolutions? Do they render ONLY for the smallest selected RES or do they render more and show LESS?

Twile I respect some of your thoughts on this so far.. Are you a games dev?

I would feel more comfortable/confident with the responses if it were not an "I think" scenario..
August 8, 2006 5:52:14 PM

I can even play with AA but not with AF...
August 8, 2006 5:56:15 PM

Just so you know I am NOT stupid... I do understand that High resolution gaming is indeed more intensive. Otherwise I would be able to play something like Oblivion at full res with everything turned on with my aged ATI 9600...
August 8, 2006 9:11:40 PM

Quote:
Twile I respect some of your thoughts on this so far.. Are you a games dev?

I would feel more comfortable/confident with the responses if it were not an "I think" scenario..


Hey, I'm trying to do the best I can. How many game devs do you think frequently comment on obscure threads on Tom's Hardware Forumz? I'm not one (yet).

I suppose that, depending on the particular scene, having different aspect ratios would impact performance. If your screen is 16:9 rather than 4:3, and you're outside, your screen will be displaying more horizontal space and less vertical space, which might mean more objects on the horizon and less "empty space" in the sky. However, the opposite could be true--you might have some very detailed 3D clouds or objects in the air such as planes or tall buildings which would take as much or more performance to render than the horizon. So I guess the aspect ratio could impact the number of objects on-screen and the performance that they take. However, I highly doubt that game developers take this into consideration when they're making games. Even if half the people out there used widescreen, would you put fewer objects on the ground, changing level design just so people could squeeze out an extra FPS or two?

Frankly I don't see why you have to have an absolute definite answer from the guru of videogame design to feel confident. Even if you wanted to know whether a 1920x1080 display would have performance optimizations (or lack thereof) compared to a 1600x1200 display (which are both similar resolutions, 1600x1200 is only 7-8% smaller than the 1920x1080) if you had to choose to buy one, I'd say that the more important thing would be what aspect ratio you like the most, and what costs the least. As for you, who already plays in HD... why this question is so burning and must be answered with absolute certainty is just beyond me O.o
August 8, 2006 10:11:49 PM

Because I am a geek!! :) 

It is nice to know info and I guess I can do some testing....

Run UT benchies with standard vs widescreen.... I ask so I can KNOW. Knowledge is a good thing... Widescreen monitors are becoming more prevalent... Given this info it would be nice to know if there is a difference.

There are people on these forums who would spend an extra $100 for a Video card that gets a few more FPS in their favorite game.

So far nobody Including me has been able to do anything but speculate.

Therefore an answer would be nice. Not to mention you have given at least a few minutes to this thread "why did you do that?" :) 
August 9, 2006 2:17:31 AM

Quote:
Just so you know I am NOT stupid... I do understand that High resolution gaming is indeed more intensive. Otherwise I would be able to play something like Oblivion at full res with everything turned on with my aged ATI 9600...


yes...it has NOTHING to do with technological advancement, HDR with AF, SM3..has nothing to do with technology...just with the resolution...
well...i think that u've proved your first point...
although it does have alot to do with res. pretty sure that your 256 can handle it...it's more to do with emerging technology. e.g. a 9600XT outperforms a 6600, but does not have support for HDR or SM3, just like a X1xxx series has SuperAF (or whatever the hell its called) allowing it to have double the AF of a 7 series.
As I've said before, the resolution is to do with the total amount of pixels shown, not with the quality of the game-the reason that more pixels look better is to do with native resolution of an LCD as well as there being more pixels to display, means that something in 800X600 will have 10,000 pixels showing it might have 20,000 at a higher resolution.
August 9, 2006 8:23:46 PM

get a 7950gtx and there wont be any choppiness.
August 10, 2006 12:03:20 AM

Thanks for the replies so far...
August 11, 2006 5:09:53 AM

Quote:
get a 7950gtx and there wont be any choppiness.


Thanks for the advice, but 1) I don't have the money, 2) my Shuttle doesn't have a free slot on the correct side, 3) I don't have PCI-Express capabilities either, 4) I'm not complaining about choppiness, and 5) I'd rather stay clear of nVidia :p 

Ima wait until early 2008 for my next computer overhaul. I figure that a dual quad-core CPU setup, 4 GB of RAM, dual graphics cards with 1 GB o' RAM each and 2 TB of total drive space should be doable by then.
!