what is doing the most work?

xxcoop42xx

Distinguished
Aug 7, 2008
509
0
18,980
hey guys im playing games at a resolution of 14x9 and was wondering is that putting more on the gpu or cpu? or is it equal at that res. what is the ratio of 14x9? thanks for answers in advanced.
 

alex_oneill2006

Distinguished
May 23, 2008
119
0
18,680
Hey mate, i game at the same res, and i think its all down to what card you are running. If you have a monster like GTX 280 or the 4870x2 then it is heavily CPU bottled. So if you have a fast cpu, for that res it makes all the difference. and you dont need to splash the cash to enjoy games :)

Regards,

Alex
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
It depends on what hardware you have, what game you're running, and what settings are enabled. Otherwise things wouldn't have changed much at all over the years, now would they have? but since hardware gets radically changed, as do game engines, this varies, so it's impossible to say flat-out where your bottleneck is from the resolution alone.

As far as the aspect ratio goes, 1440x900 is a 16:10 widescreen format, and just shy of 1.3 megapixels in size.
 

xxcoop42xx

Distinguished
Aug 7, 2008
509
0
18,980
well right now im playing gears of war and my cpu is quad core at 2.5 GHz and i have dual 9800GT in SLI but i turned off SLI so only one gpu is on and i get a solid framerate of about 62 fps. it never goes higher than 62 even with SLI on, is the game limiting the fps?
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
It's a possibility... Especially if you happen to have vsync enabled, which will prevent the video card from rendering particularly more frames than your monitor can display. Also, a lot of games ported from consoles to the PC have a built-in framerate limiter, because on the console, they need them in order to run smoothly, lest the spikes of very high framerate make the slow spots stick out even harder.

At any rate, if you're consistently getting equal to or greater than 62 fps, there's little reason to worry; it's running fine anyway. IIRC, it's capped to 30 fps on the Xbox 360...
 

xxcoop42xx

Distinguished
Aug 7, 2008
509
0
18,980
ok thanks for your info, i thought it was weird that 2 very good cards only got 62 fps, so i turned one off. still looks way better than the 360 version
 

ZootyGray

Distinguished
Jun 19, 2008
188
0
18,680
Grafx Load varies with screen rez - the hi screen rez is heavier load on the grafx card - simply, it has to draw more little tiny pixels to fill a bigger rez, which means more detail, clearer pix - so that's exponentially increasing work for the gpu as rez goes higher.

ratios are just = divide the dimensions of screen to get a ratio
= 1440 /900 = 1.6 = 1.6 /1 = 16 /10
You might want to know your native resolution on your monitor. Some rez settings are odd ratios - giving you pix of short fat people or tall skinny people, and black bands at top or side of screen. Ratios all changed between older CRT's and new LCD's because of change in physical screen ratio.
 

ovaltineplease

Distinguished
May 9, 2008
1,198
0
19,280
At the highest possible levels of details enabled through in game settings, and forced through drivers:

The gpu will be doing the most work in most games /except/ real-time strategy; in modern RTS games, there is a ton of cpu calculation on physics/unit AI.

Thats likely the simplest answer I can give.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310

that's probably because the 9800GT (also known as the 8800GT, oddly enough) happens to be vastly more powerful than the Xbox 360's graphics setup, which is closer to a GeForce 8600 GT or so.
 

TRENDING THREADS