Why is a GTX Titan 3-way sli not enough for the latest games on ultra settings?

j0ndafr3ak

Distinguished
Feb 11, 2012
409
1
18,815
Hello there everyone,
I was just taking a look at this review, which compares a gtx titan 3-way sli to a gtx 680 3-way sli and a 7970 3-way xfire.
I know this may be due to some optimization problem with the game engine or whatever, buy why would such a setup not always max out some of the latest games at the highest resolution and on the highest detail settings? I mean, what do game developers and game testers use when they test the game with everything on ultra at the highest resolution? what systems do they have? or do they just have faith in some theoretical calculation and never actually test the game at those resolutions on that high details?
 

j0ndafr3ak

Distinguished
Feb 11, 2012
409
1
18,815


so not only does it depend on graphics, but on other factors like cpu, motherboard, ram...? is AA cpu heavy?
 
for gaming the most important is cpu and gpu performance. ram will not impacting the performance as long as you got plenty available for your system. mobo will not impacting gaming performance at all. as for storage using ssd might fasten the loading times compared to hdd and i heard in some cases ssd did help in reducing stuttering in games.

as for the games itself some games are can be much more demanding on cpu or gpu or both.
 

j0ndafr3ak

Distinguished
Feb 11, 2012
409
1
18,815


well the tests were performed with an i7 3770k overclocked @ 4.8ghz. isn't it enough?
they even got a difference of 20fps depending on the system. and for each the cpu was the same, the gpus changed... so I guess in this case what makes the difference is the gpu.. and at that level of details the cpu also does its best to keep up with the graphics card.. so testers have more powerful systems, don't they? maybe a 2011 system with an overclocked i7 3970x? though not all the cores will be used...
 
the major problem when using 3 or more gpu is not about not enough performance. the problem most likely how to make 3 discrete gpu to work together and make the performance to scale rightly with the available raw performance. with two cards we can see up to 90% performance increase if the drivers are very optimized on the game. but when adding a third card the performance increase only 20-30% more and in some cases adding a third did not result in performance increase. worse case if the cpu can't keep up with the gpu we might see negative scaling on games (worse performance than two card). as for the cpu there are possibility that even 4.8Ghz might bottleneck 3 titan. hardocp have run into this very problem when benchmarking 3 GTX580 using i7 920 overclocked to 3.6Ghz (they pit it against 6990+6970). after listening to the readers feedback they re-benched the cards using OCed 2600K (i don't remember exactly the clock but it was in 4.6-4.8Ghz range) and found that their previous i7 really hold back 3 580 from showing their true colors
 
Some developers tend to push the system requirements to the limit. It's just how they want to design the games. Crysis was pretty brutal on your PC when it first came out in 2007. It is actually still used to benchmark games. Metro 2033 is a more recent game that is brutal on your system.

Why? Notoriety perhaps. How many games can you name off from 2007 (or before) that is still being used to benchmark graphic cards today? I can easily see Metro 2033 being used to benchmark graphic cards 4 or 5 years from now.
 


People seem think that PC games are meant to be played on Ultra settings with SGSSAA x4 or Ubersampling, advanced DoF or what ever other setting is offered. If they can't use every setting, the game is poorly optimized.

That is a very costly view. It will always lead to frustration.

PC games are created with options, and sometimes new features to experiment with. Some are created on top of a game engine they are selling to other dev's for future games. Dev's have also found that if they offer some extreme settings, their games get placed in benchmark suites by all the review sites and are constantly getting free advertisement. As a gamer, if a game has settings beyond today's systems, we go back to those old games in a few years and see how the latest hardware handles it. And you also have to realize that the benchmarks you showed was at an extreme resolution. Then there is AA settings, which if they create to work at 2x or 4x MSAA, they can allow 8x MSAA with no extra work, so they do.

As a PC gamer, it is up to us to pick and choose the settings we want for the system we have. These aren't consoles, where everyone has the same system, and they won't ever change. If these "poorly optimized" games were created for the console, they'd strip those higher end settings away, taking away the choice of higher IQ. There is nothing wrong with giving us options.

Take the same system, and use a single, standard 1080p resolution, and everything would change on the benchmarks. Use no more than 4x MSAA, and things change. People get too hung up on having to max out everything offered. There is no requirement for that, and just because an option is there, doesn't make it "poorly optimized".
 

j0ndafr3ak

Distinguished
Feb 11, 2012
409
1
18,815




I get your point. Maybe I didn't make myself clear, but i always assumed that a system like that would max out anything you may throw at it, which is why I asked myself and then you guys what in the world it takes to run those games at the highest settings and resolution. The titans handled it pretty well no wonder why, but i never thought 3 680s or 3 7970s would have some hard times with those games..that's why I asked, just out of curiosity. :)
 


The point is, PC gaming is open ended and just because a setting exists, doesn't mean it is meant to be used on a system using 3 monitors today. The setting might have just been an experiment to begin with, and rather than removing it, they left it there for us to mess around with. I doubt they intended people to actually use it in normal game play at 5760x1200 resolutions for a few years.

You'll notice that PC games recommend systems not capable of using everything maxed out. They expect you'll use High settings with reasonable to no AA.
 

j0ndafr3ak

Distinguished
Feb 11, 2012
409
1
18,815
And thanks everyone for clearing up my doubts. I've learned a couple of things today: a multiple gpu setup doesn't always result in performance gains, games are awesome at the details my gtx 285 can handle and to max out games you gotta be disgustingly rich haha
 


A lot of people say they max out games, that don't. You can do that on a more reasonable budget :p They get away with saying that by using Ultra, and turning off experimental features and AA. A lot of people don't consider those part of the normal settings. They don't play at 5760x1200 either.
 

j0ndafr3ak

Distinguished
Feb 11, 2012
409
1
18,815


You're right, plus who could afford a rig like that? They did the benchmarks just for the benchmarks. For example could a single 7970 with 3 gigs of vram handle a 5760x1200 resolution? Not max out, just for a smooth gameplay
 

j0ndafr3ak

Distinguished
Feb 11, 2012
409
1
18,815


For example take me. I play bf3 on ultra textures but i keep AA off cause i lose 20fps during gameplay. Not worth it
 
As long as the game is fun, and looks good, it doesn't have to be maxed out. That said, it is pretty cool to see what it can look like. Though your 285 is limited to the DX9 or DX10 version of the game. Back in the day, I'd turn all the settings up, get 10 FPS and just ooh and ahh over what it looked like. Now I get near max settings in most games, and it looks awesome with the top settings offering less and less added visual improvements.

Due to my simulator sickness issue, I always play at 60 FPS or higher. Typically 80+ in 1st person games, which make me sick the easiest.
 

j0ndafr3ak

Distinguished
Feb 11, 2012
409
1
18,815
I get 40 to 60 fps in bf3 and I'm always under 85% of gpu load cause of cpu limitations. I'm waiting for Haswell for my next upgrade and maybe go for a high end gpu and a 120hz monitor. High framerate rocks. I never had the chance to see 80+ fps on a game. Must be fun!
 


My reasons for 80+ FPS is more about avoiding sickness in first person games. It doesn't seem much better than 60, other than the lack of nausea after 30-60 mins of play. The most fun aspect of the 120hz monitor is 3D Vision, though the 60 FPS still gives me nausea after 30-60 mins, so I take breaks.
 

j0ndafr3ak

Distinguished
Feb 11, 2012
409
1
18,815


That might be why i feel kinda dizzy after a while. So you feel sick when you play at 60fps.. but how can that be be different on a 60hz monitor. What changes? What is past 60fps the monitor does not display...
 


The problem is either latency, uneven frame delivery, or a particular length between some frames that causes me nausea. All of which are helped at higher FPS to varying degrees. This is something that affects me, and it affects me most when in 1st person games when controlling your view with a mouse. I believe it is most likely latency, as watching cut scenes at 30 FPS has no negative affect on me, even without motion blur (assume no motion blur on real time animated cut scenes). It seems to require me to control the view in a very connected way, as mice cause me issues when controlling the view, but a joystick or controller does not.

My monitor is 120hz, but I have not really tested it at 60 hz without v-sync on, to see if the added FPS helps there or not.
 

klepp0906

Distinguished
Apr 29, 2013
150
0
18,710


Didn't learn much then. A multi gpu setup does ALWAYS result in performance gains lol. I guess I could create an exception by using a super low reso with a super outdated cpu, even then you would likely still see performance gain.

Sorry bud, you got a lot of idiotic responses that were either off topic or just plain wrong. I call em like I see em.
 

TRENDING THREADS