Need help choosing, torn between 21:9 and 16:9

john _k

Honorable
Feb 11, 2013
14
0
10,520
Hey guys,

I'm planning to build my very first PC build soon. I'm stuck on choosing a monitor that can provide me with the best experience for my build. The ultrawide monitor I'm looking at is the LG 29UM68 (2560x1080). This monitor is 60hz (only 75hz with FreeSync). I'm planning to get gtx 980ti for my build, so I will be just experiencing lower fps, because of the monitor's refresh rate. I don't know if I should just go with the ultrawide and 60hz or with 1920x1080 (16:9) with much higher refresh rate. I would appreciate some guidance towards this choice dilemma.

Thanks,
 
Solution
nvidia is for gsync...not freesync...either go for amd card wither freesync...or go for gsync with nvidia card...
if u state ur build, we can suggest u a gud monitor perhaps...

KindaHardcoreGamer

Reputable
Jan 9, 2016
258
0
4,790
I personally hate ultrawide monitors, to the point where I don't know why they exist. Therefore, I must suggest the 16:9 monitor. You'll love the higher refresh rate.
Instead of the 980 ti, you should get a gtx 1070 or 1060. Better price/performance.
 

Nymical

Reputable
Sep 20, 2015
122
0
4,760
Refresh rate much higher than 60 Hz, especially on such a large focal area, is not that significant to the human eye. Only experienced FPS gamers can even notice the difference in-game. Besides, you would be much better off getting a 10-series GTX at this point, considering they are all cheaper than a Ti and the 70 and 80 both heavily outperform it... 1060, for sale soon, is less than $300 and is still in the same league as the Ti, and would be better for that monitor anyways. Good luck, whatever you choose
 

CropEditPaste

Honorable
May 13, 2016
235
0
10,760
Most gamers (such as I) definitely would use the 16:9 monitor over the 21:9 because of the refresh rates and also, the response time (from pc to monitor) If you want the ultrawide monitor, go for it. The choice is yours, but I'd recommend 16:9 for gaming.
 

Mouldread

Distinguished
Apr 17, 2013
985
0
19,360
Nymical.....I bet you've never used a monitor with higher refresh rate than 60hz, otherwise you would've never said such thing. You don't need to be a professional gamer, or a gamer at all to notice the difference. Even just dragging a window around on your desktop is a very different experience on a120/144hz monitor than on a 60hz one.
 
I can't see a 980 Ti in any build today ... the 1070 is faster

perfrel_2560_1440.png


I wouldn't match a AMD Freesync Monitor with a G-Sync card. The main difference between G-Sync and Freesync is ULMB ... G-Sync monitors have it, Freesync ones don't

I'd do a G-Sync Monitor at 144hz .... At 1080p, in most games, you will have it in ULMB mode

http://pcpartpicker.com/products/monitor/#A=1&r=192001080


 

Nymical

Reputable
Sep 20, 2015
122
0
4,760


That's why I said in-game. I don't mean to sound disingenuous, but while focused on everything else happening, people do not much notice the frame rates at high speeds. I recall an experiment performed (by PC Gamer, I believe) where, in order to compensate for the increased attention people paid in actual "tell the difference" frame-rate experiments, about 30 people (familiar with games but not MLG) were just sat down at computers and told to play while someone in the background increased and decreased the frame rate. Afterwards, they were asked if they noticed anything peculiar with the display, and noone did. You would always notice it in a sterile environment, however, when actually embroiled in where it matters most, it does not make an obscene, conscious difference. In response to my not having used high refresh monitors, I have, and do on a regular basis. A friend of mine has a nice tri-monitor WQHD setup he paid at least 3k for lets me try out his rig occasionally to see how games run at best settings (I get 60 FPS on medium settings in most of my games, whereas he runs High presets at about 120 FPS), and his monitors are 144. I immediately notice the difference from my humble PC every time, but once I get into stuff, it really doesn't make anything prettier or improve playability, the mind still absorbs and follows the environment equally well. Whereas with a widescreen monitor, you have more to see, your peripheral vision is not distracted, and, therefore, it can be more immersive. Maybe some people can sit on their 144 Hz high horse but I find it to be more of a "Bombur's fat pony" situation. Not to add any more vitriol to this lol, sorry mould
 

Nymical

Reputable
Sep 20, 2015
122
0
4,760
To be totally honest you would probably better off getting a WQHD monitor @ 60 or 100 or else get dual-monitor with the 16:9 screens, cheaper versions of the 144 is too expensive...
Though for sure you should get a GTX 1060 or 1070 over a 980 Ti
 

Nymical

Reputable
Sep 20, 2015
122
0
4,760


+1 for the links, but, unless you already have a high FPS monitor on-hand (which I do not, *sigh*) it is impossible to view the differences accurately. At lower frame rates, increases are much more pronounced, while at higher ones, it is hard to see unless focusing on it, which was what I was saying. There is only one problem with all the "see what a high refresh rate monitor would be like" programs, which is the simple one of needing a high refresh rate monitor to actually view it in scale.
 

Nymical

Reputable
Sep 20, 2015
122
0
4,760


Yes, but what I am saying is that considering how the human perception of light and clarity is logarithmic in scale, 60 Hz is close to the inverse ratio of net slope and 1; this is guaranteed fact. By scaling anything down to that speed, you are effectively comparing the 2 speeds slowed to. It does not matter if a video was rendered 600 FPS, as long as it is object regular if you slow down 60 Hz and 144 Hz to fit on a screen you will be viewing a de facto 25 and 60 rendered FPS. As you increase FPS, or X, delta-Y, or increase in clarity, decreases (slope). As such, what is a large difference in clarity between points 25 and 60, is much less of a difference between points 60 and 144. The problem is scale; it occurs linearly in the program but logarithmically in human perception. So, you need a monitor of the same speed you are trying to compare to provide an accurate representation of frames per second difference in clarity. Truly, the actual blur scales linearly with FPS, however at higher speeds the human mental simulation does not have time or ability to factor in all of it.
 
Well your right, without ever having sat in front of it, you are not in position to tell. The links provided are intended to **simulate** what you would see and provide a close approximation as possible, but no.... it is not exact. The strobe test however has nothing to do what you are describing ... look at the still shot before starting the video and 30, 60, 120 Hz is meaningless, it's a still. When I look at that link, I can clearly see that 47 and 46 at the same time

Take a 48" TV and hit the pause button .. walk to 6 feet away and it looks like a water color painting that got wet and the paint ran....everything has "trails".

OTOH, I have done side by side tests with the following:

Twin 970's w/ 1440p, IPS, 144 Hz Screen 3 ms (real) lag
Twin 980 Tis w/ 3440 x 1080 screen, 75 Hz, TN, 5 ms (advertised)

Sitting down at the wide screen the 1st thing that hits you is the sense of immersion provided by the wider curbed screen. As time goes on however, the loss of fluidity is noticeable and grows to annoying at times.... things on your peripheral vision, surprisingly can be more annoying as the change in angle of moving objects are smaller. Going back to the 1440p screen it really "popped".... slight color variations were more well defined and detail more pronounced no doubt in part to the superior color of the IPS screen .... change to 60 fps and these subtle color differences are often harder to distinguish.

If ya want an accurate explanation of the science associated with this phenomenon, you can read it here:

http://www.blurbusters.com/faq/oled-motion-blur/