How do I get 120fps?

xtwiinky

Honorable
Jul 29, 2012
133
0
10,690
Im using a PC to play on my 60" Sharp Aquos LED TV. The tv says it has 120hz, and 1920x1080 resolution capability.
But for some reason when I use "detect" to find my resolution on Windows, they recommend something like 1750x1000... and I have black bars on my screen.
If I select 1920x1080 my screen is zoomed in.
??? Why is this happening,
Also, the recommended resolution for Fallout they gave me was 1200x800, but there was an option for 1600x1200, although when I clicked it the game ran zoomed in.. WTF



Also, my GTX670 is more than enough to play most games at 100+ fps easily, but for some reason, my TV shows its running 60htz (When it pops up upon starting a game)

Thanks alott.



 

trogdor796

Distinguished
Nov 26, 2009
998
0
19,160
I'm not sure about the resolution problem, my guess would be that it has something to do with a setting on the TV for the aspect ratio, maybe set it to "just scan" instead of 16:9, or vice versa. You could also try looking in the Nvidia control panel for a setting that has to do with connecting to an HDTV.

As for the the TV only displaying 60hz, I can explain that for you. Your TV probably does say it has 120hz, like most TV's today. However, TV's don't work the same as computer monitors. While monitors can actually accept and display a true 120hz signal, TV's simply take a 60hz signal and use something called frame interpolation to make it look smoother. This is used for movies and TV programs, but not for games, as it causes massive input lag due to the processing.

TLDR: Only computer monitors labeled as 120hz can display 120 frames. TV's can only display 60, no matter what the box/specs say.
 

benski

Distinguished
Jun 24, 2010
1,611
0
19,960
^+1 on the 120hz, TVs don't have inputs that can actually accept a 120hz signal.

Are you using the analog RBG (vga) connection like your TV recommends for PC connection or are you using HDMI? I would use the RGB and set it to 1080p and try using your TVs auto-sync function to resize the image to fit the screen if it looks zoomed in.
 

mikes1992

Honorable
Mar 27, 2012
292
0
10,810
I had that problem. I've been using my TV as a monitor for years. all you have to do is change the aspect ratio settings, if your TV supports the right ratios anyway. for 1920x1080 I use "Just scan" for 1776x1000 I use "Cinema zoom" and for 1600x1200 I use "original".

The recommend resolutions don't really mean anything. Some games have profiles containing optimal settings for hardware, but since you're using a 670 most games will not be able to give recommendations and will most probably default to low or medium settings.
 

mikes1992

Honorable
Mar 27, 2012
292
0
10,810
as for the 120hz I wouldn't bother... 60hz is just fine and like trogdor said I wouldn't use any frame smoothing processes while gaming. It causes lag and in some cases can make the picture look very choppy.
 

voiidwulf

Distinguished
Jun 11, 2012
903
0
19,010
I assume you are using HDMI to connect?

I use a 22inch 1920x1080 60Hz HDTV as my monitor, and I have to change the overscan/underscan options or I either get black bars or zoomed in.

For AMD cards it's in HDTV settings, but I've never owned a NVidia card, so I'm not familiar with the drivers.
 

blakwidowrsa

Honorable
Aug 10, 2012
277
0
10,810



:non: OK,OK hold your horses. :non:

TLDR: Only computer monitors labeled as 120hz can display 120 frames. <-- :pfff: partly correct, FACT= it is because only computers generally generate FPS above 60 that caused computer monitors to be designed capable of high refresh rates.
TV's can only display 60, no matter what the box/specs say. <-- :fou: incorrect , FACT= Whether your display's internal refresh rate is 120 Hertz or some other rate, the signals coming in are running at frame rates determined by the sources of those signals. This typically means 30 Hertz for interlaced formats like 1080i, 60 Hertz for progressive formats like 720p or 480p, or 24 Hertz for certain players that support 1080p/24. And what do you know... 30,60 and 24 can devide into 120 making the TV just required to refresh at 1 preset frequency to handle all standard inputs, like HDMi(30/60), DVI(60/dual-link can do 120), co-axial(30/24),displayport(60/120) etc etc.

This is the reason non-standard resolutions do not display properly on certain TV's, and an overscan/underscan setting need to be manually set from the source (i.e.- computer) as the 1600x1200 resolution 60hz HDMi source signal will be seen as progressive and the TV will try to fit and scale 1600x1200 into 1920x1080(1080p/720p/480p) making black bars / being zoomed in.

Settled then. :lol: Try using standard resolutions whenever possible on 'standard' equiptment like TV's.Why do you think Xbox360 only has standard hdmi/vga resolutions, and no settings at all for resolutions when composite(AV) is used.
 

trogdor796

Distinguished
Nov 26, 2009
998
0
19,160
Unless you are watching a movie with TruMotion or whatever setting a TV uses for their "120 or 240hz", a TV cannot display over 60 frames. There is no possible way to hook a PC up to one and display 120 frames. HDMI doesn't carry it, and no TV's with DVI or Displayport exist. TV's simply can't display a signal that's over 60hz from it's source without using frame interpolation. That's the way I thought it was.
 

xtwiinky

Honorable
Jul 29, 2012
133
0
10,690
oh is it that 1600x1200 resolution not compatitable with 1920x1080 in screen ratio...

I kind of understand what you guys are saying, but my TV has DVI and VGA ports. Would they be better, becuase as far as I know, they are all the same except DVI can support 2500x1600.

And, are converters bad? Like VGA to DVI converters?

Thanks.
 

blakwidowrsa

Honorable
Aug 10, 2012
277
0
10,810


Plainly converters is fine, remember at the end of the day it is basically just different pinouts/socket from, and to a digital source, IE 5V 2xRed 2xGreen 2xBlue and two ,another 4 syncronisation channels for horizontal- and vertical-so a db15 d-sub port has 15 pins and only Needs 10 minimum and basically the plug metal housing is earthed to the other ends plug housing and is effectivly a huge neutral.So 11 wires in VGA.
Problem with a converter is they wear out and dont make good connections later in life,then they also loose some pinouts like an LCD monitors sub-channel data, that is normally lost from a DVI -male to VGA -male adapter.

Like when you plug a monitor (NOT SCREEN/TV/CRT) with a dvi cable to the dvi port of the gfx card, youll notice it detects the native resolution of the monitor and switching between display modes is much faster than the vga that needs to auto-adjust everytime the display mode changes.



Why dont you switch the tv on, grab a dvi to dvi cable and plug the tv in first, then when you plug it into the computer the gfx card will detect a new display and should detect the name and native resolution....what does that produce? what resolution does it detect?
 

mathew7

Distinguished
Jun 3, 2011
295
0
18,860


The zooming problem is strange (as in "why?")...When you output a 1920x1080 image, you should get 1:1 pixel display...but all TVs implement a slight zoom by default to eliminate the edges that may not be part of the image (maybe because of the upscalers?). I don't know why they do it at 720p or 1080p, but all do it. It's called overscan. On newer TVs there may be an option to disable it (my panasonic offers this option when I have 16:9 aspect ratio selected).
Use of a PC-dedicated port may also disable this. But this is all model-related (same manufacturer may permit it or not on different models).



A DVI-to-VGA converter is indeed a passive device, but that converter will NEVER send a digital signal. Same as an HMDI-to-DVI converter will never sent an analog signal (which is why they output nothing when both are used). To convert analog-to-digital (or reverse) an active (chip) converter is required. PS: DVI connector allows for both: VGA signals and digital signals. The digital signals were later taken into HDMI specs.
All pins are susceptible to wear (no difference between VGA or DVI). The VGA cable has the same EDID pins as the DVI, thus the max-resolution is identified in the same way.
What you talk about the VGA auto-adjust is because VGA signals offer only one line-break signal and one frame break, so a digital output device (LCD/plasma) must find the horizontal clock by itself (this is not really needed for CRTs). Digital DVI and HDMI use a clock signal for every bit of the pixel, thus being 8 clocks/pixel (which would be 15360 individual clocks for every 1920 line, whereas VGA only has 1 for "end-of-line"). So the auto-adjust is made by the monitor to establish the margins of the VGA image to display it properly. It has nothing to do with the GPU identifying the monitor/TV.
 

mathew7

Distinguished
Jun 3, 2011
295
0
18,860


As others said, no TV has 120Hz input. I'm not sure about 3D TVs, but even then use of 120Hz implies 3D signal.
As for 1750x1000, the driver offers that resolution to combat overscan (1750+10%=1925, which means this resolution needs 5% overscan...your TV may do just 3%). But every TV has a different overscan level. In ATI/AMD drivers I can choose any overscan compensation from 0% to 10% (and see the effect immediately). But no matter how you do it, you'll never get 1:1 pixel mapping to your TV while overscan is active. My Panasonic will show the option only when 16:9 is selected (yes, I said this before, but I wanted to add that auto-aspect hides it...I need specifically 16:9).
 

mathew7

Distinguished
Jun 3, 2011
295
0
18,860


His problem is the TV, since no non-3D TV will accept 120Hz signal. And 3D TVs will accept 120 frames/s only for 60p 3D video.
PS: HDMI 1.3 category 2 cable already has the required bandwidth for 1080p120.
 

benski

Distinguished
Jun 24, 2010
1,611
0
19,960

You should use VGA because that's how your TV manufacturer intended for you to hook up a PC and using that port will disable the post processing and overscan options that you don't want turned on, at least that's how my Vizio works.

 

xtwiinky

Honorable
Jul 29, 2012
133
0
10,690
Is there frame interpoliation on monitors.. or just because its a monitor it magically doesnt have it?


Also, why is there mentioning of dual dvi? What dual dvi do? I thought you only need 1 for 2500x1600 or lower resolutions... what does 2 accomplish? Especially for 1920x1080.
Also, my TV has 1 DVI and 1 VGA, thats it.
My GPU has 2 HDMI and 2 DVI (1 looks like DVI but my DVi cannot fit into it, its not a VGA) so i guess its 1 DVI?

I have an MSI Power Edition/OC GTX 670.
This my TV-- http://www.sharpusa.com/ForHome/HomeEntertainment/LCDTV/Models/LC60C6400U.aspx

 

xtwiinky

Honorable
Jul 29, 2012
133
0
10,690



I found the second DVI slot classification, but is there a special cable to link 2? Also, my TV only has 1 DVI, 1 VGA, so is there anything lost to converting ?
Also, what is the advantage? More bandwith but isnt an HDMI cable enough for 1080p @ 120fps?

If I do buy a 1920x1080 monitor at 120 or more fps, would I need a Dual DVI, or would a single DVI be fine? And would HDMI be fine?
 

benski

Distinguished
Jun 24, 2010
1,611
0
19,960
It's dual link DVI not dual DVI, there aren't 2 cables, just a single cable with more pins in it than a single link cable and yes if you bought a 120 hz monitor you would need a dual link DVI cable. No an HDMI cable isn't enough for 1080p @ 120 fps. 120hz Frame interpolation is a feature that costs extra to add to a TV and would be of no value in a PC monitor.
 

xtwiinky

Honorable
Jul 29, 2012
133
0
10,690


So if I had a monitor, i would need dual link DVI to run 1080p@120fps ?
Sorry, could you explain dual dvi a bit, like would I need a special dvi cable to dual link? And my TV only had 1 dvi, 1 vga, so would converting one to VGA mess it up?

 

trogdor796

Distinguished
Nov 26, 2009
998
0
19,160
Yes, there are dual DVI cables and regular DVI cables. The only difference is the amount of pins.

Computer monitors that are 120hz require a Dual Link DVI cable to display 120 frames. HDMI won't cut it.

Your TV has DVI and VGA but not HDMI? I think it must have VGA and HDMI, because as far as I know, no TV has a DVI port. I've never seen one.
 

xtwiinky

Honorable
Jul 29, 2012
133
0
10,690


OOOPPSS. sorry its 1 VGA and 1 weird kinda vga one.


 

benski

Distinguished
Jun 24, 2010
1,611
0
19,960

Yes you would need a special cable, but if you buy a 120hz monitor (not all PC monitors are 120hz, they are much more expensive) it will likely come with one. When you use the DVI to VGA adapter plug that came with your GTX 670 the card will know it has an analog VGA cable plugged in and will output the appropriate signal, there is no conversion.
 

hollett

Distinguished
Jun 5, 2001
246
0
18,710
Going back to the original issue the OP was having of the picture zooming in on the TV. I am guessing that you are using a HDMI connector as the TV does not have DVI (just VGA and RS232). I know when I was using my PC to display on my TV (Samsung so might be different) there was an option on the TV to disable ‘Picture possessing’ for PC use. If I did not enable this setting then the picture would be slightly zoomed in and I would not be able to see the start bar.