Sign in with
Sign up | Sign in
Your question

Slow reacting in 1080p HD?

Last response: in Graphics & Displays
Share
Anonymous
a b U Graphics card
October 16, 2010 5:17:50 PM

First of all, hey folks (New guy here)

I've been browsing Tomshardware for a while now, being it looking for reviews for new hardware and/or using the site as a guide for buying new stuff for my pc.

Anyway, I'm having a persisting, kinda weird problem (to me anyway) My system seems to slow down when I'm running in my tv's native resolution of 1920x1080. It seems like it's everything that's slowing down. Be it windows desktop itself or playing games. The only thing that works is playing movies.

At first I thought it was my system that couldnt handle it, but installing fraps I noticed something weird. My FPS do not drop when for example playing Fallout 3. The thing that happens is, it seems the systems reaction time slows down. I'm currently running it in 1680x1050 smoothly but when I switch to HD (1920x1080) I find that from me moving the mouse or hitting a key 'till it actually reacts slows down alot and this is with exactly the same framerate.

I was hoping to be a little enlightened on the matter since I cant wrap my head around what's happening

My current system spec's are as follows:

Motherboard: P5n32-e sli plus
Processor: Core2 duo e-6600 @ 2,4GHz
RAM : 3gb corsair XMS2
Graphics: 2 x gtx 460 in SLI (I know they are waaaay overkill for my system but the rest is getting upgraded soon)

As part of my system switch I recently upgraded from a Gigabyte 9600gt and the exact same thing happened on that card

I hope someone can shed some light on this
Thanks ahead
- Cybertrash

More about : slow reacting 1080p

a b U Graphics card
October 16, 2010 5:52:25 PM

-What's the Brand & Model number of your TV?
-How are you connected to it? DVI-I or HDMI? Or VGA?

Also, what nVidia driver version are you using?
Anonymous
a b U Graphics card
October 16, 2010 6:05:19 PM

My tv is a Samsung LE40B655T2W 40" tv (1920x1080 is the native resolution on it)
I'm running nvidia's 258.96 drivers
Related resources
a b U Graphics card
October 16, 2010 6:10:51 PM

Doesn't make much sense. LOL Obviously your system is capable of running 1080p perfectly fine. What my interest was, was to make sure your TV really was 1080p capable. I'm wondering if the slowdown you're seeing is some issue with the TV (ie the TV's chipset having issues drawing the image, not your PC sending it).

I'm not finding specs on your TV online. :(  Wondering what it's native refresh rate is. Could try setting a fixed refresh equivalent to that of your TV and see if anything changes.

Otherwise, I can only suggest to uninstall your drivers. Then install a newer or older nVidia driver to see if that fixes things.

You might also consider (this is a side note) OC'ing your Q6600 to help you get more performance over all from your GTX 460's.
Anonymous
a b U Graphics card
October 16, 2010 6:15:33 PM

Yeah, I considered OC'ing, but since I'm upgrading soon I didnt actually go ahead and do it. But since I'm not sensing any system performance issues in this matter, I thought it might be software or related to my tv and/or how I set it up or connected it.

Right now I'm taking the signal from the video cards DVI-I and plugging it into my tv's HDMI port (My computer case doesnt have room for the HDMI plug in the video card)
Anonymous
a b U Graphics card
October 16, 2010 9:43:59 PM

I think the bottleneck is actually my tv. How to work around it, I'm not quite sure, but I found a "Gaming mode" in the tv's menu and performance improves a little bit.

I can't figure out why I can't run 100hz though, since it is a 100hz tv. In my nvidia control panel I only have 60hz and lower options, so I can't test wether or not it'll improve in 100hz mode, unless there's a workaround for that?
a b U Graphics card
October 16, 2010 9:48:41 PM

Connecting a PC to a TV has its inherent problems. They aren't really 100% compatible. They both operate at different standard refresh rates, have different pixel densities, etc. Not to mention, they don't always have the same input/output types.

With a TV, you also can run into the fact that it's built in video chip wants to alter incoming images in some situations to make it fit the screen properly, etc.

The "Gaming Mode" setting typically only changes image settings (i.e. contrast, sharpness, cool/warm tones, etc). It probably won't have any real affect on actual performance. And if it did make any changes, they were likely intended for Console gaming, since that's 95% of what's connected to a TV gaming wise.

So far as refresh rates go, your computer is going up to 60hz because that's what the standard LCD PC monitor maxes out at for it's native resolution. You may be able to go into the nVidia control panel to activate a customized resolution/refresh rate and force it to run @ 100hz. :)  Then you should set each game or application to use VSync (or do this by default through the nVidia control panel). Otherwise it could cause graphical tearing or other problems with the image on the TV screen. (this is for instances where you have more than 100 FPS)

Now, on the other hand, if you TV doesn't have a native 100hz refresh (it could be 50hz being doubled) that's going to cause issues too potentially. A lot of TVs in the first wave of "120hz refresh" variety were not a true 120hz, but used an image doubling to make the image appear to refresh more quickly than it really was.
Anonymous
a b U Graphics card
October 16, 2010 9:54:27 PM

Thanks for the replies mate. Much appreciated and I learned a bunch too lol

I think I've come to the conclusion that I'm just gonna run my games in 1680x1050 which is a more than decent resolution for my tv and so far I've had no problems at all with that.

Cheers
a b U Graphics card
October 16, 2010 9:57:16 PM

Not a bad idea. Sorry it's being a pain. No fun having to fight electronics. Wish I could tell you exactly why it's doing it.

Ultimately doesn't make a whole lot of sense ya know? :( 
!