Sign in with
Sign up | Sign in
Your question

Console gaming on 120hz TV

Last response: in Home Theatre
Share
March 21, 2011 12:40:21 PM

I've got an odd issue while playing certain xbox 360 games on my 120hz Samsung LCD tv. I've only seen this problem on a very small amount of games such as Mafia 2. During game play every few seconds you will get what looks like a drop in framerate and the game will appear to "lag". This happens on such a small amount of games that at first I thought it was the game itself until I switched over to a 60hz tv and the "lag" went away. I've gone through the menu system on the tv a few times and turned off everything I've found related to motion enhancements, etc. but the issue still occurs. At this point I may have to stay on the 60hz tv but I would prefer to use the 120hz tv if this is fixable as it is my only 1080p tv.

More about : console gaming 120hz

a b x TV
March 21, 2011 3:17:34 PM

you act as if 60hz is inferior somehow. 60hz is the standard lcd refresh rate. in fact, your xbox360 most likely sends only a 60hz signal when you are playing games.

120hz is meant for 3d. if you aren't running 3d there is no need to enable 120hz.

if you were trying to send a 60hz signal to a tv that wanted 120hz then the tv would have to process the extra frames and double them up which probably resulted in the lag you were mentioning.

short answer: the tv is fine, use 60hz.
m
0
l
March 21, 2011 3:37:18 PM

"At this point I may have to stay on the 60hz tv but I would prefer to use the 120hz tv if this is fixable as it is my only 1080p tv"

I don't think my 60hz is inferior at all, I just prefer to use my 1080p tv. As for just using 60hz on the 120hz tv, maybe i'm missing something but I don't see any option to tell it to use 60hz. I believe it defaults to whatever it detects. So if your right about the xbox using 60hz by default there is something specific in these games that it doesn't like about my 120hz tv. As for 120hz being just for 3D i'm pretty sure some TV guru's on here would debate you on that :-) , there is a huge difference between a 3D tv and a 120hz tv.
m
0
l
Related resources
March 21, 2011 6:37:37 PM

the game is probably causing high processor usage when it lags.
same thing happens on computers.

the only way to fix it is to solder a new processor onto the circuit board.
and it isnt worth it to break your xbox.
because for starters.. if you do manage to properly solder the new chip, that doesnt mean the processor will work.

and second of all, you dont know if its the CPU or the graphics processor that is reaching 100%

**edit**
maybe the television connection is robbing the processors of precious voltage needed to function.
lower voltage could prevent the processor from running all the way up to 100%
because the processor runs on electricity.. and if you dont have enough electricity, you dont have enough 'run'
its kinda like wearing a backpack that weighs 75lbs and you are asking the person to run as fast as they can.
if the weight is too much, its gonna slow the person down.
m
0
l
March 21, 2011 6:53:12 PM

Not to bash you dude but this makes no sense. Any type of CPU or GPU problem on a console it going to be seen on any game or tv you use. Even on the very slight chance that the tv itself was some how effecting the power distribution of the CPU or GPU again this would be seen on every game not just Mafia 2, etc.
m
0
l
a b x TV
March 21, 2011 8:01:57 PM

120Hz HDTVs only take 60Hz input. 120Hz refers to some video processing (video frame interpolation) the HTDV does to smooth out video playback and give movies that "live look"; kinda like watching a soap opera. Thus, 120Hz causes input lag in games because it takes a little time to do the video processing. This applies to 240Hz HDTVs as well which does even more video processing.

120Hz 3D HDTVs also does video processing as well and only accepts 60Hz inputs. Additionally, when watching a 3D movie the input drops to 48Hz; 24Hz per eye.

The option to switch from 120Hz to 60Hz should be somewhere in the HDTV's Picture Menu.
m
0
l
March 22, 2011 3:05:37 AM

Cold71 said:
Not to bash you dude but this makes no sense. Any type of CPU or GPU problem on a console it going to be seen on any game or tv you use. Even on the very slight chance that the tv itself was some how effecting the power distribution of the CPU or GPU again this would be seen on every game not just Mafia 2, etc.


you are stating that each game requires the same percentage of processing power.
thats not true.
look at all of the PC games the require faster graphics cards to get the frames per second up high enough.
have a look at the call of duty black ops forum, you will see people complaining that the game uses more processing power than other games with the same graphics.

there is a need to know what the processor usage is and what data is being processed.
if you look.. it doesnt mean you have permission.
and if you look.. it doesnt mean you are going to see all of the data being processed.

you said the problem went away on a different television.
there are only TWO rational options:
1. the television input is putting a strain on the console video output
2. the way you situated the console when you moved to a different television was enough to decrease (or even increase) the heat.
the difference in heat caused the processor to run better.

anything else is 'top-secret' data transfers robbing you of performance.
maybe your console was connected to the internet and was updating itself.
maybe the console was checked for cheats or modifications.

can you reproduce what happened?
like.. can you do the same area of the map and it lags each time?
m
0
l
March 22, 2011 3:43:06 AM

jaguarskx said:
120Hz HDTVs only take 60Hz input. 120Hz refers to some video processing (video frame interpolation) the HTDV does to smooth out video playback and give movies that "live look"; kinda like watching a soap opera. Thus, 120Hz causes input lag in games because it takes a little time to do the video processing. This applies to 240Hz HDTVs as well which does even more video processing.

120Hz 3D HDTVs also does video processing as well and only accepts 60Hz inputs. Additionally, when watching a 3D movie the input drops to 48Hz; 24Hz per eye.

The option to switch from 120Hz to 60Hz should be somewhere in the HDTV's Picture Menu.


having a video frame buffer would/should cause a constant input lag.
there is no artificial intelligence that can become confused.
all that happens is two frames get looked at and the object in motion gets placed directly in the middle of where the two frames said the object was.
if there is no change in movement, no movement is necessary for the 'middle' frame.
the result looks fantastic because the picture appears more solid.

the logic used is very simple.
if the two frames in the buffer are drastically different, like when you change channels or the scene changes because a different camera is being used, the software will do nothing as it fills the buffer with another video frame of the new scene and start the process of 'mixing' again.

a key point to note here...
60hz of 'mixing' is twice as many frames as the 30 frames per second being input.
120hz of 'mixing' is three times as many frames as 30 .. and twice as many as 60 frames per second.
neither one of the above examples go into detail about HOW LONG it takes the 'mixing' process.
if your television is slow, you will see a constant input lag.
if the television can do the 'mixing' fast enough.. you wont notice it at all.

the math isnt entirely accurate, because most people think 30 frames per second = 500ms
but your television doesnt hold 30 frames per second in its buffer.. it holds 2.
2 frames in the buffer = 0.033ms when the frame rate is 30 frames per second.
add to that whatever time it takes to process both frames and display a 'middle' frame = the input lag
chances are.. you arent even at 1ms
response time for college-age individuals is about 160 milliseconds to detect audio, and approximately 190 milliseconds to detect a visual.
taken from here:
http://en.wikipedia.org/wiki/Mental_chronometry

0.033ms is totally far away from either of those numbers.


if the 120hz television tries to make a 'middle' frame from 2 video frames, it might try to make another 'middle' frame from one of the previous 'mixed' frames.
same logic used to accept a drastic change in the scene to stop the process from using 'old' frames.



what i find to be most important is, the original poster said the game 'lagged'
the word 'stutter' wasnt used.
if the game had lots of drastic change going on and managed to trick the processing technology, it should have been seen as a flicker.
the same kind of flicker that happens when you are watching a movie and somebody sneeks a different video frame in the film.
it comes and goes fast enough that many people cant even see it.
other people's subconsciousness see's it and they blink their eyes.
its fast enough to hypnotize you.. and that wouldnt bring somebody to a forum and complain about 'lag'
they would probably be afraid that something is about to break and totally stop working.

the 60hz versus 120hz with video games can stop here and now.
because the math doesnt support a visual difference.
the processing time would cause lag and make the audio and video out of sync.. not responsible of the industry.
m
0
l
March 22, 2011 12:52:23 PM

anwaypasible said:
you are stating that each game requires the same percentage of processing power.
thats not true.
look at all of the PC games the require faster graphics cards to get the frames per second up high enough.
have a look at the call of duty black ops forum, you will see people complaining that the game uses more processing power than other games with the same graphics.

there is a need to know what the processor usage is and what data is being processed.
if you look.. it doesnt mean you have permission.
and if you look.. it doesnt mean you are going to see all of the data being processed.

you said the problem went away on a different television.
there are only TWO rational options:
1. the television input is putting a strain on the console video output
2. the way you situated the console when you moved to a different television was enough to decrease (or even increase) the heat.
the difference in heat caused the processor to run better.

anything else is 'top-secret' data transfers robbing you of performance.
maybe your console was connected to the internet and was updating itself.
maybe the console was checked for cheats or modifications.

can you reproduce what happened?
like.. can you do the same area of the map and it lags each time?



I don't believe that every game puts the same amount of strain on the CPU/GPU, Cleary an xbox arcade game is going to have less strain then say Crysis 2 but the difference is less so with consoles and more so with PC's. Console games do not allow you to adjust graphics and most high profile console games have generally the same graphics close enough that you not going to see a 15 degree or more temp difference to cause large gap in performance of the console itself. If that were that case that in itself would defeat the "sandbox" style performance that console companies shoot for (in other words you get a large collection of games that look and run within a fairly small performance degree for each other, this is what makes console games so appealing to game designers) This isn’t a computer that can have tons of different GPU types and range of specs. I continue to believe this issue is related to the hz. I will check and see tonight if I am able to force the tv down to 60hz and see if this resolves the problem.
m
0
l
March 22, 2011 2:04:16 PM

Cold71 said:
I don't believe that every game puts the same amount of strain on the CPU/GPU, Cleary an xbox arcade game is going to have less strain then say Crysis 2 but the difference is less so with consoles and more so with PC's. Console games do not allow you to adjust graphics and most high profile console games have generally the same graphics close enough that you not going to see a 15 degree or more temp difference to cause large gap in performance of the console itself. If that were that case that in itself would defeat the "sandbox" style performance that console companies shoot for (in other words you get a large collection of games that look and run within a fairly small performance degree for each other, this is what makes console games so appealing to game designers) This isn’t a computer that can have tons of different GPU types and range of specs. I continue to believe this issue is related to the hz. I will check and see tonight if I am able to force the tv down to 60hz and see if this resolves the problem.



having the same graphics means absolutely nothing to the code those visuals are running on.
thats like saying you and your friend ride to the same place everyday.. but you dont pay enough attention to who is driving or what car you are in.
there are options.
you drive your vehicle
the other person drives their vehicle
either one of you drives the other persons car
you take a taxi
maybe a bus
or a subway

when games get ported to a console.. most of the work is done by the translation program.
that translation program could be perfect.. or it could have flaws.
the program might have to literally change things to patch a problem.
and if that change isnt handled correctly.. problems arise.
i dont think all game developers have any idea of what or how the game gets ported.
they are probably using a pre-made program to translate the port.
maybe they paid to use the program.

games appear to run the same.. thats because the game will be molded and shaped to fit onto the console to force it to run well.
the game developer might lose their license and/or their publishing rights if the game doesnt pass quality assurance.
it wouldnt suprise me if the game doesnt fit on the console, so they delete some things to make it fit.. then they delete the same things on the PC version so that everything is fair.

black ops runs like crap on my computer.. the crysis 2 demo ran faster and has higher quality graphics.
no point arguing graphics quality.

chewing up what i said and throwing it out as if i am clueless..
you havent said anything MORE accurate.
instead you say that math doesnt matter when seeing 'lag'

how come you havent gone into detail about the lag you are experiencing.
i gave solid rationale and you chose not to go into further details for help.
so if your not willing to help us help you.. how are we supposed to help?
m
0
l
March 22, 2011 3:13:48 PM

From a console point of you, you sound pretty clueless. Your taking a cookie cutter point of view from a PC perspective and trying to make it fit for a console and it doesn't work that way. Great they have a translation program for porting games that were made on the PC to get it to run on the console, this fact is meaningless. The game is made to run in regards to the constraints of the console itself. If a problem arises during any point while porting it or just designing to run just on one particular console that’s called a bug and that bug is going to be on every single copy. Black ops running like crap on your computer and crysis 2 running fine comes down to what specs the game was programmed to run at. Same reason why WOW can run on pretty much anything. It was programmed that way. All these facts are great but they mean nothing from a console point of view. In my case there is absolutely nothing wrong with my console nor the game which I’ve proven. I've been getting a little irritated because I was clearly able to prove this from the start...the CPU or GPU stress isn’t going to change from one tv to another. From the start your advice has been to more or less "burn the house down" Soldering a new CPU or GPU to a console on such a basic issue never passes for solid logic in my book.
m
0
l
March 23, 2011 2:57:09 AM

the 'cut' is that the console has a processor.
processors are no different.
how they process is irrelavent.
what they process is cause of concern.

having code that only points 'regards' isnt the same as having code that only provides 'hints'
either one of these have no basis to the fact that you complained of lag.
the architecture works like an engine, processing wrong or with a lack of electricity will have the same affect that an engine has when the assembly isnt functioning properly or if there isnt enough gas.

you have continued to say that the television refresh rate is the cause of some sort of visual problem that you dont have the decency to share.
i've already told you how the processing is supposed to work on those televisions, regardless of what refresh rate is used.
if you have a television with inferior video processing, i would expect that you see the visual problem more often.. not only when playing video games.

i asked a most important question, can the visual problem be reproduced?
it wouldnt hurt to step away from the video games and use video to try and reproduce the problem.
video frame rate upsampling/upscaling (whatever) shouldnt have any problem with confusion.
the standard logic doesnt allow such a thing.

maybe there are only two things you need to be worried about:
1. your television has inferior video processing logic
2. your televisions video processor is dieing

you dare say that the video input of all televisions have the same resistance.. but lose faith in the industry standards when it comes to the video processor.

realize that any video processing logic other than what i stated is MORE complex than it needs to be.
you should have saved a large amount of money on the television, rather than being forced to pay more because the frame rate logic is more complex than needed.

your options are quite simple..
bring the television back for a refund if its not too late.
consider keeping the television because the visual problem doesnt happen often.
try to use the warranty and see if the video processor is malfunctioning or dieing.
sue the manufacturer because you were forced to pay more money for inferior logic.

but be warned, if you sue.. chances are, the manufacturer is going to say 'if the logic had higher functionality.. you would have paid $____ instead of $____'
you cant hold the manufacturer responsible if the FCC allows them to use the logic.
in that case, the manufacturer is free to use an alternative to the standard.
you would then be fighting with the FCC to ban all alternatives.
the FCC will probably say 'if the final cost is lower, we dont have a problem with it'
then you would have to sue that decision for a lack of ethics.. which allows you to bring the manufacturer back into the case.

the final ruling is simple..
there is no reason to do more and get paid less.
the manufacturer is at a painful loss, and the consumers who purchase the television are also at a painful loss.

then you would have to listen to the manufacturer try to make an arguement that there is no painful loss because of some reason.
and if that reason is enough to counteract the emotional pain caused by selling higher complexity logic for less money.. you would then be stuck with the fact that people who buy the television are saving money because the video processing is inferior.
with that said, you could request that the televisions actually do sell for less money.
then if the manufacturer says they will sell the television for less, but actually sell the television for more.. you could be reimbursed for being lied to (and having your time wasted).
but as far as i'm concerned, some people arent entitled to receive money from such a method.
(or they will receive much less)

you arent going to budge me without first being able to reproduce the problem.. then taking accurate resistance measurements from the video inputs.. then measuring the heat of the cpu/gpu/ram/chipset of the console.

you wont proceed without learning.
m
0
l
March 23, 2011 6:43:54 PM

Reproduce the problem? This entire thread was about me having issues with a very small handful of games on one tv and your asking me if I was able to reproduce the problem? Seriously? If I wasn't able to reproduce the problem and it happened once or a few times I wouldn't have wasted my time. Like reading 4 paragraphs of meaningless sentences like..solder a new chip to the motherboard, and take resistance measurements from the video inputs. It's like the help desk call from hell, someone calls in about a mouse not working and instead of asking them to try reseating the connection or try a new mouse first your asking them to roll windows back or install a new motherboard. That's what I meant by your just trying to immediately "burn the house down" Start small and work your way up from that...I was able to solve the issue yesterday by forcing the tv in to 60hz. Another example of why you start small and work your way up. It's called basic troubleshooting skills. I would imagine if you started replacing chips and testing connections first it would have taken you roughly 10 times longer to solve this problem.
m
0
l
March 23, 2011 6:49:08 PM

did you try updating your gp

and all of the wifi
m
0
l
a b x TV
March 24, 2011 11:53:20 PM

anwaypasible said:
having a video frame buffer would/should cause a constant input lag.
there is no artificial intelligence that can become confused.
all that happens is two frames get looked at and the object in motion gets placed directly in the middle of where the two frames said the object was.
if there is no change in movement, no movement is necessary for the 'middle' frame.
the result looks fantastic because the picture appears more solid.

the logic used is very simple.
if the two frames in the buffer are drastically different, like when you change channels or the scene changes because a different camera is being used, the software will do nothing as it fills the buffer with another video frame of the new scene and start the process of 'mixing' again.

a key point to note here...
60hz of 'mixing' is twice as many frames as the 30 frames per second being input.
120hz of 'mixing' is three times as many frames as 30 .. and twice as many as 60 frames per second.
neither one of the above examples go into detail about HOW LONG it takes the 'mixing' process.
if your television is slow, you will see a constant input lag.
if the television can do the 'mixing' fast enough.. you wont notice it at all.

the math isnt entirely accurate, because most people think 30 frames per second = 500ms
but your television doesnt hold 30 frames per second in its buffer.. it holds 2.
2 frames in the buffer = 0.033ms when the frame rate is 30 frames per second.
add to that whatever time it takes to process both frames and display a 'middle' frame = the input lag
chances are.. you arent even at 1ms
response time for college-age individuals is about 160 milliseconds to detect audio, and approximately 190 milliseconds to detect a visual.
taken from here:
http://en.wikipedia.org/wiki/Mental_chronometry

0.033ms is totally far away from either of those numbers.


if the 120hz television tries to make a 'middle' frame from 2 video frames, it might try to make another 'middle' frame from one of the previous 'mixed' frames.
same logic used to accept a drastic change in the scene to stop the process from using 'old' frames.



what i find to be most important is, the original poster said the game 'lagged'
the word 'stutter' wasnt used.
if the game had lots of drastic change going on and managed to trick the processing technology, it should have been seen as a flicker.
the same kind of flicker that happens when you are watching a movie and somebody sneeks a different video frame in the film.
it comes and goes fast enough that many people cant even see it.
other people's subconsciousness see's it and they blink their eyes.
its fast enough to hypnotize you.. and that wouldnt bring somebody to a forum and complain about 'lag'
they would probably be afraid that something is about to break and totally stop working.

the 60hz versus 120hz with video games can stop here and now.
because the math doesnt support a visual difference.
the processing time would cause lag and make the audio and video out of sync.. not responsible of the industry.


Eh, Jaguarskx has been here on THG for many years, and knows what he's talking about. I don't think he needs a dictionary definition of "interpolation" as I'm pretty sure he's already aware of it. Anyway, he is absolutely correct that the 120Hz, 240Hz, etc interpolation going on in the TV does require internal processing time and thus the time difference between the player's input to the game and the resulting action on the TV screen (aka "lag") can become significant. Imagine you're in an FPS and swinging your gun around to shoot the enemy - you mentally gauge when to press the trigger since ammo is limited, and lo and behold the enemy kills you because the TV display was a quarter-second behind where your console said you were, due to interpolating the console output to 120Hz. IOW, the TV showed your gun crosshairs still some distance from the enemy when in fact you had already got him in your sights (or else swung past him which could be even worse for rapidly moving targets). In the meantime the AI enemy already popped a couple rounds at you..
m
0
l
March 25, 2011 6:09:42 AM

fazers_on_stun said:
the 120Hz, 240Hz, etc interpolation going on in the TV does require internal processing time and thus the time difference between the player's input to the game and the resulting action on the TV screen (aka "lag") can become significant.


'can become significant'

exactly what i was trying to get at.
that is why i asked the person if the scene lagged every time the scene was on the screen.
because if it can be recreated every time, then there is something going on in the scene that is confusing the logic (which leads to inferior logic)
if the lag cannot be recreated every time, then that suggests the problem is the video processor randomly failing.. and that means the video processor might be dieing, needing a warranty repair before the warranty is expired.

how are we supposed to know if the original poster played the exact same part of the map on the other television?
maybe they popped in the disc and fired up the game without going to the same area that caused a problem.
if you pop in the disc and wait for a problem, that isnt enough to accurately test.

i know that the forum rules are to not attack the poster, but the situation.
asking if the original poster can recreate the problem by going to the same place on the map and doing the same things follows the rule.
it was a simple question that was purposefully avoided.

i asked if the console was close to something that might cause the console to heat up.
that question was purposefully avoided.

the math above is accurate, and it is without processing time.
the processing time is important to know the final amount of lag.
but as i initially said, if the video processor is causing lag because its slow.. the audio wont match the video.
the logic that i stated shouldnt take long at all.
and if it does, the processor is running too slow.. and that means the manufacturer cut costs really bad.

we still dont know if the input resistance of the televisions are different.
or if the console's air vents were blocked.
or if the visual problem was properly recreated.

so we cant say if the original television is a bad design, or if it needs warranty service
or if the console got hot and was throttled

forcing the television to 60hz doesnt mean the console is sitting in the same place.
it doesnt even say if the input resistance has become less of a problem thanks to lowering the requirements of the video processor.

i just dont see how asking for help and then avoiding real information is worth the original posters time.
and the information is valid.. you said it yourself 'I don't think he needs a dictionary definition'

if we are supposed to fix a problem when the console and television arent sitting right here in front of us, we need all of the information to create a virtual situation.

and '2 frames in the buffer = 0.033ms'
i cant imagine the video processor needing 160 milliseconds to get the job done.
when the television is too slow to keep the audio and video in synchronization, what does that say about the television manufacturer?


i tried to think of EVERYTHING that could be possible.
that wasnt welcomed.
i suppose to avoid a 'help desk call from hell' .. we should simply ignore what might actually be the problem and blame something else.
that is lieing and misleading.. but it appears as though the original poster wants to hear those things rather than the truth.

no valid reason was said that could prove my help was false information.
m
0
l
a b x TV
March 25, 2011 2:50:33 PM

anwaypasible said:
the math above is accurate, and it is without processing time.
the processing time is important to know the final amount of lag.
but as i initially said, if the video processor is causing lag because its slow.. the audio wont match the video.
the logic that i stated shouldnt take long at all.
and if it does, the processor is running too slow.. and that means the manufacturer cut costs really bad.

we still dont know if the input resistance of the televisions are different.
or if the console's air vents were blocked.
or if the visual problem was properly recreated.

so we cant say if the original television is a bad design, or if it needs warranty service
or if the console got hot and was throttled

forcing the television to 60hz doesnt mean the console is sitting in the same place.
it doesnt even say if the input resistance has become less of a problem thanks to lowering the requirements of the video processor.

i just dont see how asking for help and then avoiding real information is worth the original posters time.
and the information is valid.. you said it yourself 'I don't think he needs a dictionary definition'

if we are supposed to fix a problem when the console and television arent sitting right here in front of us, we need all of the information to create a virtual situation.

and '2 frames in the buffer = 0.033ms'
i cant imagine the video processor needing 160 milliseconds to get the job done.
when the television is too slow to keep the audio and video in synchronization, what does that say about the television manufacturer?


i tried to think of EVERYTHING that could be possible.
that wasnt welcomed.
i suppose to avoid a 'help desk call from hell' .. we should simply ignore what might actually be the problem and blame something else.
that is lieing and misleading.. but it appears as though the original poster wants to hear those things rather than the truth.

no valid reason was said that could prove my help was false information.


The trouble is, the 120+ Hz lag problem is widely known and affects most TVs sold in the last few years, including my Sony 46" XBR4. A quick Google search will find tons of results. The solution that Jaguarskx mentioned - setting that particular input on the TV to turn off interpolation - is also widely known. And the OP did just that and problem solved. No need to get into whether the TV is defective or overheating, etc.. Usually the simplest effective suggestion is the one most appreciated.
m
0
l
March 25, 2011 7:11:07 PM

fazers_on_stun said:
The trouble is, the 120+ Hz lag problem is widely known and affects most TVs sold in the last few years, including my Sony 46" XBR4. A quick Google search will find tons of results. The solution that Jaguarskx mentioned - setting that particular input on the TV to turn off interpolation - is also widely known. And the OP did just that and problem solved. No need to get into whether the TV is defective or overheating, etc.. Usually the simplest effective suggestion is the one most appreciated.


to say that MOST televisions has the lag is really putting down the industry as a whole.
i dont own one of those televisions, but i am smart enough to know that if you are having lag while playing video games.. the vocals are gonna be cloudy with the video IF the television doesnt slow down the audio to match the struggling video processor.

it sounds like a case of not appreciating how far computer processing has come.
there are hundreds of thousands of graphics cards in peoples homes and offices.
why is it so hard to expect the same horsepower in televisions?

me thinks a boycott is necessary.
really think about what is being said.. the television can take 30 frames per second and interpolate those to 60 frames per second.. but cant interpolate to 120 frames per second without lagging.

that is something that is supposed to happen when you buy a 60 frame per second television and try to 'hack' the television and force it to display 120 frames per second.
lag is the downfall, and the reason why the television came with a 60hz sticker rather than a 120hz sticker.


hey.. its your money being wasted on a 120hz television that cant faithfully do 120hz
but it is also bad publicity for people like me who might want one of these televisions in the future.

i'm not trying to avoid what i have read.
i've seen a dozen instances of people saying 120hz televisions dont play well with consoles.
but usually its always the same thing, somebody replying to a forum post forcing a subpar result down somebodys throat.

where are the ARTICLES going into detail about 120hz televisions not working properly?
i might expect to see such bad performance from a generic brand television.
i cannot imagine the entire market flooded with inferior video processing.
it downright angers me to think people are being lied to when it shouldnt be such a problem.
i dont believe anybody has actually tested what frame per second those consoles output to the television.
seems hard to believe that the console only outputs 30 frames per second.
doesnt seem all that hard to believe that the televisions have a hard time accepting 60 frames per second when they are designed for a maximum of ___?

as technology evolves, it shouldnt be a suprise to see most television stations to broadcast 35 frames per second.
even movies bumping up the frame rate.. simply because we can and it looks better.


its been said again and again in this thread, the television isnt compatible with the console.
and that means the television wasnt properly tested before bought?
the original poster said on a FEW games do it.. not all of them.
if that doesnt seem strange to any of you, i really feel sorry for the company i share.
m
0
l
March 25, 2011 7:52:34 PM

Quote:
LCDs don't have a refresh rate. They don't have a electron gun. a actual emulated refresh rate of a LCD is 1000/response time
Now that's no where near the one you can set it to.
Only reason they have one that we can select is to be compatible with the gpu which works since the old days of the CRT with a refresh rate in which it sends the images to the screen when it got the signal and the CRT needs to refresh.
Now the Xbox gpu is out of sync basically with your display hence the lag.
Just set it to 60mhz. Don't worry a Xbox is locked to 35fps for most games so you would not see a difference. Remember again a refresh rate doesn't make the quality poorer than a higher one it just emulates on to be compatible with the hardware. Its not a CRT it doesn't refresh actually


this response always fascinates me because the person is either an engineer who designed a television, or has simply copy and pasted information from an engineer who designed a television.
design A television.. not the entire industry.

you gotta open the television and intercept the circuit board to know if the television pours out all input received onto the screen where needed, or if the entire scene is refreshed.. regardless of movement.

1000 is the oscillation (processing cycle) of some part of the circuit board.
i wouldnt expect that number to be the same for every television, or even every part of the circuit.

i appreciate the people who sell these televisions and test them so they know what the television can and cant do.
it helps shave off 'features' and 'quality' to point the customer to what they are willing to do without with.
if i walked in and said i dont need this, this, and this.. the seller should say 'okay that rules out all of these televisions'
and perhaps the price will go down because of the things i am willing to do without.

i dont think 35 frames per second is enough.. it should be 40.
playing games on the computer, the difference between 35 and 40 is a feeling of transparency.
as if 35 frames per second feels like all movement is weighed down by ankle weights and wrist weights.
going up to 40 helps remove the sluggish feeling.
another bump to 60 frames per second again feels smoother.

maybe we simply have to yell at the films who continue to get by with frame rates slightly below 30.
its not fair to our brains to be forced into a seat and view cloudy movement.
i usually wish people the best.. and its no different here, talking about sluggish frame rates.

you know what happens when movies continue to run at 30 frames per second?
it allows television manufacturers to build a television that is designed for 31 frames per second and nothing else.
it allows television manufacturers to build televisions that are designed for 40 frames per second and label them as superior, with an inflated price tag.
and when we ask the television to perform with other parts of the industry who are at a faster pace (video game consoles) there is a conflict.
then we get forum posts about people who have to turn off the feature that required the television to cost more.

what may their arguement be?
if every television was fast enough to work properly with a console, there would be nothing left to advertise one television as superior to the other televisions.
they are trying to keep the frame rate quality low so that they can bump the frames per second up by 10 frames per second and call that television their flagship?
what about when we buy the television that can do 10 extra frames per second, and we manage to make those television look just as bad as the other ones?

CRT computer monitors arent going anywhere anytime soon.
i have a crt for my computer monitor and a crt for my main television.

dont they realize that a CRT refresh rate directly amounts to the total response time?
i mean, having a computer monitor that does 2,048 x 1,536 maximum resolution at 75 Hz
we are doing 1600x1200 @ 120hz
more resolution than 1080p .. AND running that resolution at faster frames per second.

that is high competition compared to all of the LCD's
i mean c'mon 2048x1536 @ 75 frames per second compared to 1920x1080 that struggles with the 35 frames per second coming from the console..!!?

how bout a fair comparison?
even the generic CRT monitors that come from dell or emachines or compaq can do 720p at 60/72 frames per second.
compare that with a 720p LCD that struggles with 35 frames per second.

it would seem like we are being taught a lesson.. but that lesson is simply, the industry is willing to charge lots of money for inferior things.
you could say that we are being forced to appreciate how far electronics have come.. but what about all of those people who ditched their CRT for the new technology, simply because the new technology had to of been hard to develop.
it would appear that those people made a mistake.

regardless of the response time of the LCD screen itself..
if the video processor is slowing down the entire system.. the final frame per second either drops are lags and requires the audio to be connected to the television so that the audio can also be delayed and stay in sync.

asking people to purchase receivers and surround sound systems.. but we need to connect the audio to the television so the audio is in sync?
the home theater industry would be highly upset..!
and the people who see the vocals mismatch with the audio would be upset.

i dont see home theater enthusiasts complaining about such a thing.
so what makes that console special ??!!!!!!
was the console using a refresh rate ahead of its time?
that would solve everything.

m
0
l
March 25, 2011 8:04:19 PM

fazers_on_stun said:
the TV is defective or overheating, etc.. Usually the simplest effective suggestion is the one most appreciated.


something to mention here.
i didnt mention that the television might be overheating.. but you did.

unaware consumers are going to think just that 'the TV is defective or overheating'

i am not a general consumer.
i like everything on an organize plate so that it can be overviewed.
not my fault that the extra effort is unappreciated.
it boils down to the fact that most people are uneducated and lazy.
they want the simple fix without appreciating the time and care that is put into design and engineering.
thats why they are replacing jobs with robots.
if people have no choice but to look at the economy and all of its technical achievements.. would they learn to appreciate then?

kinda sucks that if a business owner replaces all of the workers with robots.. the business owner is the only one who walks away appreciating new technology.
the other people get upset because they lost their job.
but the business owner doesnt have to pay workers.. doesnt have to worry about them calling in sick without warning.. doesnt have to worry about being lied to or arguements.
the boss doesnt have to smile and be nice if they really want to cry and avoid people.


ANYWAYS..
i continue to say that these people shouldnt have to turn off a feature that caused the price of the television to go way up.
its cruel.. its abusive.. it stains the desire to spend money.. it ruins that persons day, and they might get mad and ruin somebody elses day.

we had an example of that here.
the original poster was already upset because of the problem.. then lashed out and took that anger out on me, as i was trying to help by doing anything other than turning off a feature that caused the television to cost more.
besides, if something i said was actually wrong.. the problem is large enough to make the television age faster and eventually break.

excuse me for trying to prevent the television and/or console from going dead.
m
0
l
March 26, 2011 6:33:37 AM

now we are so far zoomed in..
its not fair to say a CRT has a refresh rate.
the same way that its not fair to say an LCD has a refresh rate.

you have to open up each and every individual model of CRT monitors to know how the circuit board handles the electron gun and the video input.

it can very well be that the electron guns simply POURS onto the screen.
whether the rate of pouring onto the screen actually changes is debate well served to each design choice.

see.. if the electron gun is pouring onto the screen, what does it pour onto the screen when there isnt any video input?
usually its pouring a shade of black.
the pouring continues and as pixels are input into the electron gun, they simply spit out.
where they need to go requires thinking.
the thinking process would be the first place that needs a 'rate'
because as long as the electron gun is pouring onto the screen without any blank spots.. the screen stays solid.
theres no reason for the electron gun to change ANYTHING except the pixels.

and that raises the question 'how fast can the electron gun brain receive pixels without choking or slowing down?'
that is a bandwidth theory.
its only a theory because each frame is a picture.. and that picture has a constant x / y resolution.
so therefore, you have to cap the bandwidth with a variable.
the variable is KNOWING each picture (frame) is going to have the same x / y resolution.
based on the bandwidth available.. how many pictures (frames) can be sent before the bandwidth reaches its maximum speed?

that is why CRT's have a refresh rate.
its more simple to say the CRT can accept 120 frames per second
compared to
the CRT's electron gun brain can process _____ vertical pixels per second, and ______ horizontal pixels per second.

the refresh rate is supposed to be capped close to the bandwidth maximum.
when you know the frame has x / y resolution.. you can clearly see that either X or Y is close to the vertical or horizontal pixel per second maximum.
you might be able to squeeze some more resolution out of the other direction, because the electron guns brain isnt close to its bandwidth maximum.
but without a video processor to tell those extra pixels where to go.. they simply cannot be sent to the CRT.




do i have to go into detail about an LCD having a refresh rate?
the same thing can be said about the brain in the LCD television.
it can only accept and process _____ vertical pixels per second, and ______ horizontal pixels per second.
KNOWING that each frame has x / y resolution.. you can take either X or Y and divide it to the maximum number of pixels the brain can receive per second.
the result tells you how many frames per second can be accepted and processed by the brain.
that is a refresh rate.

but dont think the lcd screen has a response time fast enough to keep up with the brain.
if the brain is faster than the liquid in the screen.. you arent going to get an increased frame per second by hacking the lcd.

crts can put the electricity onto the screen and leave it there.. constantly pointed, no?
why not?
if it can touch once for a brief moment, why cant it simply stay there?
its gonna be back again in a very short while.

you seem to think the electron gun is like a paint brush that touches the screen line after line.
that isnt fair to the CRT that simply pours electricity, making constant contact with the screen.
the electron gun doesnt have to 'update' anything that doesnt move.. it simply holds it there.

and just because one thing moves.. for example, just because the cursor is blinking as i type.. that doesnt mean the entire screen has to refresh itself.
yes, the entire screen might get updated as a whole ... but it doesnt have to.

once you realize how solid the connection can be, any movements or changes can simply be communicated by their respected x/y coordinates.
you dont have to resend all of the x/y coordinates when nothing has moved or changed.

dare i say interlaced or progressive ??

lcd's are the same way, unless the liquid has to remain in movement to keep the liquid/crystals from aging (or becoming stuck/ajar).

makes me think of the difference between an 'electron gun tube' and a 'cathode ray tube'
one of them might be drawing line by line to keep the screen from melting.. but the other one is constantly touching the screen.
see the difference if your television is a CRT or really a EGT

remember.. those types of televisions can also be used as a video camera.
thats how the nintendo video game 'duck hunt' works.
the tip of the gun fires a light into the screen.. and by taking those x/y coordinates from the light, that is how the game knows if you shot the duck or not.

drawing line by line helps the television become a video camera.
constantly pouring onto the screen needs logic to read 'feedback' or changes to the pixel output, picked up by the gun (or a video camera itself)

i really like how the entire screen doesnt have to refresh or update when not necessary.
but its not fair to not give credit to all of those who have helped develop LCD's into what they are today.
i seen an LG television at best buy last week.
the screen claimed to be 3D capable.
and when i was watching the demo without any glasses.... sure enough, the video had depth.
it wasnt solid 3D.. but it was depth.
and to me, that is the same difference.

3D is popping out of the screen.
depth is sinking deep into the screen.
more and more high definition video is starting to show depth.
you can see depth on a daily basis, even if its a commercial.

LCD's had to get the color accuracy right so that they could compete with CRT's in the high definition era.
you cant have depth without using many shades of colors.
CRT's had to master the phosphorus screens to get the color accuracy.
LCD's had to master chemistry to get the colors accurate and last a long time.


there are more chain of events taking place between the television's pixel brain and the video input.
but after realizing how and why the entire screen doesnt need to be updated, you dont have to interpolate the frames.
its a stupid concept when its just easier to involve more frames per second.
otherwise you have video processing logic that detects movement and interpolates that movement to try and make the frames per second appear as if they are higher than what they really are.

recording video at higher frames per second is more satisfying and more revealing.
and it helps remove obnoxious needs to interpolate low frames per second.
simply increasing the standard frame per second rate will solve the biggest problem with these new televisions.
that is where all of the war is at.
color accuracy and actual amount of colors available has always been the choice of design.. its what makes one television better than the other.
but now they are trying to interpolate the frames per second to make them appear as if there are more frames than there actually is.
you see price wars because of 60hz and 120hz and 240hz
and then we get complaints in the forum that the higher amounts of interpolation arent even working correctly.

i wonder how long this will go on for.
m
0
l
!