Sign in with
Sign up | Sign in
Your question

Homemade Arduino scanning LED backlight to emulate 960Hz in 120Hz LCD!

Last response: in Computer Peripherals
Share
September 12, 2012 7:44:50 AM

Hello,

Goal: Eliminate motion blur on an LCD, and allow LCD to approach CRT quality for fast-motion.

Scanning backlights are used in some high end HDTV's (google "Sony XR 960" or "Samsung CMR 960"). These high end HDTV's simulate 960 Hz using various techniques, including scanning backlights (sometimes also called "black frame insertion"). The object of this is to reduce motion blur greatly by pulsing (flickering) the LCD -- scanning the backlight (flicker), like a CRT would scan the phosphor (flicker) These home theater HDTV's are expensive, and scanning backlights are not really taken advantage of (yet) in desktop computer monitors. Although there are diminishing returns beyond 120Hz, it is worth noting that 120Hz eliminates only 50% of motion blur over 60Hz, but if you go to 480Hz, you eliminate 87.5% of motion blur. Scanning backlights can simulate the motion-blur reduction of 480Hz, without further added input lag, and without needing to increase actual refresh rate beyond the native refresh rate (e.g. 120Hz)

I have an idea of a home-made scanning backlight, using an Arduino project, some white LED strips, and a modified monitor (putting Arduino-driven white LED strips behind the LCD glass)

Most LCD's are vertically refreshed, from top to bottom.
The idea is to use a homemade scanning backlight, by putting the LCD glass in front of a custom backlight driven by an Arduino project:

Parts:
1. Horizontal white LED strip segments, put behind the LCD glass. The brighter, the better! 4 or 8 strips.
2. Arduino controller (to control LED strip segments).
3. 4 or 8 pins on Arduino connected to a transistor connected to the LED strip segments.
4. 1 pin connected to vertical sync signal (could be software such as a DirectX program that relays the vertical sync state, or hardware that detects vertical sync state on the DVI/HDMI cable)

The Arduino controller would be programmed to flash the LED strip on/off, in a scanning sequence, top to bottom. If you're using a 4-segment scanning backlight, you've got 4 vertically stacked rectangles of backlight panel (LED strips), and you flash each segment for 1/4th of a refresh. So, for a 120Hz refresh, you'd flash one segment at a time for 1/480th of a second.

The Arduino would need to be adjustable to adapt to the specific refresh rate and the specific input lag specific to the monitor:
- Upon detecting signal on the vsync pin the Arduino would begin the flashing sequence to the first segment. This permits synchronization of the scanning backlight to the actual output.
- An adjustment would be needed to compensate for input lag (either via a configurable delay or via configuring the flash sequence on a different segment than the first segment.)
- Configurable pulse length, to optimize image quality with the LCD.
- Configurable panel flash latency and speed (to sync to the LCD display's refresh speed within a refresh) -- this would require one-time manual calibration, via testing for elimination of tearing/artifacts. For example, a specific LCD display might only take 1/140th of a second to repaint a single 120Hz frame, so this adjustment allows compensation for this fact.
- Configurable number of segments to illuminate -- e.g. illuminate more segments at a time, for a brighter image at trade-off (e.g. simulating 240Hz with a double-bright image, by lighting up two segments of a scanning backlight rather than 480Hz)
- If calibrated properly, no extra input lag should be observable (at most, approximately 1-2ms extra, simply to wait for pixels to fully refresh before re-illuminating backlight).
- No modifications of computer monitor electronics is necessary; you're just replacing the backlight with your own, and using the Arduino to control the backlight.
- Calibration should be easy; a tiny computer app to be created -- just a simple moving test pattern and a couple or three software sliders -- adjust until motion looks best.

Total cost: ~$100-$150. Examples of parts:
- $35.00 (RadioShack) -- Arduino Uno Rev 3. You will need an Arduino with at least 4 or 8 output pins and 1 input pin. (e.g. most Arduino)
- $44.40 (DealExtreme) -- LED tape -- White LED's 6500K daylight LED's, 50 watts worth (5meter of 600x3528 SMD LED 6500K).
- Plus other appropriate components as needed: power supply for LED's, wire, solder, transistors for connecting Arduino pins to the LED strips, resistors or current regulators or ultra-high-frequency PWM for limiting power to the LED's, etc.

LED tape is designed to be cut into segments, (most LED tape can be cut in 2 inch increments). Google or eBay "White LED tape". A 5 meter roll of white LED tape is 600 LED's at a total 50 watts, and this is more than bright enough to illuminate a 24" panel in 4 segments, or can be doubled-up. These LED tape is now pretty cheap off eBay, sometimes as low as under $20 for chinese made rolls, but I'd advise 6500K full-spectrum daylight white LED's with reasonably high CRI, or color quality will suffer. Newer LED tape designed for accent lighting applications, would be quite suitable, though you want it daylight white rather than warm white or cold white -- to match the color of a typical computer monitor backlight. For testing purposes, cheap LED tape will do. You need extra brightness to compensate for the dark time. A 4-segment backlight that's dark 75% of the time, would ideally need to be 4 times brighter than the average preferred brightness setting of an always-on backlight. For even lighting, a diffuser (e.g. translucent sheet, wax paper, etc) will likely be needed between the LED's and the LCD glass.

This project would work best with 120Hz LCD panels on displays with fast pixel responses, rather than 60Hz LCD panels, since there would not be annoying flicker at 120Hz (since each segment of the scanning backlight would flicker at 120Hz instead of 60Hz), and also that the pixel decay would need to be quick enough to be virtually completed

Scanning backlight is a technology already exists in high end home theater LCD HDTV's (960Hz simulation in top-model Sony and Samsung HDTV's -- google "Sony XR 960" or "Samsung CMR 960"), and most of those include motion interpolation and local dimming (Turning off LED's behind dark areas of screen), in addition to scanning backlight. We don't want input lag, so we skip the motion interpolation. Local dimming is complex to do cheaply. However, scanning backlight is rather simple -- and achievable via this Arduino project idea. It would be a cheap way to simulate 480Hz (or even 960Hz) via flicker in a 120Hz display, by hacking open an existing computer monitor.

Anybody interested in attempting such a project?

[EDIT: This is an old post from 2012, archived for historical reasons -- Arduino Scanning Backlight on Blur Busters Forums.]
September 13, 2012 9:05:04 PM

There is vigorous discussion now on this topic (as of today) in a specific thread on HARDforum and 120hz.net-Forums. It's become pretty realistic and practical, because:

1. LCD's are now fast enough to finish refreshing before the next frame (requirement of 3D LCD's)
2. LED's are now bright and cheap enough (requirement of extra brightness needed in ultra-short flashes in scanning/strobed backlight)
3. Today's 120Hz LCD's, means that flicker of a scanning-backlight, will not bother _most_ people. (3D LCD's brought us 120Hz LCD's)
4. Controllers for scanning backlights are now cheap (it can be done with an Arduino)
5. Scanning backlights make it possible for LCD blur to be *better* than the LCD's own response speed.
6. Finally, scanning backlights can be made configurable in on-screen menus (e.g. turn off scanning and make it behave conventionally), if you only want to use it during videogames.

LCD blur can be less than CRT, since LCD pixel response no longer matters (once you meet pre-requisite #1 above)
For example, a single 8ms refresh (1/120th second) for a 120Hz display, can be enhanced with a scanning/strobed backight:
2ms -- wait for LCD pixel to finish refreshing (while in the dark)
5ms -- wait a little longer for most of ghosting to disappear (while in the dark)
1ms -- flash the backlight quickly. (1/960th second or 1/1000th second)

Here, you just bypassed LCD pixel response as the factor in motion blur.
Heck, you could strobe for only 0.5ms instead of 1ms -- and you get sharper motion on an LCD than on a CRT, imagine that! Flicker fusion and persistence of vision (exactly the same principle as using a CRT display), does the rest for you -- motion is crystal sharp.

The appearance of a LED scanning backlight, would look very similiar in high-speed camera footage, as to a CRT. (See YouTube of high speed camera of CRT scanning). A high-speed scanning backlight would look very similar under high speed footage. (To do so, you need to utilize very bright illumination per scanning segment for a millisecond, similiar or better than the speed of phosphor decay in a CRT. You need plenty of brightness, for a scanning backlight that is dark 90% of the time).

All these values could be adjustable in the Arduino scanning backlight project, to reduce input lag, etc. Heck, if your backlight was sufficiently bright enough, you could do 0.5ms flashes of the backlight segment, and still have a very bright image. Also, scanning the backlight means you don't need a power supply for the whole LED panel (e.g. only 15 watts out of a 100 watt backlight is illuminated at a time). And you can scan in sync with the LCD refresh (LCD refresh is updated sequentially, one row of pixels at a time), at a different phase in the refresh (e.g. refresh the top part of LCD, while illuminating part of the bottom part of the screen). All you need is a sufficient overkill amount of LED's for the extra brightness required for a backlight that's dark 90% of the time. Look around, the good news is that LED's have fallen a lot in price in the last few years. (Don't believe me? Visit Las Vegas or Times Square. The video screens you see, are LED's -- and stadiums jumbotrons typically have about a million LED's on them). LED billboards standing next to expressways are often underdriven at night at 5% of their original brightness, ONLY because of local bylaws banning overly-bright billboards (hazard). They need to be very bright, to be daylight-visible, especially when the sun is shining on these LED billboards. LED's today ARE bright enough now for a high-speed scanning backlight that's dark 90% of the time. And search "600 white LED strip" on eBay, only $15 each -- and that gives you more light than a typical computer monitor's builtin backlight. Today, we're finally at a point where we can have a backlight 10 times brighter than necessary, in order to have a scanning backlight that's dark 90% of the time, to achieve CRT-quality in an LCD.
September 13, 2012 9:30:46 PM

I have been waiting for you to try this experiment. I wouldn't think of trying something like this myself, unless I saw someone else propose, analysis and try it.

I have also seen a new, even higher refresh rate on a gaming monitor recently. It's 144hz, the ASUS VG278HE. That alone is exciting to me. That means 72hz in 3D Vision, which almost reaches the 75-80 FPS that removes motion sickness from gaming for me.
Related resources
September 20, 2012 4:12:49 AM

Thanks. By the way, I'm surprised TomsHardware is quiet on this thread; there's at least a few smart lurkers around here. There's tons of discussions in identical threads I posted on other forums.
- This thread on HardForum
- This thread on 120hz.net
- This thread on AVSFORUM (amazing replies!)
- This thread on TomsHardware
- This thread on overclockers

Also, John Carmack actually replied to me on Twitter, telling me I am on the right track!

Mark Rejhon @mdrejhon
@ID_AA_Carmack I'm researching home-made Arduino scanning backlight (90%:10% dark:bright) using 100-200 watts of LED's. 120hz.net/showthread.php…

John Carmack @ID_AA_Carmack
@mdrejhon Good project. You definitely want to find the hardware vsync, don't try to communicate it from the host.

---

I've commenced with the project, at least preliminary experimentation over the next few weeks/months.
September 20, 2012 3:37:41 PM

bystander said:
I have also seen a new, even higher refresh rate on a gaming monitor recently. It's 144hz, the ASUS VG278HE. That alone is exciting to me. That means 72hz in 3D Vision, which almost reaches the 75-80 FPS that removes motion sickness from gaming for me.
For alternate-frame active shutter glasses 3D, you do realize you're already getting 120fps through 120Hz shutter glasses, even though it's only 60 per eye? You can already get the "120fps feel" through 120Hz 3D glasses, because the alternate shutter action gives you 120 separate discrete image samples per second (60 times for left eye for the even-numbered frames, and 60 times for right eye for the odd-numbered frames).

Have you ever tried 3D with 120Hz at least, with say, a Geforce GTX 680 league graphics card or similar, capable of 120fps in many games (At least at slightly reduced settings)? You're still getting the motion resolution of 120fps, even though you're only getting 60 per eye. Therefore, you need to ask yourself the question -- is your specific motion sickness caused by insufficient framerate, or caused by flicker? If your motion sickness is not caused by flicker, you should be perfectly fine with 3D on standard 120Hz monitors -- 144Hz is overkill.

For 3D 144Hz, you will ideally need GPU horsepower for 288fps (144fps x 2 eyes) in order to get the full 'perfect' fluidity. Even though it's only 72fps per eye, you need to render both frames (144fps). However, 3D shutters give you 144 discrete time-based samples per second (left eye temporally displaced 1/144sec away from right eye), so you need to render both left/right eye frames 144 times a second, for a total required GPU horsepower of 288fps. The extra frames (not seen by the "other eye") are often wasted, but are necessary to be rendered just in case there's freezes and stalls; the 3D glasses need to be able to continue to alternate between left/right eye even through the freezes. So both frames often must be avaialble at all times, just in case, even if the extra frames turn out to be unnecessary. (For software developers and driver developers, rendering optimizations on this problem is an extremely difficult science)

Thus,
You need GPU horsepower of 288fps for maximum fluidity with 144Hz shutter 3D (72/72 for left/right eye), and
You need GPU horsepower of 240fps for maximum fluidity with 120Hz shutter 3D (60/60 for left/right eye), and
you need GPU horsepower of 120fps for maximum fluidity with old-fashioned 60Hz shutter 3D (30/30 for left/right eye).

(P.S. On a related note, scanning backlights are compatible with 3D, provided the scanning is properly synchronized with the 3D shutter. It's a matter of shutter timing.)
September 20, 2012 7:38:01 PM

I have a 3D Vision setup with a 120hz monitor and I use it any chance I get. I have been playing Skyrim this way for a while, Crysis 2 and Metro 2033 most recently, and while I realize I get 120hz, between the two eyes, the reality is, I feel about the same latency and motion sickness I get from 60 FPS in normal 2D mode. I have found that in 2D mode, if I get my FPS up to 80+, my motion sickness pretty much disappears aside from games that cause a lot of bouncing.

It isn't the refresh rate that causes the issue for me, although the shutter glasses cause a sort of flickering between the eyes, which can cause eye strain. What causes me the motion sickness is latency, and higher refresh rates with FPS to match, help lower latency.

For reference:
At 30-40 FPS, I will get motion sickness in about 5-10 mins, if not sooner, and it is pretty severe.
At 40-50 FPS, I will get motion sickness in about 30 mins.
At 60 FPS, I will get motion sickness in about 60 mins (same for 3D Vision with 120hz monitor).
At 80 FPS with hz above that, I will not get motion sickness.

This assumes a first person game that I control the view with a mouse. Watching a movie does not cause problems at all.
September 20, 2012 10:05:23 PM

Aha, latency -- that would explain things. I would think that you can solve the latency during 3D mode by doubling your GPU resources, to gain the same effective fps that you're getting in 2D mode.

I observe that the same GPU that is able to keep up at 80+fps in 2D mode, isn't going to be able to generate the 160+fps required in 3D for getting the same effective 80+fps you're getting in 2D mode. You need to double the GPU to get the same fps in 3D, for a more apples-to-apples comparision in latency -- since 3D 120Hz (at doubled GPU) theoretically have the same latency as 2D 120Hz.

That said, there might be other latency considerations induced by the LCD shutter being closed for part of the refresh, which is probably a few extra milliseconds. Added game latencies/ineffiencies caused by 3D. Other factors. So there might be other considerations I haven't considered. But theoretically, assuming you double GPU, and the other variables are not a factor, the latency is theoretically the same. It is wholly possible 144Hz wouldn't solve your problem, unless a portion of the sickness is caused by flicker. Either way -- it just boils down to whether it's worth spending the funds on 144Hz or spending the funds on doubling your GPU (an interesting decision for you I imagine!)

I'm getting offtopic in my own thread here :(  , so I'll gradually bring this thread back onto topic. ;) 
September 21, 2012 12:21:36 AM

mdrejhon said:
Aha, latency -- that would explain things. I would think that you can solve the latency during 3D mode by doubling your GPU resources, to gain the same effective fps that you're getting in 2D mode.

I observe that the same GPU that is able to keep up at 80+fps in 2D mode, isn't going to be able to generate the 160+fps required in 3D for getting the same effective 80+fps you're getting in 2D mode. You need to double the GPU to get the same fps in 3D, for a more apples-to-apples comparision in latency -- since 3D 120Hz (at doubled GPU) theoretically have the same latency as 2D 120Hz.

That said, there might be other latency considerations induced by the LCD shutter being closed for part of the refresh, which is probably a few extra milliseconds. Added game latencies/ineffiencies caused by 3D. Other factors. So there might be other considerations I haven't considered. But theoretically, assuming you double GPU, and the other variables are not a factor, the latency is theoretically the same. It is wholly possible 144Hz wouldn't solve your problem, unless a portion of the sickness is caused by flicker. Either way -- it just boils down to whether it's worth spending the funds on 144Hz or spending the funds on doubling your GPU (an interesting decision for you I imagine!)

I'm getting offtopic in my own thread here :(  , so I'll gradually bring this thread back onto topic. ;) 


Have you actually used 3D Vision or HD3D? By what you write, I'd have to guess you haven't.

3D requires v-sync. It will not function without v-sync, and you cannot remove v-sync if you wanted to. In order to make sure each eye only sees the image from the proper perspective, you cannot have tearing or it completely destroys the effect. Because tearing would cause the opposite eye's image to be partially displayed to the wrong eye. As a result, you cannot have more FPS than half your refresh rate. That is why 144hz would be very appealing to me, as it would allow for more than 60 FPS in 3D.

Oh, and I have 680's in SLI to ensure 60 FPS at all times in 3D.
October 18, 2012 5:33:34 AM

You still need 240fps even with VSYNC on, for a 120Hz 3D display.

With active shutter glasses, you're getting 120 samples per second, spaced 1/120sec apart - 60fps per each eye, shown during alternate 1/120th second pariods. The even samples go to one eye and the odd samples go to the other eye.

Why?

Alternate shutter glasses means the left eye and right eye is temporally displaced by 1/120sec.
Therefore, for optimal rendering of nearly all 3D video-based material,
1. Left eye needs 60fps......timed at T+0/120s...T+2/120sec...T+4/120sec...T+6/120sec...etc
2. Right eye needs 60fps....timed at T+1/120s...T+3/120sec...T+5/120sec...T+7/120sec...etc

Therefore, you need to generate 120 frames per second, each rendered 1/120sec apart, for smoothest motion. Thus, you're actually seeing true 120 Hz, just with one eye at a time. You want 120fps @ 120Hz, for maximum fluidity, even though 60 of that goes to one eye, and 60 of that goes to the other eye.

But, wait! What happens if the computer pauses, or the game stutters, and needs to redisplay the same frame for another refresh -- but for the other eye?? The shutter glasses isn't going to wait for the missing frame if you haven't rendered it for the other eye yet. So you need both frames for both eyes, just in case.

Therefore, you need 120fps for left eye and 120fps for right eye. Half of the frames are often 'wasted', but becomes useful if there's hiccups or stutters and the other eye of the same frame needs to be displayed. If games were designed to perfectly run at 120fps at all times, with no dropped frames or pauses, then the wasted frames would not be needed.

Therefore,
1. Left eye needs 120fps......timed at T+0/120s...T+1/120sec...T+2/120sec...T+3/120sec...etc
2. Right eye needs 120fps....timed at T+0/120s...T+1/120sec...T+2/120sec...T+3/120sec...etc
Even though most of the time, you're only displaying one or the other. You just need the other one "just in case" for stutters, etc, unless the game has some clever rendering algorithms to reduce GPU requirements of frames not expected to be seen by a specific eye. Many games, do not have any such rendering optimizations. (That's why framerate looks half as smooth in 3D than in 2D, for the same graphics card: It's always rendering frames for both eyes at all times, even when one of the frames might never be seen by the other eye.)

Therefore, you need 240fps of GPU power for perfect smooth 120fps with 3D shutter glasses.

So I'm right. Upgrade accordingly. :-)
October 18, 2012 5:51:16 AM

Here's another example of why you need 240fps of GPU power for a 120Hz display, for the "Perfect 120fps @ 120Hz motion effect":

Generating 120fps for each eye (120+120 = 240) is necessary, even if half of frames are never seen.

In normal "Perfect 120fps @ 120Hz" operation, you have this. Bold means glasses shutter open.
Left Eye......T+0/120s...T+1/120sec...T+2/120sec...T+3/120sec...T+4/120sec...T+5/120sec
Right Eye....T+0/120s...T+1/120sec...T+2/120sec...T+3/120sec...T+4/120sec...T+5/120sec

But if you get a pause, you need the other frame in reserve (Italicized means frame repeat due to pause):
Left Eye......T+0/120s...T+1/120sec...T+2/120sec...T+2/120sec...T+3/120sec...T+4/120sec
Right Eye....T+0/120s...T+1/120sec...T+2/120sec...T+2/120sec...T+3/120sec...T+4/120sec

Again, therefore, you need 240fps of GPU power for 120Hz shutter glasses (60/60 per eye)
October 18, 2012 11:19:06 AM

do remember that doubling refresh rates would also greatly increase draw on system resources. in a price per performance comparison the numbers would not be very favorable.

also you would be hard pressed to find a true 240hz monitor on the consumer market. televisions accept only a 60hz signal and fake the rest. 120hz monitors accept 120 or 60hz.

---

now lets say we do have a 240hz display..

with 1/120 glasses every two frames will be skipped on a 240hz video. in essence your left eye sees 120hz for half the time and darkness half the time. the on/off ratio is stil lat 60hz which would limit any improvements you see significantly i would imagine.

to benefit you might need 1/240 glasses.

---

remember, the more strain you place on a system (ie, requiring more fps) the more likely your system will stutter and introduce an anomoly into the video feed. working at double the speed reduces the time your pc has to render each frame. if your pc can not keep up, you will see it.

----

all the luck to you....but i just do not see it as being a viable option if you consider costs involved, technology support and driver support.
October 18, 2012 4:59:55 PM

ssddx, a 240 Hz LCD is not required.

Also, this thread is now talking about two different topics.
I already am constructing a prototype in my BlurBusters Blog at www.scanningbacklight.com, and I have been successful in electronics research & development so far. Industry insiders have endorsed it is certainly possible.

(1) Scanning backlights are compatible with 60 Hz displays.
Scanning backlights already exists in a game-compatible 60 Hz mode (low input lag) in expensive top-model Sony "MotionFlow XR 960" displays and Samsung "Clear Motion Rate 960" displays. You simply leave the scanning backlight turned on, but turn off the motion interpolation -- there are some menus . The display flickers a lot, like a 60 Hz display, but motion blur is up to 75% less than a regular 60 Hz LCD without a scanning backlight. Here are example instructions for configuring a Sony HDTV with Motionflow XR 960 (typically found only in their higher end $2500+ models) to run in a 60 Hz scanning backlight mode ("backlight line blinking").

(2) GPU power required for 3D active shutter glasses
This has nothing to do with scanning backlights. You need double the GPU power to get the same smoothness of framerate for 3D, it's already well known that turning on 3D approximately halves the framerate. However, you're actually still getting the "motion resolution" of 120fps using today's 120 Hz monitors like Acer VG236H. It's actually 120fps, even though 60fps goes to one eye and 60fps goes to the other eye. What I'm trying to say is that the frame for the left eye isn't necessarily the same instant in time as the frame for the right eye, because the frames are presented 1/120second a apart. You're still able to get the benefit of full 120 fps of motion resolution, but to get that, you need 240fps of GPU power on today's already available 120 Hz 3D monitors, if you want full 120fps operation. Even though only one eye sees a frame at a time, you still benefit from going beyond 60fps with 3D shutter glasses because of the temporal displacement of the left and right frames during active shutter glasses. This is true independently of a scanning backlight. Apologies for bringing this thread off topic.

....Again, scanning backlights is an existing technology and does not require 240Hz. They work fine at 60Hz, it just flickers too much at 60Hz (like a 60Hz CRT) because scanning backlights essentially emulate CRT flicker for getting the "CRT perfect" motion feel with an LCD. It's already done (up to 75% motion blur elimination) in some high end HDTV's (see above), but not yet done with computer monitors and not yet done in a high-performance mode (90% motion blur elimination). I'm simply improving an already-invented wheel, and it's quite simple to a vision expert (someone who understands how strobing sharpens motion) -- see Scanning Backlight FAQ as well as Science & References.

It is noted that the main good reason to go above 60 Hz with a scanning backlight, is to reduce flicker -- because people hate CRT flicker at 60 Hz. For computers, the sweet spot for a scanning backlight will probably be approximately 85 Hz, because that is when CRT's will stop flickering, and it's less GPU heavy than 120Hz. Long-time CRT afficanados know that 60fps @ 60Hz on CRT is much sharper motion than 120fps @ 120Hz on LCD, and a scanning backlight has exactly the same effect (it would look similiar to a CRT in high speed video); getting sharper motion at lower framerates, due to the elimination of LCD's sample-and-hold effect.

Cheers,
Mark Rejhon
BlurBusters Blog
www.scanningbacklight.com
October 18, 2012 10:41:51 PM

mdrejhon said:
Here's another example of why you need 240fps of GPU power for a 120Hz display, for the "Perfect 120fps @ 120Hz motion effect":

Generating 120fps for each eye (120+120 = 240) is necessary, even if half of frames are never seen.

In normal "Perfect 120fps @ 120Hz" operation, you have this. Bold means glasses shutter open.
Left Eye......T+0/120s...T+1/120sec...T+2/120sec...T+3/120sec...T+4/120sec...T+5/120sec
Right Eye....T+0/120s...T+1/120sec...T+2/120sec...T+3/120sec...T+4/120sec...T+5/120sec

But if you get a pause, you need the other frame in reserve (Italicized means frame repeat due to pause):
Left Eye......T+0/120s...T+1/120sec...T+2/120sec...T+2/120sec...T+3/120sec...T+4/120sec
Right Eye....T+0/120s...T+1/120sec...T+2/120sec...T+2/120sec...T+3/120sec...T+4/120sec

Again, therefore, you need 240fps of GPU power for 120Hz shutter glasses (60/60 per eye)


You clearly have never used 3D or you would know that 3D Vision forces on v-sync and does not allow for more than 120 frames from being drawn (60 frames per eye). With v-sync on, the system doesn't discard extra frames, it simply doesn't allow them from being drawn in the first place.

The stereoscopic drivers are also more efficient than what you think, as you normally do not get half normal FPS with it on. You normally end up at about 60% of the FPS as if you were in 2D mode. I can only guess as to why.
October 19, 2012 7:38:51 AM

I have used 3D off and on, since the very old Asus V7700 Geforce2 GTS kit for CRT displays (from year 2001 -- old anandtech review from 11/3/2000) which was CRT-based 3D at 60Hz (30/30 per eye). It required VSYNC on, too. It was known to need 120fps of GPU power to max-out the possible vsync-on. What I am describing occurs with VSYNC on. I am also a software developer with some familiarity with Direct3D as well. So I have more than 10 years of 3D understanding, and from what I know, nVidia 3D shutter glasses is similiar, except it's double refresh rate (120Hz instead of 60Hz).

Let's try to clear up some confusion....
I know VSYNC is always enabled for 3D.
Can you correct me if I am wrong below in any area:

I think the confusion is with the "frame" terminology. When I say "frame", I mean individual eye not both eyes, there are two "frames" for one "3D frame" -- a left eye frame and a right eye frame. Some people call it "one 3D frame" which is a single pair of frames that has a left eye and a right eye image. Perhaps this is the terminology that we're confusing. If you mean "120 3D frames per second", you are correct. But read on, and please correct me if you see any place where I am wrong.

Quote:
With v-sync on, the system doesn't discard extra frames, it simply doesn't allow them from being drawn in the first place.
Oh -- I know, VSYNC is on. It is not a VSYNC related thing for what I'm talking about; that's the confusion; Let me try again & you can correct me where I'm wrong. I'd like to make sure that we are both on the same base.

- We both already know nVidia 3D Vision is active shutter glasses: It's an electronically powered glasses that alternate the shutters rapidly 120 times a second, meaning that the left and right eye take turns getting a frame. (Is this right?)
- Essentially, left eye sees a refresh, right eye sees the next refresh after, left eye sees the next refresh after, right eye sees the next refresh after. (Is this right?)
- For today's 120 Hz displays running in 3D mode, that's 120 total displayed refreshes per second (60 refreshes for left eye, 60 refreshes for right eye). (Is this right?)
- We already know that video games renders one 3D frame, consisting of a pair of frames. One for left eye, a second one for the right eye. This 3D frame (a pair of two 2D frames) represents two views of the same scene of the same instant in time, from slightly different horizontal offsets. (Is this right?)
- We already know that the active shutter glasses 3D always alternate continuously, no how the framerate in a video game fluctuates. (Is this right?)
- Another 'equivalent' way to view this, is one eye gets the even refreshes and the other eye gets the odd refreshes. (Is this right?)
- The 3D shutter glasses does not dynamically hold the shutter open continuously for a specific eye for a longer interval than 1/120second. Once a game starts up, it is always even refreshes for one eye and odd refreshes for the other eye. (Is this right?)
- Video games can still stutter even with 3D; framerate is not necessarily exactly the same, at all times. Framerates still slow down in more complex scenes. Stutters can still happen even with VSYNC on. (I think we already agree on this detail, right?)

Okay, now that we are in agreement of the widely known aspects of active-shutter-glasses 3D technology in general (I hope). If nVidia 3D vision made modifications to their shutter glasses technology such as dynamic shutter operation (variable shutter length per eye depending on framerate fluctuations), then that changes the whole ballpark (and invalidates my earlier post). But from what I know, this isn't the case -- it's always a consistent in that odd refreshes get to one eye, and even refreshes go to the other eye. So if that's the case, the information still applies. But perhaps you can point out where I'm wrong (I would appreciate it!):

Known behavior
If we always display both the left/right eye of the same "video game instant":
- The 3D frame (pair of frames for left/right eye) is of the same "video game instant"
- However, the pair of frames is not presented to both eyes at the same time.
- One eye gets it first, and the other eye gets it next (1/120th of a second afterwards)
- Thus, one eye gets a delayed view. A delay of 1/120th of a second.
So, you see, the rendering of the pair is of the same in-game instant, but the presentation of the pair, to the human eyes, is not at the same instant. Thereupon, you see the that rendering timing does not correspond to the presentation timing. This contributes slightly to a "stutter" sensation (This is roughly equivalent to playing at 60fps @ 120Hz on a 120Hz monitor). The objects in the videogame hasn't moved (world object positions, position of enemies, etc) between the two refreshes when displaying the left-eye frame and then the right-eye frame 1/120th of a second later. Since the objects haven't moved, this produces stutter (the stutter feels equivalent to a frame-repeat during 2D).

Theoretical optimization #1 (not normally done due to disadvantage)
- One way to compensate is to render in a way that matches presentation.
- The game can render a 3D frame (left eye frame and right eye frame) of slightly different "video game instant"
- Left eye frame is rendered of one game instant, right eye is rendered of a slightly later game instant (1/120sec later)
- When you present to the human eye using alternate-frame shutter glasses, the presentation is now of correct timescale (rendering timing now matches presentation timing).
- This improvess fluidity, since now moving objects are have a motion resolution of 120 movement steps per second at 120 Hz. Both human eyes is always continuously tracking moving objects (even as the shutter glasses alternate), even though only one of the two eyes may be seeing an image at the time. So you're now actually getting the "120fps" sensation through 60/60 alternate-frame shutter glasses.
- However, there's a gotcha for this optimization, that prevents games from doing this
....If the game has any framerate skips, freezes, stutters, etc (they always do, at least for tiny instants) -- then you have a moment where the shutter glasses needs to repeat refreshes.
....The shutter is still alternating back and fourth, even during these moments.
....So, when repeating refreshes, you're alternating back-and-fourth between two already-rendered frames
....But the frames are of two different video game instants.
....So the frozen 3D scene is rapidly alternating back and fourth in time in 1/120 second, because one eye has a rendering of a specific instant and the other eye has a rendering of the next 1/120th second.
....This is a very annoying jerkiness amplification (I've seen this effect before in experimental setups, and it also applies to certain kinds of mis-encoded videos where video playback jumps backwards a frame before playing forward. It's a pretty werd and annoying jerkiness effect)

Workable optimization
- This is already done with some 3D shutter glasses (it's been done before, and it's applicable to all active shutter glasses), and should be possible with nVidia 3D vision too.
- To gain the benefits of Theoretical Optimization #1 (see above), without its disadvantages.
....You need to render all 120 in-game instants per second. (for proper fluidity during full frame rate)
*and*
....You need to render both pairs of frames for left/right eye at all times. (for less stutter during stutter moments, meaning repeat refreshes give the same in-game instant for both eyes)
--> Thus, to meet both above bullets, you need 120 pairs of frames. Total of 240 2D frames.
--> Thus for, 120fps feel in 3D, you need GPU horsepower of 2D 240fps.
- It's quite possible that nVidia 3D vision does not permit this optimization. If so, that would be quite bad; you will never get the same 2D-mdoe "120fps feel" when wearing 3D glasses (which is, indeed possible with shutter glasses on 120Hz displays). Basically, this would be a software/driver limitation, or a hardware restriction, if true -- definitely not a physics limitation or human vision limitation.
- Pulling this off, from a developer perspective, would be easiest with triple buffering. (six buffers -- three for each eye). That uses a lot of graphics memory. Remember, triple buffering, even with VSYNC on, discards undisplayed frames if new frames are rendered on time before the next refresh begins. (Triple buffering is not incompatible with shutter glasses technology, it's just a matter of whether nVidia allowed it or not -- triple buffering has been done before with shutter glasses, Asus V7700 Geforce2 GTS allowed it when I used that kit.)

Other observations
- In actual practice, games always render frames for both eyes of the same in-game instant. That's "one 3D frame", a pair for left/right eye, as already mentioned earlier.
- Basically, rendering generally happens in pairs. One frame for left eye, one frame for right eye. (You could call this "one 3D frame" consisting of "two 2D frames, one for each eye")
- When games has any framerate skips, freezes, stutters, etc, frame refreshes is simply alternating between eyes for the the pair of the same in-game instant.
- If you're always showing to both eyes, then you're limited to 60 video game instants per second.
- Many games have frame limiters, e.g. 30fps or 60fps (A 30fps framelimiter in a game, when 3D Vision is enabled, would actually be 30 frames for each eye, meaning GPU horsepower of 60fps)

Regardless, even though the temporal science is sound, it's possible there's a technology limitation (software/software architecture/driver/etc) specific to nVidia 3D Vision's flavour of 3D. i.e. technically can indeed be done, but isn't currently being done, if that's essentially what you're trying to tell me.
October 19, 2012 4:05:59 PM

One thing that kind of confused me at first is the use of stutter, rather than latency in many locations. Stuttering is a jerkiness effect, which you sometimes use it for, but other times you are using it in place of latency.

Anyways, A couple notes. For me to get the feeling of total fluidity, I do not need 120 FPS. I really only need about 80 FPS. Or should I say, it takes 80 FPS to remove the simulator sickness I normally experience at 60 fps and below. I happen to have 680's in SLI which allow me to consistently get those kinds of FPS, unless the game is CPU limited, in which case I turn settings down to achieve the FPS I need.

I agree with most of your check list, but I do not agree that it takes double the FPS to avoid any moments of latency to cause a delayed frame. This may be a result of improved hardware, drivers and software, but I have noticed that my system can easily maintain 60 FPS in 3D Vision without any dips, even when it can't maintain 120 FPS in 2D.

I also do like the idea of rending each eye in real time, which would likely result in less latency, and a more fluid experience. I do believe the reason they don't, is rendering both eyes at the same time frame, allows them to share information, such as physics. Then is why it doesn't take twice the power to render in 3D as it does in 2D. I believe their decision may be the better choice, because they'd also run into CPU limitations faster as a result.

Anyways, I think your topic has strayed off topic. I was simply letting you know of a new monitor with 144hz, which would allow for 72hz/fps 3D gaming. This would help with the latency issues that cause simulator sickness for some of us, and is not a waste.
October 20, 2012 2:32:43 AM

Yes, if it is a latency issue that's causing motion sickness, then it certainly would be favourable for 144 Hz too (and also, for the flicker related reason, too)

Even though a higher framerate also reduces latency for either 2D or 3D in general (by displaying frames sooner), you're already dual 680's, you're pretty high up there already with an expensive SLI system. So that's no longer a low-lying apple anymore to bump your framerate for latency-reduction reasons, the monitor is one of your next solution to try.

(Just a friendly correction -- My earlier statement was "Aha, latency -- that would explain things. I would think that you can solve the latency during 3D mode by doubling your GPU resources" .... That's a statement completely separate topic from the "GPU horsepower of 240fps" topic that kind of split off. The statement of higher framerate = reduced latency still generally universally true for most games regardless of 2D or 3D -- and regardless of whatever framerate you are at -- if you're grinding to slow framerates, e.g. doubling 45fps into 90fps -- it would reduce latency since the frames are delivered to your eyes sooner after rendering. But again, it would appear you're already got a good SLI configuration. Again, this was a completely separate topic from the 240fps-GPU-needed-for-120fps-feel-in-stereo topic.. But I didn't make that clear at first. My apologies.).


We did stray off topic. Apologies. I shall go back to the regularly scheduled program...
November 24, 2012 5:54:10 AM

Some of the reels of ultrabright LED ribbons have now arrived!
http://www.scanningbacklight.com/one-shipment-of-ultra-bright-led-ribbons-has-arrived/

...

...

They are very BRIGHT. Read more on my blog at www.scanningbacklight.com.

I'll be using a whopping 900 of these LED's as an active strobed/scanning backlight in one 24" monitor, to permit a sufficiently bright image at 0.5 second impulses per refresh -- sufficient to reduce LCD motion blur to be less than CRT. At 21-22 lumens per LED, this is a total of 20,000 lumens in the strobes! (possibly up to 60,000 lumens if I carefully overvolt the LED's during the short pulses, as LED's will tolerate surge current at short pulses).

Reasonably inexpensive #5050 Epistar LED chips (darn near 6500K) -- color purity is reasonably good at approximately CRI ~70-75; it's much better quality white light than CCFL -- but not as good as high end Samsung-manufactured LED's. (I'd end up spending 4 figures on CRI 90+ LED's alone for just one monitor.) Phosphor persistence is very low on these white LED's, supposedly far less than 0.5ms, I'll be conducting oscilloscope+photodiode tests in the coming weeks. Primary goal is simply eliminate visible motion blur, not have stunning IPS-LCD style photo quality (for now).

I'll be testing strictly with active 3D panels, since those are the only LCD panels reliably able to clear the vast majority of pixel persistence (>99%) within the same frame refresh (by design necessity), and such panels are capable of less motion blur than CRT, when coupled with this 150watt/sqft backlight in full-array-at-once strobing mode. The full strobe mode would eliminate any backlight diffusion issues (between on-segments and off-segments during sequential scanning modes), that would interfere with motion blur elimination, as this had been expressed to me by other experts. Strobe flicker is not a concern at 120Hz native refresh rate (= 120Hz strobe rate). The first prototype will be an ultimate videogaming computer monitor for my desk.
May 23, 2013 1:51:39 PM

mdrejhon said:
Hello,

Goal: Eliminate motion blur on an LCD, and allow LCD to approach CRT quality for fast-motion.

Scanning backlights are used in some high end HDTV's (google "Sony XR 960" or "Samsung CMR 960"). These high end HDTV's simulate 960 Hz using various techniques, including scanning backlights (sometimes also called "black frame insertion"). The object of this is to reduce motion blur greatly by pulsing (flickering) the LCD -- scanning the backlight (flicker), like a CRT would scan the phosphor (flicker) These home theater HDTV's are expensive, and scanning backlights are not really taken advantage of (yet) in desktop computer monitors. Although there are diminishing returns beyond 120Hz, it is worth noting that 120Hz eliminates only 50% of motion blur over 60Hz, but if you go to 480Hz, you eliminate 87.5% of motion blur. Scanning backlights can simulate the motion-blur reduction of 480Hz, without further added input lag, and without needing to increase actual refresh rate beyond the native refresh rate (e.g. 120Hz)

I have an idea of a home-made scanning backlight, using an Arduino project, some white LED strips, and a modified monitor (putting Arduino-driven white LED strips behind the LCD glass)

Most LCD's are vertically refreshed, from top to bottom.
The idea is to use a homemade scanning backlight, by putting the LCD glass in front of a custom backlight driven by an Arduino project:

Parts:
1. Horizontal white LED strip segments, put behind the LCD glass. The brighter, the better! 4 or 8 strips.
2. Arduino controller (to control LED strip segments).
3. 4 or 8 pins on Arduino connected to a transistor connected to the LED strip segments.
4. 1 pin connected to vertical sync signal (could be software such as a DirectX program that relays the vertical sync state, or hardware that detects vertical sync state on the DVI/HDMI cable)

The Arduino controller would be programmed to flash the LED strip on/off, in a scanning sequence, top to bottom. If you're using a 4-segment scanning backlight, you've got 4 vertically stacked rectangles of backlight panel (LED strips), and you flash each segment for 1/4th of a refresh. So, for a 120Hz refresh, you'd flash one segment at a time for 1/480th of a second.

The Arduino would need to be adjustable to adapt to the specific refresh rate and the specific input lag specific to the monitor:
- Upon detecting signal on the vsync pin the Arduino would begin the flashing sequence to the first segment. This permits synchronization of the scanning backlight to the actual output.
- An adjustment would be needed to compensate for input lag (either via a configurable delay or via configuring the flash sequence on a different segment than the first segment.)
- Configurable pulse length, to optimize image quality with the LCD.
- Configurable panel flash latency and speed (to sync to the LCD display's refresh speed within a refresh) -- this would require one-time manual calibration, via testing for elimination of tearing/artifacts. For example, a specific LCD display might only take 1/140th of a second to repaint a single 120Hz frame, so this adjustment allows compensation for this fact.
- Configurable number of segments to illuminate -- e.g. illuminate more segments at a time, for a brighter image at trade-off (e.g. simulating 240Hz with a double-bright image, by lighting up two segments of a scanning backlight rather than 480Hz)
- If calibrated properly, no extra input lag should be observable (at most, approximately 1-2ms extra, simply to wait for pixels to fully refresh before re-illuminating backlight).
- No modifications of computer monitor electronics is necessary; you're just replacing the backlight with your own, and using the Arduino to control the backlight.
- Calibration should be easy; a tiny computer app to be created -- just a simple moving test pattern and a couple or three software sliders -- adjust until motion looks best.

Total cost: ~$100-$150. Examples of parts:
- $35.00 (RadioShack) -- Arduino Uno Rev 3. You will need an Arduino with at least 4 or 8 output pins and 1 input pin. (e.g. most Arduino)
- $44.40 (DealExtreme) -- LED tape -- White LED's 6500K daylight LED's, 50 watts worth (5meter of 600x3528 SMD LED 6500K).
- Plus other appropriate components as needed: power supply for LED's, wire, solder, transistors for connecting Arduino pins to the LED strips, resistors or current regulators or ultra-high-frequency PWM for limiting power to the LED's, etc.

LED tape is designed to be cut into segments, (most LED tape can be cut in 2 inch increments). Google or eBay "White LED tape". A 5 meter roll of white LED tape is 600 LED's at a total 50 watts, and this is more than bright enough to illuminate a 24" panel in 4 segments, or can be doubled-up. These LED tape is now pretty cheap off eBay, sometimes as low as under $20 for chinese made rolls, but I'd advise 6500K full-spectrum daylight white LED's with reasonably high CRI, or color quality will suffer. Newer LED tape designed for accent lighting applications, would be quite suitable, though you want it daylight white rather than warm white or cold white -- to match the color of a typical computer monitor backlight. For testing purposes, cheap LED tape will do. You need extra brightness to compensate for the dark time. A 4-segment backlight that's dark 75% of the time, would ideally need to be 4 times brighter than the average preferred brightness setting of an always-on backlight. For even lighting, a diffuser (e.g. translucent sheet, wax paper, etc) will likely be needed between the LED's and the LCD glass.

This project would work best with 120Hz LCD panels on displays with fast pixel responses, rather than 60Hz LCD panels, since there would not be annoying flicker at 120Hz (since each segment of the scanning backlight would flicker at 120Hz instead of 60Hz), and also that the pixel decay would need to be quick enough to be virtually completed

Scanning backlight is a technology already exists in high end home theater LCD HDTV's (960Hz simulation in top-model Sony and Samsung HDTV's -- google "Sony XR 960" or "Samsung CMR 960"), and most of those include motion interpolation and local dimming (Turning off LED's behind dark areas of screen), in addition to scanning backlight. We don't want input lag, so we skip the motion interpolation. Local dimming is complex to do cheaply. However, scanning backlight is rather simple -- and achievable via this Arduino project idea. It would be a cheap way to simulate 480Hz (or even 960Hz) via flicker in a 120Hz display, by hacking open an existing computer monitor.

Anybody interested in attempting such a project?


May 23, 2013 2:03:57 PM

I am interested in attempting such a project. I don't have a solution, since I've never done anything like this before.
However, since I plan purchase a best choice, large size 3D HDTV, I prefer to cut my overall costs to a minimum, by utilizing
your DIY approach to achieve 960Hz/CMR, or more, in 120Hz, or higher Hz/CMR HDTV.
!