Sign in with
Sign up | Sign in

Encoding And Delivering 3D Video Content

Primer: The Principles Of 3D Video And Blu-ray 3D
By

The highest-quality method to encode and deliver a 3D video program is to store and deliver it as a dual-stream synchronized video program, with one full-quality video stream for each eye. This is how Blu-ray 3D works, storing the video for each eye as a full “Blu-ray quality” video program.

The HDMI 1.4 specification provides for 3D stereoscopic video to be delivered in several different ways, including over/under-formatted frames that are 1920 pixels wide and 2205 pixels high. The frame for the left eye and right eye are delivered together, to assure that synchronization is always maintained, even if the signal is momentarily lost and then restored.

Compressed 3D Encoding

For compatibility with existing equipment and video standards, 3D video content can be compressed to fit in a standard video signal. There are several ways that this can be done.

Side by Side encodes the video for each eye in half of a standard video frame (with the right eye picture on the right side of the frame). Thus, the video for each eye is stored with half of the horizontal resolution (960x1080 pixels in a standard 1080p video frame).

Interlaced stores the video for each eye in alternate horizontal lines. The odd lines store the picture for one eye, while the even lines store the picture for the other eye. The picture for each eye has full horizontal resolution, but half of the normal vertical resolution (1920x540 in a 1080p video frame).

Over/Under is a format that encodes the picture for each eye with half the vertical resolution stacked on top of each other in a single video frame. The picture for the left eye is stored in the upper half of the frame, and the right eye is stored in the lower half. As with the Interlaced format, the picture for each eye has full horizontal resolution, but half of the normal vertical resolution (1920x540 pixels for a 1080p video frame).

Displaying 3D Video

A stereoscopic 3D video contains two time-aligned video channels (one for each eye). To view 3D video, the display technology and the 3D glasses must assure that the left eye sees only the video meant for the left eye, and so on for the right eye. There are a number of different technologies that are designed to accomplish this, and each technology has its own benefits, drawbacks, and costs.

Anaglyphic 3D

Mention 3D video and the image that comes to mind for many people is that of the familiar 3D glasses, with one red and one blue lens. These glasses use the anaglyphic method of displaying a 3D image.

Anaglyph images are created by using color filters to remove a portion of the visible color spectrum from the image meant for each eye. When viewed through the color filters in the 3D glasses, each eye only sees the image that contains the portion of the color spectrum not filtered out by the lens. The benefit of the anaglyphic method is that no special display is needed; any standard 2D display or TV can display an anaglyphic 3D image. The drawback of anaglyphic 3D is obvious. The overall image quality suffers as a large portion of the color spectrum is filtered out of the image for each eye.

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 25 comments.
This thread is closed for comments
  • 1 Hide
    JohnnyLucky , May 19, 2010 7:44 AM
    Very informative primer. Lots of information that was easy to undertsand.

    Unfortunately I am one of those who recently purchased a new TV. It will be quite some time before I upgrade.
  • 6 Hide
    TheGreatGrapeApe , May 19, 2010 8:48 AM
    Nice article, but I think there's a few issues with regards to the overall balance of the information being put forth.

    I understand the author's preference for shutter glasses (especially since it's a certain product's preferred method of choice) even if I don't share it, the major limitation is having to buy a pair for all your friends coming over, which gets impractical until they are more commonplace.

    Also polarized solutions are not limited in resolution if they are set-up beyond just the example provided in this article (like they do in the theatre with dual projectors [like the THG review by Don see: http://www.tomshardware.com/reviews/3d-polarized-projector,2589.html]) and may have an improving single source future with 2K and 4K displays on the horizon. It's a question of preference, but it seems like the full story wasn't explored on that subject.

    Now on to a pet peeve: I love the part about "While set-top Blu-ray players will need to be replaced, PC-based Blu-ray player software can be upgraded." as a subtle product benefit plug.

    Unless it's a free upgrade, you are still replacing the software, not upgrading it (it's not a plug-in), and you're likely forking out nearly the same amount of money for the 1/100th of the cost to produce that software update, so it's not like it's a major advantage. Especially when upgrading requires a FULL upgrade to the most expensive model Power DVD (version #) Ultra 3D, and I can't simply add it to my existing PowerDVD bundles thus potentially changing my backwards compatibility (Ultra 9 already removed my HD-DVD support from Ultra 7 that I upgraded on my LG HD-DVD/BR burner [that I also used for my old Xbox USB HD-DVD player too).

    Make it a ~$20 independent 3D add-on and then you have a point [ooh I can save $5 'til May 25 :sarcastic:  gee thanks ! ], until then it's $99 (or $94.95 for loyal saps) vs $150-200, plus with the set-top route now I have a second BR-/DVD player for another room or to give to a friend (the BR software on its own is useless to give to someone else without a drive), and that's not even compared to the free PS3 upgrade.

    Also can someone explain this statement;
    "Blu-ray 3D video decoding solutions can be expected for ATI Radeon 5000-series graphics in the future."

    Didn't Cyberlink already show their BR-3D solution on ATi hardware last year? So what's the issue?

    Also why is it limited to "GeForce 300M-series mobile graphics" when often the core is the same a previous generation 200M series (example GTS 350M / 250M )?

    And this section "Full-quality 120 Hz frame-sequential 3D video (such as Blu-ray 3D) is only supported through a High Speed
    HDMI cable to a HDMI 1.4-compliant TV. " seems to miss the DVI dual-link to monitor option currently being used for 3D on PCs, and also the dual 1.3 input monitors/TVs.

    A nice little article for people unfamiliar with 3D, but there's a subtle under-current of product preference/placement in it, and far too many generalities with little supporting information. :??: 
  • 2 Hide
    hixbot , May 19, 2010 12:10 PM
    well done. I would of liked more detail on the hdmi 1.4 spec, specifically framepacking and the mandatory standards (no mandatory standard for 1080p60 framepacking).
    also some info on AVRs and how a 1.3 hdmi AVR might pass on 3d video and still decode bitstream audio, or not - do we need 1.4 hdmi AVRs to decode audio from a 1.4 source? we shouldn't need 1.4 receivers since the audio standards haven't change, but I'm understanding that in fact we do neeed new receivers. :/ 
  • 0 Hide
    hixbot , May 19, 2010 12:16 PM
    double post. good article.
  • -1 Hide
    ArgleBargle , May 19, 2010 12:37 PM
    Unfortunately for people with heavy vision impairment (astigmatism, etc.) which require corrective lenses, such 3D technology is out of their reach for the time being, or at least next to useless. Until some enterprising company comes out with 3D "goggles", people who wear corrective lenses might as well save their money.
  • 1 Hide
    boletus , May 19, 2010 12:53 PM
    3D is cool, and high definition video is cool. But Sony's moving target of a BD standard is not cool, and Cyberlink's bait and switch tactics are not cool (unless you have bundles of money you can throw at them every 6-12 months). I sent back my BD disk drive (retail, with Cyberlink software) for a refund after finding out that I would have to shell out another $60-100 just so I could watch a two-year old movie. As far as I'm concerned, high definition DVD video is dead until some more open standards and reliable software emerge.
  • 2 Hide
    cangelini , May 19, 2010 4:08 PM
    Great,

    This piece is a prelude to tomorrow's coverage, by Don, of Blu-ray 3D on a notebook and a desktop. Perhaps that one will answer any of the questions you were left with here?

    As for AMD, Tom and I went back and forth on this piece, and we agreed that it was critical to get AMD's feedback on Blu-ray 3D readiness. The fact of the matter is that it isn't ready to discuss the technology. It's behind.

    The mention of dual-link DVI was in the first revision of this piece and removed in a subsequent iteration. I've asked the author for additional clarification there and should have an answer shortly.
  • 0 Hide
    cangelini , May 19, 2010 4:37 PM
    So it turns out there were two sections on this and one was cut accidentally. Should be good to go now, though--dual-link DVI is discussed with PC displays!
  • 0 Hide
    cleeve , May 19, 2010 4:58 PM
    TheGreatGrapeApeAlso can someone explain this statement;"Blu-ray 3D video decoding solutions can be expected for ATI Radeon 5000-series graphics in the future."Didn't Cyberlink already show their BR-3D solution on ATi hardware last year? So what's the issue?


    It turns out the demo (I think it was at CES?) only used CPU decoding over an ATI graphics card; the Radeon did no software decoding.

    The Cyberlink rep tells me that Blu-ray 3D software decoding is extremely CPU-dependant and might even require a quad-core CPU. He said all four threads were being stressed under software decoding, not sure what quad-core CPU they were using though.

    Definitely something I'd like to test out in the future...
  • 1 Hide
    Alvin Smith , May 19, 2010 5:20 PM
    This was a very informative and well written article BUT, I chose to skip to the last two pages ... Because ...

    These implementations, while ever more impressive, are still being threshed out. Because of possible physiological side effects, I think I will NOT be a first adopter, with this (particular) tech (3D).

    Anyone ever watch that movie "THE JERK", with STEVE MARTIN ??

    = Opti-Grab =

    ... I can see all these class-action suits by parents of cross-eyed gamers ... hope not, tho ... I *AM* very much looking forward to the fully refined "end game", for 3D ...

    Additionally, the very best desktop workstations are only just now catching up to standard (uncompressed) HD resolution ingest and edit/render ... since that bandwidth IS shared, between both eyes, this may be a non-issue.

    I will let the kiddies and 1st adopters take-on all those risks and costs.

    Please let me know when it is all "fully baked" and field tested!

    = Alvin = (not to mention "affordable").
  • 1 Hide
    cyberlink , May 19, 2010 6:49 PM
    cangeliniAs for AMD, Tom and I went back and forth on this piece, and we agreed that it was critical to get AMD's feedback on Blu-ray 3D readiness. The fact of the matter is that it isn't ready to discuss the technology.

    While AMD has not yet announced their specific plans and schedule to support Blu-ray 3D MVC hardware accelerated decoding on ATI graphics, they were willing to confirm that a solution is coming for Radeon 5000 series graphics.

    Tom Vaughan
    Cyberlink
  • 0 Hide
    TheGreatGrapeApe , May 19, 2010 7:08 PM
    CleeveIt turns out the demo (I think it was at CES?) only used CPU decoding over an ATI graphics card; the Radeon did no software decoding.


    Ah that makes more sense (of what was trying to be said, not ATi/AMD's method) which is Ala AVIVO X1K series, make it 'sound' hardware accelerated, brilliant!

    So, it's still available, just not hardware assisted. It's not like it's not possible as that statement would suggest, just you don't get any hardware benefit. Notice they kept the intel portion separate mentioning only the dual stream HD decoding (available since the HD4600 series, and GF9600 series) infering it's doable on intel, but not on the next stated option which would be in the future, not well written in that section if providing clarity is the goal. One would assume by the statement that A) 3D BR is not possible if running on an new HD5770 with a Core i7 920-980X, and B) that when it is 'made possible' it will only be on the HD5K series.

    Quote:
    The Cyberlink rep tells me that Blu-ray 3D software decoding is extremely CPU-dependant and might even require a quad-core CPU. He said all four threads were being stressed under software decoding, not sure what quad-core CPU they were using though.Definitely something I'd like to test out in the future...


    Yeah sorta gets back to the VC-1 H.264 decoding of the early generation HD-acceleration GPUs.
    Still unclear why it's nV G300M-centric though based on the relationship of the chips as stated above.

    BTW, need to get you some new projectors for a 1080 stereo projector setup. Isn't it tax return time? :whistle: 
  • 0 Hide
    cyberlink , May 19, 2010 7:29 PM
    CleeveThe Cyberlink rep tells me that Blu-ray 3D software decoding is extremely CPU-dependant and might even require a quad-core CPU.

    To clarify, while it's possible to play Blu-ray 3D on a PC without video decoding acceleration (video decoding on your graphics processor), it takes most of the CPU power of a quad-core CPU to do software decoding of Blu-ray 3D MVC. GPU accelerated decoding is really the way to go, if possible.

    Tom Vaughan
    Cyberlink
  • 0 Hide
    cleeve , May 19, 2010 7:43 PM
    TheGreatGrapeApeSo, it's still available, just not hardware assisted.


    Well grape, that's where things get interesting. It might be *possible*, but it can't be *available* until they develop something.

    in Nvidia's case, they have their own 3D Vision infrastructure in place, so you plug in the 3D Vision stuff and you're off to the races.

    Radeons on the other hand, I think it's safe to say they'll never be 3D Vision compatible. So AMD has no way I can think of that they will be able to provide a full-resolution 3D solution... in the near future anyway. maybe they'll someday be able to plug into 3D TVs and utilize their proprietary glasses, but for that they'd need HDMI 1.4, not sure if the 5000 series can handle that with the current hardware.

    There's a lot to talk about, but it's easier to direct you toward my article that's coming out tomorrow. Then we can chat. :D 

    Take care,

    - Cleeve
  • 0 Hide
    bjrobert , May 19, 2010 9:28 PM
    I'm holding out for magic eye TVs.
  • 1 Hide
    geok1ng , May 19, 2010 11:21 PM
    hixbotwell done. I would of liked more detail on the hdmi 1.4 spec, specifically framepacking and the mandatory standards (no mandatory standard for 1080p60 framepacking). also some info on AVRs and how a 1.3 hdmi AVR might pass on 3d video and still decode bitstream audio, or not - do we need 1.4 hdmi AVRs to decode audio from a 1.4 source? we shouldn't need 1.4 receivers since the audio standards haven't change, but I'm understanding that in fact we do neeed new receivers.


    Great comments, but ATI is not showing for the game. If a product is not on the shelves, it will not sell. It is simple like that, as NVIDIA learned the hard way with Fermi.

    The 3D modes are a lose-lose alternative: it is either an expensive display coupled with inexpensive glasses, or a mildly expensive display coupled with mildly expensive glasses.

    NO matter which one goes, you lose performance or resolution: single DVI and HDMI cant display 3D over 1080p60Hz links. HDMI 1.4 was the salvation of 3D, if one can accept 24 Hz signals...

    DisplayPort would be the way to go, but TVs are HDMI domain, and will remain so for the next decades, thanks to HDMI audio.

    The point is that i see more benefits from higher resolutions than from 3D, and ther is no consumer grade cable today that can deliver 60Hz of 1080p or higher resolutions on 3D. But even modest systems demand such computational power that heat dissipation issues comes into play, much like the Graphics war of performance and Heat.

    It would take a massive change on the way consumer grade Tvs and players are manufactured to bring the high end visual experience of 3D images and 4K resolutions to the living room. There is no way to produce viable chips on 90nm or bigger-hotter processes.
  • 0 Hide
    cyberlink , May 19, 2010 11:48 PM
    geok1ng - HDMI 1.4 supports 1080P60 stereoscopic video with frame packing. I'm not sure what you are referring to when you say "if you can accept 24 Hz signals". While the full spec is confidential and only available to HDMI adopters, you can go to HDMI's website and request a subset of the HDMI 1.4 specs. This "extraction" document provides all of the detail about the 3D modes.

    hixbot - an HDMI 1.4 3D source (HTPC, Blu-ray player, or other device with HDMI 1.4 output) can choose to support one of several mandatory 3D video signal formats. If an HDMI 1.4 sink (device with input) signals that it supports 3D, it must support all mandatory 3D modes (it can advertise support for additional modes).

    Tom Vaughan
    Cyberlink
  • 0 Hide
    jsm6746 , May 20, 2010 12:47 AM
    fantastic report tom... on tom's hardware... 5stars...
  • 0 Hide
    ca87 , May 20, 2010 9:42 AM
    This is pure Computer vision! Nice report.
  • 0 Hide
    B-Kills , May 20, 2010 2:21 PM
    thanks for the report, interesting read....
Display more comments