Primer: The Principles Of 3D Video And Blu-ray 3D

Depth Perception, Continued

Parallax: As we move, we notice that the relative position of nearby objects changes more than far objects. In the photos below, as a virtual camera moves from left to right across a virtual three-dimensional scene, you can observe that objects that are closer appear to move (right to left) more than far objects.

Unlike other monocular depth and distance cues, parallax is only observed over time, in moving images. Of course, movies and video are moving images, and as a result, parallax will be observed.

Blu-ray 3D: Parallax


Texture gradient
: On surfaces that have a regular pattern, we can judge distance based on the spacing of the pattern. We notice that the spacing of the pattern is wider and the features are larger on the area that is closer. In the photo below, the pattern formed by the paving stones helps us determine the relative distance of the people and objects that we see. Both the density of the pattern and the perspective that the pattern provides help us sense distance.

Air quality: Far-away objects are sometimes obscured by haze or fog, while nearby objects are not.

Accommodation (Focus), and Convergence: When we look at objects that are close to us in the real world, our eyes do two things necessary to see the object clearly. First, our eyes converge inward, so that each eye is aimed at the spot that we want to focus on. Second, to adjust the focus of the lens in our eyes, our eye muscles adjust the shape of our eyes in a process called accommodation. The feedback that your eye muscles give your brain as you focus on different objects gives your brain some idea of the distance to each object that you see.

All of these cues provide depth information, even when we view a scene with only one eye. They also help us sense depth when we view standard two-dimensional images. Artists and filmmakers have understood these important visual cues, and have exploited them to add a feeling of realism and depth to paintings, photos, and movies for many years.

Of course, a 2D movie is a flat 2D rendering of a 3D scene. When you watch a 2D movie, your eyes focus on the screen, and they stay focused on the screen (which remains at the same distance) throughout the movie. You don’t need two eyes to perceive depth, but you do need two eyes to see 3D. 

3D movies replicate the images that your eyes would see if you were standing where the 3D camera was when the movie was filmed. Objects and scenery appear to be at different distances, and if all goes according to plan, the audience perceives that they are “on location."

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
25 comments
    Your comment
  • JohnnyLucky
    Very informative primer. Lots of information that was easy to undertsand.

    Unfortunately I am one of those who recently purchased a new TV. It will be quite some time before I upgrade.
    1
  • TheGreatGrapeApe
    Nice article, but I think there's a few issues with regards to the overall balance of the information being put forth.

    I understand the author's preference for shutter glasses (especially since it's a certain product's preferred method of choice) even if I don't share it, the major limitation is having to buy a pair for all your friends coming over, which gets impractical until they are more commonplace.

    Also polarized solutions are not limited in resolution if they are set-up beyond just the example provided in this article (like they do in the theatre with dual projectors [like the THG review by Don see: http://www.tomshardware.com/reviews/3d-polarized-projector,2589.html]) and may have an improving single source future with 2K and 4K displays on the horizon. It's a question of preference, but it seems like the full story wasn't explored on that subject.

    Now on to a pet peeve: I love the part about "While set-top Blu-ray players will need to be replaced, PC-based Blu-ray player software can be upgraded." as a subtle product benefit plug.

    Unless it's a free upgrade, you are still replacing the software, not upgrading it (it's not a plug-in), and you're likely forking out nearly the same amount of money for the 1/100th of the cost to produce that software update, so it's not like it's a major advantage. Especially when upgrading requires a FULL upgrade to the most expensive model Power DVD (version #) Ultra 3D, and I can't simply add it to my existing PowerDVD bundles thus potentially changing my backwards compatibility (Ultra 9 already removed my HD-DVD support from Ultra 7 that I upgraded on my LG HD-DVD/BR burner [that I also used for my old Xbox USB HD-DVD player too).

    Make it a ~$20 independent 3D add-on and then you have a point [ooh I can save $5 'til May 25 :sarcastic: gee thanks ! ], until then it's $99 (or $94.95 for loyal saps) vs $150-200, plus with the set-top route now I have a second BR-/DVD player for another room or to give to a friend (the BR software on its own is useless to give to someone else without a drive), and that's not even compared to the free PS3 upgrade.

    Also can someone explain this statement;
    "Blu-ray 3D video decoding solutions can be expected for ATI Radeon 5000-series graphics in the future."

    Didn't Cyberlink already show their BR-3D solution on ATi hardware last year? So what's the issue?

    Also why is it limited to "GeForce 300M-series mobile graphics" when often the core is the same a previous generation 200M series (example GTS 350M / 250M )?

    And this section "Full-quality 120 Hz frame-sequential 3D video (such as Blu-ray 3D) is only supported through a High Speed
    HDMI cable to a HDMI 1.4-compliant TV. " seems to miss the DVI dual-link to monitor option currently being used for 3D on PCs, and also the dual 1.3 input monitors/TVs.

    A nice little article for people unfamiliar with 3D, but there's a subtle under-current of product preference/placement in it, and far too many generalities with little supporting information. :??:
    6
  • hixbot
    well done. I would of liked more detail on the hdmi 1.4 spec, specifically framepacking and the mandatory standards (no mandatory standard for 1080p60 framepacking).
    also some info on AVRs and how a 1.3 hdmi AVR might pass on 3d video and still decode bitstream audio, or not - do we need 1.4 hdmi AVRs to decode audio from a 1.4 source? we shouldn't need 1.4 receivers since the audio standards haven't change, but I'm understanding that in fact we do neeed new receivers. :/
    2
  • hixbot
    double post. good article.
    0
  • ArgleBargle
    Unfortunately for people with heavy vision impairment (astigmatism, etc.) which require corrective lenses, such 3D technology is out of their reach for the time being, or at least next to useless. Until some enterprising company comes out with 3D "goggles", people who wear corrective lenses might as well save their money.
    -1
  • boletus
    3D is cool, and high definition video is cool. But Sony's moving target of a BD standard is not cool, and Cyberlink's bait and switch tactics are not cool (unless you have bundles of money you can throw at them every 6-12 months). I sent back my BD disk drive (retail, with Cyberlink software) for a refund after finding out that I would have to shell out another $60-100 just so I could watch a two-year old movie. As far as I'm concerned, high definition DVD video is dead until some more open standards and reliable software emerge.
    1
  • cangelini
    Great,

    This piece is a prelude to tomorrow's coverage, by Don, of Blu-ray 3D on a notebook and a desktop. Perhaps that one will answer any of the questions you were left with here?

    As for AMD, Tom and I went back and forth on this piece, and we agreed that it was critical to get AMD's feedback on Blu-ray 3D readiness. The fact of the matter is that it isn't ready to discuss the technology. It's behind.

    The mention of dual-link DVI was in the first revision of this piece and removed in a subsequent iteration. I've asked the author for additional clarification there and should have an answer shortly.
    2
  • cangelini
    So it turns out there were two sections on this and one was cut accidentally. Should be good to go now, though--dual-link DVI is discussed with PC displays!
    0
  • cleeve
    TheGreatGrapeApeAlso can someone explain this statement;"Blu-ray 3D video decoding solutions can be expected for ATI Radeon 5000-series graphics in the future."Didn't Cyberlink already show their BR-3D solution on ATi hardware last year? So what's the issue?


    It turns out the demo (I think it was at CES?) only used CPU decoding over an ATI graphics card; the Radeon did no software decoding.

    The Cyberlink rep tells me that Blu-ray 3D software decoding is extremely CPU-dependant and might even require a quad-core CPU. He said all four threads were being stressed under software decoding, not sure what quad-core CPU they were using though.

    Definitely something I'd like to test out in the future...
    0
  • Alvin Smith
    This was a very informative and well written article BUT, I chose to skip to the last two pages ... Because ...

    These implementations, while ever more impressive, are still being threshed out. Because of possible physiological side effects, I think I will NOT be a first adopter, with this (particular) tech (3D).

    Anyone ever watch that movie "THE JERK", with STEVE MARTIN ??

    = Opti-Grab =

    ... I can see all these class-action suits by parents of cross-eyed gamers ... hope not, tho ... I *AM* very much looking forward to the fully refined "end game", for 3D ...

    Additionally, the very best desktop workstations are only just now catching up to standard (uncompressed) HD resolution ingest and edit/render ... since that bandwidth IS shared, between both eyes, this may be a non-issue.

    I will let the kiddies and 1st adopters take-on all those risks and costs.

    Please let me know when it is all "fully baked" and field tested!

    = Alvin = (not to mention "affordable").
    1
  • cyberlink
    cangeliniAs for AMD, Tom and I went back and forth on this piece, and we agreed that it was critical to get AMD's feedback on Blu-ray 3D readiness. The fact of the matter is that it isn't ready to discuss the technology.

    While AMD has not yet announced their specific plans and schedule to support Blu-ray 3D MVC hardware accelerated decoding on ATI graphics, they were willing to confirm that a solution is coming for Radeon 5000 series graphics.

    Tom Vaughan
    Cyberlink
    1
  • TheGreatGrapeApe
    CleeveIt turns out the demo (I think it was at CES?) only used CPU decoding over an ATI graphics card; the Radeon did no software decoding.


    Ah that makes more sense (of what was trying to be said, not ATi/AMD's method) which is Ala AVIVO X1K series, make it 'sound' hardware accelerated, brilliant! [:thegreatgrapeape:5]

    So, it's still available, just not hardware assisted. It's not like it's not possible as that statement would suggest, just you don't get any hardware benefit. Notice they kept the intel portion separate mentioning only the dual stream HD decoding (available since the HD4600 series, and GF9600 series) infering it's doable on intel, but not on the next stated option which would be in the future, not well written in that section if providing clarity is the goal. One would assume by the statement that A) 3D BR is not possible if running on an new HD5770 with a Core i7 920-980X, and B) that when it is 'made possible' it will only be on the HD5K series.

    Quote:
    The Cyberlink rep tells me that Blu-ray 3D software decoding is extremely CPU-dependant and might even require a quad-core CPU. He said all four threads were being stressed under software decoding, not sure what quad-core CPU they were using though.Definitely something I'd like to test out in the future...


    Yeah sorta gets back to the VC-1 H.264 decoding of the early generation HD-acceleration GPUs.
    Still unclear why it's nV G300M-centric though based on the relationship of the chips as stated above.

    BTW, need to get you some new projectors for a 1080 stereo projector setup. Isn't it tax return time? :whistle:
    0
  • cyberlink
    CleeveThe Cyberlink rep tells me that Blu-ray 3D software decoding is extremely CPU-dependant and might even require a quad-core CPU.

    To clarify, while it's possible to play Blu-ray 3D on a PC without video decoding acceleration (video decoding on your graphics processor), it takes most of the CPU power of a quad-core CPU to do software decoding of Blu-ray 3D MVC. GPU accelerated decoding is really the way to go, if possible.

    Tom Vaughan
    Cyberlink
    0
  • cleeve
    TheGreatGrapeApeSo, it's still available, just not hardware assisted.


    Well grape, that's where things get interesting. It might be *possible*, but it can't be *available* until they develop something.

    in Nvidia's case, they have their own 3D Vision infrastructure in place, so you plug in the 3D Vision stuff and you're off to the races.

    Radeons on the other hand, I think it's safe to say they'll never be 3D Vision compatible. So AMD has no way I can think of that they will be able to provide a full-resolution 3D solution... in the near future anyway. maybe they'll someday be able to plug into 3D TVs and utilize their proprietary glasses, but for that they'd need HDMI 1.4, not sure if the 5000 series can handle that with the current hardware.

    There's a lot to talk about, but it's easier to direct you toward my article that's coming out tomorrow. Then we can chat. :D

    Take care,

    - Cleeve
    0
  • bjrobert
    I'm holding out for magic eye TVs.
    0
  • geok1ng
    hixbotwell done. I would of liked more detail on the hdmi 1.4 spec, specifically framepacking and the mandatory standards (no mandatory standard for 1080p60 framepacking). also some info on AVRs and how a 1.3 hdmi AVR might pass on 3d video and still decode bitstream audio, or not - do we need 1.4 hdmi AVRs to decode audio from a 1.4 source? we shouldn't need 1.4 receivers since the audio standards haven't change, but I'm understanding that in fact we do neeed new receivers.


    Great comments, but ATI is not showing for the game. If a product is not on the shelves, it will not sell. It is simple like that, as NVIDIA learned the hard way with Fermi.

    The 3D modes are a lose-lose alternative: it is either an expensive display coupled with inexpensive glasses, or a mildly expensive display coupled with mildly expensive glasses.

    NO matter which one goes, you lose performance or resolution: single DVI and HDMI cant display 3D over 1080p60Hz links. HDMI 1.4 was the salvation of 3D, if one can accept 24 Hz signals...

    DisplayPort would be the way to go, but TVs are HDMI domain, and will remain so for the next decades, thanks to HDMI audio.

    The point is that i see more benefits from higher resolutions than from 3D, and ther is no consumer grade cable today that can deliver 60Hz of 1080p or higher resolutions on 3D. But even modest systems demand such computational power that heat dissipation issues comes into play, much like the Graphics war of performance and Heat.

    It would take a massive change on the way consumer grade Tvs and players are manufactured to bring the high end visual experience of 3D images and 4K resolutions to the living room. There is no way to produce viable chips on 90nm or bigger-hotter processes.
    1
  • cyberlink
    geok1ng - HDMI 1.4 supports 1080P60 stereoscopic video with frame packing. I'm not sure what you are referring to when you say "if you can accept 24 Hz signals". While the full spec is confidential and only available to HDMI adopters, you can go to HDMI's website and request a subset of the HDMI 1.4 specs. This "extraction" document provides all of the detail about the 3D modes.

    hixbot - an HDMI 1.4 3D source (HTPC, Blu-ray player, or other device with HDMI 1.4 output) can choose to support one of several mandatory 3D video signal formats. If an HDMI 1.4 sink (device with input) signals that it supports 3D, it must support all mandatory 3D modes (it can advertise support for additional modes).

    Tom Vaughan
    Cyberlink
    0
  • jsm6746
    fantastic report tom... on tom's hardware... 5stars...
    0
  • ca87
    This is pure Computer vision! Nice report.
    0
  • B-Kills
    thanks for the report, interesting read....
    0