3D gaming via holograms - when will it become science fact?

kittyhawk

Distinguished
Sep 28, 2007
31
0
18,530
3D accelerated graphics are improving every day. But no matter how much it improves, it still is no more than a pseudo 3D image on the 2D surface of a LCD (or CRT) monitor.

My question is how much further can 3D graphics on a home PC improve on the 2D surface of today's monitors? Will there be a limit to which the 3D graphics that most of us know can improve? And when will holographic displays (and holographic computer games) become a reality at home?

Is nVidia and ATI currently doing any R&D on future GPUs that can render 3D images via hologram? Or is holographic gaming still far fetched with technology of the near future?

hologram.jpg
Hologram from a sci-fi movie.
 

gamebro

Distinguished
Mar 10, 2007
239
0
18,680
5 years..... to get basic holograms, with really crappy star wars quality in resolution....
50 years to get a resolution that is life like, like in Star Trek (next gen). =D
But meh..... Who needs holograms? VR headsets would probably be easier and more immersive anyways....
 

ahmshaegar

Distinguished
Dec 15, 2007
59
0
18,630
Well, if you really want to get down to it all, everything you see is a pseudo-3D image, because the image on your retina is ultimately 2D. This was a topic of contention for the philosophers of the 1700s such as George Berkeley, among others.

That aside, I think we all would love to see true 3-dimensional displays, though that development might necessitate some kind of redesign of input devices (a chance for a company other than Logitech/Razer et al. to take over the high end input device market!) Clearly, the mouse in its current incarnation isn't ideal for three dimensions. A gyro mouse is also not ideal (I wouldn't want to be holding my hand up all the time. It'd get tiring.)

Before the 3D interface becomes reality (will be a long time... first for maybe military or scientific use, and then over the next decade+ after its invention it would trickle down to consumers and perhaps another half-decade or so before it becomes affordable), perhaps multi-touch (like iPhone, Microsoft Surface) will become the "next big thing." I don't see computers with multi-touch taking the form of our current desktops or laptops. They should take the form of a table, as seen in Surface. Why? Hold your hand out and touch your monitor. Keep holding it there. Pretend to drag windows around. Keep going... keep going... It gets tiring. Sure, you can put your hand down, but it gets annoying to lift your hand up, then put it down, then repeat ad nauseum.

As for nVidia and AMD doing R&D for the hardware to display 3D? I don't really know enough to comment, but I can say for sure the bandwidth between the video card and the monitor would need to be orders of magnitude higher. Instead of sending data for a 2D display (x by y pixels at 60 Hz, for example), you'd need to send data for three dimensions (I'm guessing it'll be a square base with a height less than the length of the sides, i.e. x by x by y where x > y.) Internally (in the graphics adapter), if data is already handled in 3D, then rather than "squishing" that information down into a 2D image for display on 2D, just send it as is. Of course, I'm making the assumption that the graphics card handles data this way, which may not be true...

Let's just take a quick look at the bandwidth needed, shall we? I don't actually know the bandwidth of current displays, so let's do a comparison. Disclaimer: I know nothing about how graphics cards really work.
Resolution of 2D display is x by y at f Hz, so requires xyf pixels per second.
Resolution of 3D display is x by x by y at f Hz, so requires (x^2)yf "pixels" per second. x, y, and f are equal for both examples, for the sake of comparison. Dividing 3D by 2D gives us x, so a 3D display at the same resolution as a 2D display will require x times the data. If we plug in even a low resolution and low refresh rate, such as 800 by 600 @ 60 Hz, x = 800, so an 800 by 800 by 600 display at 60 Hz REQUIRES 800 TIMES the data as an 800 by 600 display. It gets worse as resolution increases, and as far as I can tell, we don't have an interface quite good enough yet. Someone, please correct me if I did something wrong. My initial assumptions for this quick look might be entirely wrong, invalidating the whole entire premise.

Sorry for the reply; probably not what you expected. If you don't read what I said... in summary:

-Your eye sees 2D anyway, since the image on the retina is 2D
-I think multi-touch will be the next big thing
-It'll be a long, long time before 3D displays come out. Heck, the closest thing I see out right now is this laser that turns air into plasma, creating bright spots of light. The resolution is a far cry from sci-fi movies.
-If graphics cards function the way I think they do, then there won't be many internal changes necessary to have them feed a 3D display. A new cable will be needed, as the bandwidth required will be an order of magnitude greater than before.
-A ton more data is needed for 3D than 2D displays.

Oh yes, I almost forgot. Sure, the eye sees 2D, but you have two of them, and your brain provides the depth perception, so a 3D display would certainly be beneficial.

I also forgot to put the link to the 3D display. Not something you want to stick your head in! http://www.physorg.com/news11251.html
 

kittyhawk

Distinguished
Sep 28, 2007
31
0
18,530
I remember seeing holographic gaming in a recent sci-fi movie titled 'The Island'. Hmm, how much computing power did they need for the holographic game in the cloning facility?