Sign in with
Sign up | Sign in
Your question

Wall-Sized 3D Displays: The Ultimate Gaming Room

Tags:
  • Graphics Cards
  • Gaming
  • Displays
  • 3D
  • Graphics
Last response: in Graphics & Displays
Share
April 10, 2007 2:03:40 PM

http://www.tomshardware.com/2007/04/10/wall_sized_3d_di...

I have to start learning more about this setup. Does anyone know if there are any drivers for the x1900xt?

PS. Post # 1000. WOOOHOOOOO!!!! I feel like a dirty old man.

More about : wall sized displays ultimate gaming room

April 10, 2007 3:06:27 PM

You'd have to use the eDimensional driver with the X1900XT.

It might work but it'd be interlaced, and the eDimensional drivers are buggy.
April 10, 2007 5:12:48 PM

I called them, and confirmed the THG article. 1024 X 384. That really stinks. Any idea what that kind of rez would look like? I dont even know what shape that is? Whats the standard panel for that.

Games would suck.
Related resources
April 10, 2007 6:00:25 PM

It's still 1024x768, lust interlaced. You only see every second line of resolution.

Television is interlaced, at 60 Hz.

It works like this, just the interlaced fields change much faster:

April 10, 2007 6:37:42 PM

I was actually thinking about getting a setup like this but on my LCD screen, since it can do 75HZ on 1280x1024, but because I have a X1900XT, I couldn't find any drivers on ATI for this, and I had doubts with the eDimensional drivers. Thankfully, the article explained things further, so with what I read, I guess I'll just have to keep to my current setup. I don't like interlaced resolutions, so lack of progressive scan support has put me off. Oh well...
April 10, 2007 7:12:23 PM

this is actually one of the best articles on this site in awhile. I want to set one up now.
April 10, 2007 7:16:38 PM

It just blows my mind that they cannot provide a simple solution to this. Anyone who has ever sat in a 3D movie at Disney will say its just the best viewing experience ever. Game consoles, PC, TV, movies - I mean seriously, it covers all bases. A tech like that would be extremely scalable. My research revealed that there was a push in '00/'01 to make it happen, but it never took off and manufacturers abandoned the idea. It probably had something to do with the dot com crash, and 3D was never fully adopted by the public. If they make it mainstream, everyone will want one.
April 10, 2007 7:17:29 PM

I agree. 1st good article we've seen in a while. This one has me interested and wanting to do more research and get one. Gooooo Toms!
April 10, 2007 7:18:07 PM

Does anyone know if there are any independent driver developers out there?
April 10, 2007 7:19:35 PM

Hey Cleeve, excellent illustration.
April 10, 2007 7:27:36 PM

Cleave, how big of a difference is interlaced vs progressive? Will you see it clearly with the naked eye? My projector can take a 1280X1024 resolution and actually output it at that (I know, random, native is 1024X768 but we discussed that a while ago - no overscan with VGA cable). Anyway, will I be able to still view 1280X1024 using the VGA cable? If so, how different will the quality be?
April 10, 2007 7:35:04 PM

The main difference is that interlaced appears darker, because - even though the full 1024x768 resolution is being used - only half of the pixels are being lit at any one time. You can also notice that the screen appears like it's made up of horizontal lines.

VGA cable can output at the same resolution as DVI, if that's what you're asking.

The most impoortant thing you need to find out is your projector's different refrersh rates at different resolutions. You'll want to use the highest refresh rate you can.
April 10, 2007 7:40:56 PM

Cleeve, you may recall, but I have this projector, the Infocus LP640:
http://www.infocus.com/service/lp640/specifications.asp...

I bought a DVI to M1 cable to see if the quality would be better. All that happened, is that when I selected 1280X1024 res on my desktop, it overscanned on the projection screen. It fitted fine when I used 1024X768 (the native). Here is the catch, when I used a VGA cable and tried 1280X1024, it fitted perfectly on the screen (no overscan), and the quality was much better than at 1024X768 (fonts etc). I have no explanation for this.

Anyway, this is what I have on the LP640. Seems like THG was right about the setup being interlaced:

The LP640 certainly isn't a projector with any significant ability to handle video. Although it supports component video input, it does so only with a proprietary cable through the M1-DA input and not, as with so many projectors these days, through the 15-pin RGB port. What's more, even with that custom InFocus component video cable, the LP640 can handle component video only with progressive-scan sources, not far more common interlaced sources. For interlaced video, you'll have to use the S-video or composite inputs. These offer a bare minimum of quality: the range of the signal and the cable type are limited, and the deinterlacing is rudimentary and leaves plenty of video jitter.

http://digitalcontentproducer.com/mag/avinstall_infocus...
April 10, 2007 7:50:39 PM

Cleeve, another question. It says:
H-Synch Range: 16 - 110 kHz
V-Synch Range: 50 - 85 Hz

Are these my refresh rates? What should I setup my output to be in Catylyst if these are the specs? Thanks man.
April 10, 2007 7:55:31 PM

Its more of an issue of marketing and quality of products. I mean, sure, once you get a good product out, you'll then need support from graphics card companies, make sure it supports as many screen technologies as possible, and is compatible with as many games/movies as possible. Then, you need to market it so people will buy the products, and thus reinforce the support from gfx card and games companies, since then there would be a market share to aim at. What is out at the moment is alright, however, it lacks the overall support and isn't quite as polished as the mainstream market requires. If only ATI had a set of drivers which allowed support to the eDimensional 3D glasses, but then again, I'm not surprised since this is a very niche market that 3D displays cover.
April 10, 2007 8:08:48 PM

Does the x1900xt output progressive or interlaced format?
April 10, 2007 8:16:04 PM

The interlacing is in the driver, not the VGA output. The projector should handle it fine.
April 10, 2007 8:17:36 PM

Quote:

Are these my refresh rates? What should I setup my output to be in Catylyst if these are the specs? Thanks man.


The catalysts should probably detect the maximum refresh rates of the different resolutions available on thet projector.

it would be v-sync, 85 Hz maximum. But at what resolution?
April 10, 2007 8:23:46 PM

I think it gets outputted as a progressive signal anyway, the interlacing only occurs within the drivers (like Cleeve explained)
April 10, 2007 9:00:06 PM

My desktop screen is a 19" Viewsonic VX922, at 1280X1024. What I did was clone that screen, so that when I open games, watch movies etc., they automatically open on the projector screen (dont need to drag items across etc, and plus I couldnt get the game to open on the projector screen when I had a "stretched" screen setup, always defaulted to the monitor).

Anyway, my projection screen is therefore at 1280X1024. I assume then from your comments I should up the refresh rate to 85 (I think its currently on 75).

At 85, will the 3D setup work? If the signal is just interlaced/deinterlaced/sent at progressive SOLELY using the drivers in our v-cards, why the heck does Edimensional not get a grip and provide drivers that output progressive?
April 10, 2007 9:03:57 PM

Ahhhh, but the max refresh rate on the VX922 is 60, so if it's cloned, I cannot change the output of the projector separately. Difficult to explain why its at 70 though??
April 10, 2007 9:16:01 PM

A dumb question - why do we need those special 3D glasses. If I get Shrek 3D (one of the handful of titles available on DVD), cant I just use regular plastic glasses?

Why do those glasses not work when we have to use the software to creat 3D (in games etc).
April 10, 2007 9:18:30 PM

I think it will be much less of an issue now as it was 5 or more years ago in that most games are MADE with a 3D engine.

Not too long ago, they were still either simple 3D figures, or tD maps on 3D positions in a virtual space (Like souped up versions of DOOM/Wolfenstein).

If all it would take would be DRIVER creation rather than special titles just for the glasses, I think we would see a lot more of this.

Besides, with the latest cards out, do you REALLY need more than 1600x1200 with 8XAA? How much further can they push it in the flat screen? I think doing an 80HZ refresh at 1600x1200 for an LCD capable of the same would be the next step rather than increasing the resolution to something you cannot really appreciate!
April 10, 2007 9:28:07 PM

Eureka!

3D Stereo Anaglyph Glasses
NVIDIA now provides a low-cost solution for stereo. Using standard anaglyph
glasses (red-blue filtered 3D) and the anaglyph mode in our driver, users can play
games in 3D stereo for only the cost of the glasses—$1 to $5, depending on the
quality. The anaglyph mode of the driver can also be used to run games in stereo
using LCD flat panels, because higher refresh rates are not required.
Although the stereo effect is essentially the same, the use of red-blue glasses (E3D
Stereo-Specs can be harder on the eyes, so users may choose to upgrade to LCD
glasses.
Anaglyph glasses(red-blue) the E3D Stereo-Specs can be obtained from E3DMedia
April 10, 2007 9:30:20 PM

So guys, when will NVidia have support for the 8800GTS? I was thinking about upgrading to that card when Crysis came out. Even more of a reason to switch to Nvidia.
April 10, 2007 9:45:21 PM

I assume that my max refresh rate that this card supports is 80Hz? Table below:

640 x 480 200Hz
800 x 600 200Hz
1024 x 768 200Hz
1152 x 864 200Hz
1280 x 1024 160Hz
1600 x 1200 120Hz
1920 x 1080 120Hz
1920 x 1200 100Hz
1920 x 1440 90Hz
2048 x 1536 85Hz

Am I right?
a b U Graphics card
April 10, 2007 9:59:39 PM

Sweeeet! I so want one! :D 
!