Stereo Shoot-Out: Nvidia's New 3D Vision 2 Vs. AMD's HD3D
Nvidia just updated its stereoscopic 3D ecosystem with 3D Vision 2. We show you what makes this initiative different and how it compares to the competition. Then, we benchmark GeForce and Radeon graphics cards in a no-holds-barred stereo showdown!
Test System And Benchmarks
Our time to put this story together was relatively limited, so we chose to test the graphics cards that make the most sense. We already know that enabling stereoscopic 3D causes a performance hit. So, in general, interested parties should come to the table with the highest-end graphics hardware they can afford. AMD’s HD3D driver is not yet able to benefit from two Radeon cards in CrossFire, so the best-case scenario is a single Radeon HD 6970.
On the 3D Vision side, Nvidia's GeForce GTX 570 is comparable to the Radeon HD 6970. In addition, since 3D Vision does support multiple cards in SLI, we're also including a high-end GeForce GTX 580 SLI configuration.
Finally, we want to show you what to expect from low- to mid-range cards. Our original plan was to use a Radeon HD 5770, but both of the models we have on-hand refused to work with the TriDef driver, reporting that they cannot detect the 3D monitor over DisplayPort. So, we had to go with a Radeon HD 6790 to represent AMD’s entry-level 3D card. With Nvidia's GeForce GTX 460 768 MB no longer available, we chose the similarly-priced GeForce GTX 550 Ti to represent the bottom of Nvidia’s line-up.
TriDef’s Virtual 3D Mode
We tested each Radeon card twice: once in the default TriDef 3D mode, and once in TriDef’s Virtual 3D mode. Virtual 3D mode often provides a performance benefit by rendering a single viewpoint and using the depth buffer to extrapolate the image for the second eye. As an added benefit, this mode is usually impervious to shadow and lighting artifacts suffered by the default TriDef 3D mode and 3D Vision.
Virtual 3D mode often comes under fire because it’s misunderstood. To be clear, Virtual 3D mode is not a poorly simulated 2D-to-3D conversion like the ones you might find on 3D televisions and in DVD playback software. Instead, Virtual 3D mode uses data in the scene‘s depth buffer to create a separate image for each eye. This is a valid model, and Crysis 2 uses the same technique to create stereoscopic 3D for Nvidia's 3D Vision technology.
This mode is not perfect, though. The depth buffer can’t account for transparent textures, so objects behind chain-link fences appear flat. Virtual 3D mode often struggles to identify the user interface, and as a result it's often distorted by the objects behind it. The edges of objects are sometimes blurred, as the software extrapolates pixels from limited data. It also doesn’t appear to work with multi-sample anti-aliasing. Despite those issues, Virtual 3D mode often serves up better image quality than the default TriDef 3D mode, so it’s a valid option to test. If a game we’re testing game has significant problems, we’ll point that out.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Here are the particulars of our test system:
Test Hardware | |
---|---|
Processor | Intel Core i5-2500K (Sandy Bridge)Overclocked to 4 GHz, 6 MB L3 Cache, power-saving settings enabled, Turbo Boost disabled |
Motherboard | MSI P67A-GD65, Intel P67 Chipset |
Memory | OCZ DDR3-2000, 2 x 2 GB, at 1338 MT/s, CL 9-9-9-20-1T |
Hard Drive | Western Digital Caviar Black 750 GB, 7200 RPM, 32 MB Cache, SATA 3Gb/sSamsung 470 Series SSD 256 GB, SATA 3Gb/s |
Graphics Cards | 2 x Nvidia GeForce GTX 580 in SLI (for 3D Vision)Nvidia GeForce GTX 570 (for 3D Vision)Nvidia GeForce GTX 550 Ti (for 3D Vision)AMD Radeon HD 6970 (for AMD HD3D)AMD Radeon HD 6790 (for AMD HD3D) |
Displays | Asus VG278, 27" 1080p 3D Vision monitorSamsung S23A750D, 23" 1080p monitor |
Power Supply | Seasonic X760 SS-760KM: ATX12V v2.3, EPS12V, 80 PLUS Gold |
CPU Cooler | Cooler Master Hyper TX 2 |
System Software And Drivers | |
Operating System | Microsoft Windows 7 Ultimate x64 |
DirectX | DirectX 11 |
Graphics Driver | GeForce: 285.38 Beta, AMD Catalyst 11.9 |
Stereoscopic Driver | TriDef 3D 4.6 |
Games | |
StarCraft II | version 1.3.6.19269 |
World of Warcraft | version 4.2.0.2506 |
Bulletstorm | version 1.0.7147.0 |
Lost Planet 2 | version 1.0.1.129 |
Left 4 Dead 2 | version 2.0.8.5 |
Metro 2033 | version 1.0.0.1 |
DiRT 3 | version 0.1.0.11 |
Current page: Test System And Benchmarks
Prev Page Nvidia 3D Vision 2 Vs. AMD HD3D: Comparing Hardware Next Page Benchmark Results: StarCraft II-
airborne11b I didn't see any mention of crosstalk *3D GHOSTING* in this article.Reply
Does this new nvidia vision 2 really reduce crosstalk? He'll it's even listed on the promotion of the product
http://media.bestofmicro.com/7/X/311325/original/Third%20Generation%203D%20Monitors.JPG
Yet I saw no mention of it in this article. Any word on how well it handles cross talking / 3D ghosting would be appreciated. -
de5_Roy the glasses look kinda dorky.. still waiting for glasses-free 3d. i'd rather use a 120 hz monitor instead of eye-hurting 60 hz ones (without 3d).Reply -
bystander It would appear that virtual 3D mode takes a lot less power to render a single image and extrapolate the other eyes image than it is to render two images independently in normal mode. This appears to be the only reason it does compete without crossfire support. This is both good and bad. It works in almost all situations, but never at great visual quality.Reply
I'd also like to point out that the lack of AA is not a big deal in 3D. I find I don't notice the same issues without AA in 3D. When the mind fuses two images together, it's not as bothered by AA. -
airborne11b greghomeIMO, 3D is still not as appealing and no as cheap as Eyefinity or 2D Vision Tri-Screen Gaming.Reply
I'm a fan of both 3x monitor set ups, but 3D is a lot cooler.
Problem with 3x monitors is the fish-eye effect that's very disturbing (and not fixable) in landscape mode. The best you can do with 3x monitors is use expensive 1920 x 1200 IPS monitors in portrait mode, but in this set up the bezels are normally cutting right through game HUDs / hotkey bars and puts the bezels far too close to your center of view.
Further more, for this kind of Eyefinity/Nvidia surround monitor set up, costs about $1200 - $1500. (Or even more expensive projector set ups that require a ton of space and cost as much or more if you want to try and get rid of the bezels)
Now consider Nvidia 3D. It adds amazing depth and realism to the games over 2D, doesn't have a negitive "fish eye effect", no bezels to deal with, same GPU power requirements as 3x monitors (or less), doesn't interrupt game huds or hotbars and only costs about $600-700 for the most expensive 27" screen + glasses combos. (even cheaper with smaller monitors.
The clear choice is 3D imo.
But 3x monitors is still much better then single 2D monitor. I rocked 5760 x 1080 in BFBC2, Aion and L4D for a long time :P.
3D is cooler though. -
billcat479 It seems people don't follow the news on this area very much. It's not sounding all that great.Reply
I guess most people haven't read that people using 3D tv's have been getting headaches and it's not a few but a lot of people.
They should have left it in the theaters.
I wouldn't be surprised to find if and when they do a good long term study of people using them long term in gaming start to have long term medical problems if or when they get around to doing good studies on the topic. I have read enough to stay away from this 3D glasses hardware. At best I'd only use it very little and for short term use.
They really do need to do medical testing on this because people are being effected by prolonged use of 3D glasses with tv. Add all day video gaming and I think there is a possibility small or large of long term or perm. damage to people. They dumped this on the market pretty fast without doing any studies that I know of but with the amount of people showing headaches I think it is getting more attention or should damm well start checking out the possibility of any chance of eye damage or worse.
Eyesight is pretty useful.
If they ever get a holographic display then I'd be into it.
-
amk-aka-Phantom Dirt 3, first benchmark: 6790 and 6970 should switch places! Right now 6790 is performing 5 times as good as the 6970 :D Fix that, please.Reply -
CaedenV @billcatReply
So shutter tech which has been around some 15 years is dangerous, but holographic tech which isn't really available yet would be good? I would think you would want to exercise caution with any new optical tech. Personally I am allergic to the laser-to-eye theory of hologram tech.
As for the article, it was a great review! Looks like the tech is still too high end for my budget, but I am sure they will iron out all the kinks by the time I am ready to replace my monitor (which wont be soon as I love the thing). I am really curious about how the next gen graphics cards will improve in this area! Cant wait for those reviews! -
airborne11b billcat479It seems people don't follow the news on this area very much. It's not sounding all that great. I guess most people haven't read that people using 3D tv's have been getting headaches and it's not a few but a lot of people. They should have left it in the theaters. I wouldn't be surprised to find if and when they do a good long term study of people using them long term in gaming start to have long term medical problems if or when they get around to doing good studies on the topic. I have read enough to stay away from this 3D glasses hardware. At best I'd only use it very little and for short term use. They really do need to do medical testing on this because people are being effected by prolonged use of 3D glasses with tv. Add all day video gaming and I think there is a possibility small or large of long term or perm. damage to people. They dumped this on the market pretty fast without doing any studies that I know of but with the amount of people showing headaches I think it is getting more attention or should damm well start checking out the possibility of any chance of eye damage or worse. Eyesight is pretty useful. If they ever get a holographic display then I'd be into it.Reply
This is the kind of uninformed, ignorant posts that irritate me. Interweb wanna be docs who don't know how the human body works. Allow me to educate you.
People used to say that "reading a book in the dark" or watching TV or normal PC monitor "too close" would "damage your eyes". In fact we know today that eye sight degeneration has a few factors, none of which are from normal straining.
The most common cause of eye sight degeneration is Presbyopia (from the normal aging process, where the lens progressively loses its capacity to increase its power for near vision)
Also, UV rays degenerate tissue so it's recommended you way UV protective sunglasses when outside in daylight. UV rays can cause your eyesight to weaken over time.
Also refractive error(Common in people ALL ages): A condition may be either because the eye is too short or long in length, or because the cornea or lens does not have the required refractive power. There are three types of refractive errors which are Myopia (near-sight), Hypermetropia (long-sight) and Astigmatism which is the condition where the eye does not focus the light evenly, usually due to the cornea of the eye being more curved in one direction than the other. It may occur on its own or may be associated with myopia or hypermetropia.
The very worst thing that 3D vision can do in terms of negitive health effects, is the same EXACT effects of reading too much, IE; an extremely short term headache. To cure the headache TAKE A BREAK.
Also, as you build tolerance to 3D vision (As I have just after a week or 2 of consistent use) the headaches go away. Also Nvidia 3D settings allow you to adjust the depth of the 3D, less depth = less strain and you can progressively increase 3D as you build tolerance to the use of 3D monitors.
In closing, don't post nonsense about what you don't understand. It makes you look stupid. -
oneseraph I normally don't chime in however in this case I just have to say "Who Cares"? Every time I see an article about 3D graphics and 3D display tech there are always excuses for the technology not being ready. Did I misunderstand or did this article point out that neither Nvidia or AMD have a ready for prime time product. So why are they releasing this crap to us consumers and calling it a feature? When the truth is it Tech that still belongs in the lab. Come on, if you bought a blender that would blend strawberries but would not work if you put bananas in it you not only return the item but in all likelihood there would be a class action suite against the manufacturer. In short for right now 3D is just not ready. The marketing departments of both Nvidia and AMD are being more than a little dishonest about they're respective 3D features. There are lots of good reasons to buy a new graphics card. Just don't be fooled into thinking that 3D is one of them.Reply