Sign in with
Sign up | Sign in
Your question
Closed

Nvidia 3D Vision Vs. AMD HD3D: 18 Games, Evaluated

Last response: in Reviews comments
Share
September 29, 2011 4:21:35 AM

nvidia 3d vision is best in all............
September 29, 2011 5:10:22 AM

Except for the ones where it's not recommended. Good thing I have one on this rig! Now I just got to shell out some cash for some 3D Tech.
Related resources
September 29, 2011 5:27:04 AM

3D is over hyped in my opinion, it will be some more time before games can correctly exploit it.
September 29, 2011 6:11:43 AM

Everytime nVidia pushes out a proprietary format they shoot themselves in the foot. They just can't make it marketable with such a low market share. You need something like Microsofts 90% market share to think about making a closed standard.
Anyone notice the bevel on the Samsung model. That beautiful for multi-monitor.
September 29, 2011 8:25:07 AM

Great comprehensive review! Loved it.
September 29, 2011 8:56:17 AM

Nice one, and really long awaited.
Anonymous
September 29, 2011 9:41:41 AM

i tried Tridef in EVE online, absolutely stunning. :) 
September 29, 2011 9:43:18 AM

Quote:
During preliminary testing, we noticed that a decent Phenom II X4 had some trouble providing smooth frame rates, and mid-level graphics cards were cut down to their knees


Time for Bulldozer!!!
September 29, 2011 10:00:16 AM

i loved it great review keep it up
September 29, 2011 10:35:06 AM

Would it kill Tom's to use high-resolution pop-up pics? It's nearly impossible to discern any differences in detail or artifacts when comparing such low-resolution images. C'mon...1024 x 317? Seriously?
September 29, 2011 11:27:16 AM

I thought we just had a report a month ago that 3d stuff was bad for your brain. shouldnt we be waiting to see if there's any real permanent damage from this tech before we jump right in and start using it?
September 29, 2011 12:23:15 PM

you jerks messed up the cross view images. why did you include empty black space between the frames. the need to touch, the further apart and the larger the images the harder it is. I can do it fine with the pictures on Wikipedia but these ones are impossible.
September 29, 2011 12:35:25 PM

The unwritten message here is that this is all totally unacceptable. If I had shelled out major cash for 3D hardware, and then discovered only some 3D games are playable... I'd be yelling "shenanigans" from the highest mountain top. WHAT YEAR IS THIS? Nobody puts up with this "incompatable" crap anymore... its not 1980. Kudo's to Nvidia for trying to lock down polished experiences. Shame on any Game Software vendor that uses the 3D logo and doesn't deliver the goods. that is "False Advertising" as far as I am concerned and they better put a leash on it before they get sued. It sounds like Nvidia has a framework that succeeds when licensed, and game companies shy away from it due to its cost and/or red tape. Fine... Then don't release a 3D game!!! Sheesh. The more broken 3D crap Vendors put on the market, the more people will assume 3D itself is broken, which clearly isn't the case. Stop ruining the reputation of 3D by releasing half-ass'd titles please!
September 29, 2011 12:49:23 PM

Props to BulletStorm and Metro 2033 for being the only titles that simply "worked" on both Nvidia and AMD soltuions.. They have provem it "can" be done and everyone else should hang their head in shame.
September 29, 2011 1:05:16 PM

Still not interested on 3D, nice article though.
September 29, 2011 1:29:15 PM

wow! man whats a battle between nvidia and amd
in my opinion both are great......
Anonymous
September 29, 2011 1:31:13 PM

here is a misarable pro-Intel campaign. Those reporters dedicate the article to AMD and Nvidia and they consider necessary to remind about Intel. Why? How much money you get for this hidden campaign pro Intel?
September 29, 2011 2:09:36 PM

Quote:
here is a misarable pro-Intel campaign. Those reporters dedicate the article to AMD and Nvidia and they consider necessary to remind about Intel. Why? How much money you get for this hidden campaign pro Intel?


So for you it'd be responsible journalism if we noticed a problem with hardware and buried it so our readers wouldn't find out?

Or are you saying we shouldn't report negative findings we notice from any product? Or do you mean just AMD?

From where I'm sitting, what you're suggesting isn't even handed and fair journalism...
September 29, 2011 2:11:36 PM

bradleyg5 said:
you jerks messed up the cross view images. why did you include empty black space between the frames. the need to touch, the further apart and the larger the images the harder it is. I can do it fine with the pictures on Wikipedia but these ones are impossible.


No. The borders are there to help you focus. If the images were touching, your eyes would pick out the discrepancy on the edge and make crossviewing more difficult.

And what's with "jerks"...? Was name calling really necessary? :) 
September 29, 2011 2:14:09 PM

the_krasno said:
3D is over hyped in my opinion, it will be some more time before games can correctly exploit it.


Hype: maybe.

But as far as games that correctly exploit it, they are already out there. There are some game titles that have superb stereoscopic support already.
September 29, 2011 2:39:08 PM

Um, I love the idea of 3D games, but there is no way on God's Green Earth that I'll be spending $400-$800 for a 3D monitor.

So, see ya in 5 years, 3D gaming.
September 29, 2011 2:48:22 PM

According to the HDMI 1.4a spec, one of the "secondary" formats is 1920 x 1080 at 60 Hz, with a "Frame-Pack" encoding. This means two full 1080p frames, both at 60 Hz. So if the source (GPU) and sink (HDTV) both support that format, then a aystem should be able to use it.

Don W claims that HDMI supports only up to 24 Hz at Full 1080p, but that is an oversimplification. That said, I'd sure like to have a few words with whoever on the HDMI group decided that such a format shouldn't be given more importance. Implementation of any secondary format by source or sink is optional. I'm guessing that the primary formats (which include 1080p24 Frame-Pack, among others) were chosen because they all fit within the slower speed-caps of older HDMI cables. HDMI should have required that all 1.4a-compliant sources be able to generate 1080p60 Frame Pack, but they didn't. Thank you HDMI.
September 29, 2011 3:08:47 PM

TeramediaAccording to the HDMI 1.4a spec, one of the "secondary" formats is 1920 x 1080 at 60 Hz, with a "Frame-Pack" encoding. This means two full 1080p frames, both at 60 Hz. So if the source (GPU) and sink (HDTV) both support that format, then a aystem should be able to use it.


I wasn't able to get 60 Hz to work, so it's probably not a practical solution at this time and I therefore didn't cover it.

If I can ever get it to work I'll certainly report so. :) 
a b À AMD
September 29, 2011 3:30:37 PM

I'm curious if you can use a dual link DVI to Displayport converter to allow for HD3D at 1080p on more monitors. Has anyone ever tested this?

I'd also mention, one of the biggest advantage on the Nvidia side, is SLI support. With the increased demands, this is a pretty big advantage.
Anonymous
September 29, 2011 3:35:09 PM

The article has a factual error:

"While the Samsung S27A750D is the cheapest 27" option for enabling 3D, bear in mind that it requires you also spending extra on a middleware piece of software ($20 for iZ3D and/or $25 for TriDef Ignition)."

You dont have to pay extra for middleware (like Tridef) because it comes free with the Samsung 120Hz monitor.
a b À AMD
September 29, 2011 3:50:39 PM

TeramediaAccording to the HDMI 1.4a spec, one of the "secondary" formats is 1920 x 1080 at 60 Hz, with a "Frame-Pack" encoding. This means two full 1080p frames, both at 60 Hz. So if the source (GPU) and sink (HDTV) both support that format, then a aystem should be able to use it.Don W claims that HDMI supports only up to 24 Hz at Full 1080p, but that is an oversimplification. That said, I'd sure like to have a few words with whoever on the HDMI group decided that such a format shouldn't be given more importance. Implementation of any secondary format by source or sink is optional. I'm guessing that the primary formats (which include 1080p24 Frame-Pack, among others) were chosen because they all fit within the slower speed-caps of older HDMI cables. HDMI should have required that all 1.4a-compliant sources be able to generate 1080p60 Frame Pack, but they didn't. Thank you HDMI.


I was reading an article about how HDMI 1.4a will be able to support that mode by 2012 or 2013. The way it was written, HDMI 1.4a is not fully supported at this time by any manufacturer.
September 29, 2011 3:51:20 PM

Being a regular at the EVGA forums, one of the biggest issues is trying to get the drivers sorted out for each 3D game that comes out. A recent thread discussing 3D gaming, included a majority of replies by those who made the investment in 3D equipment, and now regret the decision. It seems that 3D gaming, while a cool and unique experience, can become annoying and distracting while gaming, and not really worth it.
a b À AMD
September 29, 2011 3:58:21 PM

dennisburkeBeing a regular at the EVGA forums, one of the biggest issues is trying to get the drivers sorted out for each 3D game that comes out. A recent thread discussing 3D gaming, included a majority of replies by those who made the investment in 3D equipment, and now regret the decision. It seems that 3D gaming, while a cool and unique experience, can become annoying and distracting while gaming, and not really worth it.


There are also a lot that still enjoy it, like myself. After using it for the last few months, I actually get sick and feel disoriented when playing a game without it. A 2D image just doesn't feel right anymore, it's weird.
September 29, 2011 4:17:30 PM

3D is a novel idea and I am glad to see that the technology is out there, but I don't really like it all that much. All the 3D tech I've seen just doesn't seem ready for prime time. It is not a realistic viewing experience in my opinion.
September 29, 2011 4:29:03 PM

OMG Toms Hardware!!

How can you do an article comparing 3D technology when you dont even have the right hardware??

The Samsung 750/950 is the first(real 120hz) AMD HD3D compatible screens. You dont need to pay for Tridef, you get a license for free with the monitor.

Its like comparing Ford cars to GM cars, then saying we dont have a ford to test so we're going to use a Volkswagon instead!!

I was looking forward to this article but its kinda pointless if you dont have the right hardware and drivers.

Please do the article again with the right hardware.
September 29, 2011 4:41:34 PM

cactus45OMG Toms Hardware!! How can you do an article comparing 3D technology when you dont even have the right hardware??


We have exactly the hardware we need to compare visual quality. AMD HD3D works fine over HDMI, and the visual quality at 1080p is identical to a DisplayPort monitor.

If you're complaining about performance comparisons... we didn't make any. That's what the follow up article will be for, and for that we'll use the Samsung 27".

When you consider the focus of this article, your complaint doesn't make a lot of sense.
a b À AMD
September 29, 2011 4:53:27 PM

CleeveWe have exactly the hardware we need to compare visual quality. AMD HD3D works fine over HDMI, and the visual quality at 1080p is identical to a DisplayPort monitor.If you're complaining about performance comparisons... we didn't make any. That's what the follow up article will be for, and for that we'll use the Samsung 27".When you consider the focus of this article, your complaint doesn't make a lot of sense.


Since you guys have a lot of different hardware around your office, would you guys be able to answer a question which I have not seen answered? Can a Displayport to dual link DVI adapter work with the HD3D technology to allow this to work on more monitors in the case of HD3D?
September 29, 2011 4:57:50 PM

bystanderCan a Displayport to dual link DVI adapter work with the HD3D technology to allow this to work on more monitors in the case of HD3D?


I don't have a DP to DVI-D adapter in my lab, unfortunately. If I an get one I'll try.
a b À AMD
September 29, 2011 5:03:44 PM

cleeve said:
I don't have a DP to DVI-D adapter in my lab, unfortunately. If I an get one I'll try.


I think I found my answer, it appears you can. This adapter claims it works with HD3D technology: http://www.club-3d.com/index.php/products/reader.en/pro...

That opens up doors to allow you to buy a monitor that supports both technology.
September 29, 2011 5:09:26 PM

bystanderI think I found my answer, it appears you can. This adapter claims it works with HD3D technology: http://www.club-3d.com/index.php/p [...] ml?page=10That opens up doors to allow you to buy a monitor that supports both technology.


That assumes the monitor is made to accept both signal types... would be interesting to see if it works. i'll try to include that in the follow up performance article in October.
a b À AMD
September 29, 2011 5:14:39 PM

CleeveThat assumes the monitor is made to accept both signal types... would be interesting to see if it works. i'll try to include that in the follow up performance article in October.

I have the Acer HN274H monitor, which supports both AMD and Nvidia technologies, but does not have a displayport. HD3D is designed to use HDMI 1.4a. Do you think this might be a monitor that would be able to handle both types of signals...or rather, the HD3D signal over DVI-D converted to displayport?
September 29, 2011 5:34:15 PM

bystanderI have the Acer HN274H monitor, which supports both AMD and Nvidia technologies, but does not have a displayport. HD3D is designed to use HDMI 1.4a. Do you think this might be a monitor that would be able to handle both types of signals...or rather, the HD3D signal over DVI-D converted to displayport?


no idea without testing. I'd be surprised if it works with Radeons over DVI-D, considering that it's 3D Vision certified.
September 29, 2011 5:45:38 PM

cleeve said:
We have exactly the hardware we need to compare visual quality. AMD HD3D works fine over HDMI, and the visual quality at 1080p is identical to a DisplayPort monitor.

If you're complaining about performance comparisons... we didn't make any. That's what the follow up article will be for, and for that we'll use the Samsung 27".

When you consider the focus of this article, your complaint doesn't make a lot of sense.




How can you say that? You cant compare 3D using HDMI on a TV to frame sequencial 3D on a dedicated 120Hz computer display using displayport or Dual link DVI. They are different methods.

TVs have a much slower response time and 3D in general benefits from high refresh and fast response times of a computer monitor.

It just seems like a apples to oranges comparison to me. There are differences in the way a TV handles 3D compared to a computer monitor. AMD HD3D isnt a single method of 3D, the only way to test frame sequencial AMD HD3D is on the Samsung 750/950 series monitor. It cant be done on a TV because TVs dont support frame sequencial 3D.

Frame sequencial is the best (and recommended) way to do AMD HD3D. The only way to do it is on the Samsung 750/950 120Hz monitors. The other methods of side by side etc are just a fallback if you dont have a monitor that does frame sequencial 3D.

I have a Samsung SA950 but i do agree with your summation that 3D is more game dependant than anything else and nvidia support is a little better than AMD at this stage. Nvidia has had a couple of years head start after all. I would say to people choose your hardware based on other factors, 3D is a novelty and a little unreliable at this stage. I much prefer gaming in 2D at a smooth 120Hz than gaming in 3D.
a b À AMD
September 29, 2011 5:57:09 PM

Not to mention that it would be limited to 720p if used at 60hz.
September 29, 2011 6:02:05 PM

Haven't read it yet. I have been dying for this article for a long long time! Thanks!
September 29, 2011 6:11:13 PM

I'm interested in 3D, but for me there are several improvements in LCD technology that I'd like to see happen, and the 3D technology is tied up in the mix.

I'd love to have a 120Hz monitor, but I'm picky about pixel size, and I know from the 1920x1200 24" I have that the 120Hz monitors made so far won't do for me. I've been tempted to buy one and see if the 120Hz will be enough better that I can live with it even with the larger pixels, because 60Hz is a bother to me too, but I've opted to not go that route so far. My last monitor purchase was for the smallest pixel size you can get (or that I'm aware of anyway), which is .233mm in a 2560x1440 27" panel. I love the monitor for that - but it's still 60Hz.

So, for me, I want 3D, but I want smaller pixels, preferably a 16:10 panel as even the 27" is a little narrow top-to-bottom for my tastes, I want 120Hz for each eye, which would mean a true 240Hz monitor for 3D.

Did I mention it would be nice to have 10-bit panels, as well? :D  Of course, it would also be nice for game devs to make their games able to use the extra color capability.

Bottom line, I won't buy a "3D" monitor because, for me, they just aren't good enough right now, despite being able to run at 120Hz, which I'd love to have otherwise.

Great article, looking forward to what else you bring us about 3D tech.

;) 
September 29, 2011 6:21:04 PM

woderful article....nVidia for life...bitches
a b À AMD
September 29, 2011 6:26:22 PM

Marcus52I'm interested in 3D, but for me there are several improvements in LCD technology that I'd like to see happen, and the 3D technology is tied up in the mix.I'd love to have a 120Hz monitor, but I'm picky about pixel size, and I know from the 1920x1200 24" I have that the 120Hz monitors made so far won't do for me. I've been tempted to buy one and see if the 120Hz will be enough better that I can live with it even with the larger pixels, because 60Hz is a bother to me too, but I've opted to not go that route so far. My last monitor purchase was for the smallest pixel size you can get (or that I'm aware of anyway), which is .233mm in a 2560x1440 27" panel. I love the monitor for that - but it's still 60Hz.So, for me, I want 3D, but I want smaller pixels, preferably a 16:10 panel as even the 27" is a little narrow top-to-bottom for my tastes, I want 120Hz for each eye, which would mean a true 240Hz monitor for 3D.Did I mention it would be nice to have 10-bit panels, as well? Of course, it would also be nice for game devs to make their games able to use the extra color capability.Bottom line, I won't buy a "3D" monitor because, for me, they just aren't good enough right now, despite being able to run at 120Hz, which I'd love to have otherwise.Great article, looking forward to what else you bring us about 3D tech.


You will likely never be satisfied then. 3D/120hz monitors require twice the bandwidth of 1200p60hz monitors and the same as the 2560x1440 monitors. You have to make a compromise. You either have higher res, or higher refresh rates.

If they ever offer 120hz at 2560x1440, then they'll be able to offer much higher resolution monitors, and by your requirements, that means you have to choose the higher res monitor first.

That said, I moved from a 23" 1200p monitor to a 27" 1080p 120hz monitor. I actually find the 120hz monitor much crisper and better to look at regardless of the DPI difference (this could be a result of old vs new technology). Also, in 3D, jagged edges are far less pronounced, possibly because each eye will see a slightly different images so the jagged edges don't line up, so when the image is interpreted by the brain, it kind of meshes the two images together.
September 29, 2011 6:33:13 PM

Marcus52 said:


Bottom line, I won't buy a "3D" monitor because, for me, they just aren't good enough right now, despite being able to run at 120Hz, which I'd love to have otherwise.

;) 


You're missing out. I've heard a lot of people say similar things when they havent seen or tried a 120Hz monitor. The 2D image quality is as good as it gets on monitors like the SA750/SA950. They do well over 100% of sRGB colour, more than most of the mid-range IPS screens. It also has the cleanest/clearest panel you've ever seen.

The elimination of screen tearing is a much bigger noticeable benefit for image quality than the pixel pitch of a 16:10 monitor.

Some people have very stubborn views about which monitor/size/technology to go for, and its usually based on outdated information, or comparisons made back in 2002. Things change, people should find a store where you can see displays for yourself. The SA750/950 are very nice monitors
September 29, 2011 6:55:10 PM

God I love Crysis 2 in 3D. AWESOME!! I'm not sure if it's true, as it's not out yet, but I read on a website that Battlefield 3 is gonna use the same type of 3D as Crysis 2 does. I have the beta but haven't tried it in 3D yet (WICKED GAME). Hope it's true though. If it is my wife is gonna have a hard time getting me off the PC for about a year or two.. :D 
September 29, 2011 6:56:53 PM

So Virtual 3D is really just that same fake 3D effect some TVs and DVD players like PowerDVD10 can apply to even 2D dvd's? The one that makes everything look like cardboard cutouts, and that when something doesn't move relative to the background it loses its 3D-ness?

If so then including Virtual 3D as a viable 3D option in this review is madness, as its just a 3D-like post-processing effect that does not even use the actual shape or depth data of the 3D model.

That being said, Its a good article and especially nice to see each test broken out. But the majority of results show that AMD are significantly behind nVidia, except for perhaps 2 out of 10 notable exceptions, and even then AMD only wins because of faking with Virtual 3D.
Given the article's test results where nvidia repeatedly win out, and the extra problematic middleware you need to do 3D with AMD, The article's conclusion that AMD can now even be considered a viable alternative seems very biassed. Tom's always has had a slight pro-AMD bias, but please don't become a one-way street regardless of actual results or we'll have to call you Charlie Djemeran.

September 29, 2011 7:09:25 PM

Regarding Alien VS. Predator, I have that game, and i'm not sure if it's the drivers or what, but it seems to work fine on my system. Although I haven't played it in about a month, and I have upgraded my drivers since then, so I don't know if the drivers messed it up, as the previous drivers made 3D unplayable in Crysis 2, so I stuck with the older drivers, and then skipped to the newer ones, but Alien VS. Predator looked fine in 3D on my system.

I think what needs to happen is maybe they need to come up with some kind of standard coding techniques for 3D. Crysis 2 is the best i've seen so far, so I think game developers should talk to crytek, or crytek should write some kind of article or book on how they did it so other developers can implement it the same way.

I don't really care if the older games support it or not, but I think all newer games should. Even though not many people are into it, there are still many people who love 3D, and I think technology should always move forward not backwards.
September 29, 2011 7:09:52 PM

nizSo Virtual 3D is really just that same fake 3D effect some TVs and DVD players like PowerDVD10 can apply to even 2D dvd's? The one that makes everything look like cardboard cutouts, and that when something doesn't move relative to the background it loses its 3D-ness?


NO, NO, NO! :) 

Virtual 3D uses actual 3D information from the depth buffer, it does *NOT* simulate 3D based on a 'guess' like PowerDVD, other playback software, and 3DTV's do.

The difference between normal stereoscopic 3D and Virtual 3D is that stereoscopic 3D renders each eye independently; Virtual 3d renders a single eye, but then uses actual 3D information from the depth buffer to construct the view for the other eye.

The advantage is that shadows and lighting effects don't get screwed up as they often do with stereoscopic methods such as 3D Vision or the TriDef normal mode. In cases where there are colossal visual anomalies with 3D Vision, the TriDef virtual 3D mode can allow for good playability. In cases where 3D vision is unusable, Virtual 3D mode can look very good with bona-fide stereoscopic 3D output.

The tradeoff is that since the views for both eyes are based on a single frame of video, some edges of objects need to be interpolated by the software. This can cause minor blurriness on some object edges, but it's still valid 3D, and WORLDS better than simulated 3d methods like PowerDVD's.

We're simply reporting it as we see it. I don't think it's fair to throw an accusation of bias without having actually tried both, or at least fully understanding how it works.

FYI: the 3D Vision-certified Crysis 2 uses the depth buffer similarly to the TriDef Virtual 3D mode.
!