Sign in with
Sign up | Sign in
Your question
Closed

Please Help

Last response: in Graphics & Displays
Share
December 24, 2012 12:20:24 AM

Hello,

I am so confused. I am trying to (slowly) build up my new build to 3D capability. Currently, this is what I have:

ASUS Crosshair V Formula Mobo
1X Radeon HD6870
8GB GSkill Sniper RAM @ 1600
AMD FX-6100 @ 3.3 (o.c. to 3.8)

I have already purchased a second 6870 for Crossfire. Once this is done, I know that I must connect the top card to the monitor. My confusion lies here. I have been doing my research, and I have discovered that the HDMI 1.4a will not be sufficient for 120hz 3D at 1920x1080. I want that. SO, I must hook up DL DVI, apparently. I have no idea how that works, what cable I need, etc. My question is this: Do I need a cable that plugs into both dvi ports (top card) and the other end plugs into the dvi port on the monitor? So, the dvi cable has two connectors on one end, and one on the other? I am basically asking for some brief instructions. Please....end my headache.....

More about : question

December 24, 2012 12:43:49 AM

If you only have one monitor with a DVI input, you are only going to need one DVI cable. You only connect it to one of the DVI outputs on the GPU.
Score
0
Related resources
December 24, 2012 1:02:45 AM

Connect your 6870 through HDMI 1.4a.
If you'll connect through DVI, you'll get no sound. Only video.

1.4a is recommend for 3D TVs.
Score
0

Best solution

December 24, 2012 1:32:04 AM

*You may wish to PRINT THIS.

**WIKIPEDIA is a great source of info for some things.

Who told you HDMI 1.4a was insufficient for 3D at 1080p?

See the "HDMI Version" chart:
http://en.wikipedia.org/wiki/HDMI#Version_1.4
(The difference in HDMI versions is simply how much DATA they can reliably carry. There's no PC issue to worry about. HDMI v1.4x not only supports 1080@60Hz but it also does so with the best audio at the same time. See all the GREEN "YES" checkmarks including "3D over HDMI").

*A word of warning, however:
DVI-DL is likely what you want anyway. The HDMI input on monitors is often identical to HDTV HDMI inputs. It's there for movie playback and uses a different standard than DVI. Setup in your Catalyst Control Panel in this case would be done in the HDTV section (i.e. NTSC_1080p120 or whatever).

Once setup, the main difference with HDMI in the HDTV format is that you can't adjust the resolution in a game. It's locked at 1920x1080.

DVI SETUP:
This is what you want, some points:
#1: Read your MANUAL for your graphics card to see what outputs are supported and look for the "DVI DL" (dual link). Here's a link to show what the outputs look like though your manual should suffice:
http://en.wikipedia.org/wiki/Digital_Visual_Interface#C...

#2: Be aware that an HD6870 is an okay graphics card but 3D is going to HALVE the frame rate you can get in 2D. So if you get 50FPS in a game in 2D it's 25FPS in 3D. Bottom line is you can get 3D but you have to DRASTICALLY drop the visual quality in your games to do so (it depends on the game).

#3: Your Dual-Link cable simply connects to the graphics card and monitor like a normal cable. (not "both" DVI outputs).

#4: Your graphics cards need to be physically setup correct and the latest drivers and Crossfire profiles installed from AMD. (see BOTH your motherboard and graphics card manuals)

#5: You may wish to experiment with RadeonPro which is a great tool to:
a) force VSYNC when not supported in game (i.e. Witcher #1)
b) force Anti-Aliasing when not supported in game (i.e. Mass Effect #1)
c) force VSYNC to HALF of normal (i.e. 30FPS instead of 60FPS). This feature is a little confusing so ignore it for now. I did use it to force 30FPS for The Sims 3 because for some reason (not sure why) the game didn't stutter as much.
d) force VSYNC at 50FPS (I had games that couldn't hit 60FPS but could hit 50FPS at max quality so I did this instead. Some monitors support 50Hz playback, probably because of PAL. Mine does so at 1280x720 or 1920x1080 only.)

*I've never used 3D with RadeonPro but I assume it would work.
**I'm using an NVIDIA card now and my Control Panel can do things I had to use RadeonPro for with my HD5870. If you need to VSYNC/AA/HALF VSYNC, see if your Catalyst Control Panel can do this first.

#6:
Micro-stutter in some games can be quite bad. A 3x Crossfire usually makes a huge difference. Just FYI.

#7:
AUDIO output on an HDMI signal from a graphics card usually does NOT have full support. I believe it only decodes certain MOVIE audio codecs such as Dolby Digital, MP3 etc, but not game sounds.

HDMI is often confusing in PC's so generally ignore it.
Share
December 24, 2012 8:05:54 PM

The thing is when I looked up what to do, I found 9 times out of 10 that hdmi 1.4a does not support hd3d at 1920x 1080 with 60 hz to each eye, only 30hz. I want to make sure that I am getting the best experience possible. So, even with two 6870s, it will still halve my framerate? I didn't think it would be that taxing on two gpus of the 6870 caliber. I know my mobo supports triple crossfire, and 3-way sli. I would go with nvidia, but their high end cards are significantly more expensive, and I already have 2 (one on the way) 6870s. SO basically, the trash I heard about hdmi is old news and I should be fine using hdmi then? If so that would be great. I use a separate cable for audio anyways, to my bose speakers straight from the SB X-Fi Digital audio from the mobo. So, as I have some mixed answers here, which would give me the better experience, dvi or hdmi....I will be using a 24-27 inch monitor, not a big 3dtv or anything.
Score
0
December 25, 2012 12:22:55 AM

Are you using Direct HDMI? Or using a DVI-HDMI Converter?

HDMI is going to give you better experience. (True 1080p)
Get a GTX 660. I have one and I run my TV through that.

Here is my Experience with the TVs.
I have a 42 inch HDTV.

I connected it with DVI and I got maximum resolution 1360x768 with no sound.

I connected it with HDMI 1.3 and I got maximum resolution 1920x1080 with sound on my TV speakers.
{1.3 is ideal for HDTVs} {I got Res of more than 1080p too, but didn't tried that}

Score
0
December 28, 2012 7:21:26 AM

elitehunter6 said:
The thing is when I looked up what to do, I found 9 times out of 10 that hdmi 1.4a does not support hd3d at 1920x 1080 with 60 hz to each eye, only 30hz. I want to make sure that I am getting the best experience possible. So, even with two 6870s, it will still halve my framerate? I didn't think it would be that taxing on two gpus of the 6870 caliber. I know my mobo supports triple crossfire, and 3-way sli. I would go with nvidia, but their high end cards are significantly more expensive, and I already have 2 (one on the way) 6870s. SO basically, the trash I heard about hdmi is old news and I should be fine using hdmi then? If so that would be great. I use a separate cable for audio anyways, to my bose speakers straight from the SB X-Fi Digital audio from the mobo. So, as I have some mixed answers here, which would give me the better experience, dvi or hdmi....I will be using a 24-27 inch monitor, not a big 3dtv or anything.


Q&A:

#1: The thing is when I looked up what to do, I found 9 times out of 10 that hdmi 1.4a does not support hd3d at 1920x 1080 with 60 hz to each eye, only 30hz.

ANSWER: Incorrect.
HDMI v1.4a/b/c supports 3D at 1920x1080 up to 60FPS per eye. See the chart I linked. An HDMI cable is basically a copper wire and the QUALITY of it (version number) determines how much information it can reliably transfer. You can actually use a LOWER version number for 3D when not using any audio.

*It's academic anyway as you really should be usally the DVI-DL cable.

#2: So, even with two 6870s, it will still halve my framerate?

ANSWER: You will get HALVE the framerate that you would get in 2D with the same physical setup (2xHD6870 in Crossfire). As for a single HD6870 vs a 2xHD6870 that varies from unsupported to between 50% and 85% or so. It depends on the game, NVidia's driver support and your CPU and minor other things.

#3: SO basically, the trash I heard about hdmi is old news and I should be fine using hdmi then?

ANSWER: YES and NO.
If you read my above post more which is quite detailed, I explain that your PC monitor likely has an HDMI input that is the same as a normal HDTV's input. It's a different format requiring you to setup via the HDTV interface. Your generally okay except you can't resize video in a game from 1920x1080.

*Long story short, there's no reason to do so if you have the DVI-DL option which you almost definitely do. Using DVI you won't run into overscan and resizing issues. (You still need to make sure you have ASPECT RATIO assigned to prevent incorrect stretching).

SUMMARY:
- Forget about HDMI and use DVI-DL
- 3D is interesting but the micro-stutter (varies a lot between games) and lower framerate might make 2D preferable in some games
- ***Because of Micro-stutter, some games may be more enjoyable with CROSSFIRE DISABLED. I assume you can do that on a per-game basis.***
- Investigate RadeonPro (force vsync or anti-aliasing when not supported, 50FPS instead of 60FPS if available.. )
http://www.radeonpro.info/en-US/
Score
0
December 28, 2012 7:27:22 AM

Sumukh_Bhagat said:
Are you using Direct HDMI? Or using a DVI-HDMI Converter?

HDMI is going to give you better experience. (True 1080p)
Get a GTX 660. I have one and I run my TV through that.

Here is my Experience with the TVs.
I have a 42 inch HDTV.

I connected it with DVI and I got maximum resolution 1360x768 with no sound.

I connected it with HDMI 1.3 and I got maximum resolution 1920x1080 with sound on my TV speakers.
{1.3 is ideal for HDTVs} {I got Res of more than 1080p too, but didn't tried that}


The reason for your experience is this:
a) the DVI input is likely a PC input (probably paired with a 3.5mm stereo audio input). It connects to a small circuit to allow the screen to be used like a monitor.

b) the HDMI input is a normal HDTV input which allows up to 1080p (1920x1080)

*PC inputs and HDMI inputs are in different FORMATS. The big advantage of the PC input is that it's usually difficult to get proper audio output from a computer (HDMI output of graphics cards, unlike laptops, has only movie support). In general you hook a LAPTOP to the normal HDMI input (after switching the Audio output to HDMI) and Desktops use DVI or VGA plus 3.5mm audio input from the onboard sound or sound card.

(The entire audio situation is a mess with desktop PC's for HDMI. Laptops work fine, provided you toggle the audio output in the System Tray.)
Score
0
January 13, 2013 11:31:44 AM

Best answer selected by elitehunter6.
Score
0
!