Sign in with
Sign up | Sign in
Your question

ASUS P8Z68-V PRO DVI Out Question

Last response: in Motherboards
Share
September 9, 2011 12:46:25 AM

I'm considering getting a P8Z68-V PRO but am concerned with the limitation of the Single-Link DVI-D connection on the board because of my monitor's resolution being higher than 1900x1200.

Since I know that the Virtu thing runs even the Dedicated GPU (GTX 560 Ti in this instance) through the on-board DVI Connection, I was wondering if the opposite could work. In other words, running the Virtu features (iMode, quicksync) through the DVI out port on the dedicated card.

If no one here knows the answer, what forum would be a good place to seek it out?
October 5, 2011 6:59:08 PM

Yes - Virtu does use a dedicated (add-on) graphics card for those features also.

The Bios must be set to PCI-E as initial graphics card -and- to run in "D" mode, the DVI cable must be connected to the PCI-E Graphics Card. See section 5.3 of the P8Z68-V Pro User Manual available at Asus.com.

In resolutions above 1920x1200x60Hz you're probably only going to see whatever gains are offered by the PCI-E card itself. Encoding video however at or below that resolution should (I think) still see the gains offered by both. I'm not all that sure but it does make sense from a technology POV as output renders are a separate function altogether from screen draws. I will be able to put this to the test soon on my system in about a week. (I have it laying around but I haven't had the time to build it yet).
m
0
l
October 12, 2011 1:51:48 PM

Okay, I'd be interested in hearing your results. So far I haven't been able to even get the system to recognize the onboard graphics... so I must be doing something wrong.
m
0
l
Related resources
October 14, 2011 4:03:12 PM

Works like a charm!

After setting up the system, all features seem to perform as advertised. Be sure to flash in the latest Asus bios update first thing though.

The new bios itself was a new learning curve for me but was an absolute joy to use compared to the older style menu driven Bios.

After I initially got some proper load/install orders finally in place, due to Win7's bad habit of trying to load drivers automatically ahead of the OEM latest versions I had on a USB flash drive, I had no issues at all (I used the full set of the latest Asus drivers for the Mobo from the web).

Running in "D" mode -
With Virtu you must as follow a setup order of -> Bios setting to PCI-E & CPU graphics enabled, DVI cable on PCI-E card, load the intel graphic driver, load Virtu driver, THEN load the nVidia (or ATI) video drivers LAST. After this, all is well. I run dual Samsung T260HD/HDTV monitors for 3D animation/visual effects work.

Virtu seemed to incur a slight bit of overhead delay when launching a render output file, but "screamed through" the process nimbly once it started. Only by a few seconds though and only at launch of the render. After a few sample renders, I turned off the on board Intel 3000 graphics of the T2600K CPU in the bios. The renders ranged about 25-40% in frame output "slower" than with it enabled at 1920x1200x60 (the native resolution of the T260HD monitor). The on board graphics really made a huge differance. Next I tried some video with audio output and still saw a very nice improvement in the render output times (+20%) with it enabled vs. disabled. Sweet! CPU Temp would climb by 15-20c while rendering so I didn't put too much of a strain on it just yet. I felt I needed to add a Corsair H60 H2O closed loop cooler on the CPU and now all is well temp wise. Temp would climb only 3-5c during render... more to my preferred normal range. All in all, I'm quite happy.

The HW combination of the Intel 2600K, Asus P8Z68-V Pro and 16Gb of Corsair Vengence 1600 XMP ram appears to be a winning A/V solution.
I had added a 60Gb Force 3 SSD for cache and 2 WD 1TB Caviar Black 7200/64 HDDs. Since this was a real "across the board" upgrade solution.
Runs very well on a GS600w PSU and the Asus nVidia 560Ti-1Mb with Dual screens.

It requires a bit more of pre-planning and careful familarization of the component mix and its setup than most earlier Gen systems, but if you're willing to do your homework and prepare things properly - than this is definately the way to go with just under $1000.00 invested in upgrades.

NOTE: I noticed some confusing setup of the SATA ports of the docs in the manual vs. the printed nominclature on the mobo itself.
One says they pair horizontally (manual).. the board says they pair vertically (printed near the ports).
The board printed nomenclature is the correct arrangement for best SATA throughput. Tested both ways so I ran with that.

If you reply here with your exact setup, I can assist you better. Probably just something you might have overlooked in the setup.

P.S. Edited for more clairity.
m
0
l
October 15, 2011 2:46:42 AM

I think I got it working! The enable option for the iGPU was buried for sure. :D 
m
0
l
October 20, 2011 2:16:17 PM

Glad you're up and going.

As a footnote, I did upgrade my PSU to 1000w after I noticed some frame tearing beginning to show up in the graphics output. I at first thought this was due to a possible outdated DirectX software component but they were all in place & available up to date SDK's & runtimes (couldn't remember if I did so or not). After monitoring all 3 types of load voltages with a multimeter in realtime for the added peripheral hardwares I had added (CPU cooler, SSD & 2 additional SATA III drives) , it became obvious that the 600w PSU was only marginal in proper performance with these in mixed, semi-constant use.

I went with a 1000w modular PSU unit. General Performace is now much more stable after many hours of operation and frame outputs are smoother and are the proper "delicious eye candy" they were designed to be. I also felt the process as a whole benefitted from using both the iGPU with Virtu and the Nvidia 560Ti card itself in particular (the tested 570 & 580 cards were of limited increase of benefit to me over the 560Ti). I'm now trying to get the Virtu "Universal MVP" version of the software which is said to improve the Graphics speed and performance and really push up the frame rates by a reduction in generation of the redundant frame information during render to both screen and output files. I feel this will better bring together the best of both worlds in my application of the hardware investment.

Cheers!
m
0
l
October 23, 2011 8:05:42 AM

That's interesting,

I currently run a 700w PSU, and have I have 5 HDD's and 1 SSD. Is there a software piece that will monitor voltage usage, as (although I admire your tenacity), buying and crawling around poking things with a multimeter doesn't sound like a fantastic way to spend my afternoon ;) 

Have you noticed a difference with the "Universal MVP" version? I'm just using the "green" one right now.
m
0
l
October 23, 2011 8:14:35 AM

Also, I couldn't find a way to get the Universal MVP software, does the ASUS Z68 V-Pro support it?
m
0
l
October 23, 2011 10:30:44 PM

Yeah. I had a busy day. I did work for Intel back in the day doing QC on OEM builds. So it was just one more step to the confirmation of the build as built. I had reason to suspect I might not be comfortable with 600w and the monitoring just confirmed the marginal nature of it. Better to know before spending $200.00 on a new PSU.

I've contacted LucidLogix via website to acquire the MVP version. So far no word yet. As far as I know, any Sandybridge system can utilize it.

ASUS has a utility for monitoring the mainboard. It's called Asus probe II. It can be found on the OEM CD included with the z68-V pro. It does monitor boardlevel voltages but not PSU level. To get a good idea of real current demand, these must be measured at the PSU. Also you should have deep knowledge of electrical theory in PC design. Not something I would recommend for novice builders.
m
0
l
December 14, 2011 3:55:19 AM

What about a multiple monitor setup for this board? Recent build and I'm a bit agitated that I can't use both DVI interfaces on my graphics card.

m
0
l
December 14, 2011 11:36:00 AM

Issue is resolved, thanks.
m
0
l
December 14, 2011 12:05:48 PM

tattooedbones said:
What about a multiple monitor setup for this board? Recent build and I'm a bit agitated that I can't use both DVI interfaces on my graphics card.


I don't see any reason why you can't use both DVI outputs from a single PCI-E video card.
You can't use the main video output & the Video card's output at the same time under Virtu. It must be either in "I" mode or "D" mode.

"D" mode allows for the standard output of both DVI outputs of a PCI-E graphics card. Just turn off the internal video in the Virtu control panel.

If your having issues:

- Make sure the Video card is installed the primary PCI-E slot 1 on the mainboard.
- Connect two monitors to the DVI outputs of the PCI-E Graphics card.
- Enter the Bios. Make sure the Bios is set to PCI-E as the bus primary source card.

Choose to enable or disable the Intel internal graphics of the CPU.

If Enabled:
- Make sure the Virtu software is up to date and installed just after the bios settings.
- Turn off the internal GPU in the Virtu control panel.
- Install your Graphics drivers.
- Set the graphics resolution to a Dual screen resolution in the graphics controls.

Option:
You can disable the Internal Intel graphics in the bios if you are not using it.
You also do not need Virtu installed if you are disabling the internal GPU.
m
0
l
December 15, 2011 1:40:09 AM

Hi Omi3D. Thanks for the reply! I tried something I didn't think would work and lo and behold it did... I merely swapped the positions of monitor 1 and monitor 2 and for some reason THAT works fine ? I tried it with 3 different monitors actually, and it just won't work the way I had it.... Hey, I'm not going to question it.. but hope nothing funky happens in the future. Thanks again.
m
0
l
December 15, 2011 6:23:25 PM

tattooedbones said:
Hi Omi3D. Thanks for the reply! I tried something I didn't think would work and lo and behold it did... I merely swapped the positions of monitor 1 and monitor 2 and for some reason THAT works fine ? I tried it with 3 different monitors actually, and it just won't work the way I had it.... Hey, I'm not going to question it.. but hope nothing funky happens in the future. Thanks again.


Glad you got it solved !
m
0
l
December 15, 2011 6:27:21 PM

For those still following this thread -

Lucidlogix has announced that the Universal version of Virtu is now available from them.
I've written 3 emails asking for this version with no reply in sight from Lucidlogix sales department. :whistle: 

Those interested can goto the website at www.lucidlogix.com
m
0
l
December 18, 2011 6:33:13 PM

SavingPrincess said:
concerned with the limitation of the Single-Link DVI-D connection on the board because of my monitor's resolution being higher than 1900x1200.
How does a single link DVI-D connection vary from any other possible kinds of DVI-D connections, and what is the limitation on it?
m
0
l
December 18, 2011 8:25:02 PM

There is single link & dual link DVI Connections. The later basically carries "analog" as well as "digital" information. This is NOT to be confused with a dual head connection, which utilizes up to 2 viewing monitors. (usually on a single card).

Your question asks much more than I have time for here to answer, so I'll just refer you to the following link. Sums it up well.

http://en.wikipedia.org/wiki/Digital_Visual_Interface

Read that and you'll have the answers.
m
0
l
December 19, 2011 2:16:36 AM

Omi3D said:
There is single link & dual link DVI Connections. The later basically carries "analog" as well as "digital" information.
Anyway, all I really need to know is which is the standard connection used on current motherboards and video cards. I've never seen a distinction between types.
m
0
l
December 19, 2011 12:07:12 PM

There really is NO "set type" of connection per se.

You can use Single or Dual link cables. With single you may be limited to only a few types of resolutions as supported by any particular card & monitor combination. All single links are digital so they sync with most of the common monitor resolutions and any other they support. Even a dual link cable will work as well because any unused connection pins are not "provisioned" on the device which it attaches to.. be it a video card or monitor. So, having a Dual link cable will either not fit the physical layout of the connector's pins - or - not be provisioned internally in the device if it does fit.

The highest possible resolution is usually determined by the monitor. The video card will usually be capable of more resolution than the monitor. Dual head cards go even higher in resolution to support the wider horizontal of the combined screens.

Just for clairitys sake - The standard type is a pure digital connection with DVI - "DIGITAL" VISUAL INTERFACE.

There is a scenario whereby you have a monitor and video card that can support a dual link connection. If you use a single link cable you have limited the maximum resolution to that lower than the two devices are capable of. It will still operate normally, just not at the higher resolution that they are capable of. It will still sync to the highest capable with that particular cable. Just not as high as is possible if you HAD used a dual link cable.

m
0
l
December 19, 2011 6:15:26 PM

Hi Omi,

I appreciate the care and detail you have troubled yourself to explain here, but this seems unduly complicated.

I am getting an Asus Z68 motherboard and it has two monitor outputs.

I have two monitors. They are 1980x1050 and 1600x1200. I will be replacing the latter with a 1980x1200 or higher monitor later.

I just want to plug them in and have them work.

Can I expect that to happen, or is computing still not mainstream enough that I am asking too much?
m
0
l
December 19, 2011 8:26:58 PM

I see. Well now, perhaps that should have been the question in the beginning?

If you are referring to the VGA & DVI MB outputs then NO, it's an either/or situation. The CPU has only 1 GPU for the internal video.

The Intel graphics on-chip supports only one monitor at a time. The presence of two outputs is a choice for either older analog/TV or newer digital monitors. Pick one. If you want to use two monitors then you will need a PCI-E video card with dual heads.

Side Note: For dual monitors its strongly suggested that two identical monitors be used. That is to say the same make & model of monitors.
This is due to many reasons which, for brevity sake are quite important to both the ease of usefulness and longevity of each.
m
0
l
December 22, 2011 11:46:46 AM



HI Omi

I am hoping you can help me

I have searched for hrs and cannot resolve this problem. I have the P8Z68-vPro/Gen3 board with the 1101 flash.

I am running 2 GTX 560 TI direct II cards and basically as soon as I install the nvida drivers the machine just reboots in a loop ..

If I take the drivers off it will boot into Wins 7 (pro).

I have tried updating bios, gone down to one card, and i am set to pcie/

I cannot understand what is wrong.

I hope you can help

thanks
m
0
l
December 22, 2011 12:48:54 PM

Omi3D,
If you're still around, would you mind posting your configuration?
I'm starting out on a new build and so far have finalized Intel 2600K, Asus P8Z68-V Pro and 16Gb of Corsair Vengence 1600 XMP ram , the same as you.
I will add a 1000 W PSU but I'd like to know what case and CPU cooler you are using.

I'm a casual gamer, my primary hobby is video transcoding and photoshopping.
Are you happy with the 560 video card?

Thanks in advance.
m
0
l
December 22, 2011 1:09:16 PM

khanshirus said:
HI Omi

I am hoping you can help me

I have searched for hrs and cannot resolve this problem. I have the P8Z68-vPro/Gen3 board with the 1101 flash.

I am running 2 GTX 560 TI direct II cards and basically as soon as I install the nvida drivers the machine just reboots in a loop ..

If I take the drivers off it will boot into Wins 7 (pro).

I have tried updating bios, gone down to one card, and i am set to pcie/

I cannot understand what is wrong.

I hope you can help

thanks


Hi,
You don't mention the other system specs with which to make a better assessment of your issue.
I'm assuming you have a Z68 board. A few ideas come to mind.

Mainly this sounds like a bus contention issue with the cards. Here are a few suggestions.

- Turn "off" any internal video in the bios with any SLI/Xfire setup.
- Set bios to PCI-E as the primary video bus selected.
- Start from scratch again after removing all video drivers (nVidia, Virtu) and only a single card in place at PCI-E 1.
- Start with one video card. Boot to this card without any drivers loaded (Generic MS driver at base 640x480x60)
- Add the video drivers at this point. Set the resolution to the base resolution.
- Boot to this configuration at least 2 times to ensure the changes made have been seen/preserved by the opsys properly.
** if that appears normal then proceed to add the OEM (not nvidia) video drivers.
- If this is normal, then proceed to install the second card in PCI-E 2 (No A/C power connected) and proceed with the SLI setup as outlined in your MB manual.

That usually solves most issues like this (barring any true hardware flaws in the devices themselves).

Note:
You also may want to test each card individually first. That is to say.. use each card in a single card configuration only. Use this to verify they each operate as expected and have no descreet hardware issues apparent on either device.

Addendum - I meant to say a good Power Supply, not Z68 board .. Busy day.
2x 560Ti will need at least an 800 Watt (Silver 80 SLI or better). Toward 1000 watt if you have many HDD's and multi monitors.

**Sorry for the mistype... my mind can get way ahead of me sometimes.
m
0
l
December 22, 2011 1:52:42 PM

sanjeevnuts said:
Omi3D,
If you're still around, would you mind posting your configuration?
I'm starting out on a new build and so far have finalized Intel 2600K, Asus P8Z68-V Pro and 16Gb of Corsair Vengence 1600 XMP ram , the same as you.
I will add a 1000 W PSU but I'd like to know what case and CPU cooler you are using.

I'm a casual gamer, my primary hobby is video transcoding and photoshopping.
Are you happy with the 560 video card?

Thanks in advance.


I'm very happy with the 560TI card. I also have 570's & 580's. They actually behaved somewhat strange though. Not consistant I'm saying.
I had much better consistancy with the 560Ti. Probably because its closer to the stock 850Mhz of the on-chip Intel HD3000 in the i7-2600K.
As a whole I feel it best matches the rest of the systems capabilities (not too much stronger or weaker) and "plays nicer" with it all.

The Case I'm using is the Antec 300 gamer case (Armour Black) w/ SSD mount. I chose this case not because its cheap ($70.00 US) but because it has a 140mm top mounted fan located very close to the rear of the case where the Rad is mounted for the Corsair H60 CPU cooler. I could buy any Case on the planet, but that one, oddly, was the closest top mounted fan to the rear for what I was going for in my system.

I've chosen the H60 for some very good reasons.
- The H100 (overkill for this) + variable speed pump (no way my friend)
- The H70 (Rad a bit too thick but a nice system) + variable speed pump (nope, gotta go)
- The H60 (nice size and very capable for moderate OC) NO variable speed pump. (Hooray)

Since all the monitoring is MB based and also thermally throttled - I wanted a pump that had a single speed.
The rpm changes would be thermally consistant without running into electrical changes that would probably ruin a set-variable pump rather quickly.
I now feel I was correct with this assessment because - those other variable ones have been seen having issues with the pumps.

They are NOT bad units mind you. Just not as durable (imo) under those conditions being thermally ramped up and down all the time.
If I wasn't going to use the built-in mobo thermal controls (disabled) then they would be just fine. I would still then be limited in thermal range control however. If the H70 (and others) had a single speed pump, that would have been a strong choice for me.

I wanted a "fire & forget" cooling solution.

The system config is listed here on another post.
http://www.tomshardware.com/forum/forum2.php?config=tom...
m
0
l
March 9, 2012 6:37:41 AM

Omi3D said:
I see. Well now, perhaps that should have been the question in the beginning?

If you are referring to the VGA & DVI MB outputs then NO, it's an either/or situation. The CPU has only 1 GPU for the internal video.

The Intel graphics on-chip supports only one monitor at a time. The presence of two outputs is a choice for either older analog/TV or newer digital monitors. Pick one. If you want to use two monitors then you will need a PCI-E video card with dual heads.

Side Note: For dual monitors its strongly suggested that two identical monitors be used. That is to say the same make & model of monitors.
This is due to many reasons which, for brevity sake are quite important to both the ease of usefulness and longevity of each.



hmm odd I am running this mobo with two monitors right now... one plugged into DVI-D and one plugged into the HDMI port on the mobo... I dont have a PCI express video card.
m
0
l
March 15, 2012 3:59:28 PM

If you are -
Are you truly spanning across both monitors or just showing 2 identical desktops?

At what resolution are driving the desktop?

It is my understanding that the iGPU in the Intel Sandy Bridge CPU cannot support output to 2 monitors at the same time. Probably as a voltage supply limitation as described by Intel. Still I can find no mention of supporting 2 monitors with only iGPU. I do find a few places that mention (caution) against using two on SB iGPU Chips. Perhaps some Bios level revision addressed this after the fact - IDK.

EDIT : * Yes this was addressed in more current bios revisions. So two are supported of the three available.

Still, I myself would not want to do this for many other reasons too. Thermal load, frequency shift, power needs, etc. Better to use a dual head card which has been designed for such a typical dual screen use. But as always, feel free to expieriment.

I'll go the tried and true way and not add additional thermal stresses to the CPU.
If you OC, then it is a reasonable conclusion NOT to.

If you do not OC, then by all means enjoy it.
m
0
l
!