Sign in with
Sign up | Sign in
Your question

Lucid Virtu, Intel Z68, discrete gfx and dual monitor

Last response: in Graphics & Displays
Share
May 14, 2011 11:51:11 PM

Hi everyone

The Z68 chipset is finally here and so are motherboards featuring it. With the whole z68-SandyBridge-business comes the ability to use a discrete gfx for gaming and such, while harnessing the power of the Sandy Bridge's build-in gfx and quick sync for encoding/decoding purposes. And with that comes Lucid (Logix) Virtu - GPU visualization (should have been abstraction in my eyes though) - and a lot of questions. It's all very new, but I hope we can answer a couple of questions during the next few days and help eager z68 buyers (like my self) :) 


#1: Does the mother board have to feature build-in/onboard VGA/HDMI/DVI display port in order for me to make use of the Virtu's switching between discrete and Sandy Bridge gpu and assigning appropriate tasks to each?


As far as I can tell the MB needs to feature a build-in/onboard display port of some sort. This rules out for example Gigabyte GA-Z68X-UD4-B3 and GA-Z68X-UD7-B3, but seems to keep Asus P8Z68-V PRO in the game.
http://livetechnoguide.com/intels-z68-approaches-asus-p...

#2: Which port do I connect my monitor(s) to - the onboard/build-in or those on the discrete gfx?
I seem to have found claims of both. Thishttp://livetechnoguide.com/intels-z68-approaches-asus-p... sort of says it has to be the build-in/onboard. Whereas this http://benchmarkreviews.com/index.php?option=com_conten... suggests you can actually choose.

#3: How about dual monitor setups?
If you can use the discrete graphics display ports it seems to be no problem - most cards come with 2 dvi ports. But if you are forced to route all display signals through the mother boards onboard/build-in ports I think you could get into trouble. If you have 2 DVI monitors and the MB only features 1 VGA and 1 DVI port...

Please provide any information you have stumbled upon regarding these matters. Preferably with a source :) 

I can't wait to buy a z68 system and really look forward to understanding this whole thing.

Best regards
Wuhtzu
June 15, 2011 9:14:28 PM

The AsRock installation guide is the best documentation I could find. Intel has documention on the following link (you can do a search on the Intel site):

http://downloadmirror.intel.com/19993/eng/Lucid%20Virtu...

However, it is less detailed. I am looking at my first build for a system for everyday use with the only power use being consumer HD camcorder video editing with Power Director 9 and I do not play games requiring fancy GPUs (I wish Tom's Hardware would also deal with simpler issues like mine and not only those for gamers!). But I do want the best video playback I can get. So I too was enthusiastic about the prospect of coupling an intel 2600K core i-7 processor with a Z68 motherboard. I was thinking about buying the lowest level GPU that would give me the best video graphics possible when I watch my edits or DVDs or Blue Ray videos, which according to the following article here appears to be the AMD HD 6570 (HQV2.0 at 201 is as good as the best AMD GPUs):

http://www.tomshardware.com/reviews/radeon-hd-6570-rade...

Intel 3000 graphics apparently do not give you the best video playback possible (HQV2.0 at 159) and there are some issues apparently with 24p according to this article here and I do record in 24p in low light on my HD camcorder:

http://www.tomshardware.com/reviews/a8-3500m-llano-apu,...

My thinking then was to use d-mode so I could get the best possible HD playback and only use the processor to get access to quick sync when I convert an AVCHD file to MPEG-2 for putting the edited video on a Blue Ray disc (or for any other file transcode for that matter) within Power Director 9 if I was unhappy with the time it takes for conversion. However, for videos that I want the best quality on, my thinking was to avoid all hardware acceleration by not utilizing APP on the AMD GPU and avoiding Intel quick sync as well since it appears processor encoding/transcoding is of better quality than with any of these hardware accelerators (see the following article here):

http://www.tomshardware.com/reviews/video-transcoding-a...

So I am looking into whether the Virtu program allows for the flexibility I am looking for - that is to choose when I want to use quick sync which probably won't be often, but to allow me to use the GPU for video playback, essentially when I am not using quick sync (the only reason I can see for wanting to go through all of this vs. getting a P67 board instead or just throwing up my hands and buying AMD components like a Phenom II 6core processor with a cheap motherboard until Intel gets this done without a software patch).


My concerns, however, are the following:

1. It is not clear with Virtu if you can control the choice of using GPU graphics and it is not clear that the program chooses the best option - see this link which suggests the software chooses the GPU when it should not and visa versa:

http://www.bit-tech.net/hardware/motherboards/2011/05/1...

2. Will this software have conflicts with other software?

I am trying to find out if I can control the choice of graphic options. No documentation I could find explains this. On the AsRock document, there is a picture showing a green on button and then a graphics card and an integrated graphics picture. So maybe it is possible if you can select these, but it may just be a status indicator. Without the choice, I am concerned I may be forced into using hardware accelerators when I don't want to use them (where I want the highest quality) or have to watch video on integrated graphics which does not perform as well as the $70 graphics card I am looking at getting.

If anyone can illuminate this issue I'd appreciate it. Also, can someone explain to me if software encoding/transcoding is what is done if you disable AMDs APP on their GPU and don't use quick sync in Power Director 9? I've written support and can't get a straight answer from them. Perhaps there should be a forum or a yearly article on Tom's Hardware to help us multimedia people who don't do games! Thanks!
m
0
l
July 22, 2011 9:22:47 PM

Was told Virtu is only automatic, but can make manual choices on a Z68 board by changing the primary graphics and disabling onboard graphics and connecting same monitor to both the graphics card and onboard graphics.
m
0
l
July 23, 2011 12:01:49 AM

I reached a point where I can have the onboard and discrete enabled at the same time in BIOS and I can successfully install both Nvidias driver and Intel HD driver and install Lucid Logixs VIRTU driver (http://www.lucidlogix.com/driverdownloads-virtu.html).

All that gives me this control panel: http://images.bit-tech.net/content_images/2011/05/what-...

Which is quite simple and doesn't do much. I'm actually not really sure what I get from pressing "ON". Maybe if I press "OFF" I loose the ability to make use of Quick Sync ect...
m
0
l
July 23, 2011 4:19:24 AM

I saw the control panel and the on button, the processor graphics logo and the GPU graphics logo and asked Gigabyte (for their Z68X-UD3H-B3 board) and Intel (for their new Z68 board) whether there was a manual option to select choice of graphics with Virtu. They both told me there is no manual option, so if you are finding a way to get manual selection I'd appreciate knowing it since I have not installed Virtu based on the feedback I got from them because I am not enamored with the automatic Virtu control. What I have been doing is connecting both the GPU and the integrated graphics outputs to the same monitor, selecting the graphics option I want in the BIOS and choosing the appropriate input on my monitor to effect manual control. Since I primarily use the GPU and know what I want to do ahead of time, it's really not that bad until perhaps Virtu adds manual control and I can then install it to avoid the BIOS adjustments.
m
0
l
July 23, 2011 3:08:37 PM

I have been very curious about the Z68 chipset and its ability to work with both IGP / discrete graphic card.

From what I read, only one can be active at any given time. Therefore, a user can use either IGP or a discrete graphic card's ability. But NOT BOTH at the same time.
That's where Lucid Virtu comes in. It virtualize the video layer and switch to either engine as needed.

Therefore, accordingly, only one engine can be active at any given time.

In my simplified mindset, I thought about running F@H on the discrete card and use IGP for the desktop based on the above assumption. Is this possible? With and / or without Lucid Vritu?

Curiously, how does Quick Sync work? Does the program specifically ask for the IGP to work on some and switch back to discrete as necessary?

In theory, Z68 is VERY interesting. I would love to see some concrete evidence how they work together seamlessly.

Best.

JL
m
0
l
August 31, 2011 3:40:48 PM

OK, im doing some stuff with this Lucid virtu with a Asus p8z68-v pro, i had no install problems at all with the the AMD CCC for my 5870 CF set up and the intel Drivers and lucid.

I made sure i installed the Intel GPU driver first, then CCC 11.8 and the CAP 2 Proflie. Then i installed lucid, left the HDMI cable in the First 5870 and everything seems to be working!

I also did the latest Lucid virtu update from there Website.

I imagine, like John Lennon, that its doing something here? Im in the D mode i belive it is, havent tryed the I mode but i will.

It says in D mode that it boosts your GPU performance but doesn't give you the option to switch between Intel GPU for window's and then the Big Boys,with a boost and profile for gaming?.

From what i can tell so far, Its not really doing much at all? another Gimmick maybe.

The same goes for SRT, that i also installed, seems to be working Fine..but its doing what?.

I think, reading around the internet, With overclocking review and features of this NEW Z68 chipset, Everybody is a bit dissapointed!. The features it offers are a abit "The Emperor's New CLothes" kind of scenario...are we going to get our Ass's kicked by the Bulldozer 990FX guys in a few weeks..im a bit worried, as i dropped my 990fx onto a Friend and went 2600k Intel, lost pci 16x16 and gained.....hmmmm.
m
0
l
!