Sign in with
Sign up | Sign in

Lucid Demonstrates XLR8 Frame Rate Boosting Technology

By , Benjamin Kraft - Source: Tom's Hardware US | B 23 comments

Lucidlogix shows off its virtual GPU technology.

We paid Lucidlogix a visit a Computex to get a run through of its current portfolio. We were given a demo of Virtu MVP "discrete edition," which as the name suggests, is a solution for the desktop with a discrete add-in graphics card.

This version of Virtu that can do everything previous versions of MVP could, except it also has an integrated mode (i mode), where the monitor is attached to the outputs of the integrated GPU.

If the system is only showing 2D output or doing tasks that can be handled by the integrated GPU (such as Quick Sync transcoding), the add-in graphics card is turned off completely for a 0 Watt mode. As soon as 3D mode is activated or a heavier workload comes up, Virtu automatically wakes up the add-in card seamlessly.

 

Two ways to achieve this:

•       You need either a graphics card with a special chip to do the monitoring and waking (currently connected internally from the graphics card to the motherboard via USB)

•       You need a mainboard that can turn the x16 PCIe slot on and off.

Lucidlogix demonstrated this system using a standard Z68 board and a slightly modified Gigabyte GV-N680SOC-2GD (we reported on this in its monstrous five fan configuration) that was equipped with the special chip.

UPDATE: To be clear, Lucidlogix is just a software solutions provider. This demonstration of Virtu MVP was running on hardware modified by Gigabyte with a simple USB control on the GPU AIC that connects to the motherboard. It is then Lucid's software that manages and controls it.

During the demo, the system was running normally, but the graphics card was powered down completely with the fans stopped. When a game was launched, the fans spun up and you got the GTX 680's performance with no flickering or other disruptive signs.

In theory, you could even remove the discrete card from the active system in 2D mode, as shown in this video:

Lucidlogix Desktop GPU Power Control

Lucid is now working on applying some of its technical know-how to single-GPU configurations, which is interesting to hear given that we know this technology as something for automatically switching between or virtualizing two GPUs.

Lucid is calling this XLR8, which uses the two MVP technologies we already know about: Virtual V-Sync to eliminate reduce screen tearing; and HyperFormance to make 3D rendering more efficient.

New to the mix is DynamiX. On weaker GPUs, it is meant to improve frame rates by dynamically scaling back quality. While static elements such as the HUD and GUI remain at the same quality, the rest of the scene is reduced in an attempt to achieve a certain minimal frame rate target set by the user.

It's a simple interface with two sliders: One with the minimal frame rate the user wants, the other for the minimal quality the user is willing to settle for. This was demoed with Diablo III on a Dell XPS 13 Ultrabook with a Sandy Bridge Core i5 and its integrated GPU. Quality reduction was very evident, but responsiveness was much better, with frame rates doubling from 14 stock to 28 FPS with XLR8. Also, as promised, the HUD remained unaffected.

 

The XLR8 technology is about graphical and performance compromise, but the point, according to them, is to take games to playable levels on lower-end hardware. It's still just a tech demo, but it already works with DX 9, 10 and 11. In another demo on an Asus laptop with a GeForce 610M, the XLR8 tech took frame rates from 14 to 26 FPS playing Battlefield 3.

Finally, Lucid also they also showed remote rendering and gaming. The current problem with remote desktop software is that it uses a (slow) virtualized VGA device. Lucidlogix's solution enables using the actual GPU for rendering, so that output on the remote screen is much smoother. This will work with any remote connection software because the rendering happens on the server side. They showed it on an Android tablet and a smartphone, but it will works for pretty much any client, though, even desktop-to-desktop.

Here's a video showing that in action:

Lucidlogix GPU Server utilization

Read more from @MarcusYam on Twitter.

Display 23 Comments.
This thread is closed for comments
Top Comments
  • 13 Hide
    army_ant7 , June 11, 2012 9:44 PM
    Lucidlogix is making me love them more now! I remember far back that it was just the concept of using different GPU's in tandem that intrigued me, but practical efficiency technologies are really wowing me!
Other Comments
  • 2 Hide
    Anonymous , June 11, 2012 9:35 PM
    "dynamically scaling back quality"...didn't Carmack fail at this recently (Rage)?
  • 13 Hide
    army_ant7 , June 11, 2012 9:44 PM
    Lucidlogix is making me love them more now! I remember far back that it was just the concept of using different GPU's in tandem that intrigued me, but practical efficiency technologies are really wowing me!
  • -4 Hide
    fb39ca4 , June 11, 2012 9:48 PM
    Oh look a lot of fancy marketing terms! So exciting!!!
  • 5 Hide
    aldo_gg , June 11, 2012 9:53 PM
    Wow. I was dreaming on remote rendering and gaming for 1/3 of my life
  • 2 Hide
    army_ant7 , June 11, 2012 9:55 PM
    fb39ca4Oh look a lot of fancy marketing terms! So exciting!!!


    Well, you gotta name your "children" something cool (that would be up for discussion (its coolness)). :-))
  • 6 Hide
    frank the tank , June 11, 2012 10:27 PM
    Isnt XLR8 a moniker for PNY products? (video cards/ RAM)
  • 2 Hide
    Darkerson , June 11, 2012 10:57 PM
    This is some pretty neat stuff they are working on.
  • -8 Hide
    geminireaper , June 11, 2012 10:58 PM
    whats the news here? This is already being done in laptops. my wifes i5 has an intel and an nvidia gpu. Under light load it uses the intel but when I load a 3d intensive game the nvidia one takes over.
  • 1 Hide
    memadmax , June 11, 2012 11:10 PM
    Do the authors of these articles even bother reading what they just typed anymore??????
  • 0 Hide
    memadmax , June 11, 2012 11:11 PM
    Also, this should be a feature built into all motherboards by now...
  • 6 Hide
    warmon6 , June 11, 2012 11:34 PM
    geminireaperwhats the news here? This is already being done in laptops. my wifes i5 has an intel and an nvidia gpu. Under light load it uses the intel but when I load a 3d intensive game the nvidia one takes over.


    Your in a small box.......

    Yes it's the same concept as what nvidia has done but a few things:

    1. it's in a desktop

    2. In theory, this software could be used with any intel motherboard (maybe AMD in the future) and ANY gpu combination once graphic card makers get on board with this.
  • -8 Hide
    Energy96 , June 11, 2012 11:52 PM
    warmon6Your in a small box.......Yes it's the same concept as what nvidia has done but a few things:1. it's in a desktop2. In theory, this software could be used with any intel motherboard (maybe AMD in the future) and ANY gpu combination once graphic card makers get on board with this.


    I fail to see why anyone would ever really care about this in a desktop setting. The only advantage I see is a small power savings in 2D mode where most discrete cards already have a fairly small consumption rate anyway. If it were even $50 a year I would be surprised. I didn't even see any difference in my already small electricity bill when I added SLI GTX 580's to my system. On battery power? sure that's great, plugged in to the wall? /shrug who cares.
  • 3 Hide
    A Bad Day , June 12, 2012 12:51 AM
    Hot swappable GPU.

    Mind blown.
  • 0 Hide
    ashesofempires04 , June 12, 2012 1:08 AM
    frank the tankIsnt XLR8 a moniker for PNY products? (video cards/ RAM)


    Yes, yes it is. A PNY XLR8 GTX 670.


    Cue the trademark lawsuit in 3...2...1...
  • -1 Hide
    darkavenger123 , June 12, 2012 1:30 AM
    Err...what's the news here?? I am currently using a Lucid capable Z77 board running i3+ GTX 460 with the monitor attached to the mobo HDMI (using i3) output. Supposely to save the power in i-mode...but according to my test, it does nothing....my system still at 70W+ whether in i-mode or d-mode(connected to the GTX-460) when idle or low load.

    But at least the Virtual Vsync and Hyperformance seems to work. The only news here i saw is some 'modification' done by Gigabyte on the mobo....does this means all the current claims are false since it won't work save power and turn down the discrete GPU completely?? This means only those with whatever 'modifications' can shut down the discrete GPU?? I e-mailed LUCID but they never reply me.
  • 0 Hide
    darkavenger123 , June 12, 2012 1:31 AM
    Just to add, yes it can kick in to d-mode or hyperformance as long as i define the apps in the virtu software to tell it which apps to use i-mode/d-mode/hyperformance. Does this means this new edition is intelligent enough to kick start itself without me specifying the specific apps?
  • 0 Hide
    techguy911 , June 12, 2012 2:54 AM
    energy96I fail to see why anyone would ever really care about this in a desktop setting. The only advantage I see is a small power savings in 2D mode where most discrete cards already have a fairly small consumption rate anyway. If it were even $50 a year I would be surprised. I didn't even see any difference in my already small electricity bill when I added SLI GTX 580's to my system. On battery power? sure that's great, plugged in to the wall? /shrug who cares.



    You don't get it you can't use quicksync while a videocard is in the pci-e slot with software you can switch between ivybridge and any addon card, also if your using the addon video card in a game with software you can access the onboard gpu for more fps.
  • -5 Hide
    bucknutty , June 12, 2012 3:43 AM
    All that computer equipment and they cant afford a propper desk? Could you imagine trying to game at that computer station?
  • 4 Hide
    A Bad Day , June 12, 2012 3:45 AM
    bucknuttyAll that computer equipment and they cant afford a propper desk? Could you imagine trying to game at that computer station?


    It's call budget restraints. Lucid doesn't have Intel's finance health, otherwise they would've been involved in much more ambitious projects.
  • 1 Hide
    dreadllokz , June 12, 2012 8:12 AM
    Keep the good job! This kind of tech should be used in next gen consoles!
Display more comments