Sign in with
Sign up | Sign in

Nvidia Optimus Allows 'Hot' Removal of GPU

By - Source: Tom's Hardware US | B 41 comments

Nvidia yanks GPU from live-running computer system.

Nvidia Optimus is a hardware and software technology that determines when applications require discrete graphics via software, supplying power to the GPU in response. This is done automatically, without the need to manually switch between integrated and discrete graphics.

For notebook users, this is a particularly useful technology as the additional power requirements of the GPU are only called upon only when needed. This should extend battery life, making all notebook users happy.

Nvidia has released a video showing Optimus in action that should help drive home how slick this new technology is. Rather than just put the GPU into an idle or low-power state, the system completely powers down the graphics part. With the GPU off, the system doesn't even mind if the dedicated graphics part is completely removed even while running.

Nvidia Optimus Allows 'Hot' Removal of GPU

"Few people ever get to see this demo because it requires a completely open notebook system – no chassis – just the motherboard, CPU, GPU, Hard drive, and monitor, so it is not exactly portable. This demo is really killer with engineering teams that design notebooks," said Matt Wuebbling, senior product manager of notebooks at Nvidia.

"They practically fall out of their chairs when they see it," he continued. "Why? Because with Optimus when the GPU is not needed it is completely powered off automatically and seamlessly WHILE the rest of the system is up and running – the power to the PCI Express bus, the frame buffer memory, the GPU - everything. This is in contrast to switching the GPU to a low-power state or to ‘idle’, which would still draw power."

Display 41 Comments.
This thread is closed for comments
Top Comments
  • 40 Hide
    eklipz330 , March 3, 2010 10:23 PM
    ATi's megatron is gonna be pretty pissed about this...
  • 12 Hide
    Shadow703793 , March 3, 2010 11:17 PM
    Kudos for nVidia for this, HOWEVER WHERE is ME FERMI!?!?!?
Other Comments
  • 10 Hide
    vant , March 3, 2010 9:58 PM
    Makes sense.
  • -2 Hide
    jay236 , March 3, 2010 10:02 PM
    Wouldn't this shorten the lifespan of the GPU because of the added cooling/heating cycles?
  • 9 Hide
    maxsp33d , March 3, 2010 10:15 PM
    I wonder if we'll ever reach a point when we can do this with any part of the system (i know SATA hdds are hot-swappable)
  • 3 Hide
    IzzyCraft , March 3, 2010 10:18 PM
    jay236Wouldn't this shorten the lifespan of the GPU because of the added cooling/heating cycles?

    depends how hot the chip gets if the temp variance is small it wouldn't matter too much.
  • 40 Hide
    eklipz330 , March 3, 2010 10:23 PM
    ATi's megatron is gonna be pretty pissed about this...
  • 2 Hide
    darthvidor , March 3, 2010 10:28 PM
    maybe soon we'll see a PCI-e video daughter-card with multiple GPU sockets for SLI on demand
  • 4 Hide
    festerovic , March 3, 2010 10:31 PM
    Optimus was the brand name of shitty products from Radio Shack back in the day...
  • 2 Hide
    NapoleonDK , March 3, 2010 10:34 PM
    Very cool.
  • 0 Hide
    mindless728 , March 3, 2010 10:51 PM
    pretty cool
  • -3 Hide
    haunted one , March 3, 2010 10:57 PM
    I'd love it if this technology could be applied to allow GPU upgrades on notebooks.
  • 9 Hide
    tipoo , March 3, 2010 10:57 PM
    Then they should have called it OFFtimus.


    Ha! Get it? Get it?! See, I...Oh forget it.
  • -1 Hide
    jojesa , March 3, 2010 11:01 PM
    eklipz330ATi's megatron is gonna be pretty pissed about this...


    ROFLMAOWTIME
  • 2 Hide
    echdskech , March 3, 2010 11:05 PM
    does anyone know if/when they will apply the tech to desktops?
  • 12 Hide
    Shadow703793 , March 3, 2010 11:17 PM
    Kudos for nVidia for this, HOWEVER WHERE is ME FERMI!?!?!?
  • 0 Hide
    Anonymous , March 3, 2010 11:24 PM
    Isn't dynamic clocking almost the same(only for the saving energy part)? I think it will delay for a second or half a second to start rendering something.
    Although this is great when you would have 4 seperate GPU's and 3 of them aren't in use. I am not sure how stable it will be for overclockers though...
  • 1 Hide
    maigo , March 3, 2010 11:58 PM
    maxsp33dI wonder if we'll ever reach a point when we can do this with any part of the system (i know SATA hdds are hot-swappable)

    Some server boards can hot swap cpus/ram
  • -7 Hide
    ohim , March 4, 2010 12:08 AM
    another BS to put as a sticker on your video card when you buy it, i see no reason at all to remove my video card when my PC is running.
  • 2 Hide
    overclockingrocks , March 4, 2010 12:11 AM
    that's pretty neat to watch. when this hits notebooks I can imagine it'll make for great battery life no matter what you're doing
  • 0 Hide
    saint19 , March 4, 2010 12:15 AM
    maigoSome server boards can hot swap cpus/ram


    Yeah, that's true, but only server mobos, not home mobos.
  • 0 Hide
    bin1127 , March 4, 2010 12:42 AM
    TH must be loving this tech when doing your benchmarks.
Display more comments