Skip to main content

Nvidia Optimus Allows 'Hot' Removal of GPU

Nvidia Optimus is a hardware and software technology that determines when applications require discrete graphics via software, supplying power to the GPU in response. This is done automatically, without the need to manually switch between integrated and discrete graphics.

For notebook users, this is a particularly useful technology as the additional power requirements of the GPU are only called upon only when needed. This should extend battery life, making all notebook users happy.

Nvidia has released a video showing Optimus in action that should help drive home how slick this new technology is. Rather than just put the GPU into an idle or low-power state, the system completely powers down the graphics part. With the GPU off, the system doesn't even mind if the dedicated graphics part is completely removed even while running.

"Few people ever get to see this demo because it requires a completely open notebook system – no chassis – just the motherboard, CPU, GPU, Hard drive, and monitor, so it is not exactly portable. This demo is really killer with engineering teams that design notebooks," said Matt Wuebbling, senior product manager of notebooks at Nvidia.

"They practically fall out of their chairs when they see it," he continued. "Why? Because with Optimus when the GPU is not needed it is completely powered off automatically and seamlessly WHILE the rest of the system is up and running – the power to the PCI Express bus, the frame buffer memory, the GPU - everything. This is in contrast to switching the GPU to a low-power state or to ‘idle’, which would still draw power."

  • vant
    Makes sense.
    Reply
  • jay236
    Wouldn't this shorten the lifespan of the GPU because of the added cooling/heating cycles?
    Reply
  • maxsp33d
    I wonder if we'll ever reach a point when we can do this with any part of the system (i know SATA hdds are hot-swappable)
    Reply
  • IzzyCraft
    jay236Wouldn't this shorten the lifespan of the GPU because of the added cooling/heating cycles?depends how hot the chip gets if the temp variance is small it wouldn't matter too much.
    Reply
  • eklipz330
    ATi's megatron is gonna be pretty pissed about this...
    Reply
  • darthvidor
    maybe soon we'll see a PCI-e video daughter-card with multiple GPU sockets for SLI on demand
    Reply
  • festerovic
    Optimus was the brand name of shitty products from Radio Shack back in the day...
    Reply
  • NapoleonDK
    Very cool.
    Reply
  • mindless728
    pretty cool
    Reply
  • haunted one
    I'd love it if this technology could be applied to allow GPU upgrades on notebooks.
    Reply