Sign in with
Sign up | Sign in

Nvidia Granted Patent For Hybrid Graphics Systems

By - Source: USPTO | B 35 comments

The U.S. Patent and Trademark Office has approved a patent application by Nvidia that described a switch between integrated and discrete graphics in a single computer system -- or what we know as Optimus.

Of course, it is not new technology, but we should remember that patents can take an eternity to make their way through the patent approval process. Nvidia originally filed the patent application in November 2007. For those who keep track, this was about the time when Nvidia was releasing GeForce 8800 series GPUs based on the G92 core.

The patent explains that switching between an IGPU (integrated GPU) and a DGPU (discrete GPU) could deliver greater graphics performance when needed, but save power in general applications: "While in the hybrid graphics mode, the DGPU performs the graphics processing, and the graphics driver transmits the rendered images from the DGPU to the IGPU local memory and, then, to the IGPU DAC. This image transmission allows applications to fully exploit the processing capabilities of the DGPU, while using the display device connected to the IGPU."

The first graphics chip that supported this technology were launched on September 11, 2008 and arrived as the GeForce 9M series GPUs.


Contact Us for News Tips, Corrections and Feedback

Discuss
Ask a Category Expert

Create a new thread in the News comments forum about this subject

Example: Notebook, Android, SSD hard drive

This thread is closed for comments
Top Comments
  • 17 Hide
    alextheblue , August 7, 2012 12:29 AM
    snowzsanNot really... I mean, I like AMD cards and whatnot but I don't see how this is holding anyone back.

    Mmm.... if they wanted to be pricks they could go after AMD for their implementation of this. Not sure how far they would get, but it is something to keep in mind.
  • 13 Hide
    tntom , August 7, 2012 12:59 AM
    This is not your vague Apple patent. AMD uses a different technology to do graphics switching. So no worry. Although I believe Nvidia's is slightly better than AMD's.
  • 10 Hide
    darkavenger123 , August 7, 2012 1:17 AM
    Will it affect LucidTechnology and their implementation on Intel internal GPU with discrete GPU? I hope not...
Other Comments
    Display all 35 comments.
  • 10 Hide
    cuecuemore , August 7, 2012 12:19 AM
    Another patent granted, another blow to technological progress.
  • -2 Hide
    snowzsan , August 7, 2012 12:22 AM
    Not really... I mean, I like AMD cards and whatnot but I don't see how this is holding anyone back.
  • 17 Hide
    alextheblue , August 7, 2012 12:29 AM
    snowzsanNot really... I mean, I like AMD cards and whatnot but I don't see how this is holding anyone back.

    Mmm.... if they wanted to be pricks they could go after AMD for their implementation of this. Not sure how far they would get, but it is something to keep in mind.
  • 9 Hide
    christarp , August 7, 2012 12:35 AM
    but... amd has been doing this for years
  • -4 Hide
    markem , August 7, 2012 12:47 AM
    in two years this patent wouldn't even be worth a dollar as the intel, ARM and AMD APU`s will kill this patent before it even comes to court.

    Already we are seeing less and less discreet GPU`s, apart from gamers laptops, even that is a very small market
  • 1 Hide
    A Bad Day , August 7, 2012 12:58 AM
    What was that one company that was developing a software that allowed the motherboard to switch between the discrete Nivida/AMD GPU and integrated Intel GPU?
  • 13 Hide
    tntom , August 7, 2012 12:59 AM
    This is not your vague Apple patent. AMD uses a different technology to do graphics switching. So no worry. Although I believe Nvidia's is slightly better than AMD's.
  • 10 Hide
    darkavenger123 , August 7, 2012 1:17 AM
    Will it affect LucidTechnology and their implementation on Intel internal GPU with discrete GPU? I hope not...
  • 5 Hide
    nukemaster , August 7, 2012 1:43 AM
    darkavenger123Will it affect LucidTechnology and their implementation on Intel internal GPU with discrete GPU? I hope not...

    +1 this is exactly what Virtu does. And it does work far better then I had guessed it would.
  • 3 Hide
    _Cubase_ , August 7, 2012 1:53 AM
    cuecuemoreAnother patent granted, another blow to technological progress.


    At least Nvidia has the means and intent to actually produce this technology. This is not what I would call patent trolling because it is quite obviously a patent specific to the company's intended development roadmap, and does not prevent any other company's existing technology from being produced or innovated upon.
  • 5 Hide
    chazbeaver , August 7, 2012 2:02 AM
    christarpbut... amd has been doing this for years

    There are multiple ways of doing everything, and NVIDIA figured out another way of combining graphics systems.
  • -2 Hide
    Anonymous , August 7, 2012 2:06 AM
    I want to see discrete cards with no display port now.
  • 0 Hide
    slrmichael , August 7, 2012 2:16 AM
    graphic cards seem to be getting crazier and crazier.
  • 2 Hide
    xophaser , August 7, 2012 2:21 AM
    cuecuemoreAnother patent granted, another blow to technological progress.

    I don't fully agree here. The original idea of patent is to protect business that come up with something innovating. When somebody copycat an idea, the original companies lose money from R&D. The copycat does nothing. This patient is legit. Either AMD have some alternative (like the 30 different ways you can make an abs cruncher and not infringe on a patent) or they pay royalist to this patent. Many company pay patent to other when they use same idea all the time. I don't remembered how long, but after a few (15-30) years the patent is up and becomes general practice and nobody has to pay anybody.

    Where patient went wrong is patent trolling. Lawyers from firm that don't produce anything buying idea and not producing.
  • -1 Hide
    A Bad Day , August 7, 2012 2:36 AM
    krenotenzeI want to see discrete cards with no display port now.


    What's wrong with display port? I'd rather see monitor manufacturers drop DVI and VGA so GPUs can have better ventilation and more ports.
  • 1 Hide
    crisan_tiberiu , August 7, 2012 3:41 AM
    nVidia CEO: "FU Virtu with your Intel IGPus" :p 
  • 3 Hide
    altriss , August 7, 2012 4:12 AM
    markemin two years this patent wouldn't even be worth a dollar as the intel, ARM and AMD APU`s will kill this patent before it even comes to court.Already we are seeing less and less discreet GPU`s, apart from gamers laptops, even that is a very small market


    yes only on gamer computers....
    Do you have ever heard about graphic acceleration?
    Like in hospital system for imaging?
    Or for research in high-energy particles?
    Or for simulation of physical systems in automobile and aeronautical industry?
    Or for measure acquisition in various areas?
    and so on...
    Maybe for people knowing nothing about computer than COD and angry birds, DGPU are rare products but there is a whole world which need computation power and thus DGPU...
  • -1 Hide
    southernshark , August 7, 2012 4:43 AM
    Im just happy that today technology isn't about new things, its about new patents and laws.

    TomsLegalforum perhaps?
  • 2 Hide
    southernshark , August 7, 2012 4:44 AM
    altrissyes only on gamer computers....Do you have ever heard about graphic acceleration?Like in hospital system for imaging?Or for research in high-energy particles?Or for simulation of physical systems in automobile and aeronautical industry?Or for measure acquisition in various areas?and so on...Maybe for people knowing nothing about computer than COD and angry birds, DGPU are rare products but there is a whole world which need computation power and thus DGPU...


    I love this response. Somehow in the last ten or so years. computers became defined as a tool used by someone with a sub 100 IQ, as opposed to the original concept as it being a tool of science. Today the idea of using a computer for something other than idiotspeek is ridiculed. I suspect most of those who make these statements fall in the sub 100 IQ crowd.
  • -3 Hide
    madjimms , August 7, 2012 5:07 AM
    Why do they call it "discreet" graphics? Nothing discreet about a standalone card. they should simply call it standalone.
Display more comments