Sign in with
Sign up | Sign in

GeForce GPU Will be Inside 200 Sandy Bridge PCs

By - Source: Tom's Hardware US | B 40 comments

Even with integrated graphics, there are still lots of discrete graphics!

There’s soon going to be a new Intel kid on the block, and her name is Sandy Bridge. She’s the next generation of Intel’s flagship Core architecture, she’ll be the star on stage at Intel’s CES 2011 showing.

One of the big things about Sandy Bridge is its integration of a GPU directly in the same die as the CPU. While the graphics prowess of is Intel’s best yet, it’s not enough to elbow out discrete graphics makers like Nvidia. In fact, Nvidia is boasting that it has scored 200 design wins that will pair its GeForce GPUs with next generation Core systems.

Part of that is due to the integrated Intel graphics not being a DirectX 11 part. For full DX11 support, OEMs have to turn to graphics solutions from Nvidia (or AMD).

"With the explosion in digital content and entertainment, it's no wonder that consumers love GeForce notebooks," said Jeff Fisher, vice president of the GeForce business unit at Nvidia. "Our momentum reflects the overwhelming need for a better PC experience."

The designs are expected to span a variety of notebook and desktop systems from leading OEMs including Acer, Alienware, ASUS, Dell, Fujitsu-Siemens, HP, Lenovo, Samsung, Sony, Toshiba and more.

We expect CES next month to be packed with Sandy Bridge computers.
Discuss
Ask a Category Expert

Create a new thread in the News comments forum about this subject

Example: Notebook, Android, SSD hard drive

This thread is closed for comments
  • 1 Hide
    joytech22 , December 21, 2010 2:05 AM
    About time, the 360 has been doing this for years and it's about time it hit PC's too.

    But you can bet your ass these solutions will probably cost *something stupid here*
  • -1 Hide
    aznshinobi , December 21, 2010 2:41 AM
    ^ They already have, they were just testing it out. Intel's i5s and i think i3s already have an iGPU. AMD also proved they could do it, they just don't have it on a desktop CPU. Their laptop processors do have an iGPU however.

  • 8 Hide
    warfart1 , December 21, 2010 3:55 AM
    Quote:
    They already have, they were just testing it out. Intel's i5s and i think i3s already have an iGPU. AMD also proved they could do it, they just don't have it on a desktop CPU. Their laptop processors do have an iGPU however.

    The previous generation of Intel processors with integrated graphics had them on package, not on die. I believe that the CPU cores were 32nm while the graphics cores were 45nm.
  • Display all 40 comments.
  • 1 Hide
    Blessedman , December 21, 2010 4:26 AM
    I can understand this design for a business type of drone, but doesn't their entire new lineup have this? I bet Nvidia is laughing at this decision. I can see putting a larrabee type of designed gpu on die, so you can have decent graphics but an amazing co-processor, but why waste the space on ... junk?
  • 1 Hide
    toxxel , December 21, 2010 4:29 AM
    I personally won't even think of buying a cpu with an integrated graphics, or even use it. I could personally only see something like this being popular in mobile systems, not desktops. I build a desktop to have the customization to pick and upgrade what I want, integrated graphics disallows that option. More of a casual user if you would find it in a desktop, or someone who could care less.
  • 1 Hide
    mtyermom , December 21, 2010 5:43 AM
    toxxelMore of a casual user if you would find it in a desktop, or someone who could care less.


    And there are a LOT more of those desktops than ones like ours.
  • 2 Hide
    phatboe , December 21, 2010 5:54 AM
    I agree with toxxel, while I like the idea of an APU/Fusion/integrated GFX core for laptops but I will prob never use it on a desktop computer as I will always have a dedicated card. It sucks that it seems that both AMD and Intel will make it so that all their CPUs will have a GFX core when that die space could be used for an extra CPU core or more L3 and etc. And seeing as to how I mostly stick to Nvidia cards I doubt either AMD or Intel will make it so that I can pair the GFX abilities on the APU with the dedicated card (like some kind of hybrid SLI\CrossFireX)
  • 1 Hide
    Nintendork , December 21, 2010 6:16 AM
    If you read Fusion articles(other websites) there's the possibility that the APU can be used to offload physics calculation in games.

    As for now AMD will provide regular desktops without IGP on die (Zambesi 8-6-4 cores) and Fusion Llano(based on Deneb improved cores) updated later Fusion Trinity (Bulldozer Cores). So if you don't like to have a "useless" IGP then wait for Zambesi.
  • 0 Hide
    juliom , December 21, 2010 7:00 AM
    Is this news or just free publicity?
  • 1 Hide
    scrumworks , December 21, 2010 7:01 AM
    Nvidia GPU inside 200 Sandy Bridge PCs? Well that's not much. I bet AMD GPU will be inside hundreds of thousands of Sandy Bridge PCs.
  • 0 Hide
    madevil59 , December 21, 2010 7:10 AM
    mtyermomAnd there are a LOT more of those desktops than ones like ours.


    If you add in a video card it will disable the GPU in the Intel chip. The i series was built with gpu in them. This is just one of the first times it caught the media's attention to have enough people to read into.
  • -1 Hide
    joytech22 , December 21, 2010 7:25 AM
    I just realized that if the Intel CPU's had a powerful enough IGP from Nvidia, free PhysX calculations!
  • 1 Hide
    RazberyBandit , December 21, 2010 7:37 AM
    These 200 PCs could just as easily have Radeon GPUs in them.

    What I've managed to take from this is Nvidia has somehow partnered-up with some major PC manufacturers to ensure their GPUs are featured alongside the Sandy Bridge CPU release, which is expected to be a major attraction at CES. Nvidia is sort of hitching a ride on Intel's coattails. It's simply clever marketing and product exposure.
  • 0 Hide
    Silmarunya , December 21, 2010 8:00 AM
    phatboeI agree with toxxel, while I like the idea of an APU/Fusion/integrated GFX core for laptops but I will prob never use it on a desktop computer as I will always have a dedicated card. It sucks that it seems that both AMD and Intel will make it so that all their CPUs will have a GFX core when that die space could be used for an extra CPU core or more L3 and etc. And seeing as to how I mostly stick to Nvidia cards I doubt either AMD or Intel will make it so that I can pair the GFX abilities on the APU with the dedicated card (like some kind of hybrid SLI\CrossFireX)


    1) The die size 'wasted' on the IGP is not wasted. Today, many calculations can be offloaded to the GPU. If that trend continues, the IGP would in effect be an extra processor core that only takes effect in computationally intensive applications. That's not a waste at all...

    2) You might not use an IGP, but a majority of the market does - with good reason. Modern IGP's can do all the things Average Joe asks of them (video playback, web browsing, casual gaming and word processing) quite well.

    3) Intel has nothing to gain from strong arming Nvidia GPU's out of the market. First and foremost it would cause massive antitrust regulations, and second it would mean they'd effectively cede the GPU market to arch rival AMD.

    FYI, IGP's can already be used in Crossfire/SLI with discrete GPU's (Hybrid SLI/Crossfire anyone?). However, only for the lowest of lowest end GPU's there's a noticeable performance benefit. Even the 5450 fails to post significant gains. After all, IGP's are terribly weak and their Crossfire/SLI scaling is absolutely pathetic.
  • 0 Hide
    marraco , December 21, 2010 8:14 AM
    I hope that processors without integrated GPU will be cheaper, more overclockable (because of less transistors), or more powerful (same reason).

    I do not want to waste money and energy on unneeded transistors. Unless nVidia and AMD manages to add-up the power of integrated chipsets and discrete cards. They should work on that, since it would give a for-free performance advantage. But I suspect that it will not work without DirectX support.
  • -2 Hide
    dEAne , December 21, 2010 8:36 AM
    I can't wait to the review for this new chip. Hope for the price too is affordable.
  • 1 Hide
    woshitudou , December 21, 2010 10:50 AM
    I like nVidia cards but the company is the Ryan Seacrest of computer (cause they're douches and will backstab you for a promotion). They made a dedicated Intel-hate site and said dx11 means nothing when they didn't have dx11 yet (http://www.tomshardware.com/news/Nvidia-GPGPU-GPU-DirectX-ATI,8687.html) and now they flip flopped.
  • 2 Hide
    tommysch , December 21, 2010 11:22 AM
    toxxelI personally won't even think of buying a cpu with an integrated graphics, or even use it. I could personally only see something like this being popular in mobile systems, not desktops. I build a desktop to have the customization to pick and upgrade what I want, integrated graphics disallows that option. More of a casual user if you would find it in a desktop, or someone who could care less.


    Well, you will have to... All the top end unlocked Sandy bridges will come with that PoS integrated space taking crap. It will be disabled but it will be there. Taking precious die space and costing money for absolutely no reason. Even the Core i7-2600K... Whats the point of putting such a crap on a K model?
  • -2 Hide
    digiex , December 21, 2010 12:58 PM
    Cool Intel Processors, Hot Nvidia Graphics, it makes sense.
  • 1 Hide
    phatboe , December 21, 2010 1:16 PM
    Quote:
    1) The die size 'wasted' on the IGP is not wasted. Today, many calculations can be offloaded to the GPU. If that trend continues, the IGP would in effect be an extra processor core that only takes effect in computationally intensive applications. That's not a waste at all...


    The die space is wasted, you do realize that on sandy bridge once a discrete GFX card is in use the IGP on the CPU will simply shit down? So no it will not be used toward offloading calculations, the discrete card will have to take care of that.

    Quote:
    2) You might not use an IGP, but a majority of the market does - with good reason. Modern IGP's can do all the things Average Joe asks of them (video playback, web browsing, casual gaming and word processing) quite well.
    Yeah, I know and agree but I would hope they would make a version of sandy bridge without the IGP for those few of us who don't want it.

    Quote:
    3) Intel has nothing to gain from strong arming Nvidia GPU's out of the market. First and foremost it would cause massive antitrust regulations, and second it would mean they'd effectively cede the GPU market to arch rival AMD.
    Intel has everything to gain. Intel has already dealt with the entire antitrust issue for the most part and it is unlikely that intel will be brought up again for integrating its GPU into the CPU.

    As far as them ceding to AMD, you may have a point but with intel still working on larabee (they might not have released it but intel is still working on designing a distrete GFX card despite the fact that they have claimed they won't release it as a discrete GFX card anytime soon).

    Quote:
    FYI, IGP's can already be used in Crossfire/SLI with discrete GPU's (Hybrid SLI/Crossfire anyone?). However, only for the lowest of lowest end GPU's there's a noticeable performance benefit. Even the 5450 fails to post significant gains. After all, IGP's are terribly weak and their Crossfire/SLI scaling is absolutely pathetic.
    There is no hybrid Crossfire/SLI with Sandy Bridge, as I said before the IGP will be disabled.
Display more comments