Sign in with
Sign up | Sign in

Intel: Integrated Graphics is Where It's At

By - Source: Tom's Hardware US | B 87 comments

Intel is telling developers to consider integrated graphics solutions. Yes, really.

We love our graphics hardware. The offerings from AMD and Nvidia are always interesting to track from one generation to the next, each offering a significant jump from the previous to give us prettier, smoother games.

So what is Intel going on about when it says developers have good reason to be looking at integrated graphics? Well, for one, Intel is currently the biggest vendor of graphics parts, outpacing anything from Nvidia or AMD -- and that’s completely thanks to the IGPs that come with the Intel chipsets.

As first reported by CNet, in a recent video posted on Intel’s site, Aaron Davies, a senior marketing manager in the Intel Visual Computing Software Development group explained why game developers need to be paying the most attention to integrated graphics. "Here's your answer: Mercury Research showed that in 2008, for the first time, integrated graphics chipsets outsold discrete (graphics chips), and in 2013, we expect to see integrated graphics chipsets outsell discrete by three to one," Davies said.

Davies reiterated the point again that Intel wants to help developers capture where it believes the market will be when it comes to mobile gaming and integrated graphics from here on. He cited that with laptop sales surpassing desktop sales in 2008, gaming on integrated graphics are more important that ever.

Intel believes that developers now have a big business opportunity at targeting integrated graphics, which is essentially catering to the lowest common denominator. One thing is for certain: integrated (and eventually embedded) graphics will be even more common with Intel’s Westmere i7-based processors, and the Pineview Atom-based chips.

What do you think? Could this mark the beginning of the end of the need for high-end, $500 discrete graphics cards?

Discuss
Ask a Category Expert

Create a new thread in the News comments forum about this subject

Example: Notebook, Android, SSD hard drive

This thread is closed for comments
Top Comments
  • 30 Hide
    brendano257 , April 6, 2009 10:53 PM
    This is sad, Intel might as well say "Please make your games look like crap so they run on our integrated cards" So much for high end graphics if normal users don't appreciate them...
  • 27 Hide
    roofus , April 6, 2009 10:51 PM
    A TNT card could mop the floor vs Intel integrated graphics and they really expect developers to swallow this swill they are peddling?? Intel, you may be the biggest but your trying to impose your will where you have never proven yourself capable.
  • 18 Hide
    Anonymous , April 6, 2009 10:49 PM
    LOL, pathetic. Development is relative to the power of the hardware available, hence L337 3d games didn't exist for the 286. I guess Intel wants gaming systems dumbed-down to 1990s standards to make up for their shortcomings. I think they figured out that Larrabee is gonna flop...
Other Comments
    Display all 87 comments.
  • 18 Hide
    Anonymous , April 6, 2009 10:49 PM
    LOL, pathetic. Development is relative to the power of the hardware available, hence L337 3d games didn't exist for the 286. I guess Intel wants gaming systems dumbed-down to 1990s standards to make up for their shortcomings. I think they figured out that Larrabee is gonna flop...
  • -8 Hide
    zerapio , April 6, 2009 10:49 PM
    I think this can trigger the beginning of the end of the need but not the use for high-end discrete graphics cards. If the market for high end graphics cards shrinks enough I could see developers dropping support. Something like what happened to the sound card market.
  • 27 Hide
    roofus , April 6, 2009 10:51 PM
    A TNT card could mop the floor vs Intel integrated graphics and they really expect developers to swallow this swill they are peddling?? Intel, you may be the biggest but your trying to impose your will where you have never proven yourself capable.
  • 12 Hide
    rantarave , April 6, 2009 10:53 PM
    there will always be a market for fast gaming PC's and 500 dollar graphics cards especially now with people doing more with their PC's (picture/video editing format conversions)

    i think we are going to see a spread

    people with exreme low end (intergrated)

    and people with 200+ graphics cards
  • 30 Hide
    brendano257 , April 6, 2009 10:53 PM
    This is sad, Intel might as well say "Please make your games look like crap so they run on our integrated cards" So much for high end graphics if normal users don't appreciate them...
  • 13 Hide
    Hatecrime69 , April 6, 2009 11:03 PM
    I personally read this as: 'why not to bother with larabee' myself..If developers seriously considered intel chipset graphics in their games then quake 3 would still be considered 'high end' graphics
  • 10 Hide
    mindless728 , April 6, 2009 11:14 PM
    instead of developers, they should tell their customers to play games that predate the IGP by 10 years
  • 6 Hide
    engrpiman , April 6, 2009 11:14 PM
    Hatecrime69I personally read this as: 'why not to bother with larabee' myself..If developers seriously considered intel chipset graphics in their games then quake 3 would still be considered 'high end' graphics


    Quake 3 is fun and look at quake live. if developers stopped trying to Pump more graphics and started to pump more fun we might have better games.
  • 2 Hide
    mindless728 , April 6, 2009 11:15 PM
    instead of telling developers to tone down the games, they should be telling their customers to play games that predate the IGP by 10 years or so
  • 1 Hide
    hercules , April 6, 2009 11:21 PM
    woot quake 3 rocked my world... back 1999... I am not to concerned having integrated graphics is a set back in computing... yes making things smaller and smaller seems to be the way to go but sacrificing so much power and ability isn't going to float well they will see this soon enough.
  • 13 Hide
    Mitrovarr , April 6, 2009 11:25 PM
    Intel really shouldn't be bragging about a graphics solution that is bought exclusively by those who don't care about performance. Sure, they sell the most chips, but it's only because most people don't care at all about their 3d acceleration. Anyone who cares gets something better, even if that means an integrated Nvidia or ATI chip.

    Nothing is killing the PC gaming market more than the fact that one of the most common types of PC sold is the cheap Intel-based laptop - a computer that not only can't play games, it can't ever be upgraded to do so. Intel is making a dire mistake by pushing these things on consumers at all. If the entry level computer cannot play games or ever be upgraded to do so, the number of people entering PC gaming will dwindle, and they won't progress to buying high-end 'gaming' processors later.
  • 5 Hide
    falchard , April 6, 2009 11:28 PM
    Yes developers should make more 2D isometric games, becuase that is the standard intel has towards integrated graphics. If they really wanted to go integrated, then they should target a real integrated chip from AMD or nVidia. Integrated from Intel is a joke.
  • 12 Hide
    lexspecialis , April 6, 2009 11:31 PM
    Seriously, Intel should just stick to making processors, because their graphic solutions sucks.
  • 1 Hide
    Dave K , April 6, 2009 11:51 PM
    Which would you rather be selling?

    A) 10 million low end IGP chipsets at an average price around $10.00 (if not less)

    B) 2 million high end boards at an average price of $200.00 each?

    "Lies, Damn Lies, and Statistics"
  • -1 Hide
    SneakySnake , April 6, 2009 11:52 PM
    *cough* *choke* *gag*

    please don't ever post an article about integrated graphics again
  • 8 Hide
    warezme , April 6, 2009 11:58 PM
    the only reason there is a "LOWEST COMMON DENOMINATOR" is because of Intels CRAPPY IGP video. They are the lowest common denominator. The lead(pronounced LED, Pb) in the video industries ass.
  • 4 Hide
    Anonymous , April 7, 2009 12:32 AM
    Intel is talking rubbish. Most games will run fine on good laptops with low power discrete graphics chips. My Lenovo runs FEAR happliy on low setting thanks to it's nVidia 7300. And as for the Atom/netbook, nVidia's Ion has been demonstrated to play CoD4 and others.

    Also all the new chipsets from AMD and nVidia feature integrated DX10 GPUs.

    Intel just can't admit nVidia and AMD are stealing all their low power gaming market, since anyone who casually/seriously plays the newest games will ensure they are running a AMD or nVidia GPU of some form.

    If Intel want to stay in the race they need to spend some serious money and bring out a decent floating point pipelined GPU (not some Pentium 1 multicore crap ie Larrabee).
  • 1 Hide
    radguy , April 7, 2009 12:39 AM
    "What do you think? Could this mark the beginning of the end of the need for high-end, $500 discrete graphics cards?"
    NO
  • 5 Hide
    ravenware , April 7, 2009 12:40 AM
    Quote:
    What do you think? Could this mark the beginning of the end of the need for high-end, $500 discrete graphics cards?


    No and I am not sure if that is the point that Intel is stressing here.

    The majority of the computing world doesn't need discrete gpu cards, so obviously integrated will sell higher. This also means there is a huge target audience to run games that aren't too demanding on integrated hardware.

    Intel is trying to convince developers to develop lower end capable games since the hardware exists to do so.

    I think this is a good idea for developers. I have installed some old school games on my laptop that I was never able to play before, but they run well on my integrated hardware. Unreal is still fun many years later no matter how dated the graphics look.
  • 0 Hide
    Joe_The_Dragon , April 7, 2009 12:46 AM
    It's said when desktop cards running at X1 pci-e speed are faster then intel video.

    It's said that amd and nvidia boards at the about same price have much better on board video.

    Why can't intel have 64-128 side port ram like amd?
Display more comments