Sign in with
Sign up | Sign in

Financial Analysts Say Intel Killed the Discrete Graphics Card

By - Source: MarketWatch | B 121 comments

Intel has most certainly created some tension between integrated graphics, good-enough-graphics and discrete graphics cards with the introduction of Ivy Bridge.

The view of the financial community has been generally very positive, which is reflected in Intel's stock price that is currently trading in the $28 neighborhood and is hitting 7-year highs. With the past 12 months, Intel's market cap has increased from about $110 billion to about $142 billion.

Financial background about Intel released today by financial analyst firm Five Star Equities confirms in that sentiment and indicates that the financial community has, for the first in more than 5 years, high expectations in Intel's opportunity in the chip market. Five Star Equities states that Ivy Bridge essentially kills the discrete graphics card because the integrated graphics of the CPU would be good enough for 95 percent of computer users.

"There is a very small market of people who seek out high-performance graphics cards, mostly comprised of hardcore gamers," the report reads. "The improved graphics provided by the Ivy Bridge chips will likely satisfy the needs of the average consumer."

The report also quotes industry analyst Jack Gold, who said that "extreme gamers who want very powerful graphics cards are in a niche market already, and it's shrinking." Gold continued and noted that Nvidia may be in trouble, "because their graphics chip market is falling off faster than their mobile chip market."

Display 121 Comments.
This thread is closed for comments
Top Comments
  • 61 Hide
    southernshark , May 1, 2012 2:03 PM
    Financial Analysts are overpaid.
  • 47 Hide
    Onus , May 1, 2012 2:09 PM
    95% of the consumer market is also excluding all those who buy professional graphics cards for 2D work. The "average consumer" who watches moves, views Flash-based content, and plays Solitaire, can get by with AMD's HD4250 or Intel's HD2000. That's been true for over a year now, and the gaming market is still boisterous enough to have AMD and nVidia developing and releasing new cards.
    I smell some bovine fecal material here...
  • 46 Hide
    snotling , May 1, 2012 2:08 PM
    Financial analysts need discreet graphics cards for their multi-display setups LOL.
    at least the serious ones do!
Other Comments
  • 61 Hide
    southernshark , May 1, 2012 2:03 PM
    Financial Analysts are overpaid.
  • 45 Hide
    jacobdrj , May 1, 2012 2:07 PM
    Brazos? Trinity? Fusion?
  • 45 Hide
    aaron88_7 , May 1, 2012 2:08 PM
    This all makes sense, but I still think a computer looks empty without a massive GPU inside :) 
  • 46 Hide
    snotling , May 1, 2012 2:08 PM
    Financial analysts need discreet graphics cards for their multi-display setups LOL.
    at least the serious ones do!
  • 47 Hide
    Onus , May 1, 2012 2:09 PM
    95% of the consumer market is also excluding all those who buy professional graphics cards for 2D work. The "average consumer" who watches moves, views Flash-based content, and plays Solitaire, can get by with AMD's HD4250 or Intel's HD2000. That's been true for over a year now, and the gaming market is still boisterous enough to have AMD and nVidia developing and releasing new cards.
    I smell some bovine fecal material here...
  • 21 Hide
    Anonymous , May 1, 2012 2:10 PM
    Intel actually knows how to pick its battles well. Why go for discrete graphics where there are already two mature giants duking it out for the favor of that 5%, when you can subsidize your own improved integrated graphics and reel in all the other fish?
  • 45 Hide
    Anonymous , May 1, 2012 2:10 PM
    This computing sector expert then went on to opine about the implications of next-generation games like Angry Birds Space.

    What an idiot.
  • 39 Hide
    trumpeter1994 , May 1, 2012 2:11 PM
    Quote:
    Five star Equities states that ivy bridge essentially kills the discrete graphics card because the intergrated graphics would be good enough for 95 percent of computer users.


    Hmmmmmm I guess I'm part of the five percent

    I bet they also think that 95 percent of computer users believe that OS X is immune to malware.
  • 30 Hide
    Pyree , May 1, 2012 2:13 PM
    Not really. You don't need to be a hardcore PC gamer to need a discrete card. You just need to be a PC gamer, as most modern game will not run well with IB's IGP.

    Let's face it, if you don't game or do any GPU intensive task, they will not get discrete GPU. This is true before IB's HD 4000 and and true after it. I don't see how that distribution change, so how exactly will HD 4000 kill the discrete GPU market?
  • 31 Hide
    Goldengoose , May 1, 2012 2:13 PM
    Great, we've just got over the 'desktops are dying!' so it's just changed to 'standalone graphics cards are dying! integrated is the way to go!".

    As long as someone wants to buy, someone will sell. End of.
  • 37 Hide
    830hobbes , May 1, 2012 2:14 PM
    How has Intel killed discrete graphics when AMD's Fusion processors still have significantly better graphics than HD4000? Shouldn't it be "AMD killed discrete graphics"?

    I'm not even on the AMD bandwagon. Intel makes better processors right now. I just don't understand how it's HD4000 that "killed discrete graphics" (which aren't even close to dead anyway)
  • 24 Hide
    830hobbes , May 1, 2012 2:17 PM
    Also, it's not like developers wouldn't use more powerful graphics power if they could. If anything is killing discrete graphics, it's that they have to develop for outdated console hardware that integrated graphics has caught up to.
  • -7 Hide
    proxy711 , May 1, 2012 2:20 PM
    heerherherherIntel actually knows how to pick its battles well. Why go for discrete graphics where there are already two mature giants duking it out for the favor of that 5%, when you can subsidize your own improved integrated graphics and reel in all the other fish?

    Huh? Intel tried multiple times to get into the discrete graphics market. After spending millions on Larrabee and delays they realized releasing Larrabee a gen too late was a bad idea and scraped it.

    I don't see canceling a multi million dollar project they heavily promoted being a good way to pick their battles. If they were picking battles they would only win, they wouldn't have even tried to make Larrabee.
  • 25 Hide
    Anonymous , May 1, 2012 2:30 PM
    Why post this article? NVIDIA and AMD will be making cards as long as people are buying laptops, desktops, and servers. A standalone GPU will always be bigger, stronger, faster than a dinky integrated GPU.

    In other news electric cars killed all gas guzzlers
  • 26 Hide
    brickman , May 1, 2012 2:31 PM
    That title gave me a good laugh. What Financial analysts know about computers? Other than using financial programs, which they are also not good at that.
  • 20 Hide
    Marfig , May 1, 2012 2:38 PM
    brickmanThat title gave me a good laugh. What Financial analysts know about computers? Other than using financial programs, which they are also not good at that.


    Indeed. Citing financial analysts to predict technological changes doesn't even make sense. Like asking a butcher to tell us all about fashion designers.

    There's however this tendency of the press to either pay attention to, or actually ask, opinions of people not qualified to give them. Financial analysts being one of the screaming examples. And this one just making it evident why we shouldn't listen.
  • 6 Hide
    omnimodis78 , May 1, 2012 2:43 PM
    I've been hearing and reading about the death of the dedicated graphics card for over a decade now. It's getting old. Heck, I remember reading how the PC will be dead in a few years (that was around the time of the smart-phone revolution), and I'm pretty sure that both the GPU market, and the PC market are doing just fine. Shrinking, sure, but market demands for any and all products shrink and grow in a cyclical pattern. If Intel would have been successful with Larabee, I'm sure we'd be listening to a different analyst with different opinions...
  • 7 Hide
    bavman , May 1, 2012 2:48 PM
    This title made me lol. Maybe for the average consumer who watches movies and browsers the internet on their PC, but the gaming area will always require power integrated graphics can't provide
  • 9 Hide
    hetneo , May 1, 2012 2:49 PM
    Well this guy would be correct if people would be buying PC per "good enough" criteria. Fortunately for GPU makers people still tend to spend $2000+ on PC they will use to send occasional email and play Solitaire. Intel could kill GPU market if cared enough to make decent drivers and if people were able to be content on spending $500-$750 on PC that is good enough for what they need.
Display more comments