Sign in with
Sign up | Sign in
Your question

High time for Intel to get serious about graphics

Last response: in CPUs
Share
March 1, 2008 12:15:28 AM

High time for Intel to get serious about graphics
Posted by Tom Krazit | 5 comments

When a high-ranking executive at your strongest partner openly thinks your technology "barely works," perhaps it's time to make that a higher priority.

A series of internal Microsoft e-mails discussing Intel's 915 and 945 integrated graphics chipsets in unfavorable terms made its salacious way around the Internet this week. Microsoft is currently being sued over its Windows Vista upgrade programs, which were designed with pressure from Intel, but over the objections of the PC industry, to include support for a graphics chipset that couldn't run Vista's Aero interface.

In February 2007, just after Vista launched, Microsoft's Steve Sinofsky told CEO Steve Ballmer that the 945 chipset, required for the "Vista Premium Ready" logo, could barely run Vista. And everyone (inside the PC industry, at least) knew the widely used 915 chipset that was awarded the "Vista Capable" logo couldn't even think about running the advanced display driver model used to deliver the fancy Aero interface, considered one of the major selling points of Vista.

Juicy stuff, for sure, but it's old news that Intel and Microsoft have been in engaged in "coopetition" for years. The real lesson is just how badly even Microsoft thinks of the current state of integrated graphics.

Intel likes to mention that it's the world's leading supplier of graphics technology. The only reason it can claim that mantle, however, is because people like bargains, and the way they get those bargains is through the use of integrated graphics chipsets.

Around 75 percent of the notebooks, and around 60 percent of the desktops, sold last year used integrated graphics chips. The rest use discrete graphics chips made by Nvidia and AMD that offer far more powerful performance for games and video.

The integrated graphics chips, usually thought of as "good-enough graphics," really aren't that good. Intel has had loads of problems with its graphics chipsets and their support for PC games or other intense graphical programs. Most of that software will run, but not in an ideal fashion, and lots of people expect that shiny new PC to be able to run PC games without fits and starts or jerky gameplay.

Intel has put the 915 and 945 chipsets behind it, but challenges remain. It still encountered problems with the release of the 965 chipset, and the G30 series has yet to make it into notebook PCs. This area represents arguably Intel's most glaring weakness at present.

The company has shown it's getting more serious about graphics, hiring more engineers and focusing some of its design prowess on projects like Larrabee. And it tried to take a big step forward in the performance of its 965-series integrated graphics chipsets by adding support for functions like transform and lighting. It had lots of problems delivering drivers for that chipset, however, and when those drivers arrived, they didn't deliver a uniform boost in performance.

Nvidia and AMD are way ahead when it comes to understanding how to build graphics chips. Nvidia has been doing this for years, and AMD recognized the growing importance of graphics when it acquired (for far more than it should have paid, however) ATI Technologies in 2006.

Graphics chips and CPUs like the Core 2 Duo are two very different beasts, but the wholesale embrace of multicore processor designs means that at some point, graphics technology becomes just a core on the main chip. AMD is well underway with planning for its Fusion processor and Nvidia seems to be eyeing broader uses for its high-powered graphics chips.

This is Intel's next great challenge, now that it has thankfully derailed the March of Itanium and soothed the burns from the Netburst architecture.

It needs to somehow get up to speed with Nvidia and the former ATI when it comes to graphics knowledge while keeping an eye on the rest of its business. Intel has found it difficult in recent years to break into new areas, such as flat-screen TVs or cell phones, that have very different processing requirements and architectures than the CPU.

But those other bets were just that, bets. This time, Intel has no choice. Intel can't afford to fall behind as the PC industry changes; it's one thing to swing and miss when trying something new, it's quite another to miss the mark on your home turf.

By the time Windows 7 rolls around, Intel will need to do better than "barely works."

http://www.news.com/8301-13579_3-9883439-37.html
March 1, 2008 2:34:18 PM

I wont hold my breath for Intel on a decent gpu. I will however wait for those new cpu/gpu's to release. Now that is going to inovate computing.
a b à CPUs
March 2, 2008 1:46:49 AM

Why should they? They are mainstream, majority of the sells and making a bunch of money off their graphic chips. Anything else is a loss in money.
Related resources
March 2, 2008 2:29:46 AM

I still remember the good days of the Intel 810 IGP on my old P3
That was one of the best IGPs I owned. I could play episode 1 racer without a hick, when it was just released.
Even if Intel has enough power to surprise us all, I think it will be too hard for them to battle against today IGPs from Nvidia and ATI. Specially the later.
March 2, 2008 2:39:40 AM

I heard that Intel is getting serious on the graphics card department. They should focus on what gives them more money.
a b à CPUs
March 2, 2008 5:46:09 AM

If Intel can get any sort of video chip to match the mid range cards of the day from ATi or Nvidia (eg 8600, HD3850) then there in business, mid range/performance budget cards are hot, picture a 45nm/32nm GPU ~3ghz, integrated cache, highly overclockable/scalable with proper driver updates and highly programmable, we may see a monster in the works...
a c 127 à CPUs
March 2, 2008 4:11:47 PM

Dear God all mighty. This after the fact that at Intels last technology show announced and showed info about Larabee which is known as a discrete GPU. Not to mention that their next step of IGP's will be on CPUs thus giving better performance.

Don't worry, Intel is serious about graphics. And they might get to serious and give NVidia & ATI a run for their money.
March 2, 2008 5:05:30 PM

jimmysmitty said:
Dear God all mighty. This after the fact that at Intels last technology show announced and showed info about Larabee which is known as a discrete GPU. Not to mention that their next step of IGP's will be on CPUs thus giving better performance.

Don't worry, Intel is serious about graphics. And they might get to serious and give NVidia & ATI a run for their money.


At this point, given intel's track record on graphics solutions, anything shown by Larabee should be taken with a huge single crystal of salt. And I'm biased toward intel. =) Let's wait until they demo somthing.

I am curious... why do you think that a gpu ondie will outperform an off-die gpu? There are only so many transistors to go around indie.. so the offdie solution should have much more capability. So there's a tradeoff between GPU/CPU communication speed and raw horsepower. I haven't seen any data showing which one will win out.
March 2, 2008 5:17:49 PM

ryman554 said:
I am curious... why do you think that a gpu ondie will outperform an off-die gpu? There are only so many transistors to go around indie.. so the offdie solution should have much more capability. So there's a tradeoff between GPU/CPU communication speed and raw horsepower. I haven't seen any data showing which one will win out.


As far as I can see, on-die IGPs are a way to reduce cost, not increase performance. And they'll only be put on the crappiest CPUs since there's no point wasting transistors on an 'enthusiast' CPU for an IGP that almost no-one will ever use... even if the core has half-decent performance, the shared memory will cripple it at higher resolutions.
a b à CPUs
March 2, 2008 5:22:51 PM

Exactly, MarkG. Even my lowly 7300LE is better than an IGP.
March 2, 2008 6:59:37 PM

jimmysmitty said:
Dear God all mighty. This after the fact that at Intels last technology show announced and showed info about Larabee which is known as a discrete GPU. Not to mention that their next step of IGP's will be on CPUs thus giving better performance.

Don't worry, Intel is serious about graphics. And they might get to serious and give NVidia & ATI a run for their money.


No way my friend, intel have got engineering experience by the bucketload, but almost no software engineering experience (graphics drivers). They might be able to build a snazzy grahics card, or a decent mid-range card, but they wont be able to write drivers for it. ATI & Nvidia have been in this game for years, and are light-years ahead of intel in that respect.
March 2, 2008 7:28:57 PM

Don't worry. With the amount of resources Intel has, its easier for them to write a graphic driver than write an explanation of how they used "rebates" to compete... :sarcastic: ...
March 2, 2008 7:46:46 PM

spoonboy said:
No way my friend, intel have got engineering experience by the bucketload, but almost no software engineering experience (graphics drivers). They might be able to build a snazzy grahics card, or a decent mid-range card, but they wont be able to write drivers for it.


Except:

http://www.theinquirer.net/en/inquirer/news/2006/08/21/...

(OK, it's the Inquirer, but I've heard the same from other sources)
March 2, 2008 8:03:41 PM

Quote:
No way my friend, intel have got engineering experience by the bucketload, but almost no software engineering experience (graphics drivers). They might be able to build a snazzy grahics card, or a decent mid-range card, but they wont be able to write drivers for it.

That's not really true... Intel do have a track record already, for writing drivers for their IGPs that get bundled into motherboard drivers. Most of the complaints have been that their IGP hardware is weak, not that their drivers or even hardware are incompatible or inefficiently used, although you'll hear some driver complaints with games here and there, just as with ATI/Nvidia.
a c 127 à CPUs
March 2, 2008 8:30:26 PM

ryman554 said:
At this point, given intel's track record on graphics solutions, anything shown by Larabee should be taken with a huge single crystal of salt. And I'm biased toward intel. =) Let's wait until they demo somthing.

I am curious... why do you think that a gpu ondie will outperform an off-die gpu? There are only so many transistors to go around indie.. so the offdie solution should have much more capability. So there's a tradeoff between GPU/CPU communication speed and raw horsepower. I haven't seen any data showing which one will win out.


I don't think that a discrete GPU will be beaten by a ondie IGP but the regular IGP's that used the CPU as their main processing source will not be as good. Plus we have to wait and see. I consider the fact that it will be a actual GPU and since GPU's have different structures than CPUs it should increase performance.

spoonboy said:
No way my friend, intel have got engineering experience by the bucketload, but almost no software engineering experience (graphics drivers). They might be able to build a snazzy grahics card, or a decent mid-range card, but they wont be able to write drivers for it. ATI & Nvidia have been in this game for years, and are light-years ahead of intel in that respect.


That is true but don't forget that Intel does have the R&D to do it. I think that if Intel wants to release a nice GPU they will work on the drivers as well. Don't forget they have been making chipset drivers and other drivers that are some of the best so they could easily create good GPU drivers.

I guess we will have to wait and see. I am excited as this might push NVida and ATI to make even better chips since it will be a threat on their market.
March 2, 2008 9:01:06 PM

Quote:
Imagine an integrated 3870 X2 ahh... the affordablity


While that article is not old news, it deals with old news. It's just come out in a class action suit against Microsoft over the Vista Capable and Vista Ready logos on OEM PC's selling with Windows XP in the months prior to Vista's release.

Really nothing new to gamers. We've had examples of developers saying that they hated being forced to get their games to work under the inferior IGP's Intel had available. We've heard those kvetches for years.

Now, Intel has it's own nascent GPU division and might put out some very competitive products (which I hope utilize Crossfire, thus making it a new standard). We should see by the end of this year or the first half of 2009.

MarkG said:
As far as I can see, on-die IGPs are a way to reduce cost, not increase performance. And they'll only be put on the crappiest CPUs since there's no point wasting transistors on an 'enthusiast' CPU for an IGP that almost no-one will ever use... even if the core has half-decent performance, the shared memory will cripple it at higher resolutions.


Currently, the 3870x2 powers down from 850 to a 300 core clock when not running games. Probably only one GPU is active too. That's one form of power saving mode, but the power saving mode offered by upcoming IGP's and future CPU integrated IGP cores promises even more.

The only thing that Nvidia's doing right now that I see as innovative, is that they're enabling the IGP in all their chipsets, so even enthusiast triple SLI boards will have an IGP capable of power saving mode. That's what the tech is there for, to replace enthusiast GPU's with crappy IGP's only while surfing the net and watching movies and anime.


ryman554 said:


I am curious... why do you think that a gpu ondie will outperform an off-die gpu?


It doesn't have to outperform. It only has provide a power saving mode for high end discrete GPU's and it can provide hybrid SLI or Crossfire for IGP chipset GPU's or low end discrete GPU's. It's also a good deal in a notebook, where discrete GPU's aren't competition. Keep in mind that more notebooks are sold than desktops, and the enthusiast market is small compared to the budget and mainstream.

As far as it goes, the future for entry level PC's, notebooks and even mainstream PC's with hybrid Crossfire, is to have a GPU integrated with several CPU cores. AMD's Swift is expected to have a single 4xxx series core with 3 45nm Phenom cores. Intel has their own technology in the pipeline for this too.

What I'd like to see are future CrossfireX systems with two discrete cards, on which are two dual core GPU"s (4 core CrossfireX), running alongside a power saving mode CPU (when not playing games or doing 3D graphics). The CPU could be 8 core, where one core is a GPU capable of basic 3D for Vista and the internet. That's the enthusiast end. The budget end should be a single Vista capable GPU core with 3 CPU cores, running in hybrid Crossfire with a chipset IGP and/or a $50 add in card of similar generation.

Am I the only one to see the value to gamers in power saving mode? Or the value for the mainstream of hybrid Crossfire and SLI?
March 3, 2008 4:30:19 AM

There is an Intel developer forum slated for April 2. So perhaps in about a month, we will have some more information on how they are doing with Larrabee.
March 3, 2008 7:04:13 AM

"multicore processor designs means that at some point, graphics technology becomes just a core on the main chip"

For integrated designs like laptops, we will see this happen, getting a single chip with the functions of cpu, gpu, memory controller, and any other northbridge functions will take a package with a much more dense array of pins at least. Then again, reconsider the northbridge off the cpu package. What if we upgrade that, so we maintain two chips. Full blown GPU with its own socket that can be pulled out, and you can upgrade your memory controller, pci-express generation, and all keeping the same motherboard. ATI, Nvidia, Intel, Via, SIS, Matrox could all make products to pop into this socket if everyone keeps in open platform standards.

Then again like others mentioned, a card is cheap. If they are still profiting off producing a complete independent package as long as someone makes a slot with enough bandwidth/throughput to handle the latest and greatest everyone is happy. Additionally it is rare that a single product is remanufactured to keep power use down, while maintaining identical speed. If an athlon 1.4 at 180 nm were manufactured at 45 nm today, how much power would it require? how much faster could it be ramped up? Same goes for graphics. Say we took a top of the line card of any past generation, and shrunk it down to todays manufacturing specs, combining them would take what, 1/3 of the original power both would have required combined? Speed would be great for a handheld. Certainly it would run XP, and play games from a couple of generations ago. planting a fast gpu on a cpu is only held back by a business plan, demand, and profitabilities.



March 3, 2008 10:19:47 AM

jimmysmitty said:
I don't think that a discrete GPU will be beaten by a ondie IGP but the regular IGP's that used the CPU as their main processing source will not be as good. Plus we have to wait and see. I consider the fact that it will be a actual GPU and since GPU's have different structures than CPUs it should increase performance.



That is true but don't forget that Intel does have the R&D to do it. I think that if Intel wants to release a nice GPU they will work on the drivers as well. Don't forget they have been making chipset drivers and other drivers that are some of the best so they could easily create good GPU drivers.

I guess we will have to wait and see. I am excited as this might push NVida and ATI to make even better chips since it will be a threat on their market.


I simplified somewhat (ok alot lol) but what i was driving at is that yes they do have chipset and igp drivers, but the igp units are not on the same level of complexity as add-in graphics boards, so writing drivers for them is correspondingly not as difficult a task. I dont think its so simply a matter of scaling up the igp architecture and adding new features like dx10 and sm 4.0 support and modifying drivers to suit to produce a good product, rather its altogether far more complicated than that. Thats why there are only 2 real players left in the add-in board market (for gaming I mean) and why soo many other companies have fallen by the way side in this field over the years. If drivers were soo straightforward a matter then Nvidia and ATi would have top performing near flawless ones in a handful or revisions after a products release, which is not the case. You could take the rv670 and g82 & g84 to say yes it happens, but their performance and compatibility owe a great deal to driver development in the previous lines (r600 & g80), which was far from un-problematic (Nvidia vista support, ATI compatibility, general performance and vista support). Driver writing for add-in cards must be a whole field of expertise in itself, with engineers drawing on vast experience from previous generations and how to get the best performance out of dx9, openGL and developing understanding of how to make dx10 work efficiently, not just setting their targets at getting windows to run. Even if they throw huge gobs of money at it, its my opinion that Intel are years away from producing anything to rival Nvidia and ATI even at the midrange.
March 4, 2008 5:16:17 AM

spoonboy said:
Thats why there are only 2 real players left in the add-in board market (for gaming I mean) and why soo many other companies have fallen by the way side in this field over the years. If drivers were soo straightforward a matter then Nvidia and ATi would have top performing near flawless ones in a handful or revisions after a products release, which is not the case.


What I want to see from Swift are cheaper gaming notebooks, and what I want to see from Larrabe is for Crossfire to become the multiple GPU standard, including hybrid Crossfire. What I hope to see are discrete Intel GPU's taking on Nvidia in OpenGL (considering who they hired) but losing to ATI in Direct3D (considering who Intel ticked off over Vista).

Competition is good for the marketplace, and even SIS and Via IGP's have a market, so I don't think that discrete Intel cards will be totally worthless, though I hope they aren't good enough to drive Nvidia or ATI out of business (but I hope they take market share away from Nvidia).
March 4, 2008 8:00:54 AM

You guys seem to be missing the point of a gpu core in the cpu.
Take a look at how well the Ati cards do F@H. Todays gpu is more of a gp-gpu. (general purpose -graphics processing unit) A gp-gpu can be configured like a fgpa. If properly coded for, it can run certain types of programs at 2 or 3 times the speed of a cpu. In games, it could be the physics engine, or the AI engine.
In photoshop, it could do a lot of fp. In film conversion, it could outperform any other core.
The first chipmaker to have one, will have the crown (if the software supports it)
March 4, 2008 10:18:38 AM

yipsl said:
What I want to see from Swift are cheaper gaming notebooks, and what I want to see from Larrabe is for Crossfire to become the multiple GPU standard, including hybrid Crossfire. What I hope to see are discrete Intel GPU's taking on Nvidia in OpenGL (considering who they hired) but losing to ATI in Direct3D (considering who Intel ticked off over Vista).

Competition is good for the marketplace, and even SIS and Via IGP's have a market, so I don't think that discrete Intel cards will be totally worthless, though I hope they aren't good enough to drive Nvidia or ATI out of business (but I hope they take market share away from Nvidia).


Why in the holy hell would we use AMD's crossfire to start a new graphics interconnect standard? I think Intel is big enough to make one of it's own.
a b à CPUs
March 4, 2008 10:25:17 AM

endyen has a point.

For an integrated system to work though it needs a really powerful bus ... in order to make best use of the specialised resources available.

If AMD really capitalised on its fortunate position regard to Hypertansport ... it is nearly there.

Intel I imagine has made sure Nehalem will capitalise on this line of reasoning.




March 4, 2008 11:17:36 AM

I wonder why the stories on Vista vs 915 are so focused on the lack of Aero (which, by the way, it is thought that the 915 may be able to handle Aero just fine, assuming your system has enough RAM - Intel just never wrote drivers to support Aero on the 915). Aero is a gimmick. If people were mainly shelling out hundreds of dollars to do the Vista upgrade just for Aero (which is able to be simulated using some XP themes), then we have some pretty silly consumers on our hands.

That being said, Intel's graphics division is poor in terms of great game performance. However, they're not targeting the high end graphics market. They go for cheap and mediocre performance and it sells more graphics chips than Nvidia or ATI. Most laptops only require a cheapo graphics chip. Almost all servers need just the bare minimum to host a terminal (if they need graphics at all). So your market is a sizeable chunk of desktop, a small part of mobile, and game consoles for anything needed over the performance you get from an Intel IGP.

And if Windows 98-> XP and XP -> Vista is any indication of Microsoft's transitions, systems will need faster processors and loads of more memory - not better graphics processors that than Intel's normal development path. Plus, I highly doubt emerging markets (where a ton of the recent growth has been) are desiring the latest graphics awesomeness :) 
a b à CPUs
March 4, 2008 11:23:37 AM

Aero ... how apt ... just like the chocolate bar ... full of bubbles of nothing.

March 6, 2008 12:31:28 PM

Reynod said:
Aero ... how apt ... just like the chocolate bar ... full of bubbles of nothing.


Then don't use it.

Word, Playa.
March 6, 2008 3:15:23 PM

I would ask the microsoft guy why an OS is such a bloated hardware hog? If running an OS "right" requires a $1500 PC, the problem is with the OS not the hardware.
!