Sign in with
Sign up | Sign in
Your question
Closed

Onboard Integrated Graphics Gone by 2013

Last response: in News comments
Share
March 5, 2009 10:00:32 PM

I'll be glad to see the IGP be gone. All they do is give users barely adequate performance that ignorant users will blame on the OS.
Score
-1
March 5, 2009 10:05:40 PM

They will only disappear CPUs can render HD content at a reasonable rate.

With AMD and nVidia stepping up the performance on their onboard chips we may see them stay around much longer as they are cost effective and efficient at HD playback.
Score
5
Related resources
March 5, 2009 10:18:27 PM

You mean all the IGP's on all my motherboards will disappear on 2013? omg.. omg this is terrible, whats next, CPU's will disappear? our children? the earth?

No seriously though, good news, IGP is decent but technology is booming and they are not, they're simply falling behind, restricted to size/heat etc, they can never be strong enough.

RIP IGP, you did good for the past 20+ years. RIP.
Score
0
March 5, 2009 10:25:18 PM

Sounds like this is just moving the IGP onto the CPU. This is kinda shitty since now enthusiasts will have to pay for the graphics on the CPU instead of just buying a MB without an IGP. Even with the ability to run discrete and onboard graphics together, this could dramatically raise temperatures and hinder overclocking.
Score
4
Anonymous
March 5, 2009 10:42:51 PM

If GPU cores are integrated into the CPU, won't heat output rise a lot...?
Score
3
March 5, 2009 10:55:45 PM

Aside from laptops, netbooks or nettops where there is limited space, the only reason I can see for either IGP or embedded graphics is just in case the discrete graphics card isn't working. Even then, it's "just in case" and is mostly restricted to when building a system or upgrading the graphics card.
Score
1
March 5, 2009 11:16:02 PM

Saw this article over at Anandtech too. I thought it was a crock then and still do. If anything, IGP's are more popular now than before especially for htpc. Just how good are the graphics on a cpu embedded gpu?
We don't know.
Well, anyway my opinion is that we'll still see igp's for awhile still. for me personally I'm gonna use an HD4670 for my htpc, cuz I'll actually play games on it.
Score
2
Anonymous
March 5, 2009 11:39:28 PM

I think the onboard/on chip graphics are good for mobiles, UMD's, mini-notebooks, laptops, and office pc's.

They're not for gaming pc's.
But perhaps there will be some form of SLI/RAID for the graphic card with an onboard chip!
Perhaps your setup could benefit having an onboard graphics processor that could help the onboard pcie graphics card to give more traction in some games!

I don't think this is negative, since the on chip/on die graphics solutions will be able to be disabled, and the extra metal could actually help cooling the cpu when not in use (larger surface, while relatively less powerdraw per surface occures)!
I think cards like Radeons will be around for many more years. I think only onboard graphic solutions like GPU on mobo will dissapear.
either way, the cpu will require only a slightly larger cooler.
Score
1
March 5, 2009 11:44:35 PM

I can see a lot more complicated CPU shopping because you will have to compare the GPU specs along with the existing CPU specs. With hybrid graphics it may very well affect add-on video card performance. On the other hand, the parallel processing capabilities of the GPU could be available as a standard feature.
Score
2
March 6, 2009 1:30:28 AM

Well when IGP can display HD and aero without massive lag on decent resolutions i'll consider shoving it all on one die.
Score
2
March 6, 2009 3:20:50 AM

IzzyCraftWell when IGP can display HD and aero without massive lag on decent resolutions i'll consider shoving it all on one die.


Yes! Thats exactly what I was thinking.
This industry has a habit of Over-promising and under-delivering.
Score
0
March 6, 2009 4:27:29 AM

gm0n3ySounds like this is just moving the IGP onto the CPU. This is kinda shitty since now enthusiasts will have to pay for the graphics on the CPU instead of just buying a MB without an IGP. Even with the ability to run discrete and onboard graphics together, this could dramatically raise temperatures and hinder overclocking.

The same as there exist north bridges without graphics (actually with the IGP disabled), there will be enthusiast CPUs without graphics. What made you think otherwise.
Score
-1
March 6, 2009 4:31:07 AM

idonthaveausernamelskdjflsfsdIf GPU cores are integrated into the CPU, won't heat output rise a lot...?

The idea is that IGPs will be on the same process node as the CPU and therefore get a size reduction. This will alleviate the heat issues. IGPs are currently manufactured using the next to latest process node.
Score
-1
March 6, 2009 5:29:53 AM

gm0n3ySounds like this is just moving the IGP onto the CPU. This is kinda shitty since now enthusiasts will have to pay for the graphics on the CPU instead of just buying a MB without an IGP. Even with the ability to run discrete and onboard graphics together, this could dramatically raise temperatures and hinder overclocking.

You must be thinking of the GMA 950 with only 25 million transistors. (of course that will suck)
I think (hope) putting the CPU and IGP on the same die will actually increase performance. There is an opprotunity for ultra high bandwidth communication between the two chips on the same die. More so if they are on the same chip.
Think of it like this. If you can put the quad core Nehalem (731 million transistors) and the Ati R700 (956 million transistors) on one die, then some GRAM, that will be a moster machine already. I think it's all "possible" with the
Score
1
March 6, 2009 8:35:45 AM

eddieroolzI'll be glad to see the IGP be gone. All they do is give users barely adequate performance that ignorant users will blame on the OS.

Generally I'd agree, but a student and I (IT staff) a couple weeks ago had to conclude, that igp might be sufficient for even some less mundane things as word processing work.
We were testing rendering speed on some hp dx2300 mashines equipped with an e2180 and 2gb ddr2-667 memory. We found that the difference between rendering a bike lift, going from fully closed to fully open, was almost non-existent. In direct3d mode inventor 2009 took 2 minutes and 1 second using the onboard igp provided by the i945 based board, and it took only 1 second less using a 6600gt card. We did repeat this test serveral times with no difference. Opengl was even slower on the 6600gt than it was with direct3d on the igp. Obviously the blame is to be put on autodesk for not working properly with gaming cards even when the features are available and active, but the point is, that even some serious work can be done on igp if just the processor is up to the task. 'normal' people won't even know how to check if their system uses the gpu or the cpu for those tasks anyway, so they'll probably feel the upgrade to a discreet card as a waste. Unless they're a gamer ofcourse.
Score
2
March 6, 2009 10:01:18 AM

I agree with you prodigit.

You don't really need 3D graphics in the office environment when you're primarily using e-mail, the internet, and Office apps.

I do think it's funny that IGP is gone, but is essentially just being replaced by an integrated solution coupled with another function.
Score
2
March 6, 2009 11:30:42 AM

gm0n3ySounds like this is just moving the IGP onto the CPU. This is kinda shitty since now enthusiasts will have to pay for the graphics on the CPU instead of just buying a MB without an IGP. Even with the ability to run discrete and onboard graphics together, this could dramatically raise temperatures and hinder overclocking.


Well, manufacturers are likely to continue to sell CPU's without on-board graphics, just like they do with motherboards. That will cater for the enthusiasts / gamers who don't want it at all.

But later, I think that at least some kind of minimal graphics capabiliity will always be on-die. By then, the manufacturing process will be good enough to render the heat issues minimal, even if you don't configure it to power off if a discreete board is present.

That way, even if your discreete graphics card blows up, you can at least continue working.

You really don't need your GTX 285 to write a document in Word or browse the Internet. Well, even to play old games like unreal and unreal 2 you only need a Geforce 2 MX.

In my opinion this move makes a lot of sense, especially with Intel providing it in the form of larrabee and AMD in the form of Fusion. This could save a lot of energy, because they could aply their Speedstep and Cool N'Quiet to minimze heat and consumption. As of today, discrete graphics cards are only beginning to apply these technologies. Why on earth isn't a GTX 285 or 295 able to shut down part of its ram, part of its streaming processors, and so on, and just keep something the equivalent of a Geforce 6200 active ? This would be fine for most people browsing the net, word processing, and dvd's. Then it could up the power to something like an 8400gs for Blue-ray. Full power would only be available when doing intensive tasks like gaming, video encoding and so on.
Score
0
March 6, 2009 12:43:24 PM

gm0n3ySounds like this is just moving the IGP onto the CPU. This is kinda shitty since now enthusiasts will have to pay for the graphics on the CPU instead of just buying a MB without an IGP. Even with the ability to run discrete and onboard graphics together, this could dramatically raise temperatures and hinder overclocking.


Heard the exact same argument when sound/networking/video moved to the motherboard, but it turns out enthusiasts like saving money and time on parts they are not so enthusiastic about, just like the rest of the world.
Score
-1
March 6, 2009 12:49:29 PM

I have to say I won't miss them IF they do disappear. I used to exclusively look for and purchase motherboard models that came without IGP's. Which used to be really easy because most did not have them. Later on though for like a business class motherboard finding a model without onboard graphics became pretty much impossible.

Someone earlier posted they did good for the past 20+ years? Have they really been around that long? Seems to me like they didn't really kick into a selling point until about 7 years ago which I am really guessing but I remember it was towards the end of the PIII days and the beginning of the P4. Finding a P4 board without onboard graphics could be a pain. Like ASUS's P4GE-V - that V was for onboard video. Maybe this was more on the intel chipset side. I know later on the highend nVidia chipset boards came without onboard graphics and I loved that! Heck, I am trying to remember some of the old old 486 boards but I do not remember onboard graphics on those. I do not remember them on the early Pentium boards either like the socket 7 stuff. I remember it was sweet if you had an S3 video card so you could run 256 colors! ;)  lol Maybe he meant other applications than motherboards though. I am stuck thinking motherboards myself.
Score
0
March 6, 2009 1:03:31 PM

enewmenThere is an opprotunity for ultra high bandwidth communication between the two chips on the same die. More so if they are on the same chip.


just curious (and wondering) but isn't that statement backwards? wouldn't the communication actually be better if they are on the same die as opposed to just being on the same chip?

please let me know if I am wrong and thinking backwards - and if I am thinking backwards could someone explain how it is better to be on the same chip rather than the same die. thanks!
Score
0
March 6, 2009 1:17:16 PM

So Intel and AMD have plans to start integrating graphics into their chips and a market research firm is predicting that there will be a decline in integrated graphics on chipsets? Wow, they must be geniuses!
Score
0
March 6, 2009 2:27:48 PM

The only thing I'm worried about is sockets. Intel is already screwing the pooch by releasing a completely different socket for the Core i5 chips. Imagine releasing additional sockets just for IGP and non-IGP CPUs. Consumer confusion will be even worse. Its much better to have an IGP on the chipsets no one knows or cares about, than having to buy a completely different motherboard depending on whether or not your CPU has a GPU inside. Plus Intel is horrible with upgrade choices. Their CPUs with GPUs will probably be low-end CPUs, with little to no upgrade options, seeing as they would most likely use different sockets from the high-end CPUs (as they already are with Nehalem).
Score
0
March 6, 2009 5:24:35 PM

don't even need it in the CPU with MS next version of direct X will work with no display adapter
Score
0
March 7, 2009 6:42:28 AM

thegh0stjust curious (and wondering) but isn't that statement backwards? wouldn't the communication actually be better if they are on the same die as opposed to just being on the same chip?please let me know if I am wrong and thinking backwards - and if I am thinking backwards could someone explain how it is better to be on the same chip rather than the same die. thanks!

Good question to help me get a more intelligent answer.
Here is an article about how external bandwidth sucks and how Fiber-to-the-processor can fix this.
http://asia.stanford.edu/events/Spring05/slides/050421-...

This link is more simple explaining how the terra-scale chip works and how performace is increased by having on-die memory.
http://www.tgdaily.com/content/view/33657/118/
Score
0
March 8, 2009 3:55:30 AM

aero isn't the problem, my ancient integrated nvidia 6150 does Aero perfectly with no "lag". HD is another point, but who in their right mind wants their PC to display HD video? How many people even have an 1080p capable monitor besides hard core users? At 1680x1050, my IGP does just fine using CPU support. I have pretty good vision, and I don't think you can appreciate HD on a screen under about 30" - just my opinion. If I'm going to watch blu-ray, I want it on a big screen - at least 50" but better yet my projector. And I'm not going to force an "HTPC" to drive it - $$$ for software to decode it, blu ray drive, case, mobo, etc., etc. Don't get me wrong, I actually love MS WMC to record ATSC HDTV for free, it looks fantastic on my 720p projector at 100". But I'd have to spend at least $200 to add blu-ray support. For what? A handful of movies I want to see each year on blu-ray? I'm waiting for 1080p projectors to come down in price for that anyway.

Score
1
March 9, 2009 7:21:11 PM

IGP's really do need to go away. They haven't really been cutting it for some time now.


www.tech-pros.com
Score
-1
March 9, 2009 8:24:34 PM

Stop posting useless info and linking to your own blog. Guerrilla marketing is annoying.
Score
0
!