Intel Integrated Graphics--the facts and the myths

bourgeoisdude

Distinguished
Dec 15, 2005
1,240
25
19,320
I was inspired to write this because of the many "oldschool" hardware fanatics that belittle folks for using integrated video graphics. Since the Intel i915GL (GMA 900) graphics, Intel has finally been producing decent onboard video options. I personally own (well...I gave it to my Mom anyway) a Dell Dimension 4700 with the Intel 915 graphics. I must say I was impressed with it.

Anyway, to help persuade all of those Voodoo fanatics out there, I decided to post this detailed info about integrated graphics. Most of these specs are from Intel's website.

Intel 900 GMA specs:

333MHz core GPU (not RAMDAC)
Full DirectX 9.0(a) and Open GL 1.4 support
"Dynamic Video Memory Technology" (DVMT) 3.0 supports up to 128MB of video memory
Native support for 16x9 formats, widescreen, (HDTV), and DUAL DVI*

*915G(L) only. Most vendors decided not to include the two DVI ports so they stripped them off of the Intel "recommended" layout.

Chipsets that include GMA 900:

Intel® 915G(L) Express Chipset
Intel® 915GV Express Chipset
Intel® 910GL Express Chipset
Mobile Intel® 915GM Express Chipset
Mobile Intel® 915GMS Express Chipset
Mobile Intel® 910GML Express Chipset


Intel 950 GMA specs:

400MHz core GPU (not RAMDAC)
Full DirectX 9.0(a) and Open GL 1.4 support
supports up to 224MB of video memory
Native support for 16x9 formats, widescreen, (HDTV), and DUAL DVI

Additional specs, exact from Intel's website:
Up to 10.6 GB/sec memory bandwidth with DDR2 667 system memory
1.6 GPixels/sec and 1.6 GTexels/sec fill rate
Up to 4 pixels per clock rendering
Microsoft DirectX 9 Hardware Acceleration Features:
Pixel Shader 2.0
Volumetric Textures
Shadow Maps
Slope Scale Depth Bias
Two-Sided Stencil
Microsoft DirectX 9 Vertex Shader 3.0 and Transform and Lighting supported in software through highly optimized Processor Specific Geometry Pipeline (PSGP)
Texture Decompression for DirectX and OpenGL
OpenGL 1.4 support plus ARB_vertex_buffer and EXT_shadow_funcs extensions and TexEnv shader caching


In short, the GMA 950 is pretty decent, even for the more modern games. Now, with these specs, why do people still bash Integrated graphics? Let me show you the specs for some older integrated graphics to show you why:

Intel 810 Graphics:

"Integrated Intel® 3D with Direct AGP" (exact specs not given, but below are 'unofficial' specs)

Up to 4MB video memory
Full DirectX 6.3 support
NO GPU, uses the CPU for graphics processing (uckk!!!)

Intel 815 Graphics:
Same as above except it has "133/100 MHz 4MB display cache" onboard.

Intel 820/840 Graphics: N/A (I can't find any valid info, please inform me if you know--I think this was the one with up to 11MB shared memory though)

Intel 845G/GL Graphics: Full AGP 4X bandwidth (capable), up to 48MB shared video memory. Still only DirectX 6.3 compliant!

Intel 860/865G/GL/GV/PE Graphics: same as 845 series except memory bus allows it to utilize more bandwidth.

Intel 7XX graphics: LOL, don't ask!

The 845 and 865 series are still being produced by Intel for "Basic" PCs. With only 48MB shared and only DirectX 6.3 compliance--avoid these at all costs.

Please note that the 845 and 865 supposedly support DirectX 7 and DirectX 8.0b respectively, they do not technically support "a single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second" --the definition of hardware T&L support which is required for DX 7 compliance.

This is a work in progress, feedback is needed.
 

yourbestfriend

Distinguished
Feb 24, 2006
165
0
18,680
good job, some knowledge gets thrown around that is not entirely true.

My onboard sound off loads from CPU(about 1/3) and has eax 1.0 and 2.0 support
 

maxxum

Distinguished
Jan 30, 2006
161
0
18,680
I have the 945 G Chipset Family graphics cards (it’s the 950 I believe) on my office PC (Optiplex GX520). I can’t complain; it’s pretty good considering. It doesn’t stand well against my home PC, but that’s not what its for. It plays DVDs well enough & can handle 10 different programs on at the same time w/o too much complaining. Looking at these specs it looks like its Vista “ready” too :)
 

bourgeoisdude

Distinguished
Dec 15, 2005
1,240
25
19,320
Thanks for the info - I've got the 845 at work in a Dell and it's fine for the office.

My job uses dells with that chipset too, I'm using one to post on this forum. You're right.

For gaming of any sort, however, it stinks. Intel GMA 900 and 950 aren't Radeon X1900s or GeForce 7900GTXs, but they will actually benchmark in 3dmark05...
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
Onboard graphics suck ass, if you like playing the original UT, or possibly Quake(does Intel have an OpenGL driver yet? :lol: ) then maybe you'll be satisfied, but anyone who plays modern(recently released) games, will vomit when they see the framerates they get, or a windows error message babbling about the game failing to launch.

We dont own aftermarket cards for nothing. Tim Sweeney from Epic games has even recentely been quoted, saying that Intel graphics are holding back the developement of cool programming/games because they suck so bad :roll: , and Intel owns over 3/4 of the graphics marketshare currently.

If you're impressed by the quality and framerates of onboard graphics, then you probably also dont know what you're looking at on the screen(quality wise).
 

stillerfan

Distinguished
Mar 11, 2006
21
0
18,510
I read that article also and he makes a good point. But for office work the integrated should be fine unless you play games at work.

And no, I don't use integrated at home for gaming - I have a barely useable 6600 gt. :cry:
 

bourgeoisdude

Distinguished
Dec 15, 2005
1,240
25
19,320
Onboard graphics suck ass, if you like playing the original UT, or possibly Quake(does Intel have an OpenGL driver yet? :lol: ) then maybe you'll be satisfied, but anyone who plays modern(recently released) games, will vomit when they see the framerates they get, or a windows error message babbling about the game failing to launch.

We dont own aftermarket cards for nothing. Tim Sweeney from Epic games has even recentely been quoted, saying that Intel graphics are holding back the developement of cool programming/games because they suck so bad :roll: , and Intel owns over 3/4 of the graphics marketshare currently.

If you're impressed by the quality and framerates of onboard graphics, then you probably also dont know what you're looking at on the screen(quality wise).

Thank you for your negative insults. You are the reason I wrote this thread, yet you refuse to read it.

Unlike most folks, I play games at the good ol' 1024X768, or 800X600 (I must be a blind bumbling bafoon). You must see everything at 1600X1200 at supercalafragisexy detail settings, but Bob Smith could care less.

BTW--have you ever tried the intel 900 GMA, or the 950GMA? Well, I'll show you the 3dmarks later.

but anyone who plays modern(recently released) games, will vomit when they see the framerates they get, or a windows error message babbling about the game failing to launch.

Star Wars KOTOR games run at over 30 FPS on my 900 GMA, Star Wars Empire At War that came out a month and a half ago runs at over 44 FPS. Republic Commando works too (yeah, I'm a Star Wars fan :wink: ). Also was able to play Halo PC with the patches, decent framerate with low quality settings. Return to Wolfenstein, Sims 2 (or so I've been told), and Quake 3 work with it as well. I'll try to post some screenshots. Not that you care.
 
Interesting stuff.

Are there any benchmark articles for these solutions out there?

Yep, Extremetech looked at both the GMA900 and 950, I posted those reviews, I'm surprised, offended, sadden and a little disheartened you didn't remember. :cry:

:tongue:

Anandtech put the GMA950 against the X200, and I'm not sure if they bothered for the GF6100/6150 reviews. Needless to say alot of WNR and crashing.

All in all the GMA series is not a good choice for integrated, I'd suggest the X200 or GF6100/6150 over it anyday. Also don't forget the S3 Chrome solutions either. All as superior in performance and features.

I doubt the Dual DVI native to the GMA950, as that would require dual integrated TMDSs and I believe you'll find all the Intel dual DVI solutions have an external SGI chip.

OH yeah and for fun another posting of how much 'fun' it is to get the GMA950 working on the very latest/greatest game;

http://www.elderscrolls.com/forums/index.php?showtopic=337851
 
Simply put, integrated graphics are for companies who buy in bulk and need capable 2D (most basic 3D) yet extremely cheap and uniformity, and individuals who couldn't tell the difference between GF3 quality and X1300/GF7300 quality.

Anyone who cares about graphics, which most people frequenting this forum do, would not use integrated graphics for their needs, unless it's for mobility use, and even then they'd weigh the options of a discrete chip like the X300+/GF6200+ (prefer X1400 as a base).

It's not about being a snob, it's about true useability. We here are enthusiasts, and even on the low end, we want something capabale.
Will a P3-667 with 128MB PC100, 10GB 4KRPM HD, CD-Rom and a Kyro 400 be 'good enough for most day to day non-gaming activities), damn sure!
Would our mothers or girlfriends be probably more than happy with it? Sure. However there is no feature that the GMA900/950 offers above a basic Intel Extreme, that makes it of any significance in the grand scheme of things. It may be able to support HDTV, but then again so could my old Matrox, ATi, nV cards from 5+ years ago considering 1280x720. What the GM950 won't do is accelerate or clean-up HD video. Nor will you ever be gaming at those resolutions (forget 1080P altogether).

So while you've done a great job of putting together a work place dumb-terminal graphics feature list, and I say that both sarcastically and appreciatively (I will still point people here from time to time), it still doesn't elevate the integrated chips situation from anything other than a cheap, underpowered solution for people who don't need good, they just need cheap. Are they as bad as they used to be, NOPE! But are they still bad compared to the stuff that's out there, YEP.

Think of it like that 4K RPM HD, will it work, sure, but is it worth any savings over a cheap 7200RPM drive, nope, or PC2100 versus PC3200?

It'll never be good, and the statements about it's Vista ablilites (it's whole raison-d'etre) doesn't bode well for even the business applications side of the argument.
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
Star Wars Empire At War that came out a month and a half ago runs at over 44 FPS.
Maybe with the settings completely dumbed down......and I mean completely...

I was playing that game(full game, multiple campaigns) on my bad-ass X800GTO, and my framerates suffered big-time during cinematic widescreen battles.

Integrated graphics must be pretty bad when the person defending them has to lie about the FPS their achieving :roll:







/thread.
 

bourgeoisdude

Distinguished
Dec 15, 2005
1,240
25
19,320
Actually no...I "dumbed down" my Windows processes to run only 16 (no antivirus), and I ran at low settings except for having one notch up on the texture details...yeah "it don't look great" but it works.

Remember--I'm posting this for the not-so enthusiasts, not the "can't live without supercalafragisexy 1600X1200" folks. BTW, my suckey 6600 in my sig claims to run at only 39 FPS with settings like the Intel one--only case I've seen that happen (???). Of course one time it also told me I had a 5.8GHz cpu online, too (even though I don't OC), so I suppose the FPS thingie in the game is a buggie POS...

Honestly, I suppose I'm not even a gamer, as I could care less about Half-Life, WoW, COD, and most of that other stuff. I play SWEAW, C&C Generals Zero Hour, Halo, Final Fantasy 7, and other not-so-involving games. I have a very addictive personality--FF7 nearly ruined my social life when I first started playing it. Why am I rambling? I need to go to sleep now...
 

cleeve

Illustrious
GeneticWeapon said:
Star Wars Empire At War ...
I was playing that game(full game, multiple campaigns) on my bad-ass X800GTO, and my framerates suffered big-time during cinematic widescreen battles.

I dunno, something might be amiss with your setup. I run Empire at War on my secondary machine, an AthlonXP 2500+ with a Radeon 9700 PRO... at 1024x768 it runs smooth as silk. Detail not maxed of course, but still very, very smooth.
 

bourgeoisdude

Distinguished
Dec 15, 2005
1,240
25
19,320
Agreed, Cleeve. EAW may be a newer game, but heck it claims to fully support GeForce 256 cards, so if you have framerate problems with the X300SE then something else could be wrong.

By the way, someone mentioned the GeForce 5200FX...my brother's old 256MB AGP version (don't remember the brand) had noticeably slower framerates than the 900 GMA did. Again that's two different PCs--the FX5200 was on some "Syntax" :roll: mobo my bro had with a sempron 2800+ CPU. The Dimension 4700 specs (for mine):

-80GB Seagate SATA hdd
-2GB DDR2 PC3200 RAM in dual-channel (whatever Dell uses)
-2.8 GHz Prescott 800MHz FSB CPU with HT disabled
-Intel 900GMA graphics with latest Intel drivers
-Antivirus disabled, pci modem removed, "selective startup" mode selected (msconfig)
-All USB peripherals removed

This brings me to another point--I will probably create a thread tonight running 3DMark05 on both my PCs--once with everything checked in msconfig and all usb devices connected, and once without. If I get drastically different results I will post them. My guess, based on experience, is that the Dell will see a huge diff due to intg. video chipset, whilst the "rig in my sig" will see little if any difference--but we'll see.
 

ocularis

Distinguished
Feb 7, 2006
35
0
18,530
Integrated graphics are great budget-savers for the corporate world. When you only run Firefox or other browser, Word, Excel, and an email program, then you don't need a fancy 7900GT, much less a mid-range 6600GT. And having lots of video memory is useless unless you are running lots of 3D games with textures. Well, what game isn't 3D now, besides Solitare and Pinball. :)

People who play lots of FPS games must realize that they will need to buy a decent graphics card to go with their system. For any user that doesn't care, integrated will work fine.

We have a game night at work playing Quake3 on integrated Intel graphics (Dell Gx620s and GX270s). The game runs fine even at 1024x768 with all details.
 

ltcommander_data

Distinguished
Dec 16, 2004
997
0
18,980
I've always found Intel's integrated graphics solutions important from a mass market perspective. People have already pointed out that they may not be the most playable, but they are a good indicator of the level that the gaming industry is operating at. Afterall, the top-end graphics solutions may offer crazy performance, but programmers still have to accomodate the lower-end discrete and integrated graphics users. By moving the lower end up in terms of both performance and features, everyone benefits because the mean is moved higher allowing more programming freedom. That's why I thought it was great when the DMA series was launched with "full" support for DirectX 9. It's a step forward at least.

I'm actually quite interested in what the G965 will bring to the table. Intel will finally be doing away with the tiling method so they will have real hardware pixel shaders (which they have now), vertex shaders and T&L support. With SM3.0 support, it'll be a DirectX 9.0c part. The G965 also looks to bring competition for ATI's AVIVO, with hardware decoding for h.264 among others as well as some encoding support. Clock speeds have also worked to Intel's favour with the DMA950 clocking in at 400MHz on the 130nm process, which is comparable to the 7800GT on the 110nm process. The G965 will be made on the 90nm process so decent 500MHz clock speeds are likely. All in all, it may finally be a competitive low-end integrated part from Intel and looks to be great for Media Centre PCs.

http://www.hkepc.com/bbs/viewthread.php?tid=554771

One thing I've wondered is whether the DMA950 has dual core driver support? ATI and nVidia both have advertised them to great affect and I would think Intel would have the same considering they actually make the dual core chips. While the DMA900 was important in terms of features support, "decent" performance didn't come until the higher clock speeds of the DMA950. If Intel released decent drivers, I'm sure it'd actually be a pretty good low-end part. I'm mainly concerned because the majority of the Centrino Duo laptops seem to be shipping with the DMA950. Since the DMA950 doesn't have hardware vertex shaders it has to offload them to the CPU, but the Core Duo offers a perfect opportunity to dedicate one core for vertex and other graphics tasks and the other for the actual game logic and physics. With the Digital Media Boost, mainly expanded SSE decoder width, revised FPU/SSE units, and SSE3 support, the Core Duo would offer a much greater boost over the previous Dothan core.
 

bobbydamm

Distinguished
Nov 21, 2005
143
0
18,680
Hmmm....
That 950 set seems to recall a certain era...
A little intel/ATi corporate dealing
Those specs look a little too familiar
Could it be an 8500 chipset?
Didn't Intel use the old Rage 2c for the 810?
(Some old boards actually had either 4mb or 8mb Rage 2 built right on it? Maybe The P2/Cele chipsets from that time.)
Circa '99 .....
 

ltcommander_data

Distinguished
Dec 16, 2004
997
0
18,980
As far as I know, the DMA950 has nothing to do with the Radeon 8500. First of all, the 8500 was a DirectX 8.1 generation core while the DMA950 is a DirectX 9.0 generation. Also, the DMA950 doesn't have a hardware T&L or vertex shaders. Those are both emulated by driver and CPU. Because of this, the entire rendering method is different with the DMA950 using an advanced form of the tiling that Intel licensed and modified from STMicro. ATi and nVidia graphics cards use immediate mode rendering. I believe that the fruits of the Intel/ATI partnership are some Xpress 200 boards being labeled as Intel rather than ATI. The deal is mostly temperary anyways because Intel somehow had a supply shortage of their low-end integrated graphics boards, mainly the 915G models. It was probably because they were trying to actively produce the 845G, 865G, 915G, and 945G and associated submodels all at the same time. They have since cancelled the 845G and 915G, focusing instead on the 865G for S478 and the newer 945G.
 
Clock speeds have also worked to Intel's favour with the DMA950 clocking in at 400MHz on the 130nm process, which is comparable to the 7800GT on the 110nm process.

That's because it has far more transistors.
And, 400mhz on 130nm is nothing new, the R9600Pro did it (and far FAR more) and then with LowKd the R9600XT was stock @ 500mhz on 130nm. So this is special why?
 

INeedCache

Distinguished
Dec 15, 2005
521
0
18,980
What some of you need to remember is that not everyone in these forums are enthusiasts. Some people are just coming here looking for answers or to improve their knowledge. To simply say "integrated graphics suck" doesn't help anyone unless there is further clarification. If someone is putting together or buying a machine for simple home use of internet, email, etc., or a business workstation, integrated graphics are fine and spending more money would simply be a waste. Most people get put off by snobs, so we should be careful to not become such.
 
Most belittle onboard video strictly for it's mediocre gaming performance, but, where gaming in FEAR, BF2, Quake 4, etc., is not a concern, I am the first in line to attempt save the $150-$200 on a video card!

My Toshiba Satellite laptop (a Celeron-M at a mere 1.4G/1.5Ghz, i think, with an even older Extreme graphics solution), actualy did quite well playing the older Call of Duty at modest resolutions/settings on a long trip to the Middle East...