So I built my i3-2100 PC and had little intent on gaming. I thought that if
I was going to have any form of an enjoyable gaming experience, I would purchase an $100-150 Video card.
So while testing out my PC, I downloaded Steam, and then some demos of games. Sonic Generations,
Ignite, Portal: First Slice, and Team Fortress 2.
I was using a 20" 1600 x 900 monitor. Now I understand that I am not running these games at 1080P or 2600 x 1600, but I was surprised at how good all the games ran. Team Fortress 2 ran fine at 1366 x 768, details all on high, and 2X (I think) AA.
I then tried Sonic Generations. Since having a high frame rate is favorable in a Sonic game, I left all settings at max, and then turned down the resolution to 800 x 600 and 2X AA. I could probably have done a higher resolution, but 2X AA made up for it.
Ignite ran in 1080P (Although the monitor scaled it down), AA, and all details except shadow detail on max. Held a very playable frame rate. Would have turned shadow detail up, but the game wouldn't start with it up.
Now I agree Sonic Generations looked good at a high resolution with high AA, but the difference wasn't extremely noticeable when actually playing the game.
I think that most people here don't like onboard graphics, because it performs pretty much straight worse than almost any stand alone graphics cards do.
Basically, the reputation is deserved.
If you want to game at ultra low resolutions then even onboard graphics are probably good enough.
In fact, I recommended someone earlier should get an A8 processor with onboard ATI 6600 series graphics for an ultra low budget gaming system earlier today even.
There aren't any onboard graphics setups that I know that are better than the A8. Most aren't even half as good.
In any event, I think most people around here consider 1280 x 1024 to be the bare minimum resolutions that are acceptable if not even more than that.
My laptop plays games with its Intel onboard chip decently at 1600 x 900 too, but its not like I am trying to play Crysis on it or anything. There is no flash game it can't handle! Well, maybe there is, but I program games in flash and I play many of them and it handles those just as well as my serious gaming computer does.
It would probably get 1 FPS in Crysis, though.
Really, if you want superior performance you just don't get it from onboard graphics.
On my desktop, my WEI for graphics score is like 8.0. On my laptop it is by far my lowest score at 3.9.
If you don't care to get the absolute most out of games that you can, you will be fine a lot of times with onboard graphics, but it really just isn't an option for a serious gamer that wants to get the maximum out of their gaming experience.
IGP have come a long way since the old GMA's, which were destroyed by even the cheapest, and older generation of dedicated GPU's.
But now things like the HD3000 are actually adequate for gaming. The reason they are still despised is because people want true immersion. The ability to have every bit of graphics turned to the highest they can be, and maintain a high FPS rate.
My laptop has a Gt555m and a HD3000 in it, and sometimes I completely forget to flip the Optimus switch until an hour or so ingame.
Dedicated GPU's also have better support for programs, such as hardware video decoding for HD video.
Do you want to understand why people think that onboard graphics are bad or do you want to defend the cause of the unloved?
BTW, the $200 figure is 2x inflated. A 6770 is about half that and can do much better than the HD 3000 on the 2500k.
The 6850 is about $150 and is a very good budget card.
Not everybody needs a 6950, but unless you like playing with the bare minimum it isn't the worst idea to put a lower cutoff on the 6600 for the A8 with very low budgets or a 6770 for those who want a 2500k.
If it is your thing to play games from back in 1999 then by all means use an x300 card, but it won't even get half a FPS on Crysis even with everything on the lowest setting.
It doesn't completely have to do with true immersion either. 60 FPS is the point where it gets much harder to notice additional FPS (assuming no micro stutter). Stuttering can detract quite massively from the gaming experience. The thing most likely not to be able to hit this FPS for any game on any setting? Onboard graphics.
I personally have a long history of limping along with subpar graphics cards and many year old games, but at least in the things I do play I can get FPS enough that I don't notice it to be abnormally low that way I can focus on the game.
The Llano APU is stronger than the intel hd2000.
You can play games casually with the internal graphics, but when it comes to recent games like witcher 2 that devours graphic cards, it just won't cut it.
You spend an extra 200$ or more and you see the game on screen the way it's meant to be played.
Lowering graphics works for some games, and you do get to play the said game, but the experience is really not the same.
Raiddinn; no I do want to understand the massive amount of hate towards IGP and Intel HD in general.
People made it sound like nothing would be playable on the i3 using intergrated graphics, so I was surprised when I had 100% smooth frame rates in Portal at 1366 x 768 + 2X + Details on high.
One thing about portal... its built on the source engine, and thats one hell of an engine. For such a great engine... it doesn't take much to run it surprisingly. Left 4 Dead 1 & 2, is also built upon that engine. To me, it's one of the greatest game engines to date. Half Life2:Source is an amazing game. Especially at the time it was released.
There is really nothing wrong with IGP, per say... its came a long way.
Intel's HD 3000 is significantly better than any past IGP, and even some low-end GPUs.
My laptop (i5-2410M + HD 3000) handles Skyrim on full low settings at 1366x768 (pretty challenging resolution for IGPs) at 40-50 FPS (with occasional dips). That wouldn't have been possible a year ago.
If you're looking for full HD high performance gaming, obviously this isn't for you. But for the average casual gaming session, the newer integrated graphics perform quite satisfactorily.
Yeah, I completely agree, integrated graphics have come a long way since back in the GMA 950 days. Started around when the Radeon HD 3200 was released, and went upward from there to the Nvidia GeForce 320M in the 2010 Macs, and towards the Intel HD Graphics Media Accelerator (which, really isn't an improvement over the 320M but Intel graphics are more power-efficient, so I'll take what I can get)
It makes more sense when you consider many laptops still have integrated graphics, and if you end up with one, it's just good to know that it won't be like back in the 950 days where you will not be able to ANY games at all lol.