Sign in with
Sign up | Sign in
Your question
Closed

Intel: Integrated Graphics is Where It's At

Last response: in News comments
Share
Anonymous
April 6, 2009 10:49:17 PM

LOL, pathetic. Development is relative to the power of the hardware available, hence L337 3d games didn't exist for the 286. I guess Intel wants gaming systems dumbed-down to 1990s standards to make up for their shortcomings. I think they figured out that Larrabee is gonna flop...
April 6, 2009 10:49:36 PM

I think this can trigger the beginning of the end of the need but not the use for high-end discrete graphics cards. If the market for high end graphics cards shrinks enough I could see developers dropping support. Something like what happened to the sound card market.
Related resources
April 6, 2009 10:51:53 PM

A TNT card could mop the floor vs Intel integrated graphics and they really expect developers to swallow this swill they are peddling?? Intel, you may be the biggest but your trying to impose your will where you have never proven yourself capable.
April 6, 2009 10:53:12 PM

there will always be a market for fast gaming PC's and 500 dollar graphics cards especially now with people doing more with their PC's (picture/video editing format conversions)

i think we are going to see a spread

people with exreme low end (intergrated)

and people with 200+ graphics cards
April 6, 2009 10:53:53 PM

This is sad, Intel might as well say "Please make your games look like crap so they run on our integrated cards" So much for high end graphics if normal users don't appreciate them...
April 6, 2009 11:03:25 PM

I personally read this as: 'why not to bother with larabee' myself..If developers seriously considered intel chipset graphics in their games then quake 3 would still be considered 'high end' graphics
April 6, 2009 11:14:36 PM

instead of developers, they should tell their customers to play games that predate the IGP by 10 years
April 6, 2009 11:14:51 PM

Hatecrime69I personally read this as: 'why not to bother with larabee' myself..If developers seriously considered intel chipset graphics in their games then quake 3 would still be considered 'high end' graphics


Quake 3 is fun and look at quake live. if developers stopped trying to Pump more graphics and started to pump more fun we might have better games.
April 6, 2009 11:15:34 PM

instead of telling developers to tone down the games, they should be telling their customers to play games that predate the IGP by 10 years or so
April 6, 2009 11:21:56 PM

woot quake 3 rocked my world... back 1999... I am not to concerned having integrated graphics is a set back in computing... yes making things smaller and smaller seems to be the way to go but sacrificing so much power and ability isn't going to float well they will see this soon enough.
April 6, 2009 11:25:43 PM

Intel really shouldn't be bragging about a graphics solution that is bought exclusively by those who don't care about performance. Sure, they sell the most chips, but it's only because most people don't care at all about their 3d acceleration. Anyone who cares gets something better, even if that means an integrated Nvidia or ATI chip.

Nothing is killing the PC gaming market more than the fact that one of the most common types of PC sold is the cheap Intel-based laptop - a computer that not only can't play games, it can't ever be upgraded to do so. Intel is making a dire mistake by pushing these things on consumers at all. If the entry level computer cannot play games or ever be upgraded to do so, the number of people entering PC gaming will dwindle, and they won't progress to buying high-end 'gaming' processors later.
April 6, 2009 11:28:54 PM

Yes developers should make more 2D isometric games, becuase that is the standard intel has towards integrated graphics. If they really wanted to go integrated, then they should target a real integrated chip from AMD or nVidia. Integrated from Intel is a joke.
April 6, 2009 11:31:24 PM

Seriously, Intel should just stick to making processors, because their graphic solutions sucks.
April 6, 2009 11:51:49 PM

Which would you rather be selling?

A) 10 million low end IGP chipsets at an average price around $10.00 (if not less)

B) 2 million high end boards at an average price of $200.00 each?

"Lies, Damn Lies, and Statistics"
April 6, 2009 11:52:56 PM

*cough* *choke* *gag*

please don't ever post an article about integrated graphics again
April 6, 2009 11:58:14 PM

the only reason there is a "LOWEST COMMON DENOMINATOR" is because of Intels CRAPPY IGP video. They are the lowest common denominator. The lead(pronounced LED, Pb) in the video industries ass.
Anonymous
April 7, 2009 12:32:49 AM

Intel is talking rubbish. Most games will run fine on good laptops with low power discrete graphics chips. My Lenovo runs FEAR happliy on low setting thanks to it's nVidia 7300. And as for the Atom/netbook, nVidia's Ion has been demonstrated to play CoD4 and others.

Also all the new chipsets from AMD and nVidia feature integrated DX10 GPUs.

Intel just can't admit nVidia and AMD are stealing all their low power gaming market, since anyone who casually/seriously plays the newest games will ensure they are running a AMD or nVidia GPU of some form.

If Intel want to stay in the race they need to spend some serious money and bring out a decent floating point pipelined GPU (not some Pentium 1 multicore crap ie Larrabee).
April 7, 2009 12:39:23 AM

"What do you think? Could this mark the beginning of the end of the need for high-end, $500 discrete graphics cards?"
NO
April 7, 2009 12:40:50 AM

Quote:
What do you think? Could this mark the beginning of the end of the need for high-end, $500 discrete graphics cards?


No and I am not sure if that is the point that Intel is stressing here.

The majority of the computing world doesn't need discrete gpu cards, so obviously integrated will sell higher. This also means there is a huge target audience to run games that aren't too demanding on integrated hardware.

Intel is trying to convince developers to develop lower end capable games since the hardware exists to do so.

I think this is a good idea for developers. I have installed some old school games on my laptop that I was never able to play before, but they run well on my integrated hardware. Unreal is still fun many years later no matter how dated the graphics look.
April 7, 2009 12:46:20 AM

It's said when desktop cards running at X1 pci-e speed are faster then intel video.

It's said that amd and nvidia boards at the about same price have much better on board video.

Why can't intel have 64-128 side port ram like amd?
April 7, 2009 12:46:57 AM

Similar to Apple offering integrated solutions. Certainly with mobile computing CURRENTLY on the upswing, requiring smaller, integrated hardware, the short-term future potends that there will be a substantial marketing (translated "revenue stream") opportunity. Perhaps discrete graphics chipsets will become a boutique market, perhaps not. Who's to say that perhaps holographic requirements may require a paradigm shift in graphics performance that only the discrete suppliers can address, even ramping up to supplant CPU subsystems? The future offers opportunites for those prepared and willing to change.
April 7, 2009 12:52:07 AM

Why doesn't intel just ask developers to start including an options switch within the game that a user can click, which in turn, disables everything they just payed for in thier game, so Intel can say, " ohh ya, we are compatible with that game!"
April 7, 2009 12:54:38 AM

apples has the nvidia 9400M, not the crappy intel stuff. The intregrated 9400 performs about as well as a X1600 mobile
April 7, 2009 1:09:54 AM

What a joke: "Here's your answer: Mercury Research showed that in 2008, for the first time, integrated graphics chipsets outsold discrete (graphics chips), and in 2013, we expect to see integrated graphics chipsets outsell discrete by three to one", you would need to play alot of Chinese whispers with drunk ppl before this translated to "pc gamers are choosing IGP over discrete cards". this is like when they were counting PS3s towards total bluray player sales, except alot more laughable.

Firstly Gamers make up a tiny part of all computer sales, secondly the above statement is just as likely caused by more mobos having IGP on them(for HTPC ect) than a decrease in gfx card sales.

and another thing, if the EU is going after MS for bundling IE with windows, what are they gonna do when intel starts bundling gpus with their cpus? atleast there is physical money involved in the gpu market.
(btw, I think the EU/MS/IE thing is BS)
April 7, 2009 1:29:18 AM

[What do you think? Could this mark the beginning of the end of the need for high-end, $500 discrete graphics cards?]

$500 discrete graphics cards will dissapear. They will be replaced with better, cheaper Nvidia/ATI new cards.

Now, If Intel really cut the goods card market, it will only kill his own PC gamer market.

Intel: consoles don't need X86 compatibility. Destroy good graphics, and you will kill yourself.
April 7, 2009 2:17:14 AM

Well, for one, Intel is currently the biggest vendor of graphics parts, outpacing anything from Nvidia or AMD -- and that’s completely thanks to the IGPs that come with the Intel chipsets.


haha , Intel graphip sux big time , and Intel chipset cry for more juice , they got more market share is just because there is alot of noobs and brainwashed retard fanboy who don't know what they are buying for and believe on Intel advertisement .
April 7, 2009 2:32:01 AM

Honestly I think this is Intel's plan, to destroy Nvidia through killing the gaming market on the PC.
April 7, 2009 2:32:14 AM

engrpimanQuake 3 is fun and look at quake live. if developers stopped trying to Pump more graphics and started to pump more fun we might have better games.

Or maybe people should stop buying every ****ing FPS that comes out with good graphics and mediocre gameplay. (AKA 75% of the games on the market)
April 7, 2009 2:41:09 AM

Chances are quite slim that discrete is just going to dissappear... some enthusiasts may refuse to pay extra for embedded graphics and then they lose interest from the tech savy customers. Besides, we all know how much heat GPU's create. If I understand correctly, Intel is going to smash a super hot i7 and a GPU on 1 chip? 32nm I like though... :) 
April 7, 2009 2:46:13 AM

Dear. God. No.
Developers make games for gamers. People who play games casually buy old games their systems can handle or play free online games with lower graphics requirements. Asking developers to divert major resources from cutting edge game development is so transparently self-serving for Intel and so obvisouly self-destructive for developers that I can't imagine anybody will fall for the hype.
April 7, 2009 2:48:32 AM

that means Rockstar should made the GTA 4 like GTA 2 and marked "crappy graphic proof" on it so the game can be run on Intel intregrated chipsets XD
April 7, 2009 2:51:28 AM

The only reason that i buy fast INTEL processor is to complement my High end Graphic card and to avoid bottlenecks. If we have to depend on lousy integrated graphic card, what's the need of fast cpu ? It will also kill the RAM market since we dun need to play game, we won't need a fast RAM also. How about High end Gaming mobos , we won't need them either. Throw the High power PSUs also since we dun need to power the discreet graphic cards.

In the end, Intel is killing itself and the whole industry.

INTEL think before u talk !!
April 7, 2009 2:52:56 AM

for those noobs and stupido fanboy ... if u wanna run a game on an IGP , go get a AMD/ATi or nVidia intergrated chipsers instead of Intel crappy chipsets .
Anonymous
April 7, 2009 2:58:13 AM

Intel is forgetting one Extremely Big Issue here.

They might be selling more Graphics Chipsets, BUT........

(HERE"S THE KICKER)......

Most people DISABLE the Integrated Video and use a Video Card of their choice.

WAKE UP INTEL and smell the truth!
April 7, 2009 3:42:09 AM

IGPs are lame. Besides, the desktop sales vs laptop sales figure doesn't take into account millions of computers self-assembled at home.
April 7, 2009 3:42:31 AM

what kid of gaming do they expect?
intel integrated graphics can barely even run HL2, I tried, and even so with lots of lag
April 7, 2009 3:44:04 AM

I have not seen ANYTHING that's a theast the descrete graphics, even on notebooks.
What does Intel have? The GN40 chipset??
"If I understand correctly, Intel is going to smash a super hot i7 and a GPU on 1 chip? 32nm I like though... " -> What can this be? I didn't even hear rumors..
April 7, 2009 3:55:10 AM

Intel is going to smash a crap into the i7 ???
April 7, 2009 4:09:33 AM

Yes, and the number of gamers using those underpowered chip are at how much?

That's a lame excuse to use all numbers to represent gamers since there are so many more non gamers than gamers so Intel's point is moot.

Has anyone recently realized Intel sound desperate with lawsuits and these idiotic marketing schemes?
April 7, 2009 4:58:42 AM

DOn't you guys remember intel saying the reason they didn't include IGPs on the P45/P43 is because people are just going to add discrete anyway? Just a ploy be intel to circumvent Nvidia's Ion platform.
Anonymous
April 7, 2009 5:00:10 AM

Intel!! Wrong!! I just wish Intel would get back to improving their CPUs and stop mucking up the graphics industry. Integrated graphics does have a place - fill in the blanks.
April 7, 2009 5:04:30 AM

I don't care if integrated graphics crap outsells 500 to 1 I will never use the integrated graphics. It does not compete.
April 7, 2009 5:14:39 AM

Quote:
Aaron Davies, a senior marketing manager in the Intel Visual Computing/quote]
The Guy is from Marketing... who cares what he says.
Anonymous
April 7, 2009 5:57:57 AM

Obviously the ansewer is simple: The definition of integrated and Discrete is going to change.

The motherboard will no doubt go through some changes. In the short term, I could easily see Integrated being the standard and discrete for extra goodies. Or gaming goes to consoles and integrated is used for simpler pc titles?

But one day someone might realize the simplest way to interact with computer is going to use the existing processing power of the brain and the brain becomes the motherboard. hmmm Cyborg goodness.
April 7, 2009 5:59:56 AM

I think it would be great if developers made there games more scalable and thus letting us be able to play them on the road on poor performance IGP's while at home we can play them high-def full uber hd or what not.

It is not impossible to make game engine's scalable to begin with and for sure some games could step back on raw CPU crunching in stead of running it completely on the GPU.

There will always be 500usd graphic cards there is no doubt about that ... on the other hands IGP's are capable these days and since they have a huge market share there could be a lot of potential clients for those who dare to build a scalable engine and game.
April 7, 2009 6:55:27 AM

Is intel runned by morons ? they make the fastest CPUs on the market and yet they try to convince people to downgrade everything to run on their crappy IGPs? And i belive the so called succes of their IGPs is due to laptops and ppl who don`t have a clue about what they are buying since they buy the company "Intel" look it has an Intel IGP it must be good since is intel no? (this going into the head of casual buyers)
April 7, 2009 8:40:36 AM

integrated chips outsell discrete simply because people buy more computers. They still only have a select few number of gaming rigs (usually one, with the kids on consoles), but now sit with 3-4 laptops in a home. These aren't even used for gaming in the first place.
Essentially the number of gaming systems (desktop or laptops) have remained the same. Only the number of other systems has risen as prices have fallen.
April 7, 2009 9:00:08 AM

Intel just pissed everyone off day by day ......
April 7, 2009 9:38:18 AM

Article: What do you think? Could this mark the beginning of the end of the need for high-end, $500 discrete graphics cards?

- I don't think the high-end graphic card will be obsolete or dropped any time soon. This tool is just another way for Intel to earn more money. I do agree with intel that PC games could run on an intel IGP chip, but with major graphic cutdowns.

The way I see it, we have pros and cons to this:

Pros:

- Game enabled PCs could cost alot less, and be avadible to a wide variety of users with diffrent budgets. This means more people would buy PCs for video games instead of consoles, for about the same price. BUT this would mean that Intel would have to step up their game by making IGP chips or the upcoming on-CPU graphic controllers capable of delevering PS3-level digital graphics for PC games running on Intel chipsets.
- This would infringe on the console market (PRO for me) and make more and more titles avadible for the PC. Also, this would increase PC game sales tenfold.
- Everyone could afford a gaming pc.
- Game developers could build a graphic preset into their games witch allows users to play modern games on intel IGP chips at decent framerates and visual quality presets, again, making modern PC gaming avadile for everyone.

Cons:

- The level of game graphics quality could drop or stagnate due to these hardware limitations (let's face it, an intel X4500 has about the same graphic power as a Geforce 4 MX 440.)
- High end graphics companys and their products could be a thing of the past (unlikely tough).
- Intel may decide to INCREASE the price of moherboards/chipsets with integrated graphics, thus getting us noware. (Trus in the Intel, the Intel will take as much of your monet as they can).

Other toughts.

I'd like to see an Intel integrated graphics solution for the i7, slapped on to the X58 chipset boards, coupled with a i940 and 6GB of high-end DDR3 meory (say 6GB of CL7.7.7.18 1600MHz DDR3 memory) run in 3D mark.

Until now, the on-board solution's main drawback was (and still is) no on board memory, witch means the chip must share with the sistem. This is very bad for performance. The GPU would get leftover bandwith and RAM memory, coupled with the latter at low speeds (say DDR2 667MHz), it makes for a GFX card with a 250MHz core, 32 or 64 bit memory interface, and say 256MB of memory running at 667MHz. Witch can be outperformed by a extremely low and GF7200 PCI-E.

If you slap-on some high perf. DDR3 memory, the performance shoud increase. For example: my sister owns an Acer notebook with integrated nvidia 7000m chipset. The graphics performance out of the box was horrible, unable to play even old games like Black & White 1.
I got a ideea and swapped out the 1 module of 2GB 533MHz of ram with a couple of Nania 1GB DDR800 CL5 modules. This (i hoped) would deliver faster memory for the on board GPU and double the memory bus (dual channel 128bit). The 3D Mark 01 score jumped from 2800pts to 5781pts. Impressive right? Imagine it reaching ~ 9000pts with DDR3 Memory.

Now - if INTEL would sit teir corporate butts down, and build a on board chip - witch could take advantage of the I7's 196bit memory interface and high-speed DDR3 memory, also bunp up the IGP's clock rate, things could get interesting.

That said, i leave you to talk amongst yourselves.
!