info on Intel's future GPU

cryogenic

Distinguished
Jul 10, 2006
449
1
18,780
http://www.theinq.com/default.aspx?article=37548

Sounds interesting. We might see a great leap in graphics quality if it is as good as it sounds.

Actually it is not as good as it sounds, puting x86 cores to do graphics processing simply isn't the best ideea out there because it isn't the best architecure for stream processing.

Also Inq didn't give any FLOPS info or estimates so my guess is that this architecture will hardly match mainstream vid card performance by 2009.

One thing that it might excel at could be non real time raytrace rendering ...
 

Harrisson

Distinguished
Jan 3, 2007
506
0
18,990
Tbh its a good news, even if first Larrabee would be slower than G90 and R700 (as I'm expecting). Intel have potential to catch up with discreet graphics big names, if they invest enough money and man-power. Competition would increase, prices would drop - all the better for us :p

On the other hand, for some reason I doubt Intel will invest THAT much as required to catch up with nvidia/ATI, since they are many years behind... I'm afraid it will end another VIA Chrome clone and program will be shafted like it was with i740.
 

MarkG

Distinguished
Oct 13, 2004
841
0
19,010
From the article:

Both sides will pull the functionality into the core itself, and GPUs will cease to be. Now do you see why Nvidia is dead?

Yeah, because everyone, just everyone, wants to have to upgrade their $1000 CPU every time they need faster graphics. Not to mention be crippled by a slow CPU to memory bus rather than a GPU to memory bus with vastly more bandwidth.

CPUs and GPUs have such different requirements that I can't see how anyone other than CPU manufacturers believes that merging the two chips together would be a good thing. At the low end, sure, stick a cheap GPU into a cheap CPU and save some manufacturing costs on something that will never be upgraded... but not for the rest of us.
 

cxl

Distinguished
Mar 16, 2006
200
0
18,680
http://www.theinq.com/default.aspx?article=37548

Sounds interesting. We might see a great leap in graphics quality if it is as good as it sounds.

Actually it is not as good as it sounds, puting x86 cores to do graphics processing simply isn't the best ideea out there because it isn't the best architecure for stream processing.


I guess you can expect specific SSE extensions...

One thing that it might excel at could be non real time raytrace rendering ...

Actually, I believe this is targeted to *real-time* raytracing. Expect the return of software rendering.

Mirek
 

Pippero

Distinguished
May 26, 2006
594
0
18,980
It sounds more of a competitor of Cell and Fusion, than nVidia.
There was a time when rendering was entirely done by the CPU.
Then GPUs came and showed that having custom chips specifically designed to do the job would give better performance.
I'm still very skeptical that these hybrid CPUs with extended vector capabilities could make GPUs obsolete.
 

Lacostiade

Distinguished
Mar 8, 2006
101
0
18,680
From the article:

Both sides will pull the functionality into the core itself, and GPUs will cease to be. Now do you see why Nvidia is dead?

Yeah, because everyone, just everyone, wants to have to upgrade their $1000 CPU every time they need faster graphics. Not to mention be crippled by a slow CPU to memory bus rather than a GPU to memory bus with vastly more bandwidth.

CPUs and GPUs have such different requirements that I can't see how anyone other than CPU manufacturers believes that merging the two chips together would be a good thing. At the low end, sure, stick a cheap GPU into a cheap CPU and save some manufacturing costs on something that will never be upgraded... but not for the rest of us.

well u r talking about the current situation, and he isn't speaking about what will happen in 2 years time (i.e Fusion on die), but he is talking about maybe 5-6 years in the future, when Fuson will evolve into an on core CPU\GPU initiative. One day, we won't be able to find a reason to buy a discrete gfx cards, integrated gfx will be enough for all of us. Believe it or not, it will happen!.

See this and read till the end. http://arstechnica.com/news.ars/post/20061119-8250.html
 

MarkG

Distinguished
Oct 13, 2004
841
0
19,010
One day, we won't be able to find a reason to buy a discrete gfx cards, integrated gfx will be enough for all of us.

Again: why would you want to be forced to upgrade both CPU and GPU at the same time when you could just upgrade one? Obviously that means more $$$$ for CPU companies, but where's the benefit to the end user?

And with graphics, no amount of power will ever be enough. There's always more you can do to make them more realistic and complex.
 
One day, we won't be able to find a reason to buy a discrete gfx cards, integrated gfx will be enough for all of us. Believe it or not, it will happen!.

Sure, for the average non-gamer. But gamers who typically want the best performance will alway opt for a discrete GPU. Combining the CPU & GPU will create a price barrier for those people who wants the best graphics they can buy. Most games right now are GPU limited, not CPU limited.

Take the following as an example:

Just for the sake of argument, suppose Intel and nVidia are both one company who designs combo CPUs/GPUs. Should there be different several versions of combo chips that would fill in all the market segments? For example:

E6300/7300LE
E6300/7300GT
E6400/7600GS
E6400/7600GT
E6600/7900GS
E6600/7900GT
E6300/8800GTS
E6300/8800GTX
E6600/7300LE
E6600/7300GT
Etc.

I think you start to get the picture. If Intel/nVidia were to design a combo chip to fill in all or most of the different types of market segments it would have to produce too many product lines. 4 different CPUs and 6 different GPUs of various performance levels means that Intel/nVidia would need to design 24 different CPU/GPU combos to fill all possible market segments.

A simpler solution would be to match a weak CPU to a weak GPU, and more powerful CPUs to more powerful GPUs. The combo chips may look something like the following:

E6300/7300GT
E6400/7600GT
E6600/7900GT
E6700/8800GTS
X6800/8800GTX

In the above example Intel/nVidia would need to produce fewer combo chips, but it's to the disadvantage of the consumer. A hardcore gamer who wants the best possible GPU would be force to dish out big $$$$ for the X6800/8800GTX combo chip. That'll be a powerful chip, but the X6800 part will go to waste. A person who wants to buy a powerful CPU for a very large database would also be forced to by the X6800/8800GTX combo chip. In this case the 8800GTX part will go to waste since a powerful GPU core has no affect on improving performance of a database program.
 

cryogenic

Distinguished
Jul 10, 2006
449
1
18,780
http://www.theinq.com/default.aspx?article=37548

Sounds interesting. We might see a great leap in graphics quality if it is as good as it sounds.

Actually it is not as good as it sounds, puting x86 cores to do graphics processing simply isn't the best ideea out there because it isn't the best architecure for stream processing.


I guess you can expect specific SSE extensions...

One thing that it might excel at could be non real time raytrace rendering ...

Actually, I believe this is targeted to *real-time* raytracing. Expect the return of software rendering.

Mirek

All I can say is *lol*, Cell is barely beeing able to raytrace at low resolution wihout any reflections, reflractions, global ilumination radiositiy, HDR, antialissing etc ... not to mention subsurface scattering and volumes ... the hardware that can do realtime complex raytracing has a very long way to go...it will come eventualy but this Intel chip will not be it.
 

Harrisson

Distinguished
Jan 3, 2007
506
0
18,990
A simpler solution would be to match a weak CPU to a weak GPU, and more powerful CPUs to more powerful GPUs. The combo chips may look something like the following:

E6300/7300GT
E6400/7600GT
E6600/7900GT
E6700/8800GTS
X6800/8800GTX

In the above example Intel/nVidia would need to produce fewer combo chips, but it's to the disadvantage of the consumer. A hardcore gamer who wants the best possible GPU would be force to dish out big $$$$ for the X6800/8800GTX combo chip. That'll be a powerful chip, but the X6800 part will go to waste. A person who wants to buy a powerful CPU for a very large database would also be forced to by the X6800/8800GTX combo chip. In this case the 8800GTX part will go to waste since a powerful GPU core has no affect on improving performance of a database program.
Actualy weak cpu/weak gpu, etc. will be exactly how it will be done IMO. In notebooks its not like you can upgrade cpu/gpu now anyway. In desktops it would work for both advantage and disadvantage for consumers. Also I'm sure there will be both combo and seperate chips, so you can go custom with any combination you like, including powerful cpu's with weakest integrated graphics for workstations, etc.

Another way to do it is also shown by AMD. GPU of your choice can be inserted in "socket" which is conected to cpu. Not sure how it would translate into gaming performance, but interesting non the less.
 

Lacostiade

Distinguished
Mar 8, 2006
101
0
18,680
Now let me point something out. A CPU\GPU combo will benefit gamers, why? look at this http://www.graphicshardware.org/presentations/pharr-keynote-gh06.pdf

interesting paper, and if you analyze the paper properly, u will find that CPU\GPU combos would allow for high CPU to GPU bandwidth (duh!), which when utilized would allow PCs to step into the era of programmable graphics. As for GPU sockets, there was an interview where Bob Drebin ( a guy from AMD's gfx division) said that a socketed GPU would need different programming models. I'm sure that somewhere inside AMD, Intel and nvidia there is a group working on an API that would utilize such CPU\GPU combos.

How AMD\nvidia\intel\via would market such products remains a mystery, I'm sure they would find a solution for the product line problems u guys mentioned.
 

Major_Spittle

Distinguished
Nov 17, 2006
459
0
18,780
there is a lot of money in the GPU market and Intel is not getting it with IGCs right now. I would bet that by the middle of next year Intel will start giving some competition to AMD and NVidia in the Gamer GPU market.

Intel is big enough it can change the rules if it is losing, and it looks like it is going to change the rules of how video rendering is done to benefit itself and gamers.
 

mazzapan

Distinguished
Jan 29, 2007
32
0
18,530
http://www.theinq.com/default.aspx?article=37548

Sounds interesting. We might see a great leap in graphics quality if it is as good as it sounds.

Give me proof, validation and verification from a neutral third party.. hacked photoshop jpeg's don't count. Verified cpu-z plz!
 

Lacostiade

Distinguished
Mar 8, 2006
101
0
18,680
there is a lot of money in the GPU market and Intel is not getting it with IGCs right now. I would bet that by the middle of next year Intel will start giving some competition to AMD and NVidia in the Gamer GPU market.

Intel is big enough it can change the rules if it is losing, and it looks like it is going to change the rules of how video rendering is done to benefit itself and gamers.

IGCs- Integrated gfx cards u mean?

Intel is making lots of money out of them. Once a person ( cant remember who he was) accused intel of being a hurdle in gaming, most of people use intergated gfx and Intel's aren't good for gaming ( neither AMD's, nvidia's or ATI's)
 

Major_Spittle

Distinguished
Nov 17, 2006
459
0
18,780
IGC = Integrated graphics chips

there is a lot of money in the GPU market and Intel is not getting it = They have no performance Graphics solutions now.
 

WR

Distinguished
Jul 18, 2006
603
0
18,980
FPUs used to be on separate cards. Then CPU makers started integrating them, and nowadays no one really buys accelerator cards even though they're years faster... except HPC people who can afford thousands for a single card. Consumers exhibit little interest in technical computing, and even the uptake of Ageia PhysX by enthusiasts seems to be lukewarm at best.

Graphics so far, though, is different. So many people pay extra for graphics cards to play games at adequate to fluid frame rates because it is a form of entertainment. Will there be a day when game developers stop catering to discrete graphics users and say integrated is good enough and easier to market to?
 

killz86

Distinguished
Dec 8, 2005
403
0
18,780
From what i have seen and heard for a long time. the inq is wrong alot of the time and sometimes but very rare he is right not. trying to flame or everything of that but thats what i have seen over the year/years being on this forum. Plus i dont see why companies dont stay with that they do best. if intel wants to make Gpu's why not just make an agreement with Nvidia. like ati/amd did. i mean it would be better plus they would make more profit. in the end.

Woot this will be my 100 post lol 8O
 

dragonsprayer

Splendid
Jan 3, 2007
3,809
0
22,780
This is no surprise this C2D#2 - Intel is doing what they did for C2D

they are going to reinvent what amd and nvidia does - why do you think they did not buy ati or why do you think they could care less about nvidia!

this is the next step in total intel domination! next will be an os!

IFB!

news! flash forward to 2010 "Intel releases 248 shader DX11 gpu integrated solution mother board!"

amti-ites should be wary

----------- :evil: -------------------- :twisted: ------------

From what i have seen and heard for a long time. the inq is wrong alot of the time and sometimes but very rare he is right not. trying to flame or everything of that but thats what i have seen over the year/years being on this forum. Plus i dont see why companies dont stay with that they do best. if intel wants to make Gpu's why not just make an agreement with Nvidia. like ati/amd did. i mean it would be better plus they would make more profit. in the end.

Woot this will be my 100 post lol 8O

are u praying they will fail?