Now it is CGPU's era, goodbye GPU...war is coming..

tolemi

Distinguished
Oct 20, 2006
34
0
18,530
Hello Guys... I was just roamig around a bit with this info that Intel is coming with its Larrabee which is infact a combination of CPU and GPU I mean CGPU. Now that we had a unbalanced war between Nvidia and ATI .. this is actually what I was looking for...From nvidia's side dusting the old chip each time with a new box and on the other side ATI's bad recipe for the new food .. just give me a headche. This time I think we all can hope for a true competition with Intel and lets see what happen? Think nvidia..&..ATI ....what you have done to us...

What you ... guys think?

 

Jaevric

Distinguished
Mar 23, 2008
66
0
18,640
I think it's a lot easier to claim to be revolutionizing the graphics industry than it is to actually carry through with the claim. Larrabee isn't due out for quite a while, and just because Intel claims it's going to replace a separate video card doesn't make it so.

Until we see something more substantial than "Integral graphics are the wave of the future!" I'm not terribly worried. Or impressed by their claims.
 

dagger

Splendid
Mar 23, 2008
5,624
0
25,780



Lol, and in 2 years, shouldn't those things be 10x as fast already?
 

kenratboy

Distinguished
Feb 13, 2007
160
0
18,680
Remember in 2005 when Intel was going to have $800, 60" TV's that would wipe out the industry?

...either do I.

I'll believe it when I see it.
 

3Ball

Distinguished
Mar 1, 2006
1,736
0
19,790
I personally believe that Larrabee is something would will fair well in laptops, but for a desktop environment I feel that a separate card will be the better solution. Hopefully this doesnt do away with GPU's as whole, which I dont think it will, but alas...I digress!

Best,

3Ball
 

tipoo

Distinguished
May 4, 2006
1,183
0
19,280
Intel says its IGP will be 10x faster by 2010, nvidias, so what!

intel10xspeed.jpg
 


I think that this kind of thing is definatly coming, its been on the cards for a while really.
If its larrabee or fusion or nehalem we will end up with at least some of the graphics work being done on the cpu.
We have quad cores for the main part only using a fraction of thier potential and Intel have chips running in the labs with many more cores than 4. As far as larrabee goes im a bit confused some sites/reviews say its easy to do with little overhead, While others i have seen say it needs seriously awsome cpu power to run it.
Anyone know which is correct.
Mactronix
 

Zorg

Splendid
May 31, 2004
6,732
0
25,790
I think this is a non event. If you want to talk about something that is going to blow the graphics industry wide open and give the CPU manufacturers serious leverage to compete maybe you should be asking about ray-tracing. No it's not happening tomorrow, but the train is blasting down the tracks. It's going to be a new gaming world.

Real Time Ray-Tracing: The End of Rasterization?
 

Zorg

Splendid
May 31, 2004
6,732
0
25,790
I'm not saying that it can't be done on GPUs or PowerPCs, and probably well. I'm just saying that it lends itself well to 8 core+ CPUs, and it will change the industry. It's a lot more earth shaking than the CGPU/Fusion introduction, which I think is primarily for low power situations.
 
I think if the possibility of a fusion or larrabee comes about, itll still need a discrete gpu. The way Intel is talking, and all its fanbois, they dont need a super uber gpu to run it. Well, lets just say this, if Larrabee comes in at say 50% (and thats being generous) than todays top cards, and Intels top cpus still maintain a 10 to 20% lead IPC, or real world application, I see a huge disparity between 20% on the cpu side, and the 50% on the gpu side (and thats being conservative) Until Larrabee can close this gap, and assuming Intel does maintain its cpu lead, I see Intel once again a lowly third place in graphics (and again, thats being generous). Thats just pure numbers for now, sure. And we dont really know how these will perform, but theres been stuff out there using CUDA and nVidia cards doing the same thing Larrabee did on IDF, and its wasnt even close, and with the advent of the G200, which looks like itll be twice as fast as the 9800GTX, then Larrabee is already at least 75% behind, and losing fast
 

RizzyWho

Distinguished
Aug 19, 2006
124
0
18,680
you think that the G200 will be twice as fast? i dout it..... maybe 25-50%

But the "CGPU" i think its just more things to go wrong in the development of CPU's.

Stick to what you know.
 
Well, with all this generousity were seeing that Intel is going to take over the graphics market, I thought Id be a bit generous too. But it is possible for the G200 to do this, at least as good as the GTX2. The rumors have been a billion+ tranny chip, so yea its possible.
 

sacre

Distinguished
Jul 13, 2006
379
0
18,780
Wow we're all so sceptic about what intel has in store.

I myself believe this is just another load of bull, but what if this does happen? what if this new chip actually comes out and opens the door to faster CGPU's. What Intel is doing may be the "foot in the door" so to speak. Once there, intel may have something brewing in the back ready to be pushed through the door to create some serious competition.

I personally believe all this GPU/CPU MHZ/GHZ multiple core crap is going to be a waste because soon I believe "Laser" Processors will be released and the technology will be integrated into GPU's, giving them a massive performance boost, without all the heat.

ugh, all this multiple core stuff is annoying me.. doesn't it annoy you? Instead of creating a whole new way of making CPU's they just stack CPU's together and hope for the best...

Just like Nvidia with their damn "GX2's" ...

We need a change in technology soon.
 
It depends on usage. In graphics you can see a 1 to 1 performance increase without multithreading, or close to it. ATI/AMD's mcm may have us thinking differently about multi-chip gpus too. Theres alot changing in the graphics arena, let alone the cpu/gpu possibilty
 

Zorg

Splendid
May 31, 2004
6,732
0
25,790
The "laser" Processors that you are talking about are multiple core. As a matter of fact that is what they are doing with the lasers. They are using the lasers as the I/O in order to have better inter-CPU communications. The commercial availability of true photonic CPUs with capabilities of current CPUs is many years off.

Here's a link for you.
Sun Microsystems Awarded $44 Million Department of Defense Contract to Develop Microchip Interconnect System

A little old but a good read.
Photonic chips go 3D TRN 072804

Near term use of photonic chips
Photonic Chips, The Secret of Future Optical Communications
Photonic Chips

Also old. Just to show that Intel has it's fingers in the pie.
BetaNews | Intel Builds New Laser Based Processor

Go have a drink, you seem a little stressed.
 

Zorg

Splendid
May 31, 2004
6,732
0
25,790
I never said they were dead, I said they are VIA. :lol:

You never dealt with the VIA chipsets?

They work, but they don't necessarily inspire confidence.