Sign in with
Sign up | Sign in
Your question

Now it is CGPU's era, goodbye GPU...war is coming..

Last response: in Graphics & Displays
Share
April 20, 2008 3:26:38 PM

Hello Guys... I was just roamig around a bit with this info that Intel is coming with its Larrabee which is infact a combination of CPU and GPU I mean CGPU. Now that we had a unbalanced war between Nvidia and ATI .. this is actually what I was looking for...From nvidia's side dusting the old chip each time with a new box and on the other side ATI's bad recipe for the new food .. just give me a headche. This time I think we all can hope for a true competition with Intel and lets see what happen? Think nvidia..&..ATI ....what you have done to us...

What you ... guys think?

April 20, 2008 3:28:51 PM

hahahaha....dude u got nice sense of humour .........dunno what thunderman gonna predict this time???
April 20, 2008 4:38:41 PM

I think it's a lot easier to claim to be revolutionizing the graphics industry than it is to actually carry through with the claim. Larrabee isn't due out for quite a while, and just because Intel claims it's going to replace a separate video card doesn't make it so.

Until we see something more substantial than "Integral graphics are the wave of the future!" I'm not terribly worried. Or impressed by their claims.
April 20, 2008 5:05:01 PM

pfft, larabee. intel is saying 10x the performance of their current IGP's, thats still lower than current gen Nvidia cards!
April 20, 2008 5:15:40 PM

tipoo said:
pfft, larabee. intel is saying 10x the performance of their current IGP's, thats still lower than current gen Nvidia cards!



Lol, and in 2 years, shouldn't those things be 10x as fast already?
April 20, 2008 5:28:44 PM

Remember in 2005 when Intel was going to have $800, 60" TV's that would wipe out the industry?

...either do I.

I'll believe it when I see it.
April 20, 2008 5:29:47 PM

I personally believe that Larrabee is something would will fair well in laptops, but for a desktop environment I feel that a separate card will be the better solution. Hopefully this doesnt do away with GPU's as whole, which I dont think it will, but alas...I digress!

Best,

3Ball
April 20, 2008 5:36:42 PM

Intel says its IGP will be 10x faster by 2010, nvidias, so what!

a c 130 U Graphics card
a b Î Nvidia
April 20, 2008 7:22:46 PM

tolemi said:
Hello Guys... I was just roamig around a bit with this info that Intel is coming with its Larrabee which is infact a combination of CPU and GPU I mean CGPU. Now that we had a unbalanced war between Nvidia and ATI .. this is actually what I was looking for...From nvidia's side dusting the old chip each time with a new box and on the other side ATI's bad recipe for the new food .. just give me a headche. This time I think we all can hope for a true competition with Intel and lets see what happen? Think nvidia..&..ATI ....what you have done to us...

What you ... guys think?


I think that this kind of thing is definatly coming, its been on the cards for a while really.
If its larrabee or fusion or nehalem we will end up with at least some of the graphics work being done on the cpu.
We have quad cores for the main part only using a fraction of thier potential and Intel have chips running in the labs with many more cores than 4. As far as larrabee goes im a bit confused some sites/reviews say its easy to do with little overhead, While others i have seen say it needs seriously awsome cpu power to run it.
Anyone know which is correct.
Mactronix
April 20, 2008 7:50:32 PM

tolemi said:
What you ... guys think?
I think this is a non event. If you want to talk about something that is going to blow the graphics industry wide open and give the CPU manufacturers serious leverage to compete maybe you should be asking about ray-tracing. No it's not happening tomorrow, but the train is blasting down the tracks. It's going to be a new gaming world.

Real Time Ray-Tracing: The End of Rasterization?
April 20, 2008 8:28:35 PM

I'm not saying that it can't be done on GPUs or PowerPCs, and probably well. I'm just saying that it lends itself well to 8 core+ CPUs, and it will change the industry. It's a lot more earth shaking than the CGPU/Fusion introduction, which I think is primarily for low power situations.
a b U Graphics card
April 20, 2008 8:37:24 PM

I think if the possibility of a fusion or larrabee comes about, itll still need a discrete gpu. The way Intel is talking, and all its fanbois, they dont need a super uber gpu to run it. Well, lets just say this, if Larrabee comes in at say 50% (and thats being generous) than todays top cards, and Intels top cpus still maintain a 10 to 20% lead IPC, or real world application, I see a huge disparity between 20% on the cpu side, and the 50% on the gpu side (and thats being conservative) Until Larrabee can close this gap, and assuming Intel does maintain its cpu lead, I see Intel once again a lowly third place in graphics (and again, thats being generous). Thats just pure numbers for now, sure. And we dont really know how these will perform, but theres been stuff out there using CUDA and nVidia cards doing the same thing Larrabee did on IDF, and its wasnt even close, and with the advent of the G200, which looks like itll be twice as fast as the 9800GTX, then Larrabee is already at least 75% behind, and losing fast
April 20, 2008 8:52:48 PM

you think that the G200 will be twice as fast? i dout it..... maybe 25-50%

But the "CGPU" i think its just more things to go wrong in the development of CPU's.

Stick to what you know.
a b U Graphics card
April 20, 2008 9:04:51 PM

Well, with all this generousity were seeing that Intel is going to take over the graphics market, I thought Id be a bit generous too. But it is possible for the G200 to do this, at least as good as the GTX2. The rumors have been a billion+ tranny chip, so yea its possible.
April 20, 2008 9:20:46 PM

Wow we're all so sceptic about what intel has in store.

I myself believe this is just another load of bull, but what if this does happen? what if this new chip actually comes out and opens the door to faster CGPU's. What Intel is doing may be the "foot in the door" so to speak. Once there, intel may have something brewing in the back ready to be pushed through the door to create some serious competition.

I personally believe all this GPU/CPU MHZ/GHZ multiple core crap is going to be a waste because soon I believe "Laser" Processors will be released and the technology will be integrated into GPU's, giving them a massive performance boost, without all the heat.

ugh, all this multiple core stuff is annoying me.. doesn't it annoy you? Instead of creating a whole new way of making CPU's they just stack CPU's together and hope for the best...

Just like Nvidia with their damn "GX2's" ...

We need a change in technology soon.
a b U Graphics card
April 20, 2008 9:26:07 PM

It depends on usage. In graphics you can see a 1 to 1 performance increase without multithreading, or close to it. ATI/AMD's mcm may have us thinking differently about multi-chip gpus too. Theres alot changing in the graphics arena, let alone the cpu/gpu possibilty
April 21, 2008 2:25:42 AM

sacre said:
I personally believe all this GPU/CPU MHZ/GHZ multiple core crap is going to be a waste because soon I believe "Laser" Processors will be released and the technology will be integrated into GPU's, giving them a massive performance boost, without all the heat.

ugh, all this multiple core stuff is annoying me.. doesn't it annoy you? Instead of creating a whole new way of making CPU's they just stack CPU's together and hope for the best...
The "laser" Processors that you are talking about are multiple core. As a matter of fact that is what they are doing with the lasers. They are using the lasers as the I/O in order to have better inter-CPU communications. The commercial availability of true photonic CPUs with capabilities of current CPUs is many years off.

Here's a link for you.
Sun Microsystems Awarded $44 Million Department of Defense Contract to Develop Microchip Interconnect System

A little old but a good read.
Photonic chips go 3D TRN 072804

Near term use of photonic chips
Photonic Chips, The Secret of Future Optical Communications
Photonic Chips

Also old. Just to show that Intel has it's fingers in the pie.
BetaNews | Intel Builds New Laser Based Processor

Go have a drink, you seem a little stressed.
April 21, 2008 3:57:35 AM

Intel doesn't have the success in the graphics department that ATI and NV does.

The Chrome 430 GT will pwn larrabee.
a b U Graphics card
April 21, 2008 4:05:42 AM

Curently, Chrome is about what? 5X faster than anything Intel has ever made
April 21, 2008 4:07:00 AM

Haha, and by time larrabee comes out, there will be a newer Chrome too.
a b U Graphics card
April 21, 2008 4:09:00 AM

Im thinking Chrome just may be a major competitor with Intel , at least the first gen anyways
April 21, 2008 4:40:45 AM

Since VIA has their sticky fingers all over it, I'm a little wary.

Or have you guys forgotten already.
a b U Graphics card
April 21, 2008 4:49:37 AM

Well, there IS Isaiah, its not like theyre just dead
April 21, 2008 5:05:33 AM

I never said they were dead, I said they are VIA. :lol: 

You never dealt with the VIA chipsets?

They work, but they don't necessarily inspire confidence.
a b U Graphics card
April 21, 2008 5:14:26 AM

LOL, I like that, they dont inspire confidence Good one
April 21, 2008 5:22:24 AM

Don't you agree?
a b U Graphics card
April 21, 2008 5:33:59 AM

Yes, their chipsets are trouble. Thus my response. Chrome, for what it does, as well as Isaiah, for what itll do, doesnt look that bad tho
April 21, 2008 5:52:14 AM

Yeah, it always looks good on paper. We don't really know what is in the minds of the guys at Intel. Only time will tell, the rest is just conjecture.
April 21, 2008 8:05:34 AM

I was reading up on just exactly what Larrabee is, and I ran across a couple of old articles on Ars Technica. Like I said these are old, but probably not too far off the mark.

Anyway, It appears that Larrabee is going to be a hybrid raster and ray-tracing GPU. It appears that it is going to be primarily raster but it's going to have some ray-tracing as well.

I guess this is where Intel gets it's foot in the door with ray-tracing, so apparently Larrabee is a big deal. I guess we will see where it takes us.


Clearing up the confusion over Intel's Larrabee

Clearing up the confusion over Intel's Larrabee, part II
April 21, 2008 1:27:59 PM

I would like to think this is the way of the future, it would make sense for laptops and bigbox computers (like from best buy, etc...). But, what happens when your CPU's graphical abilities become weak? Do you replace the whole CPU? That seems pretty stupid. For this reason alone, these CGPUs will never take off.
April 21, 2008 3:05:37 PM

It's still not clear, to me at least, how this whole thing is going to come together. We still need to wait for a while until things start to take shape. I don't think the initial GPU is going to be anything stunning, but they are going in the right direction IMO. But what do I know? Right, not much.
a c 130 U Graphics card
a b Î Nvidia
April 21, 2008 4:29:07 PM

Zorg said:
It's still not clear, to me at least, how this whole thing is going to come together. We still need to wait for a while until things start to take shape. I don't think the initial GPU is going to be anything stunning, but they are going in the right direction IMO. But what do I know? Right, not much.


Thanks for the links Zorg,
Thats cleared a few things up. So as i see it its no big shakes, It can only do simple straight on raytracing at the min with the goal being able to do global illumination. :hello:  Dx10.1 already does this dosent it. The whole technology isnt as programable or versatile regarding effects etc as prograbable shaders are.
I can see it working as a composite card with rasterisation still having a place. To be honest it seems to me to be what my understanding is of the new R700 From ATI will be like (minus the raytracing obviously). namley many scalable cores (eventually).
Mactronix
Ps most of the above is what i gathered from Zorgs links and this one http://www.hothardware.com/Articles/Intel_Showcases_Dun...
a b U Graphics card
April 21, 2008 4:45:16 PM

In all honesty, Im thinking its more of a way to help eliminate the IGP. It will be superior by far to a IGP WITH a discrete card, but we have that already with both AMD and nVidia, being able to use both dicrete and IGP together. If youve noticed, both ATI and nVidia to an extent have ramped their IGPs up, and will continue to do so. Cpu oriented people, and some here in graphics dont see the importance of the better IGPs, but when it comes to gaming, it has a huge seeding effect, for first time buyers/gamers on their brand new rigs. Even at their own game, Intel loses, using raytracing with a cpu/gpu. The only way Intel will be competitive, is if they actually make a gpu that is. Otherwise, all this raytracing just may be overun by lowly IGPs with discrete add ins, using CUDA or something similar
a b U Graphics card
April 21, 2008 5:09:33 PM

Either way, I hope ATI and Nvidia survive all this...but it could be a great thing. If Intel can push out a graphics processor that is at least semi powerful, and low cost, that could help the pc gaming market. Joe Average could go and play a game and get say xbox 360 graphics on his pc, decide he likes pc gaming better, tells his friends, etc. etc. Who knows, maybe it'll help pull the pc gaming industry out of the hole and create competition=better prices for us(hopefully).
a b U Graphics card
April 21, 2008 5:19:27 PM

True, competition is good. Its just the way Intel slammed the door open with this whole thing, and what some of their people have said (no longer needing rasterization) Thats their mistake, and all the AIBs, and M$ et al wont like this "new" direction "controlled by Intel
a b U Graphics card
April 21, 2008 5:26:31 PM

You know as well as I do though, AMD/ATI are already making their own chipsets, and Nvidia is too. I don't expect them to take it lying down. Here's an interesting thought... If intel can indeed do what they claim and make these graphics solutions as fast as they say......does anyone else think the idea of Nvidia and AMD/ATI getting together to try and crush intel could be a possibility? That would be a nice fight to see.
a c 130 U Graphics card
a b Î Nvidia
April 21, 2008 5:27:34 PM

ohiou_grad_06 said:
Either way, I hope ATI and Nvidia survive all this...but it could be a great thing. If Intel can push out a graphics processor that is at least semi powerful, and low cost, that could help the pc gaming market. Joe Average could go and play a game and get say xbox 360 graphics on his pc, decide he likes pc gaming better, tells his friends, etc. etc. Who knows, maybe it'll help pull the pc gaming industry out of the hole and create competition=better prices for us(hopefully).


I think its really premature talking about ATI and Nvidia surviving anything. First off its going to be great for us end users as its a differant approach to the same solution that will bring about competition and hopefully force a price war ?
I was really impressed with the latest IGP's and it would be a shame if Intel took the route of refusing to put other companies graphics solutions on their boards.
Im not really up on who's chips go on who's boards, i know AMD/ATI but where does Nvidia stand on this ?
Can anyone see Intel closing up shop and only supplying Intel chipsets with there cards/ IGP's on. It wouldnt take much. If they do i cant see them loosing hey they could even float the idea after these Larrabee have debuted, provided they perform ok. Or to put it another way how much crap would it take for you to not want a Nehalem chip in your pc ?
mactronix
a b U Graphics card
April 21, 2008 5:39:24 PM

I don't want one anyway. I know they are a little slower on things, but I'm still for old AMD. Been using their stuff and building with it for 10 years or so. No reason to change it.
a c 130 U Graphics card
a b Î Nvidia
April 21, 2008 5:43:29 PM

ohiou_grad_06 said:
I don't want one anyway. I know they are a little slower on things, but I'm still for old AMD. Been using their stuff and building with it for 10 years or so. No reason to change it.


Agreed i have been using them in my systems also for a while now but my next system will have Intel in it, Sure for games there isnt much in it but when you take Video encoding and other cpu intensive things into account it just sways me that way. I guess its down to what you need. :) 
mactronix
a b U Graphics card
April 21, 2008 5:49:53 PM

Even so, with a 10 to 20% difference in cpus, that can be trounced with a decent gpu from ATI or nVidia in gaming. And when will these apps go coming from CUDA and the like? Theres tons of potential without having the uber cpu. And somehow I think Intels going to find that out
April 21, 2008 11:04:26 PM

You know everyone here wants the uber CPU anyway. Ray-tracing is in it's infancy so current comparisons are of no value. It may take off it may not, but I hope so. I'm certainly no expert, but it appears to have real promise. In either case it's got ATI and Nvidia scrambling.

If Intel tries some strong arm technique of locking down the boards, and I don't think they will, then that would be a big mistake.
a b U Graphics card
April 21, 2008 11:07:38 PM

You mean incentives?
April 21, 2008 11:48:58 PM

What?
a b U Graphics card
April 22, 2008 12:26:21 AM

Incentives, to the AIBs
April 22, 2008 12:37:17 AM

Zorg said:
I never said they were dead, I said they are VIA. :lol: 

You never dealt with the VIA chipsets?

They work, but they don't necessarily inspire confidence.



Unfortunately, I have...and I do agree....they are worthless chipsets.

Zorg said:
Yeah, it always looks good on paper. We don't really know what is in the minds of the guys at Intel. Only time will tell, the rest is just conjecture.


Time and money...I am curious to see how far Intel's deep pockets will get them on this one.
April 22, 2008 1:10:25 AM

JAYDEEJOHN said:
Incentives, to the AIBs
I still don't get what you are saying? I'm talking about pulling some kind of monopoly play that will leave the AIBs high and dry. Like the recent 7xx problem with the 45nm CPUs, except with the VGAs instead. I don't think Intel is that stupid though.
a b U Graphics card
April 22, 2008 2:51:09 AM

I see what youre saying. What I was saying was, give a AIB incentive to sell/make their boards over any others
a c 130 U Graphics card
a b Î Nvidia
April 22, 2008 7:31:40 AM

JAYDEEJOHN said:
Even so, with a 10 to 20% difference in cpus, that can be trounced with a decent gpu from ATI or nVidia in gaming. And when will these apps go coming from CUDA and the like? Theres tons of potential without having the uber cpu. And somehow I think Intels going to find that out


Im not sure what you are trying to say here because at first glance it seems that you are saying that somehow a good GPU can make a mediocre CPU Overclock itself or something ?
Sure at the top end of things CPU wise the differance isnt going to be noticed by most in games, and of course it all depends on what you are doing/playing. Take the Flight sim games and there are plenty more out there that to a lesser degree rely on the CPU quite heavily. Taken on a price for price basis the equivelant intel chip will be better/faster than its AMD rival and thats before overclocking is taken into account. Its the mainstream where companies make the money and these are the people who are more likley to be using systems that will run into performance restrictions.
Mactronix



April 22, 2008 10:35:56 AM

Jaevric said:

Until we see something more substantial than "Integral graphics are the wave of the future!" I'm not terribly worried. Or impressed by their claims.


I clearly remember an Intel spokesman claiming that upcoming Intel integrated graphics on the motherboard would make graphics cards obsolete. This was in the 90's just before Intel's first IGP. It didn't happen that way. Intel has hopes for ray tracing vs. rasterization for their discrete GPU's to compete against Nvidia and AMD. We'll see how that works.

For the most part, AMD has a better concept of integrating a GPU core into a CPU, but it's still aimed at notebooks and business PC's. Their Swift is supposed to be a multi-core (3 or 4) CPU with a single GPU core based on R770. It might work alongside a 780G or higher IGP and a low end discrete GPU in hybrid Crossfire, it might replace the motherboard's IGP.

At any rate, both Nvidia and AMD provide hybrid SLI or hybrid Crossfire for the masses, while Intel's primarily hoping to compete in the notebook market. Most OEM PC's have PCIe x16, so people stuck with Intel IGP's upgrade anyway, whereas Nvidia and AMD provide a choice. Intel would be wise to follow suit with their discrete GPU's working alongside of Intel IGP's or CPU/GPU's.

Nvidia and AMD are showing that hybrid graphics work. It's up to Intel to show that their claims aren't just hype during development. In my experience, Intel makes good CPU's but their attempts at graphics really miss the mark and need even more improvement than AMD's CPU division. :lol: 
April 22, 2008 12:02:33 PM

Jaevric said:

Until we see something more substantial than "Integral graphics are the wave of the future!" I'm not terribly worried. Or impressed by their claims.


Yeah look at what intel gives everyone in their standard new desktop PC "Intel Graphics Accelarator 945 Integrated Chip" Intel needs some work in the graphic's industry.
!