Sign in with
Sign up | Sign in
Your question

GPU dead? I think not!!!!

Last response: in CPUs
Share
May 24, 2008 6:48:05 AM

Having a Gelsinger Intel? Intel started this foolishness by announcing the gpu dead. Of course everyone else points at nVidias response, but how bout this response? http://www.tgdaily.com/content/view/37611/140/ The position Intel has taken is foolish. They expound about the gpu being dead, yet here (my link) we see the cpu getting trounced. And this is only the beginning. Also, Intel claims the gpu is dead, and goes on to make what? Gpus!!! Stop with the foolish comments Intel. A lot of us arent buying it. And who knows? Maybe someday, theyll actually put a cpu on a gpu. My point is, anyone that accepts this statement by Intel, then goes on about nVidias response had better rethink their opinoin. Gelsinger said after all these years the way we see 3d is obsolete, others are saying/using new ways to use those obsolete gpus, and creaming quads in the process. Like Rob Enderle said, Intel is missing the mark on this one, and they should know better

More about : gpu dead

May 24, 2008 6:59:54 AM

JAYDEEJOHN said:
Having a Gelsinger Intel? Intel started this foolishness by announcing the gpu dead. Of course everyone else points at nVidias response, but how bout this response? http://www.tgdaily.com/content/view/37611/140/ The position Intel has taken is foolish. They expound about the gpu being dead, yet here (my link) we see the cpu getting trounced. And this is only the beginning. Also, Intel claims the gpu is dead, and goes on to make what? Gpus!!! Stop with the foolish comments Intel. A lot of us arent buying it. And who knows? Maybe someday, theyll actually put a cpu on a gpu. My point is, anyone that accepts this statement by Intel, then goes on about nVidias response had better rethink their opinoin. Gelsinger said after all these years the way we see 3d is obsolete, others are saying/using new ways to use those obsolete gpus, and creaming quads in the process. Like Rob Enderle said, Intel is missing the mark on this one, and they should know better


They aren't makeing a GPU they are makeing a co processor if it turns out to handle GPU style code well then so be it.
May 24, 2008 7:01:38 AM

JAYDEEJOHN said:
Having a Gelsinger Intel? Intel started this foolishness by announcing the gpu dead. Of course everyone else points at nVidias response, but how bout this response? http://www.tgdaily.com/content/view/37611/140/ The position Intel has taken is foolish. They expound about the gpu being dead, yet here (my link) we see the cpu getting trounced. And this is only the beginning. Also, Intel claims the gpu is dead, and goes on to make what? Gpus!!! Stop with the foolish comments Intel. A lot of us arent buying it. And who knows? Maybe someday, theyll actually put a cpu on a gpu. My point is, anyone that accepts this statement by Intel, then goes on about nVidias response had better rethink their opinoin. Gelsinger said after all these years the way we see 3d is obsolete, others are saying/using new ways to use those obsolete gpus, and creaming quads in the process. Like Rob Enderle said, Intel is missing the mark on this one, and they should know better


They aren't makeing a GPU they are makeing a co processor if it turns out to handle GPU style code well then so be it.
Related resources
May 24, 2008 7:26:55 AM

Intel says a hybrid approach is wrong. nVidia says its the way to go. We will see http://www.techpowerup.com/60926/NVIDIA_Acquires_RaySca... Like Ive stated, Intel took the wrong approach, and is going to try to force feed their way of doing things. In the mean time, theres apps out there thatre going to the hybrid approach, and winning from it. Enderle is right. Intel is missing the mark. More and more, we are going to see this happening. And when does Larrabee come out? Too little too late to make the impact Intel wants to make. If they want to truly make a competing gpu, then fine. But all this garbage about current gpu ways as being dead has to go. One needs the other, as stated by Jen-Hsun, not one is dead, long live the cpu, as Intel has said
May 24, 2008 12:20:57 PM

There's no big deal: Intel is simply doing its homework. What could we expect from the biggest CPU maker? It sounds like vaporware: "Oh! look at how fantastic our next product's features are going to be! Everything else will be obsolete by the time it launches...". Probably, I would do the same - just as anybody else who wants to introduce something new in the market. There's no reason to flame Intel right now. They have some of the best engineers, so, let them give it a try. If it turns out to be the wrong approach there will be thousands of others (huh, probably just 2, in this case). But what about Nvidia? "The CPU doesn't need to become any faster"? Oh, p-l-e-a-s-e. Right. That's why you need a 2.6 - 3.0 Ghz Core 2 CPU to fully utilize your GeForce, right? I want to see Huang's face if Nehalem gives an astonishing boost to his cards and SLI (ah, erh... sorry. Just Crossfire for Nehalem). Maybe the 9 series are being ultra-bottlenecked by current CPUs and they weren't just a gimmick!!! Tell me an application that really benefits from the GPU, except games and video-related ones. Well, the next Photoshop. How many people have a discrete graphics card in their rigs, let alone a reasonable one? Even if things like CUDA ever take off what will be the benefit to Joe Average? Will he spend $100+ for it? The difference is that the CPU can speed "everything" and that's what everyone (almost) looks at:

Joe A. : "How many Gigahurtz/Cores does this computer have?"
Geek Squad attendant: "Well, it has four cores, you know."
Joe A. : "Four??! GEEZ! MY WINDOWS WILL BOOT INSTANTLY!!! FINALLY!"
Geek Squad attendant: "ACTUALLY, you can buy this nice GFX card, since many applications are optimized to work faster with it."
Joe A. : "Does it have FOUR cores?! Well, I don't need to play my movies any faster... I will miss the better scenes!"
Geek Squad attendant: "well.... I was talking about the whole computer being faster..."
Joe A. : "You really want my money, heh? O-K, smart-ass! So, the next thing you're going to tell me is to buy a sound-card, so that EXCEL will run faster, right?"
Geek Squad attendant: "... let's talk about that quad-core again."


Well, I don't mind having a 9800GX2 if I can't run even Windows XP without waiting 5 minutes until it becomes responsive. Nvidia says that a "hybrid approach" is the way to go because it is the *ONLY* way they can go. Besides, it's not a fight like "David vs Goliath" - it's more like "God vs Ant". If Intel loses the first round: give it a rest, some millions and let's go to round 2. If Nvidia loses the first round: investors crying desperately while buying Intel shares, "WTF! GRAPHICS CARDS ARE DEAD!!!...", panicking. Nvidia can become the next AMD if it fails by the time Larrabee launches: Intel will simply have another chance.

Nvidia is not the little-good-hearted boy vs the monopolist giant - just as Intel isn't the godsend who wants to make the world a better place. Both of them think with their wallets and their marketshare graphs. Should we care about who will win?
a c 127 à CPUs
May 24, 2008 1:55:21 PM

This has been going on for a while JayD. In fact I think it started when NVidia decided not to license SLI to Intel for their chipsets. It was in a way selfish if NVidia.

After that both sides started to trach talk each other. Intel demoed the abilities of Nehalem to run a game and physics with no problem and then NVidia had to go and show how well a GPU ran the game better than Intel.

Its going to be interesting to see what happens. And yes Larrabee may seem like a GPU but its not. Its a x86 CPU that is designed to run GPU style code.

Thing I find interesting is how NVidia thinks GPUs with hyprid programing run apps better but we have yet to see a CPU that was changed over to just run GPU based code. I wounder if a CPU that was made only to run GPU code @ 3GHz would be nice.

Anywho. You are right in the end. They both need eachother. Neither will die unless one comes up with a killer architecture that will do both and blow everything to hades.
May 24, 2008 2:32:06 PM

Like F@H, like photoshop and therell be othersThis is just the beginning, like I said. These arent something to be, these are here. And they stomp cpus. And what about usb3? Will everyone have to pay for that? I cant say whether it was right or wrong for nVidia to charge for their SLI, but I can say its wrong to turn around and then charge for usb3, and claim nVidia is wrong.
May 24, 2008 2:48:42 PM

Theyre steppin up with the 4xxx series.
May 24, 2008 3:21:18 PM

Quote:
To Intel
USB 1.1 to 2.0 was a big leap. Saw quite big performance increase. Now with USB 3 its not really that big, right? the data is no longer constrained by the bus right?

To NVidia
Couldn't you have made your SLi compatible with more than your own motherboards? Your seriously stabbing yourself in the foot if you have no support from Intel, the most popular chipset person or rather...

To AMD/Ati
You think my love for this company will save you from my criticism? Well apparently so... nah, seriously you're in no condition to be pushed down...


SLI works on Intel chipsets its a driver tag that disables the feature if it doesn't detect a Nvidia based chipset. I have always been a diehard Nvidia fan but look what happened to AMD when they opened their mouth does Nvidia really think it will not suffer when the sleeping giant awakes again?

Word, Playa.
May 24, 2008 3:36:29 PM

Thats the problem. Intel is the fearful darling here. But so is nVidia concerning graphics. Notice what Intels best does? SkullTrail? Whys that? Hmmmmmm. This is going to get interesting. Jen-Hsen is no joke. He isnt like Otellini, he made this company, drives it. Hes not a Hector either. I see more and more apps going gpgpu driven. By the time Larrabee does come, itll have to fight that as well, besides whatever nVidias going to do with their raytracing, and implementations from that as well. Intel hasnt had to face a company as inventive, single mindedness and compelling as nVidia. They have influence. You cant say that about ATI or AMD. They never did. nVidia does, just like Intel does. Sure theyre huge, but theyre also going against something theyve never seen before, or had to tackle, so this is going to get very interesting
a b à CPUs
May 24, 2008 6:02:51 PM

^Well said!
[:turpit:2]
a c 127 à CPUs
May 25, 2008 12:05:43 AM

JAYDEEJOHN said:
Like F@H, like photoshop and therell be othersThis is just the beginning, like I said. These arent something to be, these are here. And they stomp cpus. And what about usb3? Will everyone have to pay for that? I cant say whether it was right or wrong for nVidia to charge for their SLI, but I can say its wrong to turn around and then charge for usb3, and claim nVidia is wrong.


I agree. I think nVidia deserves to get paid for SLI but they were the ones who wanted to keep it to themselves to get more money in chipset sales. And the thing is that Intesl chipsets normally give better bandwidth to memory and are also the best for OCing. Not to say nVidias are useless but they only offer SLI and thats the only real benefit. I think thats what stirred Intel. I mean even AMD decided to let Intel have Crossfire b/c they knew getting some money was better than getting none, especially since Core2 is the best.

Quote:
To Intel
USB 1.1 to 2.0 was a big leap. Saw quite big performance increase. Now with USB 3 its not really that big, right? the data is no longer constrained by the bus right?

To NVidia
Couldn't you have made your SLi compatible with more than your own motherboards? Your seriously stabbing yourself in the foot if you have no support from Intel, the most popular chipset person or rather...

To AMD/Ati
You think my love for this company will save you from my criticism? Well apparently so... nah, seriously you're in no condition to be pushed down...


Um USB 3.0 is supposed to be 4.8Gbits/s (600MB/s) compared to USB 2.0 @ 480Mbits/s (60MB/s) so 10x the bandwidth. So I think it will be much faster than you think.

I agree with what you say to nVidia. They are really cutting themselves short there. I have seen quite a few who wanted SLI but decided to get just one nVidia GPU or went with a 3870X2/ CF 3870s so they could have a Intel chipset to OC easier and better with.

AMD/ATI, its all in the management. The aquisition of ATI caused some stirring in the R600 that caused it to underperform from what it should have.

JAYDEEJOHN said:
Theyre steppin up with the 4xxx series.


From what we have seen so far yes. I for one hope so. I am sick of seeing nVidias top GPU over the $500 dollar mark keeping people from top teir performance cuz they can't afford to pay too much. If its true a single HD4870 can out pace a 9800GX2 and HD3870X2 easily but I will reserve judgement until that comes out.

JAYDEEJOHN said:
Thats the problem. Intel is the fearful darling here. But so is nVidia concerning graphics. Notice what Intels best does? SkullTrail? Whys that? Hmmmmmm. This is going to get interesting. Jen-Hsen is no joke. He isnt like Otellini, he made this company, drives it. Hes not a Hector either. I see more and more apps going gpgpu driven. By the time Larrabee does come, itll have to fight that as well, besides whatever nVidias going to do with their raytracing, and implementations from that as well. Intel hasnt had to face a company as inventive, single mindedness and compelling as nVidia. They have influence. You cant say that about ATI or AMD. They never did. nVidia does, just like Intel does. Sure theyre huge, but theyre also going against something theyve never seen before, or had to tackle, so this is going to get very interesting


Well Intel already has a head start on raytracing as they bought a company a while ago and have been working on it for a while. So even with what nVidia plans they may be the ones having to fight back against Intel with that aspect. I understand nVidia but I feel like they are getting way in over their head by trying to push into a CPU style area. I doubt we will ever have it set up where we have 3 GPUs, one for CPU and 2 for graphics. I mean unless nVidia has a way to force multiple instructions through a GPU without problems.

I hope this doesn't have adverse effects on us consumers. I personally like it the way it is. 2 CPU companies and 2 GPU companies. Easy to choose from.
a b à CPUs
May 25, 2008 10:00:41 AM

I hope NV can get an X86 license ...

If AMD can stay afloat long enough and push their platform approach as good value for money - mobo, chipset, cpu, gpu - and get a bit more market share ... even up the balance.

At present Intel has crushed every other player in the market and finally NV has woken up to realise they are next for the chopping block.

Intel wants everything.

Intel is Skynet.

Sickum Sarah.
May 25, 2008 10:30:16 AM

Ever heard of tegra?
May 25, 2008 9:35:00 PM

Heres more on the adobe ". While graphic professionals will undoubtedly have a lot of interest in the new Adobe products, the primary message was that Adobe finally recognizes the need for powerful GPUs; with the same level of importance that the industry has placed on CPUs. Using a very large 2 GB file for demonstration, the CPU became less utilized and takes a distant back seat to the more critical GPU, and everyday functions such a free transforms and manipulating angles becomes a lightning fast operation with the right tools. " Its looking good so far. http://benchmarkreviews.com/index.php?option=com_conten... "Video games are now seeing an added dependence on physics processing, just as well as GPU and CPU. Last-generation GPUs from NVIDIA's GeForce graphics processor series have already begun the shift of processing dependency of video games onto the GPU, so it won't be very much longer before the CPU really offers no added level of performance to video games." Check this link out, it gives a few hints as to the needs of a gpu, its future, its abilities, and who knows whats after this. Maybe we dont need those top cpus to run our games anymore


May 25, 2008 10:20:58 PM

If a sleeping giant is awoken, and is off the mark, the giant will still miss what its aiming at. "After catching Jen-Hsun alone in the parking lot after the event, he wasn't at all apprehensive of what I might ask or hurried in his response. My question: "When do you foresee the next NVIDIA GPU to be produced using a smaller fabrication process?" To my complete surprise, Mr. Huang answered: "We count our blessing every time a product is designed." He went on to explain that the upcoming (unreleased) GeForce GPU is already riding the edge of technical boundaries (after the 17 June 2008 NDA is lifted, you will get a completely detailed explanation here), and that a reduced die process doesn't always translate into better performance or higher efficiency. It caught me a little of guard to hear the top NVIDIA rank tell me a completely genuine answer, without the slightest tone of marketing gloss.

It's difficult to gauge a person and their agenda in only one day, but if Jen-Hsun Huang is as genuine towards his goals as he states, the industry is going to be tossed on its ear in a short matter of time. The biggest problem I see with so many large companies is that they have no voice. Take for example Intel and AMD: ask yourself to answer who they are and what they stand for. See my point? I couldn't even begin to tell you their mission statement... but my insider observation lends me to believe it's all about profit. That's not to say that NVIDIA isn't taking their cut - they are publicly traded after all, but every single thing they do inside their design rooms is to solve a consumer problem. I've been building and selling computers for almost nine years now, and only once have I ever really felt that the big-name manufacturers were working for the everyman (that was back when AMD introduced the Athlon series of processors). AMD might be trying to re-live that moment with affordable low-end solutions or neutered tri-core processors, but it's too little perhaps too late. Intel on the other hand seems to be back on their Pentium 4 collision course where it's now about cores like it used to be about clock speed. I don't believe Intel is answering any real consumer need by producing ever-expanding cores for a limited software canvas. It seems that NVIDIA is filling the long-needed void, and offers a revolutionary change for computing with the CUDA programming architecture and graphics that feature extremely efficient processors. "
I know its in its infancy, but that too is part of the point here. Given time, we will see some amazing things coming from gpus, not cpus. nVidia is already there on the cutting edge, eventually so will ATI and even Intel, but for now, its nVidia doing more for computing than Intel,AMD or anyone else atm. I like what I see, and feel Intel should get in line with this
May 26, 2008 4:40:28 AM

i thought the reason why xfire is enabled on intel chipsets is because it doenst require a special bridge chip like sli. so practically you wont be seeing a special "amd/ati" badge on intel chipsets..correct me if im wrong.
May 26, 2008 10:35:47 AM

We will soon be able to see how quads hold up against gpus!!! Super Pi is doing a CUDA client soon, watch for this
May 26, 2008 11:19:23 AM

JAYDEEJOHN said:
We will soon be able to see how quads hold up against gpus!!! Super Pi is doing a CUDA client soon, watch for this


I second that. It will wipe out the test :)  We will be rocking our GPUs and our AMD cpus :)  Intel is trying to jump the gun. It failed before, it will fail now.
May 26, 2008 12:02:39 PM

All those super pi records will be zilch. Time for a new record!!!!
May 26, 2008 12:41:51 PM

Folks - How does CUDA compare to the programming model of Folding for ATI's R600/R500 models?


May 26, 2008 12:47:29 PM

ATM, no one knows exactly how it does. The only marks we have is from the yet to be released G280 vs the 3870, which the G280 tripled in performance. How that stacks up to performance in games and other CUDA used apps, time will tell
May 26, 2008 1:34:40 PM

JAYDEEJOHN said:
ATM, no one knows exactly how it does. The only marks we have is from the yet to be released G280 vs the 3870, which the G280 tripled in performance. How that stacks up to performance in games and other CUDA used apps, time will tell



I meant more widespread, in terms of language complexity, flexibility of the approach, capabilities and limitations of the approach - as well as the comparative speeds. :) 
May 26, 2008 1:42:46 PM

CUDA is easier done. The ATI approach is I believe done in assembler where CUDA in C++, so Id imagine the flexibilty has to go with CUDA, and the growth and ability to follow
May 26, 2008 3:23:55 PM

dattimr said:
There's no big deal: Intel is simply doing its homework. What could we expect from the biggest CPU maker? It sounds like vaporware: "Oh! look at how fantastic our next product's features are going to be! Everything else will be obsolete by the time it launches...". Probably, I would do the same - just as anybody else who wants to introduce something new in the market. There's no reason to flame Intel right now. They have some of the best engineers, so, let them give it a try. If it turns out to be the wrong approach there will be thousands of others (huh, probably just 2, in this case). But what about Nvidia? "The CPU doesn't need to become any faster"? Oh, p-l-e-a-s-e. Right. That's why you need a 2.6 - 3.0 Ghz Core 2 CPU to fully utilize your GeForce, right? I want to see Huang's face if Nehalem gives an astonishing boost to his cards and SLI (ah, erh... sorry. Just Crossfire for Nehalem). Maybe the 9 series are being ultra-bottlenecked by current CPUs and they weren't just a gimmick!!! Tell me an application that really benefits from the GPU, except games and video-related ones. Well, the next Photoshop. How many people have a discrete graphics card in their rigs, let alone a reasonable one? Even if things like CUDA ever take off what will be the benefit to Joe Average? Will he spend $100+ for it? The difference is that the CPU can speed "everything" and that's what everyone (almost) looks at:

Joe A. : "How many Gigahurtz/Cores does this computer have?"
Geek Squad attendant: "Well, it has four cores, you know."
Joe A. : "Four??! GEEZ! MY WINDOWS WILL BOOT INSTANTLY!!! FINALLY!"
Geek Squad attendant: "ACTUALLY, you can buy this nice GFX card, since many applications are optimized to work faster with it."
Joe A. : "Does it have FOUR cores?! Well, I don't need to play my movies any faster... I will miss the better scenes!"
Geek Squad attendant: "well.... I was talking about the whole computer being faster..."
Joe A. : "You really want my money, heh? O-K, smart-ass! So, the next thing you're going to tell me is to buy a sound-card, so that EXCEL will run faster, right?"
Geek Squad attendant: "... let's talk about that quad-core again."


Well, I don't mind having a 9800GX2 if I can't run even Windows XP without waiting 5 minutes until it becomes responsive. Nvidia says that a "hybrid approach" is the way to go because it is the *ONLY* way they can go. Besides, it's not a fight like "David vs Goliath" - it's more like "God vs Ant". If Intel loses the first round: give it a rest, some millions and let's go to round 2. If Nvidia loses the first round: investors crying desperately while buying Intel shares, "WTF! GRAPHICS CARDS ARE DEAD!!!...", panicking. Nvidia can become the next AMD if it fails by the time Larrabee launches: Intel will simply have another chance.

Nvidia is not the little-good-hearted boy vs the monopolist giant - just as Intel isn't the godsend who wants to make the world a better place. Both of them think with their wallets and their marketshare graphs. Should we care about who will win?


"Tell me an application that really benefits from the GPU, except games and video-related ones" - well, um, hate to break it to you but thats what you buy a good gpu for. :hello: 
May 26, 2008 3:55:05 PM

I remember Intels move into the video card market..

It was the 740i...

This chip compared to the rest which was on the market became a laughing stock, even the demo that came with it was stuttery...

Now with Nvidia's discrete graphic cards and stream processors its a wonder how Intel will catch up with graphics power..

Its already known that GPUS are much more powerfull than CPUS but not as flexible...

Folding at home seems to be what Nvidia is waiting for to show that its gpus are king...

Also Intel's Crossfire boards also support SLI from the 975x chipset to the current, but there is no driver support due to licensing issues with Intel...

So maybe Nvidia has peed off Intel to the point that its going to retaliate with a new technology...

Look forward to Labaree ( or however it is spelt ) - stupid name but hey ho...
May 26, 2008 4:33:00 PM

@Harry

Larrabee (sp?) will become vaporware. Or a flop !! Like Windows Vista.

- It has a freaking good roadmap with features and abilities.
- Not too far off delivery schedule
- Its a magic solution to a problem that doesnt exist. (CPUs are the Bottleneck, not gpus)
- Its a Flagship Product
- Up to 80 cores !! (10 Ghz anybody ? I heard that story before)
- Its really revolutionary.
- Last but not least, believe my word (intel) because i never missed, or lied, or....

May 26, 2008 6:36:47 PM

radnor said:
@Harry

Larrabee (sp?) will become vaporware. Or a flop !! Like Windows Vista.

- It has a freaking good roadmap with features and abilities.
- Not too far off delivery schedule
- Its a magic solution to a problem that doesnt exist. (CPUs are the Bottleneck, not gpus)
- Its a Flagship Product
- Up to 80 cores !! (10 Ghz anybody ? I heard that story before)
- Its really revolutionary.
- Last but not least, believe my word (intel) because i never missed, or lied, or....


LoL Intel must have pissed in your cornflakes.

Word, Playa.
May 26, 2008 6:45:27 PM

JAYDEEJOHN said:
All those super pi records will be zilch. Time for a new record!!!!


Hmm so you honestly beleive that a GPU will handle a moderately simple branch calculation faster than a CPU, lol well see those epic GPUs with the 128k onboard cache running loops around CPU's? If there isn't code morphing there will be no way a GPU will outperform a CPU. But its Nvidia so we all know their magic numbers are legit and their drivers can be trusted.

Word, Playa.
May 26, 2008 11:12:08 PM

spoonboy said:
"Tell me an application that really benefits from the GPU, except games and video-related ones" - well, um, hate to break it to you but thats what you buy a good gpu for. :hello: 


Yeah, you're absolutely right: that's what I buy a good GPU for. :wahoo:  That's exactly the point: that's the only thing they're good at up to now. :sweat:  I guess the subject of this topic is "the GPU slowly replacing the CPU" and I don't see this happening anytime soon. How many people have an Intel CPU? How many people have a NVIDIA GPU? Geez! We can't even take full advantage of SSE4.1 yet, man. It's not like everyone will start using CUDA - even if the GPU is a lot better in some specific tasks. It's completely absurd to compare the CPU against the GPU *right now*. Actually, it's pathetic to compare Intel's CPU marketshare vs NVIDIA's GPU marketshare. Everyone's flaming Intel because of its IGPs and previous experiences. Just as Huang stated that "Larrabee is just a Power Point slide and, being so, it can't compete with us", how can the green team say they're going to open a can of whoop ass against a slide? It's plain stupid. How can we talk about NVIDIA's slides predicting Larrabee's performance? Suddenly, everyone starts arguing about how Intel will fail, fail, fail... Come on! We're talking about a new tech that we had never heard of and probably not even imagined until some time ago. How can we say how it will perform? We don't even know for sure if GT200 will be better than RV770 and yet we are discussing a completely new, different and unseen approach. Just because the GPU haven't had its time, it doesn't mean that it will ever have. Who will care about it if something better shows up?

Anyway, everything in this topic is just speculation. If you buy into this NVIDIA's multi-purpose-GPU crap *right now* you'll also buy into Intel's have-as-many-cores-as-you-can crap. By the time we become able to fully utilize the GPU capabilities and more than four CPU cores our current tech will be obsolete.

But the magic word is still the same: Windows. Could it be able to take advantage of a powerful GPU? Just ask Microsoft what they think about it and I'll say how the future is going to be like. :hello: 

May 27, 2008 7:50:17 AM

Heres a little on Jen-Hsun : He spoke at NV Tech and he said, "best PSU for the money? An X2 4800+ (I think that was the model) is the best price performance." Nvidia's first motherboard to feature Hybrid SLI and Hybrid Power as well as 3 way SLI and ESA? An AMD motherboard (the 780a). Funny. It's like they launched 780i knowing that the masses use Intel, and then waited to refine the platform for AMD users.

AMD might own ATI, but Jensun is hell bent on firing back at Intel when they open their mouths.

Someone asked him why he didn't think an Intel integrated GPU would stand a chance against Nvidia. He didn't bother with anything technical he just said, "Say in this (right) pocket, I have something good (an Intel CPU). Now let's say in this (left) pocket, I have a bunch of crap (Intel integrated GPU solution). If I put my good stuff in my left pocket, now there's crap all over my good stuff."

I LOL'd.
__________________ This was posted elsewhere, by someon with alot of respect amongst all communities on the net. You can say what you want, and run it down, but to me, we all win in this one, except Intel, whos just not getting it. The gpu is faster now in a few apps, tomorrow itll be a few more. That leaves the cpu to do what it does best. You, me or anyone else, at this point in time cant say what, nor how good a gpu will work on your favorite app. So, yes its all speculation, with somethings coming to fruition. Right now, it looks good. You think buy A or whatever would be more impressed by the hundreds of cores on a gpu vs the 4 of a cpu? Like I said, weve spent the very beggining of pc history to date doting on all the capabilities of the cpu. Been there done that. Its the gpus day now, some may not like it, but to declare it dead is foolish, especially at this time
a b à CPUs
May 27, 2008 12:33:14 PM

What if he has an NV GPU in one pocket and nothing in the other pocket because Intel push their own chipset and refuse to license NV for the chipset at all ... give them nothing ?

That equates to crap in both pockets doesn't it?

That lowers NV to just pushing their discrete GPU technology ... and a loss of market share.

Lets face it AMD don't want NV chipsets ... they have ATI and a platform to push ... leveraging the bottom end of the market with a complete solution - gpu, cpu, chipset.

Please explain if that isn't true ...

I can't help but think the NV boss is off his tablets ... why push something like this ?? Makes no sense to me ...
May 27, 2008 1:12:37 PM

Reynod said:
What if he has an NV GPU in one pocket and nothing in the other pocket because Intel push their own chipset and refuse to license NV for the chipset at all ... give them nothing ?

That equates to crap in both pockets doesn't it?

That lowers NV to just pushing their discrete GPU technology ... and a loss of market share.

Lets face it AMD don't want NV chipsets ... they have ATI and a platform to push ... leveraging the bottom end of the market with a complete solution - gpu, cpu, chipset.

Please explain if that isn't true ...

I can't help but think the NV boss is off his tablets ... why push something like this ?? Makes no sense to me ...


No ? It makes perfect sense to me.

Nvidia is running as fast as ATi is. So, more profit, less development. Even after the 4xxx series are out, i bet the G80 and G92 will still be selling. So its nice to have a competition that it is running, but not as fast. Chip industry is like the pharmaceutical industry. The first pill cost billions, the second one less than 2 cents. So NVidia is milking the market for the last 1,5 years. Nice cash flow.

Nvidia can do almost the whole plataform,except for the x86 CPU. This kind of measure will almost garantee a AMD CPU in almost every game station. Where AMD would gain zero (Nvidia+Intel Plataform) is now a win situation. AMD wont ditch Nvidia. Nvidia just taking power from a Intel plataform and if CUDA is what it is ment to be, it will be a nice way to sell Opterons aswell !!! Witch will have a Nvidia plataform !!!

Intel is going after the GPU market.
Nvidia is going after the Workstation/server market.
Nvidia CUDA enable software will steal workstation/server market from Intel. Thats beyond discussion. Its being proven and its being done. Ive read the code and was thrilled by it. When a CUDA superPi comes, it will open the eyes of many of you enthusiasts.

Nvidia CEO is being bold and aiming high, because he has the threat upon him of being absorved/out-of-bussiness. Honestly i think he will do it.

Historicly, betting on AMD is not that bad. Atm they dont have the flagship product, but its hardly a bad line.
Betting on Intel already costed Microsoft alot. An example just for start,
AMD isnt floating on money so it can develop platforms for all the markets ( low, medium, high, server/workstation).
So Nvidia is going for it.

I think Nvidia CEo done his homework. Now its just applying his roadmap.



May 27, 2008 1:18:46 PM

Adobe isnt a push its a reality. Super Pi isnt either. Hopefully Mayas next etc.Maybe it wont matter what cpu you have for video editing. Like I said, what about skulltrail? Why does it exist? Everyone who has had a AMD cpu has always asked, why do they have to do it that way? Well, we are intering into a mainly graphics driven future, with other apps hugely benefitting from gpus. ATI owners also asked, why is that a TWIMTBP'd game? Sound familiar? Jen-Hsun has said, we should co exist, the cpu and gpu, but Intel still hasnt changed. nVidia said its willing to, but instead we have a VP from Intel stating the gpu is dead. Now whos doing what? I have pointed out Intels intention of spending billions of dollars going into this "dead" field. Whos telling who the truth here? Think what you want, but serious people know whats going on, and admire this man. To me the real question is, will nVidia actually be able to share the gpu market with Intel?
a c 127 à CPUs
May 27, 2008 1:46:36 PM

radnor said:
@Harry

Larrabee (sp?) will become vaporware. Or a flop !! Like Windows Vista.

- It has a freaking good roadmap with features and abilities.
- Not too far off delivery schedule
- Its a magic solution to a problem that doesnt exist. (CPUs are the Bottleneck, not gpus)
- Its a Flagship Product
- Up to 80 cores !! (10 Ghz anybody ? I heard that story before)
- Its really revolutionary.
- Last but not least, believe my word (intel) because i never missed, or lied, or....


You better hope you are right. But don't doubt Intels R&D department. Everyone laughs at Intels IGPs. Even I do.But Intel obviously made something decent enough to penetrate the business machine market and gobble up millions in sales since it does enough for a business machine. And Larrabee is nothing like their IGPs. It is out of the box completely 100% new.

I actually hope Larrabee is good enough to put a sweat on nVidia. ATI right now has been put into a place by AMD that I don't like which is a "price/performance/efficiency" area. So we may never see another top of the hill GPU from ATI. Sure R700 looks great. But what will happen when nVidias G100 series comes out?

spud said:
Hmm so you honestly beleive that a GPU will handle a moderately simple branch calculation faster than a CPU, lol well see those epic GPUs with the 128k onboard cache running loops around CPU's? If there isn't code morphing there will be no way a GPU will outperform a CPU. But its Nvidia so we all know their magic numbers are legit and their drivers can be trusted.

Word, Playa.


I think you are the only one not thinking that just beacause nVidia says its will be better that it will. Thats my main problem. GPUs are not nearly as flexible as CPUs. Therfore one GPU will not be able to handle all of the code a CPU can. Sure they have proven pretty darn fast in certain things but in others they are extremely slow.

My other problem is that GPUs take up tremendous amounts of power to do the same job. Some may not care. Others will care a lot.

But in the end don't see nVidias CUDA as a CPU killer. It wont be able to. I think both parties need to stop acting like idiots and just agree to disagree.
May 27, 2008 2:09:05 PM

jimmysmitty said:


I think you are the only one not thinking that just beacause nVidia says its will be better that it will. Thats my main problem. GPUs are not nearly as flexible as CPUs. Therfore one GPU will not be able to handle all of the code a CPU can. Sure they have proven pretty darn fast in certain things but in others they are extremely slow.

My other problem is that GPUs take up tremendous amounts of power to do the same job. Some may not care. Others will care a lot.

But in the end don't see nVidias CUDA as a CPU killer. It wont be able to. I think both parties need to stop acting like idiots and just agree to disagree.
Exactly, thats the point here. All these things thatre coming to the gpu are great, great for us!! There will be more. But I see an attitude just like Intels where ITS the only way to go, and thats wrong. Just like TWIMTBP thing, makes it hard to share, but they need to. Is all this exciting? To me and others, yes. Like I said, therell be more. I find it funny to watch the 2 camps myself, meanwhile Im hoping more and more things get done in this direction. Why? Cause I hate Intel? No. I want it for me, for us. Do I prefer nVidia? No. ATI actually. But theyre dropping the ball on this one, and its huge, or else Intel wouldnt be investing in it, after all Intel isnt stupid, maybe bullheaded like nVidia, but theyre just going to have to learn to get along, in the mean time, we benefit
May 27, 2008 2:36:23 PM

jimmysmitty said:
You better hope you are right. But don't doubt Intels R&D department. Everyone laughs at Intels IGPs. Even I do.But Intel obviously made something decent enough to penetrate the business machine market and gobble up millions in sales since it does enough for a business machine. And Larrabee is nothing like their IGPs. It is out of the box completely 100% new.


In our country we have a saying " The mountain gave birth to a rat". I think Intel Roadmap is a good one. I just think its a too good of a offering. My guess it will give birth to a rat. Or Larrabee.

GPUs are fairly limited, but they excel on their job. They can do more jobs, than just gaming. Its a hell of a floating point processor. While tradicional CPUs are indeed the jack of all trades, and they wont never end (my guess), same with Gpus.
Ill give you this example. A Intel CPU can do the job of a Killer NIC NPU no problem. But isnt the Killer NIC NPU much more efficient doing that job ? Without a doubt it is.

Its like cars mate. You cant expect a Prius to have half the torque, the accelaration of my Lancia. Hell, i eat BMWs 330d in a 250 mts dash. But i cant expect my Lancia to be half effective (miles-per-gallon or litres per 100 kms) as a Prius. No way in hell.

Specialization is always better. But you cant specialize in everything. Thats just silly. Intel is thinking its going to "brute-force" do everything. Cause hey, putting 80 (somewhat bad solo) cores in on waffer and the let it roll ?
Again, thats just silly.

May 27, 2008 3:56:08 PM

JAYDEEJOHN said:
Adobe isnt a push its a reality. Super Pi isnt either. Hopefully Mayas next etc.Maybe it wont matter what cpu you have for video editing. Like I said, what about skulltrail? Why does it exist? Everyone who has had a AMD cpu has always asked, why do they have to do it that way? Well, we are intering into a mainly graphics driven future, with other apps hugely benefitting from gpus. ATI owners also asked, why is that a TWIMTBP'd game? Sound familiar? Jen-Hsun has said, we should co exist, the cpu and gpu, but Intel still hasnt changed. nVidia said its willing to, but instead we have a VP from Intel stating the gpu is dead. Now whos doing what? I have pointed out Intels intention of spending billions of dollars going into this "dead" field. Whos telling who the truth here? Think what you want, but serious people know whats going on, and admire this man. To me the real question is, will nVidia actually be able to share the gpu market with Intel?


Actually, they're not spending billions into the "same thing". It's just the same "field", as you stated. So, they called the GPU dead. Larrabee can't be called a GPU. It's the same market, but a completely different approach, so, I can't see anything wrong with Intel's statement - as long as they can prove it. You're simply defending the GPU at all costs because it has proven that it is a lot better at some tasks, but what if Larrabee can do the same just as well and even more? And what about Nehalem? What if it doubles Core 2's performance AND the number of cores? Would the GPU be so much better in this scenario? What about Sandy Bridge? What about FUSION? I hope you are right about everyone benefiting from this clash, and that's what I want too, but I don't see this "Huang's Magical Aura" of yours. I'm not impressed with this guy, but maybe that's just me. Stupid quote: Hitler spoke very well too. It's not Huang's words what will be rendering your graphics. He can speak as a God: what matters is the execution and Intel has been close to perfect in this subject lately. NVIDIA does "bad things" just as Intel. I wouldn't give a **** if both ended and someone else replaced them.

PS: NVIDIA simply doesn't make a CPU because it doesn't have a X86 licence, not because the GPU is the future nor because the world should be a happy place with CPU + GPU coexisting in harmony.
May 27, 2008 4:56:26 PM

Very interesting! I do remember when I got all excited about the physx cards and that has seemed to fizzle away now though.

I'm not too excited about one software vendor doing it. I'm afraid it will be like F@H where you have to have a cetain brand, a certain model, and a certain driver for it all to work.

It'll be great for people that want to custom make systems just for photoshopping, but if the above is true, then many may be turned away.
May 27, 2008 6:52:14 PM

TechnologyCoordinator said:
Very interesting! I do remember when I got all excited about the physx cards and that has seemed to fizzle away now though.

I'm not too excited about one software vendor doing it. I'm afraid it will be like F@H where you have to have a cetain brand, a certain model, and a certain driver for it all to work.

It'll be great for people that want to custom make systems just for photoshopping, but if the above is true, then many may be turned away.


Agreed! ;)  One of the best posts in this thread, BTW. :bounce: 
May 28, 2008 5:40:45 AM

dattimr said:
Actually, they're not spending billions into the "same thing". It's just the same "field", as you stated. So, they called the GPU dead. Larrabee can't be called a GPU. It's the same market, but a completely different approach, so, I can't see anything wrong with Intel's statement - as long as they can prove it. You're simply defending the GPU at all costs because it has proven that it is a lot better at some tasks, but what if Larrabee can do the same just as well and even more? And what about Nehalem? What if it doubles Core 2's performance AND the number of cores? Would the GPU be so much better in this scenario? What about Sandy Bridge? What about FUSION? I hope you are right about everyone benefiting from this clash, and that's what I want too, but I don't see this "Huang's Magical Aura" of yours. I'm not impressed with this guy, but maybe that's just me. Stupid quote: Hitler spoke very well too. It's not Huang's words what will be rendering your graphics. He can speak as a God: what matters is the execution and Intel has been close to perfect in this subject lately. NVIDIA does "bad things" just as Intel. I wouldn't give a **** if both ended and someone else replaced them.

PS: NVIDIA simply doesn't make a CPU because it doesn't have a X86 licence, not because the GPU is the future nor because the world should be a happy place with CPU + GPU coexisting in harmony.

Larrabee will be a gpu working primarily using rasterization. I dont have the quote, but came from an Intel engineer, in response to all the hype Gelsinger said about Larrabee and raytracing. If youre talking about a gpgpu, then yes, like Nvidias current gpus, as well as ATIs , they are, and can be used in that manner. Currently nVidia is working with Via on the small platform for mobile etc, which is a money maker. If all these things come to fruition (Adobe etc) its just going to make things better for us. I agree, I dont like a 1 company driven tech, as I was pointing towards earlier. What I have heard is that alot of these things (other than server apps) can be done in DX, whereby M$ will maybe have a single solution that all parties can comply with, and is very doable, and in many ways easier thru familiarity. Im looking forward to it, and hope it goes great from here. Lets let our cpus do what they do best, but just dont short sell the gpu, as this is all in its infancy, and theres for sure greater things to come, and let the gpu do what it can do best as well
May 28, 2008 3:06:24 PM

I really intended this thread to die .After this post I hope it does, as well as the demeanor of the two comapnies. Ill ad this http://www.tomshardware.com/forum/249450-33-four-9800gx... This was a post by one of the users of CUDA, and starts to show the benefits we can share with this tech. Im thankfull for it, and hope theres more to come. Check out their website, it looks promising, and hope this issue (cpu vs gpu) is resolved The link http://fastra.ua.ac.be/en/specs.html
June 2, 2009 10:29:34 PM

It's all so funny, because Intel is making it's processors more capable of GPU like operations, and Nvidia is making it's GPU's more capable of CPU like operation. Since Nvidia next series will be using MIMD instead of SIMD. So it can do multiple instructions per cluster, like a server array.
June 2, 2009 10:33:16 PM

June 2, 2009 10:50:30 PM



First time posting on the forum. This thread was linked from a new story about Nvidia and CPU's. Usually sites only link recent threads, so I clicked on it thinking it was a thread related to the news story, and thus not old. Didn't think to look at the date of the comments.
a b à CPUs
June 3, 2009 2:15:05 PM

Guess I had better post quick, before Random locks the thread :D 

Anyway, with preliminary silicon Larrabee already coming in at GTX-285 performance, according to the news stories, looks like the Evil Empire might just have something up its sleeve after all.
June 3, 2009 8:14:44 PM

Ummm, since thats speculation, have you seen the speculation on G300? pwned LRB
In essence, itll be half as fast as the G300, or otherwords, not good
Or other otherwords, LRB will be half fast
June 3, 2009 10:29:54 PM

Put it another way. LRB is supposedly 600mm squared, while current ATI is 276mm and nVidia 350? or so. Thats at 55nm, so, even with Intels supposed superior process, its over twice the size (die area), and only equal to todays cards. Sounds like a losing proposition to me.
a c 127 à CPUs
June 4, 2009 7:16:32 AM

JAYDEEJOHN said:
Ummm, since thats speculation, have you seen the speculation on G300? pwned LRB
In essence, itll be half as fast as the G300, or otherwords, not good
Or other otherwords, LRB will be half fast


Is this that thing you get stuck on called TFLOPs? Cuz what happened last time? A 4870 easily out TFLOPs a GTX280....... yet the GTX280 outperforms in games a 4870. TFLOPs are useless for gaming performance. Sure its great to see such large numbers to lure customers into something they think is the most powerful but truly isn't but we all know better.

Hell the 4870 is probably 1.2 times faster than LRB in TFLOPs. Still wont matter come release day if LRB beats the pants off everything in FPS though.

JAYDEEJOHN said:
Put it another way. LRB is supposedly 600mm squared, while current ATI is 276mm and nVidia 350? or so. Thats at 55nm, so, even with Intels supposed superior process, its over twice the size (die area), and only equal to todays cards. Sounds like a losing proposition to me.


You forget that Intels is not exactally like a true GPU. Its a set of CPUs with SIMD instructions. A larger die size never portrays performance lineraly. And yes it does seem you are still stuck on the TFLOPs which has been proven to NOT equate with gaming performance.

Hell a 4870 has nearly 2x the shaders as a GTX200 series yet it does not beat nVidias high end.

Seriously JDJ, give up on the TFLOPs argument. Until it equates actual FPS performance its a useless (in gaming terms) number. Its pretty to look at the amount of power it has but useless for now and the forseeable future.

NECRO THREAD FTW!!!!
June 4, 2009 9:57:56 AM

The TFlops argument?. TFlops are just a number. It depends on the arch, and the app.
Anyways, heres the thing about LRB. If it were coming out today, itd be comparable to the G250, as that was the last highend card from last gen. Just putting it into perspective here.
Getting back to TFlops, it depends also on how well it does SP or DP, and for what purposes either SP or DP is used. DP in gaming isnt that big a deal, as its mainly gpgpu

PS In GRID, the 4870 is able to use more of its total ability and is why the ATI products walk all over the nVidia products. Like I said, it depends on the apps/games
!