GPU dead? I think not!!!!

Having a Gelsinger Intel? Intel started this foolishness by announcing the gpu dead. Of course everyone else points at nVidias response, but how bout this response? http://www.tgdaily.com/content/view/37611/140/ The position Intel has taken is foolish. They expound about the gpu being dead, yet here (my link) we see the cpu getting trounced. And this is only the beginning. Also, Intel claims the gpu is dead, and goes on to make what? Gpus!!! Stop with the foolish comments Intel. A lot of us arent buying it. And who knows? Maybe someday, theyll actually put a cpu on a gpu. My point is, anyone that accepts this statement by Intel, then goes on about nVidias response had better rethink their opinoin. Gelsinger said after all these years the way we see 3d is obsolete, others are saying/using new ways to use those obsolete gpus, and creaming quads in the process. Like Rob Enderle said, Intel is missing the mark on this one, and they should know better
 

spud

Distinguished
Feb 17, 2001
3,406
0
20,780


They aren't makeing a GPU they are makeing a co processor if it turns out to handle GPU style code well then so be it.
 

spud

Distinguished
Feb 17, 2001
3,406
0
20,780


They aren't makeing a GPU they are makeing a co processor if it turns out to handle GPU style code well then so be it.
 
Intel says a hybrid approach is wrong. nVidia says its the way to go. We will see http://www.techpowerup.com/60926/NVIDIA_Acquires_RayScale_Software.html Like Ive stated, Intel took the wrong approach, and is going to try to force feed their way of doing things. In the mean time, theres apps out there thatre going to the hybrid approach, and winning from it. Enderle is right. Intel is missing the mark. More and more, we are going to see this happening. And when does Larrabee come out? Too little too late to make the impact Intel wants to make. If they want to truly make a competing gpu, then fine. But all this garbage about current gpu ways as being dead has to go. One needs the other, as stated by Jen-Hsun, not one is dead, long live the cpu, as Intel has said
 

dattimr

Distinguished
Apr 5, 2008
665
0
18,980
There's no big deal: Intel is simply doing its homework. What could we expect from the biggest CPU maker? It sounds like vaporware: "Oh! look at how fantastic our next product's features are going to be! Everything else will be obsolete by the time it launches...". Probably, I would do the same - just as anybody else who wants to introduce something new in the market. There's no reason to flame Intel right now. They have some of the best engineers, so, let them give it a try. If it turns out to be the wrong approach there will be thousands of others (huh, probably just 2, in this case). But what about Nvidia? "The CPU doesn't need to become any faster"? Oh, p-l-e-a-s-e. Right. That's why you need a 2.6 - 3.0 Ghz Core 2 CPU to fully utilize your GeForce, right? I want to see Huang's face if Nehalem gives an astonishing boost to his cards and SLI (ah, erh... sorry. Just Crossfire for Nehalem). Maybe the 9 series are being ultra-bottlenecked by current CPUs and they weren't just a gimmick!!! Tell me an application that really benefits from the GPU, except games and video-related ones. Well, the next Photoshop. How many people have a discrete graphics card in their rigs, let alone a reasonable one? Even if things like CUDA ever take off what will be the benefit to Joe Average? Will he spend $100+ for it? The difference is that the CPU can speed "everything" and that's what everyone (almost) looks at:

Joe A. : "How many Gigahurtz/Cores does this computer have?"
Geek Squad attendant: "Well, it has four cores, you know."
Joe A. : "Four??! GEEZ! MY WINDOWS WILL BOOT INSTANTLY!!! FINALLY!"
Geek Squad attendant: "ACTUALLY, you can buy this nice GFX card, since many applications are optimized to work faster with it."
Joe A. : "Does it have FOUR cores?! Well, I don't need to play my movies any faster... I will miss the better scenes!"
Geek Squad attendant: "well.... I was talking about the whole computer being faster..."
Joe A. : "You really want my money, heh? O-K, smart-ass! So, the next thing you're going to tell me is to buy a sound-card, so that EXCEL will run faster, right?"
Geek Squad attendant: "... let's talk about that quad-core again."


Well, I don't mind having a 9800GX2 if I can't run even Windows XP without waiting 5 minutes until it becomes responsive. Nvidia says that a "hybrid approach" is the way to go because it is the *ONLY* way they can go. Besides, it's not a fight like "David vs Goliath" - it's more like "God vs Ant". If Intel loses the first round: give it a rest, some millions and let's go to round 2. If Nvidia loses the first round: investors crying desperately while buying Intel shares, "WTF! GRAPHICS CARDS ARE DEAD!!!...", panicking. Nvidia can become the next AMD if it fails by the time Larrabee launches: Intel will simply have another chance.

Nvidia is not the little-good-hearted boy vs the monopolist giant - just as Intel isn't the godsend who wants to make the world a better place. Both of them think with their wallets and their marketshare graphs. Should we care about who will win?
 
This has been going on for a while JayD. In fact I think it started when NVidia decided not to license SLI to Intel for their chipsets. It was in a way selfish if NVidia.

After that both sides started to trach talk each other. Intel demoed the abilities of Nehalem to run a game and physics with no problem and then NVidia had to go and show how well a GPU ran the game better than Intel.

Its going to be interesting to see what happens. And yes Larrabee may seem like a GPU but its not. Its a x86 CPU that is designed to run GPU style code.

Thing I find interesting is how NVidia thinks GPUs with hyprid programing run apps better but we have yet to see a CPU that was changed over to just run GPU based code. I wounder if a CPU that was made only to run GPU code @ 3GHz would be nice.

Anywho. You are right in the end. They both need eachother. Neither will die unless one comes up with a killer architecture that will do both and blow everything to hades.
 
Like F@H, like photoshop and therell be othersThis is just the beginning, like I said. These arent something to be, these are here. And they stomp cpus. And what about usb3? Will everyone have to pay for that? I cant say whether it was right or wrong for nVidia to charge for their SLI, but I can say its wrong to turn around and then charge for usb3, and claim nVidia is wrong.
 

spud

Distinguished
Feb 17, 2001
3,406
0
20,780
To Intel
USB 1.1 to 2.0 was a big leap. Saw quite big performance increase. Now with USB 3 its not really that big, right? the data is no longer constrained by the bus right?

To NVidia
Couldn't you have made your SLi compatible with more than your own motherboards? Your seriously stabbing yourself in the foot if you have no support from Intel, the most popular chipset person or rather...

To AMD/Ati
You think my love for this company will save you from my criticism? Well apparently so... nah, seriously you're in no condition to be pushed down...

SLI works on Intel chipsets its a driver tag that disables the feature if it doesn't detect a Nvidia based chipset. I have always been a diehard Nvidia fan but look what happened to AMD when they opened their mouth does Nvidia really think it will not suffer when the sleeping giant awakes again?

Word, Playa.
 
Thats the problem. Intel is the fearful darling here. But so is nVidia concerning graphics. Notice what Intels best does? SkullTrail? Whys that? Hmmmmmm. This is going to get interesting. Jen-Hsen is no joke. He isnt like Otellini, he made this company, drives it. Hes not a Hector either. I see more and more apps going gpgpu driven. By the time Larrabee does come, itll have to fight that as well, besides whatever nVidias going to do with their raytracing, and implementations from that as well. Intel hasnt had to face a company as inventive, single mindedness and compelling as nVidia. They have influence. You cant say that about ATI or AMD. They never did. nVidia does, just like Intel does. Sure theyre huge, but theyre also going against something theyve never seen before, or had to tackle, so this is going to get very interesting
 


I agree. I think nVidia deserves to get paid for SLI but they were the ones who wanted to keep it to themselves to get more money in chipset sales. And the thing is that Intesl chipsets normally give better bandwidth to memory and are also the best for OCing. Not to say nVidias are useless but they only offer SLI and thats the only real benefit. I think thats what stirred Intel. I mean even AMD decided to let Intel have Crossfire b/c they knew getting some money was better than getting none, especially since Core2 is the best.

To Intel
USB 1.1 to 2.0 was a big leap. Saw quite big performance increase. Now with USB 3 its not really that big, right? the data is no longer constrained by the bus right?

To NVidia
Couldn't you have made your SLi compatible with more than your own motherboards? Your seriously stabbing yourself in the foot if you have no support from Intel, the most popular chipset person or rather...

To AMD/Ati
You think my love for this company will save you from my criticism? Well apparently so... nah, seriously you're in no condition to be pushed down...

Um USB 3.0 is supposed to be 4.8Gbits/s (600MB/s) compared to USB 2.0 @ 480Mbits/s (60MB/s) so 10x the bandwidth. So I think it will be much faster than you think.

I agree with what you say to nVidia. They are really cutting themselves short there. I have seen quite a few who wanted SLI but decided to get just one nVidia GPU or went with a 3870X2/ CF 3870s so they could have a Intel chipset to OC easier and better with.

AMD/ATI, its all in the management. The aquisition of ATI caused some stirring in the R600 that caused it to underperform from what it should have.



From what we have seen so far yes. I for one hope so. I am sick of seeing nVidias top GPU over the $500 dollar mark keeping people from top teir performance cuz they can't afford to pay too much. If its true a single HD4870 can out pace a 9800GX2 and HD3870X2 easily but I will reserve judgement until that comes out.



Well Intel already has a head start on raytracing as they bought a company a while ago and have been working on it for a while. So even with what nVidia plans they may be the ones having to fight back against Intel with that aspect. I understand nVidia but I feel like they are getting way in over their head by trying to push into a CPU style area. I doubt we will ever have it set up where we have 3 GPUs, one for CPU and 2 for graphics. I mean unless nVidia has a way to force multiple instructions through a GPU without problems.

I hope this doesn't have adverse effects on us consumers. I personally like it the way it is. 2 CPU companies and 2 GPU companies. Easy to choose from.
 
I hope NV can get an X86 license ...

If AMD can stay afloat long enough and push their platform approach as good value for money - mobo, chipset, cpu, gpu - and get a bit more market share ... even up the balance.

At present Intel has crushed every other player in the market and finally NV has woken up to realise they are next for the chopping block.

Intel wants everything.

Intel is Skynet.

Sickum Sarah.
 
Heres more on the adobe ". While graphic professionals will undoubtedly have a lot of interest in the new Adobe products, the primary message was that Adobe finally recognizes the need for powerful GPUs; with the same level of importance that the industry has placed on CPUs. Using a very large 2 GB file for demonstration, the CPU became less utilized and takes a distant back seat to the more critical GPU, and everyday functions such a free transforms and manipulating angles becomes a lightning fast operation with the right tools. " Its looking good so far. http://benchmarkreviews.com/index.php?option=com_content&task=view&id=178&Itemid=46 "Video games are now seeing an added dependence on physics processing, just as well as GPU and CPU. Last-generation GPUs from NVIDIA's GeForce graphics processor series have already begun the shift of processing dependency of video games onto the GPU, so it won't be very much longer before the CPU really offers no added level of performance to video games." Check this link out, it gives a few hints as to the needs of a gpu, its future, its abilities, and who knows whats after this. Maybe we dont need those top cpus to run our games anymore


 
If a sleeping giant is awoken, and is off the mark, the giant will still miss what its aiming at. "After catching Jen-Hsun alone in the parking lot after the event, he wasn't at all apprehensive of what I might ask or hurried in his response. My question: "When do you foresee the next NVIDIA GPU to be produced using a smaller fabrication process?" To my complete surprise, Mr. Huang answered: "We count our blessing every time a product is designed." He went on to explain that the upcoming (unreleased) GeForce GPU is already riding the edge of technical boundaries (after the 17 June 2008 NDA is lifted, you will get a completely detailed explanation here), and that a reduced die process doesn't always translate into better performance or higher efficiency. It caught me a little of guard to hear the top NVIDIA rank tell me a completely genuine answer, without the slightest tone of marketing gloss.

It's difficult to gauge a person and their agenda in only one day, but if Jen-Hsun Huang is as genuine towards his goals as he states, the industry is going to be tossed on its ear in a short matter of time. The biggest problem I see with so many large companies is that they have no voice. Take for example Intel and AMD: ask yourself to answer who they are and what they stand for. See my point? I couldn't even begin to tell you their mission statement... but my insider observation lends me to believe it's all about profit. That's not to say that NVIDIA isn't taking their cut - they are publicly traded after all, but every single thing they do inside their design rooms is to solve a consumer problem. I've been building and selling computers for almost nine years now, and only once have I ever really felt that the big-name manufacturers were working for the everyman (that was back when AMD introduced the Athlon series of processors). AMD might be trying to re-live that moment with affordable low-end solutions or neutered tri-core processors, but it's too little perhaps too late. Intel on the other hand seems to be back on their Pentium 4 collision course where it's now about cores like it used to be about clock speed. I don't believe Intel is answering any real consumer need by producing ever-expanding cores for a limited software canvas. It seems that NVIDIA is filling the long-needed void, and offers a revolutionary change for computing with the CUDA programming architecture and graphics that feature extremely efficient processors. "
I know its in its infancy, but that too is part of the point here. Given time, we will see some amazing things coming from gpus, not cpus. nVidia is already there on the cutting edge, eventually so will ATI and even Intel, but for now, its nVidia doing more for computing than Intel,AMD or anyone else atm. I like what I see, and feel Intel should get in line with this
 

wh3resmycar

Distinguished
i thought the reason why xfire is enabled on intel chipsets is because it doenst require a special bridge chip like sli. so practically you wont be seeing a special "amd/ati" badge on intel chipsets..correct me if im wrong.
 

radnor

Distinguished
Apr 9, 2008
1,021
0
19,290


I second that. It will wipe out the test :) We will be rocking our GPUs and our AMD cpus :) Intel is trying to jump the gun. It failed before, it will fail now.
 
ATM, no one knows exactly how it does. The only marks we have is from the yet to be released G280 vs the 3870, which the G280 tripled in performance. How that stacks up to performance in games and other CUDA used apps, time will tell
 

Amiga500

Distinguished
Jul 3, 2007
631
0
18,980



I meant more widespread, in terms of language complexity, flexibility of the approach, capabilities and limitations of the approach - as well as the comparative speeds. :)
 

spoonboy

Distinguished
Oct 31, 2007
1,053
0
19,280


"Tell me an application that really benefits from the GPU, except games and video-related ones" - well, um, hate to break it to you but thats what you buy a good gpu for. :hello: