Pat "Jen-sun" Gelsinger does it again LOL

In all fairness, its a shame this kind of talk has to happen. x86 isnt going to be the end all either. The Khronos group will decide this. If people want to use their apps with whats current, theyll have to use CAL or CUDA, which some already are, and theres more lined up to do so. Short term, Gelsingers wrong, long term, hes wrong again more than likely. Until theres acually something out from Intel, its just talk. Meanwhike, CUDA and CAL are actually here, and doing things, real things. Will there be a better approach? More than likely, will it be x86? No. But, all this talk, it sounds like AMD , when they dont have their quads out yet. This doesnt make Intel look good. Someone should grab Pat and pull him asside.
 

spud

Distinguished
Feb 17, 2001
3,406
0
20,780


Then Intel flex's it's compilers, profilers, marketing, education, and software partners and ya CUDA and CAL just dissapear.

Word, Playa.
 
And every app thats gone in that direction , and every person who bought them, and uses them, they go away too? And the companies too? No big deal right? Flex, grumble, snort......its not going to help nor stop whats already taking place, and only hurt. Intel isnt getting it, and they still dont have it. Til they do, it looks like Hector to me
 

dattimr

Distinguished
Apr 5, 2008
665
0
18,980
I was going to post that for you, jaydee. Hahaha. Well...

Honestly, that's not worse than the green "can of whoop-ass" thing. Nvidia is having problems even with the fallen - not at all - ATI by now. As I said in another post, you simply can't compare the "market share" of x86 vs CUDA capable GPUs (especially with the super-mainstream-CUDA-capable-GTX 2XX pricing). Even if you could do something 10x faster with CUDA, why would you bother with, hmm, ergh, 15%-20% of the market who own a GeForce 8 or above? Of course you would bother if you take some "specific market segments" into consideration, but I still can't see much of a benefit for the average desktop user. Sorry, but even Photoshop and a simple video-encoder aren't enough for that. The promises have been around for too long.

Larrabee: maybe it'll wipe out the floor with every GeForce and Radeon ever made or maybe it won't. If it's better than anything else, shouldn't we go for it? I just don't care if it's Intel's or not. I'm sure people will also think this way when they look at the benchmarks.

Soon we'll have mainstream Nehalems - just as we'll have CPUs with 8 cores. Would the benefit of using CUDA be so great in that scenario? I don't think so, but that's just my 2 cents.
 
I am laughing more at the people who posted responses. Intels idea sounds good in theory and if Larrabees vector engine works good it will add a nice twist to the GPU arena (and maybe we wont have to pay an arm for a nVidia card).

But the people were stating that the GTX280 whoops the 4870....from what I saw it wasnt all that different and for the price, I think the 4870 has a better chance this round.

One talked about the shader and how nVidia has 256 compared to Larrabees 8 and 16.....does he even realize the difference?

People amaze me. I for one don't see CUDA as a viable replacement for the CPU. It may get a little market niche but thats it.
 

dattimr

Distinguished
Apr 5, 2008
665
0
18,980
One talked about the shader and how nVidia has 256 compared to Larrabees 8 and 16.....does he even realize the difference?

Dude, I almost created an account just to flame that one. :ouch:

People amaze me. I for one don't see CUDA as a viable replacement for the CPU. It may get a little market niche but thats it.

Yeah. That's my vision too. :sleep:
 
We arent talking a 100% increase, we arent talking 60%, in some apps we are talking 1000% increase, so doubling something that is already much slower than we we have isnt getting there. The cpu isnt the way to do certain things. Period. Traditionally they have, and its been nothing but promises, hints, rumors. Not any more. The promises of gpus are real, and here. The traditional useage of cpus for encoding are history, no longer adequate, nor needed. Those special apps, like the super computers, various medical uses etc, theyll never go to a cpu direction until theres actually something more than , rumors, promises and hype, which at this point, is all Intel can actually give us. The shoes on the other foot, and its Intel that has to come thru, quit talking, and start doing. Until I see it, its just rumor and hype
 
Dont think AMD is going to sit still either. And dont think the G200 series is the only capable gpu to use CUDA more effectively than any cpu out there. Seems to me any 8xxx series or newer uses CUDA. And Brooke+ is a very viable option as well. By the time anything Intel actually puts out anything, both will have a decent start in many apps, both wide use and narrow use
 
Thuink of it this way. Someone who does a ton of encoding sees Pat Jen-suns comments, and instead of getting that app thatll save him 300% or more in time/usage, he just has to wait, and wont use it, cause Pat said so. Yea right
 

NMDante

Distinguished
Oct 5, 2002
1,588
0
19,780
I don't see why you are getting all up in arms over Pat's comments. So he said that CUDA wouldn't last, and that Larabee will be AI based, easily programmed using existing x86 instruction sets.

Why get so upset? So, if CUDA takes off, that will be Intel who will need to play catch up. You act as though he has the power to just make CUDA or CAL fade away.

Why not just wait and see just how good or bad Larabee is before jumping to conclusions?
 

dattimr

Distinguished
Apr 5, 2008
665
0
18,980


What has CUDA given *us* - desktop users - so far? Do *we* - desktop users - need to do what super computers do?

Also, If I remember Anand's "preview" about the CUDA video-encoder, they couldn't even tell what were the quality settings used, since they were locked. Also, I remember the article saying that the video-encoder was CPU dependant too. If it encodes 1000x faster with 10x less quality then I don't care about it.

If a Nehalem Octo-core can do the same encoding 10x slower but can do 100.000 other things better - or simply only it can do - then I also don't care about CUDA.

Until we see a Badaboom video-encoder final release or any other CUDA capable desktop software they're just rumor and hype. Period.
 

dattimr

Distinguished
Apr 5, 2008
665
0
18,980


Right. Both Nvidia and Intel are just spreading FUD by now. Let's wait a litte more.
 
Look, this whole thing started with Gelsinger, thats why the response, because he hasnt stopped. Id ask you to reread that Anands article, where they said they didnt even have the latest or best solution out, which solved a few issues and increased performance. Q3 is when we will see these apps. And if youve ever been sick or know someone who has, then you might be surprised by the medical imagery being used. Its a shame Pat cant keep quiet, like its hurting him somehow. And, we know itll be awhile before Intel brings out Larrabee, so all these apps will have time to grow and flourish. Its funny you say Im upset, Im laughing, this response from Gelsinger is laughable, and sad. Either its worthless, or he should shut up. Time will tell.
 
Gelsinger also took the opportunity to trash talk the Cell processor too, which is a mistake. The last thing Intel would want to happen is Cell processor to be released into the mainstream and desktop markets. Especially since Toshiba announced they will be producing laptops with Cell processors. The Cell processor and nVidia's CUDA could put a serious hurting on Larabee in multimedia applications.

The general purpose processor isn't going to go away anytime soon and unless something like Larabee can prove that it has the juice, then guys like Gelsinger should hold their tongue.
 

NMDante

Distinguished
Oct 5, 2002
1,588
0
19,780
He is just talking. Just like the CEO of nVidia. Just like Henri Richards. Just like anyone who wants to highlight their product.

It will all fall on him if his predictions don't come true. Seriously, who cares? If Intel wants to push developers to use instructions that help Larabee, that's their business. If nVidia or AMD want the same thing, they have to push their agendas.

I still don't see what the big deal is. It's not like Pat lied about anything. He's comment about the Cell is true. He never said anything about it's uses in medical equipment, but how it was touted as the thing to replace the CPU in everyday machines. It hasn't, due to it's complex programming needs. I'm sure it could be a viable competitor, if it was easier to program.

I would just take what Pat or anyone else says with two large bags of salt. It's the start of Q3, and it's time to push products - present and future.
 

dattimr

Distinguished
Apr 5, 2008
665
0
18,980


Both Gelsinger and Huang should shut up. You're just being picky with Intel. OK, Gelsinger started the whole thing... But then what? Has Intel failed to deliver in the past 2 years?

I *love* the idea behing Folding@Home, jaydee, but is that enough for everyone to buy a CUDA capable GPU? Can't supercomputers already do that with CUDA? Can't we do enough with our Quad-Cores or Radeons?

Perhaps this whole CUDA thing will be the next "Itanish" - from Itanium - thing. Well, actually, they say it's quite easy to use it... But will it ever make it to the average user? Who knows? I'm a desktop user and I shouldn't care about what it can do in a scientific environment. Let scientists worry about it and decide for themselves. If it's better for *them*, then I'm OK with it. But if it's not better for *me*...

Show me a useful CUDA capable desktop software and I'll give it a try. Do you have any links? But CUDA has been around for more than 2 years now, hasn't it? I don't have any benchmarks of Larrabee kicking any asses too. Both = FUD.
 

MadHacker

Distinguished
May 20, 2006
542
0
18,980
All Intel has to do is buy all the companies that use CUDA, or offer them insentives to stay with Intel
remove the demand commercialy and it will fade.

Intel can make that happen.
 


You do realize the one thing here that you saying is kinda wrong right? Cell is very powerful, especially for what it does in the PS3. But from what I have seen Cell is not as good in a x86 based enviroment.

That and the fact that most game companies hate coding for Cell as it is very complicated compared to x86. VALVes Gabe Newell himself trash talked the Cell saying it was too complicated to code games for where as the PC and 360 were simple (they are very similar).

So here is my thing. Where does nVidia and ATI make the most of their money? Either media type PCs that use the Fire GLs and such or the desktop PC market. Out of all of those sales how many people are going to switch or even use CUDA? I think a small amount may use it for Folding @ Home but I doubt it will change over.

Everyone is making assumptions about Larrabee, and you know what that makes people, but we have to wait and see. Intel may have something amazing or they may have a absolute flop. We shall have to see. Although this info may be meant to mislead people and they could be incorporating their Terascale chip for Larrabee.........that would be interesting.
 
So can Toshiba, and IBM and whomever that sees something worthwhile. I want to see what can be done with this, dont you? You want to spend 8 hrs encoding instead of 30 minutes? Theres going to be good uses for this. If it works, people will buy it. And youre right, both should just shut up, thats why the comparison between the two. I dont care if its nVidia or Toshiba or AMD, so far I like what Ive seen. It can only get better. They all want it their way, which is why Im thinking Khronos will have to decide. Money only works when something else isnt worth it, we will see if this is worth it. Im hoping it will be. Something new, with maybe more abilities that we havnt seen yet
 
Crysis brings so many bottlenecks, cpu, gpu. And any gpu is bottlenecked when it comes to potential thruput. Theres really nothing out there thats being used to use full potential anyways. Some areas of Crysis go beyond what any gpu can do, and dont asl all of what gpu can do in other areas, same for the cpu, regarding Crysis. I agree, whered that come from? Im hoping this form of crying "intels dad is bigger n your dad" and "no nVidias dad is bigger n yours" crap ends soon, and we see if CUDA takes off, and will Larrabee actually be a viable alternative, regardless of Intels x86 structuring or not. Remember, thats key to begin with. If Larrabee is a dog, it wont matter how pretty all that x86 coding looks to devs, if no one wants it. And, same with CUDA. If it doesnt come thru on its promises, no ones going to bother with it as well. Im looking forward to see what it can do. If it lives up to its potential, no amount ogf money nor influence will stop people from buying/using it. At this point, and I may be wrong, but it looks like Intel has the Beta (Sony) and nVidia has the VHS. Beta was better, did more etc, but it was too late, people alreadu had VHS and liked it too much.
 

dattimr

Distinguished
Apr 5, 2008
665
0
18,980
Yeah, but it's Intel's, you know. I completely agree about the encoding time thing and seeing what CUDA can really do, jaydee, however, it looks like Nvidia doesn't want to share it at all. The world is based on x86 and that's what I believe in as of now. Of course things change and will keep changing, and that's the way it should be, but I still don't see it happening anytime soon, at least not in the way Nvidia is talking. Their promises to the average desktop users are also late, so, Larrabee still has all the time it takes to go to the shelves. Besides, we should remember the 64-bit tragedy. How long has it been around and almost nobody has even a single damn driver compiled in 64-bit? How many people don't have a 64-bit capable CPU? Compare this number to the one of people who can't run CUDA. Just because it's "better" doesn't mean that it will ever take off. Unfortunately.