Sign in with
Sign up | Sign in
Your question

Pat "Jen-sun" Gelsinger does it again LOL

Last response: in CPUs
Share
July 2, 2008 12:10:34 PM

Heres more hype for hope from Intel http://www.custompc.co.uk/news/602868/intel-cuda-will-b... Either theyre really worried, or this is just getting too funny . One thing, familiarity breeds contempt. He sounds like Jen-suns lil bro heheh. Makes Hector look good
July 2, 2008 12:45:06 PM

Geez can people stop predicting the future?
July 2, 2008 12:46:49 PM

In all fairness, its a shame this kind of talk has to happen. x86 isnt going to be the end all either. The Khronos group will decide this. If people want to use their apps with whats current, theyll have to use CAL or CUDA, which some already are, and theres more lined up to do so. Short term, Gelsingers wrong, long term, hes wrong again more than likely. Until theres acually something out from Intel, its just talk. Meanwhike, CUDA and CAL are actually here, and doing things, real things. Will there be a better approach? More than likely, will it be x86? No. But, all this talk, it sounds like AMD , when they dont have their quads out yet. This doesnt make Intel look good. Someone should grab Pat and pull him asside.
Related resources
July 2, 2008 1:00:01 PM

JAYDEEJOHN said:
In all fairness, its a shame this kind of talk has to happen. x86 isnt going to be the end all either. The Khronos group will decide this. If people want to use their apps with whats current, theyll have to use CAL or CUDA, which some already are, and theres more lined up to do so. Short term, Gelsingers wrong, long term, hes wrong again more than likely. Until theres acually something out from Intel, its just talk. Meanwhike, CUDA and CAL are actually here, and doing things, real things. Will there be a better approach? More than likely, will it be x86? No. But, all this talk, it sounds like AMD , when they dont have their quads out yet. This doesnt make Intel look good. Someone should grab Pat and pull him asside.


Then Intel flex's it's compilers, profilers, marketing, education, and software partners and ya CUDA and CAL just dissapear.

Word, Playa.
July 2, 2008 1:36:47 PM

And every app thats gone in that direction , and every person who bought them, and uses them, they go away too? And the companies too? No big deal right? Flex, grumble, snort......its not going to help nor stop whats already taking place, and only hurt. Intel isnt getting it, and they still dont have it. Til they do, it looks like Hector to me
July 2, 2008 1:56:41 PM

I was going to post that for you, jaydee. Hahaha. Well...

Honestly, that's not worse than the green "can of whoop-ass" thing. Nvidia is having problems even with the fallen - not at all - ATI by now. As I said in another post, you simply can't compare the "market share" of x86 vs CUDA capable GPUs (especially with the super-mainstream-CUDA-capable-GTX 2XX pricing). Even if you could do something 10x faster with CUDA, why would you bother with, hmm, ergh, 15%-20% of the market who own a GeForce 8 or above? Of course you would bother if you take some "specific market segments" into consideration, but I still can't see much of a benefit for the average desktop user. Sorry, but even Photoshop and a simple video-encoder aren't enough for that. The promises have been around for too long.

Larrabee: maybe it'll wipe out the floor with every GeForce and Radeon ever made or maybe it won't. If it's better than anything else, shouldn't we go for it? I just don't care if it's Intel's or not. I'm sure people will also think this way when they look at the benchmarks.

Soon we'll have mainstream Nehalems - just as we'll have CPUs with 8 cores. Would the benefit of using CUDA be so great in that scenario? I don't think so, but that's just my 2 cents.
a c 123 à CPUs
July 2, 2008 2:08:54 PM

I am laughing more at the people who posted responses. Intels idea sounds good in theory and if Larrabees vector engine works good it will add a nice twist to the GPU arena (and maybe we wont have to pay an arm for a nVidia card).

But the people were stating that the GTX280 whoops the 4870....from what I saw it wasnt all that different and for the price, I think the 4870 has a better chance this round.

One talked about the shader and how nVidia has 256 compared to Larrabees 8 and 16.....does he even realize the difference?

People amaze me. I for one don't see CUDA as a viable replacement for the CPU. It may get a little market niche but thats it.
July 2, 2008 2:12:20 PM

Quote:
One talked about the shader and how nVidia has 256 compared to Larrabees 8 and 16.....does he even realize the difference?


Dude, I almost created an account just to flame that one. :ouch: 

Quote:
People amaze me. I for one don't see CUDA as a viable replacement for the CPU. It may get a little market niche but thats it.


Yeah. That's my vision too. :sleep: 
July 2, 2008 2:14:59 PM

We arent talking a 100% increase, we arent talking 60%, in some apps we are talking 1000% increase, so doubling something that is already much slower than we we have isnt getting there. The cpu isnt the way to do certain things. Period. Traditionally they have, and its been nothing but promises, hints, rumors. Not any more. The promises of gpus are real, and here. The traditional useage of cpus for encoding are history, no longer adequate, nor needed. Those special apps, like the super computers, various medical uses etc, theyll never go to a cpu direction until theres actually something more than , rumors, promises and hype, which at this point, is all Intel can actually give us. The shoes on the other foot, and its Intel that has to come thru, quit talking, and start doing. Until I see it, its just rumor and hype
July 2, 2008 2:20:00 PM

Dont think AMD is going to sit still either. And dont think the G200 series is the only capable gpu to use CUDA more effectively than any cpu out there. Seems to me any 8xxx series or newer uses CUDA. And Brooke+ is a very viable option as well. By the time anything Intel actually puts out anything, both will have a decent start in many apps, both wide use and narrow use
July 2, 2008 2:22:45 PM

Thuink of it this way. Someone who does a ton of encoding sees Pat Jen-suns comments, and instead of getting that app thatll save him 300% or more in time/usage, he just has to wait, and wont use it, cause Pat said so. Yea right
July 2, 2008 2:32:40 PM

I don't see why you are getting all up in arms over Pat's comments. So he said that CUDA wouldn't last, and that Larabee will be AI based, easily programmed using existing x86 instruction sets.

Why get so upset? So, if CUDA takes off, that will be Intel who will need to play catch up. You act as though he has the power to just make CUDA or CAL fade away.

Why not just wait and see just how good or bad Larabee is before jumping to conclusions?
July 2, 2008 2:33:32 PM

JAYDEEJOHN said:
We arent talking a 100% increase, we arent talking 60%, in some apps we are talking 1000% increase, so doubling something that is already much slower than we we have isnt getting there. The cpu isnt the way to do certain things. Period. Traditionally they have, and its been nothing but promises, hints, rumors. Not any more. The promises of gpus are real, and here. The traditional useage of cpus for encoding are history, no longer adequate, nor needed. Those special apps, like the super computers, various medical uses etc, theyll never go to a cpu direction until theres actually something more than , rumors, promises and hype, which at this point, is all Intel can actually give us. The shoes on the other foot, and its Intel that has to come thru, quit talking, and start doing. Until I see it, its just rumor and hype


What has CUDA given *us* - desktop users - so far? Do *we* - desktop users - need to do what super computers do?

Also, If I remember Anand's "preview" about the CUDA video-encoder, they couldn't even tell what were the quality settings used, since they were locked. Also, I remember the article saying that the video-encoder was CPU dependant too. If it encodes 1000x faster with 10x less quality then I don't care about it.

If a Nehalem Octo-core can do the same encoding 10x slower but can do 100.000 other things better - or simply only it can do - then I also don't care about CUDA.

Until we see a Badaboom video-encoder final release or any other CUDA capable desktop software they're just rumor and hype. Period.
July 2, 2008 2:37:46 PM

NMDante said:
I don't see why you are getting all up in arms over Pat's comments. So he said that CUDA wouldn't last, and that Larabee will be AI based, easily programmed using existing x86 instruction sets.

Why get so upset? So, if CUDA takes off, that will be Intel who will need to play catch up. You act as though he has the power to just make CUDA or CAL fade away.

Why not just wait and see just how good or bad Larabee is before jumping to conclusions?


Right. Both Nvidia and Intel are just spreading FUD by now. Let's wait a litte more.
July 2, 2008 2:45:37 PM

Look, this whole thing started with Gelsinger, thats why the response, because he hasnt stopped. Id ask you to reread that Anands article, where they said they didnt even have the latest or best solution out, which solved a few issues and increased performance. Q3 is when we will see these apps. And if youve ever been sick or know someone who has, then you might be surprised by the medical imagery being used. Its a shame Pat cant keep quiet, like its hurting him somehow. And, we know itll be awhile before Intel brings out Larrabee, so all these apps will have time to grow and flourish. Its funny you say Im upset, Im laughing, this response from Gelsinger is laughable, and sad. Either its worthless, or he should shut up. Time will tell.
a b à CPUs
July 2, 2008 2:54:33 PM

Gelsinger also took the opportunity to trash talk the Cell processor too, which is a mistake. The last thing Intel would want to happen is Cell processor to be released into the mainstream and desktop markets. Especially since Toshiba announced they will be producing laptops with Cell processors. The Cell processor and nVidia's CUDA could put a serious hurting on Larabee in multimedia applications.

The general purpose processor isn't going to go away anytime soon and unless something like Larabee can prove that it has the juice, then guys like Gelsinger should hold their tongue.
July 2, 2008 2:55:24 PM

He is just talking. Just like the CEO of nVidia. Just like Henri Richards. Just like anyone who wants to highlight their product.

It will all fall on him if his predictions don't come true. Seriously, who cares? If Intel wants to push developers to use instructions that help Larabee, that's their business. If nVidia or AMD want the same thing, they have to push their agendas.

I still don't see what the big deal is. It's not like Pat lied about anything. He's comment about the Cell is true. He never said anything about it's uses in medical equipment, but how it was touted as the thing to replace the CPU in everyday machines. It hasn't, due to it's complex programming needs. I'm sure it could be a viable competitor, if it was easier to program.

I would just take what Pat or anyone else says with two large bags of salt. It's the start of Q3, and it's time to push products - present and future.
July 2, 2008 3:09:59 PM

JAYDEEJOHN said:
Look, this whole thing started with Gelsinger, thats why the response, because he hasnt stopped. Id ask you to reread that Anands article, where they said they didnt even have the latest or best solution out, which solved a few issues and increased performance. Q3 is when we will see these apps. And if youve ever been sick or know someone who has, then you might be surprised by the medical imagery being used. Its a shame Pat cant keep quiet, like its hurting him somehow. And, we know itll be awhile before Intel brings out Larrabee, so all these apps will have time to grow and flourish. Its funny you say Im upset, Im laughing, this response from Gelsinger is laughable, and sad. Either its worthless, or he should shut up. Time will tell.


Both Gelsinger and Huang should shut up. You're just being picky with Intel. OK, Gelsinger started the whole thing... But then what? Has Intel failed to deliver in the past 2 years?

I *love* the idea behing Folding@Home, jaydee, but is that enough for everyone to buy a CUDA capable GPU? Can't supercomputers already do that with CUDA? Can't we do enough with our Quad-Cores or Radeons?

Perhaps this whole CUDA thing will be the next "Itanish" - from Itanium - thing. Well, actually, they say it's quite easy to use it... But will it ever make it to the average user? Who knows? I'm a desktop user and I shouldn't care about what it can do in a scientific environment. Let scientists worry about it and decide for themselves. If it's better for *them*, then I'm OK with it. But if it's not better for *me*...

Show me a useful CUDA capable desktop software and I'll give it a try. Do you have any links? But CUDA has been around for more than 2 years now, hasn't it? I don't have any benchmarks of Larrabee kicking any asses too. Both = FUD.
July 2, 2008 3:11:02 PM

All Intel has to do is buy all the companies that use CUDA, or offer them insentives to stay with Intel
remove the demand commercialy and it will fade.

Intel can make that happen.
a c 123 à CPUs
July 2, 2008 4:10:15 PM

chunkymonster said:
Gelsinger also took the opportunity to trash talk the Cell processor too, which is a mistake. The last thing Intel would want to happen is Cell processor to be released into the mainstream and desktop markets. Especially since Toshiba announced they will be producing laptops with Cell processors. The Cell processor and nVidia's CUDA could put a serious hurting on Larabee in multimedia applications.

The general purpose processor isn't going to go away anytime soon and unless something like Larabee can prove that it has the juice, then guys like Gelsinger should hold their tongue.


You do realize the one thing here that you saying is kinda wrong right? Cell is very powerful, especially for what it does in the PS3. But from what I have seen Cell is not as good in a x86 based enviroment.

That and the fact that most game companies hate coding for Cell as it is very complicated compared to x86. VALVes Gabe Newell himself trash talked the Cell saying it was too complicated to code games for where as the PC and 360 were simple (they are very similar).

So here is my thing. Where does nVidia and ATI make the most of their money? Either media type PCs that use the Fire GLs and such or the desktop PC market. Out of all of those sales how many people are going to switch or even use CUDA? I think a small amount may use it for Folding @ Home but I doubt it will change over.

Everyone is making assumptions about Larrabee, and you know what that makes people, but we have to wait and see. Intel may have something amazing or they may have a absolute flop. We shall have to see. Although this info may be meant to mislead people and they could be incorporating their Terascale chip for Larrabee.........that would be interesting.
July 2, 2008 4:12:14 PM

So can Toshiba, and IBM and whomever that sees something worthwhile. I want to see what can be done with this, dont you? You want to spend 8 hrs encoding instead of 30 minutes? Theres going to be good uses for this. If it works, people will buy it. And youre right, both should just shut up, thats why the comparison between the two. I dont care if its nVidia or Toshiba or AMD, so far I like what Ive seen. It can only get better. They all want it their way, which is why Im thinking Khronos will have to decide. Money only works when something else isnt worth it, we will see if this is worth it. Im hoping it will be. Something new, with maybe more abilities that we havnt seen yet
July 2, 2008 6:35:04 PM

its amazing how 1 teraflop of "potential" computing power, cant run crysis @ a steady 60fps...
a c 123 à CPUs
July 2, 2008 6:57:05 PM

wh3resmycar said:
its amazing how 1 teraflop of "potential" computing power, cant run crysis @ a steady 60fps...


Where does this come from? A 3870X2 has 1TFlop or potential power and cannot run Crysis maxed out.
July 3, 2008 12:05:04 PM

Crysis brings so many bottlenecks, cpu, gpu. And any gpu is bottlenecked when it comes to potential thruput. Theres really nothing out there thats being used to use full potential anyways. Some areas of Crysis go beyond what any gpu can do, and dont asl all of what gpu can do in other areas, same for the cpu, regarding Crysis. I agree, whered that come from? Im hoping this form of crying "intels dad is bigger n your dad" and "no nVidias dad is bigger n yours" crap ends soon, and we see if CUDA takes off, and will Larrabee actually be a viable alternative, regardless of Intels x86 structuring or not. Remember, thats key to begin with. If Larrabee is a dog, it wont matter how pretty all that x86 coding looks to devs, if no one wants it. And, same with CUDA. If it doesnt come thru on its promises, no ones going to bother with it as well. Im looking forward to see what it can do. If it lives up to its potential, no amount ogf money nor influence will stop people from buying/using it. At this point, and I may be wrong, but it looks like Intel has the Beta (Sony) and nVidia has the VHS. Beta was better, did more etc, but it was too late, people alreadu had VHS and liked it too much.
July 3, 2008 12:42:46 PM

Yeah, but it's Intel's, you know. I completely agree about the encoding time thing and seeing what CUDA can really do, jaydee, however, it looks like Nvidia doesn't want to share it at all. The world is based on x86 and that's what I believe in as of now. Of course things change and will keep changing, and that's the way it should be, but I still don't see it happening anytime soon, at least not in the way Nvidia is talking. Their promises to the average desktop users are also late, so, Larrabee still has all the time it takes to go to the shelves. Besides, we should remember the 64-bit tragedy. How long has it been around and almost nobody has even a single damn driver compiled in 64-bit? How many people don't have a 64-bit capable CPU? Compare this number to the one of people who can't run CUDA. Just because it's "better" doesn't mean that it will ever take off. Unfortunately.
July 3, 2008 1:10:08 PM

I know what youre saying. But this can be used by all sorts of different and competing products. Also, the gpu is being used more and more in other things than desktop, and making headway there. Why is that? x86 isnt everything, its only 1 thing. Itll be handy if Intel with Larrabee comes thru, if not, it may turn out to be brooke+. Who really knows. Just BECAUSE someything is x86 doesnt in itself make it better. And especially so, if say CUDA products already exist, and are filling that niche. Heres a question. Since Larrabee will be limited to x86, then wont it be useless in server apps most the time, thus leaving the door open to ATI and nVidia?
July 3, 2008 1:23:01 PM

As of now the problem it's just what you said: it fills the needs of just a niche. Perhaps DirectX 11 can change the way we see the GPU. Microsoft still defines a lot of what we'll see or not, so, we still have to ask them. Anyway, it looks like Apple is starting something interesting regarding GPU acceleration with "Snow Leopard". Let's wait and see.

But I fear that CUDA might end up just as USB 3.0. It's somebody's and that's it.
a c 123 à CPUs
July 3, 2008 2:50:43 PM

Jell jaydee, I do know that the 4870 can play Crysis @ 1680x1050 and average in the 50FPS. maybe the 4870X2 (which should have a potential thouroghput of 2.4TFlops) will play it even better.

I agree. They both need to stop. nVidia needs to stop talking and so does Intel. nVidia also needs to stop charging so much for a very small increase in performance.
July 3, 2008 3:23:47 PM

Look, its not that CUDA is a world shaker, its not. But the attitude of this whole thing intrigues me. Why are all the cpu/Intel people so at arms? Is this really a threat? To go so far to put it down shows theres already something to this, and we all know that of the three, Intel is the most behind.Putting it down, throwing money around wont stop people from trying to male money on something thats already here. How much growth and usefullness we will se from iot who really knows. One things for sure, there are apps out that proves usefulness now, the rest is all talk, and nothing but talk. So, until Intel can actually show they have anything that can compete with this, its only talk, and coming from Pat Gelsinger, and the words hes using is either bad form or over responsiveness, neither of which puts Intel in a positive light. Also, like weve seen with the 4xxx series, you can bet AMD isnt just sitting still. Id like to point out also, that previous leaks hurt AMD/ATI, and thats something thats changed. Having nothing but words is one thing, even from an admired company as Intel, but having something thats actually here, and working is something altogether different. 1 in the hand.....
July 15, 2008 7:21:24 PM

What the hell are you going on about?

You sound like the kid in Enter the Dragon that keeps staring at Bruce Lee's finger when he's pointing at the moon.
a c 123 à CPUs
July 15, 2008 7:35:02 PM

^ one of the best parst next to Bruce Lee snaping Jackie Chans neck.
July 15, 2008 7:50:55 PM

-1 for Intel's E-peen

Although Larabee also holds high hopes, Intel shouldn't really underestimate the competition.

However, I guess Nvidia is also the same...:sarcastic: 
July 15, 2008 7:52:19 PM

wh3resmycar said:
its amazing how 1 teraflop of "potential" computing power, cant run crysis @ a steady 60fps...


Maybe because Crysis is extremely poorly coded? Maybe because all of the computing power can be used more efficiently somewhere else (folding for example)
a c 123 à CPUs
July 15, 2008 7:57:05 PM

yomamafor1 said:
Maybe because Crysis is extremely poorly coded? Maybe because all of the computing power can be used more efficiently somewhere else (folding for example)


This is what I have been saying all along. My fave example was of the memory leak that came with the retail version.

Most people don't listen to me but heck even Cerv (or whatever the head of Cryteks name is) said that its not coded well and poorly optimized in the last PC Gamer (August issue I think).
July 15, 2008 8:11:20 PM

How can people not understand that Crysis is poorly written? Comparing the graphics with COD4, they're on par, but COD4 runs a lot better. Comparing the physics with Advanced Warfighter, they're on par, but AW doesn't need Tri SLI GTX280 just to run smoothly. Comparing the CPU utilization with Supreme Commander, SC actually uses more cores, yet still run extremely well.

Crysis is probably, IMO, the biggest hyped junk in the history of computer gaming, even surpasses Phenom (if they're comparable). The graphic sucked if you don't have a 2500 bucks computer; the gameplay sucked because there's no variance (like TC said), and the storyline sucked because its so cliche.
a c 123 à CPUs
July 15, 2008 8:27:43 PM

Yea I agree. I think the "One guy takes on an entire alien armada trying to take over Earth" has been a bit overdone.

HL was one of the first to do this and did it well. The rest after a while have gotten so bleh.

Crysis is very poorly coded and optimized. Its story was bland and they are releasing a expansion that looks to be better story wise (one would hope) with a more optimized engine and better coding.
a b à CPUs
July 15, 2008 8:58:18 PM

yomamafor1 said:
How can people not understand that Crysis is poorly written? Comparing the graphics with COD4, they're on par, but COD4 runs a lot better. Comparing the physics with Advanced Warfighter, they're on par, but AW doesn't need Tri SLI GTX280 just to run smoothly. Comparing the CPU utilization with Supreme Commander, SC actually uses more cores, yet still run extremely well.

Crysis is probably, IMO, the biggest hyped junk in the history of computer gaming, even surpasses Phenom (if they're comparable). The graphic sucked if you don't have a 2500 bucks computer; the gameplay sucked because there's no variance (like TC said), and the storyline sucked because its so cliche.


There is no question that crysis runs horribly on most systems. However, I don't see how you can say that the graphics are the same as COD4. Crysis is still ahead in almost every way (graphically - I won't say the gameplay was amazing).
July 15, 2008 9:08:15 PM

They are very comparable. Crysis' Very high setting may be slightly better, but by then I'll be better off running a slideshow.
!