Sign-in / Sign-up
Your question

PSSSsss.... and so the Larrabee hype starts to deflate

Tags:
  • Graphics Cards
  • Graphics
Last response: in Graphics Cards
April 25, 2008 7:37:07 PM
Related resources
April 25, 2008 9:07:13 PM

When it comes to larrabee, I'll withhold opinion, good or bad, until its out. By reserving that opinion, I won't be at all disappointed if it doesn't do everything that some of the pundits and fans desire. As for Nvidia, I think their declared war is nonsense and will only hurt them in the end. I suspect they are starting to realize it and are getting desperate. Some of the rash statements that have been made by Nvidia execs, like "the CPU is dead", remind me of some of the bull that came out of AMD during the past year and a half. Nonsense. A computer can run perfectly fine without an add in card from either Nvidia or AMD/ATI while using integrated graphics, but a computer cannot run without its CPU, no matter how good the add in graphics card is.
a b U Graphics card
April 25, 2008 11:32:00 PM

^+1
April 26, 2008 2:39:46 AM

The negligence of the CPU is dead statement amazes me. NV bases that claim by saying CPUs can't run anything faster than what they can now. Calling current technology the top of the mountain is just ignorant. Time and time again, when technology seems to have reached it's limits, something new emerges and over takes the old.

I agree with that sailer. To add to to it, older CPUs bottleneck newer GPUs. GPUs NEEEEED top notch CPUs to run at max potential. What NV claims doesn't even make sense.
a b U Graphics card
April 26, 2008 3:36:47 AM

Lets just say they need each other. Like Rob Enderle said. He also said this didnt need to happen, and he blames Intel for it. And like my other post here, Intel started this mess, this miscomunication, if you will. It doesnt hurt nVidia at all? Anyone know a few nVidia fanboys? You think they think less of nVidia? Take a lesson from the AMD fanboys, theyre still here arent they?
April 26, 2008 7:14:45 AM

If intel creates cards that can beat nVidia and ATI cards hands down, it's actually good news...
Where there's competition there's improvement.
nVidia and AMD/ATI will be going all out to destroy intel's cards(if they exist and defeat nVidia and ATI).
And all these will lead to greater cards with greater leaps, and GREATER PRICES!
April 27, 2008 8:29:17 AM

:lol:  Does anyone actually knows who is panicking? :bounce:  Nvidia is so dead, their chipset is almost dead, how are they going to compete with both AMD and Intel when they have their own platform.
It was true that Toms posted years ago, they don't want Nvidia to be part of it and now it is happening. Woo-hoo Nvidia can die. :whistle: 

Now now Nvidia fanbois, in the olden days CPU was not powerful enough to be a GPU and now since they have so many cores and it is high time to fully utilise it. AMD got its own stuff so do Intel. :D 
April 27, 2008 8:52:13 AM

Romulus, can you tell me where that pic is from, it seems damn familiar.
a b U Graphics card
April 27, 2008 9:23:02 AM

What I find interesting is that the current cpus are good enough for anything. And really, they cant be perfected alot more. The only way cpus can get better is by going multicore, which is why Intel is following (yes, Im saying it) AMD, by going to a single die, multicore IMC, just like AMD has has for years. Heres the problem as I see it. If cpus ever get to the point of doing graphics, where theres still tons of growth to be had, of what use will a octo core be? Or then 16? Then 32? Its been about 3 years since dual core, and not quite 3 years between dual and quad. So in 3+ years we could be looking at 16 cores. Then what? Im not sure whos paninking at this point. No super real need for a 16 core monster on the horizon, and lots of room for better graphics.
a b U Graphics card
April 27, 2008 9:40:54 AM

Heres another way of looking at it. Lets say that somehow a uber cpu ismade. You click and its done, no waiting. Thats great. But, if is a super uber killer graphic game comes out, and , like Crysis, it struggles, and you have to turn down the eye candy. Or, you have instant access to all apps, but they havnt done real life graphics yet. Thats closer to where we are. I for one am waiting for those uber graphics, my current cpu works fine for me. And the regular Joe? He wants the games, the eye candy. Its who can grab that apple thats going to make the money. In 3 years time, Im not so sure we will see an end to rasterisation, but we just may have a 16 core cpu
April 27, 2008 10:07:38 AM

DarrPiggie it's from Crysis Demo.
Taken with FRAPs
April 27, 2008 10:10:03 AM

Hey, why don't AMD/ATI do multi GPU in one GPU like their Phenom or AthlonX2? It'll do better...
a b U Graphics card
April 27, 2008 10:27:39 AM

Look for the next best thing, where 2 chipss share memory, which essentially is the same thing
April 27, 2008 12:31:50 PM

What people don't see is if Intel actually pulled this off it would be the greatest upset ever. Think about it, Nvidia needs Intel to allow thier chips to work on thier boards, so does AMD. Intel needs Nvidia and ATI to make thier boards and chipsets sell. If Intel were too cut out the middle men and totally design a gpu chipset that is built by them think of how compatible the chip will be compared to Nvi/ATI. Suppose Intel came out with thier own SLI/Crossfire technology and built a GPU as fast as processor chip at blazing speeds, they'd throw a major rock in the pin Nvidia and ATI shared for the last few years. There has never been a real company to come in and be powerful enough to be a major competitor in the GPU wars so Intel is the logical candidate to really make a huge change. Now, will it be better? Who knows but think of if they actually pulled it off, the result would be devistating. Good for us maybe but bad for the other guys unless they continue to get better.
April 27, 2008 2:48:16 PM

Nividia is dead so does rasterization there is an 80 core sitting in Intel's lab. And see nothing more than that uber processor to clock as high as possible. Yeah talking about ray-tracing times 80 = uber. Or even better in the future, but we don't see the same thing with rasterization right now, only CF and SLI what's the point?
a b U Graphics card
April 27, 2008 7:50:39 PM

Currently, even an 80 wouldnt be much better than what CUDA and a 9800GTX can do. One thing to point out tho, Cuda is here as well as the 9800GTX, the 80 is just sitting there in Intels labs collecting dust. And from what Im hearing, the G200 will be twice as fast as the 9800GTX, so maybe Intel should be at work on that 160 core
April 27, 2008 11:06:05 PM

JAYDEEJOHN said:
What I find interesting is that the current cpus are good enough for anything. And really, they cant be perfected alot more.



That is where the CPU is dead comment comes from but it is just poor insight on NV's behalf. The CPUs are good enough for anything right now, but in the future, when software and other hardware catch up and becomes demanding again, there will be a need for better CPUs. I think the opposite is happening, they aren't dead, they are too far ahead of the rest of the technology to the point where new CPUs aren't being used to their full potential.

I am curious to see the magnitude of performance gain that comes out of Nehalem based on the fact that quad cores are just now coming to the stage for gaming.
a b U Graphics card
April 27, 2008 11:15:43 PM

True, but even so, say you have a game thats multithreaded. How are you going to see almost double scaling in games without using a gpu thats capable of rendering it that fast? Other apps, sure, but those are more tools than toys, and we all like our toys more
a b U Graphics card
April 27, 2008 11:18:45 PM

Thats what Im saying here. Theres much more growth and I believe desire, in the graphics side, which usually means money as well
April 28, 2008 1:44:11 AM

Quote:
get back in your hole. gpu's and cpu's are chalk and cheese, uncomparable in the way they work.
:pt1cable: LOL, I'd say you crawl back in your own hole when CPU able to be do ray-tracing way better than the GPU.
:kaola:  Intel is destroying Nvidia. :bounce: 
a b U Graphics card
April 28, 2008 7:57:57 AM

Raytracing does some things well, and some things are best left to rasterisation. No matter how many cpus you throw ar raytracing, theres some elements that simply wont outdo a gpu. That being said, that doesnt mean raytracing cannot or will not be used in graphics. In some ways its far superior to raster, tho theres alot of hope as well as potential from a raytraced solution done thru gpus using programs like CUDA, which is open sourced for growth. Time will tell
April 28, 2008 8:26:39 AM

CUDA is opened source? :0 Really Nvidia is doing open source? Anyway CPU will work the way around the GPU, either Intel have no choice to provide its own GPU solution. Yep Nvidia is still in the danger right now. If AMD and Intel will not want to share the platform or license to Nvidia anymore.
a b U Graphics card
April 28, 2008 8:47:24 AM

Larrabee will primarily be a plain old gpu, with some physics thrown in. As will future nVidia cards. Intel has bought their own game company, and may be trying to create their own engine that will use this physics/raytracing approach primarily done thru the gpu, with future app for raytracing. If they can sell enough game devs on this, then yes nVidia is going to have problems. Its more complicated than just doing raytracing on the cpu alone. Its going to cost Intel alot of money, and I mean alot, but if they stick with it, they could change the market drastically, tho Im not sure thats what the market wants, including AIBS, M$ etc. Like I said, who knows. Just dont get your hopes up for a killer gpu solution from Larrabee right away, itll take awhile first just to get a competitive product, then adding all the support from the cpu, and then having the game devs to acyually use it all
April 28, 2008 2:14:34 PM

whats the hype about the larrabee??i am pretty sure that"thing" would be pawned even before it reaches launch.probably its would crash and burn before it reaches our dirty hands.And dont talk about overclocking that thing.why am i suddenly feeling disgusted?
May 7, 2008 9:06:41 AM

The heat wall ppl! Remember? We all would rather be here i'm sure with a 10 GHz P4 than the dual/quadcores we have now. It would do anything faster, including crysis!
But the heat wall . the frequenzy wall or whatever you call it that Intel cant get 'over' . Intel and AMD is in panic ppl! What to do? A, we can make GPU's and PPU's. Lets do it!
a b U Graphics card
May 7, 2008 10:28:00 AM

Multi di cards are coming that share memory, so no CF/SLI needed with its inherent losses. That is a better solution than going multi core over a cpu, since software has to use it. Its going to get interesting, but I want Intel to make a driver first thatll work in DX10 or 11.
May 7, 2008 12:55:23 PM

Apparently they are trying. NV is jumping all over their case it sounds like.
May 7, 2008 1:32:30 PM

MTLance said:
:lol:  Does anyone actually knows who is panicking? :bounce:  Nvidia is so dead, their chipset is almost dead, how are they going to compete with both AMD and Intel when they have their own platform.
It was true that Toms posted years ago, they don't want Nvidia to be part of it and now it is happening. Woo-hoo Nvidia can die. :whistle: 

Now now Nvidia fanbois, in the olden days CPU was not powerful enough to be a GPU and now since they have so many cores and it is high time to fully utilise it. AMD got its own stuff so do Intel. :D 


I ussually dont write inflamatory posts, but this one ill start.

Nvidia is as dead as a G92 and as Crysis. They are all old tecnologies from your POV.
ATi/AMD must compete in 2 fronts. In many cases Twice the resources, in others the can manage better i guess.
Nvidia opened a can of whoop ass, i think it havent really started yet. A chipset with a nice IGP is not much. I thin k more will come.
Intel opened a can of Hype. Lets see if so much HYPE, doesnt become Netbursted. Everything is always fine on paper.
Amd doesnt got its own stuff. With Larrabee, Triple SLi will use AMD CPUs. And dont forget Physx instrucions in Nvidia CUDA Code.
With Larrabee, CFx will have a AMD Cpu ASWELL. I find that hilarious.

Honestly, Intel in VGA's as failed several times now, in completely diferent times. History will repeat itself...again.
And dont forget Vista Capable/Premium scandal, that was Intel fault.

As for the original post.

That doesnt surprise me. Hype is easy to generate, but you need to manage it afterwards.
It seemed to me too good to radical change everything so fast, AND keep it backwards compatible.
Nobody WILL buy a CPU/GPU that DOESNT use rasterization at this moment. Nothing runs on it.

Might be a fine product, but i think its too soon to tell.

Edit: Typ0 F3stival!!