Sign in with
Sign up | Sign in
Your question

To Larrabee or not to Larrabee That is the ?

Last response: in Graphics & Displays
Share
a b U Graphics card
September 4, 2008 5:14:02 AM

Interesting this powerhouse thatll revolutionize gfx as we know it http://www.theinquirer.net/gb/inquirer/news/2008/09/03/...

More about : larrabee larrabee

September 4, 2008 6:07:45 AM

Lol the inq...
a b U Graphics card
September 4, 2008 6:52:20 AM

Yea, gotta love the Inq, never know what may come outta there
Related resources
a c 130 U Graphics card
September 4, 2008 8:07:48 AM


Dont understand why they would think this even allowing that its the inq saying it.
Isnt Larrabee a monsterous entity of a processing beast that will simply compute what ever you send its way Graphics/data ?

Mactronix
September 4, 2008 11:26:43 AM

That article is pure ****. Xbox "720" out before 2010? PS4 out before that? Yea, right. Cant believe I even clicked on that link.
September 4, 2008 11:35:01 AM

i thought MS havent decided the name for the nex gen Xbox yet.fake!
a b U Graphics card
September 4, 2008 12:05:05 PM

spathotan said:
Cant believe I even clicked on that link.

You got Rick rolled!
September 4, 2008 1:17:29 PM

The inq is just so much fun to read. I'm just worried that one day they might actually anounce something true, but nobody will believe them.

IN THE INQUIRER WE TRUST!!!
September 4, 2008 2:27:10 PM

INQ=FUDZILLA!LOLOLOLOL
September 4, 2008 2:59:30 PM

Well, it makes perfect sense. Besides, here's the description of the news: "Rumour mill has Intel courting Microsoft". Don't know what is all of this INQ bashing for.

Quote:
That article is pure ****. Xbox "720" out before 2010? PS4 out before that? Yea, right. Cant believe I even clicked on that link.


Of course it's BS: you read what you wanted to read. "Intel also appears to be heding its bets by wooing Sony by going after the Playstation 4, which won’t be out until at least 2011. The Xbox720 will be out in the market a fair while before that." Have you noticed the "at least 2011" for the PS4? Basically, they only said the Xbox 720 will probably be out before the PS4, nothing more. The problem is they used "that" in the sentence, which gives you a point, but I think it was just a small mistake.

Anyway, that's such a killer deal [if it happens].

September 4, 2008 10:41:11 PM

This thread could only have been worse if someone actually chucked a turd at my screen. Same result, just different cleaning options.
September 4, 2008 10:56:16 PM

dattimr said:
Well, it makes perfect sense. Besides, here's the description of the news: "Rumour mill has Intel courting Microsoft". Don't know what is all of this INQ bashing for.

Quote:
That article is pure ****. Xbox "720" out before 2010? PS4 out before that? Yea, right. Cant believe I even clicked on that link.


Of course it's BS: you read what you wanted to read. "Intel also appears to be heding its bets by wooing Sony by going after the Playstation 4, which won’t be out until at least 2011. The Xbox720 will be out in the market a fair while before that." Have you noticed the "at least 2011" for the PS4? Basically, they only said the Xbox 720 will probably be out before the PS4, nothing more. The problem is they used "that" in the sentence, which gives you a point, but I think it was just a small mistake.

Anyway, that's such a killer deal [if it happens].


Because I typoed my own sentence means I "read what I wanted to read"? Give me a break.
a b U Graphics card
September 5, 2008 1:22:33 AM

Well, like the PS3 right now: If you have superv hardware, but have a crappy API for the programmers you won't get far by default. Cell is a fine thing alright, no one can't deny it, but programming for it sounds like a b!7ch.

Intel (according to what i read) wants to prevent that offering those "big console players" the benefit of going with "sugar daddy" Intel watching their investment and teaching the programmers. Kinda like imposing but it's not imposing at all, lol.

So... If M$ and/or Sony don't want Larrabee, Intel has no option but to make a nice funeral for it. All IMO, of course.

Esop!
a b U Graphics card
September 5, 2008 1:44:09 AM

I said 3 things about Larrabee all along. Drivers are going to be tough, M$ doesnt give a rip for lil Intel i n their world, and its going to be costly, very costly. Time will tell
September 5, 2008 3:18:41 PM

^ Well, if they don't give a rip for lil Intel, who they give a rip for? AMD?

Wintel. The word says it all. They *have* to care - and they will.
September 5, 2008 3:25:03 PM

spathotan said:
Because I typoed my own sentence means I "read what I wanted to read"? Give me a break.


Well, actually, "read" was in the Past Tense, if that was sarcasm. If not, their "at least" is what makes your reply invalid, not the 2010/11 typo. Anyway, there's no reason whatsoever to bash an article that clearly states - right from the start - that it is nothing but a rumour. Besides, they haven't talked a specific date, but rather a possibility of one console being released before the other. My intention wasn't to bash you, but to point out that if something is being discussed as a rumour then there's no use in going wild and calling it BS. Those guys have it right most of the time, actually.
a b U Graphics card
September 5, 2008 4:30:27 PM

Its true Intel has been courting M$ for this. Whether its going to happen is something else. Whats running on the current 360?
September 5, 2008 4:52:56 PM

LOL at the idiots slating the inquirer for this.



MS NEED a console deal to get developers developing specifically for Larrabee.

They are not going to beat ATI or Nvidia on their own turf, so Intel need to move the battlefield - they are going to do that by getting a console, and have ATI/Nvidia have to patch their drivers to work with the Larrabee developed games.
September 5, 2008 4:53:58 PM

JAYDEEJOHN said:
Its true Intel has been courting M$ for this. Whether its going to happen is something else. Whats running on the current 360?


A variation of ATI R500 I think...
a b U Graphics card
September 5, 2008 4:58:33 PM

Thats currently the problem with Larrabee, as it just wont simply plug in to current games. M$ has to consider the entire gaming industry, AMD, nVidia etc. This isnt just a "Intel will get what they want as always decsion", its much more than that. If Intel waants to come at this in a different way, certainly they have the resources for it, not to buy their way in, but earn they way
a b U Graphics card
September 5, 2008 5:22:04 PM

JAYDEEJOHN said:
Thats currently the problem with Larrabee, as it just wont simply plug in to current games. M$ has to consider the entire gaming industry, AMD, nVidia etc. This isnt just a "Intel will get what they want as always decsion", its much more than that. If Intel waants to come at this in a different way, certainly they have the resources for it, not to buy their way in, but earn they way


On the consoles arena, it's not like that. Intel can actually "buy" it's way in that market.

Each console gen has a different API to work with since it has a different hardware set and console OSes are not made like regular OSes. They work on tighter develop with hardware to optimize things. At least, to some degree.

Anyway, if Intel can't get it to the PC industry, i don't give a damn actually. More over, i don't know if Larrabee is going to be a good or bad thing! lol

Esop!
September 5, 2008 5:29:22 PM

maybe you are missing something Yuka! in the article said PC games are "derive" from console API and PC market is Intel's main target. so if it made it into console then obviously it will get into pc market(intel's main aim).

so its like:
Intel==win=>console bid(mass market) and console game=PC game! Intel double WIN!
a b U Graphics card
September 5, 2008 5:41:55 PM

It still has to be done in a way thats already known vs a way that isnt. More to do, more to adapt etc. M$ has to include all the game devs in their decision. This isnt business as usual, so to speak. The PS3, Ive heard is a b14+ch to have to wirk with, so too could this be.
September 5, 2008 5:46:45 PM

i really want to know what Sony can put in that "beast in a BOX" thingy!LOL
a b U Graphics card
September 5, 2008 5:50:12 PM

If its as easy as Intel claims, and the moneys there, itll happen. Otherwise, it may get tougher
September 5, 2008 5:56:40 PM

i sure IF Intel and MS joining on this "project"(if it happens) then i think nothing is impossible!
a b U Graphics card
September 5, 2008 5:58:34 PM

LOL, true that
September 5, 2008 6:05:34 PM

if i remember right there was once Intel and MS wanna join together but the "fair playing antimonopoly" people stop them from doing it isnt it?
a b U Graphics card
September 5, 2008 6:12:52 PM

iluvgillgill said:
maybe you are missing something Yuka! in the article said PC games are "derive" from console API and PC market is Intel's main target. so if it made it into console then obviously it will get into pc market(intel's main aim).

so its like:
Intel==win=>console bid(mass market) and console game=PC game! Intel double WIN!


True, since it works both ways: PC -> Console and Console -> PC, but yeah, u're right though. Porting hardware is as "easy" as writing a driver for the OS it will run on and porting software is as easy as it has a compiler that can come up with code for either hardware.

Hope Intel doesn't screw us up just for the sake of money >_>'

Esop!
September 5, 2008 6:15:40 PM

Yuka said:
Hope Intel doesn't screw us up just for the sake of money >_>'

*ahem* money!!!thats what BUSINESS is about. they are not charity to give you the best(for nothing!) :pt1cable: 
a b U Graphics card
September 5, 2008 6:19:17 PM

Thats my concern
September 5, 2008 6:20:43 PM

but from the logic side of thing larrabee is quite powerful from the theory that Intel put out.its more of a driver kinda thing to make it powerful.
a b U Graphics card
September 5, 2008 6:20:52 PM

Its be like havong to be able to include Apple in all windows enviroments, not an option, but a need
a c 130 U Graphics card
September 5, 2008 7:56:31 PM


Sorry is it me ? am i missing something here ? I just cant get my head around why people think Larrabee needs to get a console deal in order to get devs to program games that it can run. As far as i am aware it can run Open GL or DX anyway, so whats stopping it being a dedicated GPU in its own right ? I know thats not its sole function but again i thought thats how it was going to be marketed for a start.
I really dont think Intel need any help from MS or anyone else for that matter. Yes there is a Graphics chip slot up for grabs in these new consoles but quite frankly i would think them a bunch of (fill in own expletive) if they didnt pitch for it.
And just because they are pitching for it people are putting 2+2 together and all of a sudden Intel are desperate to find a use for Larrabee.

Mactronix
September 5, 2008 8:05:00 PM

your thinking is too simply my friend.its a completely different architecture it no longer uses vertex and pixel shader in the current gen. because the developers will need to start from scratch to make the game/driver/OS/GPU to communicate to each other.its a high risk thing to do.we are talking about business here not just simple "because i want it" sort of thing, money is the whole deal!if larrabee fails whos going to pay for all the lost?YOU?!?!
a b U Graphics card
September 5, 2008 9:01:50 PM

iluvgillgill said:
your thinking is too simply my friend.its a completely different architecture it no longer uses vertex and pixel shader in the current gen. because the developers will need to start from scratch to make the game/driver/OS/GPU to communicate to each other.its a high risk thing to do.we are talking about business here not just simple "because i want it" sort of thing, money is the whole deal!if larrabee fails whos going to pay for all the lost?YOU?!?!


In simplier terms, Larrabee is going to need KERNEL support from M$ to run in Windows if it doesn't follow the conventional way of doing things as a GPU-CPU thing. It may or may not decode x86 instructions at all and/or OpenGL/DX specific code (wich will need a compiler "upgrade"). I wonder if they'll keep the current way of doing things, wich i doubt.

So building an API for it might be an easy task or, if M$ don't give a damn, Intel is going to be in troubles IMO. And if there's no API/support for Larrabee, then is as good a fried CPU/GPU.

Esop!

EDIT:
Here's a Wikipedia article on what's going to be Larrabee: http://en.wikipedia.org/wiki/Larrabee_(GPU)
September 5, 2008 9:19:48 PM

Intel's plan was to take out AMD and Nvidia.
a c 130 U Graphics card
September 5, 2008 9:24:55 PM


@ iluvgillgill,
I think you need to go do more research on how its actually meant to work There will be vertex and pixel shaders.
Your whole reasoning doesn't add up or I'm misunderstanding your meaning. Do you really believe Intel just decided to make a GPU type product, then decided to make it work different than any other just because, then started trying to persuade someone to programme for it ? Do me a favour.

@ Yuka,
It wont need its own KERNAL or API it will run Open GL or DX. Quote :Game developers aren't big on learning new tricks however, especially on an unproven, unreleased hardware platform such as Larrabee. Larrabee must run DirectX/OpenGL code out of the box, and to do this Intel has written its own Larrabee native software renderer to interface between DX/OGL and the Larrabee hardware.


Mactronix
a b U Graphics card
September 5, 2008 9:34:19 PM

mactronix said:
It wont need its own KERNAL or API it will run Open GL or DX. Quote :Game developers aren't big on learning new tricks however, especially on an unproven, unreleased hardware platform such as Larrabee. Larrabee must run DirectX/OpenGL code out of the box, and to do this Intel has written its own Larrabee native software renderer to interface between DX/OGL and the Larrabee hardware.


Mactronix


Yep, i just finished reading the article from Wikipedia. Should've started from there, lol.

Now, this you mention: "Intel has written its own Larrabee native software renderer to interface between DX/OGL and the Larrabee hardware", that could be understood as a "driver" for Windows and such. I'm sure there will be code for the Kernel to actually USE all those processors (kind of a patch for it i think) just like AMD did when it's X2 came out.

It's an interesting chip indeed, since it *really* makes a "brute force" GPU from a CPU, lol.

Esop!
a b U Graphics card
September 5, 2008 9:44:05 PM

How does the cores of Larrabee compare to say Atom? Im trying to figure out power usage. Currently, a single Atom is 4 watts at I believe under 2 Ghz. These will run at 2Ghz, and have at least 24 (32 for high end?) on them. If its say 6 watts per chip, thats 192 watts, which is very high. Also, I wonder how much of a slow down will occur rendering in such a manner? It wont be as slow as say trying to render DX10 on DX9, but it will add to latency problems, wont it?
a b U Graphics card
September 5, 2008 9:48:56 PM

JAYDEEJOHN said:
How does the cores of Larrabee http://en.wikipedia.org/wiki/Larrabee compare to say Atom? Im trying to figure out power usage. Currently, a single Atom is 4 watts at I believe under 2 Ghz. These will run at 2Ghz, and have at least 24 (32 for high end?) on them. If its say 6 watts per chip, thats 192 watts, which is very high. Also, I wonder how much of a slow down will occur rendering in such a manner? It wont be as slow as say trying to render DX10 on DX9, but it will add to latency problems, wont it?


Remember it's a 2x1 "solution": GPU+CPU. So 192W for that theorical "power" it's very good indeed.

Now, thinking about thermal dissipation, yeah... It's gonna be kinda hot XD

And if you have an interpreter software layer, it won't matter. U can always turn one instruction into a set of other instructions, but yeah, it will add "lag" to the process.

Esop!

EDIT: Grammar XD
a b U Graphics card
September 5, 2008 10:05:54 PM

So no furmark for Larrabee eh? heheh. If Intel procures Larrabee for the 720 or whatever its called, it really wont matter so much then. And if Intel is supposedly upping the price to do so, the costs of Larrabee go higher. And there wont neccessarily be a profit at all from it. At least regarding the "720" deal. Not sure how this helps Intel in any way then, actually. Putting it this way, did the Cell help ? In that frame of reference anyways
a b U Graphics card
September 6, 2008 12:53:11 AM

JAYDEEJOHN said:
So no furmark for Larrabee eh? heheh. If Intel procures Larrabee for the 720 or whatever its called, it really wont matter so much then. And if Intel is supposedly upping the price to do so, the costs of Larrabee go higher. And there wont neccessarily be a profit at all from it. At least regarding the "720" deal. Not sure how this helps Intel in any way then, actually. Putting it this way, did the Cell help ? In that frame of reference anyways


Actually... Yes it will :p 

That's why they might be aiming for the X-Box: They have an interpreter, but if they win the X-Box deal, they'll have a full working API for M$ wich might be included in DX11 (or DX11.1, or whatever it's gonna be). Hence it will be "easy" for M$ to port it to Windows what-ever-edition so games run in that thing using Larrabee's optimizations.

It's a complex scenario, but i'd bet on M$ getting seduced, since Larrabee won't change so much the way PCs as a whole work. Larrabee is only a merge of 2 things into 1. Kind of a "Software Renderer" but with OpenGL/DX instructions fully supported, lol.

Hardware wise, i think the North Bridge might dissapear with Larrabee, or it will be gimped IMO, but shouldn't matter to this thread :p 

Esop!
a b U Graphics card
September 6, 2008 1:06:48 AM

I agree that what we may see in the "720" may look good. Actually, whats in there now looks better than the besy gpu of that era, mainly because of optimisations. So it could end up being a win win for Intel and thus making for an easier transition, like you said. Time will yell. I know some Intel homeys were excited about this BEFORE this article came out, so theres some truth to it
September 6, 2008 3:11:00 AM

While I do agree that it would be a major coup for Intel to land Larrabee within a console for the next generation, I really find it to be little more than idle speculation and rumors.

To be honest, I don't think Larrabee has a place in consoles. It's GPUs done with the NetBust philosophy, of providing high amounts of performance by simply saying "to hell" with concerns over TDP. It's easily programmable, but as a result is not terribly efficient for real-world single-precision stuff at all, on either a per-million-transistor or per-watt basis, even compared to CURRENT GPUs like RV770.

Now, in the future, things might change a BIT with Larrabee 2. However, I doubt they'd change the philosphy; it'd just like going from Willamette to Northwood. This simply won't do in a console...

Now, a lot of people may think that it could work, since huge, heat-spewing behemoths that cost more to produce than they do to sell may exist now, but there's the trick: they don't compete well. Historically, just three consoles have sold for less than they cost to produce, and coincidentally also were big and hot: the original Xbox, the Xbox 360 under 90nm, and the Playstation 3. Coincidentally, not one of them ever made a profit; the 360 didn't start turning a profit until AFTER they switched over to 65nm chips a while back, which had the dual advantage of not only slashing chip costs roughly in half, but also drastically cutting down the infamous RROD cases.

Meanwhile, in this generation, you have the cheap, lower-powered Wii kicking the respective tushes of both its competitors. Personally, I'm of the opinion that Sony and Microsoft are going to take a note of this, and tone down a bit on the power; over the past couple generations, we saw a big push toward bigger chips that consumed more power, but coincidentally, it never yielded more profits.

Hence, I don't think that in this market, Larrabee has a place. Sure, the Playstation 4 and the Xbox 720 could be as big leaps over the PS3 and 360 as those were over the PS2 and Xbox, but then, it could be readily predicted that they'll struggle just as much against Nintendo's next console as these are doing right now against the Wii.

So, I think that Sony and Microsoft will tone back their aggressiveness; of course the new consoles will be more powerful, duh. However, I have the impression that they won't be as expensive or heat-producing, in the interests of being more competitive. As a result, the visual leap from the current, 7th gen and the next gen won't be as large as it was from the current to the 6th before it.

And of course, I've been calling it the Xbox 720 all along, running with the joke that it, for once, will be able to actually render everything at 720p, rather than relying on 576-640p for a lot of its top-shelf titles, (Grand Theft Auto IV and Halo 3 are examples of those two resolutions I mentioned) and using bilinear filtering to scale them up to 720p for output. :p 
September 6, 2008 6:58:02 AM

Larrabee is a little overkill for a console. The GPU's in consoles are extremely weak, the Nvidia RSX in the PS3 is on par with a 7800 or two 6800 Ultras, not exactly bragging rights, what gets them by is the superior CPU's (IBMs 3.2ghz Tri-Core in the 360, and Toshiba's/IBM's Cell Processor in the PS3), and of course the mastercrafted optimizations that make consoles run so well and get better with age.

You are talking about a theoretical 192w GPU here, in a console? The whole Xbox 360 console only consumes 175w MAX on the Falcon (newer) chips, ~215w on the old original boxes.

The wattage is just the beginning of this potential nightmare, then you have to worry about heat. The heat from the original 360 CPU got so hot they now use a self-contained waterblock for the Falcon chip. So imagine a stronger CPU/GPU 2 years from now in that box, at 192w? Youre looking at a ~400w console giving off enough heat to make waterblock and copper worthless. At that point technology is going backwards instead of forwards, as the console it gonna have to be bigger than your 1200w reciever and louder than your blender due to the 4 120mm fans for the quad radiator its gonna need for that water block. And lets not get started with the PS3 using twice as much power as the Xbox 360, PS3 needs 380w

Of course this is leaning on the Inquirer :D 
a b U Graphics card
September 6, 2008 7:08:35 AM

This is true, what youre saying, but at 16 cores, the power draw is halved, getting closer to rreality, if such a 16 core Larrabee could function as well as a traditional gpu at the same tdp
September 6, 2008 7:10:55 AM

I could not see a full scale Larrabee in a console if all the theoretical numbers are at least in the ballpark, but I could see a......revised version of it, scaled down. Maybe half of the PC version somewhere along those lines. If Intel really can pull this off and Larrabee kills Nvidia "overnight" with a monster card, putting it in a console would also be the mythical "end of pc gaming" come to reality, as there would be no reason to buy a PC anymore. We already know next gen consoles are going to have full working net browsers and more media/storage options and utility, not to mention the superior hardware optimizing that makes drivers for graphic cards look like crap. Intel would score big with Larrabee, but kill there CPU business at the same time in the long run.

Of course at this point in the conversation we are beyond theories and rumors, we are walking on absurdity and pipe dreams now :D 
a b U Graphics card
September 6, 2008 7:32:14 AM

Good points all. Im reading the Annand article now, as I only touched on it originally. Im seeing some very optimistic possibilites, but Annand himself has always had great love for Intel, so I take these with a grain of salt, add in the potential disasters (drivers, cache coherency, sw emulation vs latency, power requirements, drivers, ring bus-leaning towards costly, oh and did I mention drivers?) I dont think its going to be such a walk in the park.
!