Sign in with
Sign up | Sign in
Your question

Larrabee too proprietary?

Last response: in Graphics & Displays
Share
a b U Graphics card
August 2, 2009 7:17:44 PM

OK, I just have to put this up here in the gpu forum, as it may concern all of us.
http://www.xtremesystems.org/forum [...] ost3939099
"All very accurate statements, well putted. Now, if you look at the "breaking points" of the industry of video game:
1) wolfenstein id Software game was very much going down to the metal on the CPU.
2) Later on, Unreal went down to the metal on GPU
3) Max payne, for its time was going down to the metal for CPU and GPU
4) Ridge Race PC V1 did too!

If you are a video game programmer, and you want to create a legendary legacy, it always comes down to good ideas, and going down to the metal, and beat your competitors with better performing software.

Today's video game industry is a little always from this, differenciation between programmers is very difficult, because you have to go through a pipeline like everybody else: DirectX ...

What intel is going to propose next year is to revive the creativity on PC, With Larrabee, you ll be free to have your own rasterization/rendering pipeline the way you want it, you have 100% controle over the pipe, no more limitations imposed by your 3D pipeline.
I expect the demomakers to show the way, they seem very excited by Larrabee. Massively Parrallel computing is exciting only if you have good caches to avoid massive latencies, and fast branching for single threaded performance, to be able to customize EACH pixel.
It will take time , but you can compare DX shaders to a Java layer ... It is time to free the imagination of the programmers."

Now, everone agrees that CUDA can and or is seen as being proprietary, same as physx. Great for green, meh for the red. How bout games overall? Or possibly the lack, or greater lack of games?
With Intels influence, they coyld easily lure the next blockbuster game out using a non DX path game, leaving owners of nVidia and ATI with no access, or a poorly rendered proprietary path thru SW on their own HW
Thoughts?

More about : larrabee proprietary

a b U Graphics card
August 2, 2009 7:37:08 PM

http://www.xtremesystems.org/forums/showthread.php?t=231029#post3939099
Fixed Link

I am half half on this.
While every tool the developer has to more easily make better games is great, I would wonder how long it would take to have another 3Dfx Glide situation on our hands.
That is to say, we would have a tiered visuals setup.

As an example, everyone that has a Larrabee will get extremely lifelike graphics in a certain game while everyone else who plays the same game can only achieve, say, half the detail (any of you oldies remember the original Unreal?).
While it is great that some people are getting great visuals, it is only limited to a certain, propitiatory, hardware set.
If anything happens to the manufacturer of this propitiatory hardware (again, take 3Dfx as an example with Glide...), you are swinging out on a rope ever playing those beautiful graphics again.
In this sense, DirectX is great because it is available to everyone and graphics can be reproduced between many manufacturers.

On the other hand, anything that can push visuals to the next level (and possibly influence future standards/designs) is a great thing!
Innovation is a great thing in this industry, even when it is a bad idea.
Just keeping designers/programmers thinking about better ways of doing things can, potentially, lead to great advancements.
Even if it is a terrible idea now, who is to say that latter down the line it will not have a massive impact on future GPU design, DX or GPGPU standards?
m
0
l
a b U Graphics card
August 2, 2009 7:59:55 PM

Hmm... interesting. However, I would think it would be the best interest for Intel to make this something open. Possibly licensing it out (kind of like the x86). I doubt many would bend down to Intel for doing this with out a fight. After all, we can speak with out wallet...
m
0
l
Related resources
Can't find your answer ? Ask !
August 2, 2009 8:16:53 PM

This won't take off, at least not soon, for the same reason C# isn't the standard language for games: There are massive C++ libraries devs already use, and they use DirectX.

What Intel would NEED in order to rebel against DirectX would be an entire engine, with incredible capabilities, cheap to use, easy to use (well documented), and probably a separate dev to at least make a game to showcase it (even if Intel made the engine and funded the dev).

*edit* The answer is Yes.
m
0
l
a b U Graphics card
August 2, 2009 9:43:21 PM

OK, fisrtly, unless Intel can find no takers, this isnt an if, this is a when.
Personallu, Im not taking a fanboi approach to this, cause I care way too much for pc gaming, If this is used in consoles, will anyone still care for PC gaming? If actions seen lately by nVidia, where the adoption of the newest DX standards arent kept up, and people defend that, does this mean DX is that suseptable?
If DX11 and rumors Ive been digging hard at are all true, and we have the encouragement seen by this Intel approach given to game devs, but using full new DX standards, there wont be any improvement over DX as opposed to Intel, and its direction. The coming cards, the wider abilities of DX11 should meet the abilities of LRB, if used.
I just hope people dont bandwagon for less, when theres a very strong potential for more, and that goes both ways. LRB, or DX path.
When I hear DX10.1 isnt important, or DX11 or all hype and no substabce, it gives me the impression either people dont want any higher eyecandy dev, or theyre being fanbois. Its already starting, this isnt the first Ive heard, this is Intels approach, almost as if they know LRB wont be much used in an open way, but forced changed in its own proprietary way.
LRB, as a DX renderer may only be so so, but the way its being brought to the devs isnt in a DX fashion. As time passes, those that read alot will stumble accross Intels attempts with this bew direction being pushed at the devs. I dont have all the links where Ive already seen it, but theyre there, and its early on.
Im of the mind that next gen cards are going to be monsters, some have said to slow down, but it makes sense, both nVidia and ATI stand to lose 50% of their market in discrete, from a much larger, much more influential competitor, so to me, this IS the time to give it your best shot.
m
0
l
a b U Graphics card
August 2, 2009 10:13:53 PM

Cant, Im outta breath from those long sentences.
Anyways, from the quote:

"Massively Parrallel computing is exciting only if you have good caches to avoid massive latencies"

First of all, this is sheer propodanda. Its x86 mindthink vs everything else. x86 needs cache, alot of it, because its simply not fast enough to hide latencies it as gpus do.
Im not an expert, but this is common in gpus. Its a compromise sure, it saves cycles by "hiding" the latency, but also causes more than your have with cache.

He however doesnt tell the fair people at XS that all this "new pipeline ownership" will all be done thri SW rendering, and theres the tradeoff going against cpu/x86.
Thats everything done in SW, there and back, so yes, latencies are quite large.

Quote:
"What intel is going to propose next year is to revive the creativity on PC, "

As Ive already said, this isnt next year, this is happening now. Wouldnt that sound better to say? That theyre doing this now? Or would it be better to look the "wow", like they just whipped this up overnight? Remember the delays. LRB shouldve been out close to now, but its 1 thing thats not on that tick tock I guess.

Maybe thats a good thing. Maybe that means the devs are as slow to move and take a chance on LRB despite all of Intels influence as they are to make a fully compliant mix of DX11 games, or 10 for that matter


PS Heres your pair o graphs


















m
0
l
August 3, 2009 1:28:20 AM

Still have to get past the issue of building entirely NEW libraries.
m
0
l
August 3, 2009 1:36:40 AM

Ok...correct me if I'm wrong, but wasn't the original PS3 supposed to ship with no graphics card, leaving the devs to use the Cell processor in any way they wanted to "get creative" and all that. Then when the devs didn't want to go through the trouble they threw an Nvidia card in at the last minute.

Now I realize Intel has much more influence than Sony....but to make this kind of transition, games will either have to be platform exclusive (DX11 or LRB) or developers will have to port to both. Third option is LRB emulates DX11 but if that's the only way it functions I'd say LRB loses it's appeal for improved performance. I just have a really hard time thinking this is going to be any different than the PS3, developers aren't going to want to make games that run on different rendering engines, development costs are just too high.

We see this happen time and again when a technology isn't available across the board. Is DX10.1 superior than DX10? Yes. Do most games use it? No. Is Physx an extremely powerful game computation engine? Yes. Are developers rushing to make games Physx reliant? Not so quickly. Doing so would me precious development money making a product only half the customer base can use.
m
0
l
a b U Graphics card
August 3, 2009 2:00:33 AM

Companies will not use there imagination unless theres good money in it. If majority have LRB then sure why not but most of us wont for quite some time so dx11 will live. With the current economy a move like this will not appear. Its possible dev will support both dx/lbr is it doesnt cost them much or if more profit can be made.
m
0
l
a b U Graphics card
August 3, 2009 2:11:14 AM

1ce said:
Ok...correct me if I'm wrong, but wasn't the original PS3 supposed to ship with no graphics card, leaving the devs to use the Cell processor in any way they wanted to "get creative" and all that. Then when the devs didn't want to go through the trouble they threw an Nvidia card in at the last minute.

Now I realize Intel has much more influence than Sony....but to make this kind of transition, games will either have to be platform exclusive (DX11 or LRB) or developers will have to port to both. Third option is LRB emulates DX11 but if that's the only way it functions I'd say LRB loses it's appeal for improved performance. I just have a really hard time thinking this is going to be any different than the PS3, developers aren't going to want to make games that run on different rendering engines, development costs are just too high.

We see this happen time and again when a technology isn't available across the board. Is DX10.1 superior than DX10? Yes. Do most games use it? No. Is Physx an extremely powerful game computation engine? Yes. Are developers rushing to make games Physx reliant? Not so quickly. Doing so would me precious development money making a product only half the customer base can use.


So, I take it, you think LRB is either a wannabe nVidia in the wrong club, or its going to be a killer renderer. Id add, itll emulate DX. Not 10 or 10.1 but all of it, as their libraries are filled and drivers are made;

Where what you say about cell is mostly true, Intel has much more clout than Sony would ever hope to, especially dealing in these markets, as theyve worked with game devs just recently regarding their igp, as I posted in the cpu forum a few months back, as well as prior to that, and also with M$ and has holdings in a game dev group as well etc etc

But where it really goes wrong on the comparison is where , while yes, Sony did make the Cell, whats its customer base? vs x86? Who compiles for cell?vs x86? etc etc

Id have to say, where Sony could make noise, Intel can conduct an orchestra. Can the cell do both? No. LRB can, and more, and has many other usages as well. So Id have to say it poses a much greater threat to PC gaming as we know it, than a console that wasnt ever meant for PC, and couldnt do it if it wanted to.

My bet is, Sony will be influenced by LRB for PS4, and drop Cell
m
0
l
!