Dreamworks pick Intel's Nehalem and Larrabee over AMD/ATI

dvmoo7

Distinguished
Jul 11, 2006
59
0
18,630
Does dreamworks know something about Intel unreleased graphics "Larabee" tech that most of us dont. Whatever it is, they have decided to completely revamp the entire computing environment to Intel.

Jeffrey Katzenberg, Dreamworks Animation's chief executive is said to have said: "They are radical game-changers for the entire field of computing."

"Our objective is to significantly heighten the movie going experience using DreamWorks Animation?s ground-breaking 3-D filmmaking tools," said Jeffrey Katzenberg, CEO of DreamWorks Animation. "Technology plays a significant role in enabling our artists to tell great stories. By utilizing Intel?s industry-leading computing products, we will create a new and innovative way for moviegoers to experience our films in 3-D."




Here are some related links.

http://www.theinquirer.net/gb/inquirer/news/2008/07/08/dreamworks-shifts-amd-intel


http://www.newstin.co.uk/sim/uk/67058376/en-010-003900960

http://www.cdrinfo.com/Sections/News/Details.aspx?NewsId=23661

 

blackwidow_rsa

Distinguished
Aug 16, 2007
846
0
18,990
Larrabee and all this on-die gpus are just going to cause more confusion in the pc market, especially for 'joe sixpack'. We're going to have to choose between the advantages and disadvantages between Larrabee type cpus and discrete gpus
 

beerzombie

Distinguished
May 4, 2006
15
0
18,510
From what I've gathered, Intel has been focusing on ray tracing as their rendering method with Larrabee, standard consumer graphics cards are focused on rasterisation and shading techniques. Dreamworks films are all ray traced, GPUs do not "out of the box" ray trace, a CUDA/CTM/GNGPU application has to be written to ray trace on the GPU.

I'm guessing the former process was: rasterize the scene on the GPU to position everything (quick), if it looks okay ray trace it crudely to get any additional tweaks on animators workstation (moderate), then when approved ray trace on the supercomputer (CPUs) to full resolution.
 


Um what? Larrabee is a discrete GPU that will run on PCIe. Nehalem will have a GPU probably based on Larrabee on package. But seriously....where ever you got that info it is wayyyyy off.



Yes.....yes it was..... next Lucas arts will go for a Nehalem & Larrabee setup and thunderman will be crushed.
 

Warsaw

Distinguished
Jul 8, 2008
251
4
18,795


ATI's new 4800 series does support ray tracing, and from what I've heard rumors about at various sites it's supposedly pretty good. Also, ATI's cards DO support ray tracing out of the box.

Now, even though it might be DX9 ray tracing I feel that ATI would be able to come up with a better ray tracing solution as compared to Intel's.
 


You must not have read. Intel bought a raytracing company about 1-2 years ago. It will also have Havok to use. So Larrabee will probably have the better raytracing solution. Don't ever doubt Intels R&D. The amount they have for it can easily equal AMD, ATI and nVidias all together.
 

yomamafor1

Distinguished
Jun 17, 2007
2,462
1
19,790
Waiting for the....

4onjs6
 


We could have asked the same question when they went AMD. Or heck lets ask how much Apple pays companies to use MACs for picture/movie design.

Oh wait. Could it be that Larrabe and Nehalem offer what they need? Heck its possible.

Thing is we have no idea what Larrabee is capable of. Maybe the Raytracing engine on it will exceed what a ATI/nVidia GPU can do considering that Intel has a Raytracing department and is obviously trying to make sure that Larrabee is all set.



LMAO. Thats just funny.

Come to think of it.... maybe if they had this airing on all the TV stations in areas that get hurricanes and such people wouldn't stay.
 

kassler

Distinguished
May 25, 2008
257
0
18,780

Larrabee does not exist yet, don't know a company that will spend money of something that doesn't exist
 

NMDante

Distinguished
Oct 5, 2002
1,588
0
19,780


Intel's code-named Nehalem processor for high-end workstations will have up to eight processor cores, while its Larrabee server processor will have between 10 and 100, said Intel spokesman Nick Knupffer. Those are the two chips DreamWorks has agreed to buy.

You know one - Dreamworks. Of course, Dreamworks can probably pull out of the Larrabee deal, if it turns out not to perform the way it expects, I assume.

Although they did say Larrabee was a server processor, which is something it has never been called or mentioned as before. Hmmm?
 

dagger

Splendid
Mar 23, 2008
5,624
0
25,780
Does it actually matter how they make movies? It's not like you need to switch computer hardware to watch a movie. Besides, those special effect scenes are pre-rendered, and based more on cpu than gpu. This doesn't mean Larrabee is more powerful than the traditional discrete gpus we have now, only that movie makers don't need nearly as much graphics performance as running games like Crysis. Cpu performance is what matters to them.
 

Amiga500

Distinguished
Jul 3, 2007
631
0
18,980



Exactly. No one has any idea what Larrabee is capable of. All it is at the moment is powerpoint performance projections. Grow some objectivity and quit licking Intel's arse.



Fact: Intel are looking to hit the workstation/server market with Nehalem - particularly the 4P/8P end.

Fact: Intel are looking to hit the GPU market through ray tracing and rasterisation (but mainly ray tracing apparently) with Larrabee.


Having a high-publicity company like dreamworks as a user will do wonders to build the reputation of both Nehalem and Larrabee in their respective markets, regardless of their performance.



I half expect to see a wee Intel logo in the bottom corner of the Dreamworks splash next time I am in the cinema watching one of their films.
 
Sorry Amiga. I'm not licking any arse but your question was just stupid. Just because a company switches what they use all of the sudden they are being paid to.

Of course we cannot truly say if Larrabee does or does not exist yet as we are not on the inside of Intel. It could have prototypes they are testing and showing the companies the prospective performance from that.

Nehalem on the other hand does exist and maybe what Nehalem is showing is hitting right where they would need for faster production which would save them money as we all know in that industry time is money.

BTW I am being very objective with Larrabee. I just want something new to shake up the graphics market so that nVidia and ATI start to work harder. Plus if Larrabee does make them sweat it means lower prices on better hardware for us.
 


I would guess that as opposed to saving time, ray-tracing of DWA 3d content would take longer but provide much higher detail and realism.

I just can't figure out if DWA is expecting everyone to wear those goofy 3d glasses ... :ouch:
 

harna

Distinguished
Jan 2, 2008
282
0
18,790
You will probably find that if Dreamworks is struggling for cash that Intel has "chipped in" for a good slice of the publicity, doesn't mean that the AMD hardware can't cut it.
 
Heh.... as I said. Some company chooses another company and they are being paid or helped out.

Ahhh I love it.

I wouldn't mind the 3D glasses. Freakin awesomeness. I wouldn't mind having a pair to use while I drive. I wounder what that would do.
 

NMDante

Distinguished
Oct 5, 2002
1,588
0
19,780
I didn't know Intel sold HP systems, since Dreamworks is replacing the AMD HP systems with Intel HP systems.

The computer animation studio behind Shrek, Madagascar, and this summer's kid-friendly hit Kung Fu Panda is replacing its fleet of 1,500 Hewlett-Packard machines powered by AMD chips with Intel backed HP systems.
Intel and AMD win and lose customers all the time. It's rarely up to them, since computer manufacturers, not chipmakers, are the ones vying for major corporate accounts
Hmmm....so, it's HP that paid Dreamworks? Twice?

http://www.fool.com/investing/general/2008/07/08/intel-shreks-amd.aspx
 

beerzombie

Distinguished
May 4, 2006
15
0
18,510


That's what Larrabee is addressing, allowing the GPU to handle the work that was previously done on the CPU. As I already said, Crysis et al are rasterized graphics (the GPUs fortay), raytracing has always been the CPU fortay.

ATI (and nVidia's) Ray Tracing are not nearly as mature as x86 methods, who knows what kind of specialized rendering software dreamworks is using. It might be way too much of an effort to convert this to the approaches used to program on a traditional GPU (memory access alignment, etc). It sounds like Larrabee is x86 based, so they may not need to port their software, and the savings in development could easily outweigh the hardware cost difference between Larrabee v traditional GPUs.