Larrabee and Ray-Tracing

fraggelrock

Distinguished
Apr 24, 2007
32
0
18,530
I found this article about Intels new Larrabee chip:

http://www.tomshardware.com/2008/02/...ions_larrabee/

It sounds like Intel has put some serious money into this thing, but more important it seems like Intel is trying to take over the 'graphics' market owned by nVidia and ATI!!

Could this be true?

The game developers seem to think this chip will be great for 'Ray-Tracing' which is used in Quake 4.

What exactly is Ray-Tracing, and does anyone really think Larabee will make for on-board graphics that will be better then nVidia or ATI PCI-E cards?

I also read something in another forum, where the Larrabee chip will be used on graphics cards, and would be 4 to 6x faster then anything nVidia or ATI will put out.

Is this real??
 
A small correction, Quake 4 uses OpenGL for it's API with the rendering technique being rasterization. A fellow by the name of Daniel Pohl was the one who modded a bunch of OpenGL games to raytracing. Here is a link to an article he wrote:

http://www.pcper.com/article.php?aid=334&type=expert

He goes into a little detail as to what ray-tracing is compared to rasterization.

As for ray-tracing usurping the throne from rasterization, it's not going to happen anytime soon. Why? As his article explains the CPU power needed isn't available at this time. At the moment the dedicated hardware that we use in rasterization is much faster than ray-tracing. I don't know a lot about Larrabee itself, I assume it will be dedicated hardware to offload some/most/all of the ray-tracing work from the CPU.

I believe Daniel Pohl is now working for Intel on this very thing. When we see it, assuming that it's possible to overcome the performance issues, I wouldn't expect it to seriously outperform rasterization. What you can expect is a more accurately rendered 3D image, all else being equal (resolution especially). Shadows and reflections should also look better.

Anyway I hope that has slaked your curiosity.