Sign in with
Sign up | Sign in

Intel Teases New Larrabee Details

By - Source: Tom's Hardware US | B 11 comments
Tags :

Santa Clara (CA) - Siggraph is just around the corner, so it should not be too surprising that Intel is talking more seriously about Larrabee, a discrete graphics product due for launch in "2009 or 2010". Intel decided to provide a few more slices of information that are likely to fuel a new round of rumors on the Internet.

Intel’s presentation to analysts and journalists held several interesting details, the design idea and high-level technology approach of Larrabee, but our two most burning questions were left unanswered and, at least partially, positioned in a territory that opens an opportunity of wide speculation: How many cores will Larrabee have and how will those cores compare to discrete graphics offerings from Nvidia and AMD/ATI? We don’t know for sure, but we received some hints.

According to Intel, the idea of Larrabee was born out of a need of CPU programmability and GPU multi-parallelism. While Intel promises that Larrabee, which will be based on a many-core x86 design, will provide "full support of current graphics APIs", the company said that it will offer developers a clean canvas to develop new APIs for new features. The hope here is that game developers will take advantage of x86 coding to come up with unique features that cannot run on GPUs.

Intel has developed a 1024 bits-wide bi-directional ring network (512 bits in each direction) for Larrabee to enable agents to communicate with each other in low latency manner resulting in what the company describes as "super fast communication between cores".

As previously reported, Larrabee x86 cores (each Larrabee core is actually a full x86 core) are based on a modernized dual-issue Pentium design with a short execution pipeline. The chip design was enhanced with a vector processing unit (VPU; 16 32-bit ops per clock), multi-threading (4x with separate register sets per thread), 64-bit extensions and sophisticated pre-fetching.

So, how many cores will this many-core product have? Intel says this is still a secret. The presentation charts however, which we were not allowed to publish, talk about Larrabee examples with 8 to 48 cores. These numbers are in the range of rumors we heard so far and it would not surprise us, if an 8-core chip in fact would be the entry-level product of this "2009 or 2010" product. Intel often said that Larrabee is "highly scalable," so 48 should be possible. Count in Hyperthreading and the products talked about can deal with 32 to 192 threads simultaneously.

Performance of Larrabee is a "secret" as well, as is the answer to the question how many Larrabee cores Intel will need to match Nvidia’s or AMD’s GPUs. But we would hope that Intel would not debut a product as important for the company as Larrabee with a performance that is significantly inferior to what is available on the market at the time of launch.

Scalability may become one of Larrabee’s biggest assets. Intel claims that Larrabee cores can scale almost linearly in games such as Gears of War, FEAR or Half-Life 2, Episode 2. 16 cores will provide twice the performance of 8 cores, 24 cores three times the speed, 32 times four times, etc. "Almost linearly" translates to "linearly within 10%", Intel said.

It is interesting to note that Intel mentioned that Larrabee will "fully support IEEE standards for single and double precision floating-point arithmetic." AMD’s and Nvidia’s GPGPUs support double-precision processing as well, but typically suffer dramatic hits in performance when exposed to double-precision apps. For example, Nvidia told us that the firm’s latest Tesla cards theoretically can hit 900 GFlops to 1 TFlops in single-precisions but just about 100 GFlops in double-precisions. Intel did not say how Larrabee performance is affected in double-precision environments.

Discuss
Ask a Category Expert

Create a new thread in the News comments forum about this subject

Example: Notebook, Android, SSD hard drive

This thread is closed for comments
  • 0 Hide
    jaragon13 , August 4, 2008 12:29 PM
    Oh,the future looks bright.
  • 0 Hide
    jimmysmitty , August 4, 2008 1:07 PM
    Holy jebus....they doubled ATIs highest bus....1024bit.....pair that with GDDR5 and oohhh....the bandwidth...
  • 0 Hide
    martel80 , August 4, 2008 1:56 PM
    Isn't the idea behind this CPU/GPU somewhat similar to the Cell processor and its vector engines?
  • Display all 11 comments.
  • 0 Hide
    Anonymous , August 4, 2008 2:16 PM
    Backwards compatibility with current NVDA/ATI gaming solutions....I'm S.O.L.D.
  • 0 Hide
    3Ball , August 4, 2008 2:33 PM
    two please!
  • 0 Hide
    Pei-chen , August 4, 2008 2:55 PM
    As long as it will play Sims 3 fluently, it'll sold like hot cake.
  • 0 Hide
    scooterlibby , August 4, 2008 3:14 PM
    jimmysmittyHoly jebus....they doubled ATIs highest bus....1024bit.....pair that with GDDR5 and oohhh....the bandwidth...


    I think your referring to NVidia. ATI is still using 256 it, albeit with GDDR5. the GTX280 has a 512 bit bus.
  • 0 Hide
    Anonymous , August 4, 2008 3:22 PM
    I am wondering what kind of power supply will be needed to support something like this.
  • 0 Hide
    bloodymaze , August 4, 2008 4:26 PM
    10000W minimum PSU *lol*
    So all in all it WILL be able to run farcry at 60fps, but you'll need a 15000W PSU to do it hahaha
    kikasphaltI am wondering what kind of power supply will be needed to support something like this.

  • 0 Hide
    bloodymaze , August 4, 2008 4:36 PM
    I mean Crysis lol, not farcry!
  • 0 Hide
    jimmysmitty , September 29, 2009 3:20 AM
    AnonymousI am wondering what kind of power supply will be needed to support something like this.


    Actually no. ATI had a 512bit bus before nVidia. The HD2900 series was running a 512bit ring bus interface. But due to higher power requirements and higher heat dissipation, ATI dropped it for a 256bit bus paired with GDDR5 (HD4800 series+).

    nVidia is planning 512bit bus for the G300 which is nearly 3 years after ATI did.