Sign in with
Sign up | Sign in

Report: 'Crystal Well' Haswell IGP to Have 64MB L4 Cache

By - Source: Fudzilla | B 23 comments

The "Crystal Well" variant of Intel's upcoming Haswell lineup reportedly features a 64 MB L4 cache dedicated for graphics.

According to a report published by Fuzdilla, the "Crystal Well" variant of Intel's Haswell range of processors will feature a massive 64 MB cache dedicated for graphics. Though the presence of L4 caches on graphics cards is nothing new, it has yet to make an appearance on CPUs since it would produce a "huge chip" due to the high transistor count of GPU cache memory. 

While Intel certainly has the ability to fabricate processors with die sizes large enough to accommodate a large L4 cache, the size of this cache merits a degree of skepticism. Fuzdilla has also received reports that suggest that L1, L2, L3 and L4 memory will instead be "shared between the CPU and GPU."

From what we know, Crystal Well technology has been reserved for Intel's high-end GTS-based processors, so it is likely to only make an appearance on the most expensive Core i7 models. Regardless of whether these reports are true, we can safely expect a large improvement in Intel's onboard graphics when Haswell is launched later this year.

Contact Us for News Tips, Corrections and Feedback

Discuss
Ask a Category Expert

Create a new thread in the News comments forum about this subject

Example: Notebook, Android, SSD hard drive

This thread is closed for comments
Top Comments
  • 16 Hide
    Estix , April 14, 2013 6:36 PM
    "Fudzilla has also received reports that suggest that L1, L2, L3 and L4 memory will instead be 'shared between the CPU and GPU.'"

    L1 cache isn't even shared between cores, let alone between the CPU and GPU. I call BS.
  • 15 Hide
    sundragon , April 14, 2013 6:48 PM
    1. Obviously 64MB of onboard cache would make it an expensive chip due to size and complexity. One could argue most people purchasing a high end i7 desktop chip would have dedicated graphics card. I don't think Intel's focus is for those systems.

    2. However, it makes sense to put a 64MB cache in the mobile chips for ultrabooks where the option for dedicated graphics isn't possible due to reduced battery capacity, heat, and space on the motherboard.

    64MB sounds expensive but it may be necessary to provide enough bandwidth to keep the GPU from starving the CPU. Building it into the die using 22 NM may make it cheap enough to provide enough bang for the price.
Other Comments
  • 0 Hide
    ikyung , April 14, 2013 6:21 PM
    Haswell-E?
  • Display all 23 comments.
  • 1 Hide
    leo2kp , April 14, 2013 6:26 PM
    *Fudzilla
  • 16 Hide
    Estix , April 14, 2013 6:36 PM
    "Fudzilla has also received reports that suggest that L1, L2, L3 and L4 memory will instead be 'shared between the CPU and GPU.'"

    L1 cache isn't even shared between cores, let alone between the CPU and GPU. I call BS.
  • 3 Hide
    aggroboy , April 14, 2013 6:44 PM
    Don't the highest-end models get paired with discrete graphics anyways?
  • 15 Hide
    sundragon , April 14, 2013 6:48 PM
    1. Obviously 64MB of onboard cache would make it an expensive chip due to size and complexity. One could argue most people purchasing a high end i7 desktop chip would have dedicated graphics card. I don't think Intel's focus is for those systems.

    2. However, it makes sense to put a 64MB cache in the mobile chips for ultrabooks where the option for dedicated graphics isn't possible due to reduced battery capacity, heat, and space on the motherboard.

    64MB sounds expensive but it may be necessary to provide enough bandwidth to keep the GPU from starving the CPU. Building it into the die using 22 NM may make it cheap enough to provide enough bang for the price.
  • 1 Hide
    ttcboy , April 14, 2013 7:26 PM
    Those people who can afford to buy or plan to buy this kind of high end CPU surely have enough money or planned to buy dedicated GPU. It is pointless to put integrated GPU in this kind of chip.
  • 0 Hide
    goodguy713 , April 14, 2013 7:28 PM
    this sounds fishy .. just like that amd processor for 800 .. just nonsense
  • -3 Hide
    goodguy713 , April 14, 2013 7:29 PM
    this sounds fishy .. just like that amd processor for 800 .. just nonsense
  • 7 Hide
    LordConrad , April 14, 2013 7:52 PM
    I want a TARDIS, I'm tired of waiting for the next big processor or GPU. I want it all now (or yesterday).
  • 1 Hide
    whyso , April 14, 2013 8:15 PM
    Might be good for large cpu calculations that are memory bound.
  • 1 Hide
    CaptainTom , April 14, 2013 10:20 PM
    I love this idea. It would allow all of those $500 notebooks with intel 4000 to have a great integrated solution that uses very little power instead. However, I bet intel limits this to super i7's, and as such completely removes all usefulness of this technology... :( 
  • 1 Hide
    InvalidError , April 14, 2013 10:24 PM
    Quote:
    1. Obviously 64MB of onboard cache would make it an expensive chip due to size and complexity.

    Not if the L4 is on a separate die from the rest of the CPU.

    With the 386, 486 and Pentiums, L2 cache used to be on the motherboard. With the Pentium Pro, Pentium 2 and early Pentium 3, the L2 cache used to reside on separate chips on the CPU package.

    There is plenty of precedent for on-package/off-die cache.

    Also, a 64MB SRAM is more than 3.2 billion transistors which would make this cache larger than the whole CPU die which does not make much sense cost-wise. Logically, this would indicate that the cache is DRAM-based to keep surface area and cost in check. Since DRAM and high-speed CMOS processes do not play well together, this would also point towards an off-die DRAM chip.

    So my bet is custom on-package DRAM chip.
  • 0 Hide
    utroz , April 14, 2013 10:44 PM
    Obviously they are going to use silicon interposers, or a MCM to be able to have such a large L4 or on chip memory for the gpu.
  • 0 Hide
    psupanova , April 15, 2013 2:38 AM
    Intel Graphics, with their poor driver support, Haswell will be a HasBeen...
  • 0 Hide
    MU_Engineer , April 15, 2013 5:22 AM
    There have been L4 caches used with CPUs before, although it has been off-die. IBM in particular used a lot of L4 cache in its chipsets. Intel even has made CPUs with L4 cache with the Itanium MX2 having 32 MB of on-package (but not on-die) L4 cache.
  • 0 Hide
    MU_Engineer , April 15, 2013 5:22 AM
    There have been L4 caches used with CPUs before, although it has been off-die. IBM in particular used a lot of L4 cache in its chipsets. Intel even has made CPUs with L4 cache with the Itanium MX2 having 32 MB of on-package (but not on-die) L4 cache.
  • 0 Hide
    ojas , April 15, 2013 5:51 AM
    Quote:
    "Fudzilla has also received reports that suggest that L1, L2, L3 and L4 memory will instead be 'shared between the CPU and GPU.'"

    L1 cache isn't even shared between cores, let alone between the CPU and GPU. I call BS.

    I'm skeptical about L1, L2 being shared, L3 and L4 might, though...but i think it's more likely that only L4 is shared, or that it's exclusive to graphics.

    Quote:
    Don't the highest-end models get paired with discrete graphics anyways?

    Yeah, but if this gives you GT650m performance, while consuming far less power, you could actually game on the go, without plugging it in. Not intended to compete with 660m and above, though, i would guess. Might be able to undercut the discrete mobile market if they're smart.

    Quote:
    1. Obviously 64MB of onboard cache would make it an expensive chip due to size and complexity. One could argue most people purchasing a high end i7 desktop chip would have dedicated graphics card. I don't think Intel's focus is for those systems.

    2. However, it makes sense to put a 64MB cache in the mobile chips for ultrabooks where the option for dedicated graphics isn't possible due to reduced battery capacity, heat, and space on the motherboard.

    64MB sounds expensive but it may be necessary to provide enough bandwidth to keep the GPU from starving the CPU. Building it into the die using 22 NM may make it cheap enough to provide enough bang for the price.

    1. GT3e is for notebooks and -R series BGA parts only.

    2. Agreed, but this isn't for ultrabooks afaik.

    3. That's exactly why they're doing it, because making it shared GDDR would induce latency issues for the CPU. Best of both worlds.

    Quote:
    Those people who can afford to buy or plan to buy this kind of high end CPU surely have enough money or planned to buy dedicated GPU. It is pointless to put integrated GPU in this kind of chip.

    It's for notebooks and BGA desktop parts (read: AiOs) only.

    Quote:
    Quote:
    1. Obviously 64MB of onboard cache would make it an expensive chip due to size and complexity.

    Not if the L4 is on a separate die from the rest of the CPU.

    With the 386, 486 and Pentiums, L2 cache used to be on the motherboard. With the Pentium Pro, Pentium 2 and early Pentium 3, the L2 cache used to reside on separate chips on the CPU package.

    There is plenty of precedent for on-package/off-die cache.

    Also, a 64MB SRAM is more than 3.2 billion transistors which would make this cache larger than the whole CPU die which does not make much sense cost-wise. Logically, this would indicate that the cache is DRAM-based to keep surface area and cost in check. Since DRAM and high-speed CMOS processes do not play well together, this would also point towards an off-die DRAM chip.

    So my bet is custom on-package DRAM chip.

    You're pretty much dead on, check this out:
    http://www.anandtech.com/show/6892/haswell-gt3e-pictured-coming-to-desktops-rsku-notebooks
  • -1 Hide
    Pherule , April 15, 2013 9:07 AM
    Intel you retards, it's the smaller, dual cores that need this. i7's use dedicated cards anyway. In fact, to ship an i7 without anything other than a very small, weak, on-die graphics die is stupid.
  • -1 Hide
    j0um , April 15, 2013 5:26 PM
    Imagine the performance if their drivers weren't broken beyond hope!
  • -1 Hide
    Streetguru , April 15, 2013 8:13 PM
    This is cute an all....but who in the history of forever has used intels onboard graphics on purpose? This is a waste of time and resources.
Display more comments