Closed

Report: 'Crystal Well' Haswell IGP to Have 64MB L4 Cache

The "Crystal Well" variant of Intel's upcoming Haswell lineup reportedly features a 64 MB L4 cache dedicated for graphics.

Report: 'Crystal Well' Haswell IGP to Have 64MB L4 Cache : Read more
23 answers Last reply
More about report crystal haswell igp 64mb cache
  1. Haswell-E?
  2. *Fudzilla
  3. "Fudzilla has also received reports that suggest that L1, L2, L3 and L4 memory will instead be 'shared between the CPU and GPU.'"

    L1 cache isn't even shared between cores, let alone between the CPU and GPU. I call BS.
  4. Don't the highest-end models get paired with discrete graphics anyways?
  5. 1. Obviously 64MB of onboard cache would make it an expensive chip due to size and complexity. One could argue most people purchasing a high end i7 desktop chip would have dedicated graphics card. I don't think Intel's focus is for those systems.

    2. However, it makes sense to put a 64MB cache in the mobile chips for ultrabooks where the option for dedicated graphics isn't possible due to reduced battery capacity, heat, and space on the motherboard.

    64MB sounds expensive but it may be necessary to provide enough bandwidth to keep the GPU from starving the CPU. Building it into the die using 22 NM may make it cheap enough to provide enough bang for the price.
  6. Those people who can afford to buy or plan to buy this kind of high end CPU surely have enough money or planned to buy dedicated GPU. It is pointless to put integrated GPU in this kind of chip.
  7. this sounds fishy .. just like that amd processor for 800 .. just nonsense
  8. this sounds fishy .. just like that amd processor for 800 .. just nonsense
  9. I want a TARDIS, I'm tired of waiting for the next big processor or GPU. I want it all now (or yesterday).
  10. Might be good for large cpu calculations that are memory bound.
  11. I love this idea. It would allow all of those $500 notebooks with intel 4000 to have a great integrated solution that uses very little power instead. However, I bet intel limits this to super i7's, and as such completely removes all usefulness of this technology... :(
  12. sundragon said:
    1. Obviously 64MB of onboard cache would make it an expensive chip due to size and complexity.

    Not if the L4 is on a separate die from the rest of the CPU.

    With the 386, 486 and Pentiums, L2 cache used to be on the motherboard. With the Pentium Pro, Pentium 2 and early Pentium 3, the L2 cache used to reside on separate chips on the CPU package.

    There is plenty of precedent for on-package/off-die cache.

    Also, a 64MB SRAM is more than 3.2 billion transistors which would make this cache larger than the whole CPU die which does not make much sense cost-wise. Logically, this would indicate that the cache is DRAM-based to keep surface area and cost in check. Since DRAM and high-speed CMOS processes do not play well together, this would also point towards an off-die DRAM chip.

    So my bet is custom on-package DRAM chip.
  13. Obviously they are going to use silicon interposers, or a MCM to be able to have such a large L4 or on chip memory for the gpu.
  14. Intel Graphics, with their poor driver support, Haswell will be a HasBeen...
  15. There have been L4 caches used with CPUs before, although it has been off-die. IBM in particular used a lot of L4 cache in its chipsets. Intel even has made CPUs with L4 cache with the Itanium MX2 having 32 MB of on-package (but not on-die) L4 cache.
  16. There have been L4 caches used with CPUs before, although it has been off-die. IBM in particular used a lot of L4 cache in its chipsets. Intel even has made CPUs with L4 cache with the Itanium MX2 having 32 MB of on-package (but not on-die) L4 cache.
  17. Estix said:
    "Fudzilla has also received reports that suggest that L1, L2, L3 and L4 memory will instead be 'shared between the CPU and GPU.'"

    L1 cache isn't even shared between cores, let alone between the CPU and GPU. I call BS.

    I'm skeptical about L1, L2 being shared, L3 and L4 might, though...but i think it's more likely that only L4 is shared, or that it's exclusive to graphics.

    aggroboy said:
    Don't the highest-end models get paired with discrete graphics anyways?

    Yeah, but if this gives you GT650m performance, while consuming far less power, you could actually game on the go, without plugging it in. Not intended to compete with 660m and above, though, i would guess. Might be able to undercut the discrete mobile market if they're smart.

    sundragon said:
    1. Obviously 64MB of onboard cache would make it an expensive chip due to size and complexity. One could argue most people purchasing a high end i7 desktop chip would have dedicated graphics card. I don't think Intel's focus is for those systems.

    2. However, it makes sense to put a 64MB cache in the mobile chips for ultrabooks where the option for dedicated graphics isn't possible due to reduced battery capacity, heat, and space on the motherboard.

    64MB sounds expensive but it may be necessary to provide enough bandwidth to keep the GPU from starving the CPU. Building it into the die using 22 NM may make it cheap enough to provide enough bang for the price.

    1. GT3e is for notebooks and -R series BGA parts only.

    2. Agreed, but this isn't for ultrabooks afaik.

    3. That's exactly why they're doing it, because making it shared GDDR would induce latency issues for the CPU. Best of both worlds.

    Anonymous said:
    Those people who can afford to buy or plan to buy this kind of high end CPU surely have enough money or planned to buy dedicated GPU. It is pointless to put integrated GPU in this kind of chip.

    It's for notebooks and BGA desktop parts (read: AiOs) only.

    Anonymous said:
    sundragon said:
    1. Obviously 64MB of onboard cache would make it an expensive chip due to size and complexity.

    Not if the L4 is on a separate die from the rest of the CPU.

    With the 386, 486 and Pentiums, L2 cache used to be on the motherboard. With the Pentium Pro, Pentium 2 and early Pentium 3, the L2 cache used to reside on separate chips on the CPU package.

    There is plenty of precedent for on-package/off-die cache.

    Also, a 64MB SRAM is more than 3.2 billion transistors which would make this cache larger than the whole CPU die which does not make much sense cost-wise. Logically, this would indicate that the cache is DRAM-based to keep surface area and cost in check. Since DRAM and high-speed CMOS processes do not play well together, this would also point towards an off-die DRAM chip.

    So my bet is custom on-package DRAM chip.

    You're pretty much dead on, check this out:
    http://www.anandtech.com/show/6892/haswell-gt3e-pictured-coming-to-desktops-rsku-notebooks
  18. Intel you retards, it's the smaller, dual cores that need this. i7's use dedicated cards anyway. In fact, to ship an i7 without anything other than a very small, weak, on-die graphics die is stupid.
  19. Imagine the performance if their drivers weren't broken beyond hope!
  20. This is cute an all....but who in the history of forever has used intels onboard graphics on purpose? This is a waste of time and resources.
  21. Pherule said:
    Intel you retards, it's the smaller, dual cores that need this. i7's use dedicated cards anyway. In fact, to ship an i7 without anything other than a very small, weak, on-die graphics die is stupid.

    The GT3/GT3e is only available in Haswell "-R" models which means BGA (soldered) packaging. That means mainly intended for laptops, possibly tablets, all-in-ones, SFFs, NUCs, etc. where discrete graphics may not even be an option. For all of those, everyone considering purchasing will be happy to have more performance per buck.

    Based on comments from people who have seen Intel's GT3(e?) demo, it is supposed to run most games reasonably well up to 1080p, which would be a considerable upgrade from HD4000 that often struggles even with 720p.

    Great news for everyone who does not care about playing the latest heaviest games with all settings maxed out.
  22. Streetguru said:
    This is cute an all....but who in the history of forever has used intels onboard graphics on purpose? This is a waste of time and resources.

    You would be surprised about how many people run off Intel's IGPs. I haven't seen recent stats but Intel was already ahead of both AMD and Nvidia for graphics market share years ago even with their crappier past IGPs.

    The majority of PCs out there are not used for gaming or anything else that really requires a GPU so it is perfectly reasonable for Intel (and AMD) to provide IGP options that are just good enough for most people not to bother with discrete graphics... particularly on BGA/-R models that are most likely to end up in devices where discrete GPU may not even be an option.
  23. just for ultrabook i think
Ask a new question

Read More

Build Your Own CPUs Graphics