Closed Solved

Sandy/Ivy Bridge vs. Llano & Bulldozer Architecture

Does anyone have any idea or would like to speculate on which one will be better for the hardcore gamer? I've heard that Sandy Bridge's GPU die will have a tough time dealing with discrete GPUs, where as Bulldozer won't, and also Sandy Bridge will support DX10 whereas AMD will support DX11.

I know information is very limited on both these processors right now but would anyone like to take a guess on which will be better for a desktop PC gamer?
7 answers Last reply Best Answer
More about sandy bridge llano bulldozer architecture
  1. I dunno man, I think AMD is really gonna try to make a run to get out of the whole "budget processor" area. But I have to agree with you, I think Intel is gonna wipe the floor in this generation of processors as well. Especially if Intel gets out that 22n shrink by end of 2011, and bulldozer is 28nm and not even out by then?

    I'm an Intel guy fyi, but I really am curious about AMD's Fusion architecture and how it's going to perform.
  2. Quote:
    One thing for sure Intel will continue to kick amd's ass.


    What reliable source do you base your comment on?

    For competing chip's in similar price ranges your comment isn't currently true in all conditions and environments. (So we already know your comment is incorrect and merely speculation.)
  3. Best answer
    mboyer87 said:
    Does anyone have any idea or would like to speculate on which one will be better for the hardcore gamer? I've heard that Sandy Bridge's GPU die will have a tough time dealing with discrete GPUs, where as Bulldozer won't, and also Sandy Bridge will support DX10 whereas AMD will support DX11.

    I know information is very limited on both these processors right now but would anyone like to take a guess on which will be better for a desktop PC gamer?


    According to the Anandtech preview, Sandy Bridge has around 20% IPC improvement over Westmere on average. It also has an aggressive turbo - up to 5 speed bins on the parts without the GPU. According to the Inquirer article I linked to in another thread:

    Quote:
    The new Socket LGA1155, common to both mainstream desktop Sandy Bridge Core i3/i5/i7 chips and the entry level Xeon Sandy Bridge parts, will support CPUs in two different configurations for PCIe I/O: the 16-lane desktop part, and the 20-lane server and workstation part, all of course at PCIe v2 speed.

    Otherwise, the desktop and enterprise parts are identical, including up to four cores and 8MB cache, and speeds up to 3.4GHz for the parts with GPU turned on, and 3.5GHz for the parts with the disabled GPU. The fine grained Turbo capability gives them another up to 400MHz headroom when all cores are used, with appropriate power and thermal solutions. These sockets, by now well known to the community, are the only ones with the built-in GPU.

    Then we come to the Socket LGA1356, a direct replacement for the current Socket LGA1366. The parts here are 6-core and 8-core Sandy Bridge single-socket and dual-socket capable but midrange positioned Sandy Bridge Xeon - and, ultimately, Core i7 - parts with up to 20MB of L3 cache, three DDR3-1600 memory channels just like the existing LGA1366 Westmeres with one memory speed grade higher, and 24 PCIe v3 lanes on-chip. The single external QPI v2 link runs at up to 8 gigatransfers/sec, or 32GB/sec bidirectional bandwidth, a 25 per cent speed up over the current generation, but also feeding a third more cores on each socket.

    The highest speed 8-core CPUs with up to 150W TDP should, however, be reserved for the high-end Socket LGA2011. With more power and ground lines to support 40 PCIe v3 lanes and four DDR-1600 memory channels per socket, as well as dual QPI 8 gigatransfers/sec links, the 8-core, 20MB L3 cache Sandy Bridge-based Xeons should have sufficient system bandwidth to feed even the highest workloads. Not to mention enough PCIe bandwidth for two dual-GPU cards with extra lanes still free for a, say, 5GB/sec PCIe high-speed SSD or Infiniband interconnect.

    And, when you add the same resource on the second CPU, it becomes possible to fully feed an 8 GPU system out of a single two processor workstation. And yes, you could even do a quad-socket monster here, if you're using the EX parts, I assume.


    If the above is true, then my guess is that either the LGA1356 or 2011 parts will be excellent for gaming.

    As for BD, I don't think there are any previews out on it yet, so until there are, all we have is simulations and speculation. Ditto for Ivy Bridge which is just a shrink of Sandy Bridge.

    The Llano demo by AMD earlier in the month showed off its APU ability (where the GPU is used for IIRC floating point and other stuff), but the gaming benchmarks were not any better than the Sandy Bridge GPU, which is puzzling because it is supposed to be a much more capable GPU. My guess is preliminary silicon and preliminary drivers. IMO, by using Stars cores as the CPU, AMD is trading off CPU performance for a stronger GPU, which is probably not interesting to hard-core gamers anyway.
  4. Best answer selected by mboyer87.
  5. Slightly off-topic but does anyone know roughly what month the first Bulldozer CPU's will be released in?
  6. Phoenixlight said:
    Slightly off-topic but does anyone know roughly what month the first Bulldozer CPU's will be released in?


    Was doing some searching around the net for this as well, everything I read said optimistically Q1 2011, more than likely Q2.
  7. This topic has been closed by Mousemonkey
Ask a new question

Read More

CPUs Bulldozer Sandy Bridge