Sign in with
Sign up | Sign in
Your question
Closed

Intel Reveals More About Sandy Bridge Core CPUs

Last response: in News comments
Share
September 14, 2010 4:34:14 PM

Are they doing away with heatspreaders this generation?
Score
-3
September 14, 2010 4:37:39 PM

Very interesting concept to put together the CPU and GPU, though if they ever hope to catch with AMD/Nvidia graphic performance...They still have a loooooong way to go. AMD have ATI at its disposition and certainly got a good advantage already. Intels still relies on its CPU for the most taxing tasks.
Score
6
Related resources
September 14, 2010 4:39:37 PM

Wake me up when integrated GPUs on CPUs do 60FPS in Crysis with max details at 1080 res.
Score
-17
September 14, 2010 4:55:43 PM

what is the future of pci e graphic cards? :o 
Score
-8
September 14, 2010 4:57:07 PM

So slide 17 says they're keeping the integrated PCI-e from P55. Does this mean that we'll once again have a 16 total PCIe Lane limit or was this issue resolved?

Also, did Intel decouple the PCIe controller from BCLK? One of the biggest issues with Lynnfield was the stupid (and pointless) coupling of the 2 which limited stock v overclocking greatly.
Score
4
a c 131 à CPUs
September 14, 2010 5:01:54 PM

zipzoomflyhighSeems like they just tweaked Nehalem and added gpu. It's sad the s1155 wont be backwards compatible with s1156. At least with AMD's new AM3+ socket you will still be able to drop in your old AM3 cpu.

No you won't.
http://www.xbitlabs.com/news/cpu/display/20100826225852...
"Apparently, it was possible for AMD to make Bulldozer microprocessors compatible with existing AM3 infrastructure, but in order to do that, the company would have to sacrifice certain important features of the new core."
So they won't be throwing us old AM3 or AM2+ users a bone.
Score
-2
September 14, 2010 5:05:27 PM

Is this the follow on to Intel's Larrabee project?
Score
0
September 14, 2010 5:08:37 PM

get ready to shell out a 1000 bucks for one of those , Intel will be Intel :|
Score
12
September 14, 2010 5:12:46 PM

Sandy Bridge will have USB 3.0, PCIe 3.0 and SATA 6.Gb/s... correct?
Score
0
September 14, 2010 5:19:45 PM

The Intel X58 Tylersburg based platform will likely be replaced by X68 Waimea Bay which includes a Sandy Bridge-E CPU and a Patsburg PCH.

Waimea Bay / Patsburg = high-end
Cougar Point = low-end

http://vr-zone.com/articles/a-look-into-intel-s-next-ge...

http://www.semiaccurate.com/2010/08/12/intels-patsburg-...

Waimea Bay Highend Desktop (HEDT) Overview
Earlier on, we had unveiled chipset specifications for Intel Mainstream (MS) desktop in 2011, but at that point in time, little was known of what would replace the Tylersburg Enthusiast Platform consisting Intel X58 Express chipset + i7-980X "Gulftown" CPU. With LGA1155 Sandy Bridge-MS samples already shipping, it even looked like we'd have the Tylersburg Enthusiast Platform sticking with us for a very long time.

Before you join the uninformed; concluding that there wouldn't be an enthusiast platform on Sandy Bridge microarchitechture in 2011, we'd like to show you some exclusive knowledge about Intel's next generation HEDT.

Intel will stick to using differentiated desktop platforms for the purpose of market segmentation in 2011. The LGA1155 Sandy Bridge-MS we've heard about over the past months will join the Cougar Point family of chipsets to corner the Mainstream and Essential market segments. For the highend segment, Waimea Bay will take the lead by joining Sandy Bridge-E processors to the Patsburg Platform Controller Hub (PCH). Because Sandy Bridge on LGA1155 will only be available in 2 core and 4 core versions, those looking forward to having 6 cores or more will have to put their money where it counts: on the Waimea Bay platform.

Intel also has intentions to market their next generation "Taylorsville" 6Gb/s Solid State Drives (SSDs) as a complement to the Waimea Bay HEDT platform.
Score
-3
September 14, 2010 5:34:00 PM

Waimea Bay = Sandy Bridge-E CPU and Patsburg PCH = high-end
So, I guess its the Sandy Bridge-E CPU that a typical Tom's reader would get excited about - but, I don't see any information about it. Does anyone have any info on Sandy Bridge-E?
Score
0
September 14, 2010 5:35:51 PM

This is info regarding Couger Point... CPUs and chips for my Grandmother's PC - correct?
Score
-3
September 14, 2010 5:42:22 PM

banthracisSo slide 17 says they're keeping the integrated PCI-e from P55. Does this mean that we'll once again have a 16 total PCIe Lane limit or was this issue resolved?Also, did Intel decouple the PCIe controller from BCLK? One of the biggest issues with Lynnfield was the stupid (and pointless) coupling of the 2 which limited stock v overclocking greatly.


16 lanes from the chip. Same as P55. However, the chipset PCIE is now real PCIE 2.0, rather than PCIE 2.0 hobbled to PCIE 1.1 data rates.

As for decoupling, I think they actually went further. We have no idea how the non-K SKUs are going to overclock until we get retail samples in reviewers hands.

truerockSandy Bridge will have USB 3.0, PCIe 3.0 and SATA 6.Gb/s... correct?


PCIE 2.0, mixed 6 Gb/s and 3 Gb/s ports. As for USB 3, last I heard no but Wiki says otherwise. Take that with a grain of salt.
Score
0
September 14, 2010 5:43:01 PM

Wow I thought AMD would make the "right decision" and finally change or upgrade sockets. I'm grateful that they let us upgrade from am2 all the way to am3. But can we get something that's a BIG step up. I'm been waiting for BZer forever. I'm saving for this build. I WANT my Bulldozer with: AM3+, quad channel memory, 990FX chipset, AMD HD 7800 GFX, USB3 and SATA3 on my south bridge. I've saved so much money with AMD since the k5/k6 days. I want to beat the Intel fanboys like athlon XP days. Any1 with me. I want Bulldozerer.... I want Bulldozerer.... I want Bulldozerer.... I want Bulldozerer.... If they don't do bulldozer like that, I'm switching to Intel and I've never owned an Intel!!!!!!
Score
-7
September 14, 2010 5:50:35 PM

I like that the camera catches the moke-up look of the processor
Score
0
September 14, 2010 6:01:22 PM

LORD_ORIONWake me up when integrated GPUs on CPUs do 60FPS in Crysis with max details at 1080 res.


I'd guess at 7 to 8 years, by which time neither you or anybody else will give a damn about Crysis.
Score
12
September 14, 2010 6:09:29 PM

Yes, I've read this article over again... this is definately Grandma level tech for low-end PCs. I'm still thinking Intel will not have USB 3.0, PCIe 3.0 and SATA III until the Patsburg Platform Controller Hub is matched with some type of Sandy Bridge-E CPU. Looking like 2011 is going to be the Intel year of PCs for slow people.
Score
-1
September 14, 2010 6:15:49 PM

I'm thinking anything spent on Intel tech in 2011 is going to be almost instantly obsolete. Hopefully 2012 will be a year of tech revolution. Who knows - maybe CPUs with clockspeeds significantly faster than my 7 year old 3.6 GHz Pentium 4.
Score
-9
September 14, 2010 6:19:52 PM

As cool as it is, I'd rather not have an IGP and have the focus entirely on the CPU. That's why I buy graphics cards, for graphics.
Score
18
September 14, 2010 6:45:12 PM

Enzo MatrixNo you won't.http://www.xbitlabs.com/news/cpu/d [...] s_AMD.html"Apparently, it was possible for AMD to make Bulldozer microprocessors compatible with existing AM3 infrastructure, but in order to do that, the company would have to sacrifice certain important features of the new core."So they won't be throwing us old AM3 or AM2+ users a bone.

If you read carefully, you should know that bulldozer is am3 compatible, but it doesn't support it fully. That means bulldozer performance will be crippled using am3.
Score
2
September 14, 2010 6:45:52 PM

truerockI'm thinking anything spent on Intel tech in 2011 is going to be almost instantly obsolete. Hopefully 2012 will be a year of tech revolution. Who knows - maybe CPUs with clockspeeds significantly faster than my 7 year old 3.6 GHz Pentium 4.


Clock speed has zero correlation to performance between different architectures.

I guarantee you an i7-930 under clocked to 2ghz will beat a 4.0ghz P4 in any benchmark.
Score
11
September 14, 2010 6:58:29 PM


Actually, bulldozer is not supported in AM3, not even partially.

Direct quote from AMD's Director of Product Marketing for Server/Workstation products at AMD, on AMD's website.

Quote:
in order to get the best performance, features and lowest price AMD is moving Bulldozer into an AM3+ socket. There will be no AM3 support as that could compromise features, performance and lead to a higher price.


http://blogs.amd.com/work/2010/08/30/bulldozer-20-quest...
Score
0
September 14, 2010 7:07:15 PM

the processor will porblem crash, or give you problems with windows like all intel processors
Score
-13
a b à CPUs
September 14, 2010 7:41:31 PM

Sounds awesome, this is going to kick arse.

Could the AMD trolls please vent their unhappiness somewhere else ?
It will only be another year before you get a new desktop cpu, :) 
Score
2
September 14, 2010 8:29:33 PM

"The graphics part itself will support DirectX 10.1 features, but anyone who wants DX11 will probably have a discrete part already on his build list."

But you'll go ahead and charge me for the R&D and materials/manufacturing of the 10.1 part anyway, despite admitting that its actually useless as I would be buying discrete graphics? Thanks, but no thanks. Looks like I'm skipping another generation.

"The views expressed here are mine and do not reflect the official opinion of my employer or the organization through which the Internet was accessed."
Score
3
September 14, 2010 8:33:32 PM

Dowsirequad channel memory


Why? Triple channel shows just about 0 improvement for any ordinary task, or rather pretty much any task outside a server/rendering farm applications. Hence the uselessness of the i7/i9 as desktop CPUs (great server CPUs though!). Quad is even more useless. If you really need that sort of performance, then you shouldn't be looking at desktop builds but at server boards.

"The views expressed here are mine and do not reflect the official opinion of my employer or the organization through which the Internet was accessed."
Score
3
September 14, 2010 8:49:08 PM

truerockI'm thinking anything spent on Intel tech in 2011 is going to be almost instantly obsolete. Hopefully 2012 will be a year of tech revolution. Who knows - maybe CPUs with clockspeeds significantly faster than my 7 year old 3.6 GHz Pentium 4.

You still think clock speed is that important?
Score
1
September 14, 2010 9:43:20 PM

truerockWaimea Bay / Patsburg = high-end
Cougar Point = low-end


So Patsburg will be targeted at enthusiasts
Whereas Cougar Point will be targeted at hot, older ladies
Score
-1
September 14, 2010 9:47:56 PM

CPU+GPU is the future.
Dynamic Resource Allocation between CPU and GPU is the future.
Smart move for Intel.
AMD needs to catch up.
:) 
Score
-2
September 14, 2010 9:50:21 PM

Intel, please put one of those CORE processor in my smartphone so that I can do away with the PC altogether. :D 
Score
-3
September 14, 2010 9:57:54 PM

Fusion will achieve a Directx 11 APU. Too bad for Intel!!
Score
-3
September 14, 2010 10:02:38 PM

the_krasnoFusion will achieve a Directx 11 APU. Too bad for Intel!!


Shame....intel wont be able to do the DX11 slideshow....you are acting like the integrated GPU will be able to effectively render dx11 stuff....when the top tier graphics cards have trouble with that right now

Intel has a much stronger CPU then AMD also (at least for now) so I would still call it an overall win for Intel
Score
-2
September 14, 2010 10:11:12 PM

Quote:
The graphics part itself will support DirectX 10.1 features, but anyone who wants DX11 will probably have a discrete part already on his build list.


Soo youre saying a cpu that's on the horizon for what a year or two from now is going to have a gpu on it that's supporting an already obsolete version of dx? Granted yeah there arent a ton of dx11 games out yet but in a year or two then these cpu's hit the shelves I cant help but think that people will be sorely disappointed in its performance in an "average" dx11 game.


[edit] as Nitrium pointed out anandtech already did a review on it and they announced that it will have what essentially amounts to 6 or 12 cuda cores as nvidia would call them. Nvidia's gtx 470 has 448 cuda cores and it still has problems pushing the pixels. I cant imagine cutting my gpu down to 12 cores from 448 LOL! Did intel learn nothing from their "intel extreme graphics" ? Every gamer called them intel extremely bad/crappy/slow/old graphics! [/edit]
Score
1
a b à CPUs
September 14, 2010 10:42:31 PM

Quote:
I cant imagine cutting my gpu down to 12 cores from 448 LOL! Did intel learn nothing from their "intel extreme graphics" ?


It is pretty clear that every current discrete GPU will outperform Sandy Bridge. Sandy Bridge is really only good enough for HTPC and office work (which I guess is most of the PC market). For gamers, Sandy Bridge will still have to be combined with a separate GPU, so the graphics component of this new CPU will be just idle and sucking system power for no beneficial reason. No Sandy Bridges currently slated come without integrated graphics.
Score
0
September 14, 2010 10:54:27 PM

milkteaCPU+GPU is the future.Dynamic Resource Allocation between CPU and GPU is the future.Smart move for Intel.AMD needs to catch up.


RIGHT ON!

The DIYer gamers throw a temper tantrum everytime someone suggests this, but it's true none the less. I expect drive bay hard drives to become obsolete eventually too...hard drives will be a card that plugs into your main board...and maybe RAM and hard drive will be integrated onto the same card.

The inescapable eventuality of miniaturization is a complete computer on a single chip(with multiple layers). Either it comes to that, or at some point progress stops.
Score
0
Anonymous
a b à CPUs
September 14, 2010 11:11:17 PM

@milktea

because AMD havent figured how to cobble a dx10 card into it's CPU, guess they have to just live with dx11.....
Score
0
September 14, 2010 11:26:43 PM

How come anything half decent, is always about one year out.
Score
1
September 14, 2010 11:56:49 PM

Meh....I prefer regular graphics cards!
Score
1
a b à CPUs
September 15, 2010 12:01:08 AM

nitrium said:
Quote:
I cant imagine cutting my gpu down to 12 cores from 448 LOL! Did intel learn nothing from their "intel extreme graphics" ?


It is pretty clear that every current discrete GPU will outperform Sandy Bridge. Sandy Bridge is really only good enough for HTPC and office work (which I guess is most of the PC market). For gamers, Sandy Bridge will still have to be combined with a separate GPU, so the graphics component of this new CPU will be just idle and sucking system power for no beneficial reason. No Sandy Bridges currently slated come without integrated graphics.

How much power do you think its going to take idle. I bet its practically null.
Also socket 2011 the 1366 replacement is not slated to have igp, at least unannounced.
Score
0
September 15, 2010 12:10:35 AM

What I would like to see is an IGP that assists in communication between the CPU and discreet graphics if present.

For example, if a consumer intends to just use integrated graphics, the IGP on the chip will naturally function as the graphics processor, but if another consumer intends on using a high end dedicated graphics card, then the same IGP on the CPU should be able to feed info to the card faster than the CPU could normally. This would require new architecture for about every component, but it seems the smartest, least alienating decision.
Score
1
September 15, 2010 12:16:43 AM

victorintelrVery interesting concept to put together the CPU and GPU, though if they ever hope to catch with AMD/Nvidia graphic performance...They still have a loooooong way to go. AMD have ATI at its disposition and certainly got a good advantage already. Intels still relies on its CPU for the most taxing tasks.

Well what the **** else are they gonna rely one?
Score
1
September 15, 2010 12:59:42 AM

The real question is can the GPU Stream Processors be used for computing, like CUDA without writing code just for it. The integrated graphics are useless for me. Llano should be capable of this feat.
Score
0
September 15, 2010 3:21:48 AM

milkteaCPU+GPU is the future.Dynamic Resource Allocation between CPU and GPU is the future.Smart move for Intel.AMD needs to catch up.

catch up to what? intel graphics is a huge bottleneck, haven't you noticed the benchmarks on those yet? good god man even consoles have better graphics processors than intel.

banthracisClock speed has zero correlation to performance between different architectures. I guarantee you an i7-930 under clocked to 2ghz will beat a 4.0ghz P4 in any benchmark.


cool lets go one step further and limit the I-930 to pc6400 memory speed as well. now that your quard core just got it's speed crippled the only three advantages it has is 4 cores, and higher cache, and a crippled FSB at 800mhz instead of 1600/2000mhz now your right back inline with core2 which we have charts showing P4 vs dual core/core2/core2quad. i think you'd find the encoding charts a bit interesting

banthracisSo slide 17 says they're keeping the integrated PCI-e from P55. Does this mean that we'll once again have a 16 total PCIe Lane limit or was this issue resolved?Also, did Intel decouple the PCIe controller from BCLK? One of the biggest issues with Lynnfield was the stupid (and pointless) coupling of the 2 which limited stock v overclocking greatly.


i'm hearing that sli/crossfire will have an 8X limit in sandy bridge, is this right? if that is the case swear i'm hearing 'jaws' music from AMD.
Score
0
September 15, 2010 3:25:42 AM

loomis86RIGHT ON!The DIYer gamers throw a temper tantrum everytime someone suggests this, but it's true none the less. I expect drive bay hard drives to become obsolete eventually too...hard drives will be a card that plugs into your main board...and maybe RAM and hard drive will be integrated onto the same card.The inescapable eventuality of miniaturization is a complete computer on a single chip(with multiple layers). Either it comes to that, or at some point progress stops.


let me guess you game with an Ipad right my Klingon federation of planets friend. a real pipe dream would be the holodeck where you don't just push buttons, but actually have to play a FPS like you would real war or paintball/airsoft.
Score
-1
September 15, 2010 4:36:49 AM

f-14let me guess you game with an Ipad right my Klingon federation of planets friend. a real pipe dream would be the holodeck where you don't just push buttons, but actually have to play a FPS like you would real war or paintball/airsoft.


lemme guess, you get your jollies gaining ten more frames per second. Hot rodding personal computers with a gpu will one day be as retarded as hotrodding a 1970 chevy nova with a holley 4-barrel carburetor.
Score
1
September 15, 2010 5:06:27 AM

I'm impress.
Score
0
September 15, 2010 6:02:53 AM

scook I don't see how you can say that. They do have stronger cpu's, but amd has the stronger gpu's. So really going by that logic you could say these things would even out. Of course I don't think it's gonna work that way. But I don't see how anybody can jump to any conclusion yet.

Personally I'm not a fanboy of either, when it comes to cpu's I just go with who is the best at the time. As I havn't had problems with either amd/intel.
Score
2
September 15, 2010 6:52:28 AM

I'm thoughroughly underwhelmed by Sandy Bridge, or more accurately iCore 2.

It's basically a slight upgrade on the iCore structure with a low-end GPU built in. That's all and well, and as expected.

However, you have to consider the market segment it has to compete in. It's going to go up against AMDs Llano which is a Phenom upgrade with a mid-level GPU built in. Both are coming to market at roughly the same time, Q1 2011, and both are built on a 32nm process.

The SB will probably be on par with the Llano overall but it's my firm estimate that the Llano will be priced far below the SB, and that for the price of an SB APU you could get both a better performing CPU (iCore or Bulldozer) and discrete graphics (AMD/Nvidia).

AMD has had the Fusion project in the pipeline for years and the SB does increasingly look like a last-minute thrown-together chip that is meant to compete with it.
Score
0
!