Sign in with
Sign up | Sign in
Your question

Sandy Bridge: Socket 1155 and 2011

Last response: in CPUs
Share
a b à CPUs
April 21, 2010 4:29:36 PM

http://www.bit-tech.net/hardware/cpus/2010/04/21/intel-...

Quote:
They will be split into two markets: mainstream and enthusiast. The mainstream models will replace the current LGA1156 'H1' Clarkdale and Lynnfield CPUs (Core i3 and Core i5) and will use LGA1155 'H2' packaging.

Yes, that's right, LGA1155 - one fewer pins than current LGA1156 CPUs. These sockets are NOT compatible, so you cannot use an existing LGA1156 CPU in a new motherboard or visa versa. Not only is the position of pin-1 different, the socket notch has moved from 9mm to 11.5mm from the centre and the entire voltage plane layout has changed.


Quote:
Next is Sandy Bridge 'E' (Enthusiast or Extreme, take your pick) 'Patsburg' platform, which features a huge new LGA2011 socket to replace LGA1366 in Q3 next year (although right now that seems dubiously convenient given the year of launch).


So both Nehalem/Westmere sockets are now going to be obsolete at the end of the year. Note this is non-official, but the writings on the wall at this point.
a b à CPUs
April 21, 2010 6:29:56 PM

We shall see!

Intel is going to have some explaining to do if 1366 also doesn't get it.
Related resources
a b à CPUs
April 21, 2010 8:47:54 PM

Raidur said:
We shall see!

Intel is going to have some explaining to do if 1366 also doesn't get it.



It's going to be a completly new platform. I don't think they'll support LGA1366:(  Why Intel, why....

BUT, there will be more hexas for 1366. I hear a i7-970 already on the way, will be released in near future:
http://www.tweaktown.com/news/14965/intel_readies_up_ch...

Although it won't be much cheaper, maybe 100$ less... :pfff: 


edit: http://www.ubergizmo.com/15/archives/2010/04/new_core_i...

aargh can't belive they're still branding C2D (and one i3 model-G6950) as Pentiums. Pentium, Celeron names, go away!

And what's all this duals with hyperthreading for 400$(!!) (upcoming i5-680).
i5-750 FTW! Make more true i5 quads.....
a b à CPUs
April 21, 2010 9:26:12 PM

Lol yeah, pentium has been played out enough. I don't think any of us will miss the name. :) 

Who cares about hexa I want higher IPC dammit! Not that I use either of the new chipsets, but I feel for those that do and were expecting sandy support.

Don't even get me started on i5 dual cores... those CPUs make me laugh. I feel bad for anyone getting sucked into that crap (including those with high budgets getting it because they're afraid to change a SINGLE option in bios; baseclock for i3).
April 21, 2010 11:32:48 PM

Huh, then if bulldozer comes out in q1, then I think AMD will have the performance crown for a bit. Bulldozer will be 8 cores (4 modules) and intels 6 core SB is supposedly coming in q3.
a c 83 à CPUs
April 22, 2010 12:07:40 AM

Doesn't surprise me if this is true, Sandybridge is both a cpu and gpu after all. AMD's fusion, I believe it's called Llano, needs a new socket as well.
a b à CPUs
April 22, 2010 2:00:21 AM

I doubt Intel will kill off LGA1366 considering the new Xeons use that socket. If Intel indeed kills it, I expect motherboard manufactures to support the Xeons on our "gamer" boards. IIRC, ASUS/Gigabyte did this back a while back (aka E8400 = Xeon E3110).

If both of these fail, I'm going to hate Intel for the next few generations of CPUs. I'll be AMD4Life for a while.
a b à CPUs
April 22, 2010 12:06:54 PM

To be fair, it should have been suspected a new socket would be needed due to the integrated GPU. It was simply assumed 1366 pins would be enough...
April 22, 2010 3:02:52 PM

Well, that's about it. Intel just lost one customer here...
a b à CPUs
April 22, 2010 3:23:50 PM

Raidur said:
We shall see!

Intel is going to have some explaining to do if 1366 also doesn't get it.

Here is your answer Why just sell a chip, when you can sell a chip and a motherboard? Why do you think Intel kicked Nvidia off of thier motherboards? one word : GREED

Intel does not care about losing one customer when they have billions per quarter coming in as profit.
a b à CPUs
April 22, 2010 9:03:01 PM

They'll lose a lot of enthusiasts, but yeah, we'd barely dent their funds.
a c 96 à CPUs
April 22, 2010 10:07:41 PM

Shadow703793 said:
I doubt Intel will kill off LGA1366 considering the new Xeons use that socket.


Bingo. Server CPU sockets are generally expected to have a reasonable lifespan so that motherboards and other related parts can be sold for a long enough time to make the much larger validation costs worthwhile. If Intel rolled out a new server socket every year, they'd tee off just about every third-party manufacturer of compatible boards, heatsinks, complete server sytems, and other related parts because of validation costs.

I'd expect Intel to hold onto LGA1366 for at least one more product revision. They typically offer at least three different revisions of their chips on a particular socket before they jump to the next socket, else people get PO'd.

- Socket 8: 500 nm PPro, 350 nm PPro, 250 nm PII OverDrive
- Slot 2: 250 nm PII Xeon, 250 nm PIII Xeon, 180 nm PIII Xeon
- Socket 603: 180 nm P4 Xeon, 130 nm P4 Xeon*
- Socket 604: 130 nm P4 Xeon, 90 nm P4 Xeon, 65 nm P4 Xeon, 65 nm Core 2 Xeon, 45 nm Core 2 Xeon
- PAC418: McKinley Itanium, which almost nobody bought.
- PAC611: All Itaniums except for the McKinley and the latest Itanium 9300s.
- LGA771: 65 nm NetBurst Xeon, 65 nm Core 2 Xeon, 45 nm Core 2 Xeon
- LGA1366: 45 nm Nehalem Xeon, 32 nm Nehalem Xeon.

*Socket 603 and 604 are almost identical.

This would suggest that Intel has at least one more revision left until they consider retiring LGA1366, thus Sandy Bridge on LGA1366 is probable.

Quote:
If Intel indeed kills it, I expect motherboard manufactures to support the Xeons on our "gamer" boards. IIRC, ASUS/Gigabyte did this back a while back (aka E8400 = Xeon E3110).


They're basically identical except all of the Xeons have microcode/fuses left intact that allow for ECC memory to be used and the 5xxx Xeons have microcode/fuses that leaves the second QPI link enabled. It should't take much for extra BIOS support to use LGA1366 Xeons if they can use LGA1366 Core i7s.

Quote:
If both of these fail, I'm going to hate Intel for the next few generations of CPUs. I'll be AMD4Life for a while.


Wouldn't that be "AMD4AWhile" then? ;) 
a b à CPUs
April 22, 2010 11:15:01 PM

^ :lol:  Yes, it would be AMD4AWhile. :lol: 
a c 122 à CPUs
April 24, 2010 12:03:57 AM

This part right here makes it hard to trust for now:

Quote:
A third party gave us this information - Intel won't comment on unreleased products.


When a third party gives info thats supposed to be correct 6+ months before a CPU release, you better bet there is a large chance that it will be false.

I think more of a realistic amount of true info will trickle down till about October and then from October on we will see much more info on it.
April 24, 2010 1:20:13 AM

Socket 2011 with an EE CPU and RAM in every slot. It had better be an 8 core CPU or I'm not wasting my money. I'll get a 970 or 980x if SNB doesn't have an 8 core CPU.
a c 122 à CPUs
April 24, 2010 9:21:14 PM

one-shot said:
Socket 2011 with an EE CPU and RAM in every slot. It had better be an 8 core CPU or I'm not wasting my money. I'll get a 970 or 980x if SNB doesn't have an 8 core CPU.


Sandy Bridge will scale to 8 cores natively. As to if they will be out upon release is another question.

Socket 2011 is mainly for the additional memory channel. 4 channels would be nice for someone running a private server or doing a lot of VT work since that would be 32GB max.

But for most people, 4 memory channels at this time will be overkill. I would also guess that it will have either a faster QPI or 2-3x QPi instead of just one QPI connection. That would increase memory bandwidth by quite a bit for low end servers.
a b à CPUs
April 24, 2010 9:49:16 PM

loneninja said:
............. after all. AMD's fusion, I believe it's called Llano, needs a new socket as well.


Says who?
a b à CPUs
April 25, 2010 1:27:17 AM

FALC0N said:
Says who?

Llano will have an integrated PCIe 2.0 controller, a dual channel DDR3-1600 memory controller and 4MB of L3 cache
The new socket is rumored to be FS1. (Fusion Socket 1). Due to the changes in the chip internally and requiring Graphic output, you can't run it in the AM2+/AM3 boards due to the PCIe controller being on chip.

AM2+/AM3 is hardly a dead socket however, Bulldozer will be initially supported by them, followed by an updated AM3r2, possibly implementing more memory channels for thier 128 bit Bulldozer. Not much is known about the latter, just speculating it will be late 2011- early 2012 release.
a b à CPUs
April 25, 2010 1:39:52 AM

Just cut down 1/211 pin(s) of your LGA1156/LGA1366 and you will be Sandybridge (LGA1155) ready! :p 
a b à CPUs
April 25, 2010 5:30:01 AM

noob2222 said:
Llano will have an integrated PCIe 2.0 controller, a dual channel DDR3-1600 memory controller and 4MB of L3 cache
The new socket is rumored to be FS1. (Fusion Socket 1). Due to the changes in the chip internally and requiring Graphic output, you can't run it in the AM2+/AM3 boards due to the PCIe controller being on chip.

AM2+/AM3 is hardly a dead socket however, Bulldozer will be initially supported by them, followed by an updated AM3r2, possibly implementing more memory channels for thier 128 bit Bulldozer. Not much is known about the latter, just speculating it will be late 2011- early 2012 release.


My understanding is FS1 is a notebook socket. Not sure if that tells us about the desktop variant.

Its reasonable to assume the AM3r2 will be dubbed AM3+. And it is likely the desktop socket for fusion. Fusion on the desktop might also be AM3 compatible in some form.
a c 122 à CPUs
April 25, 2010 10:45:59 PM

FALC0N said:
My understanding is FS1 is a notebook socket. Not sure if that tells us about the desktop variant.

Its reasonable to assume the AM3r2 will be dubbed AM3+. And it is likely the desktop socket for fusion. Fusion on the desktop might also be AM3 compatible in some form.


From what we know, only Bulldozer (non Fusion) will be. I don't think AMD had AM3 planned for Fusion since they originally cut FUsion out of their roadmap and then added it back.

I think having an actual GPU on die will require a socket change not matter how badly people want it not to happen. Its adding something more to the packaging and currently the 938 pins in AM3 are for the CPU.
a b à CPUs
April 25, 2010 10:59:59 PM

Well Jimmysmitty, I'm thinking socket AM3+ (the likely name for AM3 v2) will have that desktop fusion and bulldozer support. BUT its possible AMD built Fusion support into AM3 to begin with. Fusion has been on the roadmap for a while, so it would not surprise me if they thought ahead. But even if they didn't, Fusion may still support AM3 in a limited form, such as cpu without the on die gpu functioning for example.
a b à CPUs
April 25, 2010 11:00:30 PM

We don't know but it's highly unlikely Llano won't need a brand new socket on desktop too.
a b à CPUs
April 25, 2010 11:52:06 PM

FALC0N said:
Well Jimmysmitty, I'm thinking socket AM3+ (the likely name for AM3 v2) will have that desktop fusion and bulldozer support. BUT its possible AMD built Fusion support into AM3 to begin with. Fusion has been on the roadmap for a while, so it would not surprise me if they thought ahead. But even if they didn't, Fusion may still support AM3 in a limited form, such as cpu without the on die gpu functioning for example.

While it is possible, it is unnecessary. Fusion won't be any faster than the current pII or Bulldozer.

It has a built in GPU, which produces more heat and draws more power. If they were to make the cpu just as fast as the current PII line, the power draw would be more than the specifications would allow. Fusion is targeted for a cpu/gpu solution that will be more useful than Intel's solution, graphic wise it will tear up the I5-661. With that in mind, I don't see it being the cheapest chip available for the AM3 socket, making it different from the Intel option, and a waste of money when used with an actual discrete GPU.

Only time well tell, but I don't see a reason to support fusion on the AM3 board line.
April 26, 2010 12:06:50 AM

ummm...why does amd socket 939 need so many pins compared to lga 775 and socket 478?
a b à CPUs
April 26, 2010 12:41:49 AM

noob2222 said:
While it is possible, it is unnecessary. Fusion won't be any faster than the current pII or Bulldozer.

It has a built in GPU, which produces more heat and draws more power.........................If they were to make the cpu just as fast as the current PII line, the power draw would be more than the specifications would allow


The GPU would draw more power IF ITS ACTIVE.

And no, Fusion should be faster than the current generation phenom. Why? Low latency with the GPU (much like on die memory).

Its also a 32nm die shrink. Take a look at what happened from 65nm to 45nm. The difference is HUGE both in power draw an maximum clock speed.

And finally, if it is possible to make it fully functional on AM3, then why not? Flexibility is the hallmark of the AMD platform.
a b à CPUs
April 26, 2010 1:05:29 AM

WOW. I really LOVE how I posted this same story a week or so ago and everyone called me a troll.

Truth hurts - AMD > Intel.
a b à CPUs
April 26, 2010 2:11:05 AM

FALC0N said:
The GPU would draw more power IF ITS ACTIVE.

And no, Fusion should be faster than the current generation phenom. Why? Low latency with the GPU (much like on die memory).

Its also a 32nm die shrink. Take a look at what happened from 65nm to 45nm. The difference is HUGE both in power draw an maximum clock speed.

And finally, if it is possible to make it fully functional on AM3, then why not? Flexibility is the hallmark of the AMD platform.

first point, a cpu is limited just like a gpu, I believe the maximum a cpu can draw at factory settings is 140W. it can't be designed to go over that, wich limits the fusion processor, its specced with the gpu on. The die shrink allows them to put the gpu on the phenom II core and stay within spec.

point 2 is valid except you would have 2 PCIe controllers, one on the MB, one on the CPU. I doubt they can implement a way to disable the one on current motherboards, wich makes the low latency PCIe on the cpu a moot point.

point 3, like I said, only time will tell.
a b à CPUs
April 26, 2010 2:58:37 AM

noob2222 said:
first point, a cpu is limited just like a gpu, I believe the maximum a cpu can draw at factory settings is 140W. it can't be designed to go over that, wich limits the fusion processor, its specced with the gpu on. The die shrink allows them to put the gpu on the phenom II core and stay within spec.

First, they can make a CPU that uses 300 watts if they want. I don't know where you got this 140 watt limit from. Just because they don't currently, doesn't mean they can't.

Power usage will be an important factor in the laptop Fusion units for sure. Fusion isn't initially about performance, its about value and low power environments.

However, If they elect to release it on the desktop, they can pump all the power they want out of it. And they can put the cpu and gpu portions of the chip on a different power plane. Heat could be an issue though.
a c 122 à CPUs
April 26, 2010 3:57:03 AM

FALC0N said:
First, they can make a CPU that uses 300 watts if they want. I don't know where you got this 140 watt limit from. Just because they don't currently, doesn't mean they can't.

Power usage will be an important factor in the laptop Fusion units for sure. Fusion isn't initially about performance, its about value and low power environments.

However, If they elect to release it on the desktop, they can pump all the power they want out of it. And they can put the cpu and gpu portions of the chip on a different power plane. Heat could be an issue though.


Heat and TDP will be the two primary issues to overcome. Heat more than TDP. Most GPU tend to use about 150-200W, decent ones, and run at 50c idle in most cases. My GPU hits 70c under load.

But the GPU in Fusion is more of a low profile and probably at about a HD5400 performance so maybe it will work.
a b à CPUs
April 26, 2010 5:17:08 AM

MIT is trying to build a new generation CPU that involves atomic level robotics to fix the TDP 'bug'. Transistors run too hot.
a c 122 à CPUs
April 26, 2010 6:36:59 AM

werxen said:
MIT is trying to build a new generation CPU that involves atomic level robotics to fix the TDP 'bug'. Transistors run too hot.


Intel has one fix. Its utilizing fiber for data transmissions instead of transistors. Of course this means there will be some sort of light based transistor but still, its better than electrons roaming around.
April 4, 2011 2:55:49 PM

Bottem Line, is Socket 2011 is going to support PCI Expx16 3.0 and sport 4 channel DDR3 Ram along with prolly the normal Goodies 6gbps sata and usb 3.0 and the real kicker is they'll support 8-12 core cpus, yea .... SAVE YOUR MONEY>!! :fou:  :ouch: 
!