Sign in with
Sign up | Sign in
Your question
Solved

Final upgrade for entire life, go with LGA 2011 or LGA 1150?

Tags:
  • 1150
  • CPUs
Last response: in CPUs
Share
September 15, 2014 5:29:31 AM

I want to upgrade to a new computer one final time for my entire life and never have to deal with it again. I've decided Windows 7 64-bit is "good enough" and as it supports 64-bit RAM, native ipv6, and DirectX 11 then I don't see the point of upgrading to any future version of Windows ever. I will mainly use the computer for 50% playing hardcore 3D games and the other 25% web surfing and 25% applications like 3ds max, photoshop, autocad, cadence, lightroom, etc...

In terms of future proofing, platform longevity and that sort of stuff should I go with a LGA 2011 or LGA 1150?

I'm confused because it seems LGA 2011 is the more "powerful" of the two geared towards enterprise/server/hardcore stuff compared to LGA 1150, however LGA 2011 came out in 2011 and is "older" than LGA 1150. Both seem to have preliminary support for Broadwell's 14nm architecture and both do not support DDR4.

Since I plan no this being my one last and final upgrade my entire life, I want to get it right.

I understand technology is an ever forward moving target and there is no such thing as "future proof", just like how it would be equally ludicrous if back in 1995 someone wanted to spend a lot of money to buy a system that would Future proof it to 2015.

But 14nm is close to the end of the line, any smaller than 8nm and we run into quantum effects. I don't see silicon being pushed much further and besides, I doubt we will see quantum computers being sold to the consumers as desktop devices in our lifetime.



However, if the analogy makes any sense, the jump from Windows XP to Windows 7 crosses the "64-bit threshold" and we will never run out of ram ever again.... See how fast we went from 4bit to 8-bit to 16-bit to -32-bit but 64-bit is the end of the line...

Same with the switch from ipv4 to ipv6, with ipv6 we will never run out of ip addresses ever.

In a sense, Windows 7 64-bit Pro with sp1 is "good enough", forever.

I also think desktop computing has basically 'matured' where we don't need to swap systems every few years.

I guess it is all speculation, but I think computer systems built today will be able to last for a good 10-20 years without being "outdated" compared to other newer systems out there, especially since everyone these days are all so concerned about mobile, tablets, etc... and small form factor stuff.

And in a sense, XP is still more longevity than Vista, Xp with much larger marketshare than Vista.

So in terms of future processors that support or are backwards compatible with current sockets, will 2011 or 1150 hold more of the platform or mindshare?

If I (for my own reasons) wish to pick one socket type and stick to it forever, which one should it be?

More about : final upgrade entire life lga 2011 lga 1150

a b à CPUs
September 15, 2014 6:01:49 AM

I hope that you are 70 - 80 years old now. If you don't plan to upgrade again.
m
1
l
a b à CPUs
September 15, 2014 6:35:39 AM

anonymous1 said:
I hope that you are 70 - 80 years old now. If you don't plan to upgrade again.


LOL


I see the 2011 socket holding on for another 2-3 years and the 1150 for about 1-1.5 years before intel puts out a new socket. The 1150 is already 1.5 years old and I can see it being ran for a good 5-6 years but something will give out well before that if your a hardcore gamer. Ether a gpu upgrade will be bottlenecked by the cpu or the game its self will get bottlenecked.


If all you want to do is office work and play on the internet then yes ether one of those sockets should last you the 10-20 years if the MB does not fail by then, which by then you will not find a replacement board.
m
0
l
Related resources
a b à CPUs
September 15, 2014 6:37:35 AM

There are three revisions of 2011 and you are thinking about the x79 chipset platform. LGA2011 v3 just came out with the new set of three CPUS which include the 5820k, 5930k, and the 5960x on the x99 chipset. This chipset supports DDR4. If you want something that will last you for some years you are going to spend some serious money. The 5960x is 1k alone along with with 64GB of memory which run you probably another couple grand.

Your view on OSs is skewed. It may be "good enough" for you but how important is your safety. XP may have market share but that is because people are cheap and do not like to upgrade or they do not understand the importance of doing so. Have fun with your old OS that is so full of holes and vulnerabilities that you always have to worry about your data being protected and eventually the software that you wish to use will no longer be supported.

As anon said above, unless you are 80, plan on upgrading again at some point in the future. If you did get the 5960x I would expect the system to last at least 5-7 years without being a complete slouch.
m
0
l
September 15, 2014 11:18:50 AM

I'm asking a realistic question. I plan to keep the motherboard socket type , even if I upgrade to a new mobo I will keep socket and ram type and etc. Tired of Intel changing motherboard every two years, and becauses moore's law is dead.
m
0
l
a c 902 à CPUs
September 15, 2014 11:26:14 AM

Welcome to technology, things change and change often. Go with a 5930k, a supporting motherboard, and all the ram it can handle. They will last you a good while, but eventually, even that will be outdated. Every time people think that hardware cannot possibly get any faster, someone finds a way to do it. To be blunt, your idea of the system lasting 10-20yrs is nothing short of a pipe dream.
m
0
l
September 15, 2014 11:33:59 AM

logainofhades said:
Welcome to technology, things change and change often. Go with a 5930k, a supporting motherboard, and all the ram it can handle. They will last you a good while, but eventually, even that will be outdated. Every time people think that hardware cannot possibly get any faster, someone finds a way to do it. To be blunt, your idea of the system lasting 10-20yrs is nothing short of a pipe dream.


1903 - 1945 went from bicycle flying machines to jet engines (42 years)
1967 - 2014 Boeing 737 debut in 1967, Boeing 737 is still the most popular airplane in the world ( 48 years)

I had a Q6600 back in 2007 (more than 7 years ago)
I can still play Crysis on the Q6600 like I did back in 2007.

The changes in computers from 2000 to 2007 is an quantum leap
The changes in computers from 1993 to 2000 was also a quantum leap.

Changes from 2007 to 2014? No so much. Especially on the gaming side, since essentially Crysis 1 that came out today is no different than the max graphics we have today.

In comparison, from 1995 to 2002 we went from DOOM graphics to Quake-iii graphics, which is a night and day qualitative and quantitative difference.

A state of the art desktop PC today is not going to be very much outdated compared to a computer in 2020 or 2025.


m
0
l
a c 902 à CPUs
September 15, 2014 11:36:43 AM

Crysis 3 would melt a Q6600 system. It requires an i5, at minimum, to get decent gameplay.
m
0
l
September 15, 2014 11:40:27 AM

logainofhades said:
Crysis 3 would melt a Q6600 system. It requires an i5, at minimum, to get decent gameplay.


On the contrary,


After Crysis 1, the Crytek team sold out and stopped being PC exclusive and catered to the consoles. They had to make it run more efficient in order to run on xbox360.

There are only minor incremental improvements on graphics quality from Crysis 1 to Crysis 3 with huge improvements in efficiency due to polished coding of the gaming engine.

A Q6600 overclocked with good GPU can run Crysis 3 quite well.
m
0
l
a c 902 à CPUs
September 15, 2014 12:17:25 PM

I would disagree. A phenom II X4 965 is faster than an old Q6600, and even it doesn't do very well in Crysis 3.



m
0
l
September 15, 2014 12:23:03 PM

Which socket is more future proof?

1150 or 2011 v3?

Both are rumored to support 14nm....

Five or ten years from now, which socket type can you still buy new cpu's for?

For example, even today you can find cpu that run LGA 775 even though the Q6600 itself is no longer being sold.

Is DDR4 and having 64GB ram really that big of a deal?

Will games ever require more than 32GB of ram?
m
0
l
September 15, 2014 12:26:48 PM

In essence, true that 2011v3 is more newer than 1150, but concern is that it will become outdated sooner than 1150 which is more 'mainstream' and thus hold more of a mind share and market share and is a more stable platform longevity wise. However might 2011v3 be like the Vista that is trapped between XP and Windows 7 in that one Skylake comes out in 2016 that means 2011v3 will have only a 1.5 year lifespan while ironically 1150 might actually survive might longer in terms of a viable platform and socket type that other new cpu's down the road might still come out with?
m
0
l
a c 902 à CPUs
September 15, 2014 12:29:33 PM

Given Intel's recent history, I would expect to get at least Broadwell E on the latest X99 platform. X79 saw 2 generations of CPU.
m
0
l
a b à CPUs
September 15, 2014 12:30:15 PM

There is absolutely no possible way to answer those questions barring the answer of "maybe or perhaps." Future proof is an illusion. Intel could release a socket and stop supporting it a year from now or they could support it for 2 - 3 years. Games could migrate to needing more memory / CPU cores next year. There is just no possible way to know what is going to happen. However, you can look at current trends and say that Intel will "maybe" support 2011 v3 for at least a couple of years as they just released it with super high end components and it is workstation class which tends to live a bit longer. DDR4 is a big deal. There is a huge amount of difference in performance between it and DDR3; however, this is in synthetic benchmarks. Real world you will not see much of a difference.
m
0
l
September 15, 2014 12:30:58 PM

What do you mean? one more generation after that is LGA 1151? Or will there be a LGA 2011v4?
m
0
l
September 15, 2014 12:33:31 PM

kira70591 said:
There is absolutely no possible way to answer those questions barring the answer of "maybe or perhaps." Future proof is an illusion. Intel could release a socket and stop supporting it a year from now or they could support it for 2 - 3 years. Games could migrate to needing more memory / CPU cores next year. There is just no possible way to know what is going to happen. However, you can look at current trends and say that Intel will "maybe" support 2011 v3 for at least a couple of years as they just released it with super high end components and it is workstation class which tends to live a bit longer. DDR4 is a big deal. There is a huge amount of difference in performance between it and DDR3; however, this is in synthetic benchmarks. Real world you will not see much of a difference.


If that is the case what is the motivation for users to switch to DDR4 given the price premium?

Might the price premium put negative pressure on DDR3 and thus make DDR3 last longer?

I can still go to Fry's and purchase DDR2 ram, will DDR3 age as gracefully as DDR2?

Basically, is it going to be a difference that ends up making a real difference?


m
0
l
a b à CPUs
September 15, 2014 12:36:57 PM

The motivation is that it is the latest and greatest and many people want the "best" no matter if it gives any noticeable difference or not. You can see the difference with regards to editing programs, rendering, etc; however, for games, there is not going to really be a performance increase by using DDR4. The price will eventually fall to DDR3 pricing but all new technology initially carries a high premium. DDR4 also offers higher performance with a decrease in voltage which will allow you to save power over the life of the system. It may not be much but it will also exude less heat and will run cooler.
m
0
l
a b à CPUs
September 15, 2014 12:37:30 PM

so when we are all using new quantum computers in 30 years or so, your going to stick with your LGA board from the darkages ?
If the x86/64 line comes to and end, as you point out due to shrinking die size, do you think intel and amd (and arm for that matter) are just gonna close up shop and call it a day. No, they will come up with somthing new, and we will all have to upgrade eventually. PC's as they are now, wont be around forever.

Unless you only plan on being alive for the next 10 years. Your going to need new technology at some point just to get by.

Also the one you have is going to break, and parts will be come scarce and not worth spending money on to replace, since newer and better parts will be cheaper.
m
0
l
September 15, 2014 12:43:37 PM

Lee-m said:
so when we are all using new quantum computers in 30 years or so, your going to stick with your LGA board from the darkages ?

Unless you only plan on being alive for the next 10 years. Your going to need new technology at some point just to get by.

Also the one you have is going to break, and parts will be come scarce and not worth spending money on to replace, since newer and better parts will be cheaper.



Quantum computer for Desktop? Like portable fusion devices.

Never gonna happen.
m
0
l
September 15, 2014 12:48:35 PM

kira70591 said:
The motivation is that it is the latest and greatest and many people want the "best" no matter if it gives any noticeable difference or not. You can see the difference with regards to editing programs, rendering, etc; however, for games, there is not going to really be a performance increase by using DDR4. The price will eventually fall to DDR3 pricing but all new technology initially carries a high premium. DDR4 also offers higher performance with a decrease in voltage which will allow you to save power over the life of the system. It may not be much but it will also exude less heat and will run cooler.


My question is , what if ironically 1150 actually "outlives" the 2011v3?

Specs wise yes 2011v3 is more future proof as it is newer, supports more than 32GB ram, and has DDR4 support, but what if 1150 actually "outlives" the 2011v3 due to its mainstream catering that makes it actually more ubiquitous? After all both will support 14nm... and any smaller than that and we get quantum effects. Meaning we are at the end of the road.... and hence why I'm building the last final computer for lifetime.

m
0
l
a b à CPUs
September 15, 2014 12:49:30 PM

iratemonkey said:

Quantum computer for Desktop? Like portable fusion devices.

Never gonna happen.

Was a slight exageration to go along with your point. You get the idea.

This is the silliest thing I have ever heard if your under 65 years old, and don't plan on hanging in for a long life.
And x86/x64, and microsoft wont be around forever.
m
0
l
September 15, 2014 1:04:41 PM

Lee-m said:
iratemonkey said:

Quantum computer for Desktop? Like portable fusion devices.

Never gonna happen.

Was a slight exageration to go along with your point. You get the idea.

This is the silliest thing I have ever heard if your under 65 years old, and don't plan on hanging in for a long life.
And x86/x64, and microsoft wont be around forever.


I'm 29 years old. I work in IT as sys admin, and I've always been the type growing up as 'catching up to the state of the art' the type that used to upgrade every few months,etc...

But I see the bigger picture, things are slowing down not speeding up, any improvements are incremental and even that is slowing down.

All the low hanging fruits are gone. Back in 1945 the ENIAC did ballistic table calculations less than 1Khz and yet it was of immense national security value and application was world shattering....

In 1969 the computer that help land man on the moon was less powerful than a TI-89 calculator and yet it achieved one of the greatest feats of mankind....

These days everyone's smart phone is orders of magnitude more powerful than ENAIC or the Apollo Guidance Computer but can anyone truthfully say that we have made "use" of all that power?

The consumer uses it to surf the web, twitter, facebook, etc... we have received the point where each incremental power in computers is meet with less and less actual utility.... you get my point.

Case in point, on a fundamental level nothing has changed in computing processors since the invention of the Integrated Circuit nearly half a century ago..... basically we are still chipping away at silicon and we will reach quantum limits dictated by the laws of physics very soon now...

And for the past few decades there has been nothing in academia or otherwise that leads one to believe there is a viable alternative down the road like DNA computing or Quantum Computing or NanoComputing, etc.

Fact of the matter is, computers are now mature and the market is saturated. Same reason why we aren't all driving around in electric cars or even hybrids for that matter. All companies, organizations and consumers are on silicon. Even if Quantum computers magically existed tomorrow, it will still takes years if not decades for everyone to switch over. XP has been retired long ago and yet in most banks and gas stations and other systems still use embedded XP and won't change for another ten years or more....

And look at games... most games are now focusing on other ideas like "plot" and "creative story" and less and less on sheer graphics or sheer CPU power etc...

And look at applications like Adobe Photoshop, Microsoft Office, etc why do you think Adobe/Microsoft is switching over to "subscription based"? Because there is not that much difference between Office 2010 vs Office 2013 vs Office Next, and subscription (as opposed to perpetual licenses) is the only way to continue to milk the average consumer... Same with Adobe products, like Photoshop ... what is the real difference between Photoshop today and Photoshop that came out five years ago? Nothing... it has all matured.

Even if Quantum computer is possible today and affordable, we as a society will not be able to change because of the 'network effect' and all the 'momentum' built behind silicon. Think of all the software, all the support, all the programing languages and all the stuff that will have to change along with it? Never going to happen. We can't even get people to change from ipv4 to ipv6!!!!!

And plus, more realistically what will happen is when we hit the limit, we will just stop changing... and desktop/computing will mature and stop changing.... finally we won't have to play perpetual catch up...

Think of airplanes and aviation industry. It used to be each new aircraft was bigger, faster, and speedier than the next. Weren't we supposed to have HyperSonic Mach 5 commerical jets by now? Instead what happened? Airplanes today are no faster than they were 60 years ago! The airline industry has by and large long ago since matured and been saturated.

Same analogy applies to desktop and computing.





m
0
l
a b à CPUs
September 15, 2014 1:06:43 PM

I have an old (well to me) P4 socket 478 computer with a radeon x850xt and 2gb of ddr ram ill trade you. According to you nothing in the computer world has changed in the past 10 years so that will run any game you want.

It was bought brand new back in 2000 so is only 14 years old which still falls in your 10-20 year window. It has since been replaced by 6 computers since it was bought.
m
0
l
a b à CPUs
September 15, 2014 1:17:07 PM

So in 20 years time, when your outdated pc can't install a new OS, and new software can't be installed, you cant get a browser that works with whatever new internet tech is out what will you do ? write all your own software from then on ?

What about future upgrades to the internet its self? New software and file formats you wont have software to run and open.

Even something as simple as new versions .net wont be supported, work or even install on win7 in another 10 years.

when steam 2020 says 'Sorry, we no longer support windows 7, please upgrade'.

Who says they wont be slow changes to ipv6 over time ? who says it will still be appropriate and used 20-30 years down the line ?

You must be the worsed sys admin in the world mate, no offense.

Good luck. your going to need it if your still using the same machine 41 years from now when your 70.
m
0
l
a b à CPUs
September 15, 2014 1:18:53 PM

by the way just for sh!ts and giggles what games do you play and what are your system specs?

as for my specs

i5-2600k @5.4ghz
2 MSI GTX580 @ 1000 core 2470 memory with a custom bios for more voltage
8gb for mushkin ram
and a full custom water loop

This build will soon be replaced ether this year or the beginning of next year since it no longer gets me the graphic settings or FPS that I want. By the way its only 3 years old, can't even imagine trying to play a game on this system in another 2 years let alone 7-17 years like you believe you should.
m
0
l
September 15, 2014 1:23:24 PM

Lee-m said:
So in 20 years time, when your outdated pc can't install a new OS, and new software can't be installed, you cant get a browser that works with whatever new internet tech is out what will you do ? write all your own software from then on ?

What about future upgrades to the internet its self? New software and file formats you wont have software to run and open.

Even something as simple as new versions .net wont be supported or work or even install on win7 in another 10 years.

You must be the worsed sys admin in the world mate, no offense.

Good luck. your going to need it if your still using the same machine 41 years from now when your 70.



I've already decided for personal use (obviously at work that is different) I will stick forever with Windows 7 Pro 64-bit which maxs out at 128 GB RAM (more than I will ever need) and supports natively ipv6 and also has DirectX11 support for 3D applications and games.

Beyond that, I can always switch to Linux in parallel or as a replacement of Windows. And Mantel and OpenGL will save the day when all PC games force DirectX15 (face it DirectX12 is hype and a joke)

Web browser? I'm sticking with Firefox 28.0 and will compile my own to attempt to maintain further forwards compatibility. Most sites that complain about old browser can be spoofed by a custom useragent in about:config of firefox that does not require compiling anything....

And I'm gonna use Office 2013 forever, until I switch to a linux open source option.

So with the OS and Browser and Office set in stone, all I have to worry about is whether or not my final hardware platform should be LGA 2011 or LGA 1150?

Will I TRULY regret not being able to have more than 32GB of RAM or not being able to use DDR4 or not being able to have more than QUAD cores going forward?

That is my only real question..

m
0
l
September 15, 2014 1:28:37 PM

faalin said:
by the way just for sh!ts and giggles what games do you play and what are your system specs?

as for my specs

i5-2600k @5.4ghz
2 MSI GTX580 @ 1000 core 2470 memory with a custom bios for more voltage
8gb for mushkin ram
and a full custom water loop

This build will soon be replaced ether this year or the beginning of next year since it no longer gets me the graphic settings or FPS that I want. By the way its only 3 years old, can't even imagine trying to play a game on this system in another 2 years let alone 7-17 years like you believe you should.



I play PMDG 777/737/747X and Train Simulator, and Ship Simulator. Also OrbiterSim and Space Engine.

PMDG is addon to Microsoft Flight Simulator X (dates back to 2006) and they have hinted no plans to change to X-Plane or anything else (esp after FLIGHT was canned) for next ten years AT LEAST.

I also play indie pc games like Defcon, Prison Architect, and other indie games by other developers such as anti-chamber etc that do not require powerful PC at all.
m
0
l
a b à CPUs
September 15, 2014 1:31:59 PM

iratemonkey said:

Web browser? I'm sticking with Firefox 28.0 and will compile my own to attempt to maintain further forwards compatibility. Most sites that complain about old browser can be spoofed by a custom useragent in about:config of firefox that does not require compiling anything....


Have you even tried to browse the web on a 10-15 year old browser ? even the bloody css doesnt work.
Never mind the endless upgrades to javascript and everything else. I hope your going to quit your day job, as you'll need all the time to write your own browser.

Maybe you dont plan on being around that long I don't know. It seems your already going insane.

Good luck mate.
m
0
l
September 15, 2014 1:37:27 PM

Lee-m said:
iratemonkey said:

Web browser? I'm sticking with Firefox 28.0 and will compile my own to attempt to maintain further forwards compatibility. Most sites that complain about old browser can be spoofed by a custom useragent in about:config of firefox that does not require compiling anything....


Have you even tried to browse the web on a 10-15 year old browser ? even the bloody css doesnt work.
Never mind the endless upgrades to javascript and everything else. I hope your going to quit your day job, as you'll need all the time to write your own browser.

Maybe you dont plan on being around that long I don't know. It seems your already going insane.

Good luck mate.



Browsers these days will last a good five , ten years at least. We aren't talking about the Netscape and IE 4.0 days anymore....

I still have portable versions of Firefox 3.0 that run just fine when configured in the useragent to tell the websites that I go to that it is a up to date version of FF.

Plus, FF 28.0 (the final version I'm using) already has "good enough" HTML5 support and passes the acidtest 100%.... Everything is slowing down... the web isn't the wild west like it used to be. FF28 crossed the "threshold".
m
0
l
September 15, 2014 1:49:52 PM

We have been running out of IPV4 address and using NAT and other methods to persevere IPs for the last ten years or more... the jump from ipv4 to ipv6 will be the final last jump. The address space therefore has 2^128 or approximately 3.4×1038 addresses. This would be about 100 addresses for every atom on the surface of the earth.

Same thing with RAM and stuff. The first 4 bit processor debuted in 1971, then in rapid secession we went from that to 8bit and 16bit and 32bit and now 64bit....

32bit limits RAM to 4GB which is a really major limitation. With 64bit we have potential for 16 exabytes of RAM. No body alive today will EVER need to use 16 exabytes of RAM. There doesn't even exists that much RAM in all of the computers in the world COMBINED at the moment!

So from a architecture standpoint the jump from Windows Xp to Windows 7 is the "last final jump"

it doesn't matter that Windows 9 is coming out or windows 10 or the next version of windows after that.... we simply can't change anymore at the fundamental level... 64bit is IT. ipv6 is IT....

and as for NTFS we will use that forever too... remember what happened to WinFS? Yeah... I do... same thing that happened when the US tried to convert back to the metric system....
m
0
l
a b à CPUs
September 15, 2014 1:51:17 PM

Good god my grandmothers emachine can play those games. Go pick up any decent game made in the past 2 years and im sure it will bring your system to a screeching halt.

Mr. Iratemonkdy, what you've just said is one of the most insanely idiotic things I have ever heard. At no point in your rambling, incoherent response were you even close to anything that could be considered a rational thought. Everyone in this room is now dumber for having listened to it. I award you no points, and may God have mercy on your soul.


I will forever save this page and in 15 years come back, I hope to hear that you are still running the same computer and are sticking it to the man with your Linux, Firefox 28.0 with your attempt to maintain further forwards compatibility, and your linux open source option for office.

Good day sir
m
0
l
a c 902 à CPUs
September 15, 2014 1:53:44 PM

faalin said:
by the way just for sh!ts and giggles what games do you play and what are your system specs?

as for my specs

i5-2600k @5.4ghz
2 MSI GTX580 @ 1000 core 2470 memory with a custom bios for more voltage
8gb for mushkin ram
and a full custom water loop

This build will soon be replaced ether this year or the beginning of next year since it no longer gets me the graphic settings or FPS that I want. By the way its only 3 years old, can't even imagine trying to play a game on this system in another 2 years let alone 7-17 years like you believe you should.



Probably will give ammo to the OP by saying this, but in all fairness, all you really need is a GPU upgrade. Your CPU is still plenty good.
m
0
l
September 15, 2014 1:57:26 PM

logainofhades said:
faalin said:
by the way just for sh!ts and giggles what games do you play and what are your system specs?

as for my specs

i5-2600k @5.4ghz
2 MSI GTX580 @ 1000 core 2470 memory with a custom bios for more voltage
8gb for mushkin ram
and a full custom water loop

This build will soon be replaced ether this year or the beginning of next year since it no longer gets me the graphic settings or FPS that I want. By the way its only 3 years old, can't even imagine trying to play a game on this system in another 2 years let alone 7-17 years like you believe you should.



Probably will give ammo to the OP by saying this, but in all fairness, all you really need is a GPU upgrade. Your CPU is still plenty good.



All I wanted to know is opinions if you wanted to build a computer and never have to change the platform again (note I didn't say never upgrade again) and stick to whatever physical CPU socket TYPE (but still be able to upgrade the CPU) it is at right now, then would it be LGA 1150 or LGA 2011v3?

Which one will have more longevity? It would seem LGA 1150 is more mainstream so in the future it will last longer because there will have been more cpu's and the market will be flooded with more of them, etc... Basically, from a PLATFORM and architectural standpoint which one should I go with? Besides, Intel has said both will support the 14nm, and it can't go much smaller than that... less than 10nm and you run into serious quantum issues!

However what you would sacrifice would be the ability to use more than 32GB of RAM, DDR4 and more than QUAD cores.... But will these actually mean all that much in the next ten years? Or will technology process slow down so much that it won't matter anyhow?
m
0
l
a b à CPUs
September 15, 2014 1:59:39 PM

logainofhades said:
faalin said:
by the way just for sh!ts and giggles what games do you play and what are your system specs?

as for my specs

i5-2600k @5.4ghz
2 MSI GTX580 @ 1000 core 2470 memory with a custom bios for more voltage
8gb for mushkin ram
and a full custom water loop

This build will soon be replaced ether this year or the beginning of next year since it no longer gets me the graphic settings or FPS that I want. By the way its only 3 years old, can't even imagine trying to play a game on this system in another 2 years let alone 7-17 years like you believe you should.



Probably will give ammo to the OP by saying this, but in all fairness, all you really need is a GPU upgrade. Your CPU is still plenty good.


the 2600k replaced an i7-950 overclocked to 4.8ghz with 12gb of triple channel ram. The 950 now plays with 4 gtx 8800 ultras,3 in sli and a 4th for PhysX.
m
0
l
a c 902 à CPUs
September 15, 2014 2:02:09 PM

I don't see 1150 outliving 2011v3. From what I understand, 1150 ends with Broadwell. X99 will most likely end with Broadwell-E, which will come out when Broadwell's successor and a new socket are release.
m
0
l

Best solution

a c 85 à CPUs
September 15, 2014 2:22:07 PM

All I'm seeing here is a bunch of people getting scared about how their e-peens will shrink if there's not a reason to get new hardware every 2 years. PC progress is certainly slowing down.

From 1990 to 1995, game graphics were multiplied seemingly exponentially in quality, going from little more than a static 3-color image to 3D environments. That was a leap of 1000% in 5 years.

From 2000 to 2005, game graphics doubled or tripled in quality, going from simple flat textures with minimal/no lighting to fully modeled environments with dynamic lighting and physics, as well as some of the first real open world games. That was probably a leap of 500% in 5 years.

From 2005 to 2010, game graphics stayed similar, but more effects became dynamic. Polish improved somewhat, 'modern classics' released, but overall it was a refinement of existing technologies. Maybe a 75% increase in 5 years, and most of that was between 2005 and 2007.

From 2010 to present... Uhm... We added a few more polygons and increase lighting complexity a bit, even though game budgets over the years have risen from $5 thousand to $500 million. Game developers are folding left and right due to the massive inefficiency and increased cost, and the 'next-gen' consoles are flopping because the stronger hardware over their predecessors isn't helping them much at TV viewing ranges. Crysis 3 on minimum looks nearly as good as it does on ultra, and there are threads all across the internet saying BF4 looks virtually no different from low to ultra except for the framerate difference on a low-end card from 5 fps to 50 fps. In the last 4 years, we've seen a 10% increase in quality for an exponential increase in cost.

The 1280x720 resolution lasted like 2 years... 1080p is going on 10 years now and people still only rarely think that 1440p is even worth the switch.

Meanwhile, tablets with far weaker hardware than any PC are taking off in record numbers, because people have decided they don't need stronger specs than what current PCs offer anymore, so the only direction left to expand is in convenience and portability.

To the OP:
DDR4 is supposed to be the final DDR RAM released, after that it may go to stacked ESRAM or something similar. You may want to go for the 2011v3 socket, because although it costs more, the industry is likely to stick on DDR4 for a while as people become reluctant to switch over to a whole new model, particularly when DDR3 is still (for the moment) faster than it needs to be for any application I've ever used.

I'd also wait for Windows 9 for DX12 if I was in your position. There were rumors for a while that DX11 would be the last DirectX, now there are rumors DX12 will be the last. I suspect we'll hit DX13 eventually, and after that it'll probably get forced to something new that nobody wants just to create a silly selling point to keep consumption up. But anyway, it's safer to go for DX12.

I doubt keeping one mobo platform for the rest of your life is viable, but everyone else here is getting hilariously defensive and can't really back up anything they say. You should probably anticipate one or two more mild/moderate switches, imo.
Share
a c 902 à CPUs
September 16, 2014 7:01:30 AM

Progress has been slow the past few years. That is mostly due to poor economy and little competition. AMD hasn't been any real competition since the K8 vs P4/PD days. Back when there was competition, we had significant leaps in performance. Now Intel can get away with 5-10% per generation, and focus on IGP and efficiency, because on the cpu performance side of things, they have virtually 0 competition. Hoping AMD's next arch, that is still like 1.5yrs out or so, will be good enough to bring some competition back. My 3570k, nearly 2.5yrs later, is still sufficient for me. My GPU is still enough for me as well. Only thing my system could probably use is an SSD. I still want a mini-itx Xeon rig though. Don't need one, but I want one. Kinda could use the space saving, though. My full tower takes up quite a bit of room in my small house, and I have multiple rigs, on top of that.
m
0
l
September 16, 2014 11:57:16 AM

well i know that the best aswer has been selected, and i'm agree with it.

but lets think about resolution. right now no one single GPU can handle a game on 4k.... and there are projects of 64k.. so the tecnology will be focused on GPU brutal performances more than CPU... i think nowadays CPU are so complex and so awesome (intel) that will match the DDR 4 gen CPUs not like the jump from ddr2 to ddr3. Theres no real quantum jump like has happened with phenom I vs phenom II which still doing fine nowadays and phenom I are just pices of crap...
but thinking that a computer will last fine more than 4 years its just nonsence.

ok quantum CPU based systems could be the answer to feed brutal performaces GPU to play games rendering at 64K. but 50 years back to think in personal computers meant to entertainment purposes, also with the performance and with the PRICES TODAY WE PAY FOR THEM, i think no one could have belived it, no one could even guessed.

also i imegine a scientist back in those days "for faster computers we'll need to use the transistor but no one could be able to pay for it even in 100 years" XD
m
0
l
September 16, 2014 6:11:20 PM

davy rockstar said:
well i know that the best aswer has been selected, and i'm agree with it.

but lets think about resolution. right now no one single GPU can handle a game on 4k.... and there are projects of 64k.. so the tecnology will be focused on GPU brutal performances more than CPU... i think nowadays CPU are so complex and so awesome (intel) that will match the DDR 4 gen CPUs not like the jump from ddr2 to ddr3. Theres no real quantum jump like has happened with phenom I vs phenom II which still doing fine nowadays and phenom I are just pices of crap...
but thinking that a computer will last fine more than 4 years its just nonsence.

ok quantum CPU based systems could be the answer to feed brutal performaces GPU to play games rendering at 64K. but 50 years back to think in personal computers meant to entertainment purposes, also with the performance and with the PRICES TODAY WE PAY FOR THEM, i think no one could have belived it, no one could even guessed.

also i imegine a scientist back in those days "for faster computers we'll need to use the transistor but no one could be able to pay for it even in 100 years" XD


Okay I agree that for 4k the resolution wars basically "start over". And while 1080p may for some (it is subjective) cross the threshold of "good enough" (and that is my opinion) it is true that the human eye can detect the difference between 720p vs 1080p vs 4k vs 8k. In fact all the way up to 12k is were it reaches the absolute limits of the eye itself... In fact the human eye is so good it can detect a single photon under ideal optimum conditions, that is something even the best DSLR on the planet cannot do.

But, just like you can buy a 4k movie today and yet there is not that much actual source matter that was shot, and processed in raw 4k (upscaling doesn't count, neither do the demos) in much the same way, games are not going to be "true" 4k anytime soon, if AT ALL.

What do I mean?: In order for a game to be TRULY "4K" every component of it has to be "4k" or better. I've seen renderings of Watch Dogs in 8K, and it is not true "8K" but an upscaled 1080p.

The textures are NOT 4K/8K in fact in some of the so called 1080p games the textures are often subpar on some of the surfaces and props, and whatnot (like that scene in WatchDogs where the texture on the vending machine looks like back in the 800 by 600 VGA days again) and the 3d models are also not in "4K/8K" resolution.....

For a hardcore 3D game to be truly 4K/8K the textures would have to be orders of magnitude more massive, the polygon counts would also shoot through the roof, and it will tax today's fastest graphical cards on multiple, multiple levels.... not just the raw resolution of the final render itself....

This is simply not happening anytime soon, if ever. Skylake is at 14nm, soon 10nm will be approaching a physical limit to how small things can go.... It can't keep doubling forever and below 8nm even the economics will no longer make sense. There would simply be too much errors, Intel will have to throw too much of it away and that will make it even more cost prohibitive... There is a point of deminishing returns even for the number of cores, after a certain number of cores (say 32 or 64) it actually doesn't make sense to add anymore unless you are running very specialized specific application use cases that are very narrow... games these days can't even fully benefit from eight cores, and there are some parts in games that can't be done in parallel and there are limits to how many "cores" one can add to solve the problem. We hit the Ghz wall long ago, now we soon hit the nm wall, and the numbers of cores will soon also hit point of deminishing returns.... What next? Dual sockets? I don't think so... that doesn't help gaming at all.

There are limits to what can be done. The speed of light isn't getting any faster, and silicon atoms have finite size, it is not possible to keep dividing in two ad infinitium... And you can only cram so much information into a limited area/volume before it turns into a black hole.

So therefore, due to the reality of macroeconomics and that of the laws of physics itself, until we go to something other than silicion, we aren't going to see TRUE 4K games ever. Forget about 8K or 12K.

And Quantum computers for the consumer/desktop will happen the day we get portable fusion devices that power our homes....


m
0
l
September 17, 2014 11:20:14 AM

well actually i'm agree. about resolution is like games such as bioshok that presumes that in PC you can play it at FHD but thats just an upscalling, the game looks a little better, just a little. by the other hand a game from that time, a game that most pc gamers hated, i'm talking about GTA IV it trully renders at FHD, and has fisical efects that even today so few games have and despite all people says, its a well optimiced game but maybe it was launched before its time.

so for now of course, talking about 4k and 8k its nonsense at all, maybe in 20 years more or less the 8k will be an standard as technology for gaming is stuck for consoles, why developers will go far beyond if most consumers have consoles (all arround the world)?? gaming PC are tacking marcket ere in mexico and argentina (argentina mostly) but is a relationship from 1 to 100 at best, or may be 1 to 1000, and i'm thinking that the same happends in Russia, Arabian territory, Asia as well as Africa.

mmhhh.. as for performance in 120FPS or 144FPS refresh ratio, for me is useless i would preffer an optimiced way Vsinc that really works, so i wont spend another 500USD to go 120FPS when 60 are perfectly fine, i also preffer IPS or VA, i wont like to sacrifice the amazing panel technology just for "smoother game play". as i had said developers have no reason to designe games to work fine and take advange of 120fps if consoles use to run games at 30/60...

now, about processors think about Intel, makes things smaller to be eficient, not powerfull, and as you said they have the sames speeds since the sandy bridge and almost the same performance, but theres light years in tecnology compared to AMD who has faster CPUs and also more cores, how do you explain an intel pentium have better performance than FX 8350 in some games??
well its because the architecture of the cores, the newer transistors and all that stuff. a game well programed should be optimiced for 2 cores at least one to feed the GPU and at least another for all the logic computation and stuff. SO IF TRANSISTORS MADE OF SILICON cant be faster and also cant be smaller, then whats next?? well I INTRODUCE YOU THE GRAPHENE!!! which is more durable, thinner and flexible than silicon, so its also the future of mobile devices.
http://www.itproportal.com/2010/02/08/ibm-debuts-100ghz...

well that will takes like 20 years or more to put graphene based CPUs into consumer market but the technology already exists.

As for the GPU, maybe they only need wider bandwith and faster VRAM to handle bigger resolutions.

for conclution i recently upgraded my old NT panel 768p to 1080p IPS monitor and for 21.5'' and i'm happy and it was like if i was blind all my gamer life :p , i cant imegine 4k in that size, right now a monitor below 32'' is a nonsence in 4k, i would need to use a magnifier to read XD but who knows... as everything is marketing a consume, maybe i will push to buy a 4k 21.5 monitor just to satisfy the desire to have it.

rewards, nice talk
m
0
l
!