spel565

Distinguished
Apr 9, 2010
70
0
18,630
Is a i7 920 setup (with"next generation gpus" or such) going to be enought for eyefinity and ok high quality gaming for the next 3-5 years... is the cpu enought?
 

Lewis57

Distinguished
Nov 27, 2009
198
0
18,680
Yes, the i7 920 is very powerful. Also it is very overclockable, if you overclock. And you can always upgrade CPUs later on if you decide you want a more powerful one.
 

Nope. Sandy bridge will be on a new socket. Though, when it comes out, core i7 980x prices may drop.

The AMD hexa is awesome. But currently will perform no better in most games than the 955. This is because most games still do not use more than 3 CPU cores.
What games do you intend to play and what will be your graphics card setup? I would go with the phenom II 955 at the most for a couple reasons:
1. Less expensive.
2. Past a resolution of 1920x1080 (which eyefinity will certainly be if you are using 3 monitors), the demand on the GPU is a lot more and the demand on the CPU is a lot less.
 

spel565

Distinguished
Apr 9, 2010
70
0
18,630


But is it enought ;)
...To play like highest settings in eyefinity x3screens the next 3-5years? :p
 

huron

Distinguished
Jun 4, 2007
2,420
0
19,860
I don't think there is any way for anyone to know if it will be OK for eyefinity for 3-5 years. If I had to hazard a guess, I would say no...computer hardware increases speeds/performance so quickly, and therefore games continue to push the envelope, so 3-5 year old tech doesn't usually cut it.
 

Indeed. This is the reason why I suggest a less expensive route that covers you now. Three years ago, quad cores were new. Now they are starting to become standard. 3 years is a lot of time. An "future proofing" is overrated. I always recommend: Get the minimum you need now. The rest of the money can be used in the future when you feel like your current system isn't good enough. Think about it: AMD's $1000 CPU in 2006 is less than half the power of their $100 CPU today.
That being said, $300 and under CPUs are worth their cost in performance/$. Anything higher is a waste of money. The best performance/$ is below $200. So you can't go wrong with the i7 920, but you don't need it now, and you never know what the future holds.
 

Raidur

Distinguished
Nov 27, 2008
2,365
0
19,960
Also, eyefinity should use less of the CPU. Higher eyefinity resolutions are going to slow down graphics cards dramatically. Producing less frames in a particular game due to resolution, usually means the CPU is going to have to work less.

So I think yes, an overclocked i7 920 could last an eyefinity setup for 5 years.

I will add, I hope games actually progress much over the next 5 years. Crysis is still among the best looking games and its 3 years old already! Come on Sony/Microsoft let's see some damn new consoles so game designers can astound us again.
 

In 2005, for $300, you could get an Athlon 64x2 at 2GHz. This was high end.
The best GPU you could get from ATI was the X1300 pro with 512mb of DDR2 ram. Inferior to even the $30 video card today.
Would you want to be running that setup today?

I think one cannot reasonably expect to keep a system for more than 3 years if they are a high end gamer.
 

Raidur

Distinguished
Nov 27, 2008
2,365
0
19,960


I was speaking of CPU, not GPU.

The 7800GTX was out in 2005 also, which is still able to play most games today.

Yeah, I bought one of the X2s (4400+) for $450. :)

The progression of CPUs is going to slow down dramatically compared to the last 5 years. I'm sure we can all agree on that.

I do agree though, a whole system never lasts more than 3 years. However an extremely fast CPU such as an overclocked i7, I could see doing so easily today. Especially if one is going to use such an extremely GPU bottlenecking resolution like eyefinity. i7 needs very little to run 5870 crossfire @ 25x16 8xAA, let alone eyefinity.

http://www.legionhardware.com/articles_pages/cpu_scaling_with_the_radeon_hd_5970,9.html

Most of the games on that review show the i7 gaining less than ~5-10% between 2.0 & 2.6, and nothing afterwards. Massive overkill.

Look at the Q6600, that CPU is already almost 3.5 years old. It is still plenty for the average gamer and even more than enough for many gamers.
 

I'm actually going to disagree on that. Mainly because that's what people have been saying for the past decade or so.
People have been predicting the end of Moore's law (that the number of transistors on a CPU die will double every, I think it was, 18 months) for decades.
I might be wrong and it might hit me like a wall but really, with all the research these companies have, even if they run into a literal wall technologically, they will find a way to circumvent it with a new idea.
 

huron

Distinguished
Jun 4, 2007
2,420
0
19,860
I know someone who works at Intel, and they are working R&D on chips many years in the future. They said that they are not seeing an end to Moore's law anytime soon, at least not in their camp.

It's always fun to see what the future brings.
 

Raidur

Distinguished
Nov 27, 2008
2,365
0
19,960


I don't think we're going to hit any kind of wall quite yet. I just don't see as much of an IPC increase as we've seen. Who knows though maybe the core optimizations will make up for the lack of IPC to come, or maybe we'll still see significant increases in IPC. *shrug
 

But remember, progress comes in two forms: IPC and clockspeed.
The phenom IIs highest at 3.4 is certainly an improvement over the original, I believe, 2.2GHz.

The pentium 4s back in the day lasted a long time because of clockspeed improvements. In fact, netburst had a worse IPC in order to achieve much higher clockspeeds. Mostly, IPC increases occur with the release of a new architecture. Such as Sandy Bridge and Zambezi coming next year. :D