Sign in with
Sign up | Sign in
Your question

Future/next-gen proof PC

Last response: in Systems
Share
December 17, 2012 9:42:35 PM

Is it possible today to build a PC that is not jut next-gen ready, but future proof for that entire gen? A decade ago the answer was always a resounding no, but nowadays I'm not sure it is.

Back when the 360 was made a top end 7900 GT took a total of about 50W to run (if my memory is correct). The 360 runs at about 170W total load the last I checked prior to the elite model. The reason this is important as I'm sure you will all know is the heat generated and the required form factor of a gaming console. Therefore there is going to be a limit to the total load a console can run on before it becomes either unsightly, too loud or too hot.
The issue is that nowadays a GTX 680 takes ~170W on its own. Therefore with engines like UE4 and Cryengine 3 recommending a GTX 680, will they be able to ask for anything more? At a push, maybe the next gen consoles will reach the performance of a GTX 680 with some technical wizardry, but if they don't, will PC users really need more then a GTX 680 (or amd equivalent) to be in the knowledge that they are next-gen future proof (just 1080p gaming)? Of course there may be improvements made for PC versions of games, but those sadly are few and far between. One reason I ask is that I currently have a GTX 560 Ti. If I was to add a second rather then upgrading the card, how long would that last me? If we consider that an 8800GT still is a capable gaming card, how long will the power house cards of today last?

More about : future gen proof

December 17, 2012 9:54:02 PM

Well... it depends. For starters, the 680 is a waste of money; it's only 5% better than a 670, for $100 more.

That being said, it entirely depends on what you want to do. As a graphics whore, I'm going to continue to upgrade every so often, because I want to run everything on ultra at 120 frames a second.

If you have a 60 Hz monitor and are happy with medium settings, then a graphics card will run for a fair while without needing an upgrade. I just passed my old 9800GT off to my best friend, and he loves it - it's still a decently powerful card that'll compete with an xbox 360.

As for thermal issues, if you look at trends in what technology has been doing, that's not an issue. The trend has been a downward spiral of the best sort - parts have been getting more and more efficient, because they have to be in order to become more powerful.

In conclusion, cute boy:
http://www.youtube.com/watch?v=FK4ip08auGg
December 17, 2012 10:08:23 PM

Well maybe after the Haswell comes out next year. The channel is buzzing that Intel will stop making socketed CPUs, only the type that have to be soldered on. That will slow down things. Several Motherboard companies will quit the business limiting our choices.
Related resources
December 17, 2012 10:10:36 PM

no. futureproofing is impossible. wouldnt even try
December 17, 2012 10:25:01 PM

DarkSable said:
Well... it depends. For starters, the 680 is a waste of money; it's only 5% better than a 670, for $100 more.

That being said, it entirely depends on what you want to do. As a graphics whore, I'm going to continue to upgrade every so often, because I want to run everything on ultra at 120 frames a second.

If you have a 60 Hz monitor and are happy with medium settings, then a graphics card will run for a fair while without needing an upgrade. I just passed my old 9800GT off to my best friend, and he loves it - it's still a decently powerful card that'll compete with an xbox 360.

As for thermal issues, if you look at trends in what technology has been doing, that's not an issue. The trend has been a downward spiral of the best sort - parts have been getting more and more efficient, because they have to be in order to become more powerful.

In conclusion, cute boy:
http://www.youtube.com/watch?v=FK4ip08auGg


Well with consoles now being the main money spinner, surely we can estimate their power, and therefore the performance of next-gen games?
The parts are certainly more efficient then they have ever been, but still thermals need to be considered; you aren't going to be able to fit a GTX680 into a 150W console, and to fit a GTX680 into a small form factor would require some extreme cooling.

Good video, but I wonder if graphical advances will keep on slowing down. I think we all can safely say that the graphical leaps this generation have been few and far between, and have been outpaced by GPUs in a matter of months (BF3), hence why top end GPU's may be a the top of their game (no pun intended) for longer then ever before. Because of that, future proofing may not be absurd as it once was?

Do you notice a difference between 60 fps and 120 fps? I'm also dicing with upgrading my monitor or case.
December 17, 2012 10:26:36 PM

Short answer... NO.

LOnger explanation.
If you could tell me what your computing needs will be for the next 4 years, then perhaps the answer could be yes.

But technology continues to advance, and tomorrow's products will be cheaper and better.

Most likely, form the cpu point of view, a 3570K and a Z77 based motherboard will be a 4 year product. They are that good.

For graphics, who knows?? We will see next yeat the GTX780 and amd 8000 clas cards that will be considerably stronger.
They will not really be needed for gaming on the usual 1080P monitor.

Are you considering 2560 x 1440 class monitors, or perhaps triple monitor gaming?
What happens if $K monitors come on the market? Then graphics power needs will jump.

On the other hand, game developers want the largest possible market for their products. They have little incentive to build games that need the strongest of components to run.

My take is to buy what you need today...today.
Later on, sell what you can't use and buy what you need then.
December 17, 2012 10:33:02 PM

TheBigTroll said:
no. futureproofing is impossible. wouldnt even try


As I said, 5 years ago, sure, but now games that graphically push GPU's are few and far between. I think it's gone like this Crysis -> Metro 2033 -> BF3. Are there any other games that have really taxed the last 5 years of GPU's? Are we in a lull before a wave of GPU nut-busting games come out, or have GPU's outpaced the developers, which is always a reality when we consider that the money is in developing for consoles, not for PCs?
December 17, 2012 10:35:40 PM

crysis 3 will rip our rigs. same goes with bf4 next year. and metro last light


December 17, 2012 10:40:23 PM

geofelt said:
Short answer... NO.

LOnger explanation.
If you could tell me what your computing needs will be for the next 4 years, then perhaps the answer could be yes.

But technology continues to advance, and tomorrow's products will be cheaper and better.

Most likely, form the cpu point of view, a 3570K and a Z77 based motherboard will be a 4 year product. They are that good.

For graphics, who knows?? We will see next yeat the GTX780 and amd 8000 clas cards that will be considerably stronger.
They will not really be needed for gaming on the usual 1080P monitor.

Are you considering 2560 x 1440 class monitors, or perhaps triple monitor gaming?
What happens if $K monitors come on the market? Then graphics power needs will jump.

On the other hand, game developers want the largest possible market for their products. They have little incentive to build games that need the strongest of components to run.

My take is to buy what you need today...today.
Later on, sell what you can't use and buy what you need then.


Maxing out current gen and next gen games on 1080p is really all I would want to do. As I don't/wouldn't want a monitor bigger then 24", 2560 and trip-screen gaming is overkill for me.
With 4K monitors, they have to be over 50" to notice the increase in resolution (apparently). I 100% agree on your idea that devs have little incentive to build a game that demands the sky and stars hence why I'm tossing up between going SLI or not.

TheBigTroll said:
crysis 3 will rip our rigs. same goes with bf4 next year. and metro last light

The crysis 3 specs aren't that bad to be honest:
Hi-performance system requirements for PC
Windows Vista, Windows 7 or Windows 8 •
Latest DirectX 11 graphics card •
Latest quad core CPU
8GB Memory
Nvidia/Intel example setup: NVidia GTX 680, Intel Core i7-2600k
AMD example setup: AMD Radeon HD7970, AMD Bulldozer FX4150

Recommended system requirements for PC
Windows Vista, Windows 7 or Windows 8
DirectX 11 graphics card with 1GB Video RAM
Quad core CPU
4GB Memory
Nvidia/Intel example setup: Nvidia GTX 560, Intel Core i3-530
AMD example setup: AMD Radeon HD5870, AMD Phenom II X2 565

Forgot about Last Light....If it's like 2033....Ouch.
December 17, 2012 10:51:19 PM

recommended setup usually does not mean ultra settings. more like high
December 17, 2012 10:52:37 PM

You want future proof ?
Get an i7-3930k or i7-3960x, three or four 680 gtx 4gb evga classified, 64GB ram,etc.
Lock,stock and barrel, lol.
December 17, 2012 10:56:27 PM

thats not futureproof. run it 4 years down the road and it would be high settings at most
December 17, 2012 11:04:49 PM

TheBigTroll said:
recommended setup usually does not mean ultra settings. more like high

:??:  That's why it has both recommended and high setups I'm guessing?
December 17, 2012 11:07:38 PM

TheBigTroll said:
thats not futureproof. run it 4 years down the road and it would be high settings at most

But you're presuming that games keep graphically advancing at the same pace as they are today. Due to production costs and the dev time tied up in the more profitable consoles, it's reasonable to expect graphical progression to slow down. For the top end GPU's 1080p benchmarking hasn't been taxing enough, which realistically is the top resolution 80% of gamers will be playing at.
December 17, 2012 11:09:05 PM

Indeed and what's worse in a couple of years it'll be outperformed by a system costing a fraction of the "future-proof" one.
December 17, 2012 11:11:25 PM

mazty said:
But you're presuming that games keep graphically advancing at the same pace as they are today. Due to production costs and the dev time tied up in the more profitable consoles, it's reasonable to expect graphical progression to slow down. For the top end GPU's 1080p benchmarking hasn't been taxing enough, which realistically is the top resolution 80% of gamers will be playing at.

I don't see why it's reasonable to expect that now if it wasn't what happened when the last gen consoles came out. If you were right thought that'd be a sad thing indeed...

But you're not ;) 
December 17, 2012 11:12:33 PM

TheBigTroll said:
thats not futureproof. run it 4 years down the road and it would be high settings at most



I was just joking around, i'm not a psychic, nor i recommend going with all that.


As for the rest it's pointless, really just get a top card for now, then get latter on another card when you see that your card isn't doing great, also for cpus is not worth speculating, the market cpu is now stagnant, a quad core still hold hold a few years, but you can't see what's ahead in four or five years, just get the components thinking in the present tense.
December 17, 2012 11:28:46 PM

FinneousPJ said:
I don't see why it's reasonable to expect that now if it wasn't what happened when the last gen consoles came out. If you were right thought that'd be a sad thing indeed...

But you're not ;) 


As my original post states, back when the 360 came out, there was a lot of power left to be harnessed as the cards were only using 50W. Now they use 170W+, something a console won't be able to run due to the size, and therefore heat, constraints of a console. With the increasing dev costs of AAA games, will cutting edge graphics take a back seat and will devs be more likely to homogenize graphics over all the platforms it's released on? The last 7 years have shown a small improvement in graphics and the gpu's seem to be outpacing the demanding games. Will that outpacing increase? Just a thought.
December 18, 2012 1:05:55 AM

from what i know, cards back then used about 100w.

note. a gernation ago of cards (500 series) used around 250w
December 18, 2012 2:33:10 AM

TheBigTroll said:
from what i know, cards back then used about 100w.

note. a gernation ago of cards (500 series) used around 250w


http://www.xbitlabs.com/articles/graphics/display/power...

The GTX 580 used around 280W. Still though that is far, far more power hungry then capable of putting into a console so unless MS pulls out some magic tech from 2020, I can't see them getting close to the 680 in terms of performance.
December 18, 2012 2:39:42 AM

mazty said:
http://www.xbitlabs.com/articles/graphics/display/power...

The GTX 580 used around 280W. Still though that is far, far more power hungry then capable of putting into a console so unless MS pulls out some magic tech from 2020, I can't see them getting close to the 680 in terms of performance.


console gpus in general are modified gpus. so a modification can possibly decrease power consumption.
!