Just how often do we really "need" to upgrade hardware

KM75

Honorable
May 11, 2013
34
0
10,530
I was having this discussion with a couple of my tech friends. Just how important or necessary is upgrading or future proofing? How often on average do you think its reasonable to upgrade a system that performs its desired tasks reasonably well? By reasonably well, I mean editing thats not sluggish, or gaming thats still HD pretty with good frames. I mean for the most part, it seems a lot of rigs arent stressed at all by todays software (with the exception of Crysis 3 lol), so it almost seems like software is currently lagging behind how powerful hardware is becoming.

Haswell just released today, and one of my friends is in a mad rush to get the new architecture CPU and mobo. However, based on what Ive read in articles, he wouldnt get a real performance boost over his current system. He would be more energy efficient though, but thats not what hes going for to be honest.

The thing is, when new CPUs and GPUs drop, many folks are in a mad dash to acquire this new tech. And Ive heard a few people like my friend justify the cost by saying they are future proofing so they dont have to upgrade later. But than my other tech friend is more relaxed about it. He uses his pc for music recording and some gaming, but is fine using a console as well. And I have to admit, sometimes seeing everyone want to upgrade to new tech whenever it drops, makes me wonder if Ill get left behind by developers when it comes to new software and how well it runs...be it editing software or games.

However, so far, my system, despite being based on C2Q type stuff (dual quad Xeons), screams though editing software (at least for my tastes), and has eaten up every game Ive put in it. Granted I havent tried Crysis 3 yet. Doubt I will though =P. But I tend to upgrade every 3 or 4 years.

So what say yall? Is the upgrade necessary? How often do you yourself upgrade? Im pretty tempted by the newness of i3, i5, and i7. And Haswell being 22nm and more energy efficient is tempting. But using toms cpu and gpu hierarchy, my stuff is rather close to the top despite not being super new.
 
I upgraded my GPU over Christmas from a 280 to a 660ti and went from XP-Pro to 7 Pro, keeping the Phenom II 955 / 4GB RAM / Raptor 300gb & Barracuda 1TB drive. Have to say apart from DX compatibility, for everything I do including current games, I've not really noticed a great difference or any problems with running at high visuals @ 1920 x 1200.

I only tend to upgrade when I find something I want my system to do that it can't - which is why I only left Office 2003 this year, the first time it couldn't open an attachment I was sent (even with the compatibility pack) - while the stat-monsters will naysay everything apart from the latest and greatest, I'm quite happy to stick with my setup for the forseeable future - or at least until (hopefully) the new XBox means that most PC ports need some more horsepower. Say I upgrade around Christmas, keeping the 660ti, that will mean my 955-based spec will have served me really well for ~4 years and will probably continue to do so as my next linux server.
 

USAFRet

Titan
Moderator
When something I need to do, with new software, would be slow on the current hardware...I'd consider changing something.

My current software stack (photo and video editing, games, Office) runs perfectly well on the current hardware (i5 3570k). When and if there is some new piece of software that I absolutely *need*, I'll consider upgrading.

I built this about 8 months ago, and see no need for doing anything major for a long, long time. I might change the existing 6670 GPU to something else. Maybe.

I've changed my main platform 3 times in the last decade. Pentium4 -> Core2Duo ($50 laptop) -> i5 3570k.
 
^ Forgot to add, I'm writing this on an Athlon BE-2400 dual-core machine that is older than the hills. With 4GB ram and a equally creaky X1950 pro to power dual monitors, it more than handles my day-to-day office work and the odd bit of Everquest / FTL. It may take ~2 minutes to boot, but once up the RAM keeps it working just as fast as I need it to.
 

g-unit1111

Titan
Moderator
I change out my hardware about every 2 - 3 years. Last year I did a complete overhaul of my system and upgraded from a Phenom II to an i5-3570K and upgraded the GPUs from dual 550TIs to dual 7870s, and I most likely won't need to upgrade again for another 3 years or so. I generally switch out the motherboard and CPU around that time to stay current. But if you have Ivy or Sandy Bridge, there's no need for Haswell.
 

Fulgurant

Distinguished
Nov 29, 2012
585
2
19,065


You've hit on the key right there. Performance standards are subjective. How often you "need" to upgrade depends on your own personal satisfaction with your current rig.

There will always be people who genuinely enjoy upgrading for its own sake (like your friend who's in a mad rush for Haswell, despite that he's unlikely to notice any performance benefit from it), and that's fine. If they enjoy the process of watching new releases, buying them, trading/selling their old hardware, and installing/tweaking/testing the newest, hottest thing -- then more power to them. I used to be one of those people, years ago.

The only problem arises when people who don't fit that description -- people who might enjoy high performance (say, in games) but don't necessarily find upgrading/tweaking/testing intrinsically enjoyable -- when those more casual people get talked into needless upgrades because friends or salespeople convince them that older hardware is obsolete simply because something newer has arrived. It's a bit of an injustice, caused by myopia on the part of enthusiasts, and by extension, by the distortion of enthusiasts' advice as it's repeated and repeated through the ranks of the less-knowledgable gamer types.

But yeah, all of that rambling out of the way, there's no such thing as future proofing, in principle. And there's (almost) no such thing as a last-gen (that is, a previous-to-the-newest-generation) system that's genuinely obsolete.

There's only real-world performance, and the extent to which the user in question prefers more or less of it.
 
It's moving faster on the GPU side than the CPU. But it has slowed down in recent years, in part because the consoles have held back gaming graphics etc.

With the new consoles, things might pick up a bit in the next couple years. But there are also a lot of people moving to tablets and such, so things probably still won't go as fast as they used to. A platform upgrade every 4-5 years would make sense for most people, with a mid-life graphics card upgrade.
 

USAFRet

Titan
Moderator


"....a friend told me...." or "...the guy at BestBuy told me..." is a very, very common problem.
 

KM75

Honorable
May 11, 2013
34
0
10,530

This is actually quite surprising. Based on specs, benchmarks, and the gpu list tomshardware provides, I would have totally expected a 660ti upgrade to shred any of the performance standards set by your system when it had the 280 in it.

Im wondering if youre bottle-necking the new card. However, the Phenom should be performing quite well with it from things Ive read. Im no expert tthough.
 

g-unit1111

Titan
Moderator


I can't tell you how many times I have said on this forum that I think it's completely ignorant to compare PC hardware to console hardware. The console hardware is made for low power consumption and to meet a certain price point, otherwise it won't sell. PC graphics will always be superior to consoles no matter how hard they try. There's a long, long, long list of consoles that failed to succeed in the market because - despite that they had amazing hardware and pushed the limits of what a console could do, they failed to meet a price point and could not sell, or they couldn't attract developers because the systems were difficult to develop for. I'd be willing to bet some good money that the new XBOX One will be incredibly difficult to develop for because of the mandatory digital rights management on the system. Sega's Dreamcast and the Apple / Bandai Pippin are perfect examples of this. The hardware that's in consoles would most likely compare to what's available in all in one PCs or a slight step above laptop hardware. It will never, ever compare to the hardware you can get in a desktop no matter what the fanboy gushing websites try to convince you otherwise.

But that said I definitely agree that it's getting significantly better with GPUs every generation.
 

I didn't compare console hardware to PC hardware.
 

KM75

Honorable
May 11, 2013
34
0
10,530

Um....DRM isnt what kept developers off the Dreamcast. If anything dreamcast had crappy DRM, and developers hated how easy it was to pirate games. Plus it was cheaper and more profitable for them to wait for the PS2 to release

And yes, console hardware can COMPARE to pc hardware. It may not beat it, but it will compare. You get great graphics at a great price. Price to performance ratio is key here. To the vast vast majority of consumers, top of the mountain graphics is not worth the price they cost. Especially when software lags behind the capabilities of top of the mountain hardware. For the games most people play, it makes no sense to spend more than next-gen console money on a PC.

And the big reason for this is because developers focus more on console releases before pc releases. In which case, why spend so much money on a system to play a game that doesnt look as much better as the price dictated? Remember, price to performance is key. And mid level hardware is that sweet spot for most people.

Consoles can compare to budget and mid-level PCs because they essentially are PCs. They simply use standardized specifications. Its still mid-level pc hardware upon release...and ends up budget PC hardware as the console ages. But even with that being said, consoles still will look great because of developers knowing how to optimize games for it.

Consoles and budget or mid-level PCs is what most consumers want. You can get great graphics for a good price. Im able to play Fifa 13, both Batman Arkham games, the latest Call of Duty games, and battlefield 3 ALL maxed out. And used a system that cost me 500 bucks to build.
 

g-unit1111

Titan
Moderator


DRM wasn't a thing when the Dreamcast was around. Sega failed to attract developers to the system. Add to that the heavy competition from the competitors like Nintendo, Sony, and Microsoft and you've got a system with great hardware - which the Dreamcast had - lost in the mainstream sea.

And yes, console hardware can COMPARE to pc hardware. It may not beat it, but it will compare. You get great graphics at a great price. Price to performance ratio is key here. To the vast vast majority of consumers, top of the mountain graphics is not worth the price they cost. Especially when software lags behind the capabilities of top of the mountain hardware. For the games most people play, it makes no sense to spend more than next-gen console money on a PC.

You can compare it to console hardware, but it's not the same. You're comparing two different things. It'd be like comparing some guy on a karaoke machine to Luciano Pavoratti. Or it'd be like comparing some off brand generic cola to Classic Coke.

And the big reason for this is because developers focus more on console releases before pc releases. In which case, why spend so much money on a system to play a game that doesnt look as much better as the price dictated? Remember, price to performance is key. And mid level hardware is that sweet spot for most people.

The PC can never be a true game console, and vice versa. The thing is I think if PC graphics cards - especially like those of the 780 and 770 are any indication that GPUs are the one thing that's constantly improving and I'd like to see more developers develop for the PC first then the consoles.

Consoles can compare to budget and mid-level PCs because they essentially are PCs. They simply use standardized specifications. Its still mid-level pc hardware upon release...and ends up budget PC hardware as the console ages. But even with that being said, consoles still will look great because of developers knowing how to optimize games for it.

Consoles and budget or mid-level PCs is what most consumers want. You can get great graphics for a good price. Im able to play Fifa 13, both Batman Arkham games, the latest Call of Duty games, and battlefield 3 ALL maxed out. And used a system that cost me 500 bucks to build.

That's true. I'm willing to bet that there's going to be very few people who are willing to pay the $650 price tag for the GTX 780. I'll probably get one for my system but I'm waiting until the price drops drastically. But by then the GTX 880 will be out. :lol:
 

KM75

Honorable
May 11, 2013
34
0
10,530
DRM wasn't a thing when the Dreamcast was around. Sega failed to attract developers to the system. Add to that the heavy competition from the competitors like Nintendo, Sony, and Microsoft and you've got a system with great hardware - which the Dreamcast had - lost in the mainstream sea.
Youre making my point. Software protection wasnt the big reason why Sega failed. Sega failed for many reasons. Id say the biggest was their inability to make profits going back to the genesis. They made poor financial decision after poor financial decision with hardware releases.
You can compare it to console hardware, but it's not the same. You're comparing two different things. It'd be like comparing some guy on a karaoke machine to Luciano Pavoratti. Or it'd be like comparing some off brand generic cola to Classic Coke.
You can make the comparisons. The hardwares on both platforms can play games, so while the comparisons arent perfect, you can compare. The thing is that console, while using hardware that is somewhat related to actual PC hardware, consoles use hardware thats specifically made to be optimized for a sole standardized intended purpose. Add to that the developers optimizing games on hardware that never changes, and you get something truly worth value.

Im a pc gamer and console gamer...but Ive been mostly a pc gamer for the last several years. I will say Im jealous at the bang for buck that those who solely console game get. The coke and karaoke analogies you made are off the mark. For what they accomplish, and for the desires of the public, consoles do an amazing thing and control the market. Last I checked, store brand cola didnt control the soda market, and people didnt enjoy hearing karaoke over Pavoratti. As it stands today consoles control a market against hardware that can outclass it. Price to performance ratio and optimization are the reasons why.
The PC can never be a true game console, and vice versa. The thing is I think if PC graphics cards - especially like those of the 780 and 770 are any indication that GPUs are the one thing that's constantly improving and I'd like to see more developers develop for the PC first then the consoles.
I agree. The pc is more multi-use than a console can ever try to be. Anything you can do on a console, you can do on a pc and then some.

However, I dont think we will ever see the day again where PC is developed for first and then consoles for most top tier (or mid tier) developers. Theres many reasons for this, but one of the biggest reasons is that console market share is just too big and will always be so. Consoles are too affordable, too plug and play, too "easy to use and relax on your couch", when compared to gaming PCs.

We wont get the 90s back Im afraid. That was the last time when PC graphics and hardware was supremely leaps and bounds ahead of consoles. And thats part of why there was a market for developers to make games for. People wanted great graphics that console couldnt provide. That was the last time when high level graphics could be seen at home and not just in pre-rendered footage or at the arcades. Consoles have caught up, and once they did, they took away many pc gamers...despite the console gaming community having had been big enough already.
That's true. I'm willing to bet that there's going to be very few people who are willing to pay the $650 price tag for the GTX 780. I'll probably get one for my system but I'm waiting until the price drops drastically. But by then the GTX 880 will be out. :lol:
And this is where PC gaming hurts itself. High priced top of the line hardware that scares people onto consoles.

I read this in another thread, and its true; people want to be able to feel they have the best in what they buy. If they get a next gen console, they will feel they have the best graphics until that console generation ends. They dont want to be on a pc, in an ever changing market, thinking that even though they can get games and have them look good, that they wont look the absolute best they can.

You are similar to these folks, except you choose PC and possibly have more discretionary income. For most folks though, a console is top of the line stuff at a great price point with ease of use.
 


It's more the other way around - I wasn't experiencing any slowdowns on the 280 on anything I was playing (I rarely used any form of AA, but had everything else as high as possible), I only changed because I wanted DX11 support :)