Status
Not open for further replies.

hedgehogpie

Honorable
Nov 26, 2012
88
0
10,630
Hi Guys,


Just wanted your opinion about GPU lifespan. I really like to keep things, even my ps1 still functioning as of today, wrap in good plastic to keep away dusts.

I do have a question, I'm building my first gaming rig with choices for this case HAF XM, switch 810 or phantom 410, phantom 820 so expensive not I'm not planning to do water cooling or overclocking.

Back to main topic. is the OC'ed GPU has a shorter lifespan than the non OC'ed GPU? or the reference GPU has a longer lifespan than the non refererrence GPU? . I was selecting a brand so I choose EVGA. I want my GPU to last long, it doesnt mean I wont buy a new one, it's good to keep things, something to remind of. :)


So does EVGA 670 reference card last long than EVGA 670 FTW (EVGA FTW is factory overclock).


Want your opinion. Thanks in advance
 
Solution
I guess the main thing for me is this - how can we be certain they don't know? I didn't know what the guy was talking about with electron tunnelling etc, I suggested that the same effects would be caused by the built-in (standard) clock speed adjustments in CPUs and GPUs but he was saying that electrons are stripped away inside the silicon at different quantities depending on clock speed, so they can tell by that.

So what it comes down to for me is: that subatomic-level stuff is way over my head and I wouldn't have complete confidence that anyone else on here could say they know for 100% certain the manufacturer can't tell (no offence to you). I know also that even if a manufacturer really can't tell, they're not gonna admit it. So I'm...

simmons33

Honorable
Nov 7, 2012
699
0
11,160
Ive heard that OC anything can shorten the lifespan. Def if you dont know what your doing. Non Ref GPUs generaly have longer lifespan because of various things. Better cooler, capcitors, chokes, this and that. The OC put on at the factory you dont have to worry about. They are usually overlcocked according to the ability of their cooling solution. And non ref cards generaly always have better parts.

Sometimes things just croak. Sad really.

I just uses temp monitoring programs for my major components to make sure they are not damaged by heat.

And I feel you on having things last forever. Dust is your greatest enemy. Followed by your power company if were talking electronics.

Oh and should you get a GPU with a built in boost. (Ex Geforce 600 Series) Dont disable it. Ive heard doing so caused increased power consumption and unit instability
 
Yeah OCing will reduce lifespan. The EVGA is an excellent choice though for the warranty (depending on your country). And of course unlike normal overclocking, factory overclocking is covered by the warranty. So I wouldn't worry about it. It's an awesome card and it's what I'd be buying now if I wasn't waiting for next year's stuff.
 

hedgehogpie

Honorable
Nov 26, 2012
88
0
10,630



I think. I'll be getting evga FTW, but i don't think EVGA doesnt have a built in boost. Well if there is, I won't disable, but is the built in boos is already activated by default right? Yeah, here in Philippines, power interruptions occur without warning so I'll be getting a UPS. Speaking of UPS, do you have any experience using this? like if I have a 750w PSU, should I also find a UPS that has 750w also? and can i plug more than 1 component in PSU? I have also a led TV which runs on 100w, can i plug it in one PSU together with my PC?
 

hedgehogpie

Honorable
Nov 26, 2012
88
0
10,630



Thanks, be getting evga FTW, hope it last longer 5 years or so, I know I will have a new GPU by this time, but it's good to keep it, acts as a remembrance.. :)
 


That we know of. Somebody was telling me something about electron tunnelling and electron microscopes etc. Intel say that they can tell, but I guess they would say that.

OP, good choice - hope you have fun on it :)
 

I have many video cards over 5 years old still working. They generally do not die that fast unless you are overclocking then to the limits.
 


The only thing they can detect is overheating. And since it can happen due to many causes, they cannot be certain you were overclocking.
 
I guess the main thing for me is this - how can we be certain they don't know? I didn't know what the guy was talking about with electron tunnelling etc, I suggested that the same effects would be caused by the built-in (standard) clock speed adjustments in CPUs and GPUs but he was saying that electrons are stripped away inside the silicon at different quantities depending on clock speed, so they can tell by that.

So what it comes down to for me is: that subatomic-level stuff is way over my head and I wouldn't have complete confidence that anyone else on here could say they know for 100% certain the manufacturer can't tell (no offence to you). I know also that even if a manufacturer really can't tell, they're not gonna admit it. So I'm left not knowing one way or the other, but inclined to play it safe and just buy hardware that's already fast at stock speeds. To each their own though :)
 
Solution
Status
Not open for further replies.