which one should be better thermaltake toughpower 600/750W?

tolemi

Distinguished
Oct 20, 2006
34
0
18,530
Hi guys ..i m new here. For the future ... i am concerning about mostly ..u know ..so which one do u think would be better future proof thermaltake toughpower 600W or 750W? Or is the 750W a wastage of money and power? what u think?
 
G

Guest

Guest
Depends, on what you want to do with the system, no doubt the 750W is more future proof!

If you plan to go the DX10 card path the 750 might be a good option. List you specs, and youre upgrading roadmap/cycle and we'll probably be able to enlighten you!
 

Doughbuy

Distinguished
Jul 25, 2006
2,079
0
19,780
That's SLI, and he never specified if he wanted to go that route. If you are planning on SLI/Xfire and adding tons of stuff, go with the higher one. If not, the 600W is fine.
 

RichPLS

Champion
Yes, I read that... But did not study it enough to know if that standard is a general minimum requirement intended to cover all PSU's (including generics) or if it simply was actual and literal.
I find it hard to envision a gaming PC with just a pair of HD's, TV Tuner and a single graphic card actually drawing 800 watts from the PSU, but it might.

My loaded PC overclocked with TV Tuner and overclocked X1800XT and 4 hard drives only pulls a max of 350 watts from the wall outlet, which is less than 300 watts being supplied from the PSU to components.

It is hard to believe that if I remove my current X1800XT which is overclocked, and install a stock X8800GTX that it will consume an additional 500watts from my PSU.
 

chuckshissle

Splendid
Feb 2, 2006
4,579
0
22,780
I would get the 750watts since them future graphics cards like the 8800 for example are getting bigger appetite when it comes to power. I have a 600watts right now and I don't know if I could use it for a 8800GTX in the future.
 

Doughbuy

Distinguished
Jul 25, 2006
2,079
0
19,780
I'm thinking though if you avoid the first gen DX10 cards, the next gen will scale back on power quite nicely since there is already quite a large backlash against the ridiculous power requirements. However, if your not planning on that or you want to run peltier + OC + whatever else that hogs power like crazy, then the higher the better.

I would personally go with 750W. But 600W is more than enough for the average person.
 

chuckshissle

Splendid
Feb 2, 2006
4,579
0
22,780
I hope so to and maybe the later DX10 graphics cards would have smaller power consumption. It would be nice if I could still use my 600watts psu for the next year's upgrade. But if it doesn't then I'll just have to use dual-psu since I can't afford a 1K psu. :(
 

tolemi

Distinguished
Oct 20, 2006
34
0
18,530
For 600W Input Current : 115VAC / 8A max.
230VAC / 4A max.

For 750W Input Current : 115VAC / 10A max.
230VAC / 5A max.

Does it mean that 750W gonna waste more input current than 600.
By the way if I dont use the full power I mean for the 750W if I use 400/450W then whats gonna happen to the rest of the Watt. will It waste the rest of the Watt or wont use it..just use the required Watt. Cuz u know I am concerning about the electricity bill also.
 

RichPLS

Champion
By having a power supply larger than required, allows it to run cooler since less current is flowing through the unit, but a drawback is that instead of it converting the power at an 85% efficiency at high loads, it may only produce the power at an 80% efficiency, thus creating 5% wasted electricity, of which some of this loss will be offset from the fact that the PSU will run cooler, transferring less heat into your PC room.

Bottom line is that having more power reserve generally results in the PSU running cooler and more silent than if it is running at max load, where it generates more heat and noise due to the cooling fan speeding up to compensate.