Sign in with
Sign up | Sign in
Your question

Overlocking Wattage Increase

Last response: in Overclocking
Share
May 17, 2010 11:20:53 PM

I've been researching components for the last week as I work ever vigilantly towards creating my first box in almost a decade. I've recently decided that my new system is going to be getting an ASUS DirectCU 5850 and I've been reading a lot more reviews just to solidify my resolve. In learning about how voltage works with OCing I'm getting very curious about what kind of increases I can expect to wattage as I overclock my card. What kind of increases in heat level can I expect as well?

Keeping my system relatively quiet (purr or very low hum not a roar or a bonfire) is important to me but I might change my mind as I OC my GPU and CPU. I would definitely like to see what I can get out of this model of the 5850.

I have a Corsair HX650 and I'm trying to keep my system open to upgrade in the future should I want to run two 5850s in CrossFire. I have yet to decide on a motherboard/cpu combo, but the CPU will be an i5-750 or a 1055T. Any insight on the questions I've asked is appreciated.
May 20, 2010 12:38:04 AM

What sort of games do you play? Normal overclocking is fine but changing your wattage will definitely result in higher temperatures. Honestly unless you want a minimal 2FPS difference in most games I would just do a normal overclock especially on a brilliant card like a HD 5850 (Which im receiving) You can easily reach HD 5870 speeds without changing your wattage. Try MSI Afterburner which I find extremely useful
m
0
l
May 20, 2010 2:40:01 AM

To be honest, I'm just getting back into PC gaming so I'd imagine it'll be a lot of Starcraft II and FPSs mostly. The reason I was asking is that I'm going to be overclocking the card, but I want to have a rough idea of how much power my system is going to draw. I also want to know what to expect from my overclocked card. I haven't really found a lot of information on how power consumption changes after overclocking GPUs so I was hoping that the forum would be able to help me with that. I'm a little energy conscious and while I want a beast of a machine, I also want to keep its power consumption at a reasonable level.

In researching processors I've found that power consumption increases dramatically as you hit the limits of the overclock (at least for the processors I was interested in). I'm wondering how the GPU fairs in comparison; I want to know if there's an optimal point just before the overclock gets out of hand with the power consumption or if it's a more proportional increase.
m
0
l
!