Frequency scales power consumption of a digital circuit linearly at full load. However, when you overclock significantly like that, you increase not always only the frequency, but also the voltage and this is what increases power consumption exponentially, not the frequency.
Power consumption is equal to Capacitance times Frequency times voltage squared (P=C*F*V^2).
What voltage your graphics would need to overclock to what frequency will depend on the unit (how high you can go also thus depends on the highest voltages that you can use without killing the chip). This variable, IIRC, is the capacitance. So, since you can't know capacitance without testing it, you can't determine how much power you will need to consume to get a stable overclock unless you test to see what you can do at the top safe voltage and the frequency that you achieve at it. At that point, it's probably easier to determine your power consumption with a wattage measuring tool anyway.
I haven't touched the voltage despite overclocking by to 162.5%
It is stable, I have tested it with 98% capacity during gaming for several hours.
So, with capacitance untouched and voltage standard, the power draw is 63% higher...sources on the internet state the 8400GS my card is based on draws 30W at full load. This overclocked version should therefore draw around 49W...
PS: Any good wattage measurement tools other than joulemeter?