This is a question regarding the power usage of a processor that has a fixed frequency and a fixed voltage. If the processor is running at Frequency A and the Voltage is B, will the processor use more power when running at 100% load as opposed to 0% load? If this is the case then the only culprit could be that the current is higher with more load that it is at lower loads. Does this make sense?