Processor core runs at 1.8v or less, depending on type (at least anything recent). Mobo uses 3.3v, 5v, and 12v signals for power and -12v and -5v for switching. Graphics cards use 3.3v or 1.5v depending on type. CDRW has a 12v and 5v line running to it. Same with hard drive. All currents being DC.
So what does that leave to convert all the voltages to 3.3v, 5v, 12v, -5, and -12v? The power supply. It has to make the current DC as well, so output frequency doesn't exist. Most decent power supplies have a switch on the back for 120v or 240v operation (still called 110/220 or 115/230 by some companies).
<font color=blue>You're posting in a forum with class. It may be third class, but it's still class!</font color=blue>