I want to add a new terabyte hard drive in my computer. The problem is I don't know how much watt I have left to spare. I posted this in another forum and one person recommended me to use a multimeter.
I have a Celeron 450 2.2ghz, GIGABYTE GV-R435OC-512I Radeon HD 4350 512MB 64-bit GDDR2 PCI Express 2.0 x16 (said to take up 35 watts), 320gb HDD SATA, DVD-RW, 2 x 1GB sticks DDR2.
Here's the original link. I'm not sure if they are paranoid or not. When I asked the same question for my graphics card, the person said I have more than enough power. The user wasn't worried.
Hard drives use very very little power compared to graphics cards and cpu's. In fact, the power consumed by hard drives is almost insignificant. The size/capacity of the disk will not affect power consumption. You should not have any problems at all.
Exactly how much does a HDD in SATA take up in watts? Why would the graphics card recommend me a power supply wattage? Wouldn't it be better to tell how much the graphics card consumes in watts? It'll be more helpful in my situation. I have the card running on a 250W power supply.
You're an engineer, aren't you? (Don't worry, I am too)
Honestly though, everyone's right. Modern HDDs consume a maximum of around 10-15W during startup (when the discs start spinning) and consume 5-8W during peak loads (constant access) at idle most are 3-5W and in sleep (powered on but discs not spinning) less than 1W. These are pretty generalized numbers, yes, but this gives you a scale for power consumption to compare to your CPU (35W) and GPU (50W). All-in-all you can see the power draw is pretty neglegible. If you're truly that concerned then buy a P3 Kill-A-Watt (~$20) and get the total power consumption of your system to ensure it doesn't get close to the max of the PSU.