As we all know energy is constant and doesn't just disappear into thin air. And we also know that it usually shows up as heat.
So let's take a hypothetical scenario and compare:
- 800W convection space heater
- 800W (measured at the plug) PC powerhouse, with dual graphic cards, 5 active HDs, powerfull proc that eats loads of power, overclocked to eat it even more, cooled with standard techniques of ventilation + heat ribs (no water or liquid nitrogen to steal our heat here) and a big plasma screen to make matters at the plug even worse
Can the heater (100% efficient) be replaced with the PC system and still heat the room at the same level, does it act like 80% efficient heater, maybe 50% efficient? Where did the rest of the power went if not in heating and which components are the "cold power sinks (not converting power to heat)" in a standard PC system?
Unlike a heater, a PC passively creates heat. So you probably won't get the same heating levels. However, if you can find a way to direct the heat to something that distributes it across the area, the idea of using a powerful PC to warm a room might just work
And btw, nothing 100% efficient. In any case, the "missing" energy is used to drive your PC components.
As OP said right up front, energy cannot be created OR destroyed. I can only be converted from one form to another, and almost always ends up (after several conversion steps, often) as heat. There is no such thing as "cold power sinks (not converting power to heat)". Look at any operating computer system, and think about where energy is being "used". In fact, it is NOT being "used" in the sense of being consumed and disappearing. It is being converted from one form to another always. Now, a monitor converts some of its input energy to light, and all of the rest it consumes becomes heat. In fact, even the light leaving the screen gets absorbed somewhere else and ends up as heat in the absorber.
Bottom line, whatever power is being consumed - that is, actual power flow out of the wall plug, not nameplate maximum consumption - is being converted to heat, and the vast majority of that is released into the room where the machine is located. About the only place I can think where energy leaves the room NOT as heat is the electrical signals sent out on communication lines. That is extremely small, but technically it does "escape" the room.
So you're saying that running any IC based technology (pc, dvd, hifi, plasma, lcd...) that consumes 500W of power at the plug will give off the same amount of heat as space heater consuming 500W of power, or in other words: leaving any of those on for a day in isolated room would heat the room for the same amount?
Yes, if you mean actual power consumption rate in watts, as in volts x amps, measured at the wall outlet. All of the power "consumed" by the computer system can only end up as heat, and it's all released into the room where the equipment is located. So it really does have the same heating impact as a 500 W heater running continuously. Same cost for electrical power used. Same heat load on an air conditioner as if you were heating the room while trying to cool it! The fact that it does not look like a red-orange glowing dedicated heater changes nothing - it's still heat.
This is one of the significant factors in the design, construction cost, and operating cost of a data center. They have to build it with a big enough heat removal system (aka big-ass air conditioning) to handle the heat released by the equipment. For years that has been one of the perks of running a computer operations room. The room is kept at 72 F or so for the benefit of the equipment, but the operators get to enjoy, too!