Almost all the energy a computer takes in (over 99%) is 'wasted' as heat.
Talk to anyone who has ever run a datacenter. The heat output of a machine is exactly equal to the amount of energy consumed. Maybe a watt or two leaves the room as electrical energy on a network cable, as sound energy, or as EM radiation.
Where else does the energy go?
Look at this as a freshmen year physics thermodynamics problem. Your system of interest is the computer case. It has 500 Watts coming in. It has 0.5 watts leaving as EM radiation, 0.5 watts leaving as sound (i.e., vibrations in the air), and 1 Watt leaving on wires connected to it. The other 498 Watts are necessarily heat.
Oh common. I objected to you saying that all energy drawn gets transformed to heat. That's patently false, unless the system you're studying has either efficiency 0% or is indeed a heater of efficiency 100%. Anything in between contradicts what you said. In the specific case I've objected to is even easier to see the problem. The HDD platters spin. That's work done. Energy transformation from electric to kinetic. Whether its quantity is bigger or smaller than that that gets transformed to thermal energy is a matter of efficiency. Otherwise the kinetic one would be free and Perendev and Bedini would be inclined to post here.
I appreciate that you've taken the literal interpretation of the world "all", but if you're going to be pedantic (which isn't very helpful or interesting in any case) at least be correct, please.
Eventually all the kinetic energy in the spinning platters is converted to heat. It is converted continuously to heat by the process of friction between moving parts, and by the time the platters stop moving, all of it that was kinetic is now thermal.
I'm sorry if I seemed pedantic. It was not my intention.
I took "all" literally because I thought that it was important in that context. The problem was pointed to be heat and I thought pointing out that heat is the result (and directly proportional with) the inefficiency of the design (friction mainly) and power drawn, would be useful. It thus seemed that increasing efficiency and/or reducing power requirements would be a nice way of solving that particular wish in the original article. Therefore I thought that assuming heat was unavoidable was not a good/correct idea and tried to argue that. The tone was already set to "let's make wishes".
If you want to argue that all energy goes to heat in the end... irreversibly even, I think you move to a different level altogether.
Talk to anyone who has ever run a datacenter. The heat output of a machine is exactly equal to the amount of energy consumed. Maybe a watt or two leaves the room as electrical energy on a network cable, as sound energy, or as EM radiation.
Where else does the energy go?
Look at this as a freshmen year physics thermodynamics problem. Your system of interest is the computer case. It has 500 Watts coming in. It has 0.5 watts leaving as EM radiation, 0.5 watts leaving as sound (i.e., vibrations in the air), and 1 Watt leaving on wires connected to it. The other 498 Watts are necessarily heat.