typically between 70 and 90 percent efficient. However, a 500-watt power supply with 70 percent efficiency that is running at full load wastes 150 watts as heat. Of course, it is rarely drawing all of the 500 watts for which it is rated, but all that it does draw results in wasted energy. This heat causes multiple problems.
Figure 4-2 illustrates the problem. Any energy not used by the computer is given off as heat. The industry standard for power supplies is to run at an internal temperature lower than 40 °C (104 °F). The higher the operating temperature, the greater its impact on the equipment. High-efficiency units typically run cooler than this.
Energy cannot be created or destroyed. As no power supply is 100 percent efficient, the energy not leading to power components is emitted as heat. This represents wasted money in three ways:
▲ Energy "lost" to heat: In some of the less efficient power supplies, this can be up to 50% of the total power received from the wall outlet. Money is lost buying something that is not used as intended. Even the 80PLUS power supplies, some of the most efficient on the market, lose 20 cents of every dollar's energy that passes through it.
▲ Energy spent cooling the computer: Heat dumped out of the power supply raises the computer's internal temperature and that of the surrounding air space. This requires that more energy be put into cooling the computer and the room in which it is located.
▲ Increased likelihood of component failure: In general, the hotter that the power supply and computer run, the greater the thermal expansion damage to the devices. Eventually, this expansion and contraction due to heating and cooling will cause these parts to fail.
Was this article helpful?