Every computing device produces heat in proportion to the electricity it consumes. ENIAC, one of the first room-sized behemoths of the 1940s, used 174 kilowatts to run its vacuum tubes; these days, you have about as much computing power in your pocket calculator. Although technology has improved the energy efficiency of computers, heat output is still a serious consideration for architects and others who plan dedicated server rooms and general office spaces.
In any mechanical or electronic system, the energy expended eventually becomes heat. A car engine, for example, converts energy from gasoline into useful motion, but friction with air, road and mechanical parts turns that motion into heat. So it is with computers: the microchips in a PC move information back and forth, but in the end the electrical energy becomes heat. If you measured the heat produced and the energy consumed, you’d find they balance exactly. Physicists call this principle the “Conservation of Energy.”
Faster, more powerful computers produce more heat than smaller, portable models. For example, a typical notebook computer, used moderately, consumes 40 watts of electricity and produces an equivalent amount of heat. A lightly used desktop machine, by comparison, uses about 100 watts. Mobile devices use far less power and produce correspondingly less heat; power consumption is limited by the small, weight-saving battery. A typical smartphone such as the iPhone 4S consumes only a few watts when making a phone call. Edited from Chron.com.