Thread 'A naive question re BOINC, energy use & heat by electricity'

Message boards : Promotion : A naive question re BOINC, energy use & heat by electricity
Message board moderation

To post messages, you must log in.

AuthorMessage
Matt Chambers

Send message
Joined: 22 Mar 06
Posts: 9
United States
Message 22188 - Posted: 30 Dec 2008, 21:10:56 UTC

Hello all,

I have a potentially naive question about energy use relating to running computer programs, such as BOINC. It reduces to something like, “If one heats by electricity, is there any cost to running BOINC in the winter?”

Assuming that a prime side-effect of having computers systems running BOINC is the use of electricity, generating ‘waste’ heat in the process, but I already use electricity to heat in the winter, is there really any extra cost to running BOINC in the winter since the waste heat generated in this fashion is simply replacing heat generated purposefully by electric heating units? It would seem to me that the two are equivalent (beyond any wear-and-tear on the computer, such as wearing out a fan or hard drive), but that calculating with BOINC just ‘inconveniences a few electrons’ for a good purpose first.

Unless I am misunderstanding something, this could help promote BOINC adoption in some areas. In the schools, for example, the extra electricity drain during much of the school year would essentially cost less than it would otherwise, serving a dual purpose by providing heat. In the summer, of course, this heat generation would be an additional cost, forcing air conditioning units to work harder, except that schools are typically out in the U.S. in the heat of the summer. A seasonal approach to BOINC computing, perhaps with additional nighttime computing in some cases, could thus be advocated as efficient and inexpensive.

I am a big fan of charitable, volunteer computing, and, as a future science educator, I would like to bring BOINC projects into the schools more as examples of cool, real and current science. However, energy consumption is going to be an ever-bigger concern, and any extra cost for leaving a computer on or having a computer work harder will be scrutinized more.

Thanks, Matt
ID: 22188 · Report as offensive
Jazzop

Send message
Joined: 19 Dec 06
Posts: 90
United States
Message 23767 - Posted: 18 Mar 2009, 16:45:58 UTC - in response to Message 22188.  

You need to account for the difference in efficiency between a CPU and and electric HVAC system as heat generators. The purpose of an electric heater is to generate heat. As such, it is designed to take electricity at a certain voltage and convert it into heat energy as efficiently as possible. It is also coupled to a distribution system (fans/ducts) to spread the heat throughout the building.

A CPU is designed to compute and merely generates heat as a byproduct-- actually modern CPU designs take into account ways to stay as cool as possible. This makes for a horribly inefficient source of heat. Its heat is simply dumped out the back of the case, not spread around the building for maximum effect. Of course, you are also getting other products (e.g., data output) in exchange for the electricity you feed into the CPU, but it's hard to compare the value of a piece of data to a Joule of heat energy.

To summarize, you will not simply recover energy lost through CPU heat by cranking down the building's electric heater, as each degree of room temperature rise attributable to the computer costs more in energy/money input than the corresponding energy/money saved by reducing the thermostat by 1 degree.
ID: 23767 · Report as offensive
Nicolas

Send message
Joined: 19 Jan 07
Posts: 1179
Argentina
Message 24264 - Posted: 13 Apr 2009, 14:11:56 UTC - in response to Message 23767.  

It's basic conservation of energy. If a CPU is using 80 watts, it will produce 80 watts of heat. Where else would the energy go?

There is no such thing as "efficient generation of heat". When something is inefficient, it generates heat instead of doing something useful (like light or movement).

ID: 24264 · Report as offensive
John37309
Avatar

Send message
Joined: 28 Jul 07
Posts: 91
Ireland
Message 24294 - Posted: 14 Apr 2009, 4:06:12 UTC - in response to Message 24264.  
Last modified: 14 Apr 2009, 4:23:42 UTC

It's basic conservation of energy. If a CPU is using 80 watts, it will produce 80 watts of heat. Where else would the energy go?


This is not entirely true Nicolas. Although heat is an unwanted by-product of getting a microchip or CPU to run, there are many things happening that use energy or need energy to complete a task.

Time - Usually acquired by applying pressure to quartz
Capacitance & Time - Charging and discharging of capacitors leaks energy(Usually not inside chips)
Reactance - http://en.wikipedia.org/wiki/Reactance_(physics)
Reluctance - http://en.wikipedia.org/wiki/Reluctance
Inductance - http://en.wikipedia.org/wiki/Inductance
Impedance - http://en.wikipedia.org/wiki/Electrical_impedance
Gate switching in components - Switching logic gates and transistors does use energy, but its very small.

In most cases, its resistance of both copper wires and components that use the energy.

Most CPU's have wapping big fans to cool or take away the heat. The fan itself uses energy in its physical rotation. Fans also produce sound, this is also using energy. Sometimes fans can be a large percentage of the overall power used.

For PC's in general, there are loads of other stuff like LED lights, transformers, speakers, etc, etc.

But yes, much of the 80 watts of power consumption in a CPU would be heat. The percentage would vary from machine to machine.

John.
| Irelandboinc.com | PaddysInSpace.com
ID: 24294 · Report as offensive
BrianK

Send message
Joined: 19 Apr 09
Posts: 1
United States
Message 24398 - Posted: 19 Apr 2009, 8:40:44 UTC

Computers are a very inefficient manner of producing heat, for an important reason not yet mentioned:

Electric heating can never be as efficient as a heat pump.

With electric heating, if you draw 100W from the grid, the most you can heat the room with is 100W of heat.

With a heat pump, it's very possible to heat a room with 300W or more of heat while drawing 100W from the grid. This does NOT violate the conservation of energy or the second law of thermodynamics because you're using energy to "pump" heat from an area of low temperature to an area of high temperature, while with electric heating you are simply converting electric power into heat.

The COP or Coefficient of Performance is the ratio of heat moved over the work required. Most heat pumps have a COP of between 2 and 5, which means they produce between 2 and 5 watts of heating for every watt they draw. This is a massive efficiency benefit over electric heating.

If someone is still using electric heating, convert to heat pumps, and you'll save more than enough energy to justify running BOINC. But don't consider computers a good source of heat by any means.
ID: 24398 · Report as offensive
Nicolas

Send message
Joined: 19 Jan 07
Posts: 1179
Argentina
Message 24401 - Posted: 19 Apr 2009, 18:59:59 UTC - in response to Message 24398.  

Computers are a very inefficient manner of producing heat, for an important reason not yet mentioned:

Electric heating can never be as efficient as a heat pump.


Of course; can't argue with that.

Here we use natural gas for heating and cooking, and we also have a heat pump. Electric heating is unheard of, except for eg. the toaster :) But in some countries they actually use electric resistances to heat the house.

ID: 24401 · Report as offensive

Message boards : Promotion : A naive question re BOINC, energy use & heat by electricity

Copyright © 2024 University of California.
Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation.