Post by "" <just lurking>I'm trying to find out how much a 100-watt incandescent light bulb
raises the room temperature per hour. I Googled for 10 minutes and
struck out.
Layman's terms or a very technical are both fine; sister has physics
degree. Thanks!
100 W light bulb generates 360,000 joules/hr, or 360 kJ/hr.
For a perfectly insulated room of dimensions 4 m x 3 m x 3 m (36 cubic
meters), and for static dry air with a heat capacity of 1.0 kJ/kg/degC
and density 1.2 kg/m^3, this works out to a temperature rise of
360 / (1.0 * 1.2 * 36) = 8.3 degC/hr. (This number seems high to me,
but those are the numbers.)
To remove that with an air conditioner of efficiency of 15% would
expend about 670 watts of AC line power, or 0.67 kW. Running that for
an hour will expend 0.67 kW-hr of energy per hour, costing about 7
cents per hour, plus the penny it costs to power the light bulb.
Presuming that someone wants the light bulb on at night during the
summer, a period of about 10 hours, the privilege of not stubbing your
toe on the way to the bathroom in the middle of the night is costing
about $0.80 a night, or about $24 a month. If the time period is only
2 hrs/night, this cost goes down to less than $5/month, about the cost
of two Starbuck's coffees per month.
For some people, that is an acceptable cost, for others not. That part
of the battle is something a physicist is not going to get enmeshed
in.
PD