I geeked out today with a Kill-A-Watt, a simple meter that lets you measure electricity usage of things plugged into ordinary sockets. I have two computers at home - a Linux box with an Athlon at 1000MHz and a Windows box with an Athlon XP at 1733MHz. How much power do they consume? Hours of fun!
My dollar calculations are off; I don't know my real rate. What's interesting is the incremental costs. Running CPU jobs at full-tilt takes another 30W on my Windows box, or about $3/month. AMD's power management bug blows. athcool on my Linux box is saving me about 40W. If I could run VCool on my Windows box it'd save another 30W.
Bottom line - configure power saving on your monitor! And turning off your computer really makes a difference.