Did you ever have some little thing that bothers you, but you never do anything about it? Today I'm going to do something about it.
Some people leave their computers running all the time, when they don't have to. There, I've said it.
In some cases, there is a good reason for leaving it on all the time. But much of the time, people are just lazy and don't want to make the effort to shut their PC down and restart it the next morning.
Come on, folks, turn the dang thing off if you're not going to use it, especially with what energy costs these days. You'll not only lengthen the life of your computer, but you'll save money, too.
How much cash would you save? Keep reading. We'll get there.
This story started a while ago, where I work. We're in the process of moving a data center full of computer equipment from one location to another.
One of the parameters we needed to know before the move was how much the electrical load of our equipment would amount to. So we bought a meter that measures the wattage any particular device uses.
We had to measure the electrical draw of all the servers and other equipment and add it up.
After we finished, the meter was still available, so I began to tinker around and test some of the kinds of computer equipment a normal person would have in his home or office _ pretty much just out of curiosity.
I figured that if people knew what the actual cost was of leaving their stuff turned on all the time, they might change their minds about leaving equipment turned on unnecessarily.
So I started up OpenOffice Calc (my spreadsheet program), got out my latest NYSEG bill, and began to do some figuring. The hardest part was sorting out exactly how much the electricity costs.
There is the actual cost of the electrical power, then the cost of transporting that power to my house, plus the adding and subtracting of other charges and figuring the tax on some of it.
Anyway, one evening I did it. It turns out that each kilowatt-hour costs me just shy of 13 cents, delivered conveniently to my home's electric meter.
So, arbitrarily, I'm going to use that as a basis for my calculations. Some people's cost may actually be slightly higher or lower.
Then I had to take our measuring device, do a lot of plugging and unplugging of different kinds of things, and keep track of it all.
For starters, I measured some light bulbs, just to keep honest people honest. According to our watt meter, a 100-watt light bulb actually drew 97 watts. A compact fluorescent bulb, rated at 14 watts actually drew 13 watts. I guess that's close enough in my book.
So I picked some representative computer equipment to measure: a "regular" PC, with one hard disk, a DVD drive, floppy drive and a normal amount of RAM, typical of what most people would be using. I'm not testing server computers for this column. They usually need to run 24 hours, seven days a week for a good reason.
I measured a 17-inch CRT-style monitor, and a 19-inch LCD-style monitor, both fairly common these days.
For printers, I picked a fairly new inkjet printer, and a popular office-type laser printer.
Then I figured out the number of hours the devices would be turned on, but not used under some different scenarios.
I also took my wattage measurements while the devices were more or less idling, because they typically use more power while they are actually doing work than when they're just sitting there, and I wanted to be fair in my calculations of waste. After all, it's not a waste if you're actually using it.
Let's start with a nice round number: A 100-watt light bulb. It will cost me 30 cents to run it for a day.
Now, if I'm running an office and I burn that light bulb all the time but the office is only actually in use for a single work shift Monday through Friday, I will throw away $82 over the course of a year.
That's for one light bulb.
If we substitute a fluorescent fixture with two 40-watt tubes in it (fairly common), that would be about $67 wasted over a year's time. Have 10 of these fixtures in your office? Do the math.
On to the computer equipment. For the purpose of consistency, I will continue to use the scenario of being left on all the time, but only used for one work shift a day, Monday through Friday.
A PC by itself will waste $67 a year. But add an older CRT-style monitor and an office laser printer, and it comes out to $125 of wasted electricity in a year. Again, if you're a business owner with more than one, do your own math. A small office with several PCs, and you could be approaching a month's rent.
A CRT-style monitor alone would amount to $44 annual waste. An LCD-style monitor would be $15.
To put this in some perspective, let's say you're a medium-sized company. You have 20 PCs, half with older-style CRT monitors and half with newer LCDs. You have five laser printers. Everybody is leaving their stuff running all the time, and you have one work shift Monday through Friday. You're wasting $2,000 a year in electric costs.
Now some food for thought. You could continue to give the electric company the two grand, or you could turn off the equipment and give each of your 20 employees a $100 bonus check at the end of the year.
Which would you rather do?
Bruce Endries is former systems manager at The Daily Star. He can be reached by e-mail at email@example.com.