Jeff.Miller@samba.acs.unc.edu (BBS Account) (09/15/90)
I think it is safe to say the jury is still out on whether equipment is better left on or turned on only during use, from the standpoint of reliability, I believe it depends on the equipment and the pattern of usage. But think about this: you say you left 65 monitors on all the time for four years? And had only 11 failures? I would consider that excesiive, but is beside the point. Assuming each draws 50 watts (resonable for 19" monitors) and that you pay about as much for your power as I pay for min (not so reasonable: as part of an institution you may pay less, but not less than half I am sure), you paid US $13,000 (about 7,200 pounds, I think) for power for these monitors for that time period. If you kept them turned off 12 hours a day, you would have saved $6,500, enough to pay to have all of them fixed and perhaps quite a bit more. (Assuming you had them fixed by an independent :-) To say nothing of the environmental impact. I think everyone in the computer feild and especially the Unix community should give serious thought to power consumption. It is no joke. If there are 1 million machines out there, and each draws on average 200 watts, that is 200 megawatts. To say nothing of the air conditioning. Leaving them off for 12 hours a day would spare a 100 megwatt plant. Not much, unless they want to build it in _your_ backyard. Just think about $1.00 per watt per year to figure how much a given piece is costing you. Seen in this light, VAX's and 14" hard drives really do deserve to be melted down. If you have been trying to convince the powers that be at your site to replce an old system that "works just fine" with no sucess, perhaps you should try flashing a few figures. --