Jeff.Miller@samba.acs.unc.edu (BBS Account) (09/15/90)
I think it is safe to say the jury is still out on whether equipment is better left on or turned on only during use, from the standpoint of reliability, I believe it depends on the equipment and the pattern of usage. But think about this: you say you left 65 monitors on all the time for four years? And had only 11 failures? I would consider that excesiive, but is beside the point. Assuming each draws 50 watts (resonable for 19" monitors) and that you pay about as much for your power as I pay for min (not so reasonable: as part of an institution you may pay less, but not less than half I am sure), you paid US $13,000 (about 7,200 pounds, I think) for power for these monitors for that time period. If you kept them turned off 12 hours a day, you would have saved $6,500, enough to pay to have all of them fixed and perhaps quite a bit more. (Assuming you had them fixed by an independent :-) To say nothing of the environmental impact. I think everyone in the computer feild and especially the Unix community should give serious thought to power consumption. It is no joke. If there are 1 million machines out there, and each draws on average 200 watts, that is 200 megawatts. To say nothing of the air conditioning. Leaving them off for 12 hours a day would spare a 100 megwatt plant. Not much, unless they want to build it in _your_ backyard. Just think about $1.00 per watt per year to figure how much a given piece is costing you. Seen in this light, VAX's and 14" hard drives really do deserve to be melted down. If you have been trying to convince the powers that be at your site to replce an old system that "works just fine" with no sucess, perhaps you should try flashing a few figures. --
dmimi@uncecs.edu (Miriam Clifford) (09/16/90)
Does a pc actually draw 200 watts? All varieties of pc? Does it draw that much constantly? Or is the power usage higher when the machine is actually in use than it is when it is idle but on?
dhiman@motcid.UUCP (Ravinder Dhiman) (09/16/90)
dmimi@uncecs.edu (Miriam Clifford) writes: >Does a pc actually draw 200 watts? All varieties of pc? Does it draw that >much constantly? Or is the power usage higher when the machine is actually in >use than it is when it is idle but on? In answer to your first question, it depends. The 200 watt (or whatever the number may be) figure just means that the power supply is CAPABLE of supplying (sp?) that amount of power. A computer with a 200 watt power supply will usually be taking up less power than the 200 watts available. In the case of a computer loaded with relatively high power consumption boards (or other devices), may take up close to the maximum rated capacity of the power supply. Also, the amount of power consumed by a computer depends on the type and speed of the computer. An IBM PC (or it equivalent) require less power than a `386 based machine; a 20 MHz `386 will take up a little less power than a 33 MHz `386, and so on.. BTW, the above paragraph assumes all other thing being "equal." As to your last two questions, unless the computer is providing power to some small electro-mechanical gadget which turns on and off, the power consumption should be pretty constant regardless of whether the computer is sitting "idle" or actually doing something. BTW, even when the computer is "sitting idle," it is doing it's own housekeeping (refreshing the RAM, waiting on input indications, etc..). Hope that answers your questions. ---- Ravi Dhiman Motorola, Inc. M/S IL27-N276 Cellular Infrstructure Div. Arlington Heights, IL 60004 Disclaimer: My opinions only, not those of my employer.
davidsen@sixhub.UUCP (Wm E. Davidsen Jr) (09/17/90)
In article <1081@beguine.UUCP> Jeff.Miller@samba.acs.unc.edu (BBS Account) writes: | I think it is safe to say the jury is still out on whether equipment | is better left on or turned on only during use, from the standpoint of | reliability, I believe it depends on the equipment and the pattern of | usage. Several years ago one of the people in our "terminal repair" service (which does all PC's monitors, etc) studied the failure rates of equipment and concluded that the failure rate per unit per year was 30-50% lower if the equipment was left on all the time. Turning stuff off over the weekend had no significant effect, but daily power cycles hurt badly. Of course any systemn which runs UNIX will probably be on all the time for mail/news etc, but the monitor can go off on weekends. This was true of equipment in use at GE at that time. You assume responsibility for any projection of that data to your situation. -- bill davidsen - davidsen@sixhub.uucp (uunet!crdgw1!sixhub!davidsen) sysop *IX BBS and Public Access UNIX moderator of comp.binaries.ibm.pc and 80386 mailing list "Stupidity, like virtue, is its own reward" -me
Chuck.Phillips@FtCollins.NCR.COM (Chuck.Phillips) (09/17/90)
>>>>> On 15 Sep 90 10:04:05 GMT, Jeff.Miller@samba.acs.unc.edu (BBS Account) said:
Jeff> But think about this: you say you left 65 monitors on all the time
Jeff> for four years? And had only 11 failures? I would consider that
Jeff> excesiive, but is beside the point.
Jeff> Assuming each draws 50 watts (resonable for 19" monitors) and that
Jeff> you pay about as much for your power as I pay for min (not so
Jeff> reasonable: as part of an institution you may pay less, but not less
Jeff> than half I am sure), you paid US $13,000 (about 7,200 pounds, I
Jeff> think) for power for these monitors for that time period. If you kept
Jeff> them turned off 12 hours a day, you would have saved $6,500, enough
Jeff> to pay to have all of them fixed and perhaps quite a bit more.
Jeff> (Assuming you had them fixed by an independent :-)
Last we had to pay for a monitor, it ran about $2000. If he had four less
failures by keeping the monitors on, he came out ahead in $s and downtime.
Jeff> If there are 1 million machines out there, and each draws on average
Jeff> 200 watts, that is 200 megawatts. To say nothing of the air
Jeff> conditioning. Leaving them off for 12 hours a day would spare a 100
Jeff> megwatt plant.
...unless the energy required to replace the monitors is more than the
extra energy required to keep them from failing (with screenblank running,
of course). There is also the matter of extra garbage, if switching
monitors off daily causes failures.
This isn't a flame; you may be quite correct that turning monitors off _is_
the best thing to do economically and ecologically. Unfortunately, we're
missing some data needed to determine the _total_ relative costs.
Any takers?
--
Chuck Phillips MS440
NCR Microelectronics Chuck.Phillips%FtCollins.NCR.com
2001 Danfield Ct.
Ft. Collins, CO. 80525 uunet!ncrlnk!ncr-mpd!bach!chuckp
mario@cs.man.ac.uk (Mario Wolczko) (09/17/90)
In article <1081@beguine.UUCP>, Jeff.Miller@samba.acs.unc.edu (BBS Account) writes: > Assuming each draws 50 watts (resonable for 19" monitors) and that you pay > about as much for your power as I pay for min (not so reasonable: as part > of an institution you may pay less, but not less than half I am sure), you > paid US $13,000 (about 7,200 pounds, I think) for power for these monitors > for that time period. If you kept them turned off 12 hours a day, you would > have saved $6,500, enough to pay to have all of them fixed and perhaps > quite a bit more. (Assuming you had them fixed by an independent :-) 4years x 365d x 24h x 0.050kW x 65 / (12hrs/24hrs duty cycle) = 56940 kW h At approx 5p per kW h (domestic rate, don't know what the commercial rate is), this is 2847 pounds, somewhat less than you calculate, but of the same order. That budgets 258.81 pounds (approx $500) per repair, _assuming the reliability is the same_. If the reliability halves (which I think is being _extremely_ optimistic, as we are talking about _at least_ 1000 power cycles per monitor, at least 65000 in all, and probably much more as many are shared), I think the repair costs are likely to be more. > To say nothing of the environmental impact. > > I think everyone in the computer feild and especially the Unix community > should give serious thought to power consumption. It is no joke. If there > are 1 million machines out there, and each draws on average 200 watts, that > is 200 megawatts. To say nothing of the air conditioning. Leaving them off > for 12 hours a day would spare a 100 megwatt plant. Not much, unless they > want to build it in _your_ backyard. > > Just think about $1.00 per watt per year to figure how much a given piece > is costing you. This is an incredibly specious argument. How much energy do you think it costs to make the parts for a monitor? What about the cost in raw materials? What about the petrol used by the delivery of the parts (most of which come half way round the world), and the visit of the maintenance engineer? And air conditioning..in the UK? Tee hee... And why are UNIX users singled out? I've heard the expression "power users", but I thought that meant something different :-). Also, most countries have surplus power at night, so I don't think you can argue that switching machines off at night would save on stations. Anyhow, 100MW is a small fraction (<10%) of _one station_. If you're going to argue a case, _present real facts_ (such as the difference in MTBF for monitors left on vs those power cycled every day). Otherwise, we gain nothing. Incidentally, the power consumption of a 3/50 drops from 140W to 125W when screenblank becomes active: I just measured it. Switching off the monitor entirely saves 70W. Mario Wolczko ______ Dept. of Computer Science Internet: mario@cs.man.ac.uk /~ ~\ The University USENET: mcsun!ukc!man.cs!mario ( __ ) Manchester M13 9PL JANET: mario@uk.ac.man.cs `-': :`-' U.K. Tel: +44-61-275 6146 (FAX: 6280) ____; ;_____________the mushroom project___________________________________
mroussel@alchemy.chem.utoronto.ca (Marc Roussel) (09/18/90)
In article <4349@cocoa11.UUCP> dhiman@motcid.UUCP (Ravinder Dhiman) writes: >As to your last two questions, unless the computer is providing power >to some small electro-mechanical gadget which turns on and off, the power >consumption should be pretty constant regardless of whether the computer is >sitting "idle" or actually doing something. By a "small electro-mechanical gadget which turns on and off", do you perchance mean a disk drive? :-) Marc R. Roussel mroussel@alchemy.chem.utoronto.ca
dhiman@motcid.UUCP (Ravinder Dhiman) (09/18/90)
mroussel@alchemy.chem.utoronto.ca (Marc Roussel) writes: >By a "small electro-mechanical gadget which turns on and off", do you >perchance mean a disk drive? :-) > Marc R. Roussel > mroussel@alchemy.chem.utoronto.ca Actually, no. My above phrase (apologies for the lack of clarity) was in reference to any I/O boards which may interface the computer to the outside world (e.g., the computer activating a relay based on some input, controlling an automated test, etc.) Come to think of it (thanks to your question), the phrase could be applied to a disk drive installed in a laptop. Some laptops go so far as to shut down (or place into standby) their hard drives if the drives have been idle for some period of time. ---- Ravinder Dhiman Motorola, Inc. M/S IL27-N276 Cellular Infrastructure Div. Arlington, Heights, IL 60004
src@scuzzy.in-berlin.de (Heiko Blume) (09/19/90)
>dmimi@uncecs.edu (Miriam Clifford) writes: >>Does a pc actually draw 200 watts? All varieties of pc? Does it draw that >>much constantly? Or is the power usage higher when the machine is actually in >>use than it is when it is idle but on? it doesn't. those 200 watts are needed for starting the machine in the first place. that means the power supply must be able to spin the hard disks up etc which can amount to many watts. to give you some numbers: i have a 600MB hard disk that requires up to 4.5 A on 12 Volt (54 W) when it starts up, but it just uses 2.2 A 'maximum operating current' in the worst case. btw: 4.5 A is a lot, it's about half the power most 220 watt power supplies can deliver on 12 V. -- Heiko Blume c/o Diakite blume@scuzzy.in-berlin.de FAX (+49 30) 882 50 65 Kottbusser Damm 28 blume@scuzzy.mbx.sub.org VOICE (+49 30) 691 88 93 D-1000 Berlin 61 blume@netmbx.de TELEX 184174 intro d scuzzy Any ACU,e 19200 6919520 ogin:--ogin: nuucp ssword: nuucp
ck@voa3.UUCP (Chris Kern) (09/20/90)
You need to consider more than the cost of electricity in deciding whether to keep monitors, or other computer equipment, powered up (assuming, of course, that keeping the equipment on does indeed increase the interval between failures). You also need to consider the cost in lost employee productivity of computer downtime. People are typically more expensive than the computer resources they use -- and much more expensive than the electricity consumed by their computers. -- Chris Kern Voice of America, Washington, D.C. ...uunet!voa3!ck +1 202-619-2020
ucbked@athena.berkeley.edu (Earl H. Kinmonth) (09/20/90)
In article <42@voa3.UUCP> ck@voa3.UUCP (Chris Kern) writes: >You need to consider more than the cost of electricity in deciding >whether to keep monitors, or other computer equipment, powered up Similarly, if your concern is environmental (minimizing the use of energy), you must consider how much energy will be used if turning the machine on and off increases the failure rate. At the very least you will have one trip to take the machine in for repair and one trip to return it. The energy used for even a moderate auto trip will generate a fair amount of juice.
sanjay@walt.cc.utexas.edu (Sanjay Keshava) (09/20/90)
In article <42@voa3.UUCP> ck@voa3.UUCP (Chris Kern) writes: >You need to consider more than the cost of electricity in deciding >whether to keep monitors, or other computer equipment, powered up >(assuming, of course, that keeping the equipment on does indeed >increase the interval between failures). You also need to consider >the cost in lost employee productivity of computer downtime. People >are typically more expensive than the computer resources they use >-- and much more expensive than the electricity consumed by their >computers. > >-- >Chris Kern Voice of America, Washington, D.C. >...uunet!voa3!ck +1 202-619-2020 This is an interesting thread for a change. When I worked at Xerox a few years ago, a memo was circulated asking users to power-off their personal workstations before going home. Some figures were presented to show the savings in electricity cost, and they weren't trivial. However, I too wondered about the cost of increasing failures and lost productivity, but in the 5 years I was there, my workstation failed only once due to known problems with the Seagate 4051 hard disk sticking, and I powered-off my workstation daily. In fact, I experienced a severe disk crash, which required reformatting, during one of the few times I left the workstation on overnight. This was due to electricians accidently disturbing the power lines during non-work hours. The lost productivity often attributed to waiting 20 minutes for it to boot every morning was discounted because most people take a few minutes to get some coffee, return some calls, etc when they arrive in the morning. Sanjay ->|<- Student in the UT Graduate School of Business DARPA: sanjay@ccwf.cc.utexas.edu | Graduation Date: TBD CSnet: sanjay%ccwf@relay.cs.net | Greetings to fellow Anteaters, UUCP: ...!ut-emx!ccwf.cc.utexas.edu!sanjay | Trojans, and Longhorns.
karrer@ethz.UUCP (Andreas Karrer) (09/20/90)
AH! I finally found out why people leave their TV running all the time: They are preserving energy!!! Come on, be reasonable. You will *always* find a so-called expert who assure you that keeping computers on all the time is more energy-efficient that shutting them down every evening. You will likewise *always* find an expert who assures you of the contrary. Solution: use common sense. P.S. Ever wondered why US-made computers almost invariably have subminiature power switches almost inaccessible at the back end? And why often european-made ones have theirs in front? Any similarity with fuel comsumption of detroit cars vs. european/japanese cars is purely coincidental... +---------------------------------------------------------- Andi Karrer, Communication Systems, ETH Zurich, Switzerland
dwp@willett.pgh.pa.us (Doug Philips) (09/20/90)
In <1897@sixhub.UUCP>, davidsen@sixhub.UUCP (Wm E. Davidsen Jr) writes: > > Several years ago one of the people in our "terminal repair" service > (which does all PC's monitors, etc) studied the failure rates of > equipment and concluded that the failure rate per unit per year was > 30-50% lower if the equipment was left on all the time. Turning stuff > off over the weekend had no significant effect, but daily power cycles > hurt badly. I'm curious, and perhaps others are, if they told you it was the power cycling that caused the problems because of the electrical damage, or if the damage was caused by the resultant heat stress, or ??? -Doug --- Preferred: dwp@willett.pgh.pa.us Daily: {uunet,nfsun}!willett!dwp
tjo@its.bt.co.uk (Tim Oldham) (09/20/90)
In article <6136@ethz.UUCP> karrer@ethz.UUCP (Andreas Karrer) writes: >P.S. Ever wondered why US-made computers almost invariably have subminiature >power switches almost inaccessible at the back end? And why often >european-made ones have theirs in front? The EC dictates this, I'm told. Our RS/6000 has a bloody great switch on the front, which incredibly tempting to flick...I'm also told that IBM wanted to put the switch on the back, but were told they couldn't. Tim. -- Tim Oldham, BT Applied Systems. tjo@its.bt.co.uk or ...uunet!ukc!its!tjo Living in interesting times.
dom@polecat.llnl.gov (Dom Nardy) (09/20/90)
The reason most people keep their computers on constantly is to stop the component failures due to the thermal damage caused by a computer warming up upon start up and cooling down upon shutdown. These temp swings shorten component life spans. Dom
emv@math.lsa.umich.edu (Edward Vielmetti) (09/21/90)
In article <68370@lll-winken.LLNL.GOV> dom@polecat.llnl.gov (Dom Nardy) writes:
The reason most people keep their computers on constantly is to stop
the component failures due to the thermal damage caused by a computer
warming up upon start up and cooling down upon shutdown. These temp
swings shorten component life spans.
I recommend that most people keep their systems powered up because
they might want to use them from home at 3 a.m. or because they are
fast enough that someone else might want to run a job in the
background.
I guess I should start telling people to turn off their Sun 3/50's,
though, since there's not much point to remote access.
--Ed
Edward Vielmetti, U of Michigan math dept <emv@math.lsa.umich.edu>
sjg@sun0.melb.bull.oz (Simon J. Gerraty) (09/25/90)
In article <1081@beguine.UUCP>, Jeff.Miller@samba (BBS Account) writes: >I think it is safe to say the jury is still out on whether equipment >is better left on or turned on only during use, from the standpoint of >reliability, I believe it depends on the equipment and the pattern of >usage. > >I think everyone in the computer feild and especially the Unix community >should give serious thought to power consumption. It is no joke. If there I agree that the power consumption is a serious issue. However I have a 15" mono screen on my Sun at home which has been on for most of the last 12 months. I find when I turn it off even just over night, that when turned back on it flashes and does all sorts of other horrible looking things for about 10 minutes. I seriously doubt that it is going to last too much longer, but it seems to be better of left running. Does anyone else have war stories about these 15" screens? -- Simon J. Gerraty <sjg@sun0.melb.bull.oz.au> #include <disclaimer> /* imagine something *very* witty here */