[comp.misc] Leaving AT on overnight

german@uxc.cso.uiuc.edu (08/10/87)

The machines I mentioned earlier were pretty much fully loaded as well.
PC/AT_1:1664k,2 x 20m disk,mono/printer adpt.,3com ethernet,IBM TRN,external 3.5
PC/AT_2: 640k,2 x 20m disk,mono/printer adpt.,EGA,ProNET-10,IBM PC net.
PC/AT_3: 512k,1 x 30m disk,EGA,IBM 3270 emulation card, IBM PC net.

If anything seems to have trouble with being on all the time it would be an
Enhanced Graphics Monitor.  I have had a couple of failures over the years
on machines where they were turned down (brightness) but not off.

         Greg German (german@uxc.CSO.UIUC.EDU) (217-333-8293)
US Mail: Univ of Illinois, CSO, 1304 W Springfield Ave, Urbana, IL  61801
Office:  181 Digital Computer Lab.

mellman@ttrdd.UUCP (Thomas Mellman) (08/13/87)

In article <768@custom.UUCP>, boykin@custom.UUCP writes:
> 
> In article <1246@bloom-beacon.MIT.EDU>, voyager@athena.mit.edu (Center for Space Research) writes:
> > 
> > Could anyone give me some info on the bad, good, not-so-bad points
> > related to [leaving his AT on overnight].
> 
> Without a doubt, leave the machine on.  Leave it on 24hours a day
> 7 days a week.  The electronics will last significantly longer
> that way.  Most computers die as they're being powered up or down.
> ...
> As for the bearings in the disk drive, they won't be any worse for
> the wear either.

I don't know about that.  My disk drive specs rate my drive's life at
11,000 hours.  I guess I kind of hoped that I could expect more than
1 and 1/4 years out of it.

As far as that old argument that the electronics don't like power cycling,
how many terminals (as a model for pc's that's been around alot longer)
has anyone come up against whose electronics have failed?  Usually, they
just get so obsolete that nobody wants to use them anymore, or the
keyboard becomes unusable.  I think I can say that I've never seen a
terminal that didn't work.

NU013809@NDSUVM1.BITNET (Greg Wettstein) (08/14/87)

As the computer coordinator for the College of Pharmacy I have been dealing
with the on vs off controversy for about 8 years now.  It has been my
experience that computer equipment and peripherals which are left on
continuously have far fewer problems than those which are cycled with any
degree of consistency, even once a day.
     
The first personal computer I dealt with after having a series of Data
General mini-computers was a Zenith Z-158 with one of the first 10 megabyte
hard disks which ZDS started putting in their computers.  It was left on
24 hours a day for 2 to 2 1/2 years before the (what would be considered
the fairly primitive) hard disk gave out.
     
I have another Zenith Z-158 with a Seagate ST-225 hard disk in one of
our hospitals which has been running continuously now for 2 years with
nary a glitch.
     
On the other hand our research group on campus has an IBM PC which is
now on its third hard disk in about three years.  The first disk lasted
only about 1 year as it was cycled on and off every time somebody wanted
to use it.  I managed to convince them that they should at least let it
run all day and turn it off at night it they wanted to.  This was done
with the second hard drive and this one lasted slightly longer than a
year.  I put a Seagate ST-225 in when the second drive burned out and
strongly advised leaving it run 24 hours a day.  The drive has been
running flawlessly for over a year now with no difficulty.
     
A good friend of mine who teaches in the Electrical Engineering department
here at NDSU was the individual who started me on my crusade to keep
equipment running all the time.  He had told me that as far as he knows
there have been no definitive studies that indicate that semi-conductor
devices (integrated circuits) "wear out".  In the vacuum tube era (which
I experienced first hand) this was not the case.  These devices depended
on emission from thorium-oxide coated filaments.  The filaments had a
finite lifespan which could be preserved by powering the device down when
not in use for extended periods of time.  What appears to be the case with
semi-conductor devices is that while they do not "wear out" they can be
damaged by voltage spikes and power surges induced by the power cycling
process.
     
One of the individuals posting a message in this chain indicated that he
had never heard of a terminal wearing out.  I wish that our organization
had his luck.  Our first attempt at word processing was with a multi-user
product called T.I.P.S. marked by IPTC of Palo Alto California.  This
word-processor supported our five secretaries and was run under RDOS on
a Data General Nova/4X.  Each secretary had an ADM-5a as a terminal.  I
advised them to leave the terminal on all the time and to simply turn down
the display whenever they wanted to leave the terminal for longer than
5 minutes.  I couldn't convince them that this was a good idea and the
terminals got turned off every night and sometimes a couple of times a
day.  To make a long story short none of those terminals lasted much over
a year, all died of various electronics problems.  At the time we purchased
the terminals we actually bought six terminals and the sixth went on the
Nova as the operator console.
     
That terminal ran continuously for three years 24 hours a day until we
sold the Nova and went to PC's.  I moved the terminal down to my lab
office and its still running 24 hours a day, 2 years after we took it off
the Nova.
     
These results are not scientific or statistically rigorous but they do
serve as examples.  My experience has taught me to worry more about
environmental considerations such as excessive heat and the biggest problem
(especially here in the prairie) large amounts of dust and dirt getting
into the equipment.
     
                                   As always,
                                   G.W. Wettstein
                                   NU013809@NDSUVM1
     
     
When not at the above E-MAIL address I can be found in Davenport, North
Dakota, roping the hind feet of Hereford cattle . . . very rapidly.......
     
     

guardian@laidbak.UUCP (08/14/87)

In article <202@ttrdd.UUCP> mellman@ttrdd.UUCP (Thomas Mellman) writes:
>In article <768@custom.UUCP>, boykin@custom.UUCP writes:
>> 
>> Without a doubt, leave the machine on.  Leave it on 24hours a day
>> 7 days a week.  The electronics will last significantly longer
>> that way.  Most computers die as they're being powered up or down.
>> ...
>> As for the bearings in the disk drive, they won't be any worse for
>> the wear either.
>
>I don't know about that.  My disk drive specs rate my drive's life at
>11,000 hours.  I guess I kind of hoped that I could expect more than
>1 and 1/4 years out of it.

I've seen the average life of an AT to be about two years before breakdown.
Most of the PC's I've installed have had hard disk falures or general problems
within about two to three years.  If you wish to build a unit that  will 
survive for a while, then I would get the specs on the hard disk (much as
boykin stated above) and also check to see how much heat the boards give off.

I've had my PC for about three years, running off-on for about 4-8 hours a
day (as I have needed it) and have had no problems.  I suspect though I 
should be seeing problems with it by the end of the year.  The hard disk
does not seem as quite as it was when I purchased it.


Harry Skelton
guardian@laidbak.UUCP

ayac071@ut-ngp.UUCP (William T. Douglass) (08/14/87)

In article <202@ttrdd.UUCP> mellman@ttrdd.UUCP (Thomas Mellman) writes:
>As far as that old argument that the electronics don't like power cycling,
>how many terminals (as a model for pc's that's been around alot longer)
>has anyone come up against whose electronics have failed?  Usually, they
>just get so obsolete that nobody wants to use them anymore, or the
>keyboard becomes unusable.  I think I can say that I've never seen a
>terminal that didn't work.

Well, let me enlighten you.  After 5 years working in a S/370 environment
with 3278/3178/3179 terminals, I have witnessed dozens of dying or dead
tubes (terminals.)  If fact, one of the local remote job entry sites decided
to power-off all terminals in its building over the week-end during summer
months, specifically because of concerns about heat-related damage.  This
was stopped after 3 weeks because the technicians determined that powering
up the terminals weekly was causing more failures than just letting them
run constantly.

I say, let 'er run.

Bill Douglass
ayac071@ngp.UUCP

karl@ddsw1.UUCP (08/15/87)

In article <202@ttrdd.UUCP> mellman@ttrdd.UUCP (Thomas Mellman) writes:
(Question was asked about leaving systems on overnight)

>> 
>> Without a doubt, leave the machine on.  Leave it on 24hours a day
>> 7 days a week.  The electronics will last significantly longer
>> that way.  Most computers die as they're being powered up or down.
>> ...
>> As for the bearings in the disk drive, they won't be any worse for
>> the wear either.
>
>I don't know about that.  My disk drive specs rate my drive's life at
>11,000 hours.  I guess I kind of hoped that I could expect more than
>1 and 1/4 years out of it.
>
>As far as that old argument that the electronics don't like power cycling,
>how many terminals (as a model for pc's that's been around alot longer)
>has anyone come up against whose electronics have failed?  Usually, they
>just get so obsolete that nobody wants to use them anymore, or the
>keyboard becomes unusable.  I think I can say that I've never seen a
>terminal that didn't work.

Hmmm... In my experience you'll get longer life from your system if you
leave it on all the time.  This comes from the 'school of hard knocks' -- I
have had the experience of multiple failures, usually immediately after
power-on.  Our office systems here are on 24 hours now, and we have yet to
have a failure in *ANY* component -- the only thing that is ever powered
down are monitors, and then only to prevent screen burn.

I've noticed that most drive failures occur during power-on.  We have
*never* had a customer report a failure during operation -- but have had
several failures during power up!

Case in point: My St4051.  It runs fine -- as long as you leave it running.
Turn it off, and it might not spin back up.  Defective?  Definately!
Something that would never happen if we never turned it off?  True as well.

Case in point #2: TRS-80 Model IV(s).  The unit had been in continuous
operation for about 4 years, and never had a failure.  Now that we only use
it occasionally, and power it down when not in use, it has become very
flakey and sometimes refuses to work properly.

What does all this mean??  You're better off leaving the unit *ON*.  Now,
this assumes that you have adaquate cooling for the system (ie: your office
doesn't go to 90 degrees Fahrenheit during the night).


-- 

Karl Denninger				UUCP : ...ihnp4!ddsw1!karl
Macro Computer Solutions		Dial : +1 (312) 566-8909 (300-1200)
"Quality solutions at a fair price"	Voice: +1 (312) 566-8910 (24 hrs)

magore@watdcsu.waterloo.edu (Mike Gore, Institute Computer Research - ICR) (08/17/87)

In article <5911@ut-ngp.UUCP> ayac071@ngp.UUCP (Bill Douglass) '> ' writes:
[in reply to]
In article <202@ttrdd.UUCP> mellman@ttrdd.UUCP (Thomas Mellman) '>> ' writes:
>>As far as that old argument that the electronics don't like power cycling,
[munch ...]
>I say, let 'er run.
...
>Bill Douglass
>ayac071@ngp.UUCP

	As another poster has pointed out the hard disk is the problem.
I have also found that hard disks reaching their MTBF is the greatest
risk from a cost point of view... It's more like a game of odd's taking into
account that most most semiconductor failures will happen during the
first month or so of use. In fact many people should _want_ such
problems to show up in these first first months of use so that warranty
repairs can still be made without cost. Thermo cycling tends to detach bonds 
or metalization layers that were defective during manufacturing early
in a products life time - on average... It's interesting to see
what happens when a fan in a power supply passes it's MTBF and dies [ they
must have clocks in them :) ] What happens next could be termed 
"short term burn in" ... :)


# Mike Gore 
# Institute for Computer Research. ( watmath!mgvax!root - at home )
# These ideas/concepts do not imply views held by the University of Waterloo.

npollack@polyslo.UUCP (Neal Pollack) (08/19/87)

>In article <768@custom.UUCP>, boykin@custom.UUCP writes:
>
>As far as that old argument that the electronics don't like power cycling,
>how many terminals (as a model for pc's that's been around alot longer)
>has anyone come up against whose electronics have failed?  Usually, they
>just get so obsolete that nobody wants to use them anymore, or the
>keyboard becomes unusable.  I think I can say that I've never seen a
>terminal that didn't work.

Surely this person does not work around a large facility :-)

We have over 100 Terminals/Workstations/Macs/PCs in our facility.
DEC VT100 terminals fail more than anything. THE ELECTRONICS QUIT.
Specifically, the CRT driver board loves to smoke DURING SWITCH ON.
They NEVER quit while running, but when you switch them on, watch out.
TEKtronics graphics terminals flame out power supplies, XEROX 8000 Star
workstations burn CRTS, Disk drives, PC Boards, Power supplies.

In short, I see LARGE amounts of FREQUENT failures because of our huge
inventory of equipment.   The only terminals that seem to live forever
HERE (at our site ((disclaimer)) is the HP2622 and the ADM3a.

SO, please do not tell me that terminals never fail.  I will show you
thousands of dollars worth of repair bills.

FLAME OFF, settle down, drink gator-aid.

SUMMARY: The simple point is, equipment that we leave on continuously
has not failed.  The terminals and workstations that go on/off a lot
seem to quit regularly.

Take it for what its worth.



-- 
| Neal S. Pollack		|    ...!ihnp4!csun!polyslo!npollack	      |
| Computer Systems Lab		|    ...!{csustan,csun,sdsu}!polyslo!npollack |
| Cal Poly State Univ.		|    ...!ucbvax!voder!polyslo!npollack	      |
| San Luis Obispo, CA  93407	|    Voice: (805)-546-2147                    |