[news.groups] comp.sys.next, etc.

dave@micropen (David F. Carlson) (11/02/88)

Having seen pictures of the NeXT motherboard I am particularly bothered by 
the same "mistake" that Jobs et al. made with the Mac:  no parity checking.
With 16 meg of memory and my research or corporate financials in the machine,
I would rather *know* about memory hard/soft problems than not know.  Why
would anyone trust a computer that uses the ostrich approach to fault tolerance?

Is there any good answer for omitting parity checks on memory?



-- 
David F. Carlson, Micropen, Inc.
micropen!dave@ee.rochester.edu

"The faster I go, the behinder I get." --Lewis Carroll

jhh@ihlpl.ATT.COM (Haller) (11/02/88)

In article <575@micropen>, dave@micropen (David F. Carlson) writes:
> 
> Is there any good answer for omitting parity checks on memory?
> 

The MTBF for a machine without parity is larger than a machine with
parity.  This is because the additional parts all have non-zero failure
rates, but parity adds nothing to the reliability of the system,
unlike EDC, which allows one to continue using the computer even in
the face of a single memory problem.  Parity circuitry, in the simplist
design, also slows down access to memory.

So what do you gain with parity - a more timely notification that
there is something incorrect in the memory.  This allows an increase
in reliance on the results produced by the computer, but doesn't
increase the reliability by anything.

John Haller att!ihlpl!jhh  or jhh@ihlpl.att.com