[sci.electronics] Ethernet Cabling

dad@cs.brown.edu (David A. Durfee) (01/18/91)

I have recently gotten involved in the specification of a thin
ethernet installation.  A lose end still remains which I would
like to clear up.

When I was involved in thick ethernet installations a few years
back, I recall a recommendation to keep cable lengths in multiples
of one wavelength (approx. 76ft) for best results.  In fact, I notice
that thick ethernet premade cables continue to be sold only in multiples
of a wavelength.

A. Since thin ethernet is effectively the same technology, wouldn't the
   same recommendation apply?
B. Where does this recommendation come from?  

I understand so-called "microwave theory" concerning reflected waves
due to impedence mismatches to some degree.  My question is what problem
is being solved here.  It was my understanding that even multiples of
quarter wavelengths would reduce problems do to the insertion of a cable
which has a different characteristic impedence than the rest of the 
net. (that is because the reflections from the two end of the cable 
would cancel).  BUT reflections due to poor connectors/tees would have
less effect if multiples of odd quarter wavelengths are used (I don't
recall why).  These requirements appear to conflict.

Any of you out there really know the answer?  I've called the tech
support of several companies involved in making ethernet equipment
but nobody was able to shed light on my questions.