[net.lan] DEC's broadband Ethernet tranceivers

darrelj@sdcrdcf.UUCP (Darrel VanBuer) (01/04/85)

DEC has announced Ethernet tranceivers for use on broadband networks.  They
implement the standard 10 M bps in 18MHz of bandwidth.  There are
configurations for either single or double cable broadband.  Maximum
distance is 3800 meters.  Available in the spring.

The bad news is that a DECOM tranceiver costs $4250 (and the accessory for
single cable broadband another $4500--but I can't tell from the press
release whether that's per node or per net).  By way of comparison, baseband
Ethernet tranceivers cost as little as $200.
The announcement also reminded the existence of the DELNI clustering device
which allows up to 8 stations to share one tranceiver (a DELNI costs about
$2000, so it clearly pays for a group of machines within 50 meters of a
common point!).

Comments: cost is negligible for supermini or mainframe systems, rather
steep for machines in the PC class (it costs about $1000 to put an IBM PC on
baseband Ethernet--and that includes the Ethernet controller and outboard
tranceiver, $600 to $4000 more for broadband)
-- 
Darrel J. Van Buer, PhD
System Development Corp.
2500 Colorado Ave
Santa Monica, CA 90406
(213)820-4111 x5449
...{allegra,burdvax,cbosgd,hplabs,ihnp4,orstcs,sdcsvax,ucla-cs,akgua}
                                                            !sdcrdcf!darrelj
VANBUER@USC-ECL.ARPA

ian@mulga.OZ (Ian Richards) (01/07/85)

> DEC has announced Ethernet tranceivers for use on broadband networks.  They
> implement the standard 10 M bps in 18MHz of bandwidth.  There are
> configurations for either single or double cable broadband.  Maximum
> distance is 3800 meters.  Available in the spring.
> Darrel J. Van Buer, PhD

Is the 3800 metres correct? If so, how do they do it?
If they are still using CSMA/CD then the maximum distance between
nodes is limited by the contention period (46.4 microseconds) which in
turn dictates the minimum frame size (512 bits). The DEC/Xerox/Intel
"standard" suggests that after you allow for all kinds of switching
and hardware delays you have about 23 of the 46 microseconds left over
for propogation time which at 0.7c is pretty close to 5000 metres.
This of course is the "round trip" time which means a maximum node
separation of 2500 metres. Now if you use a broadband cable surely
this ought be halved again.
This is because the worst case contention problem is two nodes near to
each other but furthest from the head end. One begins transmission.
Its signal propogates via the head end and all the way back until it
almost reaches the other node which then begins a transmission. This
signal (which is now a collision) must propogate all the way via the
head end and back to the first station so it can detect the collision.
The first station must continue transmitting for all of this time.

Now perhaps they can avoid the double propogation delay by doing the
collision detection on the transmission channel/cable. That brings it
back to the baseband situation. (But how does a station upstream from
a transmitting station find out that if it transmits it will cause a
collision?) Perhaps they can get rid of some of those aforementioned
delays or perhaps their signals go faster than 0.7c. I would be
interested to hear the answer if anyone knows.

Ian Richards					decvax!mulga!ian

mccallum@nbires.UUCP (Doug McCallum) (01/14/85)

	> Is the 3800 metres correct? If so, how do they do it?
I am assuming that DEC followed the proposals put forth in the IEEE 802.3
committee for their implementation of a broadband CSMA/CD.
The proposed addition of an AUI Compatible Broadband to the IEEE 802.3
standard allows up to 4km. for the length with 0.87c cable.  There are
other differences in the spec as well.
	> If they are still using CSMA/CD then the maximum distance between
	> nodes is limited by the contention period (46.4 microseconds) which in
	> turn dictates the minimum frame size (512 bits). The DEC/Xerox/Intel
CSMA/CD doesn't impose the contention period, the "Ethernet" spec does.  By
changing many of the parameters to better fit the existing broadband media,
it is possible to get a CSMA/CD system with a larger area, different data rates,
etc.  The IEEE 802.3 standard is making allowances for CSMA/CD on different
media and data rates.
	> back to the baseband situation. (But how does a station upstream from
	> a transmitting station find out that if it transmits it will cause a
	> collision?) Perhaps they can get rid of some of those aforementioned
The signal being used in the proposed broadband 10Mb system (proposed
by DEC, M/A COM and possibly others) uses 3 CATV channels and splits these
into a data channel and an outofband channel for collision detection.
If two stations start broadcasting at the same time, collision can be
detected by comparing what is received with what is transmitted.  The
comparison need only be done on the header through the source address.

I really don't know much more than that.  The above came from some notes
I took at the last IEEE 802 meeting.

		Doug McCallum
		{allegra,ucbvax,ut-sally}!nbires!mccallum

hes@ecsvax.UUCP (Henry Schaffer) (01/14/85)

A DEC technical support person told me that the 3800 meters
maximum distance means 1900 meters max from each transceiver to
the Head End of the broadband.  --henry schaffer