david@elroy.Jpl.Nasa.Gov (David Robinson) (11/09/87)
As most people know their are a couple ethernet standards, Version 1, Version 2 and IEEE 803.2. When one installs a new ethernet board they must have the correct transeiver level to match the type that the board supports, fortunately a lot of modern boards have jumpers to support different versions. I have found that you can run both level 1 and level 2 transeivers on the same cable. The question: What is the difference between the versions and what effect is their on the physical wire to run two different types of transeivers? I have heard people say that it is best to have all the same type but no "real" evidence to base that comment on. Could someone mail me either a description of the differences or pointers to the available documentation that would answer these questions. As always if their is enough interest I will summarize to the net. -- David Robinson elroy!david@csvax.caltech.edu ARPA david@elroy.jpl.nasa.gov ames!elroy!david UUCP Disclaimer: No one listens to me anyway!
nguyen@amd.AMD.COM (Quinn Nguyen) (11/11/87)
In article <4800@elroy.Jpl.Nasa.Gov>, david@elroy.Jpl.Nasa.Gov (David Robinson) writes: > As most people know their are a couple ethernet standards, Version 1, > Version 2 and IEEE 803.2. When one installs a new ethernet board > ... > The question: What is the difference between the versions and what > effect is their on the physical wire to run two different types > of transeivers? I have heard people say that it is best to have all > the same type but no "real" evidence to base that comment on. > Ethernet version 2 and ANSI/IEEE 802.3 (10BASE5) signals are physically the same. Version 1 and 2 signals are the same in the coax. The main differences are on the Drop cable: - SQE (Signal Quality Error) generation after a packet transmission required for version 2 and 802.3 but not for version 1. - Half-step idle signal (differentially 0) with transformer isolation required for version 2 and 802.3 but not for version 1. If a MAU (transceiver) does not generate SQE and only accepts, generates full-step signals (Ver. 1), it may have problem connecting to a version 2 DTE or vice versa. Other issues are device to device line static isolation, ground, etc. which are not relevant to transceiver connection... in term of functionality. Hope this may help.
gc@bnl.ARPA (Graham Campbell) (11/13/87)
In article <4652@amd.AMD.COM> nguyen@amd.AMD.COM (Quinn Nguyen) writes: >In article <4800@elroy.Jpl.Nasa.Gov>, david@elroy.Jpl.Nasa.Gov (David Robinson) writes: >> As most people know their are a couple ethernet standards, Version 1, >> Version 2 and IEEE 803.2. When one installs a new ethernet board >> ... >> The question: What is the difference between the versions and what > ... >Other issues are device to device line static isolation, ground, >etc. which are not relevant to transceiver connection... >in term of functionality. However if you interpret "functionality" to include "does it function", then the other issues are very relevant. We have had the experience where a IEEE 803.2 transceiver would work, but a Version 2 transceiver would not work. The difference apparently was in the shielding and extra pins used in the connectors for the shielding. From my description you can tell that I do not understand the problem very well and would appreciate a reference to the complete differences (if it exists). Graham -- Graham Campbell (gc@bnl.arpa, gc@bnl.bitnet, ...!phillabs!sbcs!bnl!gc)
ron@TOPAZ.RUTGERS.EDU (Ron Natalie) (11/14/87)
On the coax there is no differenece electrically between Version I Version II, and IEEE 802.3. There is an encoding difference in the bytes. The 802.3 uses the two bytes following the source address for a length field. The older Ethernet standards use this as a type field for determining what protocol to use for the rest of the packet. Most IP networks these days are constructed using the old Ethernet interpretation regardless of what kind of transceiver they use. The difference between the Version I transceiver and the version II is the presence of the so called "heartbeat" signal or SQE. What this does is blip the collision detect line after each transmission. This is an added protection for detecting broken transcievers and cabling that may be jabbering on the net. The IEEE 802.3 transciever is similar to the Version II transciever, but has one additional signal state on the collision detect line for something like MAU (that's what they call the transciever) not ready. I'm not sure what anybody does with this (if anything). Of course, as stated earlier, the various standards call for different sizes of conductors and grounding considerations, although the essential signals conductors are the same. -Ron
rpw3@amdcad.UUCP (11/19/87)
In article <8711141736.AA21062@topaz.rutgers.edu> Ron Natalie writes: +--------------- | On the coax there is no differenece electrically between Version I | Version II, and IEEE 802.3. +--------------- Weeeelll... almost. There WAS a teensy change in the electrical spec on the coax between Ethernet Version 1.0 (Sep'80) and Version 2.0 (Nov'82), having to to with tightening the specs on the drive current (or at least changing the way the A.C. versus D.C. components were specified). See Section 7.3.2 "Coaxial Cable Signaling" in each version (p.61 in ver 1.0, p.72 in ver 2.0). The net effect was to change the shape of the AC/DC schmoo slightly. Very slightly. There IS one significant change to that section in the 2.0 spec. The following sentence is added: "The transceiver shall be able to produce its specified output current onto the coaxial cable with at least one other transceiver transmitting simultaneously." That sentence made it official that receiver-based collision detection shall be possible, by requiring that the current source in a transceiver's transmitter not wimp out until the cable voltage was AT LEAST twice the normal max peak voltage. All practical "current sources" have a "maximum compliance voltage" above which they quit being true current sources. (A "perfect" current source would increase its voltage without limit, even to the point of arcing over if you tried to disconnect it!) All of the current sources in the popular Version 1.0 transceivers had plenty of compliance; the 2.0 spec just made it official. Why all the trouble? Well, if you are going to build a repeater, it's important that you be able to creat "carrier" on the "other" side of the repeater whenever you see carrier on "this" side (or a "jam" on the other side whenever you see "collision" on this side). But in the case of several transceivers transmitting at once, the current sources will saturate and all the A.C. signal will disappear in the large D.C. offset. It is important that this not happen at a voltage lower than the repeater could reliably detect as a collision, when it itself was not transmitting. Furthermore, I've been told that the tightening of the A.C. versus D.C specs I mentioned above helped solve a possible ambiguity: In the case where a repeater is at one end of a maximally-loaded cable and there is a collision between two wimpy transmitters at the far end, the tightened spec plus the tightened "voltage compliance" guaranteed that the repeater would see it as a collision, and not interpret it as a single nearby macho transmitter. Again, it's no big deal. All (?) of the 1.0 vendors' transceivers worked (and work) just fine on a mixed 1.0/2.0/IEEE_802.3 cable. It just needed to be said explicitly in the spec. Rob Warnock Systems Architecture Consultant UUCP: {amdcad,fortune,sun,attmail}!redwood!rpw3 ATTmail: !rpw3 DDD: (415)572-2607 USPS: 627 26th Ave, San Mateo, CA 94403
raj@limbo.uci.EDU.UUCP (11/20/87)
I got this from the little booklet that comes with our ST-500 transceivers that we bought from Cabletron. (Cute, each one comes with a really complete book telling all sorts of things about transceivers and such.) We always make sure our transceiver cables have pin 1 connected to ground. (Our Computing Facility used to connect pin 4 to ground to agree with 802.3 but then they couldn't be used on V2.0 so we always get pin 1 as ground now.) We've used the same transceiver cables on V1.0, V2.0, and 802.3. We have mixtures of all versions on the same ethernet with no problems although we're now trying to always go with 802.3 in the future. Hope all of this helps. V1.0 V2.0 IEEE 802.3 ----------------------------------------------------------------------- Transceiver (3) 22 AWG pairs (4) 20 AWG pairs (4) 20 AWG cable (1) 20 AWG inner Inner & outer pairs & outer shield shield common Inner & outer common at at backshell shield isolated backshell and and pin 1 from each other pin 1 Outer shield at backshell, inside at pin 4 Indented male connector for better electrical connection. Transceiver Full step Half step Half step No heartbeat Heartbeat Heartbeat (SQE) Grounding Pin 1 Pin 1 Pin 1, 4, 11 & 14 Ground indents on male connector. Jabber latching Repeater No requirements No requirements Redundant collision protection using Jam sequence, Segments excessive collision segment from network. Vendors Xerox, U-B, DEC U-B, 3COM, DEC, Gould, Harrris, Cabletron HP, Xerox, Cabletron Micom-Interlan, Intergraph, Cabletron
kermit@BRL.ARPA (Chuck Kennedy) (11/21/87)
Here are the appropriate references for anyone that's interested: The Ethernet: A Local Area Network: Data Link Layer and Physical Layer Specifications, DEC, Intel, and Xerox Corporations, 1980. The Ethernet: A Local Area Network: Data Link Layer and Physical Layer Specifications, Version 2.0, DEC, Intel, and Xerox Corporations, 1982. Carrier Sense Multiple Access with Collision Detection (CSMA/CD) Access Method and Physical Layer Specifications (Standard 802.3-1985/International Standard 8802/3), The Institute of Electrical and Electronics Engineers, Inc., 1985.