[comp.dcom.modems] What is special about "AT"

shaw@paralogics.UUCP (Guy Shaw) (02/04/91)

In article <3763.27aaed7c@hayes.uucp> tnixon@hayes.uucp writes:

> [. . .]                     But even the specifics related to 
>detecting the speed and format from the "AT" predated Hayes' first 
>Smartmodem. [. . .]

I am curious - is there some property of the prefix "AT" that makes it
more suitable for the automatic detection of settings than other prefixes?
By settings, I mean such things as speed, parity, character size,
start/stop bits, etc.

I suppose that the completely naive approach is to keep cycling through
settings until you see a proper 'A'.  But then it would not matter what
prefix was chosen, as long it is agreed upon in advance.  But, if I
understand correctly, some characters are better than others because
they produce a greater number of distinct results for all combinations
of sending and listening settings, and this information can be used to
quickly narrow down the number of possible settings to try when listening
for the next character.

Why did tips on the Arpanet use "@"?
Why did Hayes (or whoever it might have been before them) choose "AT"?

Is there any literature on this subject?  Or, is this really just one
of those black arts that gets passed down informally from mentor to
acolyte, or, even worse, gets inherited in the form of code and tables
in ancient source code that works but no one dares tamper with?
If the answer is RTFM, please do tell me which FM.  I have a few books
on communications, but this does not seem to be one of the hot topics.
I have not seen it discussed in this newsgroup, but then, I have been
reading this newsgroup for only about a year.
--
Guy Shaw
Paralogics
paralogics!shaw@uunet.uu.net  or  uunet!paralogics!shaw

tnixon@hayes.uucp (02/05/91)

In article <424@paralogics.UUCP>, shaw@paralogics.UUCP (Guy Shaw) writes:

> I am curious - is there some property of the prefix "AT" that makes it
> more suitable for the automatic detection of settings than other prefixes?
> By settings, I mean such things as speed, parity, character size,
> start/stop bits, etc.

Actually, there are any number of character pairs that would work 
equally well.  The key characteristics are (a) the low-order bit of 
the first character must be "1", so that it is possible measure the 
duration of the start bit; and, (b) the second character must have a 
different parity from the first character, so that the parity bits 
of the two characters can be examined in order to determine the 
parity being used by the DTE (even [if both characters are even 
parity], odd [if both characters are odd parity], zero [if both 
parity bits are zero], or mark/none [if both parity bits are 1]).  
Note that the "A/" sequence to repeat a command also has these 
characteristics, as does the lower case "at" sequence that is also 
acceptable.  Mixed-case "aT" or "At" is not acceptable, because the 
number of 1 bits in the two characters are the same, making it 
impossible to accurately determine the parity in use.

> I suppose that the completely naive approach is to keep cycling through
> settings until you see a proper 'A'.  

Most modem implementations do not cycle through speeds, but use a 
one-shot timer to actually measure the duration of the start bit 
of the "A" to determine the data rate.  The remaining bits are then 
clocked in, and checked to see if they are an "A" or "a"; if not, 
the character is ignored.  If so, what is often done today is to se 
up a UART with 8/N/1 format at the detected speed, in order to read 
in the remainder of the command line (i.e., the processor-intensive 
mechanism of clocking in individual bits is avoided as much as 
possible, especially when in online command state and trying to keep 
the carrier and error control protocol active in the background).

> Why did tips on the Arpanet use "@"?

I really don't know why, but I imagine it is because among graphical 
7-bit characters which can be produced on a standard terminal 
keyboard, only "@" has all zero bits in the lower-order positions.  
This may be easier to use to detect the rate when you DO have to 
cycle through data rates to try to get a valid character.

> Why did Hayes (or whoever it might have been before them) choose "AT"?

My understanding is that it was chosen from among the multitude of 
other possibilities because it could be considered an abbreviation 
for "attention".

> Is there any literature on this subject?  Or, is this really just one
> of those black arts that gets passed down informally from mentor to
> acolyte, or, even worse, gets inherited in the form of code and tables
> in ancient source code that works but no one dares tamper with?
> If the answer is RTFM, please do tell me which FM.  I have a few books
> on communications, but this does not seem to be one of the hot topics.
> I have not seen it discussed in this newsgroup, but then, I have been
> reading this newsgroup for only about a year.

You probably wouldn't find a discussion like this in a general book 
on communications.  You MIGHT find it in a book on modem design, 
such as John Bingham's book (but I can't remember the name, 
something like "Theory and Practice of Modem Design".) -- but even 
then I think his book dwells more on modulation than controller 
firmware design.  At Hayes, this technology is preserved and 
protected by the Systems Engineering department (where I work) and 
promulgated through internal standards documents and training 
courses for engineers.

	-- Toby

-- 
Toby Nixon, Principal Engineer    | Voice   +1-404-449-8791  Telex 151243420
Hayes Microcomputer Products Inc. | Fax     +1-404-447-0178  CIS   70271,404
P.O. Box 105203                   | UUCP uunet!hayes!tnixon  AT&T    !tnixon
Atlanta, Georgia  30348  USA      | Internet       hayes!tnixon@uunet.uu.net

marc@aria.ascend.com (Marco S Hyman) (02/05/91)

In article <424@paralogics.UUCP> shaw@paralogics.UUCP (Guy Shaw) writes:
    I am curious - is there some property of the prefix "AT" that makes it
    more suitable for the automatic detection of settings than other prefixes?
    By settings, I mean such things as speed, parity, character size,
    start/stop bits, etc.

There are many pairs that are as good as AT.  What you are looking for is a
character pair where each character has different parity and the first bit
transmitted of the first character (the low order bit of the "A") is a one
bit.  This allows the receiver to autobaud by measuring the length of the
start bit (a zero bit) and determine parity (odd, even, mark, or space) by
comparing the high order bit of the two characters.

Why AT? Maybe Toby knows.  (ATtention, ATlanta, ???)

// marc
-- 
// work: marc@ascend.com		uunet!aria!marc
// home: marc@dumbcat.sf.ca.us		{ames,sun}!pacbell!dumbcat!marc    

vjs@rhyolite.wpd.sgi.com (Vernon Schryver) (02/06/91)

In article <3768.27ad9916@hayes.uucp>, tnixon@hayes.uucp writes:
> 
> > Why did tips on the Arpanet use "@"?
> 
> I really don't know why, but I imagine it is because among graphical 
> 7-bit characters which can be produced on a standard terminal 
> keyboard, only "@" has all zero bits in the lower-order positions.  
> This may be easier to use to detect the rate when you DO have to 
> cycle through data rates to try to get a valid character.

My recollection of those ancient days is that you used CR or LF ("NEWLINE"
in the newspeak of the era) to autobaud to the TIP.  (Real autobaud from
110 up, not the sad imitation in modern modems or the total miss in
UNIX getty.  It's a good thing hosts and terminals & modems are running
with "locked" speeds in these degenerate days).  "@" was used only after
you got the link to the TIP working, for example when you wanted to tell
the TIP to drop a connection.  Or at least that's my fading recollection.

The addition bits in CR can make CR easier than @ to recognize the correct
speed;  at least on other systems of the era whose inards suffered my
attentions.

In other words, "@" was to "+++", as CR or LF was to "AT" or BREAK.

My ignorant guess is that "@" was not commonly used in text, was available
on all common terminals (model 33 and 35 ttys as well as 37s and the novel
glass ttys), and had some modest mnemonic content.  It seems likely that
similiar reasoning was behind the choice of "+++".



Vernon Schryver,   vjs@sgi.com