[net.micro] 1200 baud modem problems with break

msc@qubix.UUCP (Mark Callow) (01/17/84)

I do not understand what the modem has to do with the ability or
inability to send a break.  A break, as its name implies, is a
long pulse (at least .25 seconds) on the data line.  It looks
the same as a momentary physical break in the line.

As long as your computer or terminal can generate a break
there shouldn't be any problem in it being passed through
the modem.

There is no such thing as a break character.  That, if it did
exist, would be a bit pattern to be transmitted just like 'a'
or '0' etc.
-- 
From the Tardis of Mark Callow
msc@qubix.UUCP,  decwrl!qubix!msc@Berkeley.ARPA
...{decvax,ucbvax,ihnp4}!decwrl!qubix!msc, ...{ittvax,amd70}!qubix!msc

smh@mit-eddie.UUCP (Steven M. Haflich) (01/18/84)

	I do not understand what the modem has to do with the ability or
	inability to send a break.  A break, as its name implies, is a long
	pulse (at least .25 seconds) on the data line.  It looks the same as a
	momentary physical break in the line.

	As long as your computer or terminal can generate a break there
	shouldn't be any problem in it being passed through the modem.

True, one imagines that a modem simply and instantaneously translates
the state of the RS232 Transmitted Data line (a digital signal) to an
frequency-encoded (analog) audio signal quite independantly of the
frequency of bit transitions, and any modem which does this ought to be
able to send BREAK.  Many 300 baud modems indeed work this way.  Notice
that specs for 300 baud modems usually specify baudrate as 0-300 baud.

Getting 1200 baud down a voice-grade line is entirely another matter.
Very sophisticated filtering and phase-manipulation techniques are
required both sending and receiving -- those messy analog circuits are
why 1200 baud modems have been so expensive.  Basically, the modem
transmitter is a filter which converts Transmitted Data into control
voltage(s) for an oscillator.  This filter is very frequency dependant
and is tuned to transitions at 1200 baud.  (Ever notice that the
baudrate specification for dual-speed modems is often given as "0-300,
1200 baud"?  The Racal Vadic 3451 specs in front of me give:  103 mode:
0-300 baud; 212 mode: 1182-1212 baud.)  Anyway, this filter can do
unanticipated things when presented with a non-1200-baud signal.
Clearly a modem must operate reasonably for arbitrarily long MARKs (idle
state) but it is entirely possible that this filter might not pass a
long SPACE -- which, as has been pointed out, translates into the BREAK
"character".

The filtering on most modems do indeed seem able to translate MARKS.
Although I am not familiar with the modems in question, I suspect the
nature of the filtering used to achieve low cost may be the reason for
inability to send BREAK.

Steve Haflich, MIT

david@intelca.UUCP (01/19/84)

xxx

I already said this once, but...

What Mark Callow is saying is true enough for an asynchronous (e.g. 103)
modem, but not for your average sychronous modem (e.g. 212).  A synchronous
modem receives characters asynchronously from the terminal/host, then
synchronously reserializes them for modulation purposes (PSK for 212).
A break is clearly a special case which must be explicitly considered by
the modem designer...

-- 
David DiGiacomo, Intel, Santa Clara, CA
{pur-ee,hplabs,ucbvax!amd70,ogcvax!omsvax}!intelca!david

borg@inuxh.UUCP (01/20/84)

I believe that a "break" is defined in CCITT standard V.22 as 23 bit
times of 'mark' followed by 20 bit times of 'space.  Note that 'mark'
and 'space' refer to the -3 to -20 or +3 to +20 volt signals, respectively.
This definition is adequate for all speeds, since it is defined
in terms of bit duration.

borg@inuxh.UUCP (01/20/84)

Steve,
	The 212A standard is not frequency, but PHASE encoded.  I'm not
certain what the exact implementation of the 212A from what used to be the
Bell System is, but the received async data is buffered and reserialized
into SYNCHRONOUS bit stream.  I don't think the inability of a particular
212 implementation to transmit a break is filter related.  It's
most likely a problem in the digital circuitry used to deserialize the
incoming async data and generate the synchronous equivalent.
That is also the reason for the bit rate specs (1187 to 1212 baud?).
The clocks used for deserialization  must be set to something, and for
any clock value there is a limit to the range of speeds that will be correctly
received. (Try it on your own system.  Change the UART clock from the
normal value to something higher and lower.)  The 212A also does some
tricky stop bit deletion to enable it to handle terminals running at the
high end (1219 baud) of the operating range.  The receiving 212 then reassembles
a properly formatted character.