[comp.dcom.modems] Why no PC modems without UART

gandrews@netcom.COM (Greg Andrews) (02/09/91)

In article <1991Feb9.022024.10932@wsrcc.com> wolfgang@wsrcc.com 
(Wolfgang S. Rupprecht) writes:
>
>Why didn't Telebit take this opportunity to design the UARTS out of
>the modem?  They could have replaced the two back to back UARTS with
>a latch pair, or better yet a FIFO pair and made one extremely delay
>delay tolerant PC bus modem.
>

Indeed, why wouldn't other modem manufacturers do the same thing...?

Ever try to emulate a UART chip in software?  Ever wonder why the smart
multiport boards don't do it either?  It's a pain in the butt!


-- 
.-------------------------------------------.
| Greg Andrews      |   gandrews@netcom.COM |
`-------------------------------------------'

urlichs@smurf.sub.org (Matthias Urlichs) (02/10/91)

In comp.dcom.modems, article <23314@netcom.COM>,
  gandrews@netcom.COM (Greg Andrews) writes:
< 
< Ever try to emulate a UART chip in software?  Ever wonder why the smart
< multiport boards don't do it either?  It's a pain in the butt!
< 
However, when you know what's on the other side of the pseudo-UART (a Telebit
modem, in this case), you can forget about almost every feature.
All you need is bits&parity (baud rate for the "use speed of last AT command"
modes only), handshake lines, and send+receive buffer.

Seems reasonable to me.

However, it still has to be implemented&programmed&tested, and it might be
more cost-effective to just stick two cheapo UARTs (one for the PC, and one
for the modem -- you already have the software for the latter from your
standalone modem PROMs) onto the board instead of the latches you'd otherwise
need.

-- 
Matthias Urlichs -- urlichs@smurf.sub.org -- urlichs@smurf.ira.uka.de     /(o\
Humboldtstrasse 7 - 7500 Karlsruhe 1 - FRG -- +49+721+621127(0700-2330)   \o)/

enger@seka.scc.com (Robert M. Enger) (02/10/91)

In article <23314@netcom.COM>, gandrews@netcom.COM (Greg Andrews) writes:
|> 
|> Indeed, why wouldn't other modem manufacturers do the same thing...?
|> 
|> Ever try to emulate a UART chip in software?  Ever wonder why the smart
|> multiport boards don't do it either?  It's a pain in the butt!
|> 

Dear Greg:

I believe one of (perhaps THE) main purposes of a UART is to perform
the serial<-->parallel conversion between the (serial) data communication
channel, and the host's (parallel, byte-wide) i/o data paths.  Is this correct?

It also my understanding that many of the fancier modems are already
operating on byte-wide data internally.  Some compression algorithms use
coding schemes based on reducing the bits required to transmit frequently
occuring symbols (characters or bytes).  The modem must also be able 
to recognize x-on and x-off symbols in the data stream, etc.  Thus it would
seem that most modern modems MUST view the host's IO stream in a byte-wide
(character oriented) fashion.

If these statements are true, then it WOULD seem desirable to eliminate
the UART from the data path between the modem card and the host.
Indeed, it would be nicer still to 'widen' the data path between the 
modem and the host.  That is, truly parallel-based internal modems
could utilize the full width of the I/O bus available in the host.
Still fancier optimizations are possible.  Modems could perform DMA
transfers to memory, become bus-masters, or employ other techniques to
lessen the per-character interrupt load on the host.

While these techniques seem feasible, they all require custom drivers
(and/or application programs) to operate the card.  This may render
some 'classic' pc-caliber software packages unusable.  So, the marketing
weenies will probably advise management against developing such products.

Comments?
Bob Enger
Contel Federal Systems
enger@seka.scc.com


-- 

Robert M. Enger
CONTEL Federal Systems
enger@seka.scc.com  (Internet)

gandrews@netcom.COM (Greg Andrews) (02/10/91)

In article <1991Feb9.164603.17731@europa.asd.contel.com> 
enger@seka.scc.com writes:
>
>I believe one of (perhaps THE) main purposes of a UART is to perform
>the serial<-->parallel conversion between the (serial) data communication
>channel, and the host's (parallel, byte-wide) i/o data paths.  Is this correct?
>

Yes, it adapts a parallel computer bus to a serial RS232 'bus'
(if you're willing to consider RS232 a bus)

>
>It also my understanding that many of the fancier modems are already
>operating on byte-wide data internally.  
>

Internally?  Yes, if they have error correction and aren't merely bit pumps.

>
>Some compression algorithms use coding schemes based on reducing the bits 
>required to transmit frequently occuring symbols (characters or bytes).
>The modem must also be able to recognize x-on and x-off symbols in the 
>data stream, etc.  Thus it would seem that most modern modems MUST view
>the host's IO stream in a byte-wide (character oriented) fashion.
>

Again, yes, if the modem is using error correction at the moment.  If it
isn't, then it's just operating as a bit pump.

>
>If these statements are true, then it WOULD seem desirable to eliminate
>the UART from the data path between the modem card and the host.
>Indeed, it would be nicer still to 'widen' the data path between the 
>modem and the host.  That is, truly parallel-based internal modems
>could utilize the full width of the I/O bus available in the host.
>Still fancier optimizations are possible.  Modems could perform DMA
>transfers to memory, become bus-masters, or employ other techniques to
>lessen the per-character interrupt load on the host.
>
>While these techniques seem feasible, they all require custom drivers
>(and/or application programs) to operate the card.  This may render
>some 'classic' pc-caliber software packages unusable.  So, the marketing
>weenies will probably advise management against developing such products.
>
>Comments?

In theory, yes it would seem desirable to get rid of the UART and deal
with the data directly on a byte level.  If necessary, the modem could
attach start, stop, and/or parity bits to the characters just like the
UART chip does.

Of course, this instantly makes the modem completely incompatible with
all existing communications software written for the PC.  The reason
internal modems have a UART in the first place is so they will appear
to be a standard serial port, guaranteeing compatibility with everyone's
BIOSes, comm programs, and port drivers.

IMHO, a modem manufacturer will have enough to worry about making their
MNP4/MNP5/V.42/V.42bis and even their PEP/HST/Ping-Pong/V.32/V.22bis
(pick one) code work.  Expecting them to write SCO/Interactive/AT&T/DOS
drivers and make them work with the entire installed base is probably
asking for more software development and support than they can provide,
especially if that modem must be available at the prices demanded today.

Heavy software development and support require that the product make
more money than a UART interface modem in order to break even or reap a
profit.  Incompatibility with even part of the market (how many MS-DOS
comm programs bypass DOS/INT14/BIOS drivers and access the UART directly?)
means the modem will have lower sales.  Lower sales means the modem must
be priced higher to make money (another technique for reducing sales 
figures)...etc...etc.

The way I see it, there aren't enough advantages to make it worth anyone's
while just yet.  But don't take my word for it...perhaps we're theorizing
about a product that someone is developing in their garage right now. :-)

-- 
.-------------------------------------------.
| Greg Andrews      |   gandrews@netcom.COM |
`-------------------------------------------'

gandrews@netcom.COM (Greg Andrews) (02/10/91)

In article <6v#qh2.o+3@smurf.sub.org> urlichs@smurf.sub.org 
(Matthias Urlichs) writes:
>< 
>However, when you know what's on the other side of the pseudo-UART (a Telebit
>modem, in this case), you can forget about almost every feature.
>

Sure *I* know what's on the other side of the UART.  It's the *software*
that has no idea what modem is there (or if there's even a modem there).
If the comm software or tty driver assumes that it's a Telebit modem,
then it must be a specially written driver, right?  Yes, special hardware
can be supported by special software, but then the hardware isn't very
compatible anymore...

>
>All you need is bits&parity (baud rate for the "use speed of last AT command"
>modes only), handshake lines, and send+receive buffer.
>
>Seems reasonable to me.
>

A favorite software program had a "fortune cookie" function that sometimes
said "'Easy To Use' is easy to say".  A UART isn't really as simple as that.

>
>However, it still has to be implemented&programmed&tested, and it might be
>more cost-effective to just stick two cheapo UARTs (one for the PC, and one
>for the modem -- you already have the software for the latter from your
>standalone modem PROMs) onto the board instead of the latches you'd otherwise
>need.
>

Agreed, for the reasons I mentioned in my last posting.

-- 
.-------------------------------------------.
| Greg Andrews      |   gandrews@netcom.COM |
`-------------------------------------------'

larry@nstar.rn.com (Larry Snyder) (02/10/91)

gandrews@netcom.COM (Greg Andrews) writes:

>Ever try to emulate a UART chip in software?  Ever wonder why the smart
>multiport boards don't do it either?  It's a pain in the butt!

Several of the multiport vendors have seperate versions of their boards,
one with 8250's and another with 16550A chips

-- 
   Larry Snyder, NSTAR Public Access Unix 219-289-0287 (HST/PEP/V.32/v.42bis)
                        regional UUCP mapping coordinator 
  {larry@nstar.rn.com, ..!uunet!nstar!larry, larry%nstar@iuvax.cs.indiana.edu}

enger@seka.scc.com (Robert M. Enger) (02/11/91)

Greg:

There has been some discussion on this list lately regarding the
failings of the UART chips used on some modems.  One person commented
that even the famous T2500 used such and such unacceptable UART chip.
Obviating UART chips altogether might be valued by some customers.

Another advantage might be speed.  While simple minded operating systems
might not have much else to do (might not be able to do much else) when
the telecom program is running, some of the fancier OSs do.  Thus, a
board that reduced the interrupt load on the host might be valued because
it would conserve system resources so that they could be devoted to 
other processes..

Even on a simple minded OS, if one's communications program is trying
to do some form of end-to-end integrity check (host to host checksum, etc)
then the time saved by not having to service per-character I/O interrupts
may provide a faster end-to-end throughput, especially on lower-powered CPUs.

I hope this doesn't devolve into a range war.  I have no bone to pick.
I don't even own a computer.  I dial in from a dumb terminal at home :-)

Bob

-- 

Robert M. Enger
CONTEL Federal Systems
enger@seka.scc.com  (Internet)