[comp.dcom.telecom] Data Access Lines

John Higdon <john@bovine.ati.com> (05/25/90)

Chip Rosenthal <chip@chinacat.unicom.com> writes:

> dBm is commonly used to specify a level referenced to a "digital
                                                           ^^^^^^^
> milliwatt" signal.  This is a 1004Hz sine wave of 1mW power into
> 600ohms.

What was it before digital technology? I've always heard it referred
to as simply the "milliwatt". Also, to be technically pure, dBm can be
a reference to one milliwatt into any impedance, as long as it's a
milliwatt. The 600 ohms comes into play because everyone knows that
that when you measure 0.775 volts on across 600 ohms, you have a
milliwatt. If you measure 0.949 volts across 900 ohms, you still have
a milliwatt. And it is still 0 dBm.


        John Higdon         |   P. O. Box 7648   |   +1 408 723 1395
    john@bovine.ati.com     | San Jose, CA 95150 |       M o o !

"S. E. Grove" <seg@pacbell.com> (05/26/90)

In article <8084@accuvax.nwu.edu> jgro@apldbio.com (Jeremy Grodberg)
writes:

>              Technical Standards for Data Access Lines
>        Attenuation Distortion (slope)                -1 to +3 dB
>        C-Message Noise                               20 dBrnC
>        Impulse Noise                                 59 dBrnCO
>        Relative Delay (1000 to 2604 Hz)              200 usec.

SLOPE means no more than 1 db higher than the reference level of 1khz
measured with 900 ohms impedance, and no more than 3 db greater loss
that the reference. This is measured in the range of 300hz to 3000hz,
unless you ask and pay for higher line conditioning, C4 is measured up
to 3400hz, if I remember.  All my books are in a locker and therefore
not available.

C-message noise 0 dbrn is equivalent to -65 db (again I am relying on
memory, and I haven't lined up data circuits for a living for ten
years), the c refers to a weighting filter, calculated to reduce the
effect of noise at frequencies that don't effect the data.

Impulse noise is noise of very short duration, sometimes unnoticable
to the human ear, but at the speed of 9600 Baud a real deterrent.

Relative Delay has to do with the delay of various frequencies
reaching the terminating modem in reference to the fastest frequency,
within the bandwith. The fastest frequency can vary, though it is
usually around 2400hz. This could affect the shape of the analog
envelope of the signal and make it harder for the detection circuits
in the modem to determine space or mark.

At 9600 baud you are dealing with a 209 type data set which uses tri
bits (000,001,010,011,etc) to reduce the actual line speed and eight
phase 'phase modulation'.


Stephen Grove Pac Bell, Comm. Tech. ESS; Sonoma County, Calif.

grayt@uunet.uu.net (Tom Gray) (05/26/90)

In article <8293@accuvax.nwu.edu> John Higdon <john@bovine.ati.com> writes:
X-Telecom-Digest: Volume 10, Issue 387, Message 3 of 12

>Chip Rosenthal <chip@chinacat.unicom.com> writes:

>> dBm is commonly used to specify a level referenced to a "digital
							    ^^^^^^^	
>> milliwatt" signal.  This is a 1004Hz sine wave of 1mW power into
>> 600ohms.

>What was it before digital technology? I've always heard it referred
>to as simply the "milliwatt". Also, to be technically pure, dBm can be
>a reference to one milliwatt into any impedance, as long as it's a
>milliwatt. The 600 ohms comes into play because everyone knows that
>that when you measure 0.775 volts on across 600 ohms, you have a
>milliwatt. If you measure 0.949 volts across 900 ohms, you still have
>a milliwatt. And it is still 0 dBm.

The digital milliwatt is defined in the CCITT standards. It is a
sequence of eight PCM codes which when repeated in sequence produce a
1KHz tone.  The digital miiliwatt is a means of defining the
relationship between the analog and digital domains. Note that digital
milliwatt or the digital test sequence produces a 1Khz tone when
decoded not 1004Hz.

1004Hz digital tones are commonly used since it requires 2000 PCM
samples to produce a single cycle of the tone. This produces a more
exhaustive test of the decoders of PCM then the eight samples: of the
strictly deefined 1Khz DTS.

Because of roundoff error in the PCM sequences for the tones the 1Khz
DTS will produce a level that is approximately .1db different than the
1004Hz tone. Thus for accurate level allignment of a PCM decoder the
strict 1KHz DTS must be used. Thus the 1004Hz sequences are suitable
for production or field testing of a PCM circuit but truly accurate
allignments must use the DTS.

I agree with Mr. Higdon in that dbm refers to one milliwatt in any
impedance at any frequency. Indeed, even fibre optic transmitters and
receivers are specified in dbm and their operating frequencies are
rather higher than 1Khz.

Mike.Riddle@f27.n285.z1.fidonet.org (Mike Riddle) (05/28/90)

Jeremy Goldberg asked about technical specifications for Data Access
Lines, and received many good replies.  One of his basic reason for
asking, however, has not been addressed.
  
Jeremy wants to use a Telebit 9600 bps modem, and his version of Ma
Bell said that only < 2400 was guaranteed on a voice line.  My
understanding is that a 9600 bps modem actually operates at 2400 baud,
with 4 levels, creating a 9600 bps signal.  This method was used
precisely because of the inherent bandwidth of a "normal" voice line.
It seems to me that whoever told him 9600 wouldn't work on a "normal"
line either didn't understand 9600 bps methodology or was trying to
sell up.
 
Can anyone smarter than I (that's most of you) comment on this aspect
of his problem?


   Ybbat (DRBBS) 8.9 v. 3.11 r.3
 * Origin: [1:285/27@fidonet] The Inns of Court 402/593-1192 (1:285/27.0)

   Through FidoNet gateway node 1:16/390
   Mike.Riddle@f27.n285.z1.fidonet.org

john@bovine.ati.com (John Higdon) (05/29/90)

Mike Riddle <Mike.Riddle@f27.n285.z1.fidonet.org> writes:

> My understanding is that a 9600 bps modem actually operates at 2400 baud,
> with 4 levels, creating a 9600 bps signal.  This method was used
> precisely because of the inherent bandwidth of a "normal" voice line.
> It seems to me that whoever told him 9600 wouldn't work on a "normal"
> line either didn't understand 9600 bps methodology or was trying to
> sell up.

I don't have the reference in front of my and can't give a detailed
explanation of PEP (Packetized Ensemble Protocol), but it is somewhat
more complex than that. PEP (I don't know anything at all about the
theory of v.32) tries for as many as 512 separate carriers (each
operating very slowly) over the line. During training and negotiation,
carriers that are unusable because of line quality are locked out.
This is why PEP can be so variable in terms of throughput. If line
conditions change significantly, the modems will renegotiate.

1200 and 2400 bps modems don't operate at 1200 and 2400 baud,
respectively, but rather at a slower baud rate and carry 4 or 8 bits
per baud. This is accomplished by introducing a phase (and in the case
of 2400, amplitude) component.

BTW, most people don't understand 9600 bps methodology.


        John Higdon         |   P. O. Box 7648   |   +1 408 723 1395
    john@bovine.ati.com     | San Jose, CA 95150 |       M o o !

chip@chinacat.unicom.com (Chip Rosenthal) (05/29/90)

John Higdon <john@bovine.ati.com> writes:

>Chip Rosenthal <chip@chinacat.unicom.com> writes:
>> dBm is commonly used to specify a level referenced to a "digital milliwatt"
>> signal.  This is a 1004Hz sine wave of 1mW power into 600ohms.

>What was it before digital technology? I've always heard it referred
>to as simply the "milliwatt".

Of course, you are correct.  dBm is power relative to a milliwatt.  I
slipped into that thinking because the bench work I've always done was
with digital equipment.

>Also, to be technically pure, dBm can be a reference to one milliwatt
>into any impedance, as long as it's a milliwatt.

Right.  The 600ohms is a common impedance, and would be the required
termination if you were to feed the digital milliwatt pattern into,
say a CODEC, and want to really get a milliwatt of power delivered.

>And it is still 0 dBm.

I stand, if not corrected, then at least clarified and unconfused :-)


Chip Rosenthal                           
chip@chinacat.Unicom.COM                 
Unicom Systems Development, 512-482-8260 

rpw3%rigden.wpd@sgi.com (Rob Warnock) (05/30/90)

In article <8371@accuvax.nwu.edu> John Higdon <john@bovine.ati.com>
writes:

| Mike Riddle <Mike.Riddle@f27.n285.z1.fidonet.org> writes:

| > My understanding is that a 9600 bps modem actually operates at 2400 baud,
| > with 4 levels, creating a 9600 bps signal...

| I don't have the reference in front of my and can't give a detailed
| explanation of PEP (Packetized Ensemble Protocol), but it is somewhat
| more complex than that. PEP (I don't know anything at all about the
| theory of v.32) tries for as many as 512 separate carriers (each
| operating very slowly) over the line...

Excerpts (scraps, really, the original is almost 300 lines) from a
document posted to comp.dcom.modems 6 Mar 90 by Mike Ballard & Cerifin
Castillo of Telebit (write to <modems@telebit.uucp> for more info):

	Telebit Corporation      Revision 1.01        01 DECEMBER 1989
           A BRIEF TECHNICAL OVERVIEW OF TELEBIT MODEMS 
	...
	This technique (DAMQAM) divides the voice bandwidth into 511 
	individual channels each capable of passing 2, 4, or 6 bits per 
	baud based on the measured characteristics of the individual 
	frequencies associated with each channel.  On a typical phone
	connection, the modem uses a subset of about 400 of those channels.

	Each time the modem connects to a circuit established on the dialup 
	Public Switched Telephone Network (PSTN), the TELEBIT modem
        measures the quality of the connection, and determines the usable 
        subset of the 511 carriers.  The aggregate sum of bits modulated 
        on this subset of carriers multiplied times the baud rate yields 
        a bit per second rate that on a local telephone connection 
        (i.e. round trip through your local telco) is 18031 bps.  This
        18031 bps is then reduced by about 20% to allow for the CRC overhead, 
        to about 14400 bps of data throughput. 
	...
	The modem operates at 7.35 and 88.26 baud, transparently changing 
	baud rates to accomodate the pace and quantity of data traffic. 
	When in "interactive mode" the modem sends data using 11 msec 
	packets (which run at 88.26 baud). Each packet contains 15 bytes 
	of data. In "file transfer mode" the modem uses 136 msec packets 
	(that transfer at 7.35 baud) that contain 256 bytes of data. 
	The TrailBlazer decides which packet size to use on an ongoing 
	dynamic basis. No intervention from the user is required. 

So the rate never exceeds 88.26 baud. Your local telco ought to be able to
do *that* at least...  ;-}   ;-}


Rob Warnock, MS-9U/510		rpw3@sgi.com		rpw3@pei.com
Silicon Graphics, Inc.		(415)335-1673		Protocol Engines, Inc.
2011 N. Shoreline Blvd.
Mountain View, CA  94039-7311

David Tamkin <0004261818@mcimail.com> (05/31/90)

Mike Riddle wrote in volume 10, issue 391:

|Jeremy [Grodberg] wants to use a Telebit 9600 bps modem, and his version
|of Ma Bell said that only < 2400 {actually, <= 2400 --DWT} was guaranteed
|on a voice line.

|My understanding is that a 9600 bps modem actually operates at 2400 baud
|with four levels, creating a 9600 bps signal.  This method was used
|precisely because of the inherent bandwidth of a "normal" voice line.  It
|seems to me that whoever told him 9600 wouldn't work on a "normal" line
|either didn't understand 9600 bps methodology or was trying to sell up.

John Higdon commented in volume 10, issue 394:

:1200 and 2400 bps modems don't operate at 1200 and 2400 baud
:respectively, but rather at a slower baud rate and carry four or eight
:bits per baud.  This is accomplished by introducing a phase (and in the
:case of 2400, amplitude) component.

1200 bps and 2400 bps modems operate at 600 baud with two or four bits
of information in every baud.

In volume 10, issue 395, Rob Warnock quoted an official description of
PEP and observed:

+So the rate never exceeds 88.26 baud. Your local telco ought to be able
+to do *that* at least.

And I think that's the problem: Jeremy's telco promises that ordinary
lines will support 600 baud (regardless of bps counts attained through
artifice or cunning) but not the 2400 baud possibly required for 9600
bps.  {I won't venture a guess whether he needs 2400 baud modulation
for 9600 bps as Mike said or only 88.26 baud as Rob quoted.} The reps
are told that voice lines can handle 2400 bps (the presumed speed
limit for 600 baud) but reliability at [the higher baud rates possibly
needed for] higher data rates requires premium service.

If PEP is modulated only at 7.35 or 88.26 baud, it should be no
difficulty for the local lines to carry it, unless shoving so many
bits into so few bauds requires so many carrier pitches that local
telco lines might not be reliably able to discriminate that fine.


David Tamkin  P. O. Box 7002  Des Plaines IL  60018-7002  +1 708 518 6769
MCI Mail: 426-1818   CIS: 73720,1570   GEnie: D.W.TAMKIN  +1 312 693 0591

thomas%mvac23.uucp@udel.edu (Thomas Lapp) (06/01/90)

David Tamkin wrote:

> If PEP is modulated only at 7.35 or 88.26 baud, it should be no
> difficulty for the local lines to carry it, unless shoving so many
> bits into so few bauds requires so many carrier pitches that local
> telco lines might not be reliably able to discriminate that fine.

Aren't we forgetting the fact that some of those 511 channels that the
Telebit uses are outside the frequency range that the telco promises?
I think the telco says that you'll have decent output at something like
300 Hz to 3000 Hz (or is it 2700Hz?).  I thought I read that the
Telebit tries to use more of the frequency spectrum (like maybe up to
4000 Hz?).  So if the channels outside the promised range are unusable,
the telco isn't doing anything wrong, and the modem may not (at 88.26
baud) be able to use enough channels in-band to send at the higher bit
rates.

Just a thought.

 - tom

internet     : mvac23!thomas@udel.edu  or  thomas%mvac23@udel.edu
uucp         : {ucbvax,mcvax,psuvax1,uunet}!udel!mvac23!thomas
Europe Bitnet: THOMAS1@GRATHUN1         Location: Newark, DE, USA
Quote   : The only way to win thermonuclear war is not to play.

friedl@uunet.uu.net (Stephen J. Friedl) (06/01/90)

> If PEP is modulated only at 7.35 or 88.26 baud, it should be no
> difficulty for the local lines to carry it, unless shoving so many
> bits into so few bauds requires so many carrier pitches that local
> telco lines might not be reliably able to discriminate that fine.

PEP is modulated at 7.35 or 88.26 baud PER CARRIER, and to get the
baud for the whole signal one must multiple by the number of carriers
in use.  A PEP line is easily thousands of baud for a clean line, and
for phone line requirements, the 7.35 or 88.26 number is meaningless.


Stephen J. Friedl, KA8CMY / Software Consultant / Tustin, CA / 3B2-kind-of-guy
+1 714 544 6561  / friedl@mtndew.Tustin.CA.US  / {uunet,attmail}!mtndew!friedl

Rob Warnock <rpw3%rigden.wpd@sgi.com> (06/05/90)

In article <8574@accuvax.nwu.edu> mtndew!friedl@uunet.uu.net (Stephen
J.  Friedl) writes:

| > If PEP is modulated only at 7.35 or 88.26 baud, it should be no
| > difficulty for the local lines to carry it, unless shoving so many
| > bits into so few bauds requires so many carrier pitches that local
| > telco lines might not be reliably able to discriminate that fine.

| PEP is modulated at 7.35 or 88.26 baud PER CARRIER, and to get the
| baud for the whole signal one must multiple by the number of carriers
| in use.  A PEP line is easily thousands of baud for a clean line, and
| for phone line requirements, the 7.35 or 88.26 number is meaningless.

Sorry, you have a slight misunderstanding of the term "baud". The
signaling rate in "baud" is defined as "the reciprocal of the smallest
signalling interval", that is, the peak number of "symbols" or state
changes per second.  All of the sub-carriers change at the same time.
Thus the PEP protocol is indeed 7 or 88 baud.

However, each sub-carrier is only using about (3000 - 300) / 511 =
5.28 Hz of bandwidth. (Pushing a 7 baud signal through a 5 Hz pipe is
quite good!  The theoretical maximum is 2 baud/Hz: one state for each
half-cycle of bandwidth.) Since each sub-carrier encodes 2, 4, or 6
bits per baud, or 14.7, 29.4, or 44.1 bits/second, respectively, at
the 7.35 baud rate, that is 2.78, 5.57, or 8.35 bits/second per Hertz
of bandwidth. From the Shannon limit:

	BitsPerSecond < Bandwidth * log2((S/N) + 1)

That implies that the signal-to-noise has to be at least:

			     bps/Hertz
	S/N (dB) > 10 * log(2      -   1)

or:

	S/N > 7.69 dB (min.), for 2 bits/baud (a 14.7 bit/s sub-channel)
	S/N >16.67 dB (min.), for 4 bits/baud (a 29.4 bit/s sub-channel)
	S/N >25.12 dB (min.), for 6 bits/baud (a 44.1 bit/s sub-channel)

Of course, these are theoretical minima, and don't account for noise
to to adjacent sub-channel interference, or loss due to imperfect
coding, so the line has to be a good deal better than this. Still, if
only 400 channels could get the highest rate, that's still 17,600
bits/second (before subtracting for the 20% CRC and packetizing
overhead).

In case anyone is still confused, note that sending 6 bits/baud means
that you have to be able to send any one of 64 (= 2^6) "symbols" at
each state change. Symbols can be encoded as amplitude difference,
frequency difference (although not in this case), or phase difference.
The PEP scheme, which is actually called DAMQAM or Dynamically
Adaptive Multicarrier Quadrature Amplitude Modulation at this level,
uses a combination of amplitude and phase modulation on each
sub-carrier.

Note that if you only used AM, 64 symbols means 64 different voltage
levels, which means that (*very* crudely speaking) to avoid error the
noise level has to be less than 1/2 the difference between two
adjacent levels, so the noise doesn't turn one into the other, or
1/128 the maximum level. Thus, you need a S/N of 20*log(128) or 42 dB.
(The "20" is because we are comparing *amplitude*, not *power*, as
above.) That this doesn't match the 25 dB "Shannon limit" given above
is due to (1) my example was crude indeed, (2) pure AM is not nearly
as efficient as QAM, and (3) the Shannon limit -- a *minimum* bound --
assumes that you are employing "perfect" encoding. The actual S/N
needed is somewhere between the two, and closer to the upper. Anyway,
you get the idea...

So the limit to PEP operation is the signal-to-noise of each of a
large number of very narrow, slow channels, any of which can be
down-graded or dropped from use if needed if *that particular*
sub-channel is too noisy. Non-linearities and phase-slopes which would
blow away a higher baud-rate modem are shrugged off, since they has
much less affect on a 5 Hz (sub)channel.

In case anyone's curious about the fact that the quantizing into
levels by PCM (T-carrier) puts an upper limit of something like
20*log(128/0.5) = 48 dB on the S/N if 7 bits/sample are being used,
note that at 7.35 baud there are 8000 / 7.35 = 1088 samples/baud.  A
lot of the quantizing noise can thus be averaged out.


Rob Warnock, MS-9U/510		rpw3@sgi.com		rpw3@pei.com
Silicon Graphics, Inc.		(415)335-1673		Protocol Engines, Inc.
2011 N. Shoreline Blvd.
Mountain View, CA  94039-7311