[net.audio] Disbelief in information theory

sjc@angband.UUCP (Steve Correll) (01/21/85)

>   When I studied signal theory briefly about 12 years ago, there was
>   a theorem stating that it was *impossible* to push more information
>   through a signal than the bandwidth of the signal, e.g., one can't
>   send more than k bits per second through a k Hz bandlimnited channel.
>
>   Telephone voice-grade channels are 2700 Hz limited, filtering to allow
>   signals only from 300 Hz to 3000 Hz.  So how do 4800 and 9600 bps
>   modems work over dialup circuits?...
>
>   The answer seems to be that the theory that generated that theorem
>   wasn't completely correct.  Maybe the Nyquist theorem shouldn't be
>   regarded as gospel, either.

I suspect your theorem is the one due to Shannon:

   channel_capacity = bandwidth * log_base_2(1 + signal_to_noise_ratio)

You left out the part to the right of the "*". Fast modems rely on
"log_base_2(1 + signal_to_noise_ratio)" being greater than 1.

Slow modems use signaling elements with only two states, so that one
baud conveys one bit, whereas fast modems use more than two states, so
that one baud conveys more than one bit; roughly speaking, they use
multi-valued logic instead of binary logic.

In practice, modems use frequency- and phase-shift encoding, but here's
a thought experiment using amplitude encoding to show how you might
trade away S/N and thereby transmit more information through a given
channel.  Let's assume your channel limits amplitude to 1 volt, and
that its bandwidth lets you vary the signal by up to 1 volt per second.

You could send a single two-valued element each second, using 1 volt to
represent "1" and 0 volts to represent "0", and thus transmit 1 bit of
information per second.  Alternatively, you could send a four-valued
element each second, using 1 volt to represent "3", 0.66 for "2", 0.33
volts for "1", and 0 volts for "0", thus transmitting 2 bits of
information per second.

What prevents you from using arbitrarily many signal levels, and
transmitting an infinite quantity of information through this meagre
channel at one element per second? Noise.

In the first instance, given the best possibile decoder, noise must
exceed 0.5 volts to change a "0" to a "1" and corrupt your signal. In
the second, noise need exceed only 0.17 volts to corrupt your signal.
With infinitely many signal levels, an infinitely small amount of noise
would corrupt. Somewhere between these extremes, the signal meets the
actual noise, and that (intuitively speaking) explains the limit set by
the theorem above.

Information theory holds aloof above the fray; she favors neither
digital nor analog, but lays down the laws that partisans in both
camps must obey. Peace!
-- 
                                                           --Steve Correll
sjc@s1-c.ARPA, ...!decvax!decwrl!mordor!sjc, or ...!ucbvax!dual!mordor!sjc

wkk5231@acf4.UUCP (Wesley Kaplow) (01/21/85)

/* acf4:net.audio / sjc@angband.UUCP (Steve Correll) /  8:30 pm  Jan 20, 1985 */
>   When I studied signal theory briefly about 12 years ago, there was
>   a theorem stating that it was *impossible* to push more information
>   through a signal than the bandwidth of the signal, e.g., one can't
>   send more than k bits per second through a k Hz bandlimnited channel.
>
>   Telephone voice-grade channels are 2700 Hz limited, filtering to allow
>   signals only from 300 Hz to 3000 Hz.  So how do 4800 and 9600 bps
>   modems work over dialup circuits?...
>
>   The answer seems to be that the theory that generated that theorem
>   wasn't completely correct.  Maybe the Nyquist theorem shouldn't be
>   regarded as gospel, either.

I suspect your theorem is the one due to Shannon:

   channel_capacity = bandwidth * log_base_2(1 + signal_to_noise_ratio)

You left out the part to the right of the "*". Fast modems rely on
"log_base_2(1 + signal_to_noise_ratio)" being greater than 1.

Slow modems use signaling elements with only two states, so that one
baud conveys one bit, whereas fast modems use more than two states, so
that one baud conveys more than one bit; roughly speaking, they use
multi-valued logic instead of binary logic.

In practice, modems use frequency- and phase-shift encoding, but here's
a thought experiment using amplitude encoding to show how you might
trade away S/N and thereby transmit more information through a given
channel.  Let's assume your channel limits amplitude to 1 volt, and
that its bandwidth lets you vary the signal by up to 1 volt per second.

You could send a single two-valued element each second, using 1 volt to
represent "1" and 0 volts to represent "0", and thus transmit 1 bit of
information per second.  Alternatively, you could send a four-valued
element each second, using 1 volt to represent "3", 0.66 for "2", 0.33
volts for "1", and 0 volts for "0", thus transmitting 2 bits of
information per second.

What prevents you from using arbitrarily many signal levels, and
transmitting an infinite quantity of information through this meagre
channel at one element per second? Noise.

In the first instance, given the best possibile decoder, noise must
exceed 0.5 volts to change a "0" to a "1" and corrupt your signal. In
the second, noise need exceed only 0.17 volts to corrupt your signal.
With infinitely many signal levels, an infinitely small amount of noise
would corrupt. Somewhere between these extremes, the signal meets the
actual noise, and that (intuitively speaking) explains the limit set by
the theorem above.

Information theory holds aloof above the fray; she favors neither
digital nor analog, but lays down the laws that partisans in both
camps must obey. Peace!
-- 
                                                           --Steve Correll
sjc@s1-c.ARPA, ...!decvax!decwrl!mordor!sjc, or ...!ucbvax!dual!mordor!sjc
/* ---------- */