[comp.dcom.modems] Telebit in Byte's test, Ventel PEP, V.32

ray@csri.toronto.edu (Raymond Allen) (08/25/88)

In article <15303@shemp.CS.UCLA.EDU> casey@cs.ucla.edu.UUCP (Casey Leedom) writes:
>  Standards are hard enough to get everyone to agree on that it makes
>sense to design extensible ones that can grow with technology.  Don't you
>think we've had enough standards that tie our hands as better technology
>comes along?
>
>  Standards can be wonderful things, but when they're blindly applied
>with no thought to the future they become mill stones in a very short
>time.
>
>Casey

Actually, V.42 does contain much room for adding future error correction
and/or data compression methods.  But your point is well taken.  It is worth
noting, however, that many of the industries' largest modem mfrs. have
worked together (and sometimes *not* together from my understanding) to
make V.42 what it is today.

With regards to your point about blind application, it is worth noting that
provision for the use of a 32-bit CRC (CCITT calls it
the "Frame Check Sequence") was added to the V.42 specification.  This is nice
from a technological standpoint since it will reduce the bit error rate to
about 10e-239 (note: :-) ) but from an implementation standpoint it is a real
b**ch because there is hardware in existence that will calculate the 16-bit
CRC but the 32-bit one would have to be done in software and this has to
be done for each character transmitted -- a rather large overhead.
Seems that (as usual) standards committies have reduced awareness of real-
world considerations.
-- 
Ray Allen  | Someday I'm going to have to change my .signature.
utcsri!ray | All of the above is usually my own opinion.

henry@utzoo.uucp (Henry Spencer) (08/27/88)

In article <8808251940.AA00343@ellesmere.csri.toronto.edu> ray@csri.toronto.edu (Raymond Allen) writes:
>provision for the use of a 32-bit CRC (CCITT calls it
>the "Frame Check Sequence") was added to the V.42 specification.  This is nice
>from a technological standpoint since it will reduce the bit error rate to
>about 10e-239 (note: :-) ) but from an implementation standpoint it is a real
>b**ch because there is hardware in existence that will calculate the 16-bit
>CRC but the 32-bit one would have to be done in software and this has to
>be done for each character transmitted -- a rather large overhead.
>Seems that (as usual) standards committies have reduced awareness of real-
>world considerations.

Ho ho.  Not so.  The standards committees are more aware of real-world
considerations than you are, in this case.  The fact is, 16-bit CRCs *ARE
NOT ENOUGH* for some types of modern modems, and this problem will only get
worse.  Things like RF modems will often send a substantial number of bits
as a single transition, which means that one noise hit can foul up rather
a lot of bits.  16-bit CRCs fail with significant frequency in this sort
of environment; there was an uproar in the IBM mainframe world a while ago
when clear proof was produced that IBM's 16-bit CRC did not dependably
detect errors in bulk transmission over high-speed networks.

In this case, the standards committee is being farsighted, and is accepting
implementation problems today for the sake of reliable functioning tomorrow.
(Well, they may have *done* it because 32 bits sounded sexier, but the net
result is favorable regardless of real motives.)
-- 
Intel CPUs are not defective,  |     Henry Spencer at U of Toronto Zoology
they just act that way.        | uunet!attcan!utzoo!henry henry@zoo.toronto.edu