[net.audio] Digital audio adapters on VCRs

karn@petrus.UUCP (Phil R. Karn) (11/22/85)

Recently I got the chance to play with a borrowed Sony PCM-F1 digital audio
processor box. This unit consists of two halves. One part digitizes two
audio channels and produces a "video" signal that can be recorded on a VCR.
The other half accepts an encoded video signal and reproduces the original
audio signals.  The sampling rate is 44.056 Khz, 14 bits/sample, and approx
34% redundancy is added in the form of error correcting codes.

When the PCM-F1 was configured in loopback mode (video output feeding video
input) I was completely unable to tell the difference between the PCM
processed signal and the raw input signal. However, there was significant,
audible degradation of the signal if it was recorded and played back on my
VHS VCR (an RCA VKT-385), even at the 2-hour rate.  A look at the video
signal on a scope revealed that while the video signal-to-noise ratio was
quite reasonable, there was considerable intersymbol interference that
caused considerable timing jitter and was almost certainly the cause of the
high bit error rate and audible degradation.  Each bit transition overshot
by almost 50% of the peak-to-peak signal amplitude, and from the asymmetry
of the ringing there was obviously considerable phase distortion.

I'm wondering if this sort of crummy response is inherent in home VCRs, or
if it is an intentional "feature" added to make the picture look
subjectively better. I know that high frequency "peaking" is often added to
video equipment to provide "edge enhancement" and wonder if this is what's
going on in my VCR. If this is so, there unfortunately seems to be no way to
turn it off.

One further observation. If I calibrated the record levels on the PCM-F1 so
that a 0-db signal on my CD player resulted in just saturating the PCM-F1's
A/D converters, then the background noise on even my quietest digitally
recorded classical CDs was still sufficient to randomly toggle the lower 3
or 4 bits of the PCM-F1's A/D output (you can see this by watching the
encoded video signal on a TV).  This tells me that the dynamic range of
current CDs is limited by external factors like microphone and preamp hiss,
room noise and AC hum, and that even 14 bit quantization is sufficient to
record these signals with no loss in dynamic range.

Phil

karn@petrus.UUCP (Phil R. Karn) (11/24/85)

Blush. I guess I owe RCA at least a partial apology.

The major cause of the problem turned out to be a ground loop formed by the
ground on the Tek scope I was monitoring the video signal with and the
ground of my cable TV feed. Disconnecting either the scope or the CATV feed
from the wall cleared things up dramatically. It turns out that there is
about 2v p-p of harmonic-rich AC potential between my CATV feed and the
local power outlet.

What's strange is that the effect of the hum was noticeable only on
playback, even though it was there during E-E (electronic-to-electronic)
loopback during stop mode. I guess only when the hum was combined with the
playback degradation of the VCR was it enough to cause the PCM-F1's error
correction circuits to have problems. Or perhaps the video clamper in the
PCM-F1 wasn't as effective on a noisy playback signal.

The overshoot on playback is still there, of course, and I'd still like to
know if there is an easy way to eliminate it by flattening out the VCR's
phase and frequency response.  Clicks and pops do still occasionally occur,
and I'm sure that cleaning up the "eye" pattern would make these less
frequent.

Phil