[net.audio] CD principles

jeff@tesla.UUCP (09/21/83)

The phase shift vs. frequency characteristics and square-wave distortion
reasons are summarized very well in the "Computers & Electronics" article
on CD players a couple of months ago.  It seems to me that, yes, those phase
shift characteristics are completely trivial compared to the phase shift
vs. frequency characteristics that characterize a symphony orchestra in
a concert hall.  Speaker manufacturers tried to make much of this by
setting tweeters back from woofers by a few inches, a few years ago.  Just
another "new speaker design!" gimmick.  I doubt that ANY distortion
(phase, etc.) in a CD player is audible at all.  

Certainly, however, most of the software available is not up to the
rest of the system.

Jeff

pmr@drufl.UUCP (Rastocny) (09/22/83)

It amazes me as to how many people believe what they read and few
believe their own ears.  If you don't believe that a 180 degree phase
shift is audibly confusing, switch the polarity of the speaker wire
on one of your speakers.

This is slightly different from what is happening in the D/A conversion,
but the correlation can be made.  Things just don't sound quite right
with this much phase shift.

The best way to compare analog and digital playback integrity is to
listen to violins.  If you know what a violin sounds like, analog is
closer to reproducing one more correctly.  Next, listen to an instrument
with little harmonic content above 2KHz like a low-level passages of
bass drum to eliminate a dynamic range contest.  Digital now sounds
more accurate (phase is more linear in this region).

Phil Rastocny
AT&T Information Systems Laboratories
drufl!pmr

michaelk@tekmdp.UUCP (Michael Kersenbrock) (09/23/83)

Whereas a CD player handles both channels the same, shouldn't
the experiment have you swap leads on *both* speakers before looking
for the effect?  Further, just swap leads on *both* tweeters.

Mike Kersenbrock
Tektronix Microcomputer Development Products
Aloha, Oregon

P.S.- The above doesn't really have much to do about the "real"
problem of group delay distortion (which due to other effects isn't
a "real" problem at high frequencies anyway)

dyer@wivax.UUCP (Stephen Dyer) (09/24/83)

Switching the polarity of the wires to one speaker demonstrates
a 180 degree phase shift RELATIVE to the other speaker.
The filtering necessary during digital recording causes phase
shift, but it's applied uniformly to both channels.
Very few stereo components maintain "absolute phase", anyway.
Many well regarded power amps, for example, invert their inputs.

/Steve Dyer
decvax!wivax!dyer

howard@metheus.UUCP (09/29/83)

To Phil Rastocny (drufl!pmr):

I don't recall ANYONE claiming that you couldn't hear the difference when you
phase-shift one of two related signals which are being put onto separate
speakers, or even onto one speaker.  Consider two identical sine waves.  If
we add them IN phase we get a louder sine wave, if we add them OUT of phase we
get nothing.  Clearly these will be distinguishable by the ear if the original
sine waves were audible.

However, this has NOTHING to do with most of the preceding discussion about
phase shifts.  It is still true that phase-shifting harmonics with respect
to one another IN A SINGLE SIGNAL produces NO (or extremely little) perceptible
change in sound.  Try reversing the leads to BOTH your speakers at once, thus
shifting the phase of ALL the signal, and tell me you can tell the difference
THEN.  I think you'll find you can't.

Next time you want to make snide remarks about people's submissions to the net,
try understanding them first.  You may find there's no need after all.

	Howard A. Landman
	tektronix!ogcvax!metheus!howard

rlr@pyuxn.UUCP (09/30/83)

I believe that there has been research done that states that one cannot
tell the difference between a tone (say a square wave) which has a certain
harmonic content (a square wave contains all odd harmonics in amounts
inversely proportional to their harmonic number) and another tone with
equivalent harmonic content where the phases of some of the harmonic
components are reversed in the resulting waveform (which in this case,
would no longer be a square wave in shape as a result).  It is my
understanding that THIS is the kind of phase shift that we are talking
about here.  This does not imply anything about one speaker producing the
square wave and the other producing the altered tone (i.e., phase shift
in one of the two speakers but not the other).  It implies that if a range
of frequencies are shifted uniformly in both speakers, the listener could
not distinguish this from the original sound.

This was from a course in electronic music and acoustics many years ago, so
I don't remember where this is documented.

mat@hou5d.UUCP (M Terribile) (10/01/83)

Regarding the question of whether phase distortion i.e. dispersion and
group delay can be heard, and whether harmonic content is all that matters:

I would not be surprised in the least to hear that in the range of 3kHz
and up there is no audible difference.

On the other hand, since we can begin to distinguish between sound events
that happen a few milliseconds, or a few tens of milliseconds, apart,
I would not be surprised if such distortion became audible when the
frequencies involved are 300 Hz or below.

With CD players, the noticable phase distorion is taking place at
frequencies above 10 kHz.  I would be more concerned with the fact that
a few people have SOME preception of sounds over 20 kHz and that CDs are
limited to not much more than that.

					Mark Terribile
					hou5d!mat