[net.audio] linear- vs. minimal-phase: Long and Technical

keithe@sri-unix (11/18/82)

I can't let the comment go that came across this news group a few days
ago.


	Subject: phase response of digital recorders
	Newsgroups: net.audio


	At the recent AES show, I spoke with an engineer from
	Sounstream and a product manager (ex-engineer?) from Sony.
	Both of them claimed that the low pass filters that they used
	in the input and output stages of their digital recorders were
	minimum phase (in what freq range? hmmmm) and were associated
	with appropriate all pass filters to make the phase shift 
	( nearly?) zero for all frequencies. As discussed in a recent
	article in JAES, a constant phase shift is probably not
	acceptable for audio reproduction (especially if the shift is
	180 degrees).

  -->   For those who are not familiar with the terminology, minimum
  -->   phase means that the circuit has a linear phase shift vs
  -->   frequency curve.

Well, not in my book...

In conventional filter design, Minimal Phase means something a WHOLE
LOT DIFFERENT than Linear Phase (excuse the shouting).

A minimal-phase filter is one in which each frequency is shifted the
minimum amout possible while still attaining the proscribed magnitude
alteration of the signal. The phase shift here means the phase of the
signal measured from the input of the network to the output of the
network. Good ol' Butterworth class of filters is an example of such
a filter (the Butterworth family includes Chebyschev and 
Cauer-Chebyshev, also).

This phase shift can be measured by connecting a CW (continuous-wave)
sine-wave generator to the network and, at each frequency of interest,
measuring this phase shift. (Be careful if you're measuring networks
which shift the phase by more than 2 PI radians. But you'll only find
this out by measuring the phase-shift over an appropriately wide enough
frequency range to see this effect creeping up on you when you plot
the results.)

If you do this test, and plot the results of phase-shift versus 
frequency, you will see that you get a very curvy (sp?/grammer?) plot. 
The signal at frequencies near the cut-off frequency will undergo the 
most serious phase-shift, with other frequencies less affected.

A linear-phase filter is signicantly different; and usually a lot more
work to accomplish. It is a filter which, when tested in the manner
described above, yields a phase-shift versus frequency which tries its
damnest to approximate a straight line. The slope of the line is
unimportant, just as long as it is STRAIGHT. This is because the
parameter of interest is not really the absolute phase-shift, but
rather the first derivative of phase-shift with respect to frequency:
this is what is called "Group Delay" (actually, you have to put a minus
sign in front to the derivitave to call it group delay, but that's 
one of those things that we used to say 'comes out in the wash').

Now that I've bored you to tears with this, let me add a caveat: It
appears that the engineer and the product-manager mentioned in the
earlier news item really meant that the filters were "linear phase",
but they use the term "minimal phase" as a contraction for "minimal
phase distortion". This would make sense because of the mention of
"appropriate all pass filters" associated with the filters.
It is the all-pass networks, by the way, which are responsible for
making the overall filtering system [system = filter + allpass] into
linear phase; without altering the magnitude response, they create 
additional phase shift at the frequencies that need it, thus matching 
the phase shift created (predominantly near the cutoff frequency) 
by the filter.

>From what I understand, the ear/brain combination uses phase 
information as a method of locating the source of the sound, but not 
in identifying or recognizing the sound. So an audio reproduction 
system which suffers (severe) phase distortion will "sound ok," but 
the imaging of the sound sources will range from muddled to indistinct. 
That may be acceptable for some types of music (rock? - no flames), 
but would be unacceptable for renditions of orchestral music. Or you 
could just go monaural and forget the whole problem?

With great apologies for the length...

Keith Ericson
Teklabs