pd (12/16/82)
I have some comments on the The Audiophile vs. Digital Recording debate. The audiophile's position, that I have heard often, is "It doesn't sound right, so there's something wrong with it." To which the Digital enthusiast retorts : "All audio signals can be expressed as a sum of sine waves with frequencies up to 20 Khz; and Digital techniques can accurately reproduce sine waves up to 20 Khz. So Digital recordings can accurately reproduce all Audio signals." At which point the audiophile retreats, baffled and beaten. However. Fourier Transforms attempt to MODEL a REALITY. There are merely a way of mathematically representing a REAL phenomenon. Fourier never claimed (like the good Scientist he was) that his frequency domain representation of an audio signal (for example) was a complete explanation of reality. A real scientist attitude would be not to reject the audiophiles perception of poor quality of Digital recordings because it didn't fit within Fourier- and z-transform models, but to devise repeatable, comprehensive experiments to gather data on the Audiophiles' dislike of digital recordings, and then attempt to explain it! Who knows. We might actually learn something! Prem Devanbu
ark (12/17/82)
I have yet to see documentation of a carefully constructed double-blind experiment in which a panel of "experienced listeners" consistently found a well-made digital recording to sound inferior to the best analog recordings.
karn (12/17/82)
A properly designed digital audio system records waveforms in the time domain by sampling them at a rate greater than twice the highest frequency component in the input signal. It doesn't matter what the sampling rate is, so long as it is at least the Nyquist rate (2x f) and the anti-aliasing input filter doesn't distort or roll off the highest frequency desired input signal. A 200 khz sampling rate would be a complete waste of bits. The human ear CANNOT distinguish between a 20 khz sine wave and a 20 khz non-sinusoidal wave, unless your hearing extends to at least 40 khz. This is an established fact. For this reason, a digital audio system's inablility to reproduce 20 khz square waves means nothing. As far as phase balance errors between channels are concerned, I find this hard to accept as a real problem. If both channels are not sampled simultaneously, this is fine so long as the D/A converters reconstructing the output signals update their outputs with the same timing. If not, this would be an inexcusable design error which could have been easily avoided. I'm sure that analog tape heads introduce more phase balance errors simply by being out of azimuth alignment than any production digital recording system. To show the effect of azimuth alignment on channel-to-chanell phase delay, try this experiment: Play a high frequency tone (a head alignment tape always has this) with the output channels combined (mono). Adjust the head azimuth. You will hear "beats" as the adjustment is made; this is the result of the two tracks going alternately in and out of phase as the angle of the playback head is adjusted. The proper adjustment is to peak the tone in the "center" beat; I usually find that I can get to the proper point by adjusting azimuth first on music known to have been recorded by a well-aligned deck, peaking the high frequencies by ear, then making the fine adjustment with the high frequency alignment tape. Only a small change in azimuth would be sufficient to cause a 10 degree phase imbalance between channels at 10-15 khz. Back to digital audio, my point is that almost any shortcoming of digital audio, imagined or otherwise, is far more likely to actually occur in conventional analog recorders. Try aligning even a good quality analog deck at high speed with good tape, and you'll be amazed at how imprecise they really are compared with digital's known performance. Again, I challenge anybody to produce a properly controlled, double-blind study that shows 50 khz, 16 bit digital audio to be inferior to the best possible analog recording techniques. Phil Karn
burris (12/17/82)
#R:eisx:-47100:ihlpb:4000026: 0:1432 ihlpb!burris Dec 16 21:28:00 1982 With only a very few exceptions, any audio signal CAN be reproduced via the Fourier transform. The Fourier transform will accurately represent any PERIODIC function. This is the key to the whole argument. Most audio signals change in frequency and amplitude very slowly with respect to even the lowest frequency to be reproduced. Therefore, with the proper sampling and filtering, an audio signal for all practical purposes is a periodic function. The problem of a 20 Khz. square wave is a valid one. In order to reproduce a 20 Khz. which is a REASONABLE representation of a square wave, at least the first three odd order harmonics should be present at the correct amplitude. This would require a response of 180 Khz. if trying to use sine waves to reconstruct a square wave of 20 Khz. There IS a perceptable difference between a 20 Khz. sine wave and square wave but I'm not sure if there would be a perceptable difference between a if even the first odd harmonic were present. However, This would still require a response of 60 Khz. A digital format which has the capability to vary the slope rate would stand a better chance, especially if the clock rate were high enough. The main question is what would the audible waveshape of a square wave look like after it passed through the amplifier, recorder, reproduction chain and your ears, even using the best of analog equipment. Dave Burris ihlpb!burris BTL - Naperville
FtG (12/17/82)
This digital discussion reminds me of the great hi-fi debate of so many years ago. It seems the "experts" knew that there was no point in reproducing high frequencies since human beans couldn't hear them. Well... somebody performed an experiment using live musicians (as opposed to dead ones?) separated by a system of baffles from the listeners. By adjusting the baffles, high frequencies could be damped out. The listeners noticed a significant loss of fidelity. The experts hadn't taken into account the huge difference between detecting a single tone and "sensing" a full range of harmonics. Thus hi-fi was born. (Coming next week, Uncle Ferd will tell you the story of stereo!!!) It seems that the digital people should know about these experiments and taken into account the complex nature of hearing. In any case, I can easily imagine the experts to be wrong again. FtG
thomas (12/17/82)
At the risk of being flamed off the net (watch out for the Geisthounds!), I will add my two cents to this debate. 1. How many of you can hear 20kHz anyway? Can you hear the whine of a TV set (15.75kHz)? If so, can you hear the ultrasonic burglar alarms in some retail stores (K-Mart has them around here) (about 18kHz)? If you answer no to either of these questions, then you have nothing to worry about when it comes to 20kHz signals (whether sine or square). 2. You certainly can't record a 20kHz square wave on most analog recording media, either. Most serious listeners have cassette decks these days. How many cassette tapes have a frequency response above 20kHz? How can you say "digital is inferior because it won't reproduce a 20kHz square wave" when your current medium won't either? If somebody can show that the human ear can distinguish between a sine wave and a square wave at the upper limit of hearing, then this is a motivation to improve things. Note that feeding a sine wave and a square wave into a speaker and listening for the difference doesn't count as a proper comparison (as somebody pointed out), because of non-linearities in the reproduction system, a square wave can actually cause some lower frequencies to be generated by the speaker. A valid comparison would involve generating sine and square pressure waves in the air. 3. Even listening comparisons must be done carefully. I used to work for a small digital audio firm, and we got one of a competitor's recorders in once (a large recording firm had a large backlog because editing using this competitor's system was very difficult, and they asked for help in getting a couple recordings edited). The converters in this recorder were so badly adjusted that they were introducing about 20dB worth of noise into the recording and reproduction! Although the theoretical S/N ratio was 96dB, the actual measured S/N was 70dB. Needless to say, this made music played on this system not sound as good as it could have. It would be a shame if sloppy manufacturing practices ruined the potential of digital recording. 4. For those of you who claim that listening is the real test, I have done a lot of listening to REAL digital recordings (not the record pressings, but the actual bits themselves). They have been, in general, the best reproductions of music I have ever heard. Instruments such as trumpets and drums, which suffer badly in most traditional recording media, sound just like trumpets and drums. The imaging is fantastic (arguing against those who claim that digital recording will necessarily introduce huge phase shifts). When I closed my eyes, I could imagine that I was sitting in the ideal seat in an otherwise empty concert hall, listening to a flawless performance (since mistakes can be edited out). It was the best listening experience I have ever had. I for one, can't wait for home digital (but when I get mine, I'm going to take it to my friendly local digital music company and get those D/A converters adjusted, that's for sure). Digitally yours, Spencer
mat (12/18/82)
I for one CAN hear TV raster, though I don't know if it is the 17+ kHz signal or some lower frequency that is being generated. In any case it is one of the most offensive sounds that I know. Yes, my terminal bothers me a little too! I can also hear SOME of the ultrasonic store alarms, and can 'feel' some others. This notwithstanding, I find most digital recordings just fine. Some of them seem a bit on the bright side after listening to 10 or 15 year old recordings ... but is that the fault of the digital process or of the older, less accurate recordings ? Every medium introduces SOME effects, either as a result of the medium itself or as a result of the paraphenalia that accompanies the medium. There are some people who feel that Avery Fischer Hall is too 'bright' and prefer Carnagie hall. There are some who find the visible delay in sound propagation in Carnagie Hall to be disconcerting and prefer the more 'accurate' Avery Fischer. Are recordings made in anechoic chambers with perfect mikes? What about reflections from the recording artists body, the mike stands, etc? And how many of us listen at the ideal locations for proper phasing relative to our speakers ... in an anechoic chamber. Yes, there ARE bad things that can happen to the HiFi signal ... but not all things that happen must be bad, and some of these can thoroughly mask many of the 'distortions' that get introduced. --NEXT PLEASE!