[net.audio] connecting cables and damping

ms (03/22/83)

In response to uiucdcs!emrath:

And that's the same type of literature that says 1 watt
is sufficient amplifier power, under the most ideal conditions
with 100% efficiency.  But that's not the world stereo
equipment lives in.

I don't claim to have golden ears either, but my equipment *is*
very good, and I use heavy cable (Monster cable type).  Your
listening test wasn't very well controlled, you didn't even
do a simple A/B comparison.  Maybe the following comparison is
too simple, but I think it illustrates one of the main reasons
for using heavy gauge wire.  Larger gauge wire can carry more current
with less loss, which is why high tension power lines and jumper
cables are so big.  It is now more or less agreed that an
amplifiers current capability is more important than its power, and
therefore, for the amplifier to most efficiently pass the current,
especially over long distances, large cable would be most effective.

What about the difference between resistance and impedance and
their interaction with damping? I know that speakers are really
measured by impedance, NOT resistance.  Is it not possible that
if two speakers have widely varying impedance, within or between
themselves, that a difference in cable gauge will allow the
amplifier to "see" the load in a different manner that will affect
the sound? That's just a question up for grabs, I don't know the
answer.

Also, I think that damping is frequency dependant, i.e., amplifier
specs give damping at several specific frequencies, or over a
frequency range.  So, the technical literature says 4 is enough,
but at what frequency?  Is it a misconception that damping is more
important at the very low frequencies, is 4 enough at 30 Hz. ?
(And yes, I do have transmission line subwoofers that go down to
30Hz., powered by Dynaco 416 with 200 watts/ch - not that many
sources go that low, but I do like direct-to-disk and audiophile
records that have honest bass and wide dynamic range).
Does simply changing the resistance change the damping ??

I'm pretty much middle of the road on this issue, I wouldn't get
cables just because they use some fancy or esoteric theory.  But,
it seems entirely reasonable to allow more amplifier amperage to
reach the speakers, and I simply use very heavy gauge wire.  There
is sometimes a gap between theory and reality and what can be heard
vs. what can be measured.  And, just because *you* can't hear a
difference (in a poorly controlled test) doesn't mean the cables
aren't worth it to a lot of other people who can hear a difference
with superior equipment.

burris (03/23/83)

#R:dvamc:-104300:ihlpf:4000027:  0:636
ihlpf!burris    Mar 23  9:37:00 1983


The power supply voltage and the load impedance together determine
the maximum output power to the load ( assuming there is enough gain
to reach this amplitude). This maximum power can be reached under
"real-life" situations given two stipulations:

 1. The current capability of the power supply must be large
    enough to supply the lowest impedance to be driven at full
    output to prevent clipping caused by voltage drops in the
    supply.

 2. The output transistors must be capable of handling the current
    required to drive the load to full output without being
    destroyed.

Dave Burris
ihlpf!burris
BTL - Naperville

emrath (03/24/83)

#R:dvamc:-104300:uiucdcs:22700010:000:1989
uiucdcs!emrath    Mar 23 18:27:00 1983

I tried very carefully to compare the sound from my speakers with a damping
factor of 100 (or whatever my amps are capable of-I haven't bothered to measure them) versus a damping factor of 2. I convinced myself that even if somebody
could hear the difference in an A/B compare, this difference is of NO
significance to ME, and it will be forgotten within 10 seconds of switching.
This is all strictly my opinion, and I fully agree that some people will
find special speaker cables to be worth their price.

Notes:
  1)	I always assume that a stated damping factor is a minimum over the
	audio frequency range (20-20KHz), unless otherwise specified.
  2)	As for losses, using a .639 ohm wire (100 feet of 18 Ga. copper @ 68
	deg. F) with 8 ohm speakers causes less than a 10% drop in current,
	which is a loss of less than 1dB (i.e. inaudible), and a damping
	factor slightly greater than 10 (assuming output impedance of amp
	is much lower than that of wire-it generally would be).

By the way, I am currently using a homemade amplifier (not the amps mentioned
above) which is 1 Watt per channel.  The amps clip at 4.9 volts into 8 ohms
at 50 Hz, both channels driven (same phase).
I built my own LED peak indicators. Yellow triggers at +/-1.33V, red at
+/-4.0 Volts.  (a 4 Volt 0-peak sine wave is 1W rms into 8 ohms).
Attack time is less than 50 microsecs. An overvoltage of this duration
guarantees that the LED comes full on for at least 25 millisec (which is
quite visible). I sometimes turn the volume up to where the yellows are
flashing, but I seldom turn it up to where the reds are activated. In
testing, I found that I could raise the volume 12dB above that level which
never triggers the reds before I could perceive the distortion, even though
I knew damn well the thing was clipping! I originally built this thing
as a headphone amp, but it worked so well with my speakers (Pioneer CS-63s
kind of old, but rather efficient) that I have been living with it for almost
a year.