[sci.electronics] Defining the Analog/Digital Distinction

harnad@mind.UUCP (Stevan Harnad) (10/27/86)

Tom Dietterich (orstcs!tgd) responds as follows to my challenge to
define the A/D distinction:

>	In any representation, certain properties of the representational
>	medium are exploited to carry information. Digital representations
>	tend to exploit fewer properties of the medium. For example, in
>	digital electronics, a 0 could be defined as anything below .2 volts
>	and a 1 as anything above 4 volts. This is a simple distinction.
>	An analog representation of a signal (e.g., in an audio amplifier)
>	requires a much finer grain of distinctions--it exploits the
>	continuity of voltage to represent, for example, the loudness
>	of a sound.

So far so good. Analog representations "exploit" more of the properties 
(e.g., continuity) of the "representational" (physical?) medium to carry
information. But then is the difference between an A and a D representation
just that one is more (exploitative) and the other less? Is it not rather that
they carry information and/or represent in a DIFFERENT WAY? In what
does that difference consist? (And what does "exploit" mean? Exploit
for whom?)

>	A related notion of digital and analog can be obtained by considering
>	what kinds of transformations can be applied without losing
>	information. Digital signals can generally be transformed in more
>	ways--precisely because they do not exploit as many properties of the
>	representational medium. Hence, if we add .1 volts to a digital 0 as
>	defined above, the result will either still be 0 or else be undefined
>	(and hence [un]detectable). A digital 1 remains unchanged under
>	addition of .1 volts. However, the analog signal would be
>	changed under ANY addition of voltage.

"Preserving information under transformations" also sounds like a good
candidate. But it seems to me that preservation-under-transformation
is (or ought to be) a two-way street. Digital representations may be
robust within their respective discrete boundaries, but it hardly
sounds information-preserving to lose all the information between .2
volts and 4 volts. I would think that the invertibility of analog
transformations might be a better instance of information preservation than
the irretrievable losses of A/D. And this still seems to side-step the
question of WHAT information is preserved, and in what way, by analog
and digital representations, respectively. And should we be focusing on
representations in this discussion, or on transformations (A/A, A/D,
D/D, D/A)? Finally, what is the relation between a digital
representation and a symbolic representation?

Please keep those definitions coming.

Stevan Harnad
{allegra, bellcore, seismo, packard}  !princeton!mind!harnad
harnad%mind@princeton.csnet
(609)-921-7771

jj@alice.UUCP (10/27/86)

> From allegra!princeton!mind!harnad Wed Dec 31 19:00:00 1969
> 
> 
> Tom Dietterich (orstcs!tgd) responds as follows to my challenge to
> define the A/D distinction:
> 
> >	In any representation, certain properties of the representational
> >	...
> >	of a sound.
> 
> So far so good. Analog representations "exploit" more of the properties 
> ...
> for whom?)
> 
> >	A related notion of digital and analog can be obtained by considering
> >	...
> >	changed under ANY addition of voltage.
> 
> "Preserving information under transformations" also sounds like a good
> ...
> representation and a symbolic representation?
> 
> Please keep those definitions coming.
> 
> Stevan Harnad

What a pleasant little bit of sophistry.  Mr. Harnad asks for a defination
of "digital" and "analog", both words used in a precise way in a particular
literature.  He also asks that we do not use other words used in that literature
to write the defination.  

In other words, we are asked to define something precisely, in a languange
that does not have precise values.


I suggest the first chapter of Rabiner and Gold, all of Wozencraft and Jacobs,
and perhaps a good general text on signal processing for starters.  That will
define the language.  Then the defination can be made.

Philosophy is wonderful, it doesn't have to have anything to do
with reality.
-- 
WOBEGON WON'T BE GONE, TEDDY BEAR PICNIC AT 11.
"If you love that Politician, use your Loo, use your Loo"

(ihnp4;allegra;research)!alice!jj

harnad@mind.UUCP (Stevan Harnad) (10/30/86)

[Could someone post this to sci.electronics, to which I have no access, please?]

------
(1)
ken@rochester.arpa writes:

>	I think the distinction is simply this: digital deals with a finite set
>	of discrete {voltage, current, whatever} levels, while analog deals
>	with a *potentially* infinite set of levels. Now I know you are going
>	to say that analog is discrete at the electron noise level but the
>	circuits are built on the assumption that the spectrum is continuous.
>	This leads to different mathematical analyses.

It sounds as if a problem of fact is being remedied by an assumption here.
Nor do potential infinities appear to remedy the problem; there are perfectly
discrete potential infinities. The A/D distinction is again looking
approximate, relative and scale-dependent, hence, in a sense, arbitrary.

>	Sort of like infinite memory Turing machines, we don't have them but
>	we program computers as if they had infinite memory and in practice
>	as long as we don't run out, it's ok. So as long as we don't notice
>	the noise in analog, it serves.

An approximation to an infinite rote memory represents no problem of
principle in computing theory and practice. But an approximation to an
exact distinction between the "exact" and the "approximate" doesn't seem
satisfactory. If there is an exact distinction underlying actual
engineering practice, at least, it would be useful to know what it
was, in place of intuitions that appear to break down as soon as they
are made precise.

--------
(2)
cuuxb!mwm (Marc Mengel) writes:

>	Digital is essentially a subset of analog, where the range of
>	properties used to represent information is grouped into a
>	finite set of values...
>	Analog, on the other hand, refers to using a property to directly
>	represent an infinite range of values with a different infinite
>	range of values.

This sounds right, as far as it goes. D may indeed be a subset of A.
To use the object--transformation--image vocabulary again: When an object
is transformed into an image with only finite values, then the transform
is digital. (What about combinations of image values?) When an
infinite-valued object is transformed into an infitinite-valued (and
presumably covariant) image, then the transform is analog. I assume
the infinities in question have the right cardinality (i.e.,
uncountable). Questions: (i) Do discrete objects, with only finite or
countably infinite properties, not qualify to have analogs? (ii) What does
"directly represent" mean? Is there something indirect about finiteness?
(iii) What if there are really no such infinities, physically, on either
the object end or the image end?

May I interject at this point the conjecture that what seems to be
left out of all these A/D considerations so far (not just this module)
is that discretization is usually not the sole means or end of digital
representation. What about symbolic representation? What turns a
discretized, approximate image of an object into a symbolic
representation, manipulable by formal rules and semantically
interpretable as being a representation OF that object? (But perhaps
this is getting a little ahead of ourselves.)

>	This is why slide-rules are considered analog, you are USING distance
>	rather than voltage, but you can INTERPRET a distance as precisely
>	as you want. An abacus, on the otherhand also USES distance, but
>	where a disk is MEANS either one thing or another, and it takes
>	lots of disks to REPRESENT a number. An abacus then, is digital.

(No comment. Upper case added.)

--------
(3)
<bcsaic!ray> writes:

>	(An) analog is a (partial) DUPLICATE (or abstraction) 
>	of some material thing or some process, which contains 
>	(it is hoped) the significant characteristics and properties 
>	of the original.

And a digital representation can't be any of these things? "Duplicate"
in what sense? An object's only "exact" double is itself. Once we move
off in time and space and properties, more precise notions of
"duplicate" are needed than the intuitive ones. Sharing the SAME
physical properties (e.g., obeying the same differential equations
[thanks to Si Kochen for that criterion])? Or perhaps just ANALOGS of
them? But then that gets a bit circular.

>	A digital device or method operates on symbols, rather than 
>	physical (or other) reality.  Analog computers may operate on 
>	(real) voltages and electron flow, while digital computers 
>	operate on symbols and their logical interrelationships.

On the face of it, digital computers "operate" on the same physical
properties and principles that other physical mechanisms do. What is
different is that some aspects of their operations are INTERPRETABLE
in special ways, namely, as rule-governed operations of symbol tokens
that STAND FOR something else. One of the burdens of this discussion
is to determine precisely what role the A/D distinction plays in that
phenomenon, and vice versa. What, to start with, is a symbol?

>	Digital operations are formal; that is they treat form rather 
>	than content, and are therefore always deductive, while the 
>	behavior of real things and their analogs is not.

Unfortunately, however, these observations are themselves a bit too
informal. What is it to treat form rather than content? One candidate
that's in the air is that it is to manipulate symbols according to
certain formal rules that indicate what to do with the symbol tokens
on the basis of their physical shapes only, rather than what the tokens or
their manipulations or combinations "stand for" or "mean." It's not clear
that this definition is synonymous with symbol manipulation's always
being "deductive." Perhaps it's interpretable as performing deductions,
but as for BEING deductions, that's another question. And how can
digital operations stand in contrast to the behavior of "real things"?
Aren't computers real things?

>	It is one of my (unpopular) assertions that the central nervous 
>	system of living organisms (including  myself) is best understood 
>	as an analog of "reality"; that most interesting behavior 
>	such as induction and the detection of similarity (analogy and 
>	metaphor) cannot be accomplished with only symbolic, and 
>	therefore deductive, methods.

Such a conjecture would have to be supported not only by a clear
definition of all of the ambiguous theoretical concepts used
(including "analog"), but by reasons and evidence. On the face of it,
various symbol-manipulating devices in AI do do "induction" and "similarity
detection." As to the role of analog representation in the brain:
Perhaps we'd better come up with a viable literal formulation of the
A/D distinction; otherwise we will be restricted to figurative
assertions. (Talking too long about the analog tends to make one
lapse into analogy.)

--------
(4)
lanl!a.LANL.ARPA!crs (Charlie Sorsby) writes:

>	It seems to me that the terms as they are *usually* used today
>	are rather bastardized... when the two terms originated they referred
>	to two ways of "computing" and *not* to kinds of circuits at all.
>	The analog simulator (or, more popularly, analog computer) "computed"
>	by analogy.  And, old timers may recall, they weren't all electronic
>	or even electrical.

But what does "compute by analogy" mean?

>	Digital computers (truly so) on the other hand computed with
>	*digits* (i.e.  numbers). Of course there was (is) analogy involved
>	here too but that was a "higher-order term" in the view and was
>	conveniently ignored as higher order terms often are.

What is a "higher-order term"? And what's the difference between a
number and a symbol that's interpretable as a number? That sounds like
a "higher-order" consideration too.

>	In the course of time, the term analog came to be used for those
>	electronic circuits *like* those used in analog simulators (i.e.
>	circuits that work with continuous quantities). And, of course,
>	digital came to refer to those circuits *like* those used in digital
>	computers (i.e. those which work with discrete or quantized quantities.

You guessed my next question: What does "like" mean, and why does
the underlying distinction correlate with continuous and discrete
circuit properties?

>	Whether a quantity is continuous or discrete depends on such things
>	as the attribute considered, to say nothing of the person doing the
>	considering, hence the vagueness of definition and usage of the
>	terms. This vagueness seems to have worsened with the passage of time.

I couldn't agree more. And an attempt to remedy that is one of the
objects of this exercise.

--------
(5)
sundt@mitre.ARPA writes:

>	Coming from a heavily theoretical undergraduate physics background, 
>	it seems obvious that the ONLY distinction between the analog and
>	digital representation is the enumerability of the relationships
>	under the given representation.

>	First of all, the form of digital representation must be split into
>	two categories, that of a finite representation, and that of a 
>	countably infinite representation.  Turing machines assume a countably
>	infinite representation, whereas any physically realizable digital
>	computer must inherently assume a finite digital representation.

>	Second, there must be some predicate O(a,b) defined over all the a
>	and b in the representation such that the predicate O(a,b) yields
>	only one of a finite set of symbols, S(i) (e.g. "True/False").
>	If such a predicate does not exist, then the representation is
>	arguably ambiguous and the symbols are "meaningless".

>	Looking at all the (a,b) pairs that map the O(a,b) predicate into
>	the individual S(i):
>	ANALOG REPRESENTATION: the (a,b) pairs cannot be enumerated for ALL S(i)
>	COUNTABLY-INFINITE DIGITAL REPRESENTATION: the (a,b) pairs cannot be
>	enumerated for ALL S(i).
>	FINITE DIGITAL REPRESENTATION: all the (a,b) pairs for all the S(i)
>	CAN be enumerated.

>	This distinguishes the finite digital representation from the other two 
>	representations. I believe this is the distinction you were asking
>	about. The distinction between the analog representation and the
>	countably-infinite digital representation is harder to identify.
>	I sense it would require the definition of a mapping M(a,b) onto the
>	representation itself, and the study of how this mapping relates to
>	the O(a,b) predicate. That is, is there some relationship between
>	O(?,?), M(?,?) and the (a,b) that is analgous to divisibility in
>	Z and R.  How this would be formulated escapes me.

You seem to have here a viable formal definition of something
that can be called a "analog representation," based on the
formal notion of continuity and nondenumerability. The question seems to
remain, however, whether it is indeed THIS precise sense of
analog that engineers, cognitive psychologists and philosophers are
informally committed to, and, if so, whether it is indeed physically
realizable. It would be an odd sort of representation if it were only
an unimplementable abstraction. (Let me repeat that the finiteness of
physical computers is NOT an analogous impediment for turing-machine
theory, because the finite approximations continue to make sense,
whereas both the finite and the denumerably infinite approximation to
the A/D distinction seems to vitiate the distinction.)

It's not clear, by the way, that it wasn't in fact the (missing)
distinction between a countable and an uncountable "representation" that
would have filled the bill. But I'll assume, as you do, that some suitable
formal abstraction would capture it. THe question remains: Does that
capture our A/D intuitions too? And does it sort out all actual (physical)
A/D cases correctly?

--------

The rest of Mitch Sundt's reply pertains also to the 
"Searle, Turing, Categories, Symbols" discussion that
is going on in parallel with this one:

>	we can characterize when something is NOT intelligent,
>	but are unable to define when it is.

I don't see at all why this is true, apart from the fact that
confirming or supporting an affirmation is always more open-ended
than confirming or supporting a denial.

>	[Analogously] Any attempt to ["define chaos"] would give it a fixed
>	structure, and therefore order... Thus, it is the quality that
>	is lost when a signal is digitized to either a finite or a
>	countably-infinite digital representation.  Analog representations
>	would not suffer this loss of chaos.

Maybe they wouldn't, if they existed as you defined them, and if chaos
were worth preserving. But I'm beginning to sense a gradual departure from
the precision of your earlier formal abstractions in the direction of
metaphor here...

>	Carrying this thought back to "intelligence," intelligence is the
>	quality that is lost when the behavior is categorized among a set
>	of values. Thus, to detect intelligence, you must use analog
>	representations (and meta-representations). And I am forced to
>	conclude that the Turing test must always be inadequate in assessing
>	intelligence, and that you need to be an intelligent being to
>	*know* an intelligent being when you see one!

I think we have now moved from equating "analog" with a precise (though
not necessarily correct) formal notion to a rather free and subjective
analogy. I hope it's clear that the word "conclude" here does not have
quite the same deductive force it had in the earlier considerations.

>	Thinking about it further, I would argue, in view of what I just
>	said, that people are by construction only "faking" intelligence,
>	and that we have achieved a complexity whereby we can percieve *some*
>	of the chaos left by our crude categorizations (perhaps through
>	multiple categorizations of the same phenomena), and that this
>	perception itself gives us the appearance of intelligence. Our
>	perceptions reveal only the tip of the chaotic iceberg, however,
>	by definition. To have true intelligence would require the
>	perception of *ALL* the chaos.

Thinking too much about the mind/body problem will do that to you
sometimes.

Stevan Harnad
{allegra, bellcore, seismo, rutgers, packard}  !princeton!mind!harnad
harnad%mind@princeton.csnet
(609)-921-7771