[comp.ai.digest] Uncertainty and Imprecision

RUSPINI@IU.AI.SRI.COM (Enrique Ruspini) (03/23/88)

It is usually very difficult (as well as annoying) to engage on
another round of the "acrimonious" debate on approximate reasoning
since often the discussion deals with side-issues or supposed
"paradoxes" or "inconsistencies" of theories that require, for proper
understanding, a good deal of sophistication.

The contribution of Star to AILIST (10 Mar 88) provides, however,
reasonable grounds for discussion as Star makes reasonably succint points
describing the bases for his subjectivist preferences. Unfortunately,
that conciseness is not matched by solid scientific arguments.

Before analyzing his four purportedly unique characteristics of the
subjectivist approach, let me just say that it is plain wrong to
consider fuzzy sets as an alternative to either Dempster-Shafer or
classical probability. It is actually an approach that complements
probabilistic reasoning by providing another type of insight on the
state of real world systems.

If, for example, we say that the probability of `"economic recession"
is 80%' we are indicating that there is either a known (if we are
thinking of probabilities in an objective sense) or believed (if take
a subjectivist interpretation) tendency or propensity of an economical
system to evolve into a state called "recession".

If, on the other hand, we say that the system will move into a state
that has a possibility of 80% of being a recession, we are saying that
we are *certain* that the system will evolve into a state that
resembles or is similar at least to a degree of 0.8 (in a preagreed
scale) to a state of recession (note the stress on certainty with
imprecision about the nature of the state as opposed to a description
of a believed or previously observed tendency).

Clearly, possibilistic and probabilistic approaches have different
epistemological bases. To try to force a unique view on uncertain
reasoning (and on all interpretations of probability) as suggested by
some is a bit similar to trying to understand the physics of light
using solely wave-based models.

Passing now to the unique distinguishing characteristics of
"mainstream" approaches let us take them one by one:

> Subjective probability ...is the only approach that has the
> characteristics of

> 1. Being based on a few, acceptable simple axioms.  

The multiple assertions contained in this first statement are all
either wrong or, at best, rather subjective matters.

Wheter or not Savage's axioms (Foundations of Statistics) are
acceptable (whatever that means) or sufficient is arguable. Readers
may find it interesting to look at Savage's axioms and decide by
themselves if they are either "simple" or "acceptable". The
insufficiency of such systems to capture enough aspects of rational
behavior, for example, has been criticized (e.g.,Kyburg,H., J. Phil.,
1974).  It is interesting to note also that some of these systems
(e.g., Cox) contain axioms that many find objectionable because they
appear to have been introduced solely to validate certain
characteristics of the approach (e.g., Bayes conditionalization). So
much for acceptability.

As for simplicity it is interesting to note that if one does away with
some of the axioms in systems such as Savage or Cox one immediately
gets systems (which are, necessarily, simpler) characterized by
interval- (rather than number-) valued probabilities. See, for example,
the critique of Savage axioms in "Suppes, P., The Measurement of
Belief, J. Roy. Stat. Soc., 1974.

There are also enough paradoxes around showing that rather rational
folk (including prominent subjectivists) often engage in behavior
which is inconsistent with their own prescriptions (e.g., the
well-known "Ellsberg" and "Allais" paradoxes). Of course one can
define "rational behavior" any way one wants and declare that even
oneself is not always rational but the shortcomings of these tricks
in word-play are clear: one should prescribe standards for rationality
and find out if one's recipe always assures compliance with them, rather
than define "rational behavior" solely as that which comes out of
following one's preferred procedures (which are far from being
noncontroversial) !!!

To end this point it is important to note that axiomatizations of
fuzzy sets and D/S exist (including recent development of
model-theoretic semantics for them --- more below on this) although I
must admit that I feel that it is rather silly to try to defend or
attack approaches on such bases: often systems (including those used
to reason) are very complex and it is ridiculous to expect that formal
systems will be capable of capturing such complexity using
straightforward formalisms. Under such conditions, simplicity should be
a reason for suspicion and concern.

> 2. Being able to connect directly with decision theory
> (Dempster-Shafer can't).

This is nonsense. Interval probability approaches (of which D/S is an
example) may be applied to decision analysis by simply extending the
techniques proposed by subjectivists. The difference (and I suspect
that this is what Star means by "connection") is that by producing
intervals of expected values for each decision they fail to tell
sometimes if a decision is better or worse than another. There is
nothing wrong with this, though: all that the results tell you is that
there are neither rational bases nor factual data to assure you that A
is preferrable to B, a true fact of life. Insisting that one approach
is better because it always produce a needed decision (the famous
"pragmatic necessity") even when the factual support is not there
leaves one wondering about such a choice of standards (if something
must be done, then an interval-based approach followed by coin
flipping will produce results that are as "reliable" or "rational").

It is interesting to note that if the word "belief" is replaced by
"temperature" in some of these curious epistemological arguments, then
we may convince ourselves that we always know the temperature of
everything (or act as if we do) and decide to do away with
thermometers.  Readers may also wonder about the curious circularity
of support in the pairs of arguments: "We always know degrees of
belief about any proposition (or act as if we do) because in the end
we always do something" and "We do something that is best because we
always have rationally derived degrees of belief."

It is also interesting to wonder whether Bayesian decision theory
(based primarily on the notion of expected utility as the unique
performance measure for decisions) is sufficient for the variety of
complex problems found on modern applied science (e.g., "What is the
sense of choosing business strategies that are only good in the
long-run when a (single) unfortunate turn of events may leave us
broke?").

> 3. Having efficient algorithms for computation.

I do not know where Star got this notion. Efficient algorithms are
available for both D/S and fuzzy sets. Being present at the last AAAI
in Seattle, I recall that Judea Pearl mentioned the great
computational and formal similarities between some of the Bayesian and
D/S network algorithms (e.g., some of Pearl's algorithms and the
procedures for evidence combination of Dempster/Kong). It is difficult
to believe that if a prominent Bayesian makes such an assessment, one
class of procedures could be efficient while the other is not (of
course, efficiency per se is of little value if your method gives you
the wrong solution!)

As for fuzzy sets, their very purpose is to simplify the analysis of
complex systems by trading unneeded precision for increased depth of
understanding. AIListers may be interested to know more about an
operational subway system control in Japan (in the city of Sendai)
designed by Hitachi, over a reported 8 years period, which uses fuzzy
logic. Works describing this effort (Proceedings IFSA 1987) indicate
also the reasons why fuzzy logic was used over other alternatives
(anybody with a background in control theory will shudder at the
complexities of handling --even if they were known-- the required
probabilities, covariance matrices, etc.  involved in a classical
stochastic control approach for large complex, control systems!).

4.> Being well understood.

I do not know whether Star is criticizing other theories (which had
been studied only for a few years) for not being as well understood as
classical probability . Setting aside whether or not probability
(particularly in its subjective interpretation) is "well understood"
(a matter much disputed among philosophers of probability), I do not
feel that it is particularly surprising that recent technological
developments are not as well developed or understood as techniques
that have been around for a long while (One can only wonder why this
make the old techniques more acceptable to deal with new classes of
problems).

If one looks at the applicability and current rate of progress in new
approaches, however, one sees a different story. Both fuzzy sets and
D/S are advancing strongly at the theoretical and applied level.
Dempster/Shafer has been found to have solid foundations rooted in
classical probability and epistemic logic (Ruspini, Proc. 1987 IJCAI;
SRI Technical Note 408). Recently formal semantics have been developed
for both fuzzy sets and D/S. [Ruspini, IPMU 1988]. Readers may want to
contrast this vigorous advance with subjectivists, who, after 40 or 50
years, have failed to generate convincing arguments supporting their
conceptual reliance on algorithms that assume that we always know
probabilities of events (or act as if we do) even when rational or
empirical bases to support such knowledge are admittedly absent !

I do not know where Star is looking ("Look at what people are doing
with Dempster-Shafer belief functions or fuzzy sets.") but I, for one,
have looked enough, and with a considerable background in formal
sciences, I do not see anything that brings to mind the images that
these approaches evoke in Star's mind ("mainstream approaches" versus
"more experimental").

To repeat myself, it is ridiculous to expect to find strong formalisms
around newly evolving theories (Would anybody expect Galilean
mechanics to have been formalized before being seriously considered?).
The state of development of both D/S and fuzzy sets is neither
unfounded nor solely "experimental" (a questionable epithet) or non
"mainstream" (another convenient but inaccurate qualifier), however.
One should be very leery, on the other hand, of questionable practices
that purport to derive "rational" decisions in the absence of
knowledge: a miracle that I have described elsewhere (Comp.
Intelligence, February 1988) as epistemological alchemy.

I would like to say that my comments should not be construed to negate
the value of classical probability in the study of uncertainty.
Furthermore, I believe that it is important to continue to study the
concept of belief and the problems associated with its quantification
and recognize the positive contributions that subjectivists have made
(and, undoubtely, will continue to make) to the art and science of
probabilistic reasoning. 

I believe, however, that, while striving to improve existing
methodologies, however, we should keep an open mind towards novel
concepts while realizing that the former approaches might not be (at
least yet!) as comprehensive and efficient as some zealously purport
them to be.

-------