[comp.ai.digest] Re^2: definition of information

rar@ADS.COM (Bob Riemenschneider) (06/17/88)

Date: Thu, 16 Jun 88 13:49 EDT
From: Bob Riemenschneider <rar@ads.com>
To: ailist@ai.ai.mit.edu, bnevin@CCH.BBN.COM
cc: rar@ads.com
In-Reply-To: bnevin@CCH.BBN.COM's message of 13 Jun 88 19:49:00 GMT
Subject: Re^2: definition of information

=>   From: bnevin@CCH.BBN.COM (Bruce E. Nevin)
=>
=>   My understanding is that Carnap and Bar-Hillel set out to establish a
=>   "calculus of information" but did not succeed in doing so.  

I'm not sure what your criteria for success are, but it looks pretty good to 
me.  They didn't completely solve the problem of laying a foundation for
Carnap's approach to inductive logic.  (But it certainly advanced the 
state of the art--see, e.g.,  the second of Carnap's _Two Essays on Entropy_,
which was, obviously, heavily influenced by this work.)  Advances have been
made since the original paper as well: see the bibiographies for Hintikka's 
paper and Carnap's later works on inductive logic (especially "A System of 
Inductive Logic" in _Studies in Inductive Logic_, vols. 1 and 2).
[Disclaimer: There are very serious problem's with Carnap's approach
to induction, which I have no wish to defend.]

=>   Communication theory refers to a physical system's capacity to transmit 
=>   arbitrarily selected signals, which need not be "symbolic" (need not mean
=>   or stand for anything).  To use the term "information" in this connection
=>   seems Pickwickian at least.  "Real information"?  Do you mean the
=>   Carnap/Bar-Hillel program as taken up by Hintikka?  Are you saying that
=>   the latter has a useful representation of the meaning of texts?

The Carnap and Bar-Hillel approach is based on the idea that the information
conveyed by an assertion is that the actual world is a model of the
sentence (or: "... is a member of the class of possible worlds in which 
sentence is true", or: "the present situation is a situation in which the
sentence is true", or: <fill in your own, based on your favorite 
model-theory-like semantics>).  This is certainly the most popular formal
account of information.  They, and Hintikka, count state descriptions
to actually calculate the amount of information an assertion conveys, but 
that's just because Carnap (and, I suppose, the others) are interested in
the logical notion of probability.  If you start with a probability measure
over structures (or possible worlds, or situations, or ... ) as given, you
can be much more elegant--see, e.g., Scott and Krauss's paper on probabilities
over L-omega1-omega-definable classes of structures.  (It's in one of those 
late-60's North-Holland "Studies in Logic" volumes on inductive logic, maybe 
_Aspects of Inductive Logic_.)  I don't recall what, if anything, you said 
about the application you have in mind, but, as the dynamic logic crowd 
discovered, L-omega1-omega is a natural language for talking about computation
in general.

=>   Bruce Nevin
=>   bn@cch.bbn.com
=>   <usual_disclaimer>

							-- rar