bnevin@CCH.BBN.COM (Bruce E. Nevin) (06/10/88)
Date: Thu, 9 Jun 88 08:07 EDT From: Bruce E. Nevin <bnevin@cch.bbn.com> Subject: definition of information To: ailist@ai.ai.mit.edu cc: bn@cch.bbn.com It is often acknowledged that information theory has nothing to say about information in the usual sense, as having to do with meaning. It is only concerned with a statistical measure of the likelihood of a particular signal sequence with respect to an ensemble of signal sequences, a metric misleadingly dubbed by Hartley, Shannon, and others "amount of information". Can anyone point me to a coherent definition of information respecting information content, as opposed to merely "quantity of information"? Bruce Nevin bn@cch.bbn.com <usual_disclaimer>
rar@ADS.COM (Bob Riemenschneider) (06/14/88)
Date: Sat, 11 Jun 88 16:57 EDT From: Bob Riemenschneider <rar@ads.com> To: ailist@ai.ai.mit.edu, bnevin@cch.bbn.com cc: rar@ads.com In-Reply-To: bnevin@CCH.BBN.COM's message of 9 Jun 88 22:48:00 GMT Subject: Re: definition of information => It is often acknowledged that information theory has nothing to say => about information in the usual sense, as having to do with meaning. => ... => => Can anyone point me to a coherent definition of information respecting => information content, as opposed to merely "quantity of information"? => => Bruce Nevin => bn@cch.bbn.com Actually, much the same formalization applies to "real" information. See R. Carnap and Y. Bar-Hillel, "An Outline of a Theory of Semantic Information", Technical Report 247, Research Laboratory of Electronics, MIT, October 1952. (Reprinted in Y. Bar-Hillel, _Language and Information_, Addison-Wesley, 1964.) J. Hintikka, "On Semantic Information", in: J. Hintikka and P. Suppes (eds.), _Information and Inference_, Reidel, 1970. for starters. I'm not sure what you mean by `respecting information content', but this approach *is* based on analysis of the logical consequences of messages. -- rar
bnevin@CCH.BBN.COM (Bruce E. Nevin) (06/14/88)
Date: Mon, 13 Jun 88 09:43 EDT From: Bruce E. Nevin <bnevin@cch.bbn.com> Subject: Re: definition of information In-Reply-To: Your message of Sat, 11 Jun 88 13:57:20 PDT To: Bob Riemenschneider <rar@ads.com> cc: ailist@ai.ai.mit.edu, bnevin@cch.bbn.com My understanding is that Carnap and Bar-Hillel set out to establish a "calculus of information" but did not succeed in doing so. Communication theory refers to a physical system's capacity to transmit arbitrarily selected signals, which need not be "symbolic" (need not mean or stand for anything). To use the term "information" in this connection seems Pickwickian at least. "Real information"? Do you mean the Carnap/Bar-Hillel program as taken up by Hintikka? Are you saying that the latter has a useful representation of the meaning of texts? Bruce Nevin bn@cch.bbn.com <usual_disclaimer>
golden@FRODO.STANFORD.EDU (Richard Golden) (06/15/88)
Date: Tue, 14 Jun 88 10:52 EDT From: Richard Golden <golden@frodo.STANFORD.EDU> To: AIList-REQUEST@AI.AI.MIT.EDU Subject: Re: Definition of Information In AILIST Digest V7 #26 Bruce Nevin asks: Can anyone point me to a coherent definition of information respecting information content, as opposed to merely "quantity of information"? This question is really related to an earlier discussion concerned with viewing probability theory as a measure of belief. We can think of a knowledge structure as being represented by a probability distribution which assigns some "degree of belief" (i.e., a probability) to some set of events (i.e., a sample space). Let X be an event which occurs with probability p(X). Then clearly an equivalent "knowledge structure" which assigns some "degree of surprise" (i.e., -LOG[p(X)]) to some set of events (i.e., a sample space) may be constructed. The simple point which I am making is that the SAMPLE SPACE and the STRUCTURE OF ITS ELEMENTS is a necessary component of the definition of information in a technical sense and information CONTENT (for the most part) resides in this SAMPLE SPACE. Richard Golden (golden@psych)