[sci.math] Definition wanted: a kind of entropy

cme@lectroid.sw.stratus.com (04/11/90)

I know how to compute the bits of entropy of the outputs of a
communications channel given the probability distribution of its
alphabet.  I want to compute bits of information from a different
source.


Let there be a boolean array, f, of length N filled with random bits
and a communications channel carrying symbols, x_{i} in an alphabet of
N characters.  Let there be a machine which takes that channel as
input and outputs y_{i} = f( x_{i} ).

I want to compute the number (fraction) of bits of information about
x_{i} which I learn from y_{i}.  y_{0} gives me 0 bits of information
because it is totally random.  y_{1} gives me a small amount of
information because if y_{1} != y_{0} then we know that
x_{1} != x_{0}.

I don't know how to compute that amount of information.

If I get good direct replies, I'll post them.

Thanks,

	Carl