pd (10/07/82)
WHat is the relationship between physical entropy and informational entropy. Can someone please answer this question (over mail or netnews) In connection with the following kinds of questions: a) Is it possible (eg) to store the physical entropy change in a memory chip in form of informational entropy in the chip itself ? If that can happen, is it a violation of the second law ? b) Does the entropy of the Universe change with the discovery of a major Physical Law (or should I say a refinement of a Scientfic Theory which is invalidable in more situations, a la Carl Popper) I have been trying to answer these questions, and I find I do not have the mathematical sophistication to read some of the texts on the subject. Would some kind soul take pity on me and explain known answers to me in Layman's terms ??? Yours in Curiosity, Prem Devanbu
leichter (10/09/82)
Yes, physical and informational entropy are closely related, although I don't know of any single simple argument that shows why; instead, you have to look at a number of different experiments and see that a relation seems to exist. Probably the classic experiment is (all of these are though experiments!) is Maxwell's demon. Take a box full of gas at a uniform temperature, and divide it down the middle by a wall with a door in it. Controlling the door is a demon who opens it to let slow particles go from left to right, but not right to left, and fast particles go from left to right, but not right to left. After a while, the right-hand side is hot and the left hand side is cold, in apparent contradiction of the second law. Now, you can make specialized arguments about particular means by which the demon could measure the speeds of the particles to reach his decision, and show that employing them would require him to expend more energy than he can gain from the heat difference; but you want a general result that tells you no method can work, whether you have analyzed it or not. (Otherwise, we have to throw out the second law!) So... what you notice is that the fundamental thing the demon is doing is GATHERING INFORMATION: initially, he only knows the average speed of the particles in the two halves, but he somehow determines which particles are fast and which are slow. (If you take fast and slow to be divided so that half the particles are in each subdivision on average - i.e. divide the distribution in half - then what the demon is getting is essentially one bit of information - which half of the distribution is the particle in?) Thus, it appears that gaining information is inextricably tied to expending energy, since the second law tells us that the demon must be expending energy. More detailed calculations even tell you how much energy he must expend to get one bit of information. There are other thought experiments that lead to the same conclusion. More important, they give you the same number for the energy for one bit. Thus, it is reasonable to claim that the information quantity in a bit of in- formation is a real, physical quantity, not a property of a particular situation. These are the general arguments. In fact, the numbers you get for entropy in physics and information theory turn out exactly the same (in situations where both have meaning) mainly by definition: The information theory quantity called entropy was DEFINED by analogy to the physical one. What the above argument shows is that the definition is not arbitrary; it has physical content. As to reducing entropy by storing information in a chip: Of course you do. Initially, the state of the chip was random; all you could say was that on average half the bits are on and half are off. After you store your infor- mation, you know the exact state of the system. This doesn't contradict the second law unless you managed to store that information without expending any energy! In fact, thermodynamics gives you a lower bound on how much energy ANY chip you could build would require to store a bit. -- Jerry decvax!yale-comix!leichter
atlas1 (10/12/82)
re: "does storing information in a chip reduce its entropy?" The Second Law of Thermodynamics ("In all physical processes of a system, the entropy of the system must increase, or remain the same") only applies to CLOSED systems. (caveat: I do not believe that the ramifications of relativity have been fully thrashed out - it may be that the universe cannot be considered a closed system in this sense). Thus, asking about the entropy of "a chip" is not subject to the limit imposed by the Second Law, because the system considered is not closed; something else must store the data. This action MUST require the expenditure of energy, the release of heat, etc. This release will increase the entropy of this data-storage agent, and the closed system (chip + agent) will satisfy the Second Law (assuming the agent is self-contained, including power, etc.). Most confusions and paradoxes concerning entropy stem from ignoring some part of a closed system, and only considering the obvious (visible) parts. Tom Roberts ihuxf!atlas1