Thursday, August 1, 2024

WORK: the tie that binds
Shannon and physical entropy

We have a transmitter/receiver A linked by a transducer to a receiver/transmitter B.

For Shannon and for physicists, entropy is a measure of disorder, which is to say of the more probable or more abundant combinations or configurations.

For physicists, entropy gain accompanies loss of ability to do work, in particular to repeat a cyclic task perpetually.

We see that process for information in the party game Telephone. But we wish to be more precise. So we consider a low-entropy (or high-information) message transmitted from A to B. When transmitted again from B to A, we have reversed the process and, with high probability, State A has been replicated again at A.

The machine has performed work in both directions.

Now error-correction coding vastly lowers the probability of a loss of information on one cycle. Yet, random quantum fluctuations in the circuitry and electromagnetic signal mean that -- with enough repetitions -- entropy (noise) must overwhelm and obliterate the information (signal). In fact, it seems likely that the hard machinery would deteriorate from entropy before the signal would be destroyed under the assumption of magically strong machinery.

To make the analogy between information and physical entropy a bit sharper, we can have receiver B include an interpreter that, upon receiving signal X, converts it to Y. It then sends Y back to A, which also has an interpreter, converting Y to X.

Because, perhaps, more work is done in Scenario 2, more quantum fluctuations are expected, and hence the Shannon entropy would rise faster than under Scenario 1.

Two observations:
1. Shannon entropy can be strongly affected by physical entropy.

2. Theoretically, a closed physical system is exactly reversible (State A to State B to State A with no loss of energy), but in practice this happens with probability 0. (If you observed a shattered glass tumbler reassemble itself on your floor, you would assume, perhaps after the fact, that you had been "seeing things" and write it off as a mental aberration.) Yet, in the case of Shannon information, the cycle is highly reversible in theory, as Shannon showed with his channel capacity theorem. In fact, modern telecommunications shows that reversibility extends, with strong probability, well beyond one cycle.
Shannon's theorem however, by assuming random errors, does imply that both physical and Shannon entropy eventually win out over work and information. Note that work is the connecting link between Shannon entropy and physical entropy.

The inevitable loss of signal in noise tells us that history can only be retrodictable within limits. That is, Shannon entropy can be said, in a sense, to spread into the past.

This sets up an interesting situation in which highly ordered past states are lost in a fog of degraded present information -- ie, competing entropies and so-called arrows of time.

No comments:

Post a Comment

A short proof of the Jordan curve theorem

The following is a proposed proof. Topology's Jordan curve theorem, first proposed in 1887 by Camille Jordan, asserts that an...