The story (legend) goes like this, von Neumann was asked by Shannon what he thinks and suggested Shannon call it entropy.
Shannon's entropy is actually a toy version of Boltzmann's entropy. It is a toy version because it only considers configurational entropy of discrete objects without actually describing microstates. The more interesting connection where almost no-one knows is that actually, Birkhoff's ergodic theory has legitimised Shannon's entropy as his version of ergodicity is the toy version of Boltzmann. Well, Gibbs's contribution has a different angle and it is astonishing that why von Neumann omitted that is interesting.
Shannon's entropy should be called von Neumann-Boltzmann-Shannon Entropy not only Shannon, maybe adding Birkhoff in the team.
Cite as
@misc{suezen20sew,
title = {Shannon's Entropy: Why it is called Entropy? },
howpublished = {\url{https://science-memo.blogspot.com/2020/11/shannons-entropy-why-it-is-called.html},
author = {Mehmet Süzen},
year = {2020}
}
Postscripts
- Ergodicity is an intricate subject: Boltzmann's and Birkoff's differing approaches.
- Jaynes extensively studied the connection, and his interpretation was similar, he said von Neumann-Shannon expression being a " a more primitive concept" and using statistical mechanical ideas to bring in a mathematical tool for statistical inference. See his papers I and II
Hi! Can we say that under the hypothesis of ergodicity the Shannon entropy coincides with Gibbs entropy? As the frequency probability with the ensemble probability?
ReplyDeleteIt depends on how configurational entropy is coarse grained with Gibbs and its relation to corresponding fine-grain probabilities. It is not only ensemble vs. local trajectory averages for probabilities. See
DeleteGibbs vs. Shannon entropies
Richard L. Liboff
Journal of Statistical Physics volume 11, pages343–357(1974)