Showing posts with label thermodynamics. Show all posts
Showing posts with label thermodynamics. Show all posts

Saturday, 25 February 2023

Loschimidt's Paradox and Causality:
Can we establish Pearlian expression for Boltzmann's H-theorem?

Boltzmann (Wikipedia)
  • Post covers the papers: H-theorem do-conjecture, M. Süzen,  arxiv:2310.01458 (2023) 

Preamble

Probably the most important achievement for humans is the ability to produce scientific discoveries, that  helps us objectively understand how nature works and build artificial tools where no other species can.  Entropy is an elusive concept and one of the crown achievements of human race. We question here if causal inference and Loschmidt's paradox can be reconciled. 


Mimicking analogies are not physical

Before even try to understand what is a physical entropy, we should make sure that there is only one kind of physical entropy from thermodynamics, formulated by Gibbs-Boltzmann ($S_{G}$ and $S_{B}$).  Other entropies such as Shannon's information entropy are all analogies to physics, and mimicking concepts.

Why counting microstates are associated with time?

The following definition of entropy is due to Boltzmann but Gibbs' formulation tend to provide equivalence, technically different formulations aside, they are actually equivalent.

Definition 1: An entropy of a macroscopic material is associated with larger number of states its constituted elements take different states, $\Omega$. This is associated with $S_{B}$, Boltzmann's entropy.  

Now, as we know from basic thermodynamics classes that entropy change of a system can not decrease, so the time's arrow. 

Definition 2: Time's arrow is identified with change in entropy of material systems, that $\delta S \ge 0$.

We put aside the distinction between open and close systems and equilibrium and non-equilibrium dynamics, but concentrate on how come counting system's state's are associated with time's arrow? 

Loschimidt's Paradox: Irreversible occupancy on discrete states and causal inference

The core idea probably can be explained via discrete lattice and occupancy on them over chain of dynamics. 

Conjecture 1: Occupancy of $N$ items on $M$ discrete states, $M>N$, evolving with dynamical rules $\mathscr{D}$ necessarily increases $\Omega$, compare to the number of sampling if it were $M=N$. 

This conjecture might explain the entropy increase, but irreversibility of the dynamical rule $\mathscr{D}$ is required addressing Loschimidt's Paradox, i.e., how to generate irreversible evolution given time-reversal dynamics. Actually, do-calculus may provide a language to resolve this, by inducing interventional notation on Boltzmann's H-theorem with Pearlian view. The full definition of H-function is a bit more involved, but here we summarise it in condensed form with a do operator version of it.

Conjecture 2 (H-Theorem do-conjecture): Boltzmann's H-function provides a basis for entropy increase, it is associated with conditional probability of a system $\mathscr{S}$ being in state $X$ on ensemble $\mathscr{E}$. Hence, $P(X|\mathscr{E})$. Then, an irreversible evolution from time-reversal dynamics should use interventional notation, $P(X|do(\mathscr{E}))$. Then information on how time reversal dynamics leads to time's arrow encoded on, how dynamics provides an interventional ensembles, $do(\mathscr{E})$.

Conclusion

We provided some hints on why would counting states lead to time's arrow, an irreversible dynamics.  In the light of the development of mathematical language for causal inference in statistics, the concepts are converging. Along with understanding Loschmidt's Paradox via do-calculus, it can establish an asymmetric notation. Loschmidt's question is long standing problem in physics and philosophy with great practical implications in different physical sciences.

Further reading

Please cite as follows:

 @misc{suezen23lpc, 
     title = {Loschimidt's Paradox and Causality: Can we establish Pearlian expression for Bolztmann's H-theorem?}, 
     howpublished = {\url{https://science-memo.blogspot.com/2023/02/loschimidts-do-calculus.html}}, 
     author = {Mehmet Süzen},
     year = {2023}
}  

@article{suzen23htd,
    title={H-theorem do-conjecture},
    author={Mehmet Süzen},
    preprint={arXiv:2310.01458},
    url = {https://arxiv.org/abs/2310.01458}
    year={2023}
}

Saturday, 18 February 2023

Insights into Bekenstein entropy with an intuitive mathematical definitions:
A look into Thermodynamics of Black-holes

Jacob Bekenstein
(Wikipedia)
Preamble

Thermodynamics of black holes has appeared as one of the most interesting areas of research in theoretical physics [Wald1994], specially after LIGO's massive success. The striking results of Jacob Bekenstein  [Bekenstein1973] in proposing a formulation of entropy for a black hole was on of the most striking turning point in building explanations for the thermodynamics of gravitational systems. Bekenstein entropy is defined to be so-called a phenomenological relationship and surprisingly easy to understand concept using basic dimensionality analysis. In this post, we will show how to understand the entropy of a black hole just using basic dimensionality analysis, fundamental physics constants and basic definition of entropy. 

Dimensions and scales

Dimensionality analysis appears in many different areas of physics and engineering, from fluid dynamics to relativity. The starting point is to understand the concept of dimensions. Every quantity we measure in real life has a dimension. It means a quantity $\mathscr{Q}$  we obtain from a measurement $\mathscr{M}$ has a numeric value $v$ and associated unit $u$. $\mathscr{Q}=\langle  v, u \rangle$ given $\mathscr{M}$. There are 3 distinct fundamental unit types length (L),  time (T) and mass (M).

Intuitive Bekenstein entropy (BE) for a black hole : Informal mathematical definition

Black holes are astronomical objects that are not directly observable due to their mass condensed in a small area. The primary object we will use is something called Planck length $L_{p}$ and it implies physically possible smallest patch of the space-time, this is associated with the state of the black holes on their horizon. We won't define the Planck length here in detail but with the knowledge of fundamental physics constants and dimensional analysis we mentioned, one can get a constant value for this length. 

Definition: Finite entropy $S_{f}$ of an object is associated with the number of states $\Omega$ a system can attain.

If we combine this definition for a black hole entropy : 

Definition Finite entropy of a black-hole $S_{f}^{BH}$ is  associated with the number of its states $\Omega$, number of elements on it's surface area of $A$. The elements are discretised with  small patches $a_{p}=L_{p}^{2}$. Then intuitively,  $\Omega$ yields to $A$ divided by $a_{p}$.
  
Bekenstein entropy is not thermodynamic entropy alone and family of Bekenstein entropies

The unit analysis tells us that $A$ has a dimension of length square.  We intentionally omit any equality in the above definition upon $S_{f}^{BH}$ because, in practice Bekenstein Entropy is not thermodynamic entropy alone. The formulation usually presented as BE in general uses equality for the above approach. However this is not strictly thermodynamical alone, that's why we specify definitions as finite entropy and only express the relationship as association. Similarly any other constants as it can yield to different Bekenstein entropies such as introduction of new constants would yield to family of Bekenstein entropies.

Why surface area defines states of a black-hole?

This is an amazing question and Bekenstein's main contribution is to associate this to number of states of a black-hole on event horizon, i.e., point of of no return layer whereby ordinary matter can't return. The justification is that all other properties of black hole defines this surface. Here is the intuitive definition of states of black-hole.

Definition A surface area $\mathscr{A}$ is formed by the set of physical properties forming an ensembles. such as charge density, angular momentum. These ensembles indirectly samples thermodynamics ensembles. 

Even though intuition is there, this question might still be an open question further.

Conclusion

We provided the primary idea that Bekenstein tried to convey in his 1973 paper intuitively. However,  we identify its thermodynamic limit is an open research area. Thermodynamic limit implies that taking infinite limit of both area and the discretised areas, even though it sounds that the values might converge to infinity, simultaneous limit would converge to a finite value for a physical matter. 

Primary Papers
Primary Book

Please cite as follows:

 @misc{suezen23ibe, 
     title = {Insights into Bekenstein entropy with an intuitive mathematical definitions}, 
     howpublished = {\url{https://science-memo.blogspot.com/2023/02/bekenstein-entropy.html}, 
     author = {Mehmet Süzen},
     year = {2023}
  }

Postscript A: 

Information can’t be destroyed


Proposals of that information is destroyed out of thin air is a red flag for any physical theory: this includes theories on evaporating black holes. Bekenstein’s insight in this direction that surface area is associated with entropy. The black-holes’   information in this context is quite different than the Shannon’s entropy. An evaporating black-hole, the area approaching to zero is not the same as information going to zero, surface area is a function of  physical properties of the stellar object that bound  by conservation laws in their interaction with their surrounding. Hence, the information is preserved even if area goes  to zero.


Postscript B: 

What is Holographic principle? its origins from Bekenstein Entropy perspective

The word embedding applies in this context as well. Embedding implies some sort of  dimensionality projection. A projection to lower dimensional space, or on the other end,  to the higher dimensional space. Holography is no different. Imagine taking 2D snap shots of rotating 3D objects, generating this in reverse is the end effect of holographic  reconstruction. N-dimension to (N-1) projection. This is the bases of holographic principle: entropy of black-holes doesn’t appear as all states of its constituted matter,  as normally should have for ordinary matter, it manifest as N-1 projection on it’s surface. This kind of holographic entropy is first noted by Bekenstein; whereby he assigned the event-horizon area as a representation of the states of the black-hole volume. This projection to (N-1)-dimension is improved upon Bekenstein’s approach to generalised situations in explaining how universe might be  a hologram entirely by Gerard 't Hooft and Leonard Susskind. Holographic principle, probably one of the most important development in theoretical physics in recent times.



Tuesday, 15 November 2022

Differentiating ensembles and sample spaces: Alignment between statistical mechanics and probability theory

Preamble 

Sample space is the primary concept introduced in any probability and statistics books and in papers. However, there needs to be more clarity about what constitutes a sample space in general: there is no explicit distinction between the unique event set and the replica sets. The resolution of this ambiguity lies in the concept of an ensemble.  The concept is first introduced by American theoretical physicist and engineer Gibbs in his book Elementary principle of statistical mechanics The primary utility of an ensemble is a mathematical construction that differentiates between samples and how they would form extended objects. 

In this direction, we provide the basics of constructing ensembles in a pedagogically accessible way from sample spaces that clears up a possible misconception. This usage of ensemble prevents the overuse of the term sample space for different things. We introduce some basic formal definitions.

    Figure: Gibbs's book
 introduced the concept of
ensemble (Wikipedia).

What Gibbs's had in mind by constructing statistical ensembles?

A statistical ensemble is a mathematical tool that connects statistical mechanics to thermodynamics. The concept lies in defining microscopic states for molecular dynamics; in statistics and probability, this corresponds to a set of events. Though these events are different at a microscopic level, they are sampled from a single thermodynamics ensemble, a representative of varying material properties or, in general, a set of independent random variables. In dynamics, micro-states samples an ensemble. This simple idea has helped Gibbs to build a mathematical formalism of statistical mechanics companion to Boltzmann's theories.

Differentiating sample space and ensemble in general

The primary confusion in probability theory on what constitutes a samples space is that there is no distinction between primitive events or events composed of primitive events. We call both sets sample space. This terminology easily overlooked in general as we concentrate on events set but not the primitive events set in solving practical problems.   

Definition: A primitive event $\mathscr{e}$ implies a logically distinct unit of experimental realisation that has not composed of any other events.

Definition: A sample space $\mathscr{S}$ is a set formed by all $N$ distinct primitive events $\mathscr{e}_{i}$.  

By this definition, regardless of how many fair coins are used or if a coin toss in a sequence for the experiment, the sample space is always ${H,T}$, because these are the most primitive distinct events a system can have, i.e., a single coin outcomes. However, the statistical ensemble can be different.  For example for two fair coins or coin toss in sequence of length two, corresponding ensemble of system size two reads ${HH, TT, HT, TH}$. Then, the definition of ensemble follows. 

Definition: An ensemble  $\mathscr{E}$ is a set of ordered set of primitive events $\mathscr{e}_{i}$. These event sets can be sampled with replacement but order matters, i.e., $ \{e_{i}, e_{j} \} \ne  \{e_{j}, e_{i} \}$, $i \ne j$.

Our two coin example's ensemble should be formally written as $\mathscr{E}=\{\{H,H\}, \{T,T\}, \{H,T\}, \{T,H\}\}$, as order matters members $HT$ and $TH$ are distinct. Obviously for a single toss ensemble and a sample space will be the same. 

Ergodicity makes the need for differentiation much more clear : Time and ensemble averaging 

The above distinction makes building time and ensemble averaging much easier. The term ensemble averaging is obvious as we know what would be the ensemble set and averaging over this set for a given observable.  Time averaging then could be achieved by curating a much larger set by resampling with replacement from the ensemble. Note that the resulting time-average value would not be unique, as one can generate many different sample sets from the ensemble. However, bear in mind that the definition of how to measure convergence to ergodic regime is not unique.

Conclusion

Even though the distinction we made sounds very obscure,  this alignment between statistical mechanics and probability theory may clarify the conception of ergodic regimes for general practitioners.

Further reading

Please Cite:

 @misc{suezen22dess, 
     title = {Differentiating ensembles and sample spaces: Alignment between statistical mechanics and probability theory}, 
     howpublished = {\url{https://science-memo.blogspot.com/2022/11/ensembles-probability-theory.html}, 
     author = {Mehmet Süzen},
     year = {2022}
}  

Postscript

  • If there are multiple events coming from set of primitive events, compositional outcomes considered to be ensemble not sample space. Sample space is a set that we sample from, either one or multiple times to build an ensemble. Ensemble notion within pure ML context was also noticed by late David J. C. MacKay, in his book Information Theory, Inference and Learning, Cambridge University Press (2003).


Tuesday, 13 May 2014

Is ergodicity a reasonable hypothesis? Understanding Boltzmann's ergodic hypothesis

Ergodic vs. non-ergodic
trajectories (Wikipedia)
Many undergraduate Physics students barely study Ergodic Hypothesis in detail. It is usually manifested as ensemble averages being equal to time averages. While the concept of the statistical ensemble maybe accessible to students, when it comes to ergodic theory and theorems,  where higher level mathematical jargon kicks in, it maybe confusing for the novice reader or even practicing Physicists and educator what does ergodicity really mean. For example recent pre-print titled "Is ergodicity a reasonable hypothesis?" defines the ergodicity as follows:
...In the physics literature "ergodicity" is taken to mean that a system, including a macroscopic one, visits all microscopic states in a relatively short time...[link]
Visiting all microscopic states is not a pre-condition for ergodicity from statistical physics stand point. This form of the theory is the manifestation of strong ergodic hypothesis because of the Birkhoff theorem and may not reflect the physical meaning of ergodicity.  However,  the originator of ergodic hypothesis,  Boltzmann, had a different thing in mind in explaining how a system approaches to thermodynamic equilibrium. One of the best explanations are given in the book of J. R. Dorfman, titled An introduction to Chaos and Nonequilibrium Statistical Mechanics [link], in section 1.3, Dorfman explains what Boltzmann had in mind:
...Boltzmann then made the hypothesis that a mechanical system's trajectory in phase-space will spend equal times in regions of equal phase-space measure. If this is true, then any dynamical system will spend most of its time in phase-space region where the values of the interesting macroscopic properties are extremely close to the equilibrium values...[link]
Saying this, Boltzmann did not suggest that a system should visit ALL microscopic states.  His argument only suggests that only states which are close the equilibrium has more likelihood to be visited.

Postscript (June 2022)

The sufficiency of Sparse Visits: Physical states are rarely fine-grained

A requirement for attaining ergodicity is visiting all possible states or regions due to the ergodic theorems of Birkhoff and von Neumann. This requirement is not correct for Physics. The key concepts here are coarse-graining and the sufficiency of sparse visits. Most of the physical systems have equally likely states.


The generated dynamics would rarely need to visit all accessible states or regions. Physical systems are rarely fine-grained and have a degree of sparseness, reducing their astronomically large number of states to a handful. In summary, visiting all physical states or regions in time averages is not strictly needed for the physics definition of ergodicity.


A collection of regions or multiple states with a higher probability will need to be covered to achieve thermodynamic equilibrium. A concept of “sufficiency of sparse visits”. This approach makes physical experiments possible over a finite time consistent with thermodynamics.




Sunday, 23 October 2011

Similation of rare-events review

A recent review on rare event similation both from stochastic and Hamiltonian dynamics point of view is presented by French group [ article ].

Friday, 25 February 2011

Quantum mechanical Szilard engine

A recent article on the analysis of Quantum Szilard Engine (QSZE) has appeared [link]. Japanese team derived an explicit analytical expression for work done by arbitrary number of molecules.

Saturday, 29 January 2011

Need of physics in cell biology

Physics and cell biology used to be very distinct fields, but now a days, more and more theoretical physics oriented work force moved to this domain and life sciences, specially from statistical mechanics and complex systems point of view, and even people from string theory! A recent article that reviews this perspective appeared [link].
In the similar lines in the article [link] discussion on physics role in biology in generic terms explored. In a humorous way current funding policies are summarized : "all science is either biology or tool-making for biology or not fundable".

Thursday, 23 December 2010

Quantum Monte Carlo: Fast force computation

A recent article [doi] by Italian group proposes usage of algorithmic differentiation for force computations appeared in Quantum Monte Carlo technique.

Sunday, 24 October 2010

Pair potentials are not that bad: Water & Silica

One of the challenges in modeling water or silica is the figure out correct description of inter-atomic potential. Usually used models for water in-cooperate 3-body interactions and charges, B.Guillot has a review on the subject [doi]. However there are attempts to use only pair potential for water [doi], as well as for amorphous silica [doi].
(c) Copyright 2008-2024 Mehmet Suzen (suzen at acm dot org)

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.