LOG#157. Superstatistics (II).


Following the previous article, now I will discuss some interesting points in the paper:

Generalized information entropies depending only on the probability distribution, by O.Obregón and A.Gil-Villegas.

You can find the paper here http://arxiv.org/abs/1206.3353 The Boltzmann factor for a generalized superstatistics reads


    \[ \boxed{B (E)=\displaystyle{\int_0^\infty d\beta f(\beta) e^{-\beta E}}}\]

for a distribution function



In the case of a gamma distribution, the Boltzmann factor can be evaluated in closed form


    \[ \boxed{B(p_l;E)=\left(1+p_l\beta_0E\right)^{-1/p_l}}\]

but, as the paper above remembers, the Boltzmann factor can not generally obtained for general distribution functions in analytical closed form. For instance, for log-normal distribution function


    \[ f(p_l;\beta)=\dfrac{1}{\sqrt{2\pi}\beta (\ln (p_l+1))^{1/2}}\exp{\left(-\dfrac{\left[\ln \frac{\beta (p_l+1)^{1/2}}{\beta_0}\right]^2}{2\ln(p_l+1)}\right)}\]

and the F-distribution


    \[ f(p_l;\beta)=\dfrac{\Gamma \left(\frac{8p_l-1}{2p_l-1}\right)}{\Gamma \left(\frac{4p_l+1}{2p_l-1}\right)}\dfrac{1}{\beta_0^2}\left(\dfrac{2p_l-1}{p_l+1}\right)\dfrac{\beta}{\left(1+\frac{\beta}{\beta_0}\frac{2p_l-1}{p_l+1}\right)^{\frac{8p_l-1}{2p_l-1}}}\]

have no a closed form Boltzmann factor but the paper provides an approximated form for small variance. Beck and Cohen showed that every superstatistics depending on a constant parameter q are the same for small enough variance of fluctuations! That is a powerful result.

As we saw in the previous blog post, the supestatistics with gamma distribution function provides the entropic function


    \[ \boxed{-\dfrac{S}{k}=\displaystyle{\sum_{l=1}^\Omega p_l\ln p_l+\dfrac{(p_l\ln p_l)^2}{2!}+\dfrac{(p_l\ln p_l)^3}{3!}+\ldots}}\]

and the first term is just the Shannon entropy. The functional


    \[ \displaystyle{\Phi=\dfrac{S}{k}-\gamma \sum_{l=1}^\Omega-\beta \sum_{l=1}^\Omega p_l^{p_l+1}E_l}\]

is chosen to have \gamma, \beta as Lagrange multipliers. The condition

    \[ \dfrac{\partial \Phi}{\partial p_l}=0\]

allows us to calculate p_l in terms of S and the Boltzmann factor


    \[ \boxed{B_\Omega (E)=\left(1+\dfrac{\beta}{\Omega}\right)^{-\Omega}}\]

is obtained after the equipartition of energy is applied. From this Boltzmann factor, the entropy formula reads off easily


    \[ \boxed{S=k\Omega\left(1-\Omega^{-1/\Omega}\right)}\]

That entropy expands into

    \[ \dfrac{S}{k}=\dfrac{S_B}{k}-\dfrac{1}{2!}e^{-S_B/k}\left(\dfrac{S_B}{k}\right)^2+\dfrac{1}{3!}e^{-2S_B/k}\left(\dfrac{S_B}{k}\right)^3+\ldots\]

Superstatistic-like entropy and Shannon entropy differ in the range of small number of microstates \Omega, or large probabilities p_l.

Next, the paper considers the Kaniadakis entropy, defined by

    \[ \boxed{\displaystyle{S_\kappa=-k\sum_{l=1}^\Omega \dfrac{p_l^{1+\kappa}-p_l^{1-\kappa}}{2\kappa}}}\]

Inspired by this entropy function, the authors propose the generalized entropy

    \[ \boxed{\displaystyle{S_\kappa=-k\sum_{l=1}^\Omega \dfrac{p_l^{p_l}-p_l^{-p_l}}{2}}}\]

whose Taylor expansion for small p_l reads

    \[ -\dfrac{S}{k}=\displaystyle{\sum_{l=1}^\Omega p_l\ln p_l+\dfrac{\left(p_l\ln p_l\right)^3}{3!}+\ldots}\]

The first term is again the Shannon entropy like that of superstatistics, but it has missing the even terms in the expansion, keeping only the odd terms in comparison with the previously studied generalized entropy. A similar procedure can be applied to Sharma-Mittal-Taneja biparametric entropies

    \[ \boxed{S_{\kappa, r}=-k\displaystyle{\sum_{l=1}^\Omega p_l^r\left(\dfrac{p_l^\kappa-p_l^{-\kappa}}{2\kappa}\right)}}\]

and where now, \kappa, r are not just constants but functions depending on p_l.

The final part of the paper discusses the axiomatic foundation of information theory (Khinchin axioms) and how they are modified in the case of the non-extensive entropies. The axiom that gets modified is related to the joint probability for independent systems. These issues of non-additivity and non-extensive features are very relevant in complex systems (even those having fractal and multifractal structures). I will discuss the Khinchin axioms and the informational ground of physics in the future here on this blog, but I will stop the discussion of this subject today.

See you in my third and last introductory superstatistics blog post!

Liked it? Take a second to support amarashiki on Patreon!
Become a patron at Patreon!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.