# LOG#157. Superstatistics (II).

Following the previous article, now I will discuss some interesting points in the paper:

Generalized information entropies depending only on the probability distribution, by O.Obregón and A.Gil-Villegas.

You can find the paper here http://arxiv.org/abs/1206.3353 The Boltzmann factor for a generalized superstatistics reads

for a distribution function

In the case of a gamma distribution, the Boltzmann factor can be evaluated in closed form

but, as the paper above remembers, the Boltzmann factor can not generally obtained for general distribution functions in analytical closed form. For instance, for log-normal distribution function

and the F-distribution

have no a closed form Boltzmann factor but the paper provides an approximated form for small variance. Beck and Cohen showed that every superstatistics depending on a constant parameter q are the same for small enough variance of fluctuations! That is a powerful result.

As we saw in the previous blog post, the supestatistics with gamma distribution function provides the entropic function

and the first term is just the Shannon entropy. The functional

is chosen to have as Lagrange multipliers. The condition

allows us to calculate in terms of S and the Boltzmann factor

is obtained after the equipartition of energy is applied. From this Boltzmann factor, the entropy formula reads off easily

That entropy expands into

Superstatistic-like entropy and Shannon entropy differ in the range of small number of microstates , or large probabilities .

Next, the paper considers the Kaniadakis entropy, defined by

Inspired by this entropy function, the authors propose the generalized entropy

whose Taylor expansion for small reads

The first term is again the Shannon entropy like that of superstatistics, but it has missing the even terms in the expansion, keeping only the odd terms in comparison with the previously studied generalized entropy. A similar procedure can be applied to Sharma-Mittal-Taneja biparametric entropies

and where now, are not just constants but functions depending on .

The final part of the paper discusses the axiomatic foundation of information theory (Khinchin axioms) and how they are modified in the case of the non-extensive entropies. The axiom that gets modified is related to the joint probability for independent systems. These issues of non-additivity and non-extensive features are very relevant in complex systems (even those having fractal and multifractal structures). I will discuss the Khinchin axioms and the informational ground of physics in the future here on this blog, but I will stop the discussion of this subject today.

See you in my third and last introductory superstatistics blog post!

View ratings