What is it about?
The notion of extensivity in thermodynamics means, that for a very large number of particles in a system, the entropy of the system increases proportional to the number of the particles in the system. Assuming this validity and using basic principles of Statistical Physics, one can derive the famous Boltzmann formula of entropy. However, Boltzmann entropy is only obtained if the fluctuations in the physical system are neglectible and particle interactions are only short-ranged. When fluctuations in thermodynamic quantities appear, even in case of many particles in system, generalized entropies like Tsallis entropy will be obtained. This paper shows the derivation of Tsallis entropy (which depends on one parameter) from extensivity principle when fluctuations are constant in thermodynamic limit. It goes even further deriving the generalized entropy proposed by Hanel and Thurner which depends on two parameters and has Tsallis entropy and Stretched-exponential entropy as a limiting case. A novel form of generalized entropy is derived in the paper which applies when fluctuations are depending on particle number, but tend rapidly to a constant value when phase-space volume increases. It can be useful in generalized Statistical Mechanics of biological systems.
Featured Image
Photo by Lanju Fotografie on Unsplash
Why is it important?
In recent decades, the classical Boltzmann-Gibbs-Shannon entropy was generalized to incorporate e.g. long-range force effects in physical systems. One notable example is the Tsallis entropy and it is used e.g. in description of gravitational systems. Several derivations of this entropy already exist. This paper guides the derivation of generalized entropies to obtain even more general forms of entropies. It is useful for Statistical Physics, but also other disciplines where entropy measures are used like Machine Learning and Artificial Intelligence.
Perspectives
This first-principle derivation of generalized entropies that led beyond common forms (Boltzmann-Gibbs, Tsallis, or related ones) of entropy is a breakthrough in Thermodynamics and Statistical mechanics. Several natural phenomena are still modeled by traditional forms of entropy like Boltzmann-Gibbs, which is sufficient unmany conditions, but under certain conditions they require generalized entropy descriptions that going beyond our understanding of Statistical Physics until now. This will give insights into features of these that are still unexplained. Generalized entropies are also useful for Machine Learning and AI.
Patrick Linker
Universitat Stuttgart
Read the Original
This page is a summary of: Generalized Entropies Obtained from Extensivity Principle, Proceedings of the Bulgarian Academy of Sciences, February 2026, Prof. Marin Drinov Academic Publishing House,
DOI: 10.7546/crabs.2026.02.01.
You can read the full text:
Resources
Contributors
The following have contributed to this page







