https://doi.org/10.1140/epjb/e2009-00168-5
Combinatorial entropies and statistics
School of Aerospace, Civil and Mechanical Engineering, The University of New South Wales at ADFA, Northcott Drive, Canberra, ACT, 2600, Australia
Corresponding author: a r.niven@adfa.edu.au
Received:
21
November
2008
Revised:
4
March
2009
Published online:
16
May
2009
We examine the combinatorial or probabilistic definition (“Boltzmann's principle”) of the entropy or cross-entropy function H ∝ or D ∝ - , where is the statistical weight and the probability of a given realization of a system. Extremisation of H or D, subject to any constraints, thus selects the “most probable” (MaxProb) realization. If the system is multinomial, D converges asymptotically (for number of entities N ↦∞) to the Kullback-Leibler cross-entropy DKL; for equiprobable categories in a system, H converges to the Shannon entropy HSh. However, in many cases or is not multinomial and/or does not satisfy an asymptotic limit. Such systems cannot meaningfully be analysed with DKL or HSh, but can be analysed directly by MaxProb. This study reviews several examples, including (a) non-asymptotic systems; (b) systems with indistinguishable entities (quantum statistics); (c) systems with indistinguishable categories; (d) systems represented by urn models, such as “neither independent nor identically distributed” (ninid) sampling; and (e) systems representable in graphical form, such as decision trees and networks. Boltzmann's combinatorial definition of entropy is shown to be of greater importance for “probabilistic inference” than the axiomatic definition used in information theory.
PACS: 02.50.Cw – Probability theory / 02.50.Tt – Inference methods / 05.20.-y – Classical statistical mechanics / 05.90.+m – Other topics in statistical physics, thermodynamics, and nonlinear dynamical systems / 89.20.-a – Interdisciplinary applications of physics
© EDP Sciences, Società Italiana di Fisica, Springer-Verlag, 2009