Groups, entropies and Einstein s likelihood functions

Similar documents
arxiv: v1 [cond-mat.stat-mech] 17 Jul 2015

Is There a World Behind Shannon? Entropies for Complex Systems

Entropy measures of physics via complexity

q-statistics on symmetric generalized binomial distributions

arxiv: v1 [gr-qc] 17 Nov 2017

An Entropy depending only on the Probability (or the Density Matrix)

arxiv: v1 [cond-mat.stat-mech] 3 Apr 2007

The fitting of q-laws as replacement for power-law fit: A social science tutorial. Douglas R. White 2006

Generalized entropy(ies) depending only on the probability; Gravitation, AdS/CFT,..

Chapter 2 Learning with Boltzmann Gibbs Statistical Mechanics

Temperature Fluctuations and Entropy Formulas

New universal Lyapunov functions for nonlinear kinetics

Power law distribution of Rényi entropy for equilibrium systems having nonadditive energy

From the microscopic to the macroscopic world. Kolloqium April 10, 2014 Ludwig-Maximilians-Universität München. Jean BRICMONT

Nonextensive Aspects of Self- Organized Scale-Free Gas-Like Networks

Entropies and fractal dimensions

GENERALIZED INFORMATION ENTROPIES IN QUANTUM PHYSICS

THERMODYNAMICS AND STATISTICAL MECHANICS

Image thresholding using Tsallis entropy

arxiv:cond-mat/ v1 10 Aug 2002

, applyingl Hospital s Rule again x 0 2 cos(x) xsinx

Generalized Entropy Composition with Different q Indices: A Trial

Introduction to Decision Sciences Lecture 6

A possible deformed algebra and calculus inspired in nonextensive thermostatistics

Formal Groups. Niki Myrto Mavraki

ISSN Article. Tsallis Entropy, Escort Probability and the Incomplete Information Theory

Generalized Logarithmic and Exponential Functions and Growth Models

Section 6.1: Composite Functions

Possible Generalization of Boltzmann-Gibbs Statistics

On Generalized Entropy Measures and Non-extensive Statistical Mechanics

Background on Chevalley Groups Constructed from a Root System

Stability of Tsallis entropy and instabilities of Rényi and normalized Tsallis entropies: A basis for q-exponential distributions

Statistical Mechanics in a Nutshell

Taylor polynomials. 1. Introduction. 2. Linear approximation.

A method to obtain thermodynamic fundamental equations. André Serrenho, Tânia Sousa, Tiago Domingos

arxiv:math-ph/ v1 30 May 2005

Non-Gaussian Galaxy Distributions

10 Thermal field theory

arxiv: v1 [cond-mat.stat-mech] 12 Apr 2014

BOLTZMANN-GIBBS ENTROPY: AXIOMATIC CHARACTERIZATION AND APPLICATION

Also, in recent years, Tsallis proposed another entropy measure which in the case of a discrete random variable is given by

Taylor series. Chapter Introduction From geometric series to Taylor polynomials

MULTISECTION METHOD AND FURTHER FORMULAE FOR π. De-Yin Zheng

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

arxiv:astro-ph/ v1 15 Mar 2002

An Outline of (Classical) Statistical Mechanics and Related Concepts in Machine Learning

Lecture 11: Hidden Markov Models

Entropic Uncertainty Relations, Unbiased bases and Quantification of Quantum Entanglement

Homework 2 - Math 603 Fall 05 Solutions

Principle of Maximum Entropy for Histogram Transformation and Image Enhancement

Ponownie o zasadach nieoznaczoności:

Formal Modules. Elliptic Modules Learning Seminar. Andrew O Desky. October 6, 2017

On finitely presented expansions of semigroups, algebras, and gr

arxiv:math-ph/ v4 22 Mar 2005

An Inverse Mass Expansion for Entanglement Entropy. Free Massive Scalar Field Theory

Title of communication, titles not fitting in one line will break automatically

Before you begin read these instructions carefully.

Symmetric functions of two noncommuting variables

AN ALGEBRAIC APPROACH TO GENERALIZED MEASURES OF INFORMATION

Solution to a Problem Found by Calculating the Magnetic Internal Energy Inside Nonextensive Statistical Mechanics

Power law distribution of Rényi entropy for equilibrium systems having nonadditive energy

Exam, Solutions

Small Unextendible Sets of Mutually Unbiased Bases (MUBs) Markus Grassl

Constructing Linkages for Drawing Plane Curves

Page 2 of 8 6 References 7 External links Statements The classical form of Jensen's inequality involves several numbers and weights. The inequality ca

23. The Finite Fourier Transform and the Fast Fourier Transform Algorithm

1 Commutative Rings with Identity

Thermodynamic entropy

Entropy Gradient Maximization for Systems with Discoupled Time

arxiv: v1 [math-ph] 28 Jan 2009

(v, w) = arccos( < v, w >

Scaling properties of fine resolution point rainfall and inferences for its stochastic modelling

Introduction. Chapter The Purpose of Statistical Mechanics

which is a group homomorphism, such that if W V U, then

Grand Canonical Formalism

Relations and Functions (for Math 026 review)

Double parton scattering studies in CMS

(v, w) = arccos( < v, w >

Black Holes, Integrable Systems and Soft Hair

On the average uncertainty for systems with nonlinear coupling

G 8243 Entropy and Information in Probability

Symmetry Methods for Differential and Difference Equations. Peter Hydon

Topics from Algebra and Pre-Calculus. (Key contains solved problems)

A maximum entropy perspective of Pena s synthetic indicators

Local properties of plane algebraic curves

Parametric Equations

What is Entropy? Jeff Gill, 1 Entropy in Information Theory

15. Polynomial rings Definition-Lemma Let R be a ring and let x be an indeterminate.

Comparative analysis of non-equilibrium quantum Landauer bounds

Quantum Quench in Conformal Field Theory from a General Short-Ranged State

Onsager theory: overview

(v, w) = arccos( < v, w >

Advanced Calculus I Chapter 2 & 3 Homework Solutions October 30, Prove that f has a limit at 2 and x + 2 find it. f(x) = 2x2 + 3x 2 x + 2

Free probability and quantum information

An Elementary Notion and Measurement of Entropy

Classical regularity conditions

Shannon Entropy: Axiomatic Characterization and Application

BRIAN OSSERMAN. , let t be a coordinate for the line, and take θ = d. A differential form ω may be written as g(t)dt,

Inequivalent Representations of a q-oscillator Algebra in a Quantum q-gas

Homework 10: Do problem 6.2 in the text. Solution: Part (a)

Transcription:

Groups, entropies and Einstein s likelihood functions Gabriele Sicuro in collaboration with P. Tempesta (UCM and ICMAT) Centro Brasileiro de Pesquisas Físicas Rio de Janeiro, Brazil 22nd of June, 2016

Outline 1 Groups, composability, entropy The road of Shannon and Khinchin Composability and formal groups 2 Einstein s principle and composable entropies Einstein s likelihood principle Likelihood function for composable entropies

Section 1 Groups, composability, entropy

The axioms of Claude Shannon and Aleksandr J. Khinchin Aleksandr J. Khinchin, Mathematical foundations of information theory (1957) In the context of Information theory, Shannon and Khinchin introduced a set of axioms for an entropy S[p] S(p 1,..., p W ) to be a measure of information: Shannon and Khinchin s axioms 1 The function S[p] is continuous with respect to all its arguments. 2 The function S[p] is maximized by the uniform distribution p i = W 1, i. 3 Adding an impossible event to a probability distribution does not change the value of its entropy (expansibility): S(p 1,..., p W, 0) S(p 1,..., p W ). 4 Given two subsystems A and B of a statistical system, S(A B) = S(A) + S(B A) = S(B) + S(A B). If the subsystems are statistically independent the additivity property holds, S(A B) = S(A) + S(B).

Is there life beyond the SK axioms? It is well known that SK axioms allow one entropic form only, i.e., the Boltzmann Gibbs Shannon entropy S BG[p] = k W p i ln p i, k > 0. i=1 However, other entropies have been successfully introduced for the study of complex systems, in particular to give an extensive quantity when the dimension of the accessible phase space scales not exponentially in the size of the system. The first three axioms seem to be indispensable for a meaningful entropy. We can think about the additivity axiom. How can we generalize the 4th SK axiom?

The composability axiom Tempesta Annals of Physics 365, pp. 180-197 (2016) When composing two statistically independent subsystems A and B, we want that the entropy of the composed system depends on the entropies S (A) and S (B) only, and possibly a set of parameters η, We also require that S(A B) = S(B A) = Φ η [S (A), S (B)]. S ((A B) C) = S (A (B C)). Finally, if S(B) = 0 S (A B) = S (A).

The composability axiom Tempesta Annals of Physics 365, pp. 180-197 (2016) We replaced therefore the 4th SK axiom with the composability axiom. The composability axiom An entropy S is composable if there is a continuous function Φ η such that for two statistically independent substystems of a system Ω, S(A B) = Φ η [S (A), S (B)]. (1) The function Φ η must satisfy the following properties Symmetry Φ η[x, y] = Φ η[y, x], (2) Associativity Φ η [x, Φ η [y, z]] = Φ η [Φ η [x, y], z], (3) Null-composability Φ η[x, 0] = x. (4)

A useful paper? Bochner Annals of Mathematics 47, pp. 192-201 (1946)

Formal groups Bochner Annals of Mathematics 47, pp. 192-201 (1946) The structure required by the composability axiom suggests that the theory of formal groups can be adopted fruitfully to discuss composable entropies. Commutative one dimensional formal group Let R be a ring with identity and R[x, y] the ring of formal power series in the variables x, y with coefficients in R. A commutative one dimensional formal group is a symmetric formal power series such that Φ[x, y] = x + y + higher degree terms R[x, y] Φ[x, 0] = Φ[0, x] = x, Φ [x, Φ [y, z]] = Φ [Φ [x, y], z].

Lazard formal groups Hazewinkel, Formal Groups and Applications (1978) Lazard proved that the any one dimensional commutative group law can be mapped into a Lazard formal group law Φ[x, y] = G (F(x) + F(y)). Here we have introduced the formal group exponential, F(t) := k=0 f k t k+1, f0 = 1, k + 1 over the ring Z[b 1, b 2,... ] of integral polynomials in infinitely many variables, and the formal group logarithm G : G (F(t)) = t. This means that the composition law of our entropy must be searched among the Lazard group laws (in particular the ones having a series G convergent on the real line).

Trace-form entropies Tempesta Proceedings of the Royal Society A 471:20150165 (2015). Assuming a trace-form structure for our entropy, i.e., S[p] = W p is(p i), i=1 Tempesta proved the following strong statement Theorem The only composable trace-form entropy is Tsallis entropy, S T q [p] := W i=1 1 p q 1 i p i q 1 W i=1 p i log q 1 p i, q 0. Boltzmann Gibbs entropy is recovered as particular case in the q 1 limit. The corresponding composition law and the group logarithm are given by Φ[x, y] = x + y + (1 q)xy, G(t) = e(1 q)t 1 1 q = log q e t.

Trace-form entropies: what about the others? Tempesta Proceedings of the Royal Society A 471:20150165 (2015). Many trace-form entropies have been proposed in the literature. The fact that they are not composable does not imply that they are irrelevant. Many of them fall in the large class of weakly composable entropies. Weakly composable entropies Entropies in the form S[p] := W i=1 ) p ig (ln 1pi are weakly composable if the group properties hold only for uniform distributions. For this class of entropies Φ[x, y] is constructed directly from G. For example, the Kaniadakis entropy S K κ[p] := W i=1 p κ i pi κ p i 2κ is weakly composable but not composable., κ R,

Non-trace-form entropies Tempesta arxiv:1507.07436 Non-trace-form entropies are however not excluded by this picture. For example, the simplest group, the additive group Φ[x, y] = x + y, allows only one trace-form entropy (the Boltzmann Gibbs entropy), but it is easily seen that the Rényi entropy is composable as well, ( Sq R [p] := 1 W ) 1 q ln p q i. An entire family of composable non-trace-form entropies (the so-called Z-entropies) have been introduced by Tempesta: the Rényi entropy is the most notorious member of this family. i=1

Information Sicuro, Tempesta Physical Review E 93, 040101(R) (2016) An information measure can be associated to each composable entropy having a composition law determined by a group exponential G F 1. Generalized information measure Let S be a composable entropy with group law generated by F. For a system A, we say that the information measure is given by I(A) := F (S(A)). The information measure I satisfies the following properties: continuity; maximum principle, i.e., it is maximised by the uniform distribution; expansibility; for two statistically independent systems, I(A B) = I(A) + I(B).

Information: examples Sicuro, Tempesta Physical Review E 93, 040101(R) (2016) Boltzmann Gibbs entropy S BG F(t) = t Shannon information S BG q 1 q 1 Tsallis entropy Sq T q 1 F(t) = ln(1+(1 q)t) 1 q Haubold, Tsallis EPL 110 (2015) 30005 Rényi entropy S R q F(t) = t Rényi entropy S R q

Section 2 Einstein s principle and composable entropies

One step backward: Einstein s likelihood principle Einstein Annalen der Physik 33, pp. 1105-1115 (1910). In order to be able to calculate W, one needs a complete theory of the system under consideration. Given this kind of approach, it therefore seems questionable whether Boltzmann s principle by itself has any meaning whatsoever. [...] That which appears to be completely law governed in irreversible processes is to be attributed to the fact that the probabilities of the individual states Z are of different orders of magnitude, so that a given state Z will practically always be followed by one state [...] because this one state s enormous probability [...]. It is this probability [...] which is related to the entropy in the way expressed by S = R ln W + constant. N This equation is valid to an order of magnitude if each state Z is assigned a small region of the order of magnitude of perceptible regions. A. Einstein, 1910

One step backward: Einstein s likelihood principle Einstein Annalen der Physik 33, pp. 1105-1115 (1910). On the basis of his analysis, Einstein introduced the likelihood function for a system A w(a) e S(A), requiring that, in the case of statistical independence, w(a B) = w(a)w(b). In the case of Boltzmann Gibbs entropy, the validity of the relation above is guaranteed. Can we reformulate Einstein s likelihood principle for general composable entropies?

Composability is compatible with Einstein s principle Sicuro, Tempesta Physical Review E 93, 040101(R) (2016) The answer is yes, with the following recipe. Generalized likelihood function Let S be a composable entropy with group law generated by F. For a system A, we introduce the generalized likelihood function as follows: w(a) := e I(A) = e S(A) F, e x F := e F(x). For two statistically independent systems as prescribed by Einstein. w(a B) = w(a)w(b),

Conclusions The composability axiom, in addition to the first three Shannon Khinchin axioms, allows the classification of entropies. Infinite many entropies are allowed if no additional requirements are imposed. Boltzmann Gibbs entropy and Tsallis entropy are the only trace-form composable entropies. To each composable entropy we can always associate an additive information measure. The composability axiom is compatible with Einstein s likelihood principle.

Thank you for your attention!