Distance between physical theories based on information theory

Similar documents
Beyond the Second Law of Thermodynamics

PhD in Theoretical Particle Physics Academic Year 2017/2018

Lecture 18: Quantum Information Theory and Holevo s Bound

Geometric Entropy: Black Hole Background

Lecture 19 October 28, 2015

Entanglement: concept, measures and open problems

Lecture 2: Perfect Secrecy and its Limitations

Lecture: Quantum Information

Universal Quantum Simulator, Local Convertibility and Edge States in Many-Body Systems Fabio Franchini

Kyle Reing University of Southern California April 18, 2018

A Holevo-type bound for a Hilbert Schmidt distance measure

AQI: Advanced Quantum Information Lecture 6 (Module 2): Distinguishing Quantum States January 28, 2013

Lecture 5 - Information theory

Basics on quantum information

1 Equal-time and Time-ordered Green Functions

Shunlong Luo. Academy of Mathematics and Systems Science Chinese Academy of Sciences

Basics on quantum information

1 The Quantum Anharmonic Oscillator

31st Jerusalem Winter School in Theoretical Physics: Problem Set 2

5 Mutual Information and Channel Capacity

S.K. Saikin May 22, Lecture 13

An Axiomatic Approach to Quantum Mechanics

Concentration of Measure Effects in Quantum Information. Patrick Hayden (McGill University)

Information in Biology

A new approach to quantum metrics. Nik Weaver. (joint work with Greg Kuperberg, in progress)

Density Matrices. Chapter Introduction

arxiv:cond-mat/ v2 [cond-mat.stat-mech] 25 Sep 2000

arxiv:cond-mat/ v1 [cond-mat.stat-mech] 4 May 1997

Lecture 35: December The fundamental statistical distances

The Measure of Information Uniqueness of the Logarithmic Uncertainty Measure

Information Theory in Intelligent Decision Making

221A Lecture Notes Convergence of Perturbation Theory

Problem Set 1 Classical Worldsheet Dynamics

Information in Biology

Physics 4022 Notes on Density Matrices

ds/cft Contents Lecturer: Prof. Juan Maldacena Transcriber: Alexander Chen August 7, Lecture Lecture 2 5

Holographic Entanglement Entropy

ECE 4400:693 - Information Theory

(Again, this quantity is the correlation function of the two spins.) With z chosen along ˆn 1, this quantity is easily computed (exercise):

Entanglement Entropy in Extended Quantum Systems

Variational Calculation of Eective Classical. November 12, Abstract

The general reason for approach to thermal equilibrium of macroscopic quantu

Verschränkung versus Stosszahlansatz: The second law rests on low correlation levels in our cosmic neighborhood

PHYS 352 Homework 2 Solutions

Quantum Entanglement and Measurement

Path Integrals. Andreas Wipf Theoretisch-Physikalisches-Institut Friedrich-Schiller-Universität, Max Wien Platz Jena

Chapter 2 Learning with Boltzmann Gibbs Statistical Mechanics

Zhong-Zhi Xianyu (CMSA Harvard) Tsinghua June 30, 2016

Quantum Thermodynamics

Useful Concepts from Information Theory

Introduction to Quantum Key Distribution

Typicality paradigm in Quantum Statistical Thermodynamics Barbara Fresch, Giorgio Moro Dipartimento Scienze Chimiche Università di Padova

Warming Up to Finite-Temperature Field Theory

9. Distance measures. 9.1 Classical information measures. Head Tail. How similar/close are two probability distributions? Trace distance.

Advanced Quantum Mechanics

Random Bit Generation

Symmetry of the Dielectric Tensor

...Thermodynamics. Lecture 15 November 9, / 26

Entropy and Lorentz Transformations

Information Theory and Predictability Lecture 6: Maximum Entropy Techniques

PHYSICS 219 Homework 2 Due in class, Wednesday May 3. Makeup lectures on Friday May 12 and 19, usual time. Location will be ISB 231 or 235.

An Inverse Mass Expansion for Entanglement Entropy. Free Massive Scalar Field Theory

Recursive Speed-up in Partition Function Evaluation

arxiv: v2 [quant-ph] 17 Nov 2016

Statistical physics models belonging to the generalised exponential family

Matrix Product States

arxiv:quant-ph/ v5 10 Feb 2003

Entropy measures of physics via complexity

Entropy in Classical and Quantum Information Theory

PHY305: Notes on Entanglement and the Density Matrix

Biology as Information Dynamics

Part I. Entropy. Information Theory and Networks. Section 1. Entropy: definitions. Lecture 5: Entropy

Quantum Momentum Distributions

Measures of irreversibility in quantum phase space

Second quantization: where quantization and particles come from?

Quantum Fisher Information. Shunlong Luo Beijing, Aug , 2006

Lecture contents. A few concepts from Quantum Mechanics. Tight-binding model Solid state physics review

The statistical origins of quantum mechanics

Entropies & Information Theory

Phase space in classical physics

Ensembles and incomplete information

Uncertainty Relations, Unbiased bases and Quantification of Quantum Entanglement

Physics 239/139 Spring 2018 Assignment 2 Solutions

An Introduction to Quantum Computation and Quantum Information

The Partition Function for the Anharmonic Oscillator in the Strong-Coupling Regime

Biology as Information Dynamics

Mean-field equations for higher-order quantum statistical models : an information geometric approach

Asymptotic series in quantum mechanics: anharmonic oscillator

Black hole thermodynamics

Notes on Renormalization Group: Berezinskii-Kosterlitz-Thouless (BKT) transition and Sine-Gordon model

Quantum Information Types

Entanglement in Many-Body Fermion Systems

Thermal quantum discord in Heisenberg models with Dzyaloshinski Moriya interaction

Fractional Quantum Mechanics and Lévy Path Integrals

Quantum Mechanics II

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble

3 Dimensional String Theory

Recursive calculation of effective potential and variational resummation

The first law of general quantum resource theories

14 Renormalization 1: Basics

Transcription:

Distance between physical theories based on information theory Jacques Calmet 1 and Xavier Calmet 2 Institute for Cryptography and Security (IKS) Karlsruhe Institute of Technology (KIT), 76131 Karlsruhe, Germany Physics and Astronomy, University of Sussex, Falmer, Brighton, BN1 9QH, UK Abstract We introduce a concept of distance between physical theories described by an action. The definition of the distance is based on the relative entropy. 1 email:calmet@ira.uka.de 2 email:x.calmet@sussex.ac.uk

1 Introduction Our understanding of nature is based on models which provide an approximation of the physical reality. In many areas of science, such as, for example, physics or some areas of chemistry, models are often defined in terms of an action which allows to derive the equations of motion of the model using the least action principle. A model is uniquely defined through its action which is a functional of the fields introduced in the model. It also depends on potential coupling constants and mass parameters. Equivalently, the model can be given in terms of a Hamiltonian which can be derived from to the action using a Legendre transformation. Building a Hamiltonian or equivalently an action requires to specify the symmetries (e.g. space-time or gauge symmetries) of the theory as well as the values of the parameters of the theory such as the number of fields, the spin-statistics of these fields and the value of the mass and coupling parameters at some energy scale. One also needs to impose boundary conditions on the fields in order to obtain the field equations of the theory and to fix the vacuum around which one develops the theory. However, once this has been done the theory is usually uniquely determined. Observables can then be computed within a certain approximation and compared to experiments. This allows to compare a theory to experiments and thus to determine if the theory under consideration is giving an adequate description of nature within the precision of the calculation done using the theory and of the performed experiments. One could however consider a different question and ask how different are two theories. In other words, we are interested in introducing the notion of a distance between two models described by an action. In the present work, we extend our considerations to quantum field theories. One option would be to compute all possible observables and to do a chi-square fit, but this would be an extremely cumbersome and for most practical cases impossible task. Furthermore, one might try to compare theories that do not even have the same number of observables in which case the method proposed previously would fail. We propose to introduce the notion of a distance between Hamiltonians, and thus physical theories, based on the Kullback-Leibler relative entropy which is frequently used in information theory as a concept of a distance. However, this is not a distance in the usual sense. A distance d(p 1, P 2 ) between two points P 1 and P 2 has to satisfy the following three axioms: 1. Positive definiteness: P 1, P 2 : d(p 1, P 2 ) 0 2. Symmetry d(p 1, P 2 ) = d(p 2, P 1 ) 3. Triangle inequality: P 1, P 2, P 3 : d(p 1, P 2 ) d(p 1, P 3 ) + d(p 2, P 3 ). 1

However, it is often useful to introduce a concept of distance between elements of more abstract sets in any field of knowledge as very well illustrated by the recently published encyclopedia [1]. An even more recent attempt is to define a distance between dialects in linguistics. In a domain closer to our interests, one could ask for example what is the distance between two distributions between e.g. the Gaussian and binomial distributions. It is useful to introduce the concept of entropy as a mean to define such distances. In information theory, Shannon entropy [2] represents the information content of a message or, from the receiver point of view, the uncertainty about the message the sender produced prior to its reception. It is defined as i p(i) log p(i), (1) where p(i) is the probability of receiving the message i. The unit used is the bit. The relative entropy can be used to define a distance between two distributions p(i) and g(i). The Kullback-Leibler [3] distance or relative entropy is defined as D(g p) = i g(i) log g(i) p(i) (2) where p(i) is the real distribution and g(i) is an assumed distribution. Clearly the Kullback- Leibler relative entropy is not a distance in the usual sense: it satisfies the positive definiteness axiom, but not the symmetry or the triangle inequality axioms. It is nevertheless useful to think of the relative entropy as a distance between distributions. The Kullback-Leibler distance is relevant to discrete sets. It can be generalized to the case of continuous sets. For our purposes, a probability distribution over some field (or set) X is a distribution p : X R, such that 1. X d4 x p(x) = 1 2. For any finite subset S X, S d4 x p(x) > 0. Let us now apply this definition to models described by an action or a Hamiltonian. For a given Hamiltonian there exists a density matrix ρ defined by ρ = n φ n w n φ n (3) where φ n, n {1...N} are the states of the system and w n is the probability of finding the system in the state φ n. The von Neumann entropy is then given by S = Tr(ρ log ρ). (4) 2

Given two Hamiltonians H 1 and H 2 and the corresponding density matrices ρ 1 and ρ 2, we can introduce the relative entropy of ρ 1 with respect to ρ 2 : D(ρ 1 ρ 2 ) = Tr(ρ 1 (log ρ 1 log ρ 2 )) (5) This distance is clearly not a distance as understood in metric spaces but it is nevertheless useful to think of the relative entropy as a distance between density matrices and thus Hamiltonians. Let us illustrate how to use this new concept through a concrete example which comes from statistical mechanics. The density matrix can also be defined in terms of the partition function Z. In statistical mechanics the partition function is defined as ( Z = Tr exp H ), (6) kt where k = 8.6 10 5 ev K matrix is then given by is the Boltzmann constant and T the temperature. The density ρ(t ) = 1 ( Z exp H ). (7) kt The Kullback-Leibler relative entropy can be expressed in terms of the partition functions. In that case eq.(5) reads D(ρ 1 ρ 2 ) = β 1 d dβ 1 ln Z 1 ln Z 1 + ln Z 2 β 2 Tr(ρ 1 H 2 ), (8) where β i = (kt i ) 1. We interpret this distance as a distance between the Hamiltonian H 1 corresponding to ρ 1 and the Hamiltonian H 2 corresponding to ρ 2. For models with no overlap, i.e., Tr(ρ 1 H 2 ) = Tr(H 2 ), the distance is just the entropy of the model (up to a factor k) determined by the Hamiltonian H 1. If H 1 = H 2 and for the same temperature then the distance is vanishing. We shall calculate the relative distance between one-dimensional harmonic oscillator and a one-dimensional free particle and compared it to the distance between an one-dimensional anharmonic oscillator and a one-dimensional free particle. All three theories are assumed to be in contact with a thermal bath having a temperature T and we shall assume that the overlap between the models is small and negligeable. The partition function of a onedimensional free particle in a system of length L is given by [4] mkt Z free = L 2π. (9) 3

The magnitude of L is of no particular importance as we shall use Z free as a comparator for both the harmonic and anharmonic oscillators. The partition function for a one-dimensional harmonic oscillator is given by [4] Z HO = 1 2 sinh ω 2kT (10) where ω is the frequency of the oscillator. Finally for an anharmonic oscillator (H = H 0 +λx 4, the partition function is given by [5] ( Z AHO Z HO 1 3λ ω ω 4m 2 ω 3 kt coth2 kt ). (11) The distance between the Hamiltonian of the one dimensional harmonic oscillator and that of the one-dimension free particle can now easily be computed. One finds: d(h HO H free ) = D(ρ HO ρ free ) (12) = 1 ( ) ( ( )) ωβ 1 ωβ 2 ωβ coth log 2 2 csch 2 In order to get numbers, we set all parameters of this distance to unity in their respective units. We find d(h HO H free ) = 1.123. The distance between the Hamiltonian of the one dimensional anharmonic oscillator and that of the one-dimension free particle is then given by d(h AHO H free ) = 1 ( ) ( ( )) ωβ 1 ωβ 2 ωβ coth log 2 2 csch (13) 2 3λ( 2 ωβ 2 coth( ωβ) csch 2 ( ωβ) β coth 2 ( ωβ) 2m 2 ω 2 + O(λ) Setting again all parameters with the exception of λ to unity, we find d(h AHO H free ) = 1.12446 for λ = 0.001. It is easy to see that one obtains d(h HO H free ) = d(h AHO H free ) in the limit λ 0. Our result matches the intuition that the anharmonic oscillator is further distant from the one dimensional free particle model than the harmonic oscillator is from the one dimensional free particle model. The same definition of a distance can be applied to any physical theory which can be defined in terms of an action. In particular it is particularly interesting to apply this notion to quantum field theories. Once an action I[fields] which is a functional of all the fields introduced in the model as well as a function of the coupling constants is known, one can introduce the partition function Z Z = D[fields] exp ( βi[fields, renormalized parameters]) (14) 4

where β is the imaginary time. Generally speaking, it is not always possible to calculate the partition function exactly for a quantum field theory, but this can often be done in perturbation theory. To the best of our knowledge, the concept of distance between physical theories described by an action has never been proposed. It fits well however into the present trend to quantify any sort of ontological knowledge [6]. Another potential application is that of the landscape scenario in string theory where the notion of a distance between theories could be important [7] to probe the separation from a given vacuum to that of the standard model of particle physics. Finally, note that we have considered here the case of the Kullback-Leibler entropy. We can easily extend our considerations to continuous sets of models. In other words, we can introduce Fisher s metric over a continuous space of physical models. This allows us to define a local metric on theory spaces with a physical distance ds 2 = g ij dx i dx j where x i are local coordinates. Furthermore, it has been shown that Fisher s metric can be derived from an action [8]. It would be interesting to see if imposing symmetries at the level of the action can lead to interesting relations between physical models. References [1] M.M. Deza and E. Deza, Encyclopedia of Distances, Springer, Heidelberg, 2009. [2] C. E. Shannon, A Mathematical Theory of Communication, Bell System Technical Journal 27:379-423, 623-656, July and October 1948. [3] S. Kullback, Information Theory and Statistics, John Wiley, New York, 1959. [4] R. P. Feynman, Statistical Mechanics, Advanced Book Classics, ISBN 0-201-36076-4. [5] G. Parisi, Statistical Field Theory, Addison Wesley, 1988. [6] J. Calmet, An Introduction to the Quantification of Culture, Proc. of Advances in Multiagent Systems, Robotics and Cybernetics: Theory and Practice, Volume III, G. Lasker and J. Pfalzgraf eds., InterSym-2009, Baden-Baden, forthcoming, 2010. [7] M. R. Douglas, Spaces of Quantum Field Theories, arxiv:1005.2779, 2010. [8] X. Calmet and J. Calmet, Phys. Rev. E 71, 056109, 2005. 5