On Gibbs-Shannon Entropy

Similar documents
Beyond the Second Law of Thermodynamics

arxiv: v4 [cond-mat.stat-mech] 3 Mar 2017

Maxwell's Demon in Biochemical Signal Transduction

Nonequilibrium thermodynamics at the microscale

The physics of information: from Maxwell s demon to Landauer. Eric Lutz University of Erlangen-Nürnberg

Thermodynamic Computing. Forward Through Backwards Time by RocketBoom

Even if you're not burning books, destroying information generates heat.

Emergent Fluctuation Theorem for Pure Quantum States

Nonequilibrium Thermodynamics of Small Systems: Classical and Quantum Aspects. Massimiliano Esposito

Second law, entropy production, and reversibility in thermodynamics of information

Fluctuation response inequality out of equilibrium

Lecture 27: Entropy and Information Prof. WAN, Xin

MACROSCOPIC VARIABLES, THERMAL EQUILIBRIUM. Contents AND BOLTZMANN ENTROPY. 1 Macroscopic Variables 3. 2 Local quantities and Hydrodynamics fields 4

Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences

Hardwiring Maxwell s Demon Tobias Brandes (Institut für Theoretische Physik, TU Berlin)

Non equilibrium thermodynamics: foundations, scope, and extension to the meso scale. Miguel Rubi

The dynamics of small particles whose size is roughly 1 µmt or. smaller, in a fluid at room temperature, is extremely erratic, and is

From fully quantum thermodynamical identities to a second law equality

General formula for the efficiency of quantum-mechanical analog of the Carnot engine

MD Thermodynamics. Lecture 12 3/26/18. Harvard SEAS AP 275 Atomistic Modeling of Materials Boris Kozinsky

Thermodynamics of feedback controlled systems. Francisco J. Cao

DEMONS: MAXWELL S DEMON, SZILARD S ENGINE AND LANDAUER S ERASURE DISSIPATION

arxiv: v2 [cond-mat.stat-mech] 16 Mar 2012

Statistical Mechanics

From the microscopic to the macroscopic world. Kolloqium April 10, 2014 Ludwig-Maximilians-Universität München. Jean BRICMONT

Quantum Thermodynamics

Verschränkung versus Stosszahlansatz: The second law rests on low correlation levels in our cosmic neighborhood

Information Thermodynamics on Causal Networks

Théorie des grandes déviations: Des mathématiques à la physique

Thermodynamical cost of accuracy and stability of information processing

Deriving Thermodynamics from Linear Dissipativity Theory

Optimum protocol for fast-switching free-energy calculations

Lecture 2 Entropy and Second Law

EFT Beyond the Horizon: Stochastic Inflation and How Primordial Quantum Fluctuations Go Classical

Teaching Statistical and Thermal Physics Using Computer Simulations

arxiv: v1 [physics.class-ph] 14 Apr 2012

Non equilibrium thermodynamic transformations. Giovanni Jona-Lasinio

PHYSICS 715 COURSE NOTES WEEK 1

Collective Effects. Equilibrium and Nonequilibrium Physics

General Formula for the Efficiency of Quantum-Mechanical Analog of the Carnot Engine

Comparative analysis of non-equilibrium quantum Landauer bounds

Collective Effects. Equilibrium and Nonequilibrium Physics

NPTEL

Mathematical Structures of Statistical Mechanics: from equilibrium to nonequilibrium and beyond Hao Ge

Information Theory and Predictability Lecture 6: Maximum Entropy Techniques

Measures of irreversibility in quantum phase space

Stochastic equations for thermodynamics

Thermodynamics of nuclei in thermal contact

Quantum Thermodynamics

Quantum measurement theory and micro-macro consistency in nonequilibrium statistical mechanics

Quantum-information thermodynamics

Experimental Rectification of Entropy Production by Maxwell s Demon in a Quantum System

An Outline of (Classical) Statistical Mechanics and Related Concepts in Machine Learning

arxiv:cond-mat/ v2 [cond-mat.stat-mech] 25 Sep 2000

Thermodynamics/Optics

Phase Transitions. µ a (P c (T ), T ) µ b (P c (T ), T ), (3) µ a (P, T c (P )) µ b (P, T c (P )). (4)

UNIVERSITY OF OSLO FACULTY OF MATHEMATICS AND NATURAL SCIENCES

Response to Comment on Zero and negative energy dissipation at information-theoretic erasure

Stochastic Thermodynamics of Langevin systems under time-delayed feedback control

Lecture Notes 2014March 13 on Thermodynamics A. First Law: based upon conservation of energy

PHYS Statistical Mechanics I Course Outline

Calculus of One Real Variable Prof. Joydeep Dutta Department of Economic Sciences Indian Institute of Technology, Kanpur

Thermodynamics & Statistical Mechanics SCQF Level 9, U03272, PHY-3-ThermStat. Thursday 24th April, a.m p.m.

Two recent works on molecular systems out of equilibrium

Entropy Decrease in Isolated Systems with an Internal Wall undergoing Brownian Motion

Statistical Mechanics

Introduction to Fluctuation Theorems

Temperature and Pressure Controls

Outline Review Example Problem 1 Example Problem 2. Thermodynamics. Review and Example Problems. X Bai. SDSMT, Physics. Fall 2013

Weak chaos, infinite ergodic theory, and anomalous diffusion

On the Validity of the Assumption of Local Equilibrium in Non-Equilibrium Thermodynamics

Thermodynamics (Classical) for Biological Systems Prof. G. K. Suraishkumar Department of Biotechnology Indian Institute of Technology Madras

The general reason for approach to thermal equilibrium of macroscopic quantu

Ψ ν ), if ν is ω-normal, Ψω Ψ ν

Optimal Thermodynamic Control and the Riemannian Geometry of Ising magnets

Distance between physical theories based on information theory

Quiz 3 for Physics 176: Answers. Professor Greenside

Physics of switches. Luca Gammaitoni NiPS Laboratory

Irreversible Processes

Anderson Localization Looking Forward

Introduction. Statistical physics: microscopic foundation of thermodynamics degrees of freedom 2 3 state variables!

Nonintegrability and the Fourier heat conduction law

arxiv: v1 [cond-mat.stat-mech] 9 Oct 2014

7.1 Coupling from the Past

Irreversibility and the arrow of time in a quenched quantum system. Eric Lutz Department of Physics University of Erlangen-Nuremberg

Modern thermodynamics New concepts based on the second law of thermodynamics

Thermodynamics is the only science about which I am firmly convinced that, within the framework of the applicability of its basic principles, it will

Irreversibility and the Second Law in Thermodynamics and Statistical Mechanics

V. Electrostatics Lecture 24: Diffuse Charge in Electrolytes

The fine-grained Gibbs entropy

Generalized entropy(ies) depending only on the probability; Gravitation, AdS/CFT,..

Grand Canonical Formalism

The Jarzynski Equation and the Fluctuation Theorem

Stochastic Processes at Single-molecule and Single-cell levels

ON THE ARROW OF TIME. Y. Charles Li. Hong Yang

Langevin Methods. Burkhard Dünweg Max Planck Institute for Polymer Research Ackermannweg 10 D Mainz Germany

Information and Physics Landauer Principle and Beyond

Gouy-Stodola Theorem as a variational. principle for open systems.

Mind the gap Solving optimization problems with a quantum computer

84 My God, He Plays Dice! Chapter 12. Irreversibility. This chapter on the web informationphilosopher.com/problems/reversibility

Transcription:

On Gibbs-Shannon Entropy Raphael Chetrite CNRS J.A Dieudonne Nice. Graduate School of Science, Kyoto. LARGE DEVIATION THEORY IN STATISTICAL PHYSICS : RECENT ADVANCES AND FUTURE CHALLENGES M. Gavrilov, R.C, J. Bechhoeffer : Direct measurement of weakly nonequilibrium system entropy is consistent with Gibbs-Shannon To appear in PNAS 2017 Raphael Chetrite On the Gibbs-Shannon Entropy 1/25

Apologize The only Large deviation in this talk, is the use by the speaker of slides instead of blackboard Thanks to Arvind-Abhishek-Juan-Frank-Christopher-Manjunath-Tony -Sanjib-Hugo for the invitation. Raphael Chetrite On the Gibbs-Shannon Entropy 2/25

Today : A work with Momcilo Gavrilov and John Bechhoefer M. Gavrilov, R.C, J. Bechhoeffer : Direct measurement of weakly nonequilibrium system entropy is consistent with Gibbs-Shannon To appear in PNAS 2017 Raphael Chetrite On the Gibbs-Shannon Entropy 3/25

The second law : at the roots of the all? A. Einstein : The second law of thermodynamics is the only physical theory...which I am convinced that...it will never be overthrown. Pope XII : (Pontifical Academy of Sciences, 1952) : The Second thermodynamic Law... provides a demonstrative testimony to the existence of a higher being.?versus? Parody in the onion 09/06/2000 : Conservative Christians protest the second law of thermodynamics on the steps of the Kansas Capitol. Raphael Chetrite On the Gibbs-Shannon Entropy 4/25

Main Question of this talk MQ : Physical contents of the Gibbs-Shannon entropy? Not tautological question Joel Lebowitz : Private note 2017 I have been worrying about this since I was a graduate student and still do not have a good answer. J. Bricmont : Science of Chaos or Chaos in science 1996? But why should we use this Gibbs entropy out of equilibrium? In equilibrium, it agrees with Boltzmann and everything is fine. Or again, during ours article s referring process Referee 1 of the article : The physical meaning of the Shannon entropy, away from equilibrium, has been well established (implicit : since long times)...? VERSUS? Referee 3 of the article : But I am concerned that the generality of the result is overstated... Raphael Chetrite On the Gibbs-Shannon Entropy 5/25

Main Question of this talk MQ : Physical contents of the Gibbs-Shannon entropy? Not tautological question Joel Lebowitz : Private note 2017 I have been worrying about this since I was a graduate student and still do not have a good answer. J. Bricmont : Science of Chaos or Chaos in science 1996? But why should we use this Gibbs entropy out of equilibrium? In equilibrium, it agrees with Boltzmann and everything is fine. Or again, during ours article s referring process Referee 1 of the article : The physical meaning of the Shannon entropy, away from equilibrium, has been well established (implicit : since long times)...? VERSUS? Referee 3 of the article : But I am concerned that the generality of the result is overstated... Raphael Chetrite On the Gibbs-Shannon Entropy 5/25

Main Message and Main result. By using a erasure experiment done with a feedback trap : 1 The Gibbs-Shannon entropy is Physically consistent, still outside equilibrium. 2 Experimental direct measure of the functional form of S bit (p) = p ln p (1 p) ln (1 p) Raphael Chetrite On the Gibbs-Shannon Entropy 6/25

Gibbs-Shannon entropy It is the name of what? Talk restrict to classical Definition S GS [ ] = h ln [ ]i Proportional to the amount of information needed for describe the RV Example : Bernoulli random variable. One Bit of information S bit (p) Derived Quantity : Kolmogorov-Sinai entropy (rate of Gibbs-Shannon entropy of a path measure), conditional entropy,... From who? J.W. Gibbs 1902 : Elementary Principles in Statistical Mechanics. Yale University Press. J. Von Neumann 1927 : Thermodynamik quantummechanischer Gesamheiten. C. Shannon 1948 : Amathematicaltheoryofcommunication. Raphael Chetrite On the Gibbs-Shannon Entropy 7/25

Gibbs-Shannon entropy It is the name of what? Talk restrict to classical Definition S GS [ ] = h ln [ ]i Proportional to the amount of information needed for describe the RV Example : Bernoulli random variable. One Bit of information S bit (p) Derived Quantity : Kolmogorov-Sinai entropy (rate of Gibbs-Shannon entropy of a path measure), conditional entropy,... From who? J.W. Gibbs 1902 : Elementary Principles in Statistical Mechanics. Yale University Press. J. Von Neumann 1927 : Thermodynamik quantummechanischer Gesamheiten. C. Shannon 1948 : Amathematicaltheoryofcommunication. Raphael Chetrite On the Gibbs-Shannon Entropy 7/25

Common agreement : Gibbs-Shannon entropy is the thermodynamical entropy at equilibrium J. Bricmont Science of Chaos or Chaos in science? In equilibrium, it agrees with Boltzmann and everything is fine. Equilibrium =) Gibbs distribution : ( g H (x) =exp [ (H(x) F eq H )] exp [ F eq H ]=R dx exp [ H(x)] where F eq H is called equilibrium free energy (1). Gibbs-Shannon associated entropy then S GS [g H ] = hg H ln g H i = hg H H F eq H i S GS [g H ]= hg H Hi F eq H which is the thermodynamical definition of the entropy. But beyond the equilibrium? Raphael Chetrite On the Gibbs-Shannon Entropy 8/25

Common agreement : Gibbs-Shannon entropy is the thermodynamical entropy at equilibrium J. Bricmont Science of Chaos or Chaos in science? In equilibrium, it agrees with Boltzmann and everything is fine. Equilibrium =) Gibbs distribution : ( g H (x) =exp [ (H(x) F eq H )] exp [ F eq H ]=R dx exp [ H(x)] where F eq H is called equilibrium free energy (1). Gibbs-Shannon associated entropy then S GS [g H ] = hg H ln g H i = hg H H F eq H i S GS [g H ]= hg H Hi F eq H which is the thermodynamical definition of the entropy. But beyond the equilibrium? Raphael Chetrite On the Gibbs-Shannon Entropy 8/25

Common agreement : Gibbs-Shannon entropy is the thermodynamical entropy at equilibrium J. Bricmont Science of Chaos or Chaos in science? In equilibrium, it agrees with Boltzmann and everything is fine. Equilibrium =) Gibbs distribution : ( g H (x) =exp [ (H(x) F eq H )] exp [ F eq H ]=R dx exp [ H(x)] where F eq H is called equilibrium free energy (1). Gibbs-Shannon associated entropy then S GS [g H ] = hg H ln g H i = hg H H F eq H i S GS [g H ]= hg H Hi F eq H which is the thermodynamical definition of the entropy. But beyond the equilibrium? Raphael Chetrite On the Gibbs-Shannon Entropy 8/25

And the wolf enter : S ( t ) cst for Hamiltonian evolution Prigogine Un siècle d espoir 1983 : According to the mechanical view of the world, the entropy of the universe is today identical to what it was at the origin of time. Time evolution of the Gibbs-Shannon entropy If density evaluate as t = L t [ t] then ( D E L t [ t] ln [ t] d dt S GS ( t)= h t (ln [ t]+1)i = h t L t (ln [ t])i Deterministe process x t = F t (x t) =) Liouville operator : L t = F t.r =) d dt S GS ( t)= h t F t.r (ln [ t])i = hf t r ti = hr.f t/ ti Divergenceless drift r.f t = 0 =) d dt S GS ( t)=0 Hamiltonian determinist dynamics have r.f t = 0. And then, arrive the second law... =) not positive in general Clausius 2nd law : The entropy of the universe tends to a maximum.?versus? The systems universe is hamiltonian (?) =) S GS don t do the job for hamiltonian closed systems S. Goldstein-J. Lebowitz (2003) It is therefore inappropriate, we believe, to use Shannon-Gibbs or quantities like it in derivations of the second law. Raphael Chetrite On the Gibbs-Shannon Entropy 9/25

And the wolf enter : S ( t ) cst for Hamiltonian evolution Prigogine Un siècle d espoir 1983 : According to the mechanical view of the world, the entropy of the universe is today identical to what it was at the origin of time. Time evolution of the Gibbs-Shannon entropy If density evaluate as t = L t [ t] then ( D E L t [ t] ln [ t] d dt S GS ( t)= h t (ln [ t]+1)i = h t L t (ln [ t])i Deterministe process x t = F t (x t) =) Liouville operator : L t = F t.r =) d dt S GS ( t)= h t F t.r (ln [ t])i = hf t r ti = hr.f t/ ti Divergenceless drift r.f t = 0 =) d dt S GS ( t)=0 Hamiltonian determinist dynamics have r.f t = 0. And then, arrive the second law... =) not positive in general Clausius 2nd law : The entropy of the universe tends to a maximum.?versus? The systems universe is hamiltonian (?) =) S GS don t do the job for hamiltonian closed systems S. Goldstein-J. Lebowitz (2003) It is therefore inappropriate, we believe, to use Shannon-Gibbs or quantities like it in derivations of the second law. Raphael Chetrite On the Gibbs-Shannon Entropy 9/25

And the wolf enter : S ( t ) cst for Hamiltonian evolution Prigogine Un siècle d espoir 1983 : According to the mechanical view of the world, the entropy of the universe is today identical to what it was at the origin of time. Time evolution of the Gibbs-Shannon entropy If density evaluate as t = L t [ t] then ( D E L t [ t] ln [ t] d dt S GS ( t)= h t (ln [ t]+1)i = h t L t (ln [ t])i Deterministe process x t = F t (x t) =) Liouville operator : L t = F t.r =) d dt S GS ( t)= h t F t.r (ln [ t])i = hf t r ti = hr.f t/ ti Divergenceless drift r.f t = 0 =) d dt S GS ( t)=0 Hamiltonian determinist dynamics have r.f t = 0. And then, arrive the second law... =) not positive in general Clausius 2nd law : The entropy of the universe tends to a maximum.?versus? The systems universe is hamiltonian (?) =) S GS don t do the job for hamiltonian closed systems S. Goldstein-J. Lebowitz (2003) It is therefore inappropriate, we believe, to use Shannon-Gibbs or quantities like it in derivations of the second law. Raphael Chetrite On the Gibbs-Shannon Entropy 9/25

And the wolf enter : S ( t ) cst for Hamiltonian evolution Prigogine Un siècle d espoir 1983 : According to the mechanical view of the world, the entropy of the universe is today identical to what it was at the origin of time. Time evolution of the Gibbs-Shannon entropy If density evaluate as t = L t [ t] then ( D E L t [ t] ln [ t] d dt S GS ( t)= h t (ln [ t]+1)i = h t L t (ln [ t])i Deterministe process x t = F t (x t) =) Liouville operator : L t = F t.r =) d dt S GS ( t)= h t F t.r (ln [ t])i = hf t r ti = hr.f t/ ti Divergenceless drift r.f t = 0 =) d dt S GS ( t)=0 Hamiltonian determinist dynamics have r.f t = 0. And then, arrive the second law... =) not positive in general Clausius 2nd law : The entropy of the universe tends to a maximum.?versus? The systems universe is hamiltonian (?) =) S GS don t do the job for hamiltonian closed systems S. Goldstein-J. Lebowitz (2003) It is therefore inappropriate, we believe, to use Shannon-Gibbs or quantities like it in derivations of the second law. Raphael Chetrite On the Gibbs-Shannon Entropy 9/25

First escape way : Weak-Typical second law for other entropy : the Boltzmann Entropy J. Lebowitz 1993 : Boltzmann s Entropy an time s arrow. Physics Today. S. Goldstein, J. Lebowitz 2003 On the (Boltzmann) Entropy of Nonequilibrium Systems Raphael Chetrite On the Gibbs-Shannon Entropy 10 / 25

From who? Second escape way : Coarse grained entropy Gibbs 1902 ; Erhenfest 1911. What is it? Space state Partition =U N i=1 i. Coarse-grained density cg (x)... (in fact P N i=1 1 i (x) Theorem S GS ( cg ) S GS ( ) R i dx (x) R i dx ) ( SGS ( t, 0 ) cg S GS ( t, 0 ) Let t, 0 the density a time t with IC 0 cg i S GS h t, cg S 0 GS t, cg 0 Then if Hamiltonian dynamics, second law : ( SGS ( t, 0 ) cg S GS ( 0 ) cg i S GS h t, cg S 0 GS ( cg 0 ) Raphael Chetrite On the Gibbs-Shannon Entropy 11 / 25

Gibbs-Shannon on the road again : 2nd for an open system Biblio : Prigogine, De Groot-Mazur... Meta-formulation : 3 different entropy 0apple S EP = S B + 4S 1 S EP : total entropy production in the universe 2 S B : exchange entropy with the bath 3 4S : variation of the system entropy between the final and initial time. What is the math expression of theses 3 entropy? We need 2 rules for fixe the 3 Hypothesis H1 : S = S GS, although the initial/final density not Gibbs Physical Hypothesis or just mathematical trick? If Physical, can we use this for a direct measure of S bit (p) J Lebowitz IHP talk 06/2017 : The question therefore naturally arises of why this should be true for small systems in contact with thermal reservoirs when, as agued above, this is not the case for isolated systems Raphael Chetrite On the Gibbs-Shannon Entropy 12 / 25

Gibbs-Shannon on the road again : 2nd for an open system Biblio : Prigogine, De Groot-Mazur... Meta-formulation : 3 different entropy 0apple S EP = S B + 4S 1 S EP : total entropy production in the universe 2 S B : exchange entropy with the bath 3 4S : variation of the system entropy between the final and initial time. What is the math expression of theses 3 entropy? We need 2 rules for fixe the 3 Hypothesis H1 : S = S GS, although the initial/final density not Gibbs Physical Hypothesis or just mathematical trick? If Physical, can we use this for a direct measure of S bit (p) J Lebowitz IHP talk 06/2017 : The question therefore naturally arises of why this should be true for small systems in contact with thermal reservoirs when, as agued above, this is not the case for isolated systems Raphael Chetrite On the Gibbs-Shannon Entropy 12 / 25

Hypothesis H2 : S EP as relative entropy in path space h i ST EP=S P [0,T ] [X ] P B X [0,T ] h i X the measure of the backward path h i X with P [X ] the measure of the path [X ] and P B [0,T ] for a backward process. For a Markov pro with gen L =) book Chetrite-Touchette : Transformations of Markov processes, time reversal, entropy production, and fluctuation relations. (To appear in 2050) =) this relative entropy with path measure becomes a relative entropy with 1 point measure : For a time homogene process, with backward process as the Doob transform L B = D inv L ( =) ST EP = S ( T / inv )+S ( 0 / inv ) R T 0 du d du [S ( u/ inv )] For a general process, with L B T t = D 0 L t ( 0 u = u, 0 ) =) 0 ( ST EP = S T / R 0 T + S 0/ 0 0 T 0 du d du [S ( u/ 0 u)] The RHS and the positivity by definition of ST EP give without calculation the classical version of the second principle for Markovian process : Lebowitz-Bergmann 1957, Risken 6.1 for diffusion process, Gardiner 3.7.3 for mixed process. For a general process, with L B T t = D L t (with L t t = 0) =) S EP,excess T = R T 0 du D u L u ln u u E Raphael Chetrite On the Gibbs-Shannon Entropy 13 / 25

Hypothesis H2 : S EP as relative entropy in path space h i ST EP=S P [0,T ] [X ] P B X [0,T ] h i X the measure of the backward path h i X with P [X ] the measure of the path [X ] and P B [0,T ] for a backward process. For a Markov pro with gen L =) book Chetrite-Touchette : Transformations of Markov processes, time reversal, entropy production, and fluctuation relations. (To appear in 2050) =) this relative entropy with path measure becomes a relative entropy with 1 point measure : For a time homogene process, with backward process as the Doob transform L B = D inv L ( =) ST EP = S ( T / inv )+S ( 0 / inv ) R T 0 du d du [S ( u/ inv )] For a general process, with L B T t = D 0 L t ( 0 u = u, 0 ) =) 0 ( ST EP = S T / R 0 T + S 0/ 0 0 T 0 du d du [S ( u/ 0 u)] The RHS and the positivity by definition of ST EP give without calculation the classical version of the second principle for Markovian process : Lebowitz-Bergmann 1957, Risken 6.1 for diffusion process, Gardiner 3.7.3 for mixed process. For a general process, with L B T t = D L t (with L t t = 0) =) S EP,excess T = R T 0 du D u L u ln u u E Raphael Chetrite On the Gibbs-Shannon Entropy 13 / 25

Hypothesis H2 : S EP as relative entropy in path space h i ST EP=S P [0,T ] [X ] P B X [0,T ] h i X the measure of the backward path h i X with P [X ] the measure of the path [X ] and P B [0,T ] for a backward process. For a Markov pro with gen L =) book Chetrite-Touchette : Transformations of Markov processes, time reversal, entropy production, and fluctuation relations. (To appear in 2050) =) this relative entropy with path measure becomes a relative entropy with 1 point measure : For a time homogene process, with backward process as the Doob transform L B = D inv L ( =) ST EP = S ( T / inv )+S ( 0 / inv ) R T 0 du d du [S ( u/ inv )] For a general process, with L B T t = D 0 L t ( 0 u = u, 0 ) =) 0 ( ST EP = S T / R 0 T + S 0/ 0 0 T 0 du d du [S ( u/ 0 u)] The RHS and the positivity by definition of ST EP give without calculation the classical version of the second principle for Markovian process : Lebowitz-Bergmann 1957, Risken 6.1 for diffusion process, Gardiner 3.7.3 for mixed process. For a general process, with L B T t = D L t (with L t t = 0) =) S EP,excess T = R T 0 du D u L u ln u u E Raphael Chetrite On the Gibbs-Shannon Entropy 13 / 25

Hypothesis H2 : S EP as relative entropy in path space h i ST EP=S P [0,T ] [X ] P B X [0,T ] h i X the measure of the backward path h i X with P [X ] the measure of the path [X ] and P B [0,T ] for a backward process. For a Markov pro with gen L =) book Chetrite-Touchette : Transformations of Markov processes, time reversal, entropy production, and fluctuation relations. (To appear in 2050) =) this relative entropy with path measure becomes a relative entropy with 1 point measure : For a time homogene process, with backward process as the Doob transform L B = D inv L ( =) ST EP = S ( T / inv )+S ( 0 / inv ) R T 0 du d du [S ( u/ inv )] For a general process, with L B T t = D 0 L t ( 0 u = u, 0 ) =) 0 ( ST EP = S T / R 0 T + S 0/ 0 0 T 0 du d du [S ( u/ 0 u)] The RHS and the positivity by definition of ST EP give without calculation the classical version of the second principle for Markovian process : Lebowitz-Bergmann 1957, Risken 6.1 for diffusion process, Gardiner 3.7.3 for mixed process. For a general process, with L B T t = D L t (with L t t = 0) =) S EP,excess T = R T 0 du D u L u ln u u E Raphael Chetrite On the Gibbs-Shannon Entropy 13 / 25

2nd Law for Isothermal contact with an equilibrium Bath Clausius form (1850) The heat Q that the bath receive is Q= 1 S B 2nd Law becomes 0apple Q + 4S Relation work-heat : First law The work done on the systems is given by W=(4E)+Q 2nd Law with for the Work done ( 1 4S + 4E W 4F ne with F ne : non equilibrium free energy (2) Theses 2nd law don t depends of H1 and H2, but theses hypothesis fixe the analytical form of S B, Q and then of W. As H1/H2 are some mathematicaly oriented hypothesis, we can wonder if theses give a physical form for the work W. Raphael Chetrite On the Gibbs-Shannon Entropy 14 / 25

2nd Law for Isothermal contact with an equilibrium Bath Clausius form (1850) The heat Q that the bath receive is Q= 1 S B 2nd Law becomes 0apple Q + 4S Relation work-heat : First law The work done on the systems is given by W=(4E)+Q 2nd Law with for the Work done ( 1 4S + 4E W 4F ne with F ne : non equilibrium free energy (2) Theses 2nd law don t depends of H1 and H2, but theses hypothesis fixe the analytical form of S B, Q and then of W. As H1/H2 are some mathematicaly oriented hypothesis, we can wonder if theses give a physical form for the work W. Raphael Chetrite On the Gibbs-Shannon Entropy 14 / 25

Context : Overdamped Langevin Equation dx t =- (rh t )(X t )dt+q 2 dwt Instantaneous Average energy If H B T S PE,excess T E t =h t /H t i t = H t, then with H1 and H2 we prove with the previous expression for that the work induced is the physical work D R E T W T = 0 [(@ th t )(X t )] dt Reciprocally, if we fixe the previous physical form of W,H1andH2areno longer indépendant. As a Physicist, we will now consider this logic : the physical work is fixe and then H1 and H2 are equivalent and we need now just one rules for fixe the 3 entropy. Half of the theoretician which find H1 tautological will be maybe more excited to consider possible experimental verification of H2. Fluctuation Relation come directly from the fluctuating version of H2. Then, all previous experimental work on Fluctuation Relation check partially H2. By example, (E.Roldan et al Nature Physics 2014) Raphael Chetrite On the Gibbs-Shannon Entropy 15 / 25

Context : Overdamped Langevin Equation dx t =- (rh t )(X t )dt+q 2 dwt Instantaneous Average energy If H B T S PE,excess T E t =h t /H t i t = H t, then with H1 and H2 we prove with the previous expression for that the work induced is the physical work D R E T W T = 0 [(@ th t )(X t )] dt Reciprocally, if we fixe the previous physical form of W,H1andH2areno longer indépendant. As a Physicist, we will now consider this logic : the physical work is fixe and then H1 and H2 are equivalent and we need now just one rules for fixe the 3 entropy. Half of the theoretician which find H1 tautological will be maybe more excited to consider possible experimental verification of H2. Fluctuation Relation come directly from the fluctuating version of H2. Then, all previous experimental work on Fluctuation Relation check partially H2. By example, (E.Roldan et al Nature Physics 2014) Raphael Chetrite On the Gibbs-Shannon Entropy 15 / 25

Discrete 2nd law under H1 Coarse-Graining w.r.t a partition of the space state =U N i=1 i. Discrete probability associated to : p i ( ) = R i dx (x) Conditional distribution : / i and total law probability : (x) = P N i=1 p i. / i (x) Local equilibrium state Conditional Gibbs density and conditional free energy (3) 8 < g H/ i = exp H Fi,H c 1 i R : Fi,H c = 1 ln x2 i dx exp [ H(x)] J. Parrondo, J. Horowitz, T. Sagawa Thermodynamics of information, Nature Physics 2015. The informational state are modeled by i or g H/ i. Local equilibrium state associated to a discrete probability! q : leq,! q (x) P N i=1 q ig H/ i (x) Raphael Chetrite On the Gibbs-Shannon Entropy 16 / 25

Discrete 2nd law under H1 Coarse-Graining w.r.t a partition of the space state =U N i=1 i. Discrete probability associated to : p i ( ) = R i dx (x) Conditional distribution : / i and total law probability : (x) = P N i=1 p i. / i (x) Local equilibrium state Conditional Gibbs density and conditional free energy (3) 8 < g H/ i = exp H Fi,H c 1 i R : Fi,H c = 1 ln x2 i dx exp [ H(x)] J. Parrondo, J. Horowitz, T. Sagawa Thermodynamics of information, Nature Physics 2015. The informational state are modeled by i or g H/ i. Local equilibrium state associated to a discrete probability! q : leq,! q (x) P N i=1 q ig H/ i (x) Raphael Chetrite On the Gibbs-Shannon Entropy 16 / 25

Symmetric case : Constant conditional free energy If it exist an application T i,j form i to j, with unit Jacobian and with H t (T i,j x)=h t (x) then directly F c i,h = F c j,h. Discrete expression of the Gibbs-Shannon Entropy A long but simple calculation give S GS ( ) =S!p GS + E F!p S leq,! p with the average free energy (4) F! p P N i=1 p if c i,h. This formulae can also be written for the non equilibrium free energy (2) F ne = E 1 S GS ( ) as F ne = F! p 1 S!p GS + 1 S leq,! p This generalize a famous relation between the non equilibrium free energy (2) the equilibrium free energy (1) F eq H and the entropy deficiency S ( g H) : F ne = F eq H + 1 S ( g H ) Raphael Chetrite On the Gibbs-Shannon Entropy 17 / 25

Symmetric case : Constant conditional free energy If it exist an application T i,j form i to j, with unit Jacobian and with H t (T i,j x)=h t (x) then directly F c i,h = F c j,h. Discrete expression of the Gibbs-Shannon Entropy A long but simple calculation give S GS ( ) =S!p GS + E F!p S leq,! p with the average free energy (4) F! p P N i=1 p if c i,h. This formulae can also be written for the non equilibrium free energy (2) F ne = E 1 S GS ( ) as F ne = F! p 1 S!p GS + 1 S leq,! p This generalize a famous relation between the non equilibrium free energy (2) the equilibrium free energy (1) F eq H and the entropy deficiency S ( g H) : F ne = F eq H + 1 S ( g H ) Raphael Chetrite On the Gibbs-Shannon Entropy 17 / 25

Symmetric case : Constant conditional free energy If it exist an application T i,j form i to j, with unit Jacobian and with H t (T i,j x)=h t (x) then directly F c i,h = F c j,h. Discrete expression of the Gibbs-Shannon Entropy A long but simple calculation give S GS ( ) =S!p GS + E F!p S leq,! p with the average free energy (4) F! p P N i=1 p if c i,h. This formulae can also be written for the non equilibrium free energy (2) F ne = E 1 S GS ( ) as F ne = F! p 1 S!p GS + 1 S leq,! p This generalize a famous relation between the non equilibrium free energy (2) the equilibrium free energy (1) F eq H and the entropy deficiency S ( g H) : F ne = F eq H + 1 S ( g H ) Raphael Chetrite On the Gibbs-Shannon Entropy 17 / 25

Discrete 2nd Law for the Work under H1 W - 1 4 S GS!p +4 F!p + 1 4 S leq,! p W.r.t a naive ad-hoc discrete 2nd law (in blue), there are two corrective terms : The asymmetry term 4 F! p Idea introduce by K.Shizume 1995 and this form by T.Sagawa et al 2009. In the Symmetric case plus cyclic protocol this term vanishes 4 T F!p = F c HT F c H 0 = 0 The distance to local equilibrium term 1 4 S leq,! p Discrete 2nd Law for the Work under H1 with initial/final state in LE Sagawa-Ueda generalized or breakdown Landauer principle W - 1 4 S GS!p +4 F!p The word generalized and breakdown means to take as reference or conventional an had-hoc discrete 2nd law. ADVERTISEMENT : the discret 2nd law proven here suppose H1. Raphael Chetrite On the Gibbs-Shannon Entropy 18 / 25

Discrete 2nd Law for the Work under H1 W - 1 4 S GS!p +4 F!p + 1 4 S leq,! p W.r.t a naive ad-hoc discrete 2nd law (in blue), there are two corrective terms : The asymmetry term 4 F! p Idea introduce by K.Shizume 1995 and this form by T.Sagawa et al 2009. In the Symmetric case plus cyclic protocol this term vanishes 4 T F!p = F c HT F c H 0 = 0 The distance to local equilibrium term 1 4 S leq,! p Discrete 2nd Law for the Work under H1 with initial/final state in LE Sagawa-Ueda generalized or breakdown Landauer principle W - 1 4 S GS!p +4 F!p The word generalized and breakdown means to take as reference or conventional an had-hoc discrete 2nd law. ADVERTISEMENT : the discret 2nd law proven here suppose H1. Raphael Chetrite On the Gibbs-Shannon Entropy 18 / 25

Partial Conclusion before the exp of double well erasure 1 In our erasure experiment : We measure at long intermediate time the reversible work Wrev exp for all value of a parameter p (which is the population of the left well), and this is totally independent of the H1 hypothesis. In this situation, the previous discrete 2nd law become an equality and permits to theoretically access to Wrev H1. We compare then the two result for all values of p and we find that there is a perfect agreement. 2 This prove our message that the Gibbs-Shannon entropy is physically consistent, still outside equilibrium (initial/final state not Gibbs). Question : Is this message was not already imply by the previous experiment on Landauer erasure? as : A.Bérut,A.Arakelyan, A.Petrosyan,S.Ciliberto,R.Dillenschneider,E. Lutz : Nature 2012 Y.Jun, M. Gavrilov, J. Bechhoefer : Phys. Rev. Lett.2014 J. Hong, B. Lambson, S. Dhuey, J. Bokor : Sci. Adv 2016 What we win w.r.t theses previous works? Response :Thepreviousexperimentswasconcernjustwithoneuniquevalueofthe parameter, p = 1,andthentheagreementwithH1 was for a punctual value while 2 it s a functional agreement in our experiment. The key experimentam point here is the control of p. Raphael Chetrite On the Gibbs-Shannon Entropy 19 / 25

Partial Conclusion before the exp of double well erasure 1 In our erasure experiment : We measure at long intermediate time the reversible work Wrev exp for all value of a parameter p (which is the population of the left well), and this is totally independent of the H1 hypothesis. In this situation, the previous discrete 2nd law become an equality and permits to theoretically access to Wrev H1. We compare then the two result for all values of p and we find that there is a perfect agreement. 2 This prove our message that the Gibbs-Shannon entropy is physically consistent, still outside equilibrium (initial/final state not Gibbs). Question : Is this message was not already imply by the previous experiment on Landauer erasure? as : A.Bérut,A.Arakelyan, A.Petrosyan,S.Ciliberto,R.Dillenschneider,E. Lutz : Nature 2012 Y.Jun, M. Gavrilov, J. Bechhoefer : Phys. Rev. Lett.2014 J. Hong, B. Lambson, S. Dhuey, J. Bokor : Sci. Adv 2016 What we win w.r.t theses previous works? Response :Thepreviousexperimentswasconcernjustwithoneuniquevalueofthe parameter, p = 1,andthentheagreementwithH1 was for a punctual value while 2 it s a functional agreement in our experiment. The key experimentam point here is the control of p. Raphael Chetrite On the Gibbs-Shannon Entropy 19 / 25

Logics 1 But, logically speaking, we don t prove that H1 is true for general state. The strict logic can t prevents some loophole, by examples : H1 could be right only for local equilibrium state, and not for more general non equilibrium state. Worst, H1 could be wrong for all states since the beginning but with the discrete 2nd law still right Or other way, H1 and the discrete 2nd law can be wrong but becomes right just in the intermediate time limit....ect... 2 The fact that H1 is right in this peculiar case but not in general is the sustained view of Joel Lebowitz in the last part of his IHP talk 06/2017 which concern our article. 3 Finally, another possibility, implicitly taken in many articles, is to consider as a fundamental start the discrete second law : W 1 4 [S]. Moreexactlywiththetwo corrective term W 1 4 S! p + 4 F!p h i + 1 4 S / leq,! p. With this ad-hoc discrete version in the hand, our experiment check that the discrete version of H1 S(! p )=S GS (! p ) is right (with still the loop-hole of the long time limit). 4 However, by exceed this debate of fondement, we obtain an experimentally measure of the function S bit (p) Raphael Chetrite On the Gibbs-Shannon Entropy 20 / 25

Logics 1 But, logically speaking, we don t prove that H1 is true for general state. The strict logic can t prevents some loophole, by examples : H1 could be right only for local equilibrium state, and not for more general non equilibrium state. Worst, H1 could be wrong for all states since the beginning but with the discrete 2nd law still right Or other way, H1 and the discrete 2nd law can be wrong but becomes right just in the intermediate time limit....ect... 2 The fact that H1 is right in this peculiar case but not in general is the sustained view of Joel Lebowitz in the last part of his IHP talk 06/2017 which concern our article. 3 Finally, another possibility, implicitly taken in many articles, is to consider as a fundamental start the discrete second law : W 1 4 [S]. Moreexactlywiththetwo corrective term W 1 4 S! p + 4 F!p h i + 1 4 S / leq,! p. With this ad-hoc discrete version in the hand, our experiment check that the discrete version of H1 S(! p )=S GS (! p ) is right (with still the loop-hole of the long time limit). 4 However, by exceed this debate of fondement, we obtain an experimentally measure of the function S bit (p) Raphael Chetrite On the Gibbs-Shannon Entropy 20 / 25

Logics 1 But, logically speaking, we don t prove that H1 is true for general state. The strict logic can t prevents some loophole, by examples : H1 could be right only for local equilibrium state, and not for more general non equilibrium state. Worst, H1 could be wrong for all states since the beginning but with the discrete 2nd law still right Or other way, H1 and the discrete 2nd law can be wrong but becomes right just in the intermediate time limit....ect... 2 The fact that H1 is right in this peculiar case but not in general is the sustained view of Joel Lebowitz in the last part of his IHP talk 06/2017 which concern our article. 3 Finally, another possibility, implicitly taken in many articles, is to consider as a fundamental start the discrete second law : W 1 4 [S]. Moreexactlywiththetwo corrective term W 1 4 S! p + 4 F!p h i + 1 4 S / leq,! p. With this ad-hoc discrete version in the hand, our experiment check that the discrete version of H1 S(! p )=S GS (! p ) is right (with still the loop-hole of the long time limit). 4 However, by exceed this debate of fondement, we obtain an experimentally measure of the function S bit (p) Raphael Chetrite On the Gibbs-Shannon Entropy 20 / 25

Experimental Set-up of the Erasure experiment 1 Systems : Colloidal Particles, Micron-scale silica bead of 1,5 µm diameter, in a fluid which is the bath 2 Feedback trap (image particles, estimate position, Calculate force, apply voltage) =) effective one dimensional Overdamped Langevin equation in a time dependent double well potential. 3 Initial/final state : Local equilibrium in an even double well potential H T = H 0 = H =) 4 F! p = 0 4 Protocole : Lowering Titlting Uppering Untilting 5 Notation : =L t R, p L,T! p T and leq,! p! leq,p. Raphael Chetrite On the Gibbs-Shannon Entropy 21 / 25

Complete Erasure of a fraction of bit Initial State : Local equilibrium with population p 0 (the free parameter) Final State : Local equilibrium with p T = 0 (total erasure) =) 4 S GS!p = 0 Sbit (p 0 ) =) the discrete 2nd law is W 1 S Bit (p 0 ) =) in the intermediate time limit with reversibility W rev 1 = 1 S Bit (p 0 ) Experimental limitation : little range of p 0! S bit (p 0 ) (filled markers). limitation due to the reversibility contraint which ask a slow first step : to stretch the initial potential H 0! H0 0 such that the local equilibrium leq,p0 is aglobalonew.r.th0 0. Raphael Chetrite On the Gibbs-Shannon Entropy 22 / 25

Complete Erasure of a fraction of bit Initial State : Local equilibrium with population p 0 (the free parameter) Final State : Local equilibrium with p T = 0 (total erasure) =) 4 S GS!p = 0 Sbit (p 0 ) =) the discrete 2nd law is W 1 S Bit (p 0 ) =) in the intermediate time limit with reversibility W rev 1 = 1 S Bit (p 0 ) Experimental limitation : little range of p 0! S bit (p 0 ) (filled markers). limitation due to the reversibility contraint which ask a slow first step : to stretch the initial potential H 0! H0 0 such that the local equilibrium leq,p0 is aglobalonew.r.th0 0. Raphael Chetrite On the Gibbs-Shannon Entropy 22 / 25

Complete Erasure of a fraction of bit Initial State : Local equilibrium with population p 0 (the free parameter) Final State : Local equilibrium with p T = 0 (total erasure) =) 4 S GS!p = 0 Sbit (p 0 ) =) the discrete 2nd law is W 1 S Bit (p 0 ) =) in the intermediate time limit with reversibility W rev 1 = 1 S Bit (p 0 ) Experimental limitation : little range of p 0! S bit (p 0 ) (filled markers). limitation due to the reversibility contraint which ask a slow first step : to stretch the initial potential H 0! H0 0 such that the local equilibrium leq,p0 is aglobalonew.r.th0 0. Raphael Chetrite On the Gibbs-Shannon Entropy 22 / 25

Partial erasure of a symmetric bit Initial State : Local equilibrium with population p 0 = 1 2. Final State : Local equilibrium with parameter p T (the free parameter) =) 4 S GS!p = Sbit (p T ) ln(2) Then the discrete 2nd law is W - 1 S bit (p T )+ 1 ln 2 and then in the intermediate time limit with reversibility W1 rev = 1 Sbit (p 1 )+ 1 ln 2 Experimental limitation : It is difficult to control p 1, then it is not possible to use this relation for deduce p 1! ln 2 W1 rev = S bit (p 1 ). Solution : Refined version of the discrete second law in the empirical case of conditional gaussian work.see Supplements PNAS for more. S bit (p T ) ln 2 = E gh (W T ) E B leq,pt (W T ) Then we access experimentally to p T! S (B pt ) by plugins at finite T E gh (W T ) E B leq,pt (W T ) p T! ln 2 2 Hollow markers in the figure. Raphael Chetrite On the Gibbs-Shannon Entropy 23 / 25 2

Partial erasure of a symmetric bit Initial State : Local equilibrium with population p 0 = 1 2. Final State : Local equilibrium with parameter p T (the free parameter) =) 4 S GS!p = Sbit (p T ) ln(2) Then the discrete 2nd law is W - 1 S bit (p T )+ 1 ln 2 and then in the intermediate time limit with reversibility W1 rev = 1 Sbit (p 1 )+ 1 ln 2 Experimental limitation : It is difficult to control p 1, then it is not possible to use this relation for deduce p 1! ln 2 W1 rev = S bit (p 1 ). Solution : Refined version of the discrete second law in the empirical case of conditional gaussian work.see Supplements PNAS for more. S bit (p T ) ln 2 = E gh (W T ) E B leq,pt (W T ) Then we access experimentally to p T! S (B pt ) by plugins at finite T E gh (W T ) E B leq,pt (W T ) p T! ln 2 2 Hollow markers in the figure. 2 Raphael Chetrite On the Gibbs-Shannon Entropy 23 / 25

Thanks you for your attention... and sorry if the end was too blurry, I hope to contradict the French proverb : Quand c est flou, c est qu il y a un loup Raphael Chetrite On the Gibbs-Shannon Entropy 24 / 25

Supplement : Reffined version of the discrete second law for conditional gaussian process Start with Detailed Fluctuation Relation. R.C-K.Gawedzki CMP 2008. Then Prove the Conditioned Jarzynski/Crooks relation (E.Roldan et al Nature Physics 2014) Then, 8 in the conditional gaussian regime P gh (w/x T 2 L) N W L, ( L ) 2 (w) >< P gh (w/x T 2 R) N W R, ( R ) 2 (w) P B leq,pt (w/x 0 2 L) N WL B, B 2 L (w) >: (w/x 0 2 R) N WR B, B 2 R (w) P B leq,pt The conditioned Crooks relation give the refined second law : E gh (W T )= 2 p T ( L ) 2 +(1 p T )( R ) 2 + 1 (ln (2) S bit (p T )) E B leq,pt (W T )= 2 p T ( L ) 2 +(1 p T )( R ) 2 1 (ln (2) S bit (p T )) Raphael Chetrite On the Gibbs-Shannon Entropy 25 / 25