Thermodynamics of feedback controlled systems. Francisco J. Cao

Similar documents
Quantum Thermodynamics

Chapter 20. Heat Engines, Entropy and the Second Law of Thermodynamics. Dr. Armen Kocharian

Lecture 5: Temperature, Adiabatic Processes

Second law, entropy production, and reversibility in thermodynamics of information

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

Even if you're not burning books, destroying information generates heat.

Murray Gell-Mann, The Quark and the Jaguar, 1995

Maxwell s Demon. Kirk T. McDonald Joseph Henry Laboratories, Princeton University, Princeton, NJ (October 3, 2004)

arxiv: v1 [physics.class-ph] 14 Apr 2012

1. Thermodynamics 1.1. A macroscopic view of matter

Chapter 16 Thermodynamics

04. Information and Maxwell's Demon. I. Dilemma for Information-Theoretic Exorcisms.

Chapter 12. The Laws of Thermodynamics

Quantum heat engine using energy quantization in potential barrier

Class 22 - Second Law of Thermodynamics and Entropy

Physics is time symmetric Nature is not

Heat Machines (Chapters 18.6, 19)

Information in Biology

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble

Information in Biology

Section 3 Entropy and Classical Thermodynamics

An Outline of (Classical) Statistical Mechanics and Related Concepts in Machine Learning

Entropy and the Second and Third Laws of Thermodynamics

Chapter 12. The Laws of Thermodynamics. First Law of Thermodynamics

The First Law of Thermodynamics

The (absence of a) relationship between thermodynamic and logical reversibility

IV. Classical Statistical Mechanics

The physics of information: from Maxwell s demon to Landauer. Eric Lutz University of Erlangen-Nürnberg

Maxwell s Demon. Kirk T. McDonald Joseph Henry Laboratories, Princeton University, Princeton, NJ (October 3, 2004; updated September 20, 2016)

8 Lecture 8: Thermodynamics: Principles

Contents. 1 Introduction and guide for this text 1. 2 Equilibrium and entropy 6. 3 Energy and how the microscopic world works 21

Physics, Time and Determinism

Entropy and the second law of thermodynamics

ChE 503 A. Z. Panagiotopoulos 1

Lecture 9 Overview (Ch. 1-3)

Statistical Physics. How to connect the microscopic properties -- lots of changes to the macroscopic properties -- not changing much.

Thermodynamics: More Entropy

Chapter 4: Going from microcanonical to canonical ensemble, from energy to temperature.

Thermodynamics & Statistical Mechanics SCQF Level 9, U03272, PHY-3-ThermStat. Thursday 24th April, a.m p.m.

Summary Part Thermodynamic laws Thermodynamic processes. Fys2160,

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Entropy. Entropy Changes for an Ideal Gas

Entropy and Free Energy in Biology

Entropy and Free Energy in Biology

Chapter 20 The Second Law of Thermodynamics

Grand Canonical Formalism

CARNOT CYCLE = T = S ( U,V )

5.60 Thermodynamics & Kinetics Spring 2008

Ch. 19 Entropy and Free Energy: Spontaneous Change

Elements of Statistical Mechanics

Statistical Mechanics and Information Theory

Chapter 3. The Second Law Fall Semester Physical Chemistry 1 (CHM2201)

S = S(f) S(i) dq rev /T. ds = dq rev /T

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Thermodynamic Computing. Forward Through Backwards Time by RocketBoom

Physics 172H Modern Mechanics

October 18, 2011 Carnot cycle - 1

Imperial College London BSc/MSci EXAMINATION May 2008 THERMODYNAMICS & STATISTICAL PHYSICS

Thermodynamics Second Law Heat Engines

Nonequilibrium Thermodynamics of Small Systems: Classical and Quantum Aspects. Massimiliano Esposito

Removing the mystery of entropy and thermodynamics. Part 3

Atkins / Paula Physical Chemistry, 8th Edition. Chapter 3. The Second Law

PHY 5524: Statistical Mechanics, Spring February 11 th, 2013 Midterm Exam # 1

Lecture 27: Entropy and Information Prof. WAN, Xin

Chapter 19. First Law of Thermodynamics. Dr. Armen Kocharian, 04/04/05

Irreversible Processes

Statistical thermodynamics for MD and MC simulations

Information Thermodynamics on Causal Networks

Information and Physics Landauer Principle and Beyond

fiziks Institute for NET/JRF, GATE, IIT-JAM, JEST, TIFR and GRE in PHYSICAL SCIENCES

Zero and negative energy dissipation at information-theoretic erasure

Chemical thermodynamics the area of chemistry that deals with energy relationships

The Kelvin-Planck statement of the second law

Thermodynamics: More Entropy

Physics 132- Fundamentals of Physics for Biologists II. Statistical Physics and Thermodynamics

Lecture 2 Entropy and Second Law

arxiv:quant-ph/ v1 21 Feb 2002

Results of. Midterm 1. Points < Grade C D,F C B points.

CHAPTER 4 ENTROPY AND INFORMATION

Thermodynamic system is classified into the following three systems. (ii) Closed System It exchanges only energy (not matter) with surroundings.

Derivation of Van der Waal s equation of state in microcanonical ensemble formulation

BOLTZMANN ENTROPY: PROBABILITY AND INFORMATION

Introduction to Statistical Thermodynamics

From the microscopic to the macroscopic world. Kolloqium April 10, 2014 Ludwig-Maximilians-Universität München. Jean BRICMONT

Finite Ring Geometries and Role of Coupling in Molecular Dynamics and Chemistry

to satisfy the large number approximations, W W sys can be small.

a. 4.2x10-4 m 3 b. 5.5x10-4 m 3 c. 1.2x10-4 m 3 d. 1.4x10-5 m 3 e. 8.8x10-5 m 3

dv = adx, where a is the active area of the piston. In equilibrium, the external force F is related to pressure P as

The Temperature of a System as a Function of the Multiplicity and its Rate of Change

Maxwell's Demon in Biochemical Signal Transduction

(prev) (top) (next) (Throughout, we will assume the processes involve an ideal gas with constant n.)

Thermodynamics: The Laws

Chapter 2 Ensemble Theory in Statistical Physics: Free Energy Potential

Entropy in Macroscopic Systems

Basics of Statistical Mechanics

The fine-grained Gibbs entropy

arxiv:chao-dyn/ v1 17 Feb 1999

w = -nrt hot ln(v 2 /V 1 ) nrt cold ln(v 1 /V 2 )[sincev/v 4 3 = V 1 /V 2 ]

Chemistry 123: Physical and Organic Chemistry Topic 2: Thermochemistry

Mathematical Structures of Statistical Mechanics: from equilibrium to nonequilibrium and beyond Hao Ge

Transcription:

Thermodynamics of feedback controlled systems Francisco J. Cao

Open-loop and closed-loop control Open-loop control: the controller actuates on the system independently of the system state. Controller Actuation Evolving system Closed-loop or feedback control: the controller actuates on the system using information of the system state. Controller Information Actuation Evolving system J. Bechhoefer, Rev. Mod. Phys. 77, 783 (2005)

Information and feedback control The information about the state of the system allows the external agent to optimize its actuation on the system, in order to improve the system performance. Thermodynamics of feedback control is incomplete: the role of information in feedback controlled systems is still not completely understood. In particular, its implications for the entropy of the system. The understanding of feedback systems and their limitations is very important form the technological point of view. Controller Information Actuation Evolving system 3

Overview. Entropy in Thermodynamics 2. Entropy in Statistical Physics and Information 3. Entropy and Thermodynamics of feedback controlled systems 4. Conclusions 4

. Entropy in Thermodynamics Second law and entropy intimately linked 5

.. Second principle Kelvin-Planck statement: It is not possible to find any spontaneous process whose only result is to convert a given amount of heat to an equal amount of work through the exchange of heat with only one heat source. T Q W Clausius statement: It is not possible to find an spontaneous process which its only result is to pass heat from a system to another system with greater temperature. T Q Q T >T 2 T 2 6

.2. Clausius Theorem For a system that follows a cyclic process we have for each cycle δq T TB 0 p with δq the infinitesimal amount of work interchanged with the thermal bath at temperature T TB. The equality holds if the process is reversible (in this case also T system =T TB ) V 7

.3. Thermodynamic definition of entropy The application of the Clausius theorem for reversible cycles tell us that there exist a state function, named entropy, defined by p S 2 S = 2 δq T REVERSIBLE 2 V As a consequence in any cycle the change in entropy of the system is zero. 8

.4. Second principle in terms of entropy The entropy of an isolated system either increases or remains constant ΔS ISOLATED 0 Thus, in an isolated systems only processes that increase or keep constant the entropy will spontaneously occur. The increase of the entropy of an isolated system indicates its evolution towards the equilibrium state, which has the maximum entropy. 9

2. Entropy in Statistical Physics and Information Microstate and Macrostate + Entropy expression in Statistical Physics + Basic concepts in Information Theory = Fruitful and clear interpretation of entropy 0

2.. Microstate and macrostate Microstate: Complete description of the state of the system, where all the microscopic variables are specified. Macrostate: Partial description of the state of the system, where only some macroscopic variables are specified.

2.. Microstate and macrostate Example: gas of a great number of point particles Microstate: position and velocity of each particle at a time t. Macrostate: E, V and N; o p, V and T. In general, for systems with a great number of constituents experimentally it is only possible to determine the macrostate. 2

2.2. Entropy in the microcanonical ensemble Isolated system in an equilibrium state defined by E, V and N. Macrostate E, V and N has Ω equiprobable compatible microstates Entropy S = k ln Ω k=,38 0-23 J/K Boltzmann constant 3

2.3. Boltzmann entropy Entropy of a macrostate S = k n p i ln p i i= p i probability of microstate i n number of microstates compatible with the macrostate Example with equal probability: isolated system in equilibrium microcanonical ensemble p i =/Ω Example with different probabilities: - system in equilibrium with a thermal bath (particle gas) canonical ensemble. -Proteins. 4

2.5. Entropy and information Shannon defined the quantity H = n i= p i log 2 p i (Shannon entropy ) It is a measure of the average uncertainty of a random variable that takes n values each with probability p i. It is the number of bit needed in average to describe the random variable. C.E. Shannon, The Bell System Tech. J. 27, 379 (948). 5

2.5. Entropy and information If the values are equiprobable, the number of bits needed in average to describe the random variable, is simply log 2 n. But when the values are not equiprobable, the average number of bits can be reduced, using a shorter description for the more probable cases. Example with four values: values a b c d p i / 2 / 4 / 8 / 8 chain 0 0 3/ 8 With this codification the average number of bits needed is p i l i = 7/4 =.75 bits Which coincides with the Shannon entropy H = p i log 2 p i = 7/4 =.75 bits While it they were equiprobable it would be log 2 4= 2 bits 0 l i 2 3 3 p l i / 2 / 2 3/ 8 i 6

2.5. Entropy and information Recall that the Boltzmann entropy of a macrostate and the Shannon entropy are S = k n i= p i ln p i H = i= p i probability or the i microstate n number of microstates compatible with the macrostate Botzmann entropy of a macrostate: average amount of information needed to specify the microstate S = k ln(2) H n p i log 2 p i [The ln(2) factor comes from the change of base.] 7

3. Entropy and thermodynamics of feedback systems Feedback controlled system: system that is coupled to an external agent that uses information of the system to actuate on it. Thermodynamics of feedback control is incomplete: the role of information in feedback controlled systems is still not completely understood. In particular, its implications for the entropy of the system. Much of the progress has come from the study of the Maxwell s demon, and mainly from a computation theory point of view. Controller Information Actuation Evolving system 8

3.. Maxwell demon: Szilard engine The demon puts a wall in the middle, and observes where is the particle. Once the demon knows in which side is the particle, it ΔS s =-kln2 attaches a piston in the correct side of the wall to extract a work W. Meanwhile the system is connected to a thermal bath of temperature T extracting from it a heat Q=W. Apparently the efficiency is η=w/q=??? W=kTln2 and with only one thermal bath ΔS s =kln2 ( 2 nd principle!!!) Q=W J.C.Maxwell, Theory of heat (87) L.Szilard, Z. Phys. 53, 840 (929) 9

3.. Maxwell demon: Szilard engine ΔS s =-kln2? W=kTln2 Q=W ΔS s =kln2 J.C.Maxwell, Theory of heat (87) L.Szilard, Z. Phys. 53, 840 (929) 20

3.2. Landauer principle It can be obtained from the second law, therefore it is not a principle. The erasure of one bit of information produces a growth in the entropy of the enviroment of ΔS e kln2 W d =Q e Q e ktln2 ΔS e kln2 (Szilard engine: one bit is enough to store the information, for example: 0 left, right) R. Landauer, IBM J. Res. Dev. 5, 83 (96) 2

3.3. Maxwell demon solution (system + demon perspective) ΔS d = kln2 ΔS s =-kln2 ΔS e kln2 W d =Q e ΔS d = -kln2 Q e ktln2 W=kTln2 ΔS s =kln2 ΔS e =-kln2 Q=W ΔS s + ΔS d + ΔS e 0 C.H.Bennett, Int.J.Theor.Phys. 2, 905 (982) 22

3.4. Many measurements (demon + system perspective) Zurek shows how to minimize the erasure cost, using an algorithmic complexity approach nw d =nq e nq e ΔS e nkln2 The clever demons compress the information (less bits = lower erasure cost) Compressed n c n ΔS e n c kln2 W.H.Zurek, Phys.Rev.A 40, 473 (989) 23

3.5. Open questions There are already many open questions in the physics of feedback controlled systems. From the point of view of system + controller the understanding is advanced, but it uses concepts like algorithmic complexity (Zurek) which do not have a clear physical meaning, and which it is neither clear how to compute them in real cases. The understanding from the point of view of the system (without entering in the controller details) is still incomplete. The thermodynamics of the feedback controlled systems is still incomplete. 24

3.6. Entropy reduction due to information System perspective: For the controller we only need the (deterministic or not) correspondence between the states of the system and the actions of the controller. left right off on The entropy of the system before being measured by the system for the first time b S = k px ( x)ln p ( x) k ln(2) H ( X) X = x X p X ( x) F. J. Cao, M. Feito, Phys. Rev. E 79, 048 (2009) 25

3.6. Entropy reduction due to information If the first measurement implies that the first action of the controller C is c, the entropy decreases to k Therefore the average entropy after the measurement is S x X p ( x c)ln p ( x c) = : k ln(2) H ( X C X C X C = = k ln( 2) pc ( c) H ( X C = c) = : k ln(2) H ( X ) a C c C c) p X ( x) C = 0 C = p X C ( x 0) p X C ( x) p (0) () C p C F. J. Cao, M. Feito, Phys. Rev. E 79, 048 (2009) 26

F. J. Cao, M. Feito, Phys. Rev. E 79, 048 (2009) 27 3.7. Derivation of the Landauer principle The average change in a measurement is where its appears the mutual information which is a measurement of the ((dependency)) between two random variables. Thus, we obtain the Landauer principle as a consequence = = C c X x C X C X C X c p x p c x p c x p C X H X H C X I, ) ( ) ( ), ( )ln, ( ) ( ) ( ) : ; ( )] ( ) ( 2)[ ln( X H C X H k S S S a d = = Δ ) ; ( 2) ln( C X I S k = Δ

3.8. Many measurements (system perspective) For systems with deterministic control after M measurements we obtain ΔS info = k = kln( 2 )H(C c M, K, c C p M C M LC,,C (c M,,c ) lnp C M LC,,c H(C M,,C ) is the average amount of information needed to especify the M actions of the controller on the system This result indicates that only nonredundant information is useful to reduce the entropy of the system (in correspondence with the Zurek idea of compressing the information) ) (c M ) F. J. Cao, M. Feito, Phys. Rev. E 79, 048 (2009) 28

3.8. Many measurements (system perspective) For system with NONdeterministic control after M measurements we have ΔS where the additional term is nonzero if the present state of the system and the previous history of the controller does not completely determine the action of the controller. Example: info = kln( 2 ) H(C M,,C M ) H ( Ck Ck,..., C, X k ) k = The entropy reduction in the system due to information after M measurements is ΔS info left right -ε ε ε -ε [ H(C,,C ) MH ( )] = kln( 2 ) ε M off on b with H b ( ε ) = ε lnε ( ε )ln( ε ) F. J. Cao, M. Feito, Phys. Rev. E 79, 048 (2009) 29

3.9. Application and example Isothermal feedback systems: its efficiency can be defined as η = W Δ F cont If the controller does not transfer heat to the system, the maximum efficiency is η = ΔU W TΔ cont S info Markovian particle pump: we have computed the rate of reduction of entropy, the work, and the efficiency, both in the quasistatic and in a nonquasistatic regime. 2 α 3 α α 6 5 4 L α 7 F. J. Cao, M. Feito, Phys. Rev. E 79, 048 (2009) 30

4. Conclusions Entropy of a macrostate can be interpreted as the average amount of information needed to specify the microstate S = k ln(2) H This approach allows us to establish the thermodynamics of feedback controlled classical systems, even for nonquasistatic cases (where measurements are correlated) and also for nondeterministic controllers. Open questions: continuous time limit, thermodynamics of feedback controlled quantum systems, applications to particular systems of relevance. F. J. Cao, M. Feito, Phys. Rev. E 79, 048 (2009) 3