G : Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into

Similar documents
CHEM-UA 652: Thermodynamics and Kinetics

where (E) is the partition function of the uniform ensemble. Recalling that we have (E) = E (E) (E) i = ij x (E) j E = ij ln (E) E = k ij ~ S E = kt i

III. Kinetic Theory of Gases

From the microscopic to the macroscopic world. Kolloqium April 10, 2014 Ludwig-Maximilians-Universität München. Jean BRICMONT

in order to insure that the Liouville equation for f(?; t) is still valid. These equations of motion will give rise to a distribution function f(?; t)

Statistical mechanics of classical systems

The fine-grained Gibbs entropy

UNDERSTANDING BOLTZMANN S ANALYSIS VIA. Contents SOLVABLE MODELS

Classical Statistical Mechanics: Part 1

Entropy and irreversibility in gas dynamics. Joint work with T. Bodineau, I. Gallagher and S. Simonella

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble

G : Statistical Mechanics

Statistical Mechanics

Equilibrium. Chapter Introduction

Beyond the Second Law of Thermodynamics

Concepts in Materials Science I. StatMech Basics. VBS/MRC Stat Mech Basics 0

A Brief Introduction to Statistical Mechanics

Thermodynamics. Basic concepts. Thermal equilibrium and temperature

Time-Dependent Statistical Mechanics 5. The classical atomic fluid, classical mechanics, and classical equilibrium statistical mechanics

1 Phase Spaces and the Liouville Equation

Boltzmann and the Interpretation of Entropy

Unstable K=K T=T. Unstable K=K T=T

IV. Classical Statistical Mechanics

Entropy. Physics 1425 Lecture 36. Michael Fowler, UVa

Thermodynamic equilibrium

G : Statistical Mechanics

S = S(f) S(i) dq rev /T. ds = dq rev /T

Absolute Equilibrium Entropy

Thus, the volume element remains the same as required. With this transformation, the amiltonian becomes = p i m i + U(r 1 ; :::; r N ) = and the canon

Quantum Thermodynamics

Lecture 6: Ideal gas ensembles

Dissipation and the Relaxation to Equilibrium

Emergent proper+es and singular limits: the case of +me- irreversibility. Sergio Chibbaro Institut d Alembert Université Pierre et Marie Curie

Elements of Statistical Mechanics

Introduction Statistical Thermodynamics. Monday, January 6, 14

A Deeper Look into Phase Space: The Liouville and Boltzmann Equations

Fluctuation theorems. Proseminar in theoretical physics Vincent Beaud ETH Zürich May 11th 2009

Chapter 2 Ensemble Theory in Statistical Physics: Free Energy Potential

ChE 503 A. Z. Panagiotopoulos 1

Lecture 14. Entropy relationship to heat

Basic Concepts and Tools in Statistical Physics

[S R (U 0 ɛ 1 ) S R (U 0 ɛ 2 ]. (0.1) k B

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

UNIVERSITY OF OSLO FACULTY OF MATHEMATICS AND NATURAL SCIENCES

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Lecture 9 Examples and Problems

Math 1270 Honors ODE I Fall, 2008 Class notes # 14. x 0 = F (x; y) y 0 = G (x; y) u 0 = au + bv = cu + dv

Chapter 9: Statistical Mechanics

Lecture 2.7 Entropy and the Second law of Thermodynamics During last several lectures we have been talking about different thermodynamic processes.

Introduction to thermodynamics

Chapter 3. Entropy, temperature, and the microcanonical partition function: how to calculate results with statistical mechanics.

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Lecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011

Microcanonical Ensemble

1. Thermodynamics 1.1. A macroscopic view of matter

Entropy and Free Energy in Biology

Concept: Thermodynamics

The propagation of chaos for a rarefied gas of hard spheres

STATISTICAL PHYSICS II

Preliminary Examination: Thermodynamics and Statistical Mechanics Department of Physics and Astronomy University of New Mexico Fall 2010

13. The Kinetic Theory, Part 2: Objections.

1 Ordinary points and singular points

Removing the mystery of entropy and thermodynamics. Part 3

ADDENDUM B: CONSTRUCTION OF R AND THE COMPLETION OF A METRIC SPACE

Chaos and Liapunov exponents

Part1B(Advanced Physics) Statistical Physics

Entropy A measure of molecular disorder

The Rotating Spring. ω Rotating spring. Spring at rest. x L. x' L' Jacob Gade Koefoed

Statistical Mechanics in a Nutshell

84 My God, He Plays Dice! Chapter 12. Irreversibility. This chapter on the web informationphilosopher.com/problems/reversibility

PHY411 Lecture notes Part 5

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics

Lecture 6. Statistical Processes. Irreversibility. Counting and Probability. Microstates and Macrostates. The Meaning of Equilibrium Ω(m) 9 spins

1 Foundations of statistical physics

Distributions of statistical mechanics

Introduction. Statistical physics: microscopic foundation of thermodynamics degrees of freedom 2 3 state variables!

L(q, q) = m 2 q2 V (q) 2 m + V (q)

Brownian Motion and The Atomic Theory

Entropy and Free Energy in Biology

Markovian Description of Irreversible Processes and the Time Randomization (*).

Mathematical Structures of Statistical Mechanics: from equilibrium to nonequilibrium and beyond Hao Ge

MITOCW watch?v=ybcsmyk5xmg

Notes on Iterated Expectations Stephen Morris February 2002

Chapter 5. Sound Waves and Vortices. 5.1 Sound waves

Minimum Bias Events at ATLAS

Minimum and maximum values *

d 3 r d 3 vf( r, v) = N (2) = CV C = n where n N/V is the total number of molecules per unit volume. Hence e βmv2 /2 d 3 rd 3 v (5)

1 The fundamental equation of equilibrium statistical mechanics. 3 General overview on the method of ensembles 10

Physics, Time and Determinism

Contents. 2.1 Vectors in R n. Linear Algebra (part 2) : Vector Spaces (by Evan Dummit, 2017, v. 2.50) 2 Vector Spaces

MD Thermodynamics. Lecture 12 3/26/18. Harvard SEAS AP 275 Atomistic Modeling of Materials Boris Kozinsky

Basics of Statistical Mechanics

8 Lecture 8: Thermodynamics: Principles

X α = E x α = E. Ω Y (E,x)

(i) T, p, N Gibbs free energy G (ii) T, p, µ no thermodynamic potential, since T, p, µ are not independent of each other (iii) S, p, N Enthalpy H

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term Solutions to Problem Set #10

2. Thermodynamics. Introduction. Understanding Molecular Simulation

PHYS 705: Classical Mechanics. Hamiltonian Formulation & Canonical Transformation

G : Statistical Mechanics

Lecture 25 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas

Transcription:

G25.2651: Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into thermal contact. By thermal contact, we mean that the systems can only exchange heat. Thus, they do not exchange particles, and there is no potential coupling between the systems. In this case, if system 1 has a phase space vector x 1 and system 2 has a phase space vector x 2, then the total Hamiltonian can be written as H(x) = H 1 (x 1 ) + H 2 (x 2 ) Furthermore, let system 1 have N 1 particles in a volume V 1 and system 2 have N 2 particles in a volume V 2. The total particle number N and volume V are N = N 1 + N 2 and V = V 1 + V 2. The entropy of each system is given by The partition functions are given by 1 (N 1 V 1 E 1 ) = C N1 2 (N 2 V 2 E 2 ) = C N2 (N V E) = C N S 1 (N 1 V 1 E 1 ) = k ln 1 (N 1 V 1 E 1 ) S 2 (N 2 V 2 E 2 ) = k ln 2 (N 2 V 2 E 2 ) dx 1 (H 1 (x 1 )? E 1 ) dx 2 (H 2 (x 2 )? E 2 ) dx(h 1 (x 1 ) + H 2 (x 2 )? E) 6= 1 (N 1 V 1 E 1 ) 2 (N 2 V 2 E 2 ) However, it can be shown that the total partition function can be written as E (E) = C 0 de 1 1 (E 1 ) 2 (E? E 1 ) 0 where C 0 is an overall constant independent of the energy. Note that the dependence of the partition functions on the volume and particle number has been suppressed for clarity. Now imagine expressing the integral over energies in the above expression as a Riemann sum: (E) = C 0 1 (E (i) 1 ) 2(E? E (i) 1 ) where is the small energy interval (which we will allow to go to 0) and P = E=. The reason for writing the integral this way is to make use of a powerful theorem on sums with large numbers of terms. Consider a sum of the form = where > 0 for all. Let a max be the largest of all the 's. Clearly, then a max P a max 1

Thus, we have the inequality a max P a max or ln a max ln ln a max + ln P This gives upper and lower bounds on the value of ln. Now suppose that ln a max >> ln P. Then the above inequality implies that ln ln a max This would be the case, for example, if a max e P. In this case, the value of the sum is given to a very good approximation by the value of its maximum term. Why P should this theorem apply to the sum expression for (E)? Consider the case of a system of free particles N H = p2 i =2m i, i.e., no potential. Then the expression for the partition function is (E) D(V ) d N r d N p NX p 2 i 2m i? E since the particle integrations are restricted only the volume of the container. Thus, the terms in the sum vary exponentially with N. But the number of terms in the sum P also varies like N since P = E= and E N, since E is extensive. Thus, the terms in the sum under consideration obey the conditions for the application of the theorem. Let the maximum term in the sum be characterized by energies E1 and E2 = E? E1. Then, according to the above analysis,! V N S(E) = k ln (E) = k ln + k ln 1 ( E1 ) 2 (E? E1 ) + k ln P + k ln C 0 Since P = E=, ln + ln P = ln + ln E? ln = ln E. But E N, while ln 1 N. Since N >> ln N, the above expression becomes, to a good approximation S(E) k ln 1 ( E1 ) 2 (E? E1 ) + O(ln N) + const Thus, apart from constants, the entropy is approximately additive: S(E) = k ln 1 ( E1 ) + k ln 2 ( E2 ) = S 1 ( E1 ) + S 2 ( E2 ) + O(ln N) + const Finally, in order to compute the temperature of each system, we make a small variation in the energy E1 d E1. But since E1 + E2 = E, d E1 =?d E2. Also, this variation is made such that the total entropy S and energy E remain constant. Thus, we obtain 0 = @S 1 + @S 2 0 = @S 1? @S 2 @ E2 0 = 1 T 1? 1 T 2 from which it is clear that T 1 = T 2, the expected condition for thermal equilibrium. It is important to point out that the entropy S(N V E) dened via the microcanonical partition function is not the only entropy that satises the properties of additivity and equality of temperatures at thermal equilibrium. Consider an ensemble dened by the condition that the Hamiltonian, H(x) is less than a certain energy E. This is known as the uniform ensemble and its partition function, denoted (N V E) is dened by (N V E) = C H(x)<E dx = C dx (E? H(x)) 2

where (x) is the Heaviside step function. Clearly, it is related to the microcanonical partition function by (N V E) = @ (N V E) Although we will not prove it, the entropy ~ S(N V E) dened from the uniform ensemble partition function via ~S(N V E) = k ln (N V E) is also approximately additive and will yield the condition T 1 = T 2 for two systems in thermal contact. In fact, it diers from S(N V E) by a constant of order ln N so that one can also dene the thermodynamics in terms of ~S(N V E). In particular, the temperature is given by 1 T = @ ~ S! NV = k @ ln NV Hamilton's equations of motion II. REVERSIBLE LAWS OF MOTION AND THE ARROW OF TIME _q i = @H @p i are invariant under a reversal of the direction of time t!?t. momenta transform according to Thus, or _p i =? @H @q i q i! q i dq i p i = m i dt! m dq i i d(?t) =?p i dq i d(?t) =?@H @p i _q i = @H @p i Under such a transformation, the positions and d(?p i ) d(?t) =?@H @q i _p i =? @H @q i The form of the equations does not change! One of the implications of time reversal symmetry is as follows: Suppose a system is evolved forward in time starting from some initial condition up to a maximum time t at t, the evolution is stopped, the sign of the velocity of each particle in the system is reversed, i.e., a time reversal transformation is performed, and the system is allowed to evolve once again for another time interval of length t the system will return to its original starting point in phase space, i.e., the system will return to its initial condition. Now from the point of view of mechanics and the microcanonical ensemble, the initial conditions (for the rst segment of the evolution) and the conditions created by reversing the signs of the velocities for initiating the second segment are equally valid and equally probably, both being points selected from the constant energy hypersurface. Therefore, from the point of view of mechanics, without a priori knowledge of which segment is the forward evolving trajectory and which is the time reversed trajectory, it should not be possible to say which is which. That is, if a movie of each trajectory were to be made and shown to an ignorant observer, that observer should not be able to tell which is the forwardevolving trajectory and which the time-reversed trajectory. Therefore, from the point of view of mechanics, which obeys time-reversal symmetry, there is not preferred direction for the ow of time. Yet our everyday experience tells us that there are plenty of situations in which a system seems to evolve in a certain direction and not in the reverse direction, suggesting that there actually is a preferred direction in time. Some common examples are a glass falling to the ground and smashing into tiny pieces or the sudden expansion of a gas into a large box. These processes would always seem to occur in the same way and never in the reverse (the glass shards never reassemble themselves and jump back onto the table forming an unbroken glass, and the gas particles never 3

suddenly all gather in one corner of the large box). This seeming inconsistency with the reversible laws of mechanics is known as Loschmidt's paradox. Indeed, the second law of thermodynamics, itself, would seem to be at odds with the reversibility of the laws of mechanics. That is, the observation that a system naturally evolves in such a way as to increase its entropy cannot obviously be rationalized starting from the microscopic reversible laws of motion. Note that a system being driven by an external agent or eld will not be in equilibrium with its surroundings and can exhibit irreversible behavior as a result of the work being done on it. The falling glass is an example of such a system. It is acted upon by gravity, which drives it uniformly toward a state of ever lower potential energy until the glass smashes to the ground. Even though it is possible to write down a Hamiltonian for this system, the treatment of the external eld is only approximate in that the true microscopic origin of the external eld is not taken into account. This also brings up the important question of how one exactly denes non-equilibrium states in general, and how do they evolve, a question which, to date, has no fully agreed upon answer and is an active area of research. However, the expanding gas example does not involve an external driving force and still seems to exhibit irreversible behavior. How can this observation be explained for such an isolated system? One possible explanation was put forth by Boltzmann, who introduced the notion of molecular chaos. Under this assumption, the momenta of two particles do not become correlated as the result of a collision. This is tantamount to the assumption that microscopic information leading to a correlation between the particles is lost. This is not inconsistent with microscopic reversibility from a probabilistic point of view, as the momenta of two particles before a collision are certainly uncorrelated with each other. The assumption of molecular chaos allows one to prove the so called Boltzmann H-theorem, a theorem that predicts an increase in entropy until equilibrium is reached. For more details see Chapters 3 and 4 of Huang. Boltzmann's assumption of molecular chaos remains unproven and may or may not be true. Another explanation due to Poincare is based on his recurrence theorem. The Poincare recurrence theorem states that a system having a nite amount of energy and conned to a nite spatial volume will, after a suciently long time, return to an arbitrarily small neighborhood of its initial state. Proof of the recurrence theorem (taken almost directly from Huang, pg. 91): Let a state of a system be represented by a point x in phase space. As the system evolves in time, a point in phase space traces out a trajectory that is uniquely determined by any given point on the trajectory (because of the deterministic nature of classical mechanics). Let g 0 be an arbitrary volume element in the phase space in the volume! 0. After a time t all points in g 0 will be in another volume element g t in a volume! t, which is uniquely determined by the choice of g 0. Assuming the system is Hamiltonian, then by Liouville's theorem:! 0 =! t Let? 0 denote the subspace of phase space that is the union of all g t for 0 t < 1. Let its volume be 0. Similarly, let? denote the subspace that is the union of all g t for t < 1. Let its volume be. The numbers 0 and are nite because, since the energy of the system is nite and the spatial volume occupied is nite, a representative point is conned to a nite region in phase space. The denitions immediately imply that? 0 contains?. We may think of? 0 and? in a dierent way. Imagine the region? 0 to be lled uniformly with representative points. As time progresses,? 0 will evolve into some other regions that are uniquely determined. It is clear, from the denitions, that after a time,? 0 will become?. Also, by Liouville's theorem: 0 = Recall that? 0 contains all the future destinations of the points in g, which in turn is evolved from g 0 after a time. It has been shown that? 0 has the same volume as? since 0 = by Liouville's theorem. Therefore,? 0 and? must contain the same set of points (except possibly for a set of zero measure). In particular,? contains all of g 0 (except possibly for a set of zero measure). But, by denition, all points in? are future destinations of the points in g 0. Therefore all points in g 0 (except possibly for a set of zero measure) must return to g 0 after a suciently long time. Since g 0 can be made arbitrarily small, Poincare's theorem follows. Now consider an initial condition in which all the gas particles are initially in a corner of a large box. By Poincare's theorem, the gas particles must eventually return to their initial state in the corner of the box. How long is this recurrence time? In order to answer this, consider dividing the box up into M small cells of volume v. The total number of microstates available to the gas varies with N like V N. The number of microstates corresponding to all the particles occupying a single cell of volume v is v N. Thus, the probability of observing the system in this microstate is approximately (v=v ) N. Even if v = V=2, for N 10 23, the probability is vanishingly small. The Poincare recurrence time, on the other hand, is proportional to the inverse of this probability or (V=v) N. Again, since N 10 23, if v = V=2, the required time is 4

t 2 1023 which is many orders of magnitude longer than the current age of the universe. Thus, although the system will return arbitrarily close to its initial state, the time required for this is unphysically long and will never be observed. Over times relevant for observation, given similar such initial conditions, the system will essentially always evolve in the same way, which is to expand and ll the box and essentially never to the opposite. 5