MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble

Similar documents
MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Exam #1

IV. Classical Statistical Mechanics

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

An Outline of (Classical) Statistical Mechanics and Related Concepts in Machine Learning

2m + U( q i), (IV.26) i=1

Grand Canonical Formalism

9.1 System in contact with a heat reservoir

8.044 Lecture Notes Chapter 4: Statistical Mechanics, via the counting of microstates of an isolated system (Microcanonical Ensemble)

PHY 5524: Statistical Mechanics, Spring February 11 th, 2013 Midterm Exam # 1

arxiv:cond-mat/ v1 10 Aug 2002

Removing the mystery of entropy and thermodynamics. Part 3

Introduction Statistical Thermodynamics. Monday, January 6, 14

Summer Lecture Notes Thermodynamics: Fundamental Relation, Parameters, and Maxwell Relations

A Brief Introduction to Statistical Mechanics

Introduction. Statistical physics: microscopic foundation of thermodynamics degrees of freedom 2 3 state variables!

Statistical Mechanics in a Nutshell

(i) T, p, N Gibbs free energy G (ii) T, p, µ no thermodynamic potential, since T, p, µ are not independent of each other (iii) S, p, N Enthalpy H

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term Exam #4

Part II: Statistical Physics

Chapter 20 The Second Law of Thermodynamics

6.730 Physics for Solid State Applications

Part II: Statistical Physics

PHYSICS 715 COURSE NOTES WEEK 1

I.D The Second Law Q C

III. Kinetic Theory of Gases

to satisfy the large number approximations, W W sys can be small.

X α = E x α = E. Ω Y (E,x)

1 The fundamental equation of equilibrium statistical mechanics. 3 General overview on the method of ensembles 10

G : Statistical Mechanics Notes for Lecture 3 I. MICROCANONICAL ENSEMBLE: CONDITIONS FOR THERMAL EQUILIBRIUM Consider bringing two systems into

although Boltzmann used W instead of Ω for the number of available states.

Statistical Thermodynamics and Monte-Carlo Evgenii B. Rudnyi and Jan G. Korvink IMTEK Albert Ludwig University Freiburg, Germany

i=1 n i, the canonical probabilities of the micro-states [ βǫ i=1 e βǫn 1 n 1 =0 +Nk B T Nǫ 1 + e ǫ/(k BT), (IV.75) E = F + TS =

Physics 172H Modern Mechanics

Chapter 3. The Second Law Fall Semester Physical Chemistry 1 (CHM2201)

ChE 503 A. Z. Panagiotopoulos 1

Thermodynamic equilibrium

The Methodology of Statistical Mechanics

Statistical Physics. How to connect the microscopic properties -- lots of changes to the macroscopic properties -- not changing much.

8 Lecture 8: Thermodynamics: Principles

Lecture 4: Entropy. Chapter I. Basic Principles of Stat Mechanics. A.G. Petukhov, PHYS 743. September 7, 2017

Temperature Fluctuations and Entropy Formulas

Statistical Physics. The Second Law. Most macroscopic processes are irreversible in everyday life.

Physics 4311 ANSWERS: Sample Problems for Exam #2. (1)Short answer questions:

Elements of Statistical Mechanics

Statistical mechanics of classical systems

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term Solutions to Problem Set #10

Contents. 1 Introduction and guide for this text 1. 2 Equilibrium and entropy 6. 3 Energy and how the microscopic world works 21

Statistical Mechanics. Atomistic view of Materials

4.1 Constant (T, V, n) Experiments: The Helmholtz Free Energy

Thermodynamics of the nucleus

2. Thermodynamics. Introduction. Understanding Molecular Simulation

[S R (U 0 ɛ 1 ) S R (U 0 ɛ 2 ]. (0.1) k B

S = S(f) S(i) dq rev /T. ds = dq rev /T

Irreversible Processes

Thermodynamics & Statistical Mechanics

Lecture 20. The Chemical Potential

Entropy and Free Energy in Biology

Phys Midterm. March 17

Boltzmann Distribution Law (adapted from Nash)

First Law Limitations

Appendix 4. Appendix 4A Heat Capacity of Ideal Gases

where (E) is the partition function of the uniform ensemble. Recalling that we have (E) = E (E) (E) i = ij x (E) j E = ij ln (E) E = k ij ~ S E = kt i

Thermodynamics of feedback controlled systems. Francisco J. Cao

VII.B Canonical Formulation

summary of statistical physics

Concept: Thermodynamics

Part II: Statistical Physics

Thermodynamics, Gibbs Method and Statistical Physics of Electron Gases

Statistical. mechanics

Statistical Mechanics and Information Theory

Concepts in Materials Science I. StatMech Basics. VBS/MRC Stat Mech Basics 0

Section 3 Entropy and Classical Thermodynamics

Rubber elasticity. Marc R. Roussel Department of Chemistry and Biochemistry University of Lethbridge. February 21, 2009

Thus, the volume element remains the same as required. With this transformation, the amiltonian becomes = p i m i + U(r 1 ; :::; r N ) = and the canon

Basic Concepts and Tools in Statistical Physics

Minimum Bias Events at ATLAS

Atkins / Paula Physical Chemistry, 8th Edition. Chapter 3. The Second Law

Theoretical Statistical Physics

Statistical Mechanics. Hagai Meirovitch BBSI 2007

Entropy and the Second and Third Laws of Thermodynamics

Thermodynamics, Statistical Mechanics and Entropy

Thermodynamics. Basic concepts. Thermal equilibrium and temperature

5. Systems in contact with a thermal bath

ELEMENTARY COURSE IN STATISTICAL MECHANICS

Microcanonical Ensemble

Thermodynamics of phase transitions

10 Thermal field theory

1 Foundations of statistical physics

Entropy and Free Energy in Biology

3.320 Lecture 18 (4/12/05)

Statistical Mechanics

Homework 1 Due: Thursday 2/5/2015. Instructions: Turn in your homework in class on Thursday 2/5/2015

Classical Statistical Mechanics: Part 1

Chapter 20 Entropy and the 2nd Law of Thermodynamics

5.60 Thermodynamics & Kinetics Spring 2008

Statistical Mechanics I

Thermodynamics. 1.1 Introduction. Thermodynamics is a phenomenological description of properties of macroscopic systems in thermal equilibrium.

The fine-grained Gibbs entropy

8.21 The Physics of Energy Fall 2009

Transcription:

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department 8.044 Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble The object of this endeavor is to impose a simple probability density on the phase space, classical or quantum, of the many particle system and to use it to obtain both microscopic and macroscopic information about the system. The microscopic information includes the probability densities for individual microscopic coordinates or states. The macroscopic information will consist of a statistical mechanical definition of the temperature, the second law of thermodynamics and all the necessary equations of state. 1. The System The system we will consider consists of N particles in a volume V subject to enough mechanical and electromagnetic constraints to specify the thermodynamic state of the system when combined with the additional constraint that the total energy of the system is restricted to a very narrow range above a reference energy E. E < energy E + The number of items in the list of fixed quantities, including E, is the number of independent macroscopic variables. For simplicity we will carry along only 3 in most of these notes: E, N and V. Fixed: E V N M or H P or H 2. The Probability Density Here we choose the simplest of all possible forms (at least conceptually) for the probability density: a constant for all of the accessible states of the system. The accessible states are those microscopic states of the system consistent with the constraints (E, V, N, ). This choice is known as the postulate of equal a priori probabilities. It is in fact the fundamental basis of all of statistical mechanics. 1

Classical version: p({p, q}) = 1/Ω E < H({p, q}) E + = 0 elsewhere Ω { dp, dq } = Ω(E, V, N) accessible Quantum version: p(k) = 1/Ω E < k H k E + = 0 elsewhere Ω (1) = Ω(E, V, N) k, accessible In the quantum case Ω is dimensionless. It is the total number of microscopic states accessible to the system. Classically Ω is the accessible volume of phase space. It can be made dimensionless by dividing by m where m is the number of canonically conjugate momentumcoordinate pairs (p, q) in the phase space. In most of what follows the classical version will be employed. Microscopic information is obtained by integrating the unwanted variables out of the joint probability density p({p, q}). For example if one wants the probability density for a single coordinate q i p(q i ) = p({p, q}) {dp, dq} q=q i 1 = {dp, dq} Ω q=q i Ω (all but q i axis) = Ω 2

For a more complex state of the system, X, involving specification of a subset {p, q } of the microscopic variables p(x) = 1 = Ω = = except {p,q } p({p, q }) {dp, dq} { dp, dq} except { p,q } Ω (consistent with X) Ω volume of accessible phase space consistent with X total volume of accessible phase space We will see later that the thermodynamic information about the system is obtained from the dependence of Ω on the constraints, Ω(E, V, N). 3. Quantities Related to Ω Φ(E, V, N) {dp, dq} H{ p,q } <E = cumulative volume in phase space ω(e, V, N) Φ(E, V, N) E = density of states as a function of energy Ω(E, V, N) = ω(e, V, N) 3

4. Entropy In general Ω increases exponentially with the size of the system. It is convenient to work with an extensive measure of the phase space volume, one which is additive as two systems are brought together in mutual equilibrium. A logarithmic measure of Ω satisfies this criterion. S(E, V, N) k ln Ω(E, V, N) k ln Φ(E, V, N) k ln ω(e, V, N) The two approximate expressions hold due to the fact that for large N the error can be shown to be of order ln N while the given term is proportional to N. S(E, V, N) is called the entropy. It is a state function. It is extensive. It is a logarithmic measure of the microscopic degeneracy associated with a macroscopic (that is, thermodynamic) state of the system. k is Boltzmann s constant with units of energy per 0 K. 5. Statistical Mechanical Definition of Temperature Bring together two arbitrary systems 1 and 2, each represented by its own microcanonical ensemble and therefore having well defined phase space volumes Ω 1 and Ω 2. While isolated from the rest of the world they are allowed to interact thermally, d/q 1 = d/q 2, but not mechanically, d/w 1 = d/w 2 = 0. The interaction is weak enough that the two systems maintain their identities, thus the individual phase space volumes (and the microscopic variables on which they are defined) still make sense. Since the sum of the two systems is isolated it also can be represented by its own microcanonical ensemble with phase space volume Ω. Since Ω must take into account all the possible ways the total energy E can be divided between the two subsystems Ω(E) can be expressed in terms of their individual phase space volumes as follows: E Ω(E) = Ω 1 (E ) Ω 2 (E 0 E ) de 4

Consider the following reasonable question. What is the most probable value for E 1 when equilibrium has been reached? We can determine this from p(e 1 ). p(e 1 ) = Ω 1 (E 1 ) Ω 2 (E E 1 ) Ω(E) Note that this expression is consistent with the normalization of p(e 1 ). E E Ω 1 (E 1 ) Ω 2 (E E 1 ) de 0 1 Ω(E) p(e 1 ) de 1 = = = 1 0 Ω(E) Ω(E) Now find where p(e 1 ) has its maximum by finding where its derivative vanishes. d 0 = (Ω 1 (E 1 )Ω 2 (E de 1 E 1 )) dω 1 (E 1 ) dω 2 (E E 1 ) = Ω 2 (E E 1 ) + Ω 1 (E 1 ) de 1 de 1 dω 1 (E 1 ) dω 2 (E 2 ) = Ω 2 (E 2 ) Ω 1 (E 1) de 1 de 2 1 dω 1 (E 1 ) 1 dω 2(E 2) = Ω1 (E 1 ) de 1 Ω 2 (E 2 ) de 2 d d = ln Ω 1 (E 1 ) ln Ω 2 (E 2 ) de 1 de 2 Thus the maximum of p(e 1 ) occurs when ( ) S1 E 1 d/w 1 =0 ( ) S2 = E 2 d/w 2 =0 Solving would give the most probable value of E 1. More important is the fact that this expression specifies the equilibrium condition. Therefore ( ) S 1 = f(t ) (in equilibrium) E T d/w =0 5

The specific choice of f(t ) agrees with the T defined empirically by the ideal gas law, agrees with the T defined thermodynamically by the efficiency of Carnot cycles. 6. Two Fundamental Inequalities Consider the two systems in section 5. when they are not necessarily at the same temperature before contact. Let E 1 be the energy of system 1 before the contact is made and E 1 be its energy after the combined system has reached mutual equilibrium. When contact is first made the following relationship holds between the probabilities of finding system 1 at those energies; the equality only holds if the two systems were in equilibrium before contact, which would require E 1 = E 1. p(e 1 ) p(e 1 ) Ω 1 (E 1 )Ω 2 (E E 1 ) Ω 1 (E 1 )Ω 2 (E E 1 ) Ω 1 (E 1 ) Ω 2 (E E 1 ) 1 Ω1 (E 1 ) Ω 2 (E E 1 ) 0 S } 1(E 1 ) S {{ 1 (E 1 } ) + S } 2(E E 1 ) S 2 (E E 1 ) {{} S 1 S 2 But entropy is additive, so the entropy change for the entire system is S = Thus we have found the important result S 1 + S 2. S 0 for spontaneous changes in an isolated system Now consider the special case where system 2 is so large compared to system 1 that the heat added to it, d/q 2, does not change its temperature. System 2 is then referred to as a temperature bath or thermal reservoir and T 2 T bath. This assumption allows us to find an explicit relation for small changes in the entropy of system 2: de 2 ds 2 = statistical definition of temperature T 2 d/q 2 = since no work is done T 2 d/q 1 = T bath 6

We can now rewrite the inequality derived above as it applies to this special case. 0 ds 1 + ds 2 0 d/q1 ds 1 T bath From this we conclude that ds 1 d/q1 T bath when exchanging heat with a reservoir The two inequalities indicated by bullets, taken together, form the second law of thermodynamics. 7. Entropy as a Thermodynamic Variable The work done on a system is given by the expression P dv d/w = SdA + HdM + EdP + FdL X i dx i i Here for convenience we have introduced the notation of a generalized force X i conjugate to some generalized external parameter x i. In the microcanonical ensemble the energy is fixed at E. In general the internal energy of a system is the average of the energy over all the accessible microstates, so in this case U is identical to E and we can write the first law as de = d/q + d/w Now we examine the consequences of the statistical mechanical definition of entropy. S k ln Ω = S( E, V, M, ) }{{} constraints when computing Ω, a complete set of independent thermodynamic variables = S(E, { x i }) where we have chosen to use the x as the constraints i 7

The statistical mechanical expression for the entropy can be inverted to give E as a function of S. S(E, V, M, ) E(S, V, M, ) S(E, {x i }) E(S, {x i }) de d/w =0 = d/q from the first law Therefore de d/w =0 T ds utilizing the second law [the equality only holds for equilibrium (reversible) changes] de = d/q + d/w T ds + d/w P dv T ds + SdA FdL + HdM + EdP + The final line above is a fundamental result which expresses the combined first and second laws of thermodynamics. We continue our development by solving this expression for the differential of the entropy. 1 1 1 1 ds de d/w = de Xi dx i T T T T 1 P H de + dv dm E d T T T T P + i This expression leads to two fundamental results. The first is a restatement of the second law of thermodynamics. The only changes which can occur in an isolated system (de = dx i = 0) are those which increase the entropy or, at best, leave it unchanged. 8

The second result is that the connection between statistical mechanics and thermodynamics in the microcanonical ensemble is through the entropy. In particular, if one knows the entropy as a function of the constraints S(E, {x i }), or what is equivalent the phase space volume Ω(E, {x i }), then one can find expressions for the generalized forces in terms of those same variables by taking the appropriate derivative. The resulting expressions are the equations of state for the system. In particular ( ) S Xj = x j T ( ) S V E,x i =x j E,M, P P = T ( ) S H = M T ( ) S P E,V, P E,V,M = E T 9

MIT OpenCourseWare http://ocw.mit.edu 8.044 Statistical Physics I Spring 2013 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.