Chapter 3. Entropy, temperature, and the microcanonical partition function: how to calculate results with statistical mechanics.

Similar documents
The goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq

Thermodynamics of phase transitions

The Euler Equation. Using the additive property of the internal energy U, we can derive a useful thermodynamic relation the Euler equation.

Chemistry. Lecture 10 Maxwell Relations. NC State University

A.1 Homogeneity of the fundamental relation

Entropy and Free Energy in Biology

...Thermodynamics. Entropy: The state function for the Second Law. Entropy ds = d Q. Central Equation du = TdS PdV

The Chemical Potential

Entropy and Free Energy in Biology

Some properties of the Helmholtz free energy

The Second Law of Thermodynamics (Chapter 4)

Chapter 3. Property Relations The essence of macroscopic thermodynamics Dependence of U, H, S, G, and F on T, P, V, etc.

Chapter 4: Partial differentiation

3.20 Exam 1 Fall 2003 SOLUTIONS

The mathematical description of the motion of Atoms, Molecules & Other Particles. University of Rome La Sapienza - SAER - Mauro Valorani (2007)

ESCI 341 Atmospheric Thermodynamics Lesson 12 The Energy Minimum Principle

[S R (U 0 ɛ 1 ) S R (U 0 ɛ 2 ]. (0.1) k B

Introduction Statistical Thermodynamics. Monday, January 6, 14

Properties of Entropy

Outline Review Example Problem 1 Example Problem 2. Thermodynamics. Review and Example Problems. X Bai. SDSMT, Physics. Fall 2013

Multivariable Calculus

MS212 Thermodynamics of Materials ( 소재열역학의이해 ) Lecture Note: Chapter 7

Thermodynamics and Phase Transitions in Minerals

Thermodynamic equilibrium

Chapter 4: Going from microcanonical to canonical ensemble, from energy to temperature.

to satisfy the large number approximations, W W sys can be small.

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

The Maxwell Relations

ADIABATIC PROCESS Q = 0

I.G Approach to Equilibrium and Thermodynamic Potentials

Part II: Statistical Physics

OCN 623: Thermodynamic Laws & Gibbs Free Energy. or how to predict chemical reactions without doing experiments

I.G Approach to Equilibrium and Thermodynamic Potentials

WHY SHOULD WE CARE ABOUT THERMAL PHENOMENA? they can profoundly influence dynamic behavior. MECHANICS.

NENG 301 Week 8 Unary Heterogeneous Systems (DeHoff, Chap. 7, Chap )

2. Thermodynamics. Introduction. Understanding Molecular Simulation

The Gibbs Phase Rule F = 2 + C - P

CHEMISTRY 443, Fall, 2014 (14F) Section Number: 10 Examination 2, November 5, 2014

CONVECTION AND MATTER TRANSPORT PROCESSES REVIEW: CLOSED SYSTEM

Module 5 : Electrochemistry Lecture 21 : Review Of Thermodynamics

Outline Review Example Problem 1. Thermodynamics. Review and Example Problems: Part-2. X Bai. SDSMT, Physics. Fall 2014

Lecture Notes 2014March 13 on Thermodynamics A. First Law: based upon conservation of energy

Advanced Thermodynamics. Jussi Eloranta (Updated: January 22, 2018)

Lecture 3 Clausius Inequality

4) It is a state function because enthalpy(h), entropy(s) and temperature (T) are state functions.

Entropy Changes & Processes

Thermal and Statistical Physics

Physics 408 Final Exam

(i) T, p, N Gibbs free energy G (ii) T, p, µ no thermodynamic potential, since T, p, µ are not independent of each other (iii) S, p, N Enthalpy H

Statistical Mechanics I

Introduction. Statistical physics: microscopic foundation of thermodynamics degrees of freedom 2 3 state variables!

Lecture 4 Clausius Inequality

Appendix 4. Appendix 4A Heat Capacity of Ideal Gases

Effect of adding an ideal inert gas, M

Thermodynamics Free E and Phase D. J.D. Price

ChE 503 A. Z. Panagiotopoulos 1

We already came across a form of indistinguishably in the canonical partition function: V N Q =

U = γkt N. is indeed an exact one form. Thus there exists a function U(S, V, N) so that. du = T ds pdv. U S = T, U V = p

Physics is time symmetric Nature is not

Removing the mystery of entropy and thermodynamics. Part 3

Recitation: 10 11/06/03

Thermodynamic Variables and Relations

Part II: Statistical Physics

Classical Thermodynamics. Dr. Massimo Mella School of Chemistry Cardiff University

10, Physical Chemistry- III (Classical Thermodynamics, Non-Equilibrium Thermodynamics, Surface chemistry, Fast kinetics)

The Standard Gibbs Energy Change, G

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics

Brief introduction to groups and group theory

Basic Thermodynamics Prof. S. K. Som Department of Mechanical Engineering Indian Institute of Technology, Kharagpur

3.1 Energy minimum principle

Overview of Classical Thermodynamics


Applied Thermodynamics for Marine Systems Prof. P. K. Das Department of Mechanical Engineering Indian Institute of Technology, Kharagpur

Entropy in Macroscopic Systems

4.1 Constant (T, V, n) Experiments: The Helmholtz Free Energy

Lecture 4 Clausius Inequality

Thermodynamics (Classical) for Biological Systems Prof. G. K. Suraishkumar Department of Biotechnology Indian Institute of Technology Madras

Thermodynamics: A Brief Introduction. Thermodynamics: A Brief Introduction

Chapter 19 Chemical Thermodynamics

Thus, the volume element remains the same as required. With this transformation, the amiltonian becomes = p i m i + U(r 1 ; :::; r N ) = and the canon

Lecture 6 Free Energy

Section 3 Entropy and Classical Thermodynamics

Phys Midterm. March 17

summary of statistical physics

PHYS 352 Homework 2 Solutions

Physical Biochemistry. Kwan Hee Lee, Ph.D. Handong Global University

Chapter 2 Ensemble Theory in Statistical Physics: Free Energy Potential

1 Foundations of statistical physics

Equilibrium Statistical Mechanics

MME 2010 METALLURGICAL THERMODYNAMICS II. Fundamentals of Thermodynamics for Systems of Constant Composition

IV. Classical Statistical Mechanics

Chapter 5. Simple Mixtures Fall Semester Physical Chemistry 1 (CHM2201)

MATTER TRANSPORT (CONTINUED)

Statistical Physics a second course

i=1 n i, the canonical probabilities of the micro-states [ βǫ i=1 e βǫn 1 n 1 =0 +Nk B T Nǫ 1 + e ǫ/(k BT), (IV.75) E = F + TS =

Stuff 1st Law of Thermodynamics First Law Differential Form Total Differential Total Differential

MME 2010 METALLURGICAL THERMODYNAMICS II. Partial Properties of Solutions

8 Lecture 8: Thermodynamics: Principles

Part1B(Advanced Physics) Statistical Physics

1. Thermodynamics 1.1. A macroscopic view of matter

Transcription:

Chapter 3. Entropy, temperature, and the microcanonical partition function: how to calculate results with statistical mechanics. The goal of equilibrium statistical mechanics is to calculate the density (probability) ˆρ eq so we can evaluate average observables A that give us fundamental relations or equations of state. Just as thermodynamics has its potentials U, H, F (often also called A ), and G, so statistical mechanics has its ensembles. So far we only considered the microcanonical ensemble. The microcanonical ensemble is the one directly defined in postulate II of statistical mechanics. In the microcanonical ensemble U is fixed (Postulate I), and the other constraints that are fixed are the volume V and mole number n (for a simple system), or other extensive parameters (for more complicated systems). 1. Useful properties of U and S, and of a new variable: the temperature T a. Monotonicity of S(U) and U(S) We saw in chapter 2 that S=k B ln(w(u,v,n )), and that W increases monotonically with U: the higher the energy, the more microstates are accessible to the system. But the logarithm is also a monotonically increasing function, so S(U,V,n) is a monotonically increasing function of U: as the energy goes up, so does the entropy (or disorder) of a system because more microstates are available. More microstates means more disorder. 1 microstate only means complete order (S=0), as we saw in chapter 2. We can thus invert the function uniquely, and U(S,V,n) is a monotonically increasing function of S: the more disorder a system has, the higher its energy must be: U 0 S & V,n,! b. Derivatives of the energy and entropy If the energy changes a little bit, we have the differential change! du = U! & ds + U! & dv + U & dn +! S V n V,n = TdS + ( P)dV + µdn +! The derivatives are intensive variables (independent of system size). Solving for ds we also have ds = 1 T du + P T dv µ T dn +! so S & ( = 1 U T, S & ( = P V T etc. ote that we are giving these derivatives names, like T and P and µ, but we usually reserve these names for temperature and pressure and chemical potential. Are they actually the temperature and pressure and chemical potential? Let s find out. What is this quantity T? Consider a closed composite system {Sys} made of two subsystems {Sys 1 } and {Sys 2 } separated by a diathermal wall. A diathermal wall allows only heat flow, so dv=0 and dn=0. At equilibrium, the number of microstates of the composite system {Sys} is not changing, so S,n S,V

ds = 0 = S 1 U 1 & du 1 + S 2!X U 2 & du 2.!X ds = 1 T 1 du 1 + 1 T 2 du 2. But du = 0 for a closed system because of energy conservation, from which follows that du 2 = -du 1 or 0 = 1 1 du 1 at equilibrium. T 1 T 2 & This equation can only be true if 1 1 & ( = 0 T 1 = T 2 T 1 T 2 Thus, T is the intensive quantity that is equalized between two subsystems when heat is allowed to flow between them. This is the definition of temperature: the thing that becomes equal when heat stops flowing between two places. We can thus identify the intensive variable T as the temperature of the system. Temperature is always guaranteed to be positive because we saw in a. that U is a monotonically increasing function of S and T=( U/ S). Exercise: Use the same logic to prove that P must be the pressure. Pressure is the quantity that is equalized when two systems connected by a movable wall reach equilibrium by changing their volumes. ote that the derivative U/ V is defined as P to agree with our intuition that volume decreases (dv is negative) by squeezing on the system, its energy increases (du is positive) because we are doing work on the system. ow prove that µ, the chemical potential, is the quantity that is equalized between two subsystems when particles flow back and forth and their particle numbers come to equilibrium. You could go on and on proving such relationships for other intensive (derivative) variables. c. An explicit formula for the energy In b., we are giving a formula for energy change. Can we derive one for the energy U(S,V,n ) explicitly? Since U,S,V,n are all extensive quantities, it must be true for any positive multiplier λ that U(λS, λv, λn,...) = λu(s,v, n,...). This simply states if I increase the size of the system by λ, so S λs and so forth, the new energy must equal λu. Taking the derivative with respect to λ on both sides,! U! & λs! &+ U! & λv &+... =U(S,V,...) or λs λ λv λ! U! &S+ U &V +... =U(S,V,...). λs λv This must hold for λ=1 in particular, so

! U! &S+ U &V +... =U(S,V,...) S V TS-PV + µn +... =U(S,V,...) The energy is simply equal to U=TS-PV+µn+ for any system. This very simple functional form is known as bilinearity or Euler s form of the energy in terms of intensive and extensive thermodynamic state functions. d. Thermodynamic potentials equivalent to W, S and U. We started with postulate II about W(U,V,n ), and showed that the functions S(U,V,n ) and U(S,V,n ) contain the same information. Yet other thermodynamic potentials can be devised that also contain the same information as U. Consider a function y(x). We could calculate its slope m(x)= y/ x and its intercept ϕ=y(x)-mx, and then express the intercept as a function of slope, ϕ(m)=y(x(m))-mx(m) by solving m(x) for x(m) and inserting. The function ϕ(m) contains the same information as y(x) and is called the Legendre transform of y(x). Let us apply this to y=u, x=v and m= U/ V=-P, and let s call the resulting slope H : H(S, P, n,!) =U + PV. We can also easily see the change of variables by using the chain rule on the differentials: H =U + PV dh = du + PdV +VdP = TdS PdV + µdn +!+ PdV +VdP = TdS +VdP + µdn +! Clearly H is now a function of S, P, instead of S, V, It contains information equivalent to U because we can always transform back from H(P) to U(V). Similarly, we could pick as our variable S, and have A(T,V, n,!) =U TS da = SdT PdV + µdn +! A is called the Helmholtz potential. It is also equivalent to U. We could even do it twice in a row, first for S and then for V, getting G(T, P, n,!) =U TS + PV = H TS = A + PV dg = SdT +VdP + µdn +! G is called the Gibbs free energy. It is a function of T and P, but it also contains equivalent knowedge to U that can be traced back to S and ultimately to W. Using the explicit formula for the energy from c., and applying it to a system with several components of mole number n i, we can see that G has a particularly simple formula: U = TS PV + µ i n i and G =U TS + PV G = i µ i n i i The Gibbs free energy is just the sum over the chemical potentials µ i =( U/ n i ) S,V. Based on the above equation, evidently we also have µ i =( G/ n i ) T,P. We will find the Gibbs free energy very useful in chapter 5 when we want to know the equilibrium concentrations of reactants and products in a chemical reaction at constant temperature and pressure. e. Relation between entropy change, heat flow, and temperature

Here s one more useful formula involving the entropy and temperature. If our system is not allowed to exchange particles (dn=0) and the volume is fixed (dv=0), no work can be done. The only way to change its energy is by adding heat. Therefore dq V,n = du = TdS PdV + µdn = TdS or ds = dq V,n T. Exercise: You already learned about the enthalpy H=U+PV: it is the heat flow when pressure is constant (dn=0 and dp=0 instead of dn=0 and dv=0). Using dh=du+pdv+vdp, write down a formula like the above one for S, q P,n and T when pressure is constant. f. Temperature dependence of the energy, enthalpy and entropy Tbe Helmholtz and Gibbs free energies are already naturally functions of temperature. U, S and H are not, but frequently it is useful to know how they depend on temperature. For example, U = C V (T ) dh = C V (T )dt T & V H T & P = C P (T ) dh = C P (T )dt ds = dq / T ds = dh / T (at constant P) ds = C P (T )dt / T ds = dq / T ds = du / T (at constant V) ds = C V (T )dt / T Exercise: If S=ns (0) at concentration c (0) =1 and at the standard reference temperature T (0) (for example 298 K) show that at some other temperature, S=nS (0) + dtc P (T)/T where the integration limits are T (0) and the other temperature. Derive a similar formula for H. So we need to know what the two function C V and C P are. They tell us how much U or H increase when the temperature of our system increases. C V is given by a second derivative: ( 2 U/ S 2 ) V,n = ( T/ S) V,n since T=( U/ S) V,n. Also ( S/ T)= ( T/ S) -1. But we showed above that ( S/ T)=C V /T, so C V =T( S/ T)=T( 2 U/ S 2 ) -1. By Legendre transforming from U to H so we have a function of pressure, we can play the same game with the second derivative ( 2 H/ S 2 ) P,n to derive the analogous formula for C P. The important concept here is that heat capacities are second derivatives of energy or enthalpy because the temperature in U/ T or H/ T is already a first derivative. ote that analysis of second derivatives could be of interest if we want to prove whether a quantity is minimized or maximized, since a zero drrivative only tells us that either is happening. 2. Calculation of thermodynamic quantities from W(U) and S(U) Example 1: Fundamental relation for a lattice gas model: entropy-volume part. Consider again the model system of a box with M=V/V 0 volume elements V 0 and particles of volume V 0, so each particle can fill one volume element. The particles can randomly hop among unoccupied volume elements to randomly sample the full volume of the box. This is a simple model of an ideal gas. As shown in the last chapter,

M! W = (M )!! for identical particles, and we can approximate this, if M<< by W 1 V! & since M!/(M-!) M in that case. Assuming the hopping samples all microstates so the system is at equilibrium, we compute the equilibrium entropy, as derived in chapter 2 as S = k B lnw k B ln + k B + k B ln(v /V 0 ) V 0 k B (1 ln(v 0 )) k B ln( /V) nr(1 ln(v 0 )) nrln(n A /V) ns 0 nrln(n /V) = ns 0 nrln(c) In the first line, we used the definition of S in terms of W and Stirling s approximation for the factorial ln(!) ln. In the second line, we rearranged terms. In the third line, we converted from particle number to mole number n: =n A, where A is Avogadro s number. R=k B A 8.31 Joules/(mole. Kelvin). This constant is known as the gas constant, and you can see that it is just Boltzmann s constant times Avogadro s number. Both k B and R are just conversion factors between energy and temperature because 150 years ago, scientists did not yet understand that temperature is just average energy per active degree of freedom, and should really have units of Joules! In the last line, we collected terms into ones that are constant, and onces that depend on concentration c=n/v Thus the entropy of a randomly moving gas (and also of an ideal solution, where solutes do not interact with one another) depends on the logarithm of the concentration of the particles. Exercise: In the above formula, S 0 in the last line lumps together all the constants. What is S 0? Could S 0 become negative? Should it be able to if we worked with the exact equation? Exercise: By taking the derivative ( S/ V) in the above formula, and comparing to the derivative ( S/ V) defined in b., derive a familiar law that is obeyed by particles randomly moving around in a box. ow you see the power of statistical mechanics: we assume energy is conserved (Postulate I) and all microstates of the same energy have same probability (Postulate II), and by assuming a system consists of partccles randomly hopping around in a box and taking some derivatives, we derive the ideal gas law! Example 2: A system of uncoupled spins s z =±1/2 The Hamiltonian for a system of spins in a magnetic field is given by H = s zj B + B 2, j=1 where the extra term at the end is added so the energy equals zero when all the spins are pointing down (the ground state). At energy U=0, no spin is excited. For each excited

spin, the energy increases by B, so at energy U, U/B atoms are excited. These U/B excitations are indistinguishable and can be distributed in sites:! Γ( + 1) W (U) = U B &! U = B &! Γ + 1 U B & Γ U. B + 1 & This is our usual formula for permutations; the right side is in terms of Gamma functions, which are defined even when U/B is not an integer. Gamma functions basically interpolate the factorial function for noninteger values. Exercise: Plot this formula from U=0 to U=B. Since S~ln(W), do you see a problem? This formula has a potential problem built-in: clearly, when U starts out at 0 and then increases, W initially increases. But for U=B (the maximum energy), W=1 again. In fact, W reaches its maximum for U=B/2. But if W(U) is not monotonic in U, then S isn t either, violating the second law of thermodynamics! Let s see how this works out. For large, and temperature neither so low that U B ~ O(1), nor so high that U ~ O(), we can use the Stirling expansion ln! ln-, yielding B S = lnw ln U & k B B (ln U & B (+ U B U B lnu B + U B ln U & B (ln U & B ( U B lnu B + U B ln U B ln ln 1 U & B (+ U B ln 1 U & B ( U B ln U & B ( U B & (ln 1 U & B ( U B ln U & B ( after canceling terms as much as possible. We can now calculate the temperature and obtain the equation of state U(T): 1 T = S U k B ln B U 1 B U 1+ e B/kT In this equations, at T ~ 0, U 0. But when T, U B / 2. So even at infinite temperature the energy can only go up to half the maximum value, where W(U) is monotonic. At most half the spins can be made to point up by heating! The population cannot be inverted to have more spins point up than down. This should come as no surprise: if the number of microstates is maximized by having only half the spins point up when energy is added, then that s the state you will get. So the second law of thermodynamics that we derived in chapter 2 is not violated after all. This observation does not mean that it is impossible to get all the spins to point up. It is just not an equilibrium state at any temperature between T=0 and T=. Such nonequilibrium states with more spins up (or atoms excited) than down are called inverted populations. In lasers, such states are created by putting the system (like a laser crystal)

far out of equilibrium. Such a state will then relax back to an equilibrium state, releasing a pulse of energy as the spins (or atoms) drop from the excited to the ground state. The heat capacity of the above example system is c v = U T & B2 e B/kT kt 2 ( 1 + e B/kT ) peaked at 4k BT/B, 2 so we can calculate thermodynamic quantities like the heat capacity by using our statistical mechanics knowledge. In fact, we can calculate anything from statistical mechanics!