Statistical. mechanics

Similar documents
Atkins / Paula Physical Chemistry, 8th Edition. Chapter 16. Statistical thermodynamics 1: the concepts

Statistical thermodynamics L1-L3. Lectures 11, 12, 13 of CY101

Statistical thermodynamics for MD and MC simulations

Statistical thermodynamics Lectures 7, 8

10.40 Lectures 23 and 24 Computation of the properties of ideal gases

The Partition Function Statistical Thermodynamics. NC State University

L11.P1 Lecture 11. Quantum statistical mechanics: summary

213 Midterm coming up

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Thermal and Statistical Physics Department Exam Last updated November 4, L π

Boltzmann Distribution Law (adapted from Nash)

Elements of Statistical Mechanics

+ kt φ P N lnφ + φ lnφ

to satisfy the large number approximations, W W sys can be small.

Lecture 9 Examples and Problems

Statistical thermodynamics Lectures 7, 8

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

A Brief Introduction to Statistical Mechanics

Advanced Thermodynamics. Jussi Eloranta (Updated: January 22, 2018)

Thermodynamics & Statistical Mechanics

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

5.60 Thermodynamics & Kinetics Spring 2008

ε1 ε2 ε3 ε4 ε

Recitation: 10 11/06/03

ChE 503 A. Z. Panagiotopoulos 1

4.1 Constant (T, V, n) Experiments: The Helmholtz Free Energy

Quantum mechanics (QM) deals with systems on atomic scale level, whose behaviours cannot be described by classical mechanics.

Physics 607 Final Exam

Appendix 4. Appendix 4A Heat Capacity of Ideal Gases

Part II: Statistical Physics

Statistical Physics. The Second Law. Most macroscopic processes are irreversible in everyday life.

The goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq

Part II: Statistical Physics

Statistical Mechanics Victor Naden Robinson vlnr500 3 rd Year MPhys 17/2/12 Lectured by Rex Godby

Chapter 4: Going from microcanonical to canonical ensemble, from energy to temperature.

Identical Particles. Bosons and Fermions

Intro/Review of Quantum

Entropy and Free Energy in Biology

Statistical Thermodynamics

Monatomic ideal gas: partition functions and equation of state.

Thermodynamics: Chapter 02 The Second Law of Thermodynamics: Microscopic Foundation of Thermodynamics. September 10, 2013

2m + U( q i), (IV.26) i=1

9.1 System in contact with a heat reservoir

PHYSICS 219 Homework 2 Due in class, Wednesday May 3. Makeup lectures on Friday May 12 and 19, usual time. Location will be ISB 231 or 235.

5. Systems in contact with a thermal bath

We already came across a form of indistinguishably in the canonical partition function: V N Q =

Lecture 2: Intro. Statistical Mechanics

Physics 607 Exam 2. ( ) = 1, Γ( z +1) = zγ( z) x n e x2 dx = 1. e x2

MCB100A/Chem130 MidTerm Exam 2 April 4, 2013

Intro/Review of Quantum

IV. Classical Statistical Mechanics

Lecture 13. Multiplicity and statistical definition of entropy

Physical Chemistry II Exam 2 Solutions

Physics is time symmetric Nature is not

5.62 Physical Chemistry II Spring 2008

Applications of Quantum Theory to Some Simple Systems

Physics 172H Modern Mechanics

Gibbs Paradox Solution

Part II: Statistical Physics

Entropy and Free Energy in Biology

CHAPTER 13 LECTURE NOTES

Chapter 2 Ensemble Theory in Statistical Physics: Free Energy Potential

1+e θvib/t +e 2θvib/T +

2. Thermodynamics. Introduction. Understanding Molecular Simulation

Unusual Entropy of Adsorbed Methane on Zeolite Templated Carbon. Supporting Information. Part 2: Statistical Mechanical Model

Introduction Statistical Thermodynamics. Monday, January 6, 14

Spontaneity: Second law of thermodynamics CH102 General Chemistry, Spring 2012, Boston University

MCB100A/Chem130 MidTerm Exam 2 April 4, 2013

An Aside: Application of Rotational Motion. Vibrational-Rotational Spectroscopy

Statistical and Thermal Physics. Problem Set 5

Statistical Mechanics in a Nutshell

although Boltzmann used W instead of Ω for the number of available states.

1 The fundamental equation of equilibrium statistical mechanics. 3 General overview on the method of ensembles 10

Reaction Dynamics (2) Can we predict the rate of reactions?

Statistical Thermodynamics - Fall Professor Dmitry Garanin. Statistical physics. May 2, 2017 I. PREFACE

4. Systems in contact with a thermal bath

30 Photons and internal motions

3. RATE LAW AND STOICHIOMETRY

Lecture 16. Equilibrium and Chemical Potential. Free Energy and Chemical Potential Simple defects in solids Intrinsic semiconductors

Quantum ideal gases: bosons

Thermodynamics: Free Energy and Entropy. Suggested Reading: Chapter 19

Chapter 3. The Second Law Fall Semester Physical Chemistry 1 (CHM2201)

Statistical Mechanics

Lecture 4: Entropy. Chapter I. Basic Principles of Stat Mechanics. A.G. Petukhov, PHYS 743. September 7, 2017

Derivation of the Boltzmann Distribution

a. 4.2x10-4 m 3 b. 5.5x10-4 m 3 c. 1.2x10-4 m 3 d. 1.4x10-5 m 3 e. 8.8x10-5 m 3

Homework Hint. Last Time

13! (52 13)! 52! =

Mechanics and Statistical Mechanics Qualifying Exam Spring 2006

Thermodynamics and Kinetics

PHYS3113, 3d year Statistical Mechanics Tutorial problems. Tutorial 1, Microcanonical, Canonical and Grand Canonical Distributions

Principles of Molecular Spectroscopy

Molecular Modeling of Matter

Statistical Thermodynamics and Monte-Carlo Evgenii B. Rudnyi and Jan G. Korvink IMTEK Albert Ludwig University Freiburg, Germany

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble

Physical Chemistry II Exam 2 Solutions

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics

Microscopic Treatment of the Equilibrium Constant. Lecture

Chapter 5. Chemical potential and Gibbs distribution

Chemistry 431. Lecture 27 The Ensemble Partition Function Statistical Thermodynamics. NC State University

Transcription:

CHAPTER 15 Statistical Thermodynamics 1: The Concepts I. Introduction. A. Statistical mechanics is the bridge between microscopic and macroscopic world descriptions of nature. Statistical mechanics macroscopic world microscopic world Thermodynamics example macroscopic variables: P = pressure V = volume T = temperature E = internal energy S = entropy G = Gibbs free energy A = Helmholtz free energy ρ = density η = viscosity ε = dielectric constant Quantum & classical mechanics example microscopic variables: x i = position of particle i p i = momentum of particle i n, l, m l, m s =H-atom quantum #s v = vibrational quantum # J= rotational quantum # B. Microstates. 1. Statistical mechanics predicts how a system will be distributed over all of its available microstates. 2. Microstate = particular spatial or spin or energy configuration of a system. Many are possible. 3. Simple example: system = a collection of 3 fixed, non-interacting particles, each of which can have either a spin-up (1/2) or spin-down (-1/2) configuration of equal energy. 4. Specifying the microstate of the system involves (in this simple example) assigning a spin value to each particle. CHAPTER 15 1

5. Here are the possible microstates of the system. There are 8 possible: (1/2,1/2,1/2) (1/2,1/2,-1/2) (1/2,-1/2,1/2) (-1/2,1/2,1/2) (1/2,-1/2,-1/2) (-1/2,1/2,-1/2) (-1/2,-1/2,1/2) (-1/2,-1/2,-1/2) or we could use the following notation for simplicity: (+++) (++-) (+-+) (-++) (+--) (-+-) (--+) (---) 6. Assigning the quantum numbers to particles is the microscopic way of specifying the state of the system, that is, the microstate. 7. Now suppose we are interested in a macroscopic property of the system, such as the total spin of the system. Total spin of system = S = S 1 + S 2 + S 3 The total spin S of the system could then have the following values: S = 3/2, 1/2, 1/2, 1/2, -1/2-1/2, -1/2, -3/2 8. Now the statistical weight function of the system W(S) is the total number of possible ways (or microstates) that the system can have a total spin value of S. W(S) = number of ways of having total spin value S W(3/2) = 1 way W(1/2) = 3 ways W(-1/2) = 3 ways W(-3/2) = 1 way 9. Notice that the function W(S) peaks in the middle, that is, there are more ways of having lower total spin values 1/2 and -1/2 than there are to have higher spin values such as 3/2 or -3/2. This is a typical property of distribution functions, which becomes even more pronounced when the system has a large number of particles rather than just 3. 10. A similar statistical function we could define is the probability P(S) of finding the system having a total spin of S. P(S) = probability of system being found in state having total spin = S P(S) = ways of having total spin S total microstates of system e.g., P(3/2) = W(3/2)/8 = 1/8 P(1/2) = W(1/2)/8 = 3/8 CHAPTER 15 2

P(S) -3/2 3/2 S C. The Binomial Theorem. 1. If instead of having only 3 particles that can each be in two states, we had a total of n particles, the function W could be determined using the binomial theorem. W(n +,n - ) = number of ways of arranging a system of n particles in which n + particles are in (+) state and n - particles are in (-) state. W ( n +,n ) = n! n +!n! n! = n factorial 1 x 2 x 3 x n 2. Let s try the binomial theorem out on our simple 3-particle system first: W ( 3 2) = W ( 3,0 ) = 3! 0! 1 W ( 3 2) = 1 W ( 1 2) = 3! 2!1! = 3 W 1 2 ( ) = 3! 1!2! = 3 ( ) ( 1 ) 3!0! = 3 2 1 3 2 1 CHAPTER 15 3

3. Now lets demonstrate the use of the binomial theorem on a much larger system. Consider a system of n=100 coins, each of which can be either in a heads or tails state. Let n + be the number of coins that are heads, and n - be the tails. Question: How many different ways can 100 coins be arranged so that 50 are heads and 50 are tails? n = 100; n + = 50; n - = 50 W( 50,50) = 100! 50!50! Use Stirling s approximation to evaluate the factorial of a large number: ln x! x ln x - x Now solve for W: n! W = n! + n! ln W = lnn! lnn +! lnn! ( ) ( n lnn n ) = nlnn n n + lnn + n + Now for n=100; n + = n - = 50; ln W = 69.3; so W = e 69.3 10 30 ways 4. So we see an enormous number of microstates that are evenly distributed between heads and tails. There are 10 30 arrangements of the coins that all give a 50/50 distribution. 5. Now lets compare this very random distribution with a more orderly distribution in which there is one coin which is heads and the other 99 are tails. W( 1,99) = 100! 100 99! = = 100 ways 1!99! 1!99! so there are a relatively small number of microstates with this orderly distribution, even though a coin is just as happy being a heads as a tails. 6. Now we can see demonstrated for this simple system of 100 coins the statistical basis of entropy. That is, even though coins can be heads or tails with equal preference, a large system of coins will more likely be found with an even distribution near 50/50 than with a 1/99 distribution. A box of coins, shaken and allowed to sample its microstates at random, will almost always be found near a 50/50 distribution. CHAPTER 15 4

7. Entropy maximization is the consequence of a very important theorem of statistical mechanics. The statistical law of equal a priori probabilities states that: A system at equilibrium is evenly distributed over all microstates accessible to it at fixed total energy. D. Statistical Definition of Entropy. 1. The entropy S of a system in a given configuration is defined as follows: S = k B lnw where k B = Boltzmann s constant = 1.38 x 10-23 joule/k-mol ln is the natural logarithm W = number of microstates when system is in a given configuration 2. Example: For a spin system (or coin system), S is a function of n + and n -, so that: S(n +, n - ) = k B ln W(n +, n - ) e.g. S(0,100)=k B ln W(0,100) = k B ln (1)=0 (the zero point of entropy) S(1,99) k B ln W(1,99) = k B x 4.6 S(50,50) = k B ln e 69.3 = k B x 69.3 3. The shaking of the box will cause the system of coins to tend toward a 50/50 distribution. The system entropy S is spontaneously increasing towards its maximum value, Smax = 69.3k B. 4. This is a manifestation of the Second Law of Thermodynamics = a closed isolated system will tend to increase its entropy in any spontaneous process. CHAPTER 15 5

E. Energy Considerations. 1. Let s look at a more complicated example system. This example is complicated by energy considerations. 2. Consider a system of 3 independent harmonic oscillators. For example, this could be three atoms in a crystal. 3. The allowed quantum energy states of a harmonic oscillator are evenly spaced by increments of hν such that the energy of one oscillator is given by E = (v+1/2)hν Here h = Planck s constant = 6.626 x 10-34 J-s ν = the classical oscillation frequency of the oscillator v = the vibrational quantum number of the oscillator = 0,1,2,3,... 4. Energy diagram of three oscillator system might be depicted as follows: 5. In the above example, the total system has 3 increments of energy above the absolute minimum of energy. We would say the system has 3 quanta of energy above the ground state, or total energy = 3hν above the zero-point energy. 6. Let s figure out how many microstates are available to a system of three oscillators given that it has a fixed amount of energy 3hν above the zero-point energy. Here are the 10 possible microstates of the system, all having a total amount of energy = 3hν. CHAPTER 15 6

The statistical law of equal a priori probabilities states that at equilibrium a system is to be found with equal probability in any one of its microstates having the appropriate energy. 7. In our system then, over a period of time the system will randomly range over the 10 accessible microstates depicted above, found with equal likelihood in any one of these at any given moment of time. 8. Let N V = average number of atoms found occupying vibrational level v (after taking many observations). N 0 = 12/10 N 1 = 9/10 N 2 = 6/10 N 3 = 3/10 Total N = Σ N V = 0.3 + 0.6 + 0.9 + 1.2 = 3.0 = tot number of particles 9. Then the fraction of particles in level v is N V /N: N 0 /N = 0.4 N 1 /N = 0.3 N 2 /N = 0.2 N 3 /N = 0.1 which when plotted becomes our distribution function of vibrational energy state occupations: 10. Note the very important and typical property of distributions over energy states - the lower energy states have a higher occupation probability. As the state energy increases, the population decreases. Why? CHAPTER 15 7

Simply because there are many more ways to spread the energy equitably between all the molecules that to give a few molecules all the energy. This is just like the coin example, where we found that there were many more ways to arrange the coins with the same number of heads and tails than any other arrangements. II. The Boltzmann Distribution Law for large numbers of molecules. A. Macroscopic-Sized Systems. 1. For realistic large systems having N molecules, where N may be as large as a mole of particles (10 23 particles), the quantum states i of the molecule are occupied according to the Boltzmann Distribution Law. 2. The fraction of molecules in state i is defined as Ni/N and is given by the following equation: N i N = Ce ε i /k B T where C is a normalization constant, ε i is the energy of quantum state i, k B is the Boltzmann constant = 1.38 x 10-23 J/K, and T is the system temperature in K. Thus the population of each quantum state i is dictated by the ratio of the energy of state i to the energy factor k B T. This ratio appears in the exponent. For convenience it is customary to define: β 1 k B T and to refer to the exponential term as the Boltzmann factor for state I e βε i CHAPTER 15 8

3. Here is a sketch of N i /N versus i: 1.2 Boltzmann Distribution 1.0 0.8 Ni/N 0.6 0.4 0.2 0.0 0 1 2 3 4 5 Energy of level i 4. Note that this shows the same trend as our simpler three-atom system. The lower levels are more highly occupied because there are many more ways of spreading the energy among all the molecules than by giving more to one than another. 5. Now we need to determine C the normalization constant: We do this by enforcing the conservation condition that ΣN i = N or in other words, the number of molecules occupying each of the states, summed over all the states, should equal the total number of molecules N. Another way of saying this is by enforcing the condition: molec _ states N i = 1 N i molec _ states N i = C e βε i N i Then C = 1 molec _ states e βε i i molec _ states i Then the Boltzmann Law may be written as N i N = βε e i molec _ states e βε i i CHAPTER 15 9

6. Finally the denominator sum is defined as q, the molecular partition function. N i βε i N = e q where q molec _ states e βε i i Interpretation of q ~ effective number of states that are thermally accessible to a molecule at the temperature T. CHAPTER 15 10

7. Now for systems with degenerate energy levels, the Boltzmann Law can be rewritten in terms of levels rather than states. For example here is the first few energy levels for the rotational energy of a diatomic molecule: j=2 _ j=1 _ j=0 _ 8. Each j value is a level. Each line is a quantum state. Now the Boltzmann Law is: N j N = j Ω j e ε j Ω j e ε j k B T k B T Q = Ω j e ε j k B T j Boltzmann Law in terms of levels N j /N = fraction of molecules in level j ε j = energy of level j q = partition function rewritten in terms of summation over levels rather than states Ω j = degeneracy of level j = number of quantum states of identical energy in level j = 2j+1 for a diatomic molecule, for example 9. Now for example, the Boltzmann distribution over rotational energy levels of a diatomic molecule can be written as: N j ( ( j+1) ) e j 2Ik B T ( 2j + 1) N = 2j + 1 j Q = 2j + 1 j ( ) ( j+1) 2 2Ik B T e j ( j+1) 2 2Ik B T e j Since: ε j = j( j + 1) 2 2I; I = rotational moment of inertia Ω j = 2j + 1 = degeneracy factor of level j CHAPTER 15 11

B. Sample calculation using the Boltzmann distribution function for rotation of a diatomic molecule. 1. Here is a sample FORTRAN program for implementing the previous equation. Something similar could be done in EXCEL or other software. c rotation.for c Predicts rotational level population fractions Nj/N for a diatomic c molecule without making the classical rotor approximation. write(6,*) &' Input the mass in amu of each of the two atoms in the diatomic' read(5,*)am1,am2 write(6,*)' Input the bond distance of the molecule in Angstroms' read(5,*)r theta=228.0/(r**2)/(am1*am2/(am1+am2)) write(6,*)' Rotational temperature of the molecule = ',theta write(6,*)' This is in units of K' write(6,*)' input highest level for which you want information' read(5,*)maxlevel write(6,*)' input the temperature in K' read (6,*)T c First let's evaluate the partition function Q by summing over the first c 100 terms in the series: Q=0.0 do j=0,100 arg=j*(j+1)*(theta/t) if(arg.gt.20)expo=0.0 if(arg.le.20)expo=exp(-arg) term=(2*j+1)*expo Q=Q+term end do write(6,*)'partition function = ',Q c Now lets loop through levels, one at a time and calculate fraction write(6,*)' level population fraction' do j=0,maxlevel arg=j*(j+1)*(theta/t) if(arg.gt.20)expo=0.0 if(arg.le.20)expo=exp(-arg) term=(2*j+1)*expo fraction=term/q write(6,*)j,fraction end do stop end 2. In the previous program the quantity theta, called the rotation temperature, is set equal to a collection of constants: θ R h 2 8π 2 Ik B units Kelvin ( ) = 2 2Ik B CHAPTER 15 12

I = µr 2 ; µ = m 1 m 2 m 1 + m 2 ; r = bond length 3. Here is a sample execution of the FORTRAN program, showing the input of parameters by the user, followed by the output of results: $ exe rotation.for $ Fortran/Check/NoOptimize/NoShow ROTATION.FOR $ Link ROTATION $ Run ROTATION Input the mass in amu of each of the two atoms in the diatomic 14,14 Input the bond distance of the molecule in Angstroms 1.0 Rotational temperature of the molecule = 32.57143 This is in units of K input highest level for which you want information 10 input the temperature in K 298.15 Partition function = 9.494500 level population fraction 0 0.1053241 1 0.2539569 2 0.2734202 3 0.1987428 4 0.1066300 5 4.3710325E-02 6 1.3925157E-02 7 3.4811632E-03 8 6.8701972E-04 9 1.0746627E-04 10 1.3361231E-05 FORTRAN STOP 0.5 0.4 Rotational level distribution for a diatomic molecule with theta = 9.49 K at T=298 K 0.3 Nj/N 0.2 0.1 0.0 0 2 4 6 8 10 12 j ltzmann formula for the distribution of a dia CHAPTER 15 13

This Boltzmann formula for the distribution of a diatomic over its levels J explains the appearance of the peak heights in a gas phase rotational spectrum of diatomics (microwave spectrum): Absorption increase due to degeneracy decrease due to exponential factor frequency C. Detailed Derivation of Boltzmann Distribution Law. 1. For systems with large number of particles N in which each can be in an arbitrary number of molecular states s (which are countable), the number of ways of achieving a system configuration: {n 1, n 2, n 3 n s } # of molecules in molecular state 1,2,3,...etc. is given by: W ( n 1,n 2,n 3...n s ) = N! n 1!n 2!n 3!...n s! 2. There are constraints, however, on what configurations are allowed, or accessible. System may have a fixed total energy such that: s E = n i ε i is a constraint i=1 E=total energy n i =number in molecular state i e i =energy of a molecule in state i 3. Further, there may be fixed number of particles. For example: s n i = N is another constraint i=1 CHAPTER 15 14

#2 and #3 will serve as restrictions, or constraints 4. The number of accessible microstates is generally strongly peaked. Remember the heads/tails analogy: This peak represents the statistically most typical configuration of the system, so much so that most other configurations are inconsequential. 5. Let s find, for general system, where W reaches a peak: i.e. find the configuration {n 1, n 2, n 3 n s }* for which W reaches a maximum. i.e. find the derivative d(ln W) and set = 0. 6. We get the following variation equation: ( ) = $ ln W d ln W = " $ # s i=1 % " n ' dn + ln W % 1 $ 1 & # n ' dn +... 2 2 & " ln W % $ # n ' dn i i & 7. We also need to subject Eq(6) to constraints #2 and #3. Do this by Lagrange s method of undetermined multipliers = A constraint should be multiplied by some constant and added to the main variation equation. particle # constraint energy constr s " ln W % s s 0 = $ # n ' dn i + α dn i βε i dn i i=1 i & i=1 i=1 CHAPTER 15 15

or we now have a series of equations, one for each i: # 0 = ln W & % ( + α βε i $ n i ' N! ln W = ln n 1!n 2!...n s! s = ln N! ln n j! j=1 Use Stirling s formula for factorials. 8. Final Result: n i = Constant e -βε i at equilibrium, where we have defined β 1 k B T Verbal meaning: occupancy of state i at equilibrium diminishes exponentially with energy ε i. of state i. 9. Constant = C determined by Normalization, that is, by applying the constraint on # of particles: s n i = N i=1 s C e βε i i=1 C = s i=1 N = N e βε i 10. And so we get the final form of Boltzmann s Law: fraction of molec in molec state n i N = s i=1 e βεi e βε i CHAPTER 15 16

3.0 11. Molecular Partition Function q. q = s i=1 e βε i 2.0 -verbal meaning: an indication of the number of states thermally accessible to each molecule in the system. D. Example: simple two level system. A collection of molecules that have only two levels available to them. The lower one is nondegenerate and has energy =0, by definition. The 1.0 upper one is doubly degenerate and has energy E. Partition function: q = 2 j=1 Ω j e βε j q = 1 e 0 + 2 e βe q = 1 + 2e βe Fraction of molecules in level j is: βε N j j N = Ω je q N j N = Ω j e βε j 1 + 2e βe N 1 N = 1 1 + 2e βe N 2 N = 2e βe 1 + 2e βe CHAPTER 15 17

Fraction of molecules in level 1 and level 2: E. Example: A large collection of pure harmonic oscillators. q = i=1 e βε i q = e 0 + e βε + e 2βε +... q = 1 + x + x 2 + x 3 +... where x e βε CHAPTER 15 18

This summation has a simple solution: q = 1 + x + x 2 + x 3 +... q = 1 1 x so : 1 q = 1 e βε which is plotted below as function of T: CHAPTER 15 19

III. Simple Illustration: Application to Chemical Equilibrium. Allowed Energies in Chemical Species A and B E A (v) = vε a v=0,1,2,3,4... E B (v) = ΔE + vε b v=0,1,2,3,4... Partition function for species A and B and for whole system of states: q A = q B = A states e βe A (v) A states = e βvε a = (1 e βε a ) 1 v v B states e βe B (v) q = q A + q B B states = e β(δe+vε b ) = e βδe (1 e βε b ) 1 v v Population in any given state v in Species A or B: βe N A (v) v N = e q A + q B βe N B (v) v N = e q A + q B CHAPTER 15 20

Now, what is the equilibrium constant for a chemical reaction A B? K eq = N B N A K eq = K eq = B states N v v A states N v v N B states e βe B (v) / (q A + q B ) v A states N e βe A (v) / (q A + q B ) v and finally, the important equation: K eq = q B q A The equilibrium constant for a reaction is simply the ratio of the partition functions! Substituting in the expressions for the species A and B partition functions: K eq = e βδe (1 e βε b ) 1 (1 e βε a ) 1 CHAPTER 15 21

Let s look at a specific case: Suppose zero-point energy of B is k B T above A as in drawing. So ΔE = k B T. Energy spacings of species A are also ε a = k B T. Observe the equilibrium constant & how it s affected by the spacing of levels ε b in species B. Notice that, even though the species B manifold of states is ABOVE those of A, the K eq can be > 1, favoring the product side, IF the spacings in B are small enough. This is an ENTROPY effect. The states on the B side are denser allowing more ways of distributing the system, even though the B side starts at a higher energy. Population of states when B spacing is 0.1 kt 0.2 0.18 0.16 0.14 0.12 A states B states 0.1 0.08 0.06 0.04 0.02 0 1 2 3 state # 4 5 6 7 B states A states CHAPTER 15 22

IV. Thermodynamic energy and entropy. The partition function contains all the information needed to calculate the thermodynamic properties of a system. Partition function is to stat mech as the wavefunction is to quantum mechanics. Let s now see the relationship of the molecular partition function q to thermodynamic internal energy U and entropy S. A. The Internal Energy U. 1. This is the energy found in the first law of thermodynamics, (ΔU= q-w). We first obtain an expression for energy E = total energy of the system above its zero-point energy (lowest energy state). E = n i ε i i n i = number of molecules in state i ε i = energy of molecule in state i Substituting in the Boltzmann Law for the number of molecules in state i, and then performing some mathematical operations, we have: E = i Ne βε i q ε i = N q i ε i e βε i But now we use math trick to rewrite this: ε i e βε i so E = N q = β e βε i E = N q q β i and finally E = N lnq β β e βε i = N q β i e βε i CHAPTER 15 23

And so now adding the zero-point energy of the system we get our final result for the thermodynamic energy U: $ U = U(0) N lnq ' & ) % β ( N,V fixed where we have indicated the partial derivative is taken with volume V and number of molecules N fixed. So to find the thermodynamic internal energy we must take a derivative of the partition function q with respect to β. 2. It is convenient to rewrite this derivative in terms of temperature T rather than β. Remembering the definition β 1 k B T we can rewrite the partial: β = T β ( ) 1 T = βk B β T = k B T2 T And so our final internal energy result is written in the more convenient form: " U = U(0) + Nk B T 2 lnq% $ ' # T & N,V _ fixed B. The Entropy S. 1. The Boltzmann formula for the entropy was given earlier as: S = k B lnw where W is the number of ways of distributing the total energy among a system of molecules in the most probable configuration. CHAPTER 15 24

2. The W term can be rewritten in terms of the molecular partition function q. See Atkins/dePaula 10 th ed. P. 639. S = U U(0) T + Nk B lnq C. The Canonical Partition Function Q. 1. Up to now we have assumed that our individual molecules are statistically independent, and all our equations have been based on the molecular partition function q. 2. There is a more general partition function Q that is more appropriate for interacting systems of molecules (such as in condensed phases). Q = system states e -βe i i 3. This looks deceptively the same as q, but Q differs from q in that the summation is not over the quantum states possessed by an individual molecule. It is over configurations of the collective system of interacting molecules, and E i is the energy of the entire system in one of those configurations i, not the energy of one molecule in quantum state i. 4. Therefore Q can be thought of as a TOTAL partition function of N molecules, not the partition function of an individual molecule. 5. The thermodynamic internal energy and entropy in terms of Q are: $ U = U(0) lnq ' & ) % β ( V fixed or $ U = U(0) + k B T 2 lnq' & ) % T ( and S = U U(0) T + k B lnq V.fixed CHAPTER 15 25

6. But wait a minute, there is a relationship between Q and q. When the N molecules are independent and indistinguishable. Q = qn N! Independent means that the total energy can be obtained by a sum of independent energy terms of independent molecules, giving rise to the power of N. Indistinguishable means that the molecules are identical and free to move and exchange positions. This would cause a statistical overcount of unique states of the collection of molecules, so we divide by N! 7. When the molecules are independent but distinguishable, such as each molecule different OR molecules are fixed on a lattice: Q = q N 8. Note that when this last equation is inserted for Q in the equations for U and S, those equations revert to what we had originally. CHAPTER 15 26