Statistical thermodynamics Lectures 7, 8

Similar documents
Statistical thermodynamics Lectures 7, 8

Statistical thermodynamics L1-L3. Lectures 11, 12, 13 of CY101

Atkins / Paula Physical Chemistry, 8th Edition. Chapter 16. Statistical thermodynamics 1: the concepts

Statistical. mechanics

The Lowest Temperature Of An Einstein Solid Is Positive

Introduction Statistical Thermodynamics. Monday, January 6, 14

Lecture 14. Entropy relationship to heat

The Partition Function Statistical Thermodynamics. NC State University

5.60 Thermodynamics & Kinetics Spring 2008

Lecture 13. Multiplicity and statistical definition of entropy

2. Thermodynamics. Introduction. Understanding Molecular Simulation

5.62 Physical Chemistry II Spring 2008

Lecture 2: Intro. Statistical Mechanics

Entropy and Free Energy in Biology

Exam 2, Chemistry 481, 4 November 2016

Solid Thermodynamics (1)

Part II: Statistical Physics

Minimum Bias Events at ATLAS

a. 4.2x10-4 m 3 b. 5.5x10-4 m 3 c. 1.2x10-4 m 3 d. 1.4x10-5 m 3 e. 8.8x10-5 m 3

Part II: Statistical Physics

Entropy and Free Energy in Biology

Physics 172H Modern Mechanics

Advanced Thermodynamics. Jussi Eloranta (Updated: January 22, 2018)

Irreversible Processes

First Law Limitations

Introduction. Statistical physics: microscopic foundation of thermodynamics degrees of freedom 2 3 state variables!

Appendix 4. Appendix 4A Heat Capacity of Ideal Gases

Homework Week The figure below depicts the isothermal compression of an ideal gas. isothermal du=0. δq rev = δw rev = P dv

UNIVERSITY OF SOUTHAMPTON

The goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq

although Boltzmann used W instead of Ω for the number of available states.

Statistical Physics. The Second Law. Most macroscopic processes are irreversible in everyday life.

Monatomic ideal gas: partition functions and equation of state.

Lecture Notes 2014March 13 on Thermodynamics A. First Law: based upon conservation of energy

Physics is time symmetric Nature is not

+ kt φ P N lnφ + φ lnφ

Statistical Mechanics Notes. Ryan D. Reece

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble

Classical Physics I. PHY131 Lecture 36 Entropy and the Second Law of Thermodynamics. Lecture 36 1

Problem Set 4 Solutions 3.20 MIT Dr. Anton Van Der Ven Fall 2002

9.1 System in contact with a heat reservoir

Statistical Mechanics

to satisfy the large number approximations, W W sys can be small.

Chapter 20 Entropy and the 2nd Law of Thermodynamics

Statistical Mechanics and Information Theory

Heat Capacities, Absolute Zero, and the Third Law

Potential Descending Principle, Dynamic Law of Physical Motion and Statistical Theory of Heat

Physics 607 Final Exam

Summer Lecture Notes Thermodynamics: Fundamental Relation, Parameters, and Maxwell Relations

Adiabatic Expansion (DQ = 0)

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics

Physics 4311 ANSWERS: Sample Problems for Exam #2. (1)Short answer questions:

Entropy. Physics 1425 Lecture 36. Michael Fowler, UVa

We already came across a form of indistinguishably in the canonical partition function: V N Q =

Definite Integral and the Gibbs Paradox

11.6: Ratio and Root Tests Page 1. absolutely convergent, conditionally convergent, or divergent?

Contents. 1 Introduction and guide for this text 1. 2 Equilibrium and entropy 6. 3 Energy and how the microscopic world works 21

Removing the mystery of entropy and thermodynamics. Part 3

213 Midterm coming up

Recitation: 10 11/06/03

Chapter 4: Going from microcanonical to canonical ensemble, from energy to temperature.

Entropy Changes & Processes

L11.P1 Lecture 11. Quantum statistical mechanics: summary

1 Garrod #5.2: Entropy of Substance - Equation of State. 2

rate of reaction forward conc. reverse time P time Chemical Equilibrium Introduction Dynamic Equilibrium Dynamic Equilibrium + RT ln f p

Chemistry. Lecture 10 Maxwell Relations. NC State University

Thus, the volume element remains the same as required. With this transformation, the amiltonian becomes = p i m i + U(r 1 ; :::; r N ) = and the canon

Statistical thermodynamics (mechanics)

CY1001 BASIC CONCEPTS

UNIVERSITY OF SOUTHAMPTON

Problem: Calculate the entropy change that results from mixing 54.0 g of water at 280 K with 27.0 g of water at 360 K in a vessel whose walls are

Lecture 4: Mechanical and Chemical Equilibrium In the Living Cell (Contd.)

Low Temperature Properties of the Bose-Einstein and Fermi-Dirac Equations

Rate of Heating and Cooling

Introduction. Chapter The Purpose of Statistical Mechanics

Lecture 3 Clausius Inequality

Rubber elasticity. Marc R. Roussel Department of Chemistry and Biochemistry University of Lethbridge. February 21, 2009

3. RATE LAW AND STOICHIOMETRY

2 Equations of Motion

Physics 576 Stellar Astrophysics Prof. James Buckley. Lecture 14 Relativistic Quantum Mechanics and Quantum Statistics

Atkins / Paula Physical Chemistry, 8th Edition. Chapter 3. The Second Law

IV. Classical Statistical Mechanics

Thermodynamics and Statistical Physics Exam

Thermal and Statistical Physics Department Exam Last updated November 4, L π

Temperature and Heat. Ken Intriligator s week 4 lectures, Oct 21, 2013

General Physical Chemistry I

Exam TFY4230 Statistical Physics kl Wednesday 01. June 2016

The First Law of Thermodynamics

Today. Work, Energy, Power loose ends Temperature Second Law of Thermodynamics

SCORING. The exam consists of 5 questions totaling 100 points as broken down in this table:

Chapter 3. The Second Law Fall Semester Physical Chemistry 1 (CHM2201)

Thermodynamics: Chapter 02 The Second Law of Thermodynamics: Microscopic Foundation of Thermodynamics. September 10, 2013

Lecture 27: Entropy and Information Prof. WAN, Xin

Homework Hint. Last Time

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

CY1001 BASIC CONCEPTS

Final Exam, Chemistry 481, 77 December 2016

The Third Law. NC State University

The Direction of Spontaneous Change: Entropy and Free Energy

Assignment 3. Tyler Shendruk February 26, 2010

Transcription:

Statistical thermodynamics Lectures 7, 8 Quantum Classical Energy levels Bulk properties Various forms of energies. Everything turns out to be controlled by temperature CY1001 T. Pradeep Ref. Atkins 9 th edition Alberty, Silbey, Bawendi 4 th edition

Need for statistical thermodynamics Microscopic and macroscopic world Energy states Distribution of energy - population Principle of equal a priori probabilities ε 0 Reference to zero Configuration - instantaneous n 1, n 2, molecules exist in states with energies ε 0, ε 1, N is the total number of molecules {N,0,0, } and {N-2, 2,0, } are configurations Second is more likely than the first Fluctuations occur {1,1}, {2,0}, {0,2} Weight of a configuration = how many times the configuration can be reached.

Energy Generalized picture of weight A configuration {N-2,2,0,0, } First ball of the higher state can be chosen in N ways, because there are N balls Second ball can be chosen in N-1 ways as there are N-1 balls But we need to avoid A,B from B,A. Thus total number of distinguishable configurations is, ½ [N(N-1)] 1 2 3 4 5 6 7 8 9 10 0 A configuration {3,2,0,0,..} is chosen in 10 different ways - weight of a configuration How about a general case of N particles? {n 0,n 1,..} configuration of N particles. Distinct ways, W = N! n 0!n 1!n 2!... N! ways of selecting balls (first ball N, second (N-1), etc.) n 0! ways of choosing balls in the first level. n 1! for the second, etc. W is the weight of the configuration. How many ways a configuration can be achieved.

Better to use natural logarithm ln W = ln N! n 0! n 1! n 2!... = ln N!- ln(n 0! n 1! n 2!..) =ln N! -(ln n 0! +ln n 1! +ln n 2! +...) =ln N! -Σ i ln n i! ln x! x ln x-x Stirling s approximation ln W = (N ln N - N) - Σ i (n i ln n i n i ) = N ln N Σ i n i ln n i We have used, Σ i n i = N

Which is the dominating configuration having maximum weight? Generalized approach is to set dw = 0 There are problems as any configuration is not valid. 1. Energy is constant. Σ i N i ε i = E 2. Constant number of molecules. Σ i N i = N W N i Populations in the configuration of greatest weight depends on the energy of the state. N i N = e -βε i Σ i e -βε i β = 1 kt Boltzmann distribution i is a sum over available states Temperature gives the most probable populations

Boltzmann distribution population, p i = e -βε i q Molecular partition function Another form of q: Look at limiting cases q =Σ i e -βε i q =Σ levels l g l e -βε i lim T 0 q = g 0 lim T = When number of states is finite, we will not get. There are several ways of looking at i How to look at partition functions? 1. Partition function is the number of available states. 2. Partition function is the number of thermally accessible states. 3. How molecules are partitioned into available states. How to look at thermodynamic properties? Because, ε 0 = 0 For all higher levels ε is finite. e -βε =1. Because, e -x is 0 when x is. All terms will reduce to 1. e -x is 1 when x is 0

Evaluation of molecular partition function 1 + x + x 2 + x 3 +.= 1/(1-x) q = 1 + e βε + e 2βε + e 3βε + = 1 + e βε + (e βε ) 2 + (e βε ) 2 + =1/(1- e βε ) 2ε ε 0 ε Fraction of molecules in energy levels is, p i = e βε i /q = (1- e βε ) e βε i Discussion of figure, next slide For a two level system, p p o = 1/(1 + e βε ) p 1 = e βε /q = e βε / (1+ e βε ) As T à, both p o and p 1 go to ½. Consequences?

Low temperature High temperature

Approximations: Partition functions are often complex. Analytical expressions cannot be obtained in several cases. Approximations become very useful. For example, partition function for one dimensional motion E n = n 2 h 2 ε n = (n 2-1)ε 8 mx 2 n =1,2. ε = h2 8mX 2 q x = Σ n=1 e -(n 2-1)βε Lowest energy level has an energy of h 2 8mX 2 Energy with respect to this level is, q x = 1 e -(n 2-1)βε dn energy levels are close, sum becomes an integral No big difference, if we take the lower limit to 0 and replace n 2-1 to n 2. q x = 1 1/2 1/2 βε 0 e -x 2 dx = 1 1/2 βε 2 = 2 1/2 m X h 2 β Substitute, x 2 = n 2 βε, meaning dn = dx/(βε) 1/2 q x = 2 m h 2 β 1/2 X Substitute for ε

Independent motion in three dimensions ε n1n2n3 = ε n1 (X) + ε n2 (Y) +ε n3 (Z) Energy is a sum of independent terms q =Σ all n e βε(x) n1 βε(y) n2 -βε(z) n3 = Σ n1 e- βε(x) n1 Σ n2 e -βε(y) n2 Σ n3 e -βε(z) n3 = q x q y q z 3/2 q = 2 m h 2 β XYZ q = V Ʌ 3 Ʌ = h β 1/2 2 m = h (2 mkt) 1/2 Ʌ has dimensions of length, thermal wavelength J =kg m -2 s -2 Question: How many more quantum states will be accessible for 18 O 2 compared to 16 O 2, if it were to be confined in a box of 1 cm 3?

How to get thermodynamics? All information about the thermodynamic properties of the system is contained in the partition function. Thermal wavefunction. Total energy We know, E = Σ i n i ε i E = N q Σ i ε i e-βεi ε i e βεi = - d dβ e-βεi Most probable configuration is dominating. We use Boltzmann distribution. E = - N q Σ i d dβ U = U(0) + E e βεi = - N q U = U(0)- N q q β v U = U(0)- N lnq β d dβ Σ N dq i e-βεi = q dβ 1. All E s are relative E is the value of U relative to T = 0. v 2. Derivative w.r.t. β is partial as there are other parameters (such as V) on which energy depends on. Partition function gives internal energy of the system.

How to get entropy? For a process, change in internal energy, U = U(0) + E U = U(0) + Σ i n i ε i Internal energy changes occur due to change in populations (N i + dn i ) or energy states (ε i + dε i ). Consider a general case: For constant volume changes, du = Σ i ε i dn i du = du(0) + Σ i N i dε i + Σ i ε i dn i du = dq rev = TdS ds = du/t = k β Σ i ε i dn i ds = k Σ i ( lnw/ N i ) dn i + kασ i dn i β ε i = ( lnw/ N i ) + α Number of molecules do not change. Second term is zero. ds = k Σ i ( lnw/ N i ) dn i = k (dlnw) S = klnw From the derivation of Boltzmann distribution α and β are constants This can be rewritten (after some manipulations) to S(T) = [U(T) - U(0)]/T + Nk ln q