Statistical thermodynamics Lectures 7, 8

Similar documents
Statistical thermodynamics Lectures 7, 8

Statistical thermodynamics L1-L3. Lectures 11, 12, 13 of CY101

Atkins / Paula Physical Chemistry, 8th Edition. Chapter 16. Statistical thermodynamics 1: the concepts

Statistical. mechanics

Lecture 14. Entropy relationship to heat

Introduction Statistical Thermodynamics. Monday, January 6, 14

The Lowest Temperature Of An Einstein Solid Is Positive

Lecture 13. Multiplicity and statistical definition of entropy

The Partition Function Statistical Thermodynamics. NC State University

Lecture 2: Intro. Statistical Mechanics

2. Thermodynamics. Introduction. Understanding Molecular Simulation

Entropy and Free Energy in Biology

Solid Thermodynamics (1)

5.60 Thermodynamics & Kinetics Spring 2008

Entropy and Free Energy in Biology

Physics 172H Modern Mechanics

Part II: Statistical Physics

Irreversible Processes

First Law Limitations

Exam 2, Chemistry 481, 4 November 2016

Part II: Statistical Physics

5.62 Physical Chemistry II Spring 2008

although Boltzmann used W instead of Ω for the number of available states.

a. 4.2x10-4 m 3 b. 5.5x10-4 m 3 c. 1.2x10-4 m 3 d. 1.4x10-5 m 3 e. 8.8x10-5 m 3

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble

Statistical Mechanics and Information Theory

Chapter 20 Entropy and the 2nd Law of Thermodynamics

Physics 607 Final Exam

Summer Lecture Notes Thermodynamics: Fundamental Relation, Parameters, and Maxwell Relations

Adiabatic Expansion (DQ = 0)

Appendix 4. Appendix 4A Heat Capacity of Ideal Gases

UNIVERSITY OF SOUTHAMPTON

Minimum Bias Events at ATLAS

Physics is time symmetric Nature is not

Statistical Mechanics Notes. Ryan D. Reece

11.6: Ratio and Root Tests Page 1. absolutely convergent, conditionally convergent, or divergent?

213 Midterm coming up

Classical Physics I. PHY131 Lecture 36 Entropy and the Second Law of Thermodynamics. Lecture 36 1

Monatomic ideal gas: partition functions and equation of state.

Recitation: 10 11/06/03

+ kt φ P N lnφ + φ lnφ

Lecture Notes 2014March 13 on Thermodynamics A. First Law: based upon conservation of energy

rate of reaction forward conc. reverse time P time Chemical Equilibrium Introduction Dynamic Equilibrium Dynamic Equilibrium + RT ln f p

9.1 System in contact with a heat reservoir

Chemistry. Lecture 10 Maxwell Relations. NC State University

CY1001 BASIC CONCEPTS

Advanced Thermodynamics. Jussi Eloranta (Updated: January 22, 2018)

Lecture 4: Mechanical and Chemical Equilibrium In the Living Cell (Contd.)

Problem: Calculate the entropy change that results from mixing 54.0 g of water at 280 K with 27.0 g of water at 360 K in a vessel whose walls are

Problem Set 4 Solutions 3.20 MIT Dr. Anton Van Der Ven Fall 2002

The goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq

Statistical Mechanics

Rubber elasticity. Marc R. Roussel Department of Chemistry and Biochemistry University of Lethbridge. February 21, 2009

Atkins / Paula Physical Chemistry, 8th Edition. Chapter 3. The Second Law

Definite Integral and the Gibbs Paradox

Introduction. Statistical physics: microscopic foundation of thermodynamics degrees of freedom 2 3 state variables!

Entropy. Physics 1425 Lecture 36. Michael Fowler, UVa

Exam TFY4230 Statistical Physics kl Wednesday 01. June 2016

The First Law of Thermodynamics

Homework Week The figure below depicts the isothermal compression of an ideal gas. isothermal du=0. δq rev = δw rev = P dv

Today. Work, Energy, Power loose ends Temperature Second Law of Thermodynamics

Removing the mystery of entropy and thermodynamics. Part 3

Thermodynamics: Chapter 02 The Second Law of Thermodynamics: Microscopic Foundation of Thermodynamics. September 10, 2013

Chapter 3. The Second Law Fall Semester Physical Chemistry 1 (CHM2201)

Physics 4311 ANSWERS: Sample Problems for Exam #2. (1)Short answer questions:

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

CY1001 BASIC CONCEPTS

Final Exam, Chemistry 481, 77 December 2016

The Direction of Spontaneous Change: Entropy and Free Energy

Chapter 19 The First Law of Thermodynamics

macroscopic view (phenomenological) microscopic view (atomistic) computing reaction rate rate of reactions experiments thermodynamics

ChE 503 A. Z. Panagiotopoulos 1

PHAS2228 : Statistical Thermodynamics

to satisfy the large number approximations, W W sys can be small.

Contents. 1 Introduction and guide for this text 1. 2 Equilibrium and entropy 6. 3 Energy and how the microscopic world works 21

Once Upon A Time, There Was A Certain Ludwig

Potential Descending Principle, Dynamic Law of Physical Motion and Statistical Theory of Heat

Statistical thermodynamics (mechanics)

1 Garrod #5.2: Entropy of Substance - Equation of State. 2

1 The fundamental equation of equilibrium statistical mechanics. 3 General overview on the method of ensembles 10

Spontaneity: Second law of thermodynamics CH102 General Chemistry, Spring 2012, Boston University

Heat Capacities, Absolute Zero, and the Third Law

Appendix 1: Normal Modes, Phase Space and Statistical Physics

Introduction. Chapter The Purpose of Statistical Mechanics

UNIVERSITY OF SOUTHAMPTON

Statistical Physics. The Second Law. Most macroscopic processes are irreversible in everyday life.

Physics 576 Stellar Astrophysics Prof. James Buckley. Lecture 14 Relativistic Quantum Mechanics and Quantum Statistics

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

L11.P1 Lecture 11. Quantum statistical mechanics: summary

Thermodynamics: A Brief Introduction. Thermodynamics: A Brief Introduction

Dr.Salwa Alsaleh fac.ksu.edu.sa/salwams

Lecture 3 Clausius Inequality

PHYS 445 Lecture 3 - Random walks at large N 3-1

Physics 607 Final Exam

Boltzmann Distribution Law (adapted from Nash)

Entropy Changes & Processes

3.320: Lecture 19 (4/14/05) Free Energies and physical Coarse-graining. ,T) + < σ > dµ

Chapter 20 The Second Law of Thermodynamics

Lecture 2 Entropy and Second Law

1. Thermodynamics 1.1. A macroscopic view of matter

Transcription:

Statistical thermodynamics Lectures 7, 8 Quantum classical Energy levels Bulk properties Various forms of energies. Everything turns out to be controlled by temperature CY1001 T. Pradeep Ref. Atkins 7 th or 8 th edition Alberty, Silbey, Bawendi 4 th edition

Need for statistical thermodynamics Microscopic and macroscopic world Energy states Distribution of energy - population Principle of equal a priori probabilities ε 0 Reference to zero Configuration - instantaneous n 1, n 2, molecules exist in states with energies ε 0, ε 1, N is the total number of molecules {N,0,0, } and {N-2, 2,0, } are configurations Second is more likely than the first Fluctuations occur {1,1}, {2,0}, {0,2} Weight of a configuration = how many times the configuration can be reached.

Energy Generalized picture of weight A configuration {N-2,2,0,0, } First ball of the higher state can be chosen in N ways, because there are N balls Second ball can be chosen in N-1 ways as there are N-1 balls But we need to avoid A,B from B,A. Thus total number of distinguishable configurations is, ½ [N(N-1)] 1 2 3 4 5 6 7 8 9 10 0 A configuration {3,2,0,0,..} is chosen in 10 different ways - weight of a configuration How about a general case of N particles? {n 0,n 1,..} configuration of N particles. Distinct ways, N! ways of selecting balls (first ball N, second (N-1), etc.) n W = N! 0! ways of choosing balls in the first level. n 1! for the n 0!n 1!n 2!... second, etc. W is the weight of the configuration. How many ways a configuration can be achieved.

Better to use natural logarithm ln W = ln N! n 0! n 1! n 2!... = ln N!- ln(n 0! n 1! n 2!..) =ln N! -(ln n 0! +ln n 1! +ln n 2!...) =ln N! -Σ i ln n i! ln x! x ln x-x Stirling s approximation ln W = (N ln N - N) - Σ i (n i ln n i n i ) = N ln N Σ i n i ln n i

Which is the dominating configuration having maximum weight? Generalized approach is to set dw = 0 There are problems as any configuration is not valid. 1. Energy is constant. Σ i n i ε i = E 2. Constant molecules. Σ i n i = N W n i Populations in the configuration of greatest weight depends on the energy of the state. n i N = e -βε i Σ i e -βε i β = 1 kt Boltzmann distribution i is a sum over available states Temperature gives the most probable populations

Boltzmann distribution population, p i = e -βε i q Molecular partition function Another form of q: Look at limiting cases q =Σ i e -βε i q =Σ levels i g i e -βε i lim T 0 q = g 0 lim T = When number of states is finite, we will not get. There are several ways of looking at i How to look at partition functions? Because, ε 0 = 0 For all higher levels ε is finite. e -βε =1. e -x is 0 when x is. All terms will reduce to 1. e -x is 1 when x is 0 1. Partition function is the number of available states. 2. Partition function is the number of thermally accessible states. 3. How molecules are partitioned into available states. How to look at thermodynamic properties?

Evaluation of molecular partition function 1 + x + x 2 + x 3 +.= 1/(1-x) q = 1 + e βε + e 2βε + e 3βε + = 1 + e βε + (e βε ) 2 + (e βε ) 2 + =1/(1- e βε ) 2ε ε 0 ε Fraction of molecules in energy levels is, p i = e βε i/q = (1- e βε ) e βε i Discussion of figure, next slide For a two level system, p p o = 1/(1 + e βε ) p 1 = e βε /q = e βε / (1+ e βε ) As T, both p o and p 1 go to ½. Consequences?

Low temperature High temperature

Approximations: Partition functions are often complex. Analytical expressions cannot be obtained in several cases. Approximations become very useful. For example, partition function for one dimensional motion E n = n 2 h 2 8 mx 2 n =1,2. ε n = (n 2-1)ε ε = h2 8mX 2 q x = Σ n=1 e -(n 2-1)βε q x = 1 e -(n 2-1)βε dn energy levels are close, sum becomes an integral No big difference, if we take the lower limit to 0 and replace n 2-1 to n 2. q x = 1 1/2 1/2 βε 0 e -x 2 dx = 1 1/2 βε 2 = 2 1/2 m X h 2 β Substitute, X 2 = n 2 βε, dn = dx/(βε) 1/2 q x = 2 m h 2 β 1/2 X Substitute for ε

Independent motion in three dimensions ε n1n2n3 = ε n1 (X) + ε n2 (Y) +ε n3 (Z) Energy is a sum of independent terms q =Σ all n e -βε(x) n1 βε(y) n2 -βε(z) n3 = Σ n1 e- βε(x) n1 Σ n2 e -βε(y) n2 Σ n3 e -βε(z) n3 = q x q y q z 3/2 q = 2 m h 2 β XYZ q = V Ʌ = h β 1/2 Ʌ 3 2 m = h (2 mkt) 1/2 Ʌ has dimensions of length, thermal wavelength J =kg m -2 s -2 Question: How many more quantum states will be accessible for 18 O 2 compared to 16 O 2, if it were to be confined in a box of 1 cm 3?

How to get thermodynamics? All information about the thermodynamic properties of the system is contained in the partition function. Thermal wavefunction. Total energy E = Σ i n i ε i We know, E = N q Σ iε i e -βεi ε i e βεi = - d dβ e-βεi Most probable configuration is dominating. We use Boltzmann distribution. E = - N q Σ i d dβ U = U(0) + E e βεi =-N q U = U(0) - N q d dβ Σ N dq ie -βεi = q dβ 1. All E s are relative E is the value of U relative to T = 0. q β v U = U(0)- N lnq β v 2. Derivative w.r.t. β is partial as there are other parameters (such as V) on which energy depends on. Partition function gives internal energy of the system.

How to get entropy? For a process, change in internal energy, U = U(0) + E U = U(0) + Σ i n i ε i Internal energy changes occur due to change in populations (n i + dn i ) or energy states (ε i + dε i ). Consider a general case: du = du(0) + Σ i n i dε i + Σ i ε i dn i For constant volume changes, du = Σ i ε i dn i du = dq rev = TdS ds = du/t = k β Σ i ε i dn i ds = k Σ i ( lnw/ n i ) dn i + kασ i dn i β ε i = ( lnw/ n i ) + α Number of molecules do not change. Second term is zero. ds = k Σ i ( lnw/ n i ) dn i = k (dlnw) S = klnw