although Boltzmann used W instead of Ω for the number of available states.

Similar documents
(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics

A Brief Introduction to Statistical Mechanics

Physics 172H Modern Mechanics

to satisfy the large number approximations, W W sys can be small.

ChE 503 A. Z. Panagiotopoulos 1

Chapter 4: Going from microcanonical to canonical ensemble, from energy to temperature.

Recitation: 10 11/06/03

Statistical Mechanics. Atomistic view of Materials

The Partition Function Statistical Thermodynamics. NC State University

TSTC Lectures: Theoretical & Computational Chemistry

Physics 607 Final Exam

(i) T, p, N Gibbs free energy G (ii) T, p, µ no thermodynamic potential, since T, p, µ are not independent of each other (iii) S, p, N Enthalpy H

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble

Solid Thermodynamics (1)

1 Fundamentals of Statistical Physics

Quantum Grand Canonical Ensemble

Molecular Modeling of Matter

Grand Canonical Formalism

Contents. 1 Introduction and guide for this text 1. 2 Equilibrium and entropy 6. 3 Energy and how the microscopic world works 21

Statistical Thermodynamics and Monte-Carlo Evgenii B. Rudnyi and Jan G. Korvink IMTEK Albert Ludwig University Freiburg, Germany

IV. Classical Statistical Mechanics

Statistical Mechanics in a Nutshell

[S R (U 0 ɛ 1 ) S R (U 0 ɛ 2 ]. (0.1) k B

Part II: Statistical Physics

i=1 n i, the canonical probabilities of the micro-states [ βǫ i=1 e βǫn 1 n 1 =0 +Nk B T Nǫ 1 + e ǫ/(k BT), (IV.75) E = F + TS =

Statistical. mechanics

Part II: Statistical Physics

Statistical Thermodynamics

Fluctuations of Trapped Particles

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term Solutions to Problem Set #10

Entropy and Free Energy in Biology

CHEM-UA 652: Thermodynamics and Kinetics

1 Foundations of statistical physics

Thermodynamics of the nucleus

Basic Concepts and Tools in Statistical Physics

fiziks Institute for NET/JRF, GATE, IIT-JAM, JEST, TIFR and GRE in PHYSICAL SCIENCES

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

MACROSCOPIC VARIABLES, THERMAL EQUILIBRIUM. Contents AND BOLTZMANN ENTROPY. 1 Macroscopic Variables 3. 2 Local quantities and Hydrodynamics fields 4

9.1 System in contact with a heat reservoir

2m + U( q i), (IV.26) i=1

1 The fundamental equation of equilibrium statistical mechanics. 3 General overview on the method of ensembles 10

Statistical thermodynamics for MD and MC simulations

Introduction Statistical Thermodynamics. Monday, January 6, 14

2. Thermodynamics. Introduction. Understanding Molecular Simulation

Entropy online activity: accompanying handout I. Introduction

Addison Ault, Department of Chemistry, Cornell College, Mount Vernon, IA. There are at least two ways to think about statistical thermodynamics.

Statistical thermodynamics Lectures 7, 8

Basics of Statistical Mechanics

Entropy and Free Energy in Biology

summary of statistical physics

Basics of Statistical Mechanics

Thus, the volume element remains the same as required. With this transformation, the amiltonian becomes = p i m i + U(r 1 ; :::; r N ) = and the canon

where (E) is the partition function of the uniform ensemble. Recalling that we have (E) = E (E) (E) i = ij x (E) j E = ij ln (E) E = k ij ~ S E = kt i

Statistical thermodynamics Lectures 7, 8

CHEM-UA 652: Thermodynamics and Kinetics

Part II: Statistical Physics

PHYSICS 715 COURSE NOTES WEEK 1

Nanoscale simulation lectures Statistical Mechanics

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Thermodynamics & Statistical Mechanics

Boltzmann Distribution Law (adapted from Nash)

UNIVERSITY OF SOUTHAMPTON

Lecture Notes Set 3b: Entropy and the 2 nd law

Appendix C extra - Partition function Cextra4-1

Statistical Mechanics Notes. Ryan D. Reece

PHYS 352 Homework 2 Solutions

Micro-canonical ensemble model of particles obeying Bose-Einstein and Fermi-Dirac statistics

Statistical Mechanics

Summer Lecture Notes Thermodynamics: Fundamental Relation, Parameters, and Maxwell Relations

213 Midterm coming up

Thermal and Statistical Physics Department Exam Last updated November 4, L π

CH 240 Chemical Engineering Thermodynamics Spring 2007

Computational Physics (6810): Session 13

Understanding Molecular Simulation 2009 Monte Carlo and Molecular Dynamics in different ensembles. Srikanth Sastry

Chemistry 431. Lecture 27 The Ensemble Partition Function Statistical Thermodynamics. NC State University

Assignment 8. Tyler Shendruk December 7, 2010

STATISTICAL PHYSICS II

Solutions to Problem Set 8

Molecular Interactions F14NMI. Lecture 4: worked answers to practice questions

Advanced Thermodynamics. Jussi Eloranta (Updated: January 22, 2018)

Once Upon A Time, There Was A Certain Ludwig

Entropy. Physics 1425 Lecture 36. Michael Fowler, UVa

Information Theory and Predictability Lecture 6: Maximum Entropy Techniques

PHY 5524: Statistical Mechanics, Spring February 11 th, 2013 Midterm Exam # 1

NEGATIVE SPECIFIC HEAT IN A THERMODYNAMIC MODEL OF MULTIFRAGMENTATION

Statistical Physics. How to connect the microscopic properties -- lots of changes to the macroscopic properties -- not changing much.

Midterm Examination April 2, 2013

An Outline of (Classical) Statistical Mechanics and Related Concepts in Machine Learning

4.1 Constant (T, V, n) Experiments: The Helmholtz Free Energy

First Problem Set for Physics 847 (Statistical Physics II)

Statistical thermodynamics L1-L3. Lectures 11, 12, 13 of CY101

Intro. Each particle has energy that we assume to be an integer E i. Any single-particle energy is equally probable for exchange, except zero, assume

Removing the mystery of entropy and thermodynamics. Part 3

Lecture 2: Intro. Statistical Mechanics

Chapter 5 - Systems under pressure 62

Theoretical Statistical Physics

arxiv:cond-mat/ v1 10 Aug 2002

Elements of Statistical Mechanics

5. Systems in contact with a thermal bath

Transcription:

Lecture #13 1 Lecture 13 Obectives: 1. Ensembles: Be able to list the characteristics of the following: (a) icrocanonical (b) Canonical (c) Grand Canonical 2. Be able to use Lagrange s method of undetermined multipliers 3. Be able to derive the Canonical partition function Ensembles 1. icrocanonical Ensemble: In this ensemble the number of molecules, volume and total energy is held fixed (N,V,E). Consider a collection of systems that is completely isolated from the universe. No heat, no work, so that E = constant. This is the microcanonical ensemble. This system is used extensively in molecular dynamics calculations, as it is the most natural way to solve the equations of motion. The partition function is denoted by Ω(N, V, E) and represents the total number of states available to the system consistent with the constraints. The pressure and temperature fluctuate and may be different from one ensemble member and another. The energy does not fluctuate because it is an eigenvalue of the system. Ω(N,V,E) = ω(e) where ω(e) is the degeneracy of the system with energy E. The total number of states available to the system is actually a measure of the entropy of the system. You can imagine that the larger the number of states the larger the number of ways of arranging the molecules in the system. This argument leads to the famous formula for the entropy as expressed by Boltzmann and is enscribed on his tombstone, S = kln(ω) although Boltzmann used W instead of Ω for the number of available states. 2. Canonical Ensemble: The number of molecules, volume and temperature are held constant. Actually, the temperature is allowed to fluctuate about the mean value T. The canonical partition function is denoted Q(N,V,T). Think of a collection of systems in a heat bath so that they are all thermostated to the same temperature. There are no work interactions allowed, but heat may be transfered, E constant. The energy fluctuates about some mean value for each member of the ensemble. Other properties also fluctuate. Q(N,V,T) = exp( βe ) = }{{} States E }{{} Energy levels ω(e )exp( βe ) = E Ω(N,V,E )exp( βe )

Lecture #13 2 Figure 1: A portion of an ensemble of macroscopic systems, all with the same number of molecules N, same volume V and at the same temperature T. 3. Grand Canonical Ensemble: The volume, temperature, and chemical potential are held constant. The number of particles fluctuates, so this is an materially open system, i.e., molecules are allowed to diffuse in and out of the system. The grand canonical ensemble is denoted by Ξ(µ,V,T). The total energy, system pressure, etc. fluctuate around their mean values. Ξ(µ,V,T) = N P = kt V lnξ = N = N exp( βe )exp(βµn) Q(N,V,T)exp(βµN) E Ω(N,V,E )exp( βe )exp(βµn) 4. Isothermal-Isobaric Ensemble: The number of molecules, pressure, and temperature are held constant. Can you guess what thermodynamic function corresponds to the isothermal-isobaric ensemble? (N,P,T) = exp( βe )exp( βpv) V = Q(N,V,T)exp( βpv) V = Ω(N,V,E )exp( βe )exp( βpv) E V

Lecture #13 3 Ensemble Averages Consider an ensemble made up of members, each with N molecules. Let i be the number of member systems that have energy E i so that = i i E tot = i i E i We want to find the average energy (and other properties) E. This means that we need to know the probability of the occurrence of each quantum state i. We know that for this particular ensemble (collection of systems) the probability of observing a system in state i is P i = i But, this is ust for one ensemble. There are many many possible ensembles. In other words, think of making copies of the ensemble, or think of letting the ensemble evolve in time and then making new observations. We want to calculate the average number of systems in state i for many, many ensembles. Let this be i. The number i is actually the expectation value of i. If we know i then we can find the average energy (or other property). Since the principle of equal a Priori probabilities applies, each state with the same energy is equally likely. Hence, we need the combination of ways that a particular system can occur. That is, we need to find out the number of ways that things can be arranged i at a time. The way to find i is to use the multinomial distribution. where f(n 1,N 2,...) = N! N! N = N Recall that the binomial theorem is (a+b) = N=0 C N a N b N where C N means combinations of things taken N at a time. (a+b) 5 = b 5 +5ab 4 +10a 2 b 3 +10a 3 b 2 +5a 4 b+a 5 The multinomial distribution is a generalization to higher dimensions of the binomial distribution. For three dimensions the multinomial (trinomial) theorem is (a+b+c) = n 1 =0n 2 =0n 3 =0! n 1!n 2!n 3! an 1 b n 2 c n 3

Lecture #13 4 where the n i are subect to the constraint n i = i Example: Find (a+b+c) 3 using the multinomial theorem. Write down the combinations: n 1 n 2 n 3 0 0 3 0 1 2 0 2 1 0 3 0 1 0 2 1 1 1 1 2 0 2 0 1 2 1 0 3 0 0 (a+b+c) 3 = c 3 +3bc 2 +3b 2 c+b 3 +3ac 3 +6abc+3ab 2 +3a 2 c+3a 2 b+a 3 We want to find P i = i where i = if() i! f() =!!! We can not possibly write down all the terms in this equation, so we want to simplify the expression by replacing the sum over all distributions by the most probable distribution f. i = if() f() = i f f = i where i is the maximum term in f. Hence, we want to find i. We want to find P = subect to constraints. What are the constraints? They are the N V T constraints, = i i E tot = i i E i

Lecture #13 5 For this we need to use a mathematical tool called Lagrange s method of undetermined multipliers. Given a multidimensional function f(x 1,...,x n ) we want to maximize (or minimize) the function subect to some constraints g k (x 1,...,x n ) = 0 for k = 1, l. For unconstrained minimization df = n ( f i=1 x i ) dx i = 0 or, because each x i is independent, ( ) f = 0 x i Lagrange s method is to multiply each constraint equation by some unknown number and construct a new obective function Then take F = f(x i )+ xi,λ k F = 0 l λ k g k ( x) k=1 We then solve the equations (if possible). Back to our problem, we have the constraints = i i and E tot = i E i i We want to replace f() with the maximum term, but with constraints. Our goal is to find f and i, subect to the constraints of total number of ensemble members, and total energy. For convenience we maximize lnf rather than f. So we use multipliers α and β to form a new function F = lnf +α +β E tot = ln! ln!+α = ln! ln!+α E +β E tot E +β E tot E = ln ( ln )+α +β E tot E

Lecture #13 6 Take the derivative and set the sum equal to zero (to maximize). Note that is a constant: ( ) F i = ln α βe = 0 Now we can find the undetermined multiplier α = exp( α) exp( βe ) = or exp( α) = exp( βe ) Put this into the definition of P, P = i = 1 exp( βe ) exp( βe ) P = exp( βe ) exp( βe ) Compare with our definition of the partition function and we identify Q(N,V,T) = exp( βe ) What is the second parameter β? After quite a lot of work you can prove that β = 1/kT, where k is Boltzmann s constant.