Chapter 4: Going from microcanonical to canonical ensemble, from energy to temperature.

Similar documents
to satisfy the large number approximations, W W sys can be small.

The goal of equilibrium statistical mechanics is to calculate the diagonal elements of ˆρ eq so we can evaluate average observables < A >= Tr{Â ˆρ eq

Part II: Statistical Physics

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Part II: Statistical Physics

1. Thermodynamics 1.1. A macroscopic view of matter

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Entropy and Free Energy in Biology

Contents. 1 Introduction and guide for this text 1. 2 Equilibrium and entropy 6. 3 Energy and how the microscopic world works 21

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

Chapter 3. Entropy, temperature, and the microcanonical partition function: how to calculate results with statistical mechanics.

THE SECOND LAW OF THERMODYNAMICS. Professor Benjamin G. Levine CEM 182H Lecture 5

9.1 System in contact with a heat reservoir

Introduction Statistical Thermodynamics. Monday, January 6, 14

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics

Entropy and Free Energy in Biology

Let s start by reviewing what we learned last time. Here is the basic line of reasoning for Einstein Solids

although Boltzmann used W instead of Ω for the number of available states.

A Brief Introduction to Statistical Mechanics

Statistical thermodynamics for MD and MC simulations

Statistical Thermodynamics and Monte-Carlo Evgenii B. Rudnyi and Jan G. Korvink IMTEK Albert Ludwig University Freiburg, Germany

Recitation: 10 11/06/03

Chapter 2 Ensemble Theory in Statistical Physics: Free Energy Potential

Statistical Mechanics

Lecture 25 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas

Physics 132- Fundamentals of Physics for Biologists II

Statistical. mechanics

Homework Hint. Last Time

2m + U( q i), (IV.26) i=1

a. 4.2x10-4 m 3 b. 5.5x10-4 m 3 c. 1.2x10-4 m 3 d. 1.4x10-5 m 3 e. 8.8x10-5 m 3

ChE 503 A. Z. Panagiotopoulos 1

An Outline of (Classical) Statistical Mechanics and Related Concepts in Machine Learning

MD Thermodynamics. Lecture 12 3/26/18. Harvard SEAS AP 275 Atomistic Modeling of Materials Boris Kozinsky

8.044 Lecture Notes Chapter 8: Chemical Potential

Chapter 19. Chemical Thermodynamics

UNIVERSITY OF OSLO FACULTY OF MATHEMATICS AND NATURAL SCIENCES

Physics 132- Fundamentals of Physics for Biologists II. Statistical Physics and Thermodynamics

Entropy, free energy and equilibrium. Spontaneity Entropy Free energy and equilibrium

2. Thermodynamics. Introduction. Understanding Molecular Simulation

U(T), H(T), and Heat Capacities (C )

Thermal & Statistical Physics Study Questions for the Spring 2018 Department Exam December 6, 2017

1 The fundamental equation of equilibrium statistical mechanics. 3 General overview on the method of ensembles 10

Advanced Thermodynamics. Jussi Eloranta (Updated: January 22, 2018)

Statistical Physics. The Second Law. Most macroscopic processes are irreversible in everyday life.

Statistical Mechanics Primer

Removing the mystery of entropy and thermodynamics. Part 3

CHEM-UA 652: Thermodynamics and Kinetics

Thermodynamics & Statistical Mechanics

Statistical Mechanics

MP203 Statistical and Thermal Physics. Jon-Ivar Skullerud and James Smith

Entropy in Macroscopic Systems

Physics 213 Spring 2009 Midterm exam. Review Lecture

5.60 Thermodynamics & Kinetics Spring 2008

Physics 207 Lecture 25. Lecture 25, Nov. 26 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas

Statistical Mechanics. Atomistic view of Materials

5. Systems in contact with a thermal bath

4. Systems in contact with a thermal bath

PHY 5524: Statistical Mechanics, Spring February 11 th, 2013 Midterm Exam # 1

Statistical mechanics of classical systems

4.1 Constant (T, V, n) Experiments: The Helmholtz Free Energy

i=1 n i, the canonical probabilities of the micro-states [ βǫ i=1 e βǫn 1 n 1 =0 +Nk B T Nǫ 1 + e ǫ/(k BT), (IV.75) E = F + TS =

6.730 Physics for Solid State Applications

HWK#7solution sets Chem 544, fall 2014/12/4

Basics of Statistical Mechanics

UNDERSTANDING BOLTZMANN S ANALYSIS VIA. Contents SOLVABLE MODELS

Chapter 19 Chemical Thermodynamics

Atkins / Paula Physical Chemistry, 8th Edition. Chapter 16. Statistical thermodynamics 1: the concepts

[S R (U 0 ɛ 1 ) S R (U 0 ɛ 2 ]. (0.1) k B

MCB100A/Chem130 MidTerm Exam 2 April 4, 2013

...Thermodynamics. Entropy: The state function for the Second Law. Entropy ds = d Q. Central Equation du = TdS PdV

MCB100A/Chem130 MidTerm Exam 2 April 4, 2013

Phase Transitions. µ a (P c (T ), T ) µ b (P c (T ), T ), (3) µ a (P, T c (P )) µ b (P, T c (P )). (4)

Thermodynamics of nuclei in thermal contact

Basics of Statistical Mechanics

Turning up the heat: thermal expansion

Assignment 3. Tyler Shendruk February 26, 2010

Outline Review Example Problem 1. Thermodynamics. Review and Example Problems: Part-2. X Bai. SDSMT, Physics. Fall 2014

18.13 Review & Summary

Nanoscale simulation lectures Statistical Mechanics

1 Foundations of statistical physics

Introduction to the Renormalization Group

First Problem Set for Physics 847 (Statistical Physics II)

Kinetic theory of the ideal gas

Outline Review Example Problem 1 Example Problem 2. Thermodynamics. Review and Example Problems. X Bai. SDSMT, Physics. Fall 2013

Chem 3502/4502 Physical Chemistry II (Quantum Mechanics) 3 Credits Spring Semester 2006 Christopher J. Cramer. Lecture 9, February 8, 2006

Chapter 19 Chemical Thermodynamics

Thermodynamics: More Entropy

III. Kinetic Theory of Gases

Physics 4230 Final Exam, Spring 2004 M.Dubson This is a 2.5 hour exam. Budget your time appropriately. Good luck!

Chapter 17 Spontaneity, Entropy, and Free Energy

PHY214 Thermal & Kinetic Physics Duration: 2 hours 30 minutes

Physics PhD Qualifying Examination Part I Wednesday, January 21, 2015

Statistical thermodynamics L1-L3. Lectures 11, 12, 13 of CY101

Quiz 3 for Physics 176: Answers. Professor Greenside

S = k log W CHEM Thermodynamics. Change in Entropy, S. Entropy, S. Entropy, S S = S 2 -S 1. Entropy is the measure of dispersal.

Javier Junquera. Statistical mechanics

Statistical Mechanics and Information Theory

Thermal and Statistical Physics Department Exam Last updated November 4, L π

Quantum Thermodynamics

Lecture 9 Examples and Problems

Transcription:

Chapter 4: Going from microcanonical to canonical ensemble, from energy to temperature. All calculations in statistical mechanics can be done in the microcanonical ensemble, where all copies of the system are in microstates of the same energy with well-defined volume and particle number (closed system). We saw that with ust two postulates and some calculus, relations like PV=nRT or ds=dq/t (at constant volume or pressure) can be derived. In particular, we were able to define a new variable temperature as the quantity that is equal when two systems, between which heat is allowed to flow, come to equilbrium. Similarly, we defined what pressure and chemical potential are. In reality, a system is never closed; there is always a larger environment around the system that can exchange heat, or work, or volume, or particles in some combination or other. We could always treat the systems + environment as one larger closed system, and apply the microcanonical ensemble and postulate 2 directly. This works ust fine, but it can be very cumbersone if you have to treat the environment explicitly all the time. Often it is more practical to ust treat the system explicitly, and leave out the environment as much as possible. Consider heat flow as an example. Many reaction are done in open beakers, where heat can flow in and out. Thus at equilibrium not the energy in the beaker will be constant, but the temperature will be constant the same temperature as the much larger lab environment, from/into which heat can flow from the small system until equilibrium is reached. Thus it would be nice to have formulas for systems at constant temperature. Such systems are called canonical in statistical mechsnics, while systems at constant energy are called microcanonical. The ensemble of microstates of a system at constant temperature that belong to the same macrostate are called the canonical ensemble, in analogy to the microcanonical ensemble that we saw in chapter 2. To discuss a system at constant T, we need to embed the system inside a heat bath (the environment) and connect the system by a diathermal wall (that allows heat to flow) to the heat bath. Important note: Only the bath needs to be large so that W tot and W bath satisfy the large number approximations necessary for thermodynamics. W W sys can be small. So the system could be a single molecule on a big surface, and if heat can flow from the molecule to the surface and vice versa, the molecule will still have the same temperature as the surface. 1) Deriving the canonical partition function from the microcanonical partition function. Remember what a partition function is: it is the number of microstates consistent with a given macrostate. For example, in the microcanonical ensemble, W is the microcanonical partition function; it is the number of microstates at constant energy U. What about the number of microstates in a macrostate at constant temperature, instead of constant

energy? That would be called the canonical partition function since the ensemble at constant temperature is the canonical ensemble. Let the system and heat bath (environment) together have constant energy U tot (overall closed system). Then the number of ways the system can be at energy E is W (E ) W bath E ).The number of ways the system can be at any energy is simply W tot ). From this follows for the probability of the system of being at energy E that ρ (eq) = W (E ) W (U E ) bath tot ; using S = k B lnw W tot ) ρ (eq) = W (E )e 1 S bath E ) S tot ) k B { } We can simplify the exponent as follows: Let U be the (as yet unknown) average energy U = E of the canonical ensemble of systems (= the ensemble at constant tenperature). e (a) S tot ) = S(U)+ S bath U), and Taylor-expanding S bath about the average energy U $ (b) S bath E ) = S bath U)+ S ' bath & % E ) ( E =U (E U)+... $ = S bath U)+ S ' $ bath & ) U ' bath % U & bath (% E ) (E U)+... ( = S bath U) 1 T (E U)+... In the last step, we used 1 / T = S U and du bath = du sys to simplify the second term. Why are the higher order terms negligible? Usually in a Taylor series, it is because x=e - U is small. This is true for a large system (central limit theorem) but not necessarily for a small system. Here the reason is that the bath is assumed to be huge. This causes the 0 th order term S bath to be large compared to the first order term, but it also causes the second derivative (1/T)/ U bath to be negligible compared to the first derivative (1/T): the temperature of an infinite heat bath ust is not going to change if you add some energy to the bath. Inserting both (a) and (b) into the exponential at the top of the page, ρ (eq) = W (E )e = W (E )e = W e E /k BT Z { } 1 1 k B T E + 1 T U S 1 k B T E +U TS { } ; but we know that U TS = A, and defining Z e A/k B T, using W =W(E ) for a shorter notation.

This is the probability of being in state with energy E when the temperature is T. We can derive an alternate formula for the normalization factor Z by realizing that the total probability has to add up to 1: ρ =1 1 W e E /k T B =1 or Z = W e E /k B T. Z 2) The meaning of the canonical partition function Z is called the canonical partition function, in analogy to the microcanonical partition function W. It is given by the sum over all microcanonical partition functions weighted by a Boltzmann factor that takes into account that at a given temperature T, higher energy microstates are less likely to be populated. Sometimes the notation Q is used for Z. Just as W counts the number of microstates accesible to a closed system at constant energy, Z counts the number of microstates accessible to an open system at constant temperature. For example, if T=0, then Z=W 0, the microcanonical partition function or degeneracy of the lowest energy level. This degeneracy is usually equal to 1: usually there is only one microstate without extractable energy (3d law of thermodynamics) in chapter 3. Thus Z=1 at very low temperature because only the one lowest energy microstate can be populated. As the temperature inreases, Z increases because more microstates can be populated. When k B T E, a state at energy E makes a full contribution lim e x = 1 x= E /k B T 0 to Z. If a state has E >> k B T, then it makes a small contribution to Z. Thus Z counts the effective number of accessible microstates, including the ones that are partially accessible. For practical use, the important consequence of the derivation given above is that Z = W e E /k B T = e A/kBT = e βa if we define β 1/ k B T This formula connects the microscopic world of individual energy levels and their degeneracies (degeneracy = microcanonical partition function) with the macroscopic world of thermodynamic potentials, in this case A, which is the potential useful at constant temperature, called the Helmholtz potential. Once the microcanonical partition functions W(E )=W are known for each energy E, Z can be calculated. Once Z is known, A can be calculated, and from A all other thermodynamics quantities can be calculated, since A is equivalent to W, S, and U as we saw in Chapter 3. Note 1: If E is continuous, we replace the sum by an integration Z = dew (E)e E/k BT, where W (E) is the continuous density of states. Note 2: Z is sometimes written as a sum over every single state, instead of over degenerate energy levels: 0

Z = e βe i ˆρ eq (T,V ) = 1 Z e βĥ i=1 For example, if energy level =5 is 3-fold degenerate, this sum would have to contain the term e βe 5 three times, rather than explicitly writing W 5 e βe 5 = 3e βe 5 3) Computation of thermodynamic quantities from Z We saw in the last chapter how to prove useful thermodynamic formulas like ds=dq/t, or PV=nRT. We can use the same method to calculate any average quantity we want from Z. Exercise 1: What is the heat capacity of and RNA hairpin? Consider the following simple model for the reaction F! U that interconverts a folded RNA hairpin to an unfolded RNA hairpin. The figure below shows a simple 2-D lattice model for an RNA with four bases, two of which can base pair in the stem. In this model, we can count all the states that are non-superimposable by 2-D rotation: F U Fig. 4.1 Folding/unfolding reaction of an RNA hairpin in a simple 2-D lattice model that allows only 90 hinge motions of the bases, and counts only one interaction energy (base pairing energy), between the two black bases. Let the base pairing energy between the two black bases be ε kj/mole. Renormalizing the minimum energy to 0, we have E F = 0 W F =1 and E U =+ε and W U =5. The partition function for this system is Z = W F +W U e ε/rt. The probability of being in states F and U is = W F Z, ρ U = W Ue ε/rt, Z and the ratio of concentrations for the above reaction is therefore K = [U] [F] = ρ U. We will prove in chapter 5 that this ratio, known as the equilibrium constant K, can be calculated directly from the Gibbs free energy introduced in chapter 3.

The average energy of the system is given by U = E F + E U ρ U = W F Z 0 + W Ue ε/rt ε = W Uεe ε/rt Z W F +W U e. ε/rt We now have the energy as a function of temperature and the parameters W F, W U and ε. Now take the derivative of the energy as in with respect to temperature to obtain " C V = U % $ ' = 1 2 " ε % $ ' ρ U. # T & R # T & This heat capacity has a maximum below ρ U = = 1/2, in the middle of the folding/unfolding transition. Thus the melting temperature of an RNA piece could be determined by calorimetry by looking for the heat capacity maximum. Let s calculate this melting temperature where ρ U = = ½: according to the definition of above, at the melting temperature ½=1/Z (since W F =1) or 2=1+W U exp(- ε/rt m ), or T m =ε/(rln(w U )). This should make sense: the higher the interaction energy ε, the harder it will be to melt the RNA; the more unfolded microststates are available, the easier it will be to melt the RNA. Note that a simple model like the above will not be perfectly accurate. For example, by making it 2-D, we are grossly underestimating the number of unfolded configurations compared to the folded ones. We could easily remedy this by writing ρ U as ρ U =1 = W U / W F e ε/rt 1+W U / W F e ε/rt and making the ratio W U /W F an adustable parameter in our fitting model. Or we could do a better calculation by, say, running molecular dynamics to count configurations and evaluate energies. The important thing is that statistical mechanics lets you derive realistic model functions for your experiment, instead of fitting the data by a polynomial or some other function that has no physical meaning and provides no physical insight. In the above equation for C V, the physical insight is that the heat capacity is directly related to the base pairing energy ε, depends on the ratio (ε/t) 2, has a peak, and depends on the ratio W U /W F (which is probably much larger than 5 in reality), and that the RNA melts at a temperature T m =ε/(rln(w U )). That s a lot of useful things you have already learned about RNA folding ust by thinking about a simple model before even toughing the calorimeter to do the experiment. Exercise 2: Derive a simple formula for the average energy of a system at constant temperature Z = e βa = e β (U TS) = e βu+lnw (U ) Z β = UQ U = 1 Z Z β = ln Z β Exercise 3: Derive a formula for average pressure for a system at constant temperature Hint: remember from chapter 3 that P=-( U/ V). The same is still true for the Helmholtz potential A. P = A V = k B T ln Z V T,N,... T,N,... V,n,... V,n,...

Exercise 4: Derive a formula for the entropy of a system at constant temperature Once we have U and A, it is easy to obtain S: A =U TS S = 1 (U A) T Remember that A is a function of T and S is a function of U, so to get a proper fundamental relation, you have to substitute T= U/ S as a function of U and/or S everywhere. Thermodynamics provides us with the manipulations to get any variable we want once we have computed the partition function. Z (or W) contain all knowledge about the equilibrium properties of a system, ust like U(S) and A(T) do. Exercise 5: A diatomic molecule with spring constant k is heated to a temperature T. What is the average (root-mean-squared average) length L by which the molecule is compressed or expanded because of heating? Answer: The applied force that stretches the spring is F=kL, where L is the displacement from the equilibrium bond distance. Instead of U=TS-PV for a 3-D system, we have U=TS-FL for a 1-D spring. Note that FL=kL 2, not ½ kl 2. U is not the potential energy, but the total energy, and when the spring is heated, half the energy is stored in potential energy, the other half in kinetic energy (the classical equipartition principle). The energy of the spring is E =!ωn if we shift E n=0 = 0 to the zero point as our reference energy. Now we can calculate Z Z = 1 e!ωn/k BT = e!ω/k BT ( ) n 1 = 1 e!ω/k BT n=0 n=0 ( ) A = k B T ln Z = +k B T ln 1 e!ω/k BT U = TS FL for a spring A =U TS = FL; F = kl A = kl 2 L 2 = A k = k T B k ln Z Therefore we have L rms = k BT k ln Z. This remarkable result tells us how the root-mean-square thermal fluctuations of a diatomic molecule depend on temperature T, spring constant k and number of states populated Z. For a macroscopic spring, L rms would be rather small compared to the length of a spring, but for a molecule-sized spring, it could be quite large relative to the length of the spring. This is the reason we cannot build a Maxwell Daemon by making a little door between two chambers held tight by a spring on one side so particles can only go through one way. In reality, the thermal motion of the particles would make the spring vibrate so much that particles could get through either way (or if k is too large, neither way). There s ust no escaping the second law!