Statistical Mechanics

Similar documents
Appendix C extra - Partition function Cextra4-1

The Second Law of Thermodynamics

1 Garrod #5.2: Entropy of Substance - Equation of State. 2

Elements of Statistical Mechanics

Solid Thermodynamics (1)

although Boltzmann used W instead of Ω for the number of available states.

Gibbs Paradox Solution

Problem Work and heat

Statistical Mechanics Victor Naden Robinson vlnr500 3 rd Year MPhys 17/2/12 Lectured by Rex Godby

Thermodynamics & Statistical Mechanics

(i) T, p, N Gibbs free energy G (ii) T, p, µ no thermodynamic potential, since T, p, µ are not independent of each other (iii) S, p, N Enthalpy H

Part1B(Advanced Physics) Statistical Physics

(a) How much work is done by the gas? (b) Assuming the gas behaves as an ideal gas, what is the final temperature? V γ+1 2 V γ+1 ) pdv = K 1 γ + 1

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

6.730 Physics for Solid State Applications

The Partition Function Statistical Thermodynamics. NC State University

Entropy: A concept that is not a physical quantity

1 Fundamentals of Statistical Physics

Statistical. mechanics

L11.P1 Lecture 11. Quantum statistical mechanics: summary

Statistical Thermodynamics

MCB100A/Chem130 MidTerm Exam 2 April 4, 2013

Bayesian Inference for the Multivariate Normal

PHYS 352 Homework 2 Solutions

8.044 Lecture Notes Chapter 5: Thermodynamcs, Part 2

Homework Hint. Last Time

PY2005: Thermodynamics

Thermodynamics & Statistical Mechanics SCQF Level 9, U03272, PHY-3-ThermStat. Thursday 24th April, a.m p.m.

Chapter 2 Thermodynamics

Lecture 14. Entropy relationship to heat

An Outline of (Classical) Statistical Mechanics and Related Concepts in Machine Learning

Lecture 5: Temperature, Adiabatic Processes

Entropy and Free Energy in Biology

Assignment 6: MCMC Simulation and Review Problems

IUGG, Perugia July, Energy Spectra from. Entropy Principles. Peter Lynch & Wim Verkely. University College Dublin. Meteorology & Climate Centre

Molecular Modeling of Matter

Statistical Physics. The Second Law. Most macroscopic processes are irreversible in everyday life.

Lecture 2: Intro. Statistical Mechanics

IV. Classical Statistical Mechanics

A m. Q m P. piston or diaphragm

Lecture 1 - Electrons, Photons and Phonons. September 4, 2002

(Refer Slide Time: 0:17)

Entropy: A concept that is not a physical quantity

Summary Part Thermodynamic laws Thermodynamic processes. Fys2160,

Imperial College London BSc/MSci EXAMINATION May 2008 THERMODYNAMICS & STATISTICAL PHYSICS

ADIABATIC PROCESS Q = 0

4.1 Constant (T, V, n) Experiments: The Helmholtz Free Energy

Statistical Mechanics in a Nutshell

MCB100A/Chem130 MidTerm Exam 2 April 4, 2013

Physics 172H Modern Mechanics

39th International Physics Olympiad - Hanoi - Vietnam Theoretical Problem No. 3 / Solution. Solution

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics

Entropy and Free Energy in Biology

Thermal and Statistical Physics Department Exam Last updated November 4, L π

arxiv:cond-mat/ v2 [cond-mat.stat-mech] 25 Sep 2000

Statistical Mechanics Victor Naden Robinson vlnr500 3 rd Year MPhys 17/2/12 Lectured by Rex Godby

( ) ( )( k B ( ) ( ) ( ) ( ) ( ) ( k T B ) 2 = ε F. ( ) π 2. ( ) 1+ π 2. ( k T B ) 2 = 2 3 Nε 1+ π 2. ( ) = ε /( ε 0 ).

8 Lecture 8: Thermodynamics: Principles

Contents. 1 Introduction and guide for this text 1. 2 Equilibrium and entropy 6. 3 Energy and how the microscopic world works 21

Statistical Mechanics. Hagai Meirovitch BBSI 2007

I.D The Second Law Q C

16 Free energy and convex analysis. Summary Helmholtz free energy A tells us about reversible work under isothermal condition:

The Euler Equation of Gas-Dynamics

Lesson 4 : The Boltzman distribution. Notes from Prof. Susskind video lectures publicly available on YouTube

Quasi-equilibrium transitions

Homework Week 8 G = H T S. Given that G = H T S, using the first and second laws we can write,

Heat Capacities, Absolute Zero, and the Third Law

Introduction Statistical Thermodynamics. Monday, January 6, 14

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble

Boltzmann Distribution Law (adapted from Nash)

Thermal & Statistical Physics Study Questions for the Spring 2018 Department Exam December 6, 2017

dt T du T = C V = Nk ln = Nk ln 1 + V ]

Lecture 5: Random Energy Model

Lecture 5 Entropy and Disorder

Assignment 8. Tyler Shendruk December 7, 2010

Appendix 4. Appendix 4A Heat Capacity of Ideal Gases

AME 436. Energy and Propulsion. Lecture 7 Unsteady-flow (reciprocating) engines 2: Using P-V and T-s diagrams

Entropy and the Second and Third Laws of Thermodynamics

Pumped Thermal Exergy Storage Past, Present and Future

Collective Effects. Equilibrium and Nonequilibrium Physics

Statistical Mechanics for the Truncated Quasi-Geostrophic Equations

MITES 2010: Physics III Survey of Modern Physics Problem Set 3 Solutions

Physics 1501 Lecture 37

PHYSICS 715 COURSE NOTES WEEK 1

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Thermodynamics and Statistical Physics. Preliminary Ph.D. Qualifying Exam. Summer 2009

arxiv:cond-mat/ v1 10 Aug 2002

UC Berkeley. Chem 130A. Spring nd Exam. March 10, 2004 Instructor: John Kuriyan

PHYS First Midterm SOLUTIONS

fiziks Institute for NET/JRF, GATE, IIT-JAM, JEST, TIFR and GRE in PHYSICAL SCIENCES

Statistical Physics. How to connect the microscopic properties -- lots of changes to the macroscopic properties -- not changing much.

Appendix 7: The chemical potential and the Gibbs factor

Multicanonical distribution and the origin of power laws. Abstract

Statistical Mechanics. Atomistic view of Materials

Statistical Mechanics. Hagai Meirovitch

An Entropy depending only on the Probability (or the Density Matrix)

Statistical Mechanics I

a. 4.2x10-4 m 3 b. 5.5x10-4 m 3 c. 1.2x10-4 m 3 d. 1.4x10-5 m 3 e. 8.8x10-5 m 3

Chapter 5. Chemical potential and Gibbs distribution

Solutions to Problem Set 8

Transcription:

Statistical Mechanics Will Penny Wellcome rust Centre for Neuroimaging, University College, London WC1N 3BG, UK. December 30, 2015 1 Introduction hese notes are taen from Leonard Sussind s first five lectures on Statistical Mechanics from Stanford University, and available on Youube. 2 Entropy Given a probability distribution over the discrete variable x which mae taen on values i = 1..K, the entropy is S = p i log p i (1) i=1 where p i p(x = i). If only M of the K states have non-zero probability and that probability is 1/M then S = log M. In this special case the entropy is the log of the number of occupied states. An analog of this special case in the continuous domain occurs in Liouville s heorem. his states that if the volume of phase space (i.e. of the x variable, but now multivariate and continuous) is some fixed value e.g. log M then after the system evolves the distribution over x will change but the volume will be preserved. he density within the volume is assumed uniform. Entropy can depend both on the state of a system itself and your nowledge of it (depending, presumably, on what distribution one is considering). 3 emperature he most general definition of temperature,, is the amount of energy required to increase the entropy by 1 unit of information. hat is = de ds where E is energy. o show that this is a sensible definition, consider two volumes of material A and B that are connected to each other and so may (2) 1

exchange energy. Before this exchange occurs assume that each system is at steady state with temperatures t A and t B and that t B t A. From the first law of thermodynamics (L1) we now that the total energy is conserved. de A + de B = 0 (3) From the second law of thermodynamics we now entropy increases (L2) ds A + ds B 0 (4) Substituting our definintion of temperature into equation 3 gives ds B = A B ds A (5) Pluggin this into equation 4, and assuming temperatures are positive, gives ( B A )ds A 0 (6) Because we have defined B to be at higher temperature than A we have Multiplying by A gives ds A 0 (7) de A 0 (8) hus the energy of system A increases. From L1, the energy of system B must therefore decrease. energy flows from the higher temperature system to the lower until the temperatures are equal. 4 Equilibrium Distribution Here we consider our system of interest as being in contact with a heat bath which is so large we can consider our system as being just one of N replicas of our system where N is a large number. We denote the number of replicas in state as n. We have n = N (9) =1 For example if n 1 = N all replicas are in state 1. Given that N is large we have p = n /N. What is the distribution of states at thermal equilibrium? i.e. what is p? Well, the p are defined by the corresponding set of occupation numbers n. he more ways there are of supporting {n 1,..., n,..., n K } the more probable is that set. his number of combinations is given by We now approximate C = log N! = N! n 1!n 2!...n K! (10) N log a (11) a=1 2

by the continuous integral log N! N 1 log ada (12) N log N N N! = N N e N which is Stirling s approximation. Plugging this into equation 10 gives log C = N log N i n i log n i (13) = = S p i log p i i=1 hus the most probable distribution of states is the one with the highest entropy. So, we can now finesse our question to be - what is the distribution of states with maximal entropy? Given that the constraints on this distribution are p = 1 (14) =1 p e = E =1 where E is the energy, the solution can be found by minimising (p) = ( ) ( ) p log p + α p 1 + β p e E (15) where α and β are Lagrange multipliers. Differentiating with respect to p and setting to zero gives log p = (1 + α) βe (16) where Z = e 1+α. his is the Boltzmann distribution. 5 he Partition Function p = 1 Z exp( βe ) (17) he quantity log Z is nown as the partition function and the fundamental quantities entropy, energy and temperature can all be related to it. 5.1 Energy Using the definition of energy we have E = e p (18) = 1 e exp( βe ) Z 3

and given that Z is the normalisation constant, we have Z = exp( βe ) (19) dz dβ = e exp( βe ) E = 1 dz Z dβ = d log Z dβ (20) 5.2 Entropy Using the definition of entropy we have S = p log p (21) and from the Boltzmann distribution we now p = 1 Z exp( βe ) (22) log p = βe log Z S = β p e + log Z p (23) = βe + log Z 5.3 emperature From the above relation between S and log Z we can write ds = βde + Edβ + d log Z dβ (24) dβ = βde + Edβ Edβ β = ds de herefore β = 1/ is the inverse temperature. (25) 6 Helmholtz Free Energy We can rewrite equation 23, by multiplying through by temperature, = 1/β, and re-arranging as E S = log Z (26) 4

he Helmholtz Free Energy, A, is then defined as the above quantity So, A is a function of temperature. 7 Adiabatic Change A = E S (27) = log Z In dependent variables that one may vary experimentally are referred to as control parameters. For example, one may alter volume,, and measure what is the effect on the dependent variable energy. We can define pressure on eg. a piston connected to a volume of material as P = E (28) Increasing the volume results in less pressure on the piston. An adiabatic change is defined as one which does not change the entropy. hus if the volume changes adiabatically we have P = E (29) S From basic calculus (see section 1.6.1) we can write P = E E (30) = E = (E S) = A where the second line follows from the definition of temperature, and the last line follows from the definition of the Helmholtz Free Energy. We therefore also have P = log Z Pressure is sometimes referred to as the thermodynamic conjugate of volume (cf. dependent and independent variables). 7.0.1 Example: Ideal gas For an ideal gas Z = N f(β) N! (31) log Z = N log + log f(β) +... log Z = N 5

P = N, the ideal Gas law, or if you measure temperature in elvin P = N. Appendix Calculus for Adiabatic Change he change in entropy given by a change in volume and temperature is ds = + (32) herefore, if there is no change in entropy ds = 0 we must have = (33) his is our adiabatic constraint. Similarly, the change in energy given by a change in volume and temperature is E = E + E (34) herefore, the change in energy per change in volume is E = E + E = E + E (35) Finally, plugging in equation 33 (our adiabatic constraint) into this last equation gives E = E S E (36) 6