STATISTISK MEKANIK FYSA21. Hösten Lars Gislén. Department of Theoretical Physics University of Lund

Similar documents
STATISTICAL MECHANICS

Thermodynamics and statistical mechanics in materials modelling II

STATISTICAL MECHANICAL ENSEMBLES 1 MICROSCOPIC AND MACROSCOPIC VARIABLES PHASE SPACE ENSEMBLES. CHE 524 A. Panagiotopoulos 1

Thermodynamics General

University of Washington Department of Chemistry Chemistry 453 Winter Quarter 2015

Temperature. Chapter Heat Engine

Lecture 4. Macrostates and Microstates (Ch. 2 )

and Statistical Mechanics Material Properties

Thermodynamics Second Law Entropy

Physics 240: Worksheet 30 Name:

Physics 607 Exam 1. ( ) = 1, Γ( z +1) = zγ( z) x n e x2 dx = 1. e x2

Review of Classical Thermodynamics

A quote of the week (or camel of the week): There is no expedience to which a man will not go to avoid the labor of thinking. Thomas A.

Introduction to Statistical Methods

Physics 181. Particle Systems

Statistical mechanics handout 4

10.40 Appendix Connection to Thermodynamics and Derivation of Boltzmann Distribution

NAME and Section No.

Lecture 7: Boltzmann distribution & Thermodynamics of mixing

π e ax2 dx = x 2 e ax2 dx or x 3 e ax2 dx = 1 x 4 e ax2 dx = 3 π 8a 5/2 (a) We are considering the Maxwell velocity distribution function: 2πτ/m

12. The Hamilton-Jacobi Equation Michael Fowler

Week3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity

...Thermodynamics. If Clausius Clapeyron fails. l T (v 2 v 1 ) = 0/0 Second order phase transition ( S, v = 0)

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law:

Lecture. Polymer Thermodynamics 0331 L Chemical Potential

Mechanics Physics 151

G4023 Mid-Term Exam #1 Solutions

Solution Thermodynamics

The Feynman path integral

A particle in a state of uniform motion remain in that state of motion unless acted upon by external force.

Homework Chapter 21 Solutions!!

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

Numerical Heat and Mass Transfer

ESCI 341 Atmospheric Thermodynamics Lesson 10 The Physical Meaning of Entropy

Open Systems: Chemical Potential and Partial Molar Quantities Chemical Potential

Physics 5153 Classical Mechanics. Principle of Virtual Work-1

FREQUENCY DISTRIBUTIONS Page 1 of The idea of a frequency distribution for sets of observations will be introduced,

Density matrix. c α (t)φ α (q)

Physics 53. Rotational Motion 3. Sir, I have found you an argument, but I am not obliged to find you an understanding.

Moments of Inertia. and reminds us of the analogous equation for linear momentum p= mv, which is of the form. The kinetic energy of the body is.

Monte Carlo method II

Chapter Twelve. Integration. We now turn our attention to the idea of an integral in dimensions higher than one. Consider a real-valued function f : D

PHY688, Statistical Mechanics

One Dimension Again. Chapter Fourteen

Chapter 13: Multiple Regression

Temperature. Chapter Temperature Scales

Chapter 1. Probability

Physics 5153 Classical Mechanics. D Alembert s Principle and The Lagrangian-1

Workshop: Approximating energies and wave functions Quantum aspects of physical chemistry

V T for n & P = constant

Integrals and Invariants of Euler-Lagrange Equations

Lecture 5: Ideal monatomic gas

5.62 Physical Chemistry II Spring 2008

5.60 Thermodynamics & Kinetics Spring 2008

Lecture 3: Probability Distributions

χ x B E (c) Figure 2.1.1: (a) a material particle in a body, (b) a place in space, (c) a configuration of the body

Week 9 Chapter 10 Section 1-5

Physics 106a, Caltech 11 October, Lecture 4: Constraints, Virtual Work, etc. Constraints

Mechanics Physics 151

THERMAL DISTRIBUTION IN THE HCL SPECTRUM OBJECTIVE

CHAPTER 14 GENERAL PERTURBATION THEORY

Chapter 5 rd Law of Thermodynamics

Entropy generation in a chemical reaction

EPR Paradox and the Physical Meaning of an Experiment in Quantum Mechanics. Vesselin C. Noninski

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

Outline. Unit Eight Calculations with Entropy. The Second Law. Second Law Notes. Uses of Entropy. Entropy is a Property.

Linear Momentum. Center of Mass.

Salmon: Lectures on partial differential equations. Consider the general linear, second-order PDE in the form. ,x 2

Module 14: THE INTEGRAL Exploring Calculus

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Chapters 18 & 19: Themodynamics review. All macroscopic (i.e., human scale) quantities must ultimately be explained on the microscopic scale.

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Thermodynamics and Kinetics of Solids 33. III. Statistical Thermodynamics. Â N i = N (5.3) N i. i =0. Â e i = E (5.4) has a maximum.

Physics 115. Molecular motion and temperature Phase equilibrium, evaporation

#64. ΔS for Isothermal Mixing of Ideal Gases

Lecture 10 Support Vector Machines II

Problem Set #6 solution, Chem 340, Fall 2013 Due Friday, Oct 11, 2013 Please show all work for credit

Q e E i /k B. i i i i

Chapter 21 - The Kinetic Theory of Gases

Prof. Dr. I. Nasser Phys 630, T Aug-15 One_dimensional_Ising_Model

Robert Eisberg Second edition CH 09 Multielectron atoms ground states and x-ray excitations

PHYS 705: Classical Mechanics. Calculus of Variations II

Supplementary Notes for Chapter 9 Mixture Thermodynamics

Linear Approximation with Regularization and Moving Least Squares

Inductance Calculation for Conductors of Arbitrary Shape

Chapter 8. Potential Energy and Conservation of Energy

ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM

Physics 141. Lecture 14. Frank L. H. Wolfs Department of Physics and Astronomy, University of Rochester, Lecture 14, Page 1

5 The Rational Canonical Form

Lecture 3: Boltzmann distribution

Assignment 4. Adsorption Isotherms

Foundations of Arithmetic

Formal solvers of the RT equation

Kernel Methods and SVMs Extension

Engineering Risk Benefit Analysis

Physics 207: Lecture 20. Today s Agenda Homework for Monday

where the sums are over the partcle labels. In general H = p2 2m + V s(r ) V j = V nt (jr, r j j) (5) where V s s the sngle-partcle potental and V nt

NUMERICAL DIFFERENTIATION

The Second Anti-Mathima on Game Theory

Poisson brackets and canonical transformations

Transcription:

STATISTISK MEKANIK FYSA Hösten 009 Lars Gslén Department of Theoretcal Physcs Unversty of Lund

STATISTICAL MECHANICS Lars Gslén, Theoretcal physcs, Unversty of Lund. Introducton Classcal thermodynamcs s based on a few prncples derved from: The frst and second law. The frst law s essentally a law statng the conservaton of energy: The change of the nternal energy of a body s the result of mechancal work and/or a change of ts "heat". The second law can, n one of ts formulatons, be expressed as that "heat" spontaneously flows from warm to cold. In classcal thermodynamcs you also ntroduce the concept of entropy n connecton wth studyng heat engnes and the so-called Carnot process. Ths way of studyng thermodynamcs s essentally hstorcally-techncally and was the result of the need to understand and mprove the heat engnes that was used n the early mnng technque, lke Newcomen's steam engne. In classcal thermodynamcs we use macroscopc quanttes lke temperature, mass, pressure, volume, densty. The method we wll use was developed by the Austran physcst Ludwg Boltzmann at the end of the 9th century. It derves the thermodynamc laws and macroscopc propertes of a system startng from a mcroscopc descrpton. The advantage of usng ths approach s that we can, startng from a few, very smple axoms, derve the entre classcal thermodynamcs. Another advantage s that the concept of entropy that s rather abstract and hard to understand n classcal thermodynamcs gets a very smple nterpretaton usng statstcal mechancs. We wll avod usng the word "heat" n our text as ths word n everyday language s connected wth several dfferent concepts n thermodynamcs lke temperature, nternal energy, transfer of energy usng a temperature dfference and also wth entropy.. Fundament defntons Thermodynamc system Isolatng walls

If we wat long enough the system wll reach thermodynamcal equlbrum (TE). Macroscopc parameters then have well defned and constant values. The system has no "memory" of past states. Example: Dathermc means that energy (heat) can pass the wall. After havng wated long enough we have TE. The 0-th law of thermodynamcs: If system A s n thermodynamc equlbrum wth B and C s n thermodynamc equlbrum wth B ths mples that C s n thermodynamc equlbrum wth A. A B C B A C ΤΕ ΤΕ ΤΕ Example: Ths can be use to (prelmnarly) defne temperature by havng B above be a "thermometer". If the temperature of A s the same as the temperature of B and the temperature of C s the same as the temperature of B then A and C have the same temperature. We can now determne f two bodes have the same temperature. What we need s a temperature scale. Imagne a number of reference systems B ; =,,. We can check whch of them s n TE wth A. The number then s a measure of the temperature of A. Example: In a standard thermometer, s the length of the lqud capllary of the thermometer. We can calbrate a thermometer by usng specal meda: ce, bolng water. We can assgn the length of the thermometer capllary by 00 for bolng water and 0 for ce and then we dvde the nterval between them n 00 parts. The gves us the so-called Celcus scale but hstorcally we also have other calbratons, the Fahrenhet scale, the Réaumur scale and the Rankn scale. Calbrated thermometers of course agree at the calbraton ponts but can devate n other ponts. We would lke to defne temperature n a more

consstent way. One possblty s, as s done n classcal thermodynamcs, to defne temperature as proportonal to the pressure of an "deal" gas wt fxed volume. We wll return to ths later. We wll show that n statstcal mechancs, we can defne temperature theoretcally and that ths defnton agrees wth the deal temperature defnton.. Dfferent knds of equlbrum We llustrate ths usng three pctures. Thus we can have thermal equlbrum, mechancal equlbrum and chemcal equlbrum. There are, n fact, several other possbltes; magne that we have electrcally charged partcles and /or partcles wth magnetc moments and external electrc and magnetc felds. We must then wat for electrc and magnetc equlbrum and so on. We wll for the moment gnore such effects on our system. To have thermodynamc equlbrum, we demand that we have (f possble) all these knd of equlbrum at the same tme..3 Functons of state At thermodynamc equlbrum t turns out that the propertes of the system are smple; they only depend on a few macroscopc parameters; we have what s called a well-defned macroscopc state. We then have certan relatons between some of the macroscopc parameters. Let T be temperature, p pressure and V volume. ( ) p = p( T,V ) V = V( p,t ) Example: T = T p,v Example: For an deal gas at constant temperature we have expermentally pv = const, Boyle's law. It the temperature s not constant we have (for an deal gas) pv = nrt, the general law of state. Observe that ths mples that n classcal thermodynamcs we can ntroduce the deal gas scale by T = pv nr We can measure temperature by measurng the pressure of an deal gas n a contaner wth constant volume: the gas thermometer. Example. We can model a non-deal gas usng the equaton of van der Waal: p + a ( V b) = nrt V where a and b are sutable constants. Consder an arbtrary (nce) functon 3

G = G( x,y, ) where G s such that ts value s unquely determned by the arguments x, y, We make a small change of G by changng the arguments. We can compute the change n G usng a Taylor expanson: dg = G G dx + dy = A( x, y)dx + B ( x, y x y )dy Notce that f G s nce enough, n practce always, we have G y x = G A = ( x, y ) B = ( x,y ) x y y x that s a very strngent condton on the functons A and B..4 Internal energy E Change the temperature of a system from T to T. To do ths we have to perform work, ether usng thermal work (coolng or heatng the system) or by usng mechancal work (for nstance by frcton) or by addng chemcal, electrc, magnetc energy, we neglect the last ones. The sum of all these works change the energy of the system and ths change only depends on the start and end states of the system. We can retan the prncple of energy conservaton f we assume that the added energy s stored as nternal energy, E, n the system: ΔE = Q + W = thermal+ mechancal work Ths s the frst law of thermodynamcs. The nternal energy s a functon of state of the system. Ths means that t s, as a functon, unquely determned by a handful of macroscopc parameters. Notce that what often s called "heat" that s transfer of energy usng a temperature dfference here s gven a specal name, thermal work. Example: For a gas we have E = E( p,v,t,n). (Besdes for an deal gas t turns out that the nternal energy only depends on the temperature.) BUT: W and Q that s mechancal and thermal work are not functons of state! Explan why? We rewrte the frst law n terms of nfntesmal changes and also put n the specfc expresson for the mechancal work n case we have a gas: de = dq + dw = dq p dv The lnes over the d:s mark that they are not "proper" dfferentals, that s do not correspond to changes n functons of state. Can you explan the mnus sgn n the formula? The frst law s often wrtten: dq = de + p dv 4

.5 Heat capactes We defne the heat capacty C by C = dq that s the rato of thermal work and dt temperature change. In most cases the heat capacty wll be dfferent f we measure t at constant pressure or at constant volume. We denote the heat capacty at constant volume or pressure by an ndex V and p respectvely on C. It turns out that a physcally more nterestng quantty s the molar heat capacty, the heat capacty per mol. We denote molar heat capactes by a lower case c. Study the followng table that shows molar heat capactes for a number of gases: Gas c p c V c p c V γ = c p /c V He 0.9.6 8.3.66 Ar 0.9.5 8.4.67 Hg 0.9.5 8.4.67 O 9.3 0.9 8.4.40 CO 9.3.0 8.3.40 Cl 34. 5. 9.0.36 SO 40.6 3.4 9..9 C H 6 5.9 43. 8.8.0 You can note several nterestng facts n the table. The dfference between heat capactes at constant pressure and volume s more or less constant. Further, the raton between these heat capactes n the last column s very close to ratonal numbers 5/3.67, 7/5 =.4, 9/7.9. We wll understand and explan these observatons later on. Now study a gas at constant volume. dq = de + p dv = de (dv = 0) Ths mples C V = dq dt = de dt c V = de n dt V V V At constant pressure we have dq = de + p dv C p = dq dt p = de dt p + p dv dt For an deal gas we have pv = nrt. If the pressure p s constant ths mples p dv dt = nr Thus 5

C p = de dt p + nr c p = de n dt p For an deal gas the nternal energy only depends on the temperature whch means de dt = de dt p V or c p = c V + R where R s the gas constant wth value 8.343 J/(mol K). Ths agrees very well wth the thrd column n the table above. Exercse: Explan why cp s larger than cv?.6 Adabatc process, deal gas An adabatc process s defned as a process n whch the thermal work s zero (no "heat" s added or subtracted from the system) and f we consder mol of gas we have de = pdv = c V dt For mol of an deal gas we also have pv = RT If we dfferentate ths equaton we get dpv + pdv = RdT = R c V pdv + R or or 0 = dpv + pdv + R c V = dpv + pdv c + R V = dpv + pdvγ c V 0 = dp p + γ dv V We ntegrate or ( ) const = ln p + γ lnv = ln pv γ pv γ = const.7 Some termnology Certan macroscopc parameters lke volume, mass, nternal energy have the property that f we make a new system by untng two systems, these parameters wll smply be the sum of the parameters of the orgnal systems. Such parameters are addtve and are called extensve. Other parameters lke pressure, temperature and densty behave dfferently. Such parameters are called ntensve. 6

Exercse problems. Chapter. The gas law can be wrtten n several dfferent ways: pv = Nk B T pv = nrt ρ = Mp RT n = N N A eher N s the number of partcles, n number of mols, M mol mass, ρ densty, and R the gas constant. N A s Avogadro s constant. Show that these formulatons are equvalent and express gas constant n more fundamental constants of nature.. You compress ar n a bcycle pump rapdly to /0 of the orgnal volyme. What knd of process s ths? What s the fnal temperature and pressure of the ar? γ =.4. If you nstead compress the ar very slowly, what s the fnal pressure? Explan! Practcal use? 3. Newcomen s steam engne worked lke ths: a) Steam of atmospherc pressure was let nto the cylnder of the engne. b) A small amount of cold water was njected n the cylnder causng the steam to condensate. c) Steam takes up a volume that s about 700 tmes the volume of lqud water. Ths means that a vacuum was essentally created n the cylnder. The pston was pressed down by the atmospherc pressure. Ths was the work phase of the engne. d) The cycle was repeated from a). Problem: Compute the theoretcal effcency for ths process. Hnt: How much energy was needed to transform water of 00 C to steam of 00 C? How large s the work done by the atmospherc pressure? 7

. About probabltes. Introducton Before we enter statstcal mechancs, we wll repeat some concepts from probablty scence. For nstance throwng a de s called an experment. The result s called an event. We can enumerate the events wth an ndex. To each event we connect a (real) number P [ 0,] that we call the probablty of the event. We can plot the possble events as ponts n an abstract space. Example: Throwng a de 3 4 5 6 Example: Throwng two dce 6 5 4 3 3 4 5 6 We can also study a complex event where we choose a group of ponts n the dagram above and say that the complex event occurs f any event occurs that belongs to the group of events. An example would be n the case of throwng two dce that the sum of the two dce s 5. We chose the probabltes such that P =, normalzaton.. Classcal probablty Choose P such that P = where Ω s the total number of possble events. Ω Example: Heads or tals. Ω =, P krona = P klave = Example: Throwng one de. Ω = 6, P = 6, =..6 8

Example: Throwng two dce. Ω = 36, P = 36, =..36 In the real world t s not always true that the dfferent probabltes are equal, a dce can be prepared. But f we don't have much nformaton of a system the assumpton of equal probablty s reasonable f we want to explore the system..3 Statstcal probablty Make N experments. If an event occurs n tmes, we defne the statstcal probablty as the lmt n P = lm N N In practce we can of course not make an nfnte number of experments but have to be satsfed by "many". Another problem s that we cannot be sure that the lmt exsts. Example: The value of a share on the stock market..4 Probablty postulates [ ], P ( ) = P + P j + P k + f, j, k are mutually exclusve events. ) P 0, ) P j k... = ( ) = P P j P k f, j, k are ndependent events. 3) P j k... Mutually exclusve means that f one of the events occurs none of the other can occur. If you throw a de you can only get one of the events,, 3, 4, 5 or 6. Independent events means that they cannon nfluence each other. If you throw two dce, the result of one de does not nfluence the result of the other de. Example: What s the probablty of gettng at least one sx n three throws wth a de? The throws are ndependent events. The probablty of not gettng any sx s 5 5 5 (Postulate 3). The probablty of not gettng ths result that s gettng at 6 6 6 least one sx s then 5 5 5 6 6 6 0.4 (Postulate ).5 Permutatons A permutaton of N dfferent objects = the number of ways that you can order N dfferent objects n a row. Some reflecton tells us the number s N ( N ) ( N ) ( N 3) = N! If n objects are dentcal we get N! permutatons. We have to compensate for n! the n! permutatons that are alke. 9

If we have several knds of dentcal objects wth n of one knd, n knd and so on, the number of permutatons s N! n! n! = N! n! where N = n Note that 0! = by defnton. of another.6 Dstrbutons Example: Count the number of pulses n the Geger counter durng a certan tme, say 0 seconds. Repeat many tmes. Denote the number of measured pulses n experment by x. Plot the result n a dagram that may look lke ths: n s the frequency that s the number of tmes we measured x pulses. We now defne the average (mean) number of pulses or the expectaton value of the number of pulses as: n x x = N = n N x = P x where the last equalty follows of the number of measurements, N, s large. Note that the expectaton value n general s NOT the same as the most probable value, the number of pulses n the maxmum bar n the dagram. We wll often have a contnuous dstrbuton where ρ( x)dx s the probablty of fndng x n the nterval [ x,x + dx]. For ths case t s natural to defne x = xρ( x) dx ( ) s called dstrbuton n probablty densty. Evdently we have a ρ( x) ρ x normalzaton condton dx = correspondng to P = n the dscrete case. 0

Example. Quantum mechancs where ρ x functon. ( ) = Ψ ( x) wth Ψ ( x) = the wave An mportant statstcal quantty s the standard devaton or scatterng, σ, defned by the varance σ = P ( x x ) Note! x can represent any physcally nterestng varable lke poston, speed, energy Exercse problems. Chapter. In how many ways can you permute 4 grls and 5 boys?. What s the probablty of gettng ether 7 or 6 by throwng two dce? 3. Show that we can wrte σ ( x) = x x. Ths s very useful as t smplfes the number of steps n the computaton of the standard devaton. 4. A neutron that moves n a pece of uranum-35 can ht an uranum nucleus and start a chan reacton. Assume that the probablty for a neutron to ht a nucleus when t moves a dstance dx s p dx. a) What s the probablty the neutron does not ht a nucleus when t moves a dstance dx? b) What s the probablty that the neutron moves N steps dx wthout httng a nucleus and then hts a nucleus n the next step? c) Assume that N steps correspond to a total dstance x. Use the relaton lm + z N = e z to rewrte the expresson you got n b) n a smpler way. N N d) Compute the average dstance a neutron travels before t hts a nucleus.

3. Statstcal mechancs We wll now construct the statstcal mechancs, whch derves the classcal thermodynamcs from a few smple postulates. We assume that we study solated systems wth a large number, N, of dentcal, weakly nteractng partcles n a volume V. The partcles have a total energy E. The assumpton of weak nteractons means that the total energy s the sum of one-partcle energes. 3. Macrostates and mcrostates A gven macrostate can be realsed by an enormous number of mcrostates, even worse, f we consder one mol of gas t changes mcrostate 0 3 tmes each second! We can look at the ar n ths lecture hall that looks the same n spte of an enormous number of collsons each moment between the molecules that then change ther veloctes and by that the mcrostate. In prncple we could descrbe the mcrostates f we knew the three-dmensonal postons and veloctes of every molecule. Ths s of course mpossble n practce and we wll see that we can manage qute well by usng statstcal methods to descrbe the mcrostates. We denote the number of accessble mcrostates by Ω. Ths number s of course very large.

Postulate: Every mcrostate s equally probable at thermodynamcal equlbrum. How can we motvate ths postulate? Smply by sayng that t s the smplest assumpton. We don t know anythng about these probabltes thus we assume they are the same. If we would assume they were dfferent we would mmedately have to face a more complcated problem: what then are they? Besdes, t turns out that the thermodynamcal laws that we get from ths postulate agree very well wth experment. Our statstcal methods have the followng steps: ) Solve the one-partcle problem. Ths means solvng a quantum mechancal problem that gves us the energy levels ε, =,, 3... and states of the partcle. We assume that we have solved ths problem. ) What dstrbutons {n } of partcles can we have n the energy levels gven the constrants N = n and E = n ε? 3) How many mcrostates t{n } are there n each dstrbuton? 4) Determne the average dstrbuton. 3. Entropy, S We defne the entropy of a system by S = k B ln Ω where k B =.38 0 3 J/K, Boltzmann s constant, determnes the scale and dmenson of the entropy. As you can see, the entropy s smply a measure of the number of accessble mcrostates for the system. We have once above used the word accessble. All mcrostates are not n general accessble to the system. For nstance we demand that a gas should be confned n a certan volume whch means that mcrostates where a molecule s outsde the volume are forbdden. Further we want that the system shall have a certan nternal energy that puts a constrant on the possble energes of the partcles; the sum of ther energes must have a fxed value. Why not havng the entropy be just Ω? There are several evdent advantages by usng a logarthm. ) Ω s an ENORMOUS number! By usng the logarthm of large numbers we get numbers that are more manageable. ) Consder two systems wth Ω and Ω mcrostates respectvely. The combned system has evdently Ω = Ω Ω mcrostates. But wth our defnton the entropy of the combned system s 3

S = k B ln Ω = k B ln( Ω Ω ) = k B ln Ω + k B ln Ω = S + S Ths s a very nce property; entropy s an addtve or extensve quantty. As we wll soon see the entropy has several other nce and useful propertes. Example: Place N partcles n a volume V. Dvde V n a number of small compartments each wth the fxed volume ΔV. Place the partcles randomly n the compartments. There are V compartments, thus the number of ΔV possble ways of placng the partcles = the number of accessble mcrostates s Ω = V N ΔV The entropy s S = k B ln Ω = k B ln V N = Nk ΔV B ln V ΔV = Nk ( B ln V ln ΔV ) It looks lke we have a problem here. The value of the entropy depends on the volume of the compartments. Just now we can avod ths problem by sayng that we normally are only nterested n changes of the entropy and n such cases the offendng term dsappears. Later on we wll see that usng quantum mechancs we actually get a defnte sze of the compartments. But ths was a problem for Boltzmann who lved at a tme when quantum mechancs hadn t been nvented Example: Detaled computaton for a "Mckey Mouse system" wth 4 partcles. We use our statstcal method outlned above. Suppose that we have equdstant energy levels (agan quantum mechancs!) 0, ε, ε, 3 ε Suppose that the total energy of the system s 4ε. We also suppose that the partcles are dstngushable, that s can be thought of as havng labels A, B, C... such that we can tell them apart. We can then place the partcles n the levels n fve dfferent ways (dstrbutons) that all gve the same gven total energy: 4ε 3ε ε ε 0 We now count the number of permutatons for each dstrbuton. Partcles n one level are not permuted. We get the followng result: 4! 3!! =4 4!!!! = 4!!!! = 4!!! =6 4! 4! = 4

(In general you have N! n! permutatons.) In total we have Ω = 4 + + 6 + + = 35 mcrostates. In ths case the entropy s S = k B ln 35 3.56 k B The probabltes of the respectve dstrbutons are 4/35, /35, 6/35, /35, /35. Observe that certan, rather few dstrbutons domnate the scene. The average number of partcles n level 0 s n 0 = 3 4 + + 6 + + 0.7 35 35 35 35 35 For the other levels we easly calculate the correspondng averages n.4 n 0.69 n 3 0.34 n 4 0. It s nterestng to plot the result <n> 0 0 3 4 The curve has some smlarty wth an exponentally decreasng functon. We wll return to ths fact and see that our supposton s true. 5 We dvde the system n two subsystems A and B each wth partcles and energes E A = 4ε and E B = 0. You easly show that Ω A = 5 and Ω B =, that s the total number of mcrostates s Ω = ΩA ΩB = 5. If we brng the systems together and allow them to reach thermodynamcal equlbrum we wll have the stuaton we studed before: as we approach thermodynamcal equlbrum the entropy ncreases. 3.3 The second law. Use of the entropy Postulate: (Second law) At thermodynamcal equlbrum the entropy of an solated system takes ts maxmum value (gven the constrants on the system lke nternal energy, volume, number of partcles and so on). The postulate means that at thermodynamcal equlbrum (TE) the system explots all accessble mcrostates wth the same probablty. 5

Ths mples that at TE, S has a fxed value determned by the parameters E, V, N. Ths mples that S s a functon of state. Fnally ths mples that for small changed n the parameters we have ds = S S S de + dv + E V N dn 3.3. Defnton of temperature Consder a system Only the nternal energy E and E and of course the entropy can vary, all other parameters are fxed. At TE the entropy has a maxmum whch means that ds = 0 when we vary the nternal energy: 0 = ds = S E de + S E de We use that the energy s conserved de = de and get 0 = S S E E de S = S E E In ths case we evdently have thermal equlbrum and the temperature must be the same n the two subsystems. The partal dervatve has dmenson nverse temperature. Ths leads us to defne temperature by S E = T We have earler seen that we can (at least n prncple, just count the number of accessble mcrostates) compute the entropy of a system. Gven the entropy S we can then compute the temperature T. It turns out that the temperature that we get n ths way s dentcal wth the one n classcal thermodynamcs that you get from the deal gas thermometer. Now assume that the to subsystems are NOT n thermodynamcal equlbrum but that T > T. When we let the two systems exchange energy, the entropy wll ncrease towards a maxmum, wth other words ds > 0. Then we have ds = S de E + S de E = S S E E de = T T de > 0 The factor n front of de s less than zero, thus de < 0 whch we nterpret that system s losng energy and system s ganng energy, energy flows 6

spontaneously from a warmer system to a colder one. Ths s one of the alternatve formulatons of the second law and agrees wth physcal common sense. 3.3.. Defnton of pressure We start agan but now wth a movable, dathermal wall between the subsystems. Here E, E, V, and V can vary. At TE the entropy has a maxmum: 0 = ds = S dv V + S dv V + S de E + S de E We know that for thermal equlbrum the sum of the frst two terms s zero, thus the sum of the last two terms must be zero. The total volume s constant or dv = dv whch mples S S V V dv = 0 S = S V V Now we have also mechancal equlbrum and agan by dmensonal reasonng t seem to be a good dea to defne pressure, p, by S V = p T because f we use that we have thermal equlbrum and then the same temperature n the two subsystems we get p = p Ths s ntutvely correct, n ths knd of equlbrum both temperature and pressure are equal n the subsystems. In the same way as before we can now show that f we have thermal equlbrum but not mechancal equlbrum the subsystem wth the hgher pressure wll expand at the expense of the volume of the other subsystem. *3.3.3. Chemcal potental µ Fnally we study a permeable wall that allows partcles to pass, s dathermal and movable. 7

As a concrete example you can thnk of havng a gas n and a lqud n and that the wall s the nterface between lqud and gas. At TE the entropy s maxmal: 0 = ds = S dn N + S dn N + S dv V + S dv V + S de E + S de E As we have thermal and mechancal equlbrum the sum of the frst four terms s zero. In the same way as before, usng that the total number of partcles s conserved, we have S = S N N We then have chemcal equlbrum and defne the chemcal potental, µ, by S N = µ T The sgn s chosen such that we later get consstent results. The chemcal potental has dmenson energy. Our procedures above are actually very general, the partal dervatve of the entropy wth respect to an extensve varable gves us an ntensve parameter dvded by temperature. In summary we gve an alternatve formulaton of the second law: In an solated system the change of the entropy s always larger than or equal to zero. (Ether the system s at TE and the entropy has attaned ts maxmum value or t s on ts way to equlbrum and the entropy s ncreasng.) 3.3.4 Rewards Earler we had ds = S S S de + dv + E V N dn T de + p T dv µ T dn We rewrte de = TdS pdv + µdn Ths looks famlar! We redscover the frst law; actually not very exctng as we have used that the nternal energy s conserved. The nterestng thng s that the frst term looks dfferent. The frst term evdently corresponds to thermal work, the second s mechancal work, the thrst chemcal work, f we add partcles to the system the carry some knd of chemcal energy µ nto the system. (Just now we are not nterested n such processes but ths term wll be mportant when we study chemcal processes or equlbrum problems for a lqud-gas nterface.) Identfyng the frst term wth thermal energy we have dq =TdS or ds = dq T. Ths s the orgnal, hstorcal defnton of entropy. If we ntegrate between two (macro)states A and B we have 8

B dq ΔS AB = A T T C m dt Further dq = C m dt ΔS AB = T T We have found a smple macroscopc way of computng entropy changes when we heat a body! Example: What s ΔS when kg water melts? The temperature s constant = 73 K. Thus ΔS = C m /73 = 334/73 kj /K Ths shows how to compute the entropy change of a phase change. Note that the entropy ncreases, the molecules n the water can access many more mcrostates than they have n ce. A further reward! We return to our volume that we dvded n small compartments wth fxed volume S = Nk B ( lnv ln ΔV) We can now compute the pressure p T S V = Nk B pv = Nk B T V We have n a very smple way derved the gas law for an deal! Ths also shows that our temperature defnton and the deal gas thermometer temperature are equvalent. Note that we only have used our two smple postulates and out defntons of pressure and temperature. Statstcal mechancs s an extremely powerful tool! 3.4. We dg deeper and fnd the Boltzmann factor We wll now make a more advanced calculaton on a system that s more realstc than our earler Mckey Mouse system. We assume that we have N partcles where N s really large, of order 0 3. We assume that we have a number of energy levels ε ; =,, not necessarly equdstant. Assume that we n one of the possble dstrbutons have n ; =,, partcles n the respectve levels. Here we explot an mportant fact. At TE the entropy s maxmal. It then turns out (se below) that f the number of partcles s large, only ONE dstrbuton wll domnate n probablty over all the others. We saw ths tendency already n the Mckey Mouse system. The number of mcrostates n ths dstrbuton wll, f the number of partcles s large, be almost the same as the total number of mcrostates. Ths means that we only have to study one dstrbuton and n ths dstrbuton arrange the partcles such that the entropy s maxmsed gven the constrants that the total energy and number of partcles s constant. The number of mcrostates n ths dstrbuton s Ω * = N! n! and the entropy 9

S = k B ln N! n! When n s large we can use Strlng s approxmaton: ln n! n ln n n. Ths approxmaton s very good even for rather reasonable n. Example: Study a large number (N) of tosses of a penny. For each toss we can have ether of two events thus we have N "mcrostates" n total, each wth the same probablty. In a dstrbuton wth n heads N! and m = N n tals we have t = n! ( N n)! mcrostates. Ths expresson has evdently a maxmum when n = N / or N! t max = N! N! If we use Sterlng s approxmaton we have ln t max = N ln N N N ln N N = N ln N N ln N = N ln N N ln N + N ln = N ln = ln N When N s large, the number of mcrostates n the most probable dstrbuton approaches the total number of mcrostates. We return to our man problem. For the entropy we then have S = k B ln N! ln n! = k ln N! ln n! B = = k B N ln N N ( n ln n n ) We now want to maxmse S by varyng the dfferent n :s. Ths s a bt problematc as the n :s are not ndependent. We have two constrants on n : N = n and E = n ε, the number of partcles and the nternal energy are gven and constant Mathematcally we can handle such a stuaton by nsertng the constrants va Lagrange multplcators and nstead maxmse the functon f = S + α N n k B + β E n ε N t max ln N Relatve error (%) 0.69.39 50.00 4.79.77 35.38 6 3.00 4.6 7.97 8 4.5 5.55 3.38 0 5.53 6.93 0.3 0.3 3.86.5 30 8.86 0.79 9.30 40 5.65 7.73 7.49 50 3.47 34.66 6.3 00 66.78 69.3 3.65 00 35.75 38.63.07 500 343.4 346.57 0.96 000 689.47 693.5 0.53 0000 696.64 693.47 0.07 Through ths trck we can treat the n :s as f they were ndependent and get 0

or 0 = f n = ln n + α βε n = e α e βε We determne the frst factor that contans α by the condton n = e α e βε = e α e βε N = N e α = N e βε Z where we have defned Z = very useful. e βε, the partton functon that wll soon prove The occupaton number of each level s then gven by n = N Z e βε e βε s the so-called Boltzmann factor. We now see what we guessed n the case of the Mckey Mouse system, that the number of partcles n the levels decrease exponentally as the energy ncreases. 3.5 What s β? We plug n our result n the entropy S = k B N ln N N ( n ln n n ) = k B N ln N k B k B N ln N k B N Z ( n ln n ) = k B N ln N k B N Z e βε ( ( ln N ln Z βε )) = N k B N ln N k B Z Zln N Zln Z β ε e βε = e βε ln N Z e βε = k B N ln Z + k B β N ε = k N ln Z + k β ε n = k B B B N ln Z + k B βe Z e βε To sum up we have: S = k B N ln Z + k B βe Fnally use the temperature defnton S E = T = k β or β = B k B T. We collect our results expressed n more famlar quanttes, the postons of the energy levels, the temperature and the nternal energy, all of them measurable or computable quanttes: n = N Z e βε where Z = e βε and β = k B T S = k B N ln Z + E T

Remember for the future that the probablty to fnd a partcle (or a system) n level s n N = Z e βε e βε, the Boltzmann probablty. 3.6 Energy reservor and subsystem Consder a subsystem n contact and n thermal equlbrum wth an energy reservor wth temperature T. We assume that the subsystem s small and that the energy reservor s large. The energy reservor and the subsystem are solated ffromthe envronment and have the total (and constant) energy E. We want the probablty p that the subsystem delsystemet s n the state wth E. The energy reservor the has the energy E E. The entropy of the energy reservor s a functon of ts energy and s S ( E E ). The number of mcrostates of the energy reservor then s e S ( E E )/k B. For the subsystem and the energy reservor we then have n total e S ( E E )/k B mcrostates. The probablty that the subsystem s n a state wth energy E s proportonal to the number of mrostates n the combnes system, thus we have p = Ae S ( E E )/k B As the energy reservor s large we have E << E and can Taylor expand: S that mples S ( ) = S ( E) E E E ( E) E /k B E = S ( E) E T p = Ae S T = Ce E /k B T As the sum of the probabltes of the subsystem has to be we have = p k = C e E k /k B T = CZ k k gvng us p = e E β Z, a result that should look famlar. 3.7 The useful Z, the partton functon We have E = ε n = N Z ε e βε = N Z β e βε = N Z Z ln Z = N β β Ths s mportant. Once we know the energy levels of a system, a problem we solve n quantum mechancs, we know the partton functon. We then can compute the nternal energy n a smple way wthout havng to use the maybe unfamlar entropy. (If we want the entropy t can smply be computed from S = k B N ln Z + E.) Once we know the nternal energy we can compute other T measurable quanttes lke heat capactes.

In many systems the energy levels are degenerate that s there are several dfferent states havng the same energy. We must then n the partton functon count the term that corresponds to a degenerate level as many tmes as the degeneracy (multplcty). If we assume that the energy levels ε, =,, 3... have multplctes g we get a partton sum Z = g e βε More about ths later. 3.8 Energy fluctuatons We now study a large number (N) of dentcal systems n contact wth a heat bath wth temperature T. As an example you can thnk of the atoms or molecules n a gas. Such a set of systems s called the canoncal ensemble. Each system can be n one of ts energy levels Er. The probablty that ths happens s accordng to what we have seen above p r = e E r β = e E r β e E β Z, Z = e E β The average energy of the systems then s E = p E = Z E e E β. The average of the square of the energy s E = p E = Z E e E β We are nterested n the fluctuaton n energy or the varance of the energy that s ΔE = E E = Z E e E β E Z e E β ) We have ΔE = Z E e E β E Z e E β Z Z β Z E β Z β = E T dt β = Z β Z β = Now β = k B T T = k B β dt dβ = k B β = k T B Ths gves the total varance ΔE total = NΔE = N E T k T B = E total T k T B = (See the end of chapter 3

Further E total T = n c = N mol V c N V, mples ΔE total A = Nk B T N A ΔE total N The relatve fluctuaton ΔE total E total N order 0. 3 can obvously be neglected f N s of 3.9 What s entropy ntutvely? I thnk that you sometmes have heard entropy be descrbed as a measure of dsorder. Ths s only half of the truth though. If we add that entropy also s freedom we get a rather good ntutve descrpton of the entropy concept. When the entropy s maxmsed t means that the system tres to gan access to all accessble mcrostates. Ths s the freedom. Each mcrostate s then occuped wth the same probablty. Ths s the dsorder. Also be careful wth the condton n the second law: The entropy ncreases (or s maxmum) n a closed (solated) system but t can decrease locally. Lvng creatures s an example of regons wth very low local entropy and when we arrange brcks n very ordered patterns to buld a house we create a very low entropy locally. Ths s possble because we have an easly accessble source of low entropy nearby, the Sun. If we nclude the Sun n our system we wll have a good approxmaton of a closed system and the total entropy n ths, larger system s ncreasng. Lvng creatures wth low entropy do not volate the laws of physcs or need some supernatural nteracton! Fnally, n our theory here we have focused on systems n thermodynamcal equlbrum. Lvng creatures are very far from beng n thermodynamcal equlbrum, whch s precsely one of the propertes that make them lvng. In modern advanced thermodynamcs you study systems that are not n thermodynamcal equlbrum. 3.0. The Boltzmann factor, Mount Everest, and the use of frdges Consder a flat earth wth an atmosphere above. x Assume that the atmosphere s sothermal that s the temperature s the same everywhere. The probablty of fndng an ar molecule at heght x then s P( x) e βε ( x ) where ε x and potental energy. The knetc energy s on average the same everywhere as the temperature s the same, ndependent of the heght. The potental energy s Mgx where M s the mass of the molecule. Ths gves ( ) s the energy of the molecule. Ths energy s the sum of the knetc 4

P( x) e βε k e βε p ( x) e βmgx = e Mgx k B T Now, the densty of the ar s evdently proportonal to the probablty of fndng a molecule, and the pressure n turn s proportonal to the densty. Ths mples p( x) = p( 0)e Mgx k B T where we have normalsed the pressure wth p( 0), the ground pressure. Ths s the well-known barometrc formula used by for nstance avators. Puttng n numercal values we get p( x) = p( 0)e x 8000[m] where 8000 m s the so-called scale heght. Another applcaton of the Boltzmann factor s connected wth why we have frdges and freezers. Assume that we have some knd of foodstuff that has a probablty P rum to become stale durng let us say a day n room temperature. To get stale means some knd of chemcal change, n most cases caused by bactera. Chemcal changes typcally deal wth energy changes of order E ev. The probablty of a change at room temperature then s proportonal to the Boltzmann probablty wth ths energy n the exponent: E P rum = C e k B T rum where C s some constant. The probablty n a frdge wth temperaturet kyl then s P kyl = C e and n a freezer P frys = C e We then have P kyl /P rum = e E k B T kyl E k B T frys. E k B T kyl T rum If we use numercal values, say T rum = 95 K and T kyl = 80 K we get P kyl /P rum = 0. whch means that the foodstuff wll reman fresh about 0 tmes longer than n room temperature. If we use T frys = 55, freezer temperature, we nstead get P frys / P rum = 0.004, whch means that the foodstuff wll reman fresh about 700 tmes longer than at room temperature,.e. for months! The very rapd change s the result of the Boltzmann factor beng exponental. *3.0.. The equpartton theorem. Dervaton Suppose we have a system descrbed by ts (generalsed) coordnates q ; =,, N and (generalsed) momenta p ; =,, N. The energy of the system s a functon of these varables. E = E( q, p ) 5

The probablty that the coordnates of the system are n the nterval [ q,q + dq ] and the momenta n [ p,p + dp ] then s C dq dq dq N dp dp dp N e βe ( q,p ) We have the normalsaton condton ( ) = C dq dq dq N dp dp dp N e βe q, p that determnes the constant C = ( ) dq dq dq N dp dp dp N e βe q,p The average value (the expectaton value) of the energy then s dq dq dq N dp dp dp N E( q, p )e βe ( q,p ) E = dq dq dq N dp dp dp N e βe ( q,p ) Ths looks qute nasty but we wll use the same trck as we used for the nternal energy and the partton functon. We can rewrte the monster ntegral to somethng more palatable E = β ln dq dq dq dp dp dp ( N N e βe q,p ) We now assume that the energy s a quadratc functon of the coordnates and the momenta. Ths s very often true: E( q,p ) = a q + a q + b p + b p + whch means e βe = e βa q e βa q e βb p e βb p We can now rewrte the ntegral more smply as E = β ln dq q e βa dq e βb p or E = ln dq β e βa q + ln dq e βb p ( ) All the terms n the sums have the same structure and we only consder one of them, say the frst one dq e βa q = ( βa ) / dte t t= q βa = ( βa ) / D The ntegraton s over all allowable values of ths coordnate and the remanng ntegral s just some number that we call D. All the other terms n the sum gve smlar contrbutons. Thus we have E = β ln D a / ( β / / ( ) + ln( D a β / ) + ) = ( ) = β ln D ln a ln β + k T + k T + k T + B B B We get the extremely smple result: β + β + β + = 6

Each quadratc term n the expresson of the total energy of a partcle system mples a contrbuton to the nternal energy by k BT. Ths s the equpartton theorem. 3.0.3. The equpartton theorem. Applcatons. Consder an deal, monoatomc gas of N partcles. Monoatomc means that the gas partcles only can have translatonal movement; deal means that there s no nteracton between the partcles. Ths s a farly good model of a noble gas at normal pressure and temperature. The total energy of a partcle then s + Mv y + Mv z Mv x We have 3 quadratc terms for each atom. Thus the nternal energy s E = N 3 k B T = 3 Nk B T = 3 nrt The molar heat capacty at constant volume then s c V = de n dt = 3 R.5 J/(mol K) We have explaned the heat capactes of the noble gases that we studed n the table on page 4! Now consder a sold. We can model ts atoms as a system where the atoms are connected by sprngs wth sprng constants k n a three dmensonal grd: The energy of an atom s Mv x + Mv y + Mv z + kx + ky + kz We have 6 quadratc terms per atom and the nternal energy s E = N 6 k T = 3Nk T = 3nRT B B And the molar heat capacty c V = de = 3R 5 n dt 7

Ths s called Dulong-Pett s law and t works very well for most solds, see the dagram below. There are two evdent exceptons, graphte that has a heat capacty that s precsely /3 of what t should have and damond that also have a value that s too low. We wll later be able to explan ths excepton further on. 4 C/R 3 0 Graphte Damond Sold Al Sb Be Pb Au Cd Ca Co Cr Hg Mg Mo Na N Pt Ag Ta Sn U B Zn C C P Se S Encouraged by these results we try to apply our theory on datomc gases lke oxygen, ntrogen and hydrogen. A datomc molecule has more possble ways of movng than just a translaton n three dmensons. It can vbrate along the connecton lne between the atoms and t can rotate around two axes perpendcular to ths lne. We can easly wrte down the total energy for such a molecule Mv x + Mv y + Mv z + µv + kx + I ω + I ω The frst three terms correspond to translatonal energy, the two followng correspond to the vbraton energy and the two last ones correspond to the rotatonal energes. In total we have 7 quadratc terms that gves an nternal energy E = N 7 k B T = 7 Nk B T = 7 nrt that mples a molar heat capacty at constant volume c V = de n dt = 7 R 9 Unfortunately ths does not agree at all wth values that you get from experment. These gve a value close to 5 R. The physcs of the 9th century could not explan ths evdent falure of classcal thermodynamcs. As we wll se we wll need quantum mechancs to solve the problem. 8

Also later, when the electron was dscovered, there were problems. A very good model of a metal s that you have a gas of free electrons that can move n the lattce of the sold. For the sold tself we have as before c V,lattce = 3R Foe the electron gas we expect the result from a monoatomc gas c V,el = 3 R The total heat capacty s c V,el = 9 R, a metal should have a heat capacty that s 50 % larger than for a non-metal. But expermentally you fnd that the molar heat capactes for metals and non-metals are essentally the same. Why? Fnally we wll pont on a problem that also s connected wth heat capactes and entropy. Earler we saw that we had B T dq C m dt ΔS AB = = = C m ln T A T T T T We can se that we have a problem when T = 0, the entropy change gets sngular! One way of solvng ths would be f the heat capacty C goes to zero sutably fast when the temperature goes to zero. Why would ths happen? It turns out that ths also can be explaned by quantum mechancs. We want to make ths very clear: Smple an uncontroversal expermental measurements of heat capactes show that the classcal (non-quantum mechancal) thermodynamcs s WRONG! Ths was a great problem at he begnnng of the 0 th century. We wll see that we need to use quantum mechancs to get results that agree wth experment. Besdes quantum mechancs wll turn out to descrbe the mcrocosmos n a new and exctng way. Exercse problems. Chapter 3. We return to the Mckey Mouse system. Compute Ω for total nternal energes E = 0, ε, ε, 3ε, 5ε, 6ε for the 4 partcles. The compute s = S/k B (renormalsed entropy) and plot s( E). Sketch a curve through the ponts and estmate relatvely the temperature for dfferent nternal energes. Qualtatvely sketch the relaton E(T). Hnt: There are respectvely,,, 3, 5, 6, and 9 dfferent dstrbutons.. Compute ΔS when kg water of 00 C s transformed from lqud to steam. Then compute the change n entropy when you heat kg water from 0 C to 00 C. Comments? 3. kg vatten of 0 C s put n contact wth a large heat source that can be assumed to have a constant temperature of 00 C. What s the entropy change n the system water + heat source when the water has reached ts fnal temperature? 9

4. We want to determne the extremum of the functon f x, y ( ) = x + y gven the constrant x + y =. Do ths n two ways, a) By elmnatng for nstance y from the frst functon usng the constrant. The result s a functon of only one varable that can easly be handled. b) By addng the constrant usng a Lagragan multpler and then put dervatves to zero. Show that both methods gve the same result. 30