Statistical Physics he Second Law ime s Arrow Most macroscopic processes are irreversible in everyday life. Glass breaks but does not reform. Coffee cools to room temperature but does not spontaneously heat up. 1
Probability hermodynamic processes happen in one direction rather than the other because of probabilities. Systems move toward the most probable macrostate. wo State Systems Counting States 2
Coin Flips All 8 possible microstates of a set of three coins. Coin 1 Coin 2 Coin 3 = heads = tails Coin Flips Macrostates Coin 1 Coin 2 Coin 3 3 eads (1 microstate) 3
Coin Flips Macrostates Coin 1 Coin 2 Coin 3 3 eads (1 microstate) 2 eads (3 microstates) Coin Flips Macrostates Coin 1 Coin 2 Coin 3 3 eads (1 microstate) 2 eads (3 microstates) 1 ead (3 microstates) 4
Coin Flips Macrostates Coin 1 Coin 2 Coin 3 3 eads (1 microstate) 2 eads (3 microstates) 1 ead (3 microstates) 0 eads (1 microstate) Coin Flips Macrostates Coin 1 Coin 2 Coin 3 3 eads (1 microstate) 2 eads (3 microstates) 1 ead (3 microstates) 0 eads (1 microstate) he number of microstates corresponding to a given macrostate is called the multiplicity, Ω. 5
Coin Flips Assuming the coins are fair all 8 microstates are equally probable. hus the probability of any given macrostate is Coin 1 Coin 2 Coin 3 100 coins he number of microstates for 100 coins is 2 100. he number of macrostates is 101. 0 heads, 1 head, 2 heads,,100 heads. Multiplicities: Ω(0) = 1 Ω(1) = 100 6
100 coins o find Ω(2): 100 choices for 1 st coin 99 choices for 2 nd coin Any pair in either order so divide by two. o find Ω(3): 100 choices for 1 st coin 99 choices for 2 nd coin 98 choices for third coin But any triplet has 3 choices for 1 st flip, then 2 choices for 2 nd flip. Combinatorics You can probably see the pattern: or For N coins the multiplicity of the macrostate with n heads is 7
Combinatorics In general the number of ways of arranging N indistinguishable particles among n states is he number of ways of arranging N distinguishable particles among n states is "(n) = n! (N # n)! Statistics to Physics Microstates, Macrostates and Entropy 8
he Boltzman Postulate An isolated system in equilibrium is equally likely to be in any of its accessible microstates. Accessible physically allowed and reachable via some process. Gas in a box A box contains N = 4 identical particles. Let us describe the macrostate in terms of number of particles on the right side of the box, N R. he microstates for each macrostate can be computed using the formulas or by inspection. 9
Gas in a Box # "(N R ) = N & % ( = $ N R ' N! N R!(N ) N R )! Ω(0) = 1 Ω(1) = 4 Ω(2) = 6 Ω(All) = 16 As each microstate has the same probability (according to the Boltzmann Postulate) the probability of a given macrostate is proportional to the multiplicity. Gas in a Box 10
he 2nd Law - Again he equilibrium state is the most probable macrostate. he most probable macrostate is the state with the most microstates - the largest multiplicity. Boltzmann realized that the entropy is maximum at equilibrium so that there must be a connection between multiplicity and entropy. Boltzmann defined entropy as S " k B ln# he 2nd Law - Again he results of our analysis of multiplicity: Any large system in equilibrium will be found in the macrostate with the greatest multiplicity or Multiplicity tends to increase. becomes the Second Law of hermodynamics: Entropy tends to increase. S " k B ln# 11
emperature and Entropy Objects in hermal Contact Consider two systems A and B that can exchange thermal energy (heat) back and forth. Suppose that initially heat is flowing slowly from A to B. As the two systems change energy they also change their entropy. Solid A eat Solid B 12
Entropy change he total entropy change during which an infinitesimal amount of heat (de) flows is ds otal = ds A + ds B he energy exchange causes the entropy change so we can write # ds otal = "S & # % ( de $ "E ' A + "S & % ( de $ "E ' B A But the systems exchange energy so de A = -de B : *# ds otal = "S &,% ( + $ "E ' A B B # ) "S & - % ( / de $ "E ' A. emperature and Entropy *# ds otal = "S &,% ( + $ "E ' A # ) "S & - % ( / de $ "E ' A. Now if the two systems are far from equilibrium the entropy change of the whole system must be a non-zero positive value as it moves toward maximum entropy. If the system is at equilibrium the entropy change must be zero. hus the two systems are in equilibrium when # "S & % ( $ "E ' A # = "S & % ( $ "E ' B B 13
emperature and Entropy # "S & % ( $ "E ' A # = "S & % ( $ "E ' B But temperature is the quantity we define as being the same when two objects are in equilibrium. herefore we can define # "S & % ( ) 1 $ "E ' where the inverse ensures that the heat flows from the higher temperature to the lower temperature object. emperature and Entropy More formally we define the temperature for a system with a fixed number of particles and a fixed volume. he letter U is used for energy to emphasize that it is the internal energy of the system. 14
Statistical Physics Partition Function and the Boltzmann Factor Isolated vs. Constant emperature Systems Our previous work has been with isolated systems but what about systems at constant temperature? What is the probability of finding a system in a particular energy state at a given temperature? System A System A exchanges energy with a temperature reservoir B. emperature reservoir B 15
One big system and one small system If we can cast the problem in terms of an isolated system we can use the work we have already developed. We can treat the combined system as an isolated system if system A is much smaller than system B so that the temperature of B is unaffected by the exchange of energy with A. Let s choose system A to be a single atom. Probability ratios Let s look at the ratio of probabilities for two of the atom s microstates: s 1 and s 2. he energies and probabilities of these states are written as 16
Probability ratios For an isolated system each accessible state is equally likely but the atom is not an isolated system. he atom + reservoir is an isolated system and the reservoir has a much larger number of microstates than the atom so Probability ratios Using the relationship we can write 17
Entropy and Energy We previously proved so for dn = 0 and dv = 0 Energy and Entropy he difference in entropies of the system corresponding to the two states is he difference in energy of the reservoir is just minus the difference between the two atomic states. 18
Probabilities Returning to our ratio of probabilities becomes he Boltzmann Factor his means that the probability of finding a system in energy state s at temperature is proportional to the Boltzmann Factor: Now this just gives proportionality - to get the total probability we have to normalize. 19
he partition function he normalized probability is where Z is called the partition function and is given by he Most Important Equation he most important equation in classical statistical mechanics is Note: the sum is over all states of the system not all energies. 20
Using the Boltzmann Factor Averages If we have a system of N particles at temperature then the number of particles in state s, N(s), is given by 21
Averages We can use this to find the average properties of a system: Averages Average energy is he internal energy of the system is 22
Example: eight and density he ratio of the number of particles at different heights in a gas at uniform temperature is but the difference in energy at different heights is just mg z so Energy distributions 23
Energy vs. States It is common to express the probability in terms of energies rather than states. If the energies are not degenerate no change in the expression is required but if there is more than one state with the same energy the expression changes. P(s) " P(E) Energy vs. States In terms of states the normalized probability is In terms of energies it is P(E) = / k g(e)e"e # E "E / k g(e)e 24
Degeneracy g(e) is the degeneracy of energy level E. It is the number of states with the energy E. he partition function can also be rewritten in terms of the energy. Z = # E "E / k g(e)e Sum to Integral In many systems the spacing of energy levels is much smaller than the typical particle energy. (Certainly true in classical systems.) In this case the sums over energy can be replaced by integrals. " # $ de E 25
Sum to Integral he degeneracy g(e) needs to be replaced by a continuous quantity called the density of states, D(E). D(E) = the number of states with energies between E and E + de P(E)dE = $ D(E)e"E / k de # 0 D(E)e "E / k de where P(E)dE is the probability that the system energy is between E and E + de Quantum Statistics For classical systems and systems where k >> E F (for fermions) the Boltzmann distribution is valid but for very low temperatures we need to replace the Boltzmann distribution with the appropriate quantum distribution: where P(E)dE = # D(E) f (E)dE " 0 D(E) f (E)dE # 1 e (E"E F )/ % k +1 fermions f (E) = $ % 1 &% e (E"µ)/ k "1 bosons 26
eat capacity Definition of eat Capacity he heat capacity of a system is defined as C = du d Experimentally the heat capacity is the amount of heat added to the system when the temperature is raised by one unit of temperature. 27
Degrees of Freedom he temperature dependence of internal energy of a system can be determined by using the equipartition theorem. Monatomic gas: U = 3 2 Nk C = 3 2 Nk Diatomic gas: U = 6 2 Nk C = 3Nk Elemental Solid: U = 7 2 Nk C = 7 2 Nk Quantum Effects Diatomic Molecules 28
Quantum Effects Solids: 29