Statistical Physics. The Second Law. Most macroscopic processes are irreversible in everyday life.

Similar documents
Stuff 1st Law of Thermodynamics First Law Differential Form Total Differential Total Differential

Thermodynamics & Statistical Mechanics

Basic Concepts and Tools in Statistical Physics

Gibbs Paradox Solution

a. 4.2x10-4 m 3 b. 5.5x10-4 m 3 c. 1.2x10-4 m 3 d. 1.4x10-5 m 3 e. 8.8x10-5 m 3

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Heat Capacities, Absolute Zero, and the Third Law

213 Midterm coming up

Chapter 20 The Second Law of Thermodynamics

Introduction Statistical Thermodynamics. Monday, January 6, 14

Imperial College London BSc/MSci EXAMINATION May 2008 THERMODYNAMICS & STATISTICAL PHYSICS

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Physics 172H Modern Mechanics

Part II: Statistical Physics

Part II: Statistical Physics

Part II: Statistical Physics

Lecture 25 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

Homework Week The figure below depicts the isothermal compression of an ideal gas. isothermal du=0. δq rev = δw rev = P dv

Entropy and the second law of thermodynamics

Before the Quiz. Make sure you have SIX pennies

6.730 Physics for Solid State Applications

Statistical. mechanics

So far changes in the state of systems that occur within the restrictions of the first law of thermodynamics were considered:

Physics 207 Lecture 25. Lecture 25, Nov. 26 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas

(a) How much work is done by the gas? (b) Assuming the gas behaves as an ideal gas, what is the final temperature? V γ+1 2 V γ+1 ) pdv = K 1 γ + 1

Summary Part Thermodynamic laws Thermodynamic processes. Fys2160,

Physics 132- Fundamentals of Physics for Biologists II

Reversibility. Processes in nature are always irreversible: far from equilibrium

Removing the mystery of entropy and thermodynamics. Part 3

Irreversible Processes

CARNOT CYCLE = T = S ( U,V )

Entropy and Free Energy in Biology

Chemistry 2000 Lecture 9: Entropy and the second law of thermodynamics

Chapter 12. The Laws of Thermodynamics. First Law of Thermodynamics

I.D The Second Law Q C

Statistical Mechanics Victor Naden Robinson vlnr500 3 rd Year MPhys 17/2/12 Lectured by Rex Godby

PHYSICS 214A Midterm Exam February 10, 2009

Statistical Thermodynamics

Physics 4311 ANSWERS: Sample Problems for Exam #2. (1)Short answer questions:

Grand Canonical Formalism

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble

Chapter 4: Going from microcanonical to canonical ensemble, from energy to temperature.

5.60 Thermodynamics & Kinetics Spring 2008

Knight: Chapter 18. The Micro/Macro Connection. (Thermal Interactions and Heat & Irreversible Processes and the 2 nd Law of Thermodynamics)

18.13 Review & Summary

Lecture 9 Examples and Problems

Lecture. Polymer Thermodynamics 0331 L First and Second Law of Thermodynamics

Second Year Thermodynamics. 1.1 Basic Concepts

CHEM Thermodynamics. Entropy, S

Entropy. ΔS = q rev T. CLASSICAL THERMODYNAMICS study of macroscopic/thermodynamic properties of systems: U, T, V, P,

Last Name: First Name NetID Discussion Section: Discussion TA Name:

Chapter 17. Free Energy and Thermodynamics. Chapter 17 Lecture Lecture Presentation. Sherril Soman Grand Valley State University

X α = E x α = E. Ω Y (E,x)

Advanced Thermodynamics. Jussi Eloranta (Updated: January 22, 2018)

Physics 9 Wednesday, February 29, 2012

Thermodynamics: More Entropy

Thermodynamics & Statistical Mechanics SCQF Level 9, U03272, PHY-3-ThermStat. Thursday 24th April, a.m p.m.

Lecture 13 Heat Engines

Heat Machines (Chapters 18.6, 19)

PHYSICS 715 COURSE NOTES WEEK 1

Statistical Mechanics Notes. Ryan D. Reece

1. Thermodynamics 1.1. A macroscopic view of matter

Homework 10: Do problem 6.2 in the text. Solution: Part (a)

Physics 576 Stellar Astrophysics Prof. James Buckley. Lecture 14 Relativistic Quantum Mechanics and Quantum Statistics

Entropy and Free Energy in Biology

Lecture 9 Overview (Ch. 1-3)

Physics 607 Exam 2. ( ) = 1, Γ( z +1) = zγ( z) x n e x2 dx = 1. e x2

2. Thermodynamics. Introduction. Understanding Molecular Simulation

Lecture 25: Heat and The 1st Law of Thermodynamics Prof. WAN, Xin

Physics Oct Reading. K&K chapter 6 and the first half of chapter 7 (the Fermi gas). The Ideal Gas Again

Let s start by reviewing what we learned last time. Here is the basic line of reasoning for Einstein Solids

Lecture 2: Intro. Statistical Mechanics

Solid Thermodynamics (1)

fiziks Institute for NET/JRF, GATE, IIT JAM, JEST, TIFR and GRE in PHYSICAL SCIENCES Kinetic Theory, Thermodynamics OBJECTIVE QUESTIONS IIT-JAM-2005

Chapter 12. The Laws of Thermodynamics

Statistical thermodynamics for MD and MC simulations

Lecture 13. Heat Engines. Thermodynamic processes and entropy Thermodynamic cycles Extracting work from heat

QuickCheck. Collisions between molecules. Collisions between molecules

Introduction to thermodynamics

Thermodynamics, Gibbs Method and Statistical Physics of Electron Gases

w = -nrt hot ln(v 2 /V 1 ) nrt cold ln(v 1 /V 2 )[sincev/v 4 3 = V 1 /V 2 ]

ChE 503 A. Z. Panagiotopoulos 1

Entropy. Entropy Changes for an Ideal Gas

Physics is time symmetric Nature is not

1 Garrod #5.2: Entropy of Substance - Equation of State. 2

Outline. 1. Work. A. First Law of Thermo. 2. Internal Energy. 1. Work continued. Category: Thermodynamics. III. The Laws of Thermodynamics.

Collisions between molecules

Physics 132- Fundamentals of Physics for Biologists II. Statistical Physics and Thermodynamics

Statistical Mechanics

5. Systems in contact with a thermal bath

Thermodynamics: More Entropy

Part1B(Advanced Physics) Statistical Physics

Class 22 - Second Law of Thermodynamics and Entropy

Definite Integral and the Gibbs Paradox

Section 3 Entropy and Classical Thermodynamics

...Thermodynamics. Lecture 15 November 9, / 26

More Thermodynamics. Specific Specific Heats of a Gas Equipartition of Energy Reversible and Irreversible Processes

Classical Thermodynamics. Dr. Massimo Mella School of Chemistry Cardiff University

Classical Physics I. PHY131 Lecture 36 Entropy and the Second Law of Thermodynamics. Lecture 36 1

Transcription:

Statistical Physics he Second Law ime s Arrow Most macroscopic processes are irreversible in everyday life. Glass breaks but does not reform. Coffee cools to room temperature but does not spontaneously heat up. 1

Probability hermodynamic processes happen in one direction rather than the other because of probabilities. Systems move toward the most probable macrostate. wo State Systems Counting States 2

Coin Flips All 8 possible microstates of a set of three coins. Coin 1 Coin 2 Coin 3 = heads = tails Coin Flips Macrostates Coin 1 Coin 2 Coin 3 3 eads (1 microstate) 3

Coin Flips Macrostates Coin 1 Coin 2 Coin 3 3 eads (1 microstate) 2 eads (3 microstates) Coin Flips Macrostates Coin 1 Coin 2 Coin 3 3 eads (1 microstate) 2 eads (3 microstates) 1 ead (3 microstates) 4

Coin Flips Macrostates Coin 1 Coin 2 Coin 3 3 eads (1 microstate) 2 eads (3 microstates) 1 ead (3 microstates) 0 eads (1 microstate) Coin Flips Macrostates Coin 1 Coin 2 Coin 3 3 eads (1 microstate) 2 eads (3 microstates) 1 ead (3 microstates) 0 eads (1 microstate) he number of microstates corresponding to a given macrostate is called the multiplicity, Ω. 5

Coin Flips Assuming the coins are fair all 8 microstates are equally probable. hus the probability of any given macrostate is Coin 1 Coin 2 Coin 3 100 coins he number of microstates for 100 coins is 2 100. he number of macrostates is 101. 0 heads, 1 head, 2 heads,,100 heads. Multiplicities: Ω(0) = 1 Ω(1) = 100 6

100 coins o find Ω(2): 100 choices for 1 st coin 99 choices for 2 nd coin Any pair in either order so divide by two. o find Ω(3): 100 choices for 1 st coin 99 choices for 2 nd coin 98 choices for third coin But any triplet has 3 choices for 1 st flip, then 2 choices for 2 nd flip. Combinatorics You can probably see the pattern: or For N coins the multiplicity of the macrostate with n heads is 7

Combinatorics In general the number of ways of arranging N indistinguishable particles among n states is he number of ways of arranging N distinguishable particles among n states is "(n) = n! (N # n)! Statistics to Physics Microstates, Macrostates and Entropy 8

he Boltzman Postulate An isolated system in equilibrium is equally likely to be in any of its accessible microstates. Accessible physically allowed and reachable via some process. Gas in a box A box contains N = 4 identical particles. Let us describe the macrostate in terms of number of particles on the right side of the box, N R. he microstates for each macrostate can be computed using the formulas or by inspection. 9

Gas in a Box # "(N R ) = N & % ( = $ N R ' N! N R!(N ) N R )! Ω(0) = 1 Ω(1) = 4 Ω(2) = 6 Ω(All) = 16 As each microstate has the same probability (according to the Boltzmann Postulate) the probability of a given macrostate is proportional to the multiplicity. Gas in a Box 10

he 2nd Law - Again he equilibrium state is the most probable macrostate. he most probable macrostate is the state with the most microstates - the largest multiplicity. Boltzmann realized that the entropy is maximum at equilibrium so that there must be a connection between multiplicity and entropy. Boltzmann defined entropy as S " k B ln# he 2nd Law - Again he results of our analysis of multiplicity: Any large system in equilibrium will be found in the macrostate with the greatest multiplicity or Multiplicity tends to increase. becomes the Second Law of hermodynamics: Entropy tends to increase. S " k B ln# 11

emperature and Entropy Objects in hermal Contact Consider two systems A and B that can exchange thermal energy (heat) back and forth. Suppose that initially heat is flowing slowly from A to B. As the two systems change energy they also change their entropy. Solid A eat Solid B 12

Entropy change he total entropy change during which an infinitesimal amount of heat (de) flows is ds otal = ds A + ds B he energy exchange causes the entropy change so we can write # ds otal = "S & # % ( de $ "E ' A + "S & % ( de $ "E ' B A But the systems exchange energy so de A = -de B : *# ds otal = "S &,% ( + $ "E ' A B B # ) "S & - % ( / de $ "E ' A. emperature and Entropy *# ds otal = "S &,% ( + $ "E ' A # ) "S & - % ( / de $ "E ' A. Now if the two systems are far from equilibrium the entropy change of the whole system must be a non-zero positive value as it moves toward maximum entropy. If the system is at equilibrium the entropy change must be zero. hus the two systems are in equilibrium when # "S & % ( $ "E ' A # = "S & % ( $ "E ' B B 13

emperature and Entropy # "S & % ( $ "E ' A # = "S & % ( $ "E ' B But temperature is the quantity we define as being the same when two objects are in equilibrium. herefore we can define # "S & % ( ) 1 $ "E ' where the inverse ensures that the heat flows from the higher temperature to the lower temperature object. emperature and Entropy More formally we define the temperature for a system with a fixed number of particles and a fixed volume. he letter U is used for energy to emphasize that it is the internal energy of the system. 14

Statistical Physics Partition Function and the Boltzmann Factor Isolated vs. Constant emperature Systems Our previous work has been with isolated systems but what about systems at constant temperature? What is the probability of finding a system in a particular energy state at a given temperature? System A System A exchanges energy with a temperature reservoir B. emperature reservoir B 15

One big system and one small system If we can cast the problem in terms of an isolated system we can use the work we have already developed. We can treat the combined system as an isolated system if system A is much smaller than system B so that the temperature of B is unaffected by the exchange of energy with A. Let s choose system A to be a single atom. Probability ratios Let s look at the ratio of probabilities for two of the atom s microstates: s 1 and s 2. he energies and probabilities of these states are written as 16

Probability ratios For an isolated system each accessible state is equally likely but the atom is not an isolated system. he atom + reservoir is an isolated system and the reservoir has a much larger number of microstates than the atom so Probability ratios Using the relationship we can write 17

Entropy and Energy We previously proved so for dn = 0 and dv = 0 Energy and Entropy he difference in entropies of the system corresponding to the two states is he difference in energy of the reservoir is just minus the difference between the two atomic states. 18

Probabilities Returning to our ratio of probabilities becomes he Boltzmann Factor his means that the probability of finding a system in energy state s at temperature is proportional to the Boltzmann Factor: Now this just gives proportionality - to get the total probability we have to normalize. 19

he partition function he normalized probability is where Z is called the partition function and is given by he Most Important Equation he most important equation in classical statistical mechanics is Note: the sum is over all states of the system not all energies. 20

Using the Boltzmann Factor Averages If we have a system of N particles at temperature then the number of particles in state s, N(s), is given by 21

Averages We can use this to find the average properties of a system: Averages Average energy is he internal energy of the system is 22

Example: eight and density he ratio of the number of particles at different heights in a gas at uniform temperature is but the difference in energy at different heights is just mg z so Energy distributions 23

Energy vs. States It is common to express the probability in terms of energies rather than states. If the energies are not degenerate no change in the expression is required but if there is more than one state with the same energy the expression changes. P(s) " P(E) Energy vs. States In terms of states the normalized probability is In terms of energies it is P(E) = / k g(e)e"e # E "E / k g(e)e 24

Degeneracy g(e) is the degeneracy of energy level E. It is the number of states with the energy E. he partition function can also be rewritten in terms of the energy. Z = # E "E / k g(e)e Sum to Integral In many systems the spacing of energy levels is much smaller than the typical particle energy. (Certainly true in classical systems.) In this case the sums over energy can be replaced by integrals. " # $ de E 25

Sum to Integral he degeneracy g(e) needs to be replaced by a continuous quantity called the density of states, D(E). D(E) = the number of states with energies between E and E + de P(E)dE = $ D(E)e"E / k de # 0 D(E)e "E / k de where P(E)dE is the probability that the system energy is between E and E + de Quantum Statistics For classical systems and systems where k >> E F (for fermions) the Boltzmann distribution is valid but for very low temperatures we need to replace the Boltzmann distribution with the appropriate quantum distribution: where P(E)dE = # D(E) f (E)dE " 0 D(E) f (E)dE # 1 e (E"E F )/ % k +1 fermions f (E) = $ % 1 &% e (E"µ)/ k "1 bosons 26

eat capacity Definition of eat Capacity he heat capacity of a system is defined as C = du d Experimentally the heat capacity is the amount of heat added to the system when the temperature is raised by one unit of temperature. 27

Degrees of Freedom he temperature dependence of internal energy of a system can be determined by using the equipartition theorem. Monatomic gas: U = 3 2 Nk C = 3 2 Nk Diatomic gas: U = 6 2 Nk C = 3Nk Elemental Solid: U = 7 2 Nk C = 7 2 Nk Quantum Effects Diatomic Molecules 28

Quantum Effects Solids: 29