Physics 172H Modern Mechanics

Similar documents
Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Lecture 8. The Second Law of Thermodynamics; Energy Exchange

Lecture Notes Set 3a: Probabilities, Microstates and Entropy

Let s start by reviewing what we learned last time. Here is the basic line of reasoning for Einstein Solids

213 Midterm coming up

Physics 172H Modern Mechanics

Guided Inquiry Worksheet 2: Interacting Einstein Solids & Entropy

Lecture Notes Set 3b: Entropy and the 2 nd law

Chapter 20 The Second Law of Thermodynamics

Thermodynamics Second Law Entropy

6.730 Physics for Solid State Applications

Lecture 9 Examples and Problems

VISUAL PHYSICS ONLINE THERMODYNAMICS SECOND LAW OF THERMODYNAMICS ENTROPY

Chapter 17. Free Energy and Thermodynamics. Chapter 17 Lecture Lecture Presentation. Sherril Soman Grand Valley State University

Physics 9 Wednesday, February 29, 2012

Thermodynamics: Entropy

Thermodynamics: Entropy

PHYS 172: Modern Mechanics Fall 2009

Statistical Physics. How to connect the microscopic properties -- lots of changes to the macroscopic properties -- not changing much.

Lecture 13. Multiplicity and statistical definition of entropy

Introduction Statistical Thermodynamics. Monday, January 6, 14

although Boltzmann used W instead of Ω for the number of available states.

Boltzmann Distribution Law (adapted from Nash)

Statistical Physics. The Second Law. Most macroscopic processes are irreversible in everyday life.

The Methodology of Statistical Mechanics

Concepts in Materials Science I. StatMech Basics. VBS/MRC Stat Mech Basics 0

Chapter 20 Entropy and the 2nd Law of Thermodynamics

Removing the mystery of entropy and thermodynamics. Part 3

Lecture 9 Overview (Ch. 1-3)

Spontaneity: Second law of thermodynamics CH102 General Chemistry, Spring 2012, Boston University

Part II: Statistical Physics

Heat Capacities, Absolute Zero, and the Third Law

Chemistry 2000 Lecture 9: Entropy and the second law of thermodynamics

Class 22 - Second Law of Thermodynamics and Entropy

Part II: Statistical Physics

Thermodynamics: More Entropy

First Law Limitations

Irreversibility. Have you ever seen this happen? (when you weren t asleep or on medication) Which stage never happens?

Thermodynamics: Chapter 02 The Second Law of Thermodynamics: Microscopic Foundation of Thermodynamics. September 10, 2013

Thermodynamics & Statistical Mechanics

Entropy and Free Energy in Biology

Lecture 2: Intro. Statistical Mechanics

Concept: Thermodynamics

ChE 503 A. Z. Panagiotopoulos 1

Entropy and Free Energy in Biology

PHY 293F WAVES AND PARTICLES DEPARTMENT OF PHYSICS, UNIVERSITY OF TORONTO PROBLEM SET #6 - SOLUTIONS

Homework 10: Do problem 6.2 in the text. Solution: Part (a)

Quiz 3 for Physics 176: Answers. Professor Greenside

Stuff 1st Law of Thermodynamics First Law Differential Form Total Differential Total Differential

1 The fundamental equation of equilibrium statistical mechanics. 3 General overview on the method of ensembles 10

Lecture 25 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas

Heat Machines (Chapters 18.6, 19)

Thermodynamics: More Entropy

Before the Quiz. Make sure you have SIX pennies

Physics 132- Fundamentals of Physics for Biologists II. Statistical Physics and Thermodynamics

Physics 207 Lecture 27. Lecture 26. Chapters 18, entropy and second law of thermodynamics Chapter 19, heat engines and refrigerators

a. 4.2x10-4 m 3 b. 5.5x10-4 m 3 c. 1.2x10-4 m 3 d. 1.4x10-5 m 3 e. 8.8x10-5 m 3

Results of. Midterm 1. Points < Grade C D,F C B points.

Entropy. Entropy Changes for an Ideal Gas

Physics 132- Fundamentals of Physics for Biologists II

Building your toolbelt

Physics 9 Friday, November 2, 2018

Lecture 4: Classical Illustrations of Macroscopic Thermal Effects

Entropy. Governing Rules Series. Instructor s Guide. Table of Contents

4/19/2016. Chapter 17 Free Energy and Thermodynamics. First Law of Thermodynamics. First Law of Thermodynamics. The Energy Tax.

Homework Hint. Last Time

Outline Review Example Problem 1. Thermodynamics. Review and Example Problems: Part-2. X Bai. SDSMT, Physics. Fall 2014

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Physics Department Statistical Physics I Spring Term 2013 Notes on the Microcanonical Ensemble

Lecture 14. Entropy relationship to heat

Entropy, free energy and equilibrium. Spontaneity Entropy Free energy and equilibrium

Lecture 6. Statistical Processes. Irreversibility. Counting and Probability. Microstates and Macrostates. The Meaning of Equilibrium Ω(m) 9 spins

Chapter 4 - Second Law of Thermodynamics

and R (GM) 1/ (r GM) 1/. (6) Dividing (5) by (6), he obtains after considering that β = 1/T, the expression for the Hawking temperature, namely T = 1/

BOLTZMANN ENTROPY: PROBABILITY AND INFORMATION

Irreversible Processes

Part II: Statistical Physics

Lecture 6 Examples and Problems

The Direction of Spontaneous Change: Entropy and Free Energy

Chapter 20. Heat Engines, Entropy and the Second Law of Thermodynamics. Dr. Armen Kocharian

STATISTICAL AND THERMAL PHYSICS

Statistical Mechanics

Dissipative nuclear dynamics

12 The Laws of Thermodynamics

Phase space in classical physics

ChE 210B: Advanced Topics in Equilibrium Statistical Mechanics

1) Yes 2) No 3) The energy principle does not apply in this situation. Would this be a violation of the energy principle?

Statistical. mechanics

X α = E x α = E. Ω Y (E,x)

Physics, Time and Determinism

The Temperature of a System as a Function of the Multiplicity and its Rate of Change

Lecture 6 Examples and Problems

Entropy A measure of molecular disorder

Physics 4311 ANSWERS: Sample Problems for Exam #2. (1)Short answer questions:

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

Outline Review Example Problem 1 Example Problem 2. Thermodynamics. Review and Example Problems. X Bai. SDSMT, Physics. Fall 2013

From the microscopic to the macroscopic world. Kolloqium April 10, 2014 Ludwig-Maximilians-Universität München. Jean BRICMONT

Physics 207 Lecture 25. Lecture 25, Nov. 26 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas

IV. Classical Statistical Mechanics

Van der Waals Fluid Deriving the Fundamental Relation. How do we integrate the differential above to get the fundamental relation?

Entropy for Mathematicians or... The final Blaubeuren chapters

Transcription:

Physics 172H Modern Mechanics Instructor: Dr. Mark Haugan Office: PHYS 282 haugan@purdue.edu TAs: Alex Kryzwda John Lorenz akryzwda@purdue.edu jdlorenz@purdue.edu Lecture 22: Matter & Interactions, Ch. 12.3 12.5

Last Time E = ħ k m = ħω Einstein Model of Solids assume atoms oscillate independently of each other each atom is comprised of 3 independent quantum harmonic oscillators: N oscillators N/3 atoms # microstates (N oscillators, q quanta) Ω = ( q + N ) 1! ( N ) q! 1! Fundamental assumption of statistical mechanics Over time, an isolated system in a given macrostate (total energy) is equally likely to be found in any of its microstates (microscopic distribution of energy).

Last time we enumerated the 15 different microstates for 4 energy quanta of distributed among 3 oscillators (1 atom). The Fundamental Assumption of Statistical Mechanics A fundamental assumption of statistical mechanics is that in our state of macroscopic ignorance, each microstate (microscopic distribution of energy) corresponding to a given macrostate (total energy) is equally probable.

Q1. Two objects share a total energy E = E 1 +E 2. There are 3 ways to arrange an amount of energy E 1 in the first object and 6 ways to arrange an amount of energy E 2 in the second object. How many different ways are there to arrange the total energy E = E 1 +E 2 so that there is E 1 in the first object and E 2 in the other? A) 3 B) 6 C) 9 D) 18 E) 3 6 =729 6 ways 6 ways object 1 way 1 way 2 object 2 way 3 6 ways 3 x 6 =18 total ways

Fundamental Assumption of Statistical Mechanics The fundamental assumption of statistical mechanics is that, over time, an isolated system in a given macrostate is equally likely to be found in any of its microstates. Most likely case: 29% Even distribution of energy 24% 24% 12% 12% atom 1 atom 2 5

The chart for THE 4-quantum macrostate of the 2-atom system q 1 q 2 Ω 1 Ω 2 Ω 1 Ω 2 Each line is about a different energy distribution 0 4 1 15 15 1 3 3 10 30 2 2 6 6 36 3 1 10 3 30 4 0 15 1 15 Total = 126 Number of microstates available for the q 1 =2 energy distribution This is the total number of microstates accessible to the entire 2-atom system

Two Big Blocks Interacting Add 100 quanta of energy. Most microstates distribute the energy evenly. Ω = ( q + N ) q ( N ) 1!! 1! q 1 q 2 = 100 q 1 Ω 1 Ω 2 Ω 1 Ω 2 0 100 1 2.78 E+81 2.77 E+81 1 99 300 9.27 E+80 2.78 E+83 2 98 4.52 E+4 3.08 E+80 1.39 E+85 3 97 4.55 E+6 1.02 E+80 4.62 E+86 4 96 3.44e E+8 3.33 E+79 1.15 E+88

Two Important Observations 1. The most likely circumstance is that the energy is divided in proportion to the # of oscillators in each object: most likely state has 60% of quanta in block with 60% of the oscillators. 2. As the # of oscillators gets larger, the curve gets narrower relative to its height: relatively smaller fluctuations likely. 300,200 3x10 23, 2x10 23 add more oscillators # Microstates 60% q 1

Definition of Entropy Entropy: S k ln Ω where Boltzmann s constant is k = 23 1.38 10 J K It is convenient to deal with ln(ω) instead of Ω for several reasons: 1. Ω is typically huge, and ln(ω) is a smaller number to deal with 2. Ω 12 =Ω 1 Ω 2 (multiplies), whereas lnω 12 =lnω 1 + lnω 2 (adds). Double the size, double the entropy. ( Extensive property )

Entropy: S k lnω, where k=1.4x10-23 J / K Note smaller numbers: We will see that in equilibrium, the most probable distribution of energy quanta is that which maximizes the total entropy. (Second Law of Thermodynamics).

Entropy: S k lnω, where k=1.4x10-23 J / K Why entropy? k ln(ω 1 Ω 2 ) = k ln(ω 1 ) + k ln(ω 2 ) (property of logarithms) When there is more than one object in a system, we can add entropies just like we can add energies: S total =S 1 +S 2, E total =E 1 +E 2. Remember: Ω = ( q + N ) q! ( N 1 )! 1!

Approaching Equilibrium energy distribution A energy distribution B Assume system starts in an energy distribution A far away from equilibrium (far from most likely state). What happens as the system evolves? Crucial point: consider our system to contain both blocks. The macrostate of the system is specified by the total # quanta: 100. All microstates compatible with q total =100 are equally likely, but not all energy distributions (values of q 1 ) are equally likely!

Approaching Equilibrium Each square represents 10 112 microstates. near peak far away from peak

Entropy Increased! energy distribution A energy distribution B There are many more microstates compatible with 60/40 than with 90/10. Hence, the entropy S = klnω has increased. This behavior is very general...

Second Law of Thermodynamics A mathematically sophisticated treatment of these ideas was developed by Boltzmann in the late 1800 s. The result of such considerations, confirmed by all experiments, is THE SECOND LAW OF THERMODYNAMICS If a closed system is not in equilibrium, the most probable consequence is that the entropy of the system will increase. By most probable we don t mean just slightly more probable, but rather so ridiculously probable that essentially no other possibility is ever observed. Hence, even though it is a statistical statement, we call it the 2 nd Law.

Irreversibility An isolated system goes easily from a low entropy to high entropy ( ), but never: from high to low. Each square represents 10 112 microstates. near peak (high S) far away from peak (low S)

Order, Disorder, and the 2 nd Law There are many more ways to distribute the perfume molecules throughout the room than there are ways to distribute them in the bottle. The natural tendency is S, meaning the perfume diffuses. Intuitively, this final, higher entropy state is more disordered than the initial state.

Leading Q2: This logarithmic plot shows the number of ways 100 quanta can be distributed between box 1 (300 oscillators) and box 2 (200 oscillators). Look at the end where q 1 =0. If you had to guess, which box is hotter? A. Box 1 (300 oscillators, q 1 =0). B. Box 2 (200 oscillators, q 2 =100).

Thermal transfer of energy

S = S 1 + S 2 Note: highest entropy S is the most likely macrostate (equilibrium). Remember 2nd Law! ds ds ds dq dq dq 1 2 = + = 1 1 1 0 ds dq ds = 0 (at equilibrium) dq 1 2 1 2 ds dq ds = dq 1 2 1 2 (at equilibrium) At equilibrium, these two slopes are equal. What else is equal here?

At equilibrium, these two slopes are equal. What else is equal here? The temperature of the two objects that are in contact and have come to equilibrium. This leads to the definition of temperature in statistical mechanics, Temperature: 1 T S E int Remember: the box is hotter when the slope is smaller. We use E int instead of q 1 to be more general.

SUMMARY: 1. We learned to count the number of microstates in a certain macrostate, using the Einstein model of a solid. 2. We defined ENTROPY. 3. We observe that the 2nd Law of thermodynamics requires that interacting systems move toward their most likely macrostate (equilibrium reflects statistics!); explains irreversibility. 4. We defined TEMPERATURE to correspond to the observed behavior of systems moving into equilibrium, using entropy as a guide (plus 2nd law, statistics). Entropy: S k lnω Temperature: 1 T S E int 22