Thermodynamics Second Law Entropy

Similar documents
Conceptual Physics Phase Changes Thermodynamics

Thermodynamics Boltzmann (Gibbs) Distribution Maxwell-Boltzmann Distribution Second Law Entropy

Lecture 27: Entropy and Information Prof. WAN, Xin

Thermodynamics Second Law Entropy

Physics 172H Modern Mechanics

Information in Biology

Information in Biology

Thermodynamics: More Entropy

Thermodynamics: More Entropy

Thermodynamics Second Law Heat Engines

213 Midterm coming up

Thermodynamics: Entropy

Irreversible Processes

Lecture 25 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas

Lecture Notes 2014March 13 on Thermodynamics A. First Law: based upon conservation of energy

Thermodynamics Heat Transfer

Thermodynamics Heat Capacity Phase Changes

Physics 207 Lecture 25. Lecture 25, Nov. 26 Goals: Chapter 18 Understand the molecular basis for pressure and the idealgas

S = S(f) S(i) dq rev /T. ds = dq rev /T

Lecture Notes Set 3a: Probabilities, Microstates and Entropy

Entropy. Physics 1425 Lecture 36. Michael Fowler, UVa

Thermodynamics Molecular Model of a Gas Molar Heat Capacities

Chapter 4 - Second Law of Thermodynamics

Chapter 20. Heat Engines, Entropy and the Second Law of Thermodynamics. Dr. Armen Kocharian

Waves Standing Waves Sound Waves

Sparks CH301. THERMODYNAMICS and ENTROPY. UNIT 4 Day 5

Class 22 - Second Law of Thermodynamics and Entropy

Chapter 20 The Second Law of Thermodynamics

Chemistry 2000 Lecture 9: Entropy and the second law of thermodynamics

Murray Gell-Mann, The Quark and the Jaguar, 1995

...Thermodynamics. Lecture 15 November 9, / 26

Entropy. Entropy Changes for an Ideal Gas

Adiabatic Expansion (DQ = 0)

The four laws of Thermodynamics. WOD are underlined.

Entropy A measure of molecular disorder

Heat Machines (Chapters 18.6, 19)

Let s start by reviewing what we learned last time. Here is the basic line of reasoning for Einstein Solids

Chapter 20 Entropy and the 2nd Law of Thermodynamics

Lecture 6. Statistical Processes. Irreversibility. Counting and Probability. Microstates and Macrostates. The Meaning of Equilibrium Ω(m) 9 spins

Linear Momentum Center of Mass

Chapter 11 Heat Engines and The Second Law of Thermodynamics

Lecture 11: Information theory THURSDAY, FEBRUARY 21, 2019

Conduction. Heat Transfer Methods. Conduction. Conduction

Entropy, free energy and equilibrium. Spontaneity Entropy Free energy and equilibrium

Thermodynamics: Entropy

Removing the mystery of entropy and thermodynamics. Part 3

The Temperature of a System as a Function of the Multiplicity and its Rate of Change

Thermodynamics Heat & Internal Energy Heat Capacity

12 The Laws of Thermodynamics

1. Thermodynamics 1.1. A macroscopic view of matter

Agenda. Chapter 10, Problem 26. All matter is made of atoms. Atomic Structure 4/8/14. What is the structure of matter? Atomic Terminology

PHYSICS 715 COURSE NOTES WEEK 1

Lecture 4: Classical Illustrations of Macroscopic Thermal Effects

Extended or Composite Systems Systems of Many Particles Deformation

Lesson 12. Luis Anchordoqui. Physics 168. Tuesday, November 28, 17

Worksheet for Exploration 21.1: Engine Efficiency W Q H U

BIT 1002 Thermodynamics. First Law Heat engines Second Law Entropy. What is heat?

Heat What is heat? Work = 2. PdV 1

Chapter 12. The Laws of Thermodynamics. First Law of Thermodynamics

Chapter 12. The Laws of Thermodynamics

Spontaneity: Second law of thermodynamics CH102 General Chemistry, Spring 2012, Boston University

Problem: Calculate the entropy change that results from mixing 54.0 g of water at 280 K with 27.0 g of water at 360 K in a vessel whose walls are

Chapter 12- The Law of Increasing Disorder

So far in talking about thermodynamics, we ve mostly limited ourselves to

The First Law of Thermodynamics

Linear Momentum 2D Collisions Extended or Composite Systems Center of Mass

MP203 Statistical and Thermal Physics. Jon-Ivar Skullerud and James Smith

Lecture 13. Multiplicity and statistical definition of entropy

Physics 207 Lecture 27. Lecture 26. Chapters 18, entropy and second law of thermodynamics Chapter 19, heat engines and refrigerators

18.13 Review & Summary

Lecture 5 Entropy and Disorder

Lecture Notes Set 3b: Entropy and the 2 nd law

Details on the Carnot Cycle

Irreversibility. Have you ever seen this happen? (when you weren t asleep or on medication) Which stage never happens?

Waves Standing Waves and Sound Beats Nonsinusoidal Wave Patterns

Physics 202 Homework 5

Kinematics Motion in 1-Dimension

PHY101: Major Concepts in Physics I

Kinematics Kinematic Equations and Falling Objects

Fluids, Thermodynamics, Waves, and Optics Fluids

Irreversible Processes

Physics, Time and Determinism

Lecture 10: Heat Engines and Reversible Processes

Linear Momentum Collisions

First Law Limitations

Irreversible Processes

SPONTANEOUS PROCESSES AND THERMODYNAMIC EQUILIBRIUM

Concepts in Materials Science I. StatMech Basics. VBS/MRC Stat Mech Basics 0

Thermodynamics Heat & Work The First Law of Thermodynamics

How to please the rulers of NPL-213 the geese

Chapter 17. Free Energy and Thermodynamics. Chapter 17 Lecture Lecture Presentation. Sherril Soman Grand Valley State University

Beyond the Second Law of Thermodynamics

Chapter 19 Chemical Thermodynamics

CHEM-UA 652: Thermodynamics and Kinetics

Physics 123 Unit #2 Review

Electricity and Magnetism Coulomb s Law

Chemistry 163B Heuristic Tutorial Second Law, Statistics and Entropy

Aljalal-Phys March 2004-Ch21-page 1. Chapter 21. Entropy and the Second Law of Thermodynamics

Second Law Applications. NC State University

Linear Momentum 2D Collisions Extended or Composite Systems Center of Mass

Transcription:

Thermodynamics Second Law Entropy Lana Sheridan De Anza College May 9, 2018

Last time entropy (macroscopic perspective)

Overview entropy (microscopic perspective)

Reminder of Example from Last Lecture What is the entropy change during an adiabatic free expansion of an isolated gas of n moles going from volume V i to volume V f? (Note: such a process is not reversible.)

Reminder of Example from Last Lecture What is the entropy change during an adiabatic free expansion of an isolated gas of n moles going from volume V i to volume V f? (Note: such a process is not reversible.) In an adiabatic free expansion Q = 0, and since its isolated, W = 0, and therefore T f = T i. ( ) ( ) Tf Vf S = nc v ln + nr ln T i V i

Reminder of Example from Last Lecture What is the entropy change during an adiabatic free expansion of an isolated gas of n moles going from volume V i to volume V f? (Note: such a process is not reversible.) In an adiabatic free expansion Q = 0, and since its isolated, W = 0, and therefore T f = T i. using ln 1 = 0 becomes ( ) ( ) Tf Vf S = nc v ln + nr ln T i V i ( ) Vf S = nr ln V i

Clausius Equality Clausius found that the entropy change around any reversible cycle (closed path) is zero. This is called the Clausius Equality: S = dqr T = 0 And moreover that since real engines can never quite conform to this ideal, for all engines the Clausius Inequality holds: dq S sys+env = T 0 (If the system entropy decreases the environment s entropy must increase.)

Entropy in an isolated system This gives us another way to state the second law: 2nd Law In an isolated system, entropy does not decrease. In a non-isolated system (either closed or open) entropy can decrease, but only by increasing the entropy of the environment at least as much.

Entropy Microscopically Entropy is a measure of disorder in a system. It can also be used as a measure of information content. Intriguingly, entropy was introduced separately in physics and then later in information theory. The fact that these two measures were the same was observed by John von Neumann.

Entropy According to Claude Shannon, who developed Shannon entropy, or information entropy: I thought of calling it information, but the word was overly used, so I decided to call it uncertainty. [...] Von Neumann told me, You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.

So what is entropy? Consider the Yo. app (valued at $5-10 million in 2014). You can only use it to send the message yo. If you get a message on the app, you can guess what it will say.

So what is entropy? Consider the Yo. app (valued at $5-10 million in 2014). You can only use it to send the message yo. If you get a message on the app, you can guess what it will say. The message has no information content, and it is perfectly ordered, there is no uncertainty. The message is a physical system that can only be in one state.

So what is entropy? But what if the message is only sent with 50% probability? If you get the message, let s meet for drinks, if not, I m still in a meeting and can t join you.

So what is entropy? But what if the message is only sent with 50% probability? If you get the message, let s meet for drinks, if not, I m still in a meeting and can t join you. Now you learn something when you get the message. The information content is 1 bit.

So what is entropy? (Shannon) Entropy of a message m: H(m) = i p i log p i where p i is the probability of receiving message i.

So what is entropy? (Shannon) Entropy of a message m: H(m) = i p i log p i where p i is the probability of receiving message i. For the fully-determined message yo : H(m) = 1 log 1 0 log 0 = 0 bits

So what is entropy? (Shannon) Entropy of a message m: H(m) = i p i log p i where p i is the probability of receiving message i. For the fully-determined message yo : H(m) = 1 log 1 0 log 0 = 0 bits For the yo -or-no message: H(m) = 1 2 log 1 2 1 2 log 1 2 = log 2 = 1 bit

Entropy in Thermodynamics In physics, we express entropy a little differently: S = k B p i ln p i p i is the probability of being in the ith microstate, given you are in a known macrostate. k B is called the Boltzmann constant. i k B = 1.38 10 23 J K 1 Notice that this changes the units of entropy to J / K.

Entropy in Thermodynamics Consider the atmosphere, it is mostly Oxygen and Nitrogen. Have you ever walked into a room and been unable to breathe because all of the oxygen in on the other side of the room?

Entropy in Thermodynamics Consider the atmosphere, it is mostly Oxygen and Nitrogen. Have you ever walked into a room and been unable to breathe because all of the oxygen in on the other side of the room? a b c As more oxygen molecules are added, the probability that there is oxygen is on both sides increases.

the molecules are oxygen and the other half are nitrogen. At any given moment, the probability of one molecule being in the left part of the container shown in Figure 22.15a as a result of random motion is 1 2. If there are two molecules as shown in Figure 22.15b, the probability of both being in the left part is 1 1 2 2 2, or 1 in 4. If there are three molecules (Fig. 22.15c), the probability of them all being in the left portion at the same moment is 1 1 2 A macrostate is something 2 3, or 1 in 8. For 100 independently moving molecules, the probability that the 50 oxygen molecules we can will observe be found onin a the large left scale. part at any moment is 1 1 2 2 50. Likewise, the probability that the remaining 50 nitrogen molecules will be found in the right part at any moment is 1 1 2 2 50. Therefore, the probability of Macrostates and Microstates The macrostates here could be: a b c all oxygen on the left all oxygen on the right oxygen mixed throughout the room. Figure 22 tions of ide container. exist only t guish amo (a) One m has a 1-in-2 the left sid have a 1-in the left sid (c) Three m chance of b at the same

of a 4 on a pair of dice is only three times as probable as a macrostate of 2. The ratio of probabilities of a worthless hand and a royal flush is significantly larger. When we are talking about a macroscopic thermodynamic system containing on the order of Avogadro s number of molecules, however, the ratios of probabilities Macrostates and Microstates can be astronomical. A microstate is a state too small / complex to easily observe, but Let s explore this concept by considering 100 molecules in a container. Half of represents the molecules one are way oxygen a macrostate and the other half canare be nitrogen. achieved. At any given moment, the probability of one molecule being in the left part of the container shown in Figure 22.15a as a result of random motion is 1 2. If there are two molecules as shown in Figure 22.15b, the probability of both being in the left part is 1 1 2 2 2, or 1 in 4. If there We want to consider the number of microstates for each macrostate. are three molecules (Fig. 22.15c), the probability of them all being in the left portion at the same moment is 1 1 2 2 3, or 1 in 8. For 100 independently moving molecules, the probability that the 50 oxygen molecules will be found in the left part at any moment is 1 1 2 2 50. Likewise, the probability that the remaining 50 nitrogen molecules will be found in the right part at any moment is 1 1 2 2 50. Therefore, the probability of The macrostates here could be: all oxygen on the left 1 microstate a all oxygen on the right 1 microstate oxygen mixed throughout the room 6 microstates b c more disordered tha sample. We will not order approach in th watch for it in other Figure 22.15 Poss tions of identical mo container. The color exist only to allow u guish among the mo (a) One molecule in has a 1-in-2 chance o the left side. (b) Two have a 1-in-4 chance the left side at the sa (c) Three molecules chance of being on at the same time.

b c guish among the mo (a) One molecule in has a 1-in-2 chance o the left side. (b) Two have a 1-in-4 chance the left side at the sa (c) Three molecules chance of being on at the same time. Suppose all of the microstates are equally likely. If so, even with only 3 molecules, we would expect to find the oxygen distributed throughout the room (75% probability). S = k B p i ln p i Entropy of the all on the left macrostate: S L = k B ln 1 = 0 Entropy of the mixed macrostate: i S M = k B ln 6 1.8k B The entropy of the mixed macrostate is higher!

Boltzmann s formula The entropy of a macrostate can be written: S = k B ln W where W is the number of microstates for that macrostate, assuming all microstates are equally likely. W is the number of ways the macrostate can occur.

Entropy and disorder A macrostate that has very many microstates can be thought of as a disordered state. If all the oxygen is on the left of the room, all the nitrogen on the right, the room is organized, or ordered. But this is very unlikely! Even if a room starts out ordered, you would expect it to become disordered right away, because a disordered room is more probable.

Entropy example Imagine throwing two dice. The score of the dice will be the sum of the two numbers on the dice. What score should you bet on seeing? 1 Hewitt, Conceptual Physics, Problem 8, page 331.

Entropy example Imagine throwing two dice. The score of the dice will be the sum of the two numbers on the dice. What score should you bet on seeing? What is the number of microstates associated with that score? What is the number of microstates associated a score of 2? 1 Hewitt, Conceptual Physics, Problem 8, page 331.

Entropy example Imagine throwing two dice. The score of the dice will be the sum of the two numbers on the dice. What score should you bet on seeing? What is the number of microstates associated with that score? What is the number of microstates associated a score of 2? The macroscopic world is like a game in which there are 10 23 dice, but you only ever see the (approximate) score, not the result on each dice. 1 Hewitt, Conceptual Physics, Problem 8, page 331.

Second Law of Thermodynamics This gives us another perspective on entropy and the second law. Heat must flow from hotter to colder because there are more ways to distribute energy evenly than to keep it in one place only. S t 0 As time goes by, things tend to disorder.

Second Law of Thermodynamics 20-3 CHANGE IN ENTROPY 537 PART 2 does not obey a conservation law. d; it always remains constant. For system always increases. Because of ordered, less probable, low entropy mes called the arrow of time. For popcorn kernel with the forward py. The backward direction of time ond to the exploded popcorn reckward process would result in an the change in entropy of a system: the energy the system gains or loses the atoms or molecules that make st approach in the next section and System Insulation (a) Initial state i Stopcock closed Vacuum Irreversible process Stopcock open disordered, more probable, high entropy tropy by looking again at a process : the free expansion of an ideal gas. ibrium state i, confined by a closed sulated container. If we open the tainer, eventually reaching the final s is an irreversible process;all the ft half of the container. (b) Final state f

Example (Microscopic Entropy Analysis) What is the entropy change during an adiabatic free expansion of an isolated gas of n moles going from volume V i to volume V f? (Note: such a process is not reversible.) Two approaches: microscopic and macroscopic (seen earlier). Now for microscopic. We need some way to count microstates of the gas. Let s break up our volume, V, into a grid and say there are m different locations a gas particle could be in. 1 Let N be the number of particles and m N. 1 Serway & Jewett, pg 671.

Example Let V m be the volume of one grid unit, so m = V /V m. How many ways can we place the particles? The first can be placed in any of w = V /V m places. For each of those positions, the second can be placed in w locations, giving w 2 ways in total. For N, there are w N = ( V V m ) N ways the particles can be arranged in the volume V. This gives ( ) V S = Nk B ln V m

Example There are reasons to quibble over how we counted the microstates, but let us think what this gives us for the change in entropy. We have a final volume V f and an initial volume V i : ( ) ( ) Vf Vi S = Nk B ln Nk B ln V m V m

Example There are reasons to quibble over how we counted the microstates, but let us think what this gives us for the change in entropy. We have a final volume V f and an initial volume V i : ( ) ( ) Vf Vi S = Nk B ln Nk B ln V m V m So, ( ) Vf S = Nk B ln V i The same as for the macroscopic analysis!

Summary entropy (microscopic perspective) Test Monday, May 14. Homework Serway & Jewett: new: Ch 22, onward from page 679. Probs: 39, 49 will set tomorrow: Ch 22, OQs: 1, 3, 7; CQs: 1; Probs: 1, 3, 9, 15, 20, 23, 29, 37, 67, 73, 81