Thermodynamics Second Law Entropy Lana Sheridan De Anza College May 9, 2018
Last time entropy (macroscopic perspective)
Overview entropy (microscopic perspective)
Reminder of Example from Last Lecture What is the entropy change during an adiabatic free expansion of an isolated gas of n moles going from volume V i to volume V f? (Note: such a process is not reversible.)
Reminder of Example from Last Lecture What is the entropy change during an adiabatic free expansion of an isolated gas of n moles going from volume V i to volume V f? (Note: such a process is not reversible.) In an adiabatic free expansion Q = 0, and since its isolated, W = 0, and therefore T f = T i. ( ) ( ) Tf Vf S = nc v ln + nr ln T i V i
Reminder of Example from Last Lecture What is the entropy change during an adiabatic free expansion of an isolated gas of n moles going from volume V i to volume V f? (Note: such a process is not reversible.) In an adiabatic free expansion Q = 0, and since its isolated, W = 0, and therefore T f = T i. using ln 1 = 0 becomes ( ) ( ) Tf Vf S = nc v ln + nr ln T i V i ( ) Vf S = nr ln V i
Clausius Equality Clausius found that the entropy change around any reversible cycle (closed path) is zero. This is called the Clausius Equality: S = dqr T = 0 And moreover that since real engines can never quite conform to this ideal, for all engines the Clausius Inequality holds: dq S sys+env = T 0 (If the system entropy decreases the environment s entropy must increase.)
Entropy in an isolated system This gives us another way to state the second law: 2nd Law In an isolated system, entropy does not decrease. In a non-isolated system (either closed or open) entropy can decrease, but only by increasing the entropy of the environment at least as much.
Entropy Microscopically Entropy is a measure of disorder in a system. It can also be used as a measure of information content. Intriguingly, entropy was introduced separately in physics and then later in information theory. The fact that these two measures were the same was observed by John von Neumann.
Entropy According to Claude Shannon, who developed Shannon entropy, or information entropy: I thought of calling it information, but the word was overly used, so I decided to call it uncertainty. [...] Von Neumann told me, You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.
So what is entropy? Consider the Yo. app (valued at $5-10 million in 2014). You can only use it to send the message yo. If you get a message on the app, you can guess what it will say.
So what is entropy? Consider the Yo. app (valued at $5-10 million in 2014). You can only use it to send the message yo. If you get a message on the app, you can guess what it will say. The message has no information content, and it is perfectly ordered, there is no uncertainty. The message is a physical system that can only be in one state.
So what is entropy? But what if the message is only sent with 50% probability? If you get the message, let s meet for drinks, if not, I m still in a meeting and can t join you.
So what is entropy? But what if the message is only sent with 50% probability? If you get the message, let s meet for drinks, if not, I m still in a meeting and can t join you. Now you learn something when you get the message. The information content is 1 bit.
So what is entropy? (Shannon) Entropy of a message m: H(m) = i p i log p i where p i is the probability of receiving message i.
So what is entropy? (Shannon) Entropy of a message m: H(m) = i p i log p i where p i is the probability of receiving message i. For the fully-determined message yo : H(m) = 1 log 1 0 log 0 = 0 bits
So what is entropy? (Shannon) Entropy of a message m: H(m) = i p i log p i where p i is the probability of receiving message i. For the fully-determined message yo : H(m) = 1 log 1 0 log 0 = 0 bits For the yo -or-no message: H(m) = 1 2 log 1 2 1 2 log 1 2 = log 2 = 1 bit
Entropy in Thermodynamics In physics, we express entropy a little differently: S = k B p i ln p i p i is the probability of being in the ith microstate, given you are in a known macrostate. k B is called the Boltzmann constant. i k B = 1.38 10 23 J K 1 Notice that this changes the units of entropy to J / K.
Entropy in Thermodynamics Consider the atmosphere, it is mostly Oxygen and Nitrogen. Have you ever walked into a room and been unable to breathe because all of the oxygen in on the other side of the room?
Entropy in Thermodynamics Consider the atmosphere, it is mostly Oxygen and Nitrogen. Have you ever walked into a room and been unable to breathe because all of the oxygen in on the other side of the room? a b c As more oxygen molecules are added, the probability that there is oxygen is on both sides increases.
the molecules are oxygen and the other half are nitrogen. At any given moment, the probability of one molecule being in the left part of the container shown in Figure 22.15a as a result of random motion is 1 2. If there are two molecules as shown in Figure 22.15b, the probability of both being in the left part is 1 1 2 2 2, or 1 in 4. If there are three molecules (Fig. 22.15c), the probability of them all being in the left portion at the same moment is 1 1 2 A macrostate is something 2 3, or 1 in 8. For 100 independently moving molecules, the probability that the 50 oxygen molecules we can will observe be found onin a the large left scale. part at any moment is 1 1 2 2 50. Likewise, the probability that the remaining 50 nitrogen molecules will be found in the right part at any moment is 1 1 2 2 50. Therefore, the probability of Macrostates and Microstates The macrostates here could be: a b c all oxygen on the left all oxygen on the right oxygen mixed throughout the room. Figure 22 tions of ide container. exist only t guish amo (a) One m has a 1-in-2 the left sid have a 1-in the left sid (c) Three m chance of b at the same
of a 4 on a pair of dice is only three times as probable as a macrostate of 2. The ratio of probabilities of a worthless hand and a royal flush is significantly larger. When we are talking about a macroscopic thermodynamic system containing on the order of Avogadro s number of molecules, however, the ratios of probabilities Macrostates and Microstates can be astronomical. A microstate is a state too small / complex to easily observe, but Let s explore this concept by considering 100 molecules in a container. Half of represents the molecules one are way oxygen a macrostate and the other half canare be nitrogen. achieved. At any given moment, the probability of one molecule being in the left part of the container shown in Figure 22.15a as a result of random motion is 1 2. If there are two molecules as shown in Figure 22.15b, the probability of both being in the left part is 1 1 2 2 2, or 1 in 4. If there We want to consider the number of microstates for each macrostate. are three molecules (Fig. 22.15c), the probability of them all being in the left portion at the same moment is 1 1 2 2 3, or 1 in 8. For 100 independently moving molecules, the probability that the 50 oxygen molecules will be found in the left part at any moment is 1 1 2 2 50. Likewise, the probability that the remaining 50 nitrogen molecules will be found in the right part at any moment is 1 1 2 2 50. Therefore, the probability of The macrostates here could be: all oxygen on the left 1 microstate a all oxygen on the right 1 microstate oxygen mixed throughout the room 6 microstates b c more disordered tha sample. We will not order approach in th watch for it in other Figure 22.15 Poss tions of identical mo container. The color exist only to allow u guish among the mo (a) One molecule in has a 1-in-2 chance o the left side. (b) Two have a 1-in-4 chance the left side at the sa (c) Three molecules chance of being on at the same time.
b c guish among the mo (a) One molecule in has a 1-in-2 chance o the left side. (b) Two have a 1-in-4 chance the left side at the sa (c) Three molecules chance of being on at the same time. Suppose all of the microstates are equally likely. If so, even with only 3 molecules, we would expect to find the oxygen distributed throughout the room (75% probability). S = k B p i ln p i Entropy of the all on the left macrostate: S L = k B ln 1 = 0 Entropy of the mixed macrostate: i S M = k B ln 6 1.8k B The entropy of the mixed macrostate is higher!
Boltzmann s formula The entropy of a macrostate can be written: S = k B ln W where W is the number of microstates for that macrostate, assuming all microstates are equally likely. W is the number of ways the macrostate can occur.
Entropy and disorder A macrostate that has very many microstates can be thought of as a disordered state. If all the oxygen is on the left of the room, all the nitrogen on the right, the room is organized, or ordered. But this is very unlikely! Even if a room starts out ordered, you would expect it to become disordered right away, because a disordered room is more probable.
Entropy example Imagine throwing two dice. The score of the dice will be the sum of the two numbers on the dice. What score should you bet on seeing? 1 Hewitt, Conceptual Physics, Problem 8, page 331.
Entropy example Imagine throwing two dice. The score of the dice will be the sum of the two numbers on the dice. What score should you bet on seeing? What is the number of microstates associated with that score? What is the number of microstates associated a score of 2? 1 Hewitt, Conceptual Physics, Problem 8, page 331.
Entropy example Imagine throwing two dice. The score of the dice will be the sum of the two numbers on the dice. What score should you bet on seeing? What is the number of microstates associated with that score? What is the number of microstates associated a score of 2? The macroscopic world is like a game in which there are 10 23 dice, but you only ever see the (approximate) score, not the result on each dice. 1 Hewitt, Conceptual Physics, Problem 8, page 331.
Second Law of Thermodynamics This gives us another perspective on entropy and the second law. Heat must flow from hotter to colder because there are more ways to distribute energy evenly than to keep it in one place only. S t 0 As time goes by, things tend to disorder.
Second Law of Thermodynamics 20-3 CHANGE IN ENTROPY 537 PART 2 does not obey a conservation law. d; it always remains constant. For system always increases. Because of ordered, less probable, low entropy mes called the arrow of time. For popcorn kernel with the forward py. The backward direction of time ond to the exploded popcorn reckward process would result in an the change in entropy of a system: the energy the system gains or loses the atoms or molecules that make st approach in the next section and System Insulation (a) Initial state i Stopcock closed Vacuum Irreversible process Stopcock open disordered, more probable, high entropy tropy by looking again at a process : the free expansion of an ideal gas. ibrium state i, confined by a closed sulated container. If we open the tainer, eventually reaching the final s is an irreversible process;all the ft half of the container. (b) Final state f
Example (Microscopic Entropy Analysis) What is the entropy change during an adiabatic free expansion of an isolated gas of n moles going from volume V i to volume V f? (Note: such a process is not reversible.) Two approaches: microscopic and macroscopic (seen earlier). Now for microscopic. We need some way to count microstates of the gas. Let s break up our volume, V, into a grid and say there are m different locations a gas particle could be in. 1 Let N be the number of particles and m N. 1 Serway & Jewett, pg 671.
Example Let V m be the volume of one grid unit, so m = V /V m. How many ways can we place the particles? The first can be placed in any of w = V /V m places. For each of those positions, the second can be placed in w locations, giving w 2 ways in total. For N, there are w N = ( V V m ) N ways the particles can be arranged in the volume V. This gives ( ) V S = Nk B ln V m
Example There are reasons to quibble over how we counted the microstates, but let us think what this gives us for the change in entropy. We have a final volume V f and an initial volume V i : ( ) ( ) Vf Vi S = Nk B ln Nk B ln V m V m
Example There are reasons to quibble over how we counted the microstates, but let us think what this gives us for the change in entropy. We have a final volume V f and an initial volume V i : ( ) ( ) Vf Vi S = Nk B ln Nk B ln V m V m So, ( ) Vf S = Nk B ln V i The same as for the macroscopic analysis!
Summary entropy (microscopic perspective) Test Monday, May 14. Homework Serway & Jewett: new: Ch 22, onward from page 679. Probs: 39, 49 will set tomorrow: Ch 22, OQs: 1, 3, 7; CQs: 1; Probs: 1, 3, 9, 15, 20, 23, 29, 37, 67, 73, 81