General Physics I Lecture 27: Entropy and Information Prof. WAN, Xin xinwan@zju.edu.cn http://zimp.zju.edu.cn/~xinwan/
1st & 2nd Laws of Thermodynamics The 1st law specifies that we cannot get more energy out of a cyclic process by work than the amount of energy we put in. U Q W The 2nd law states that we cannot break even because we must put more energy in, at the higher temperature, than the net amount of energy we get out by work. Qc Tc W 1 carnot 1 Qh Qh Th
Carnot s Engine
Efficiency of a Carnot Engine All Carnot engines operating between the same two temperatures have the same efficiency.
An Equality Now putting in the proper signs, positive negative Qh Qc 0 Th Tc dq Carnot Cycle T 0
A Sum of Carnot Cycles P adiabats Th,i Any reversible process can be approximated by a sum of Carnot cycles, hence Qh,i T i Tc,i h,i Qc,i Tc,i 0 dq C T 0 V
Clausius Definition of Entropy Entropy is a state function, the change in entropy during a process depends only on the end points and is independent of the actual path followed. 2 C2 dqreversible ds T ds ds ds 0 C1,1 2 1 C1 S 2 S1 C 2, 2 1 C ds ds ds C1,1 2 C 2, 2 1 C 2,1 2
Return to Inexact Differential x Assume dg dx dy y ( 2,1) ( 2, 2) dx x dy 1 2 ln 2 (2,1) y (1,1) (1, 2 ) ( 2, 2 ) dx x dy ln 2 1 (1,2) y (1,1) dg dx dy Note: df x x y Integrating factor is an exact differential. f ( x, y ) ln x ln y f 0
Back to the First Law Heat is path dependent. dq du PdV Therefore, 1/T is really the integrating factor for the differential form of heat. Now we can recast the 1st law of thermodynamics as du TdS PdV Entropy is also a state function, as is the internal energy or volume.
Entropy of an Ideal Gas (1 mole) RT p (T, V ) V mol V U (T ) C fr T T 2 mol V C dt RdV 1 ds du pdv T T V Integrating from (T0,V0) to (T, V) mol V S (T,V ) S 0 C T V ln R ln T0 V0
Carnot s Theorem No real heat engine operating between two energy reservoirs can be more efficient than Carnot s engine operating between the same two reservoirs. positive negative Q 'c Tc e' 1 1 Q 'h Th Q ' h Q 'c 0 Th Tc What does this mean? Still, for any engine in a cycle (S is a state function!) ds 0
Counting the Heat Baths in Q 'h S h Th Q'h > 0 S gas ds 0 after a cycle Q 'c S c Tc Q'c < 0 Q 'h Q 'c S S h S gas S c 0 0 Th Tc
Counting the Heat Baths in Q 'h S h Th Q'h > 0 S gas ds 0 after a cycle Q 'c S c Tc Q'c < 0 The total entropy of an isolated system that undergoes a change can never decrease.
Example 1: Clausius Statement Q S h Th Q S c Tc Q Q S S h S c 0 Th Tc Irreversible!
Example 2: Kelvin Statement Q S 0 T Irreversible!
Specific Heat Note: Last time we defined molar specific heat. In physics, we also use specific heat per particle.
Example 3: Mixing Water TA TB TA< TB Q TA T : Q cm A T TA TB T : Q cmb TB T m A T TA mb TB T TA Q TB A B m ATA mbtb T m A mb
Example 3: Mixing Water TA TA< TB TB Q TA T : TB T : TA Q TB A B cm A dt S A cm A ln T 0 TA TA T T cm dt B S B cmb ln T 0 TB TB T T For simplicity, assume m A mb m, T TA TB / 2 2 0 S S A S B cm ln T T T A B Irreversible!
The Second Law in terms of Entropy The total entropy of an isolated system that undergoes a change can never decrease. If the process is irreversible, then the total entropy of an isolated system always increases. In a reversible process, the total entropy of an isolated system remains constant. The change in entropy of the Universe must be greater than zero for an irreversible process and equal to zero for a reversible process. SUniverse 0
Order versus Disorder Isolated systems tend toward disorder and that entropy is a measure of this disorder. Ordered: all molecules on the left side Disordered: molecules on the left and right
Example 4: Free Expansion U Q W 0? S 0 We can only calculate S with a reversible process! In this case, we replace the free expansion by the isothermal process with the same initial and final states. Vf S Vi V f PdV V f nrdv dq V f nr ln 0 V V V i i i T T V Irreversible!
Entropy: A Measure of Disorder We assume that each molecule occupies some microscopic volume Vm. V Wi i Vm N Wf V f Wf V m V f Wi Vi N N V f S Nk B ln Nk B ln 2 V i suggesting S k B ln W (Boltzmann)
Information and Entropy (1927) Bell Labs, Ralph Hartley Measure for information in a message Logarithm: 8 bit = 28 = 256 different numbers (1940) Bell Labs, Claude Shannon A mathematical theory of communication Probability of a particular message But there is no information. You are not winning the lottery.
Information and Entropy (1927) Bell Labs, Ralph Hartley Measure for information in a message Logarithm: 8 bit = 28 = 256 different numbers (1940) Bell Labs, Claude Shannon A mathematical theory of communication Probability of a particular message Now that s something. Okay, you are going to win the lottery.
Information and Entropy (1927) Bell Labs, Ralph Hartley Measure for information in a message Logarithm: 8 bit = 28 = 256 different numbers (1940) Bell Labs, Claude Shannon A mathematical theory of communication Probability of a particular message Information ~ - log (probability) ~ negative entropy Sinfomation Pi log Pi i
It is already in use under that name. and besides, it will give you great edge in debates because nobody really knows what entropy is anyway. ---- John von Neumann
Maxwell s Demon To determine whether to let a molecule through, the demon must acquire information about the state of the molecule. However well prepared, the demon will eventually run out of information storage space and must begin to erase the information it has previously gathered. Erasing information is a thermodynamically irreversible process that increases the entropy of a system.
Landauer s Principle & Verification Computation needs to involve heat dissipation only when you do something irreversible with the information. Lutz group (2012) Q k BT ln 2 0.693
For Those Who Are Interested Reading (downloadable from my website): Charles Bennett and Rolf Landauer, The fundamental physical limits of computation. Antoine Bérut et al., Experimental verification of Landauer s principle linking information and thermodynamics, Nature (2012). Seth Lloyd, Ultimate physical limits to computation, Nature (2000). Dare to adventure where you have not been!