General Pysics I New Lecture 27: Carnot Cycle, e 2nd Law, Entropy and Information Prof. AN, Xin xinwan@zju.edu.cn ttp://zimp.zju.edu.cn/~xinwan/
Carnot s Engine
Efficiency of a Carnot Engine isotermal process adiabatic process
Efficiency of a Carnot Engine
Efficiency of a Carnot Engine All Carnot engines operating between te same two temperatures ave te same efficiency.
Carnot Cycle e efficiency of te Carnot cycle only depends on te temperatures c &. e efficiency approaces 1 wen c / 0, or c 0. is te absolute energy scale, in units of K. Carnot cycle is a reversible cycle quasistatic wit no dissipation.
Reverse Carnot Cycle
Carnot s eorem No real eat engine operating between two energy reservoirs can be more efficient tan my engine operating between te same two reservoirs. -- Sadi Carnot
at if not? If we suppose carnot + =?
Matc ork First Matc ork First If we suppose + = 0 1 1 carnot carnot
Clausius Statement It is impossible to construct a cyclical macine wose sole effect is te continuous transfer of energy from one object to anoter object at a iger temperature witout te input of energy by work. No perfect refrigerator!
Matc Heat at te Cold End Matc Heat at te Cold End If we suppose + = 0 1 1 1 1 carnot c c c c carnot c c
Kelvin Statement It is impossible to construct a eat engine tat, operating in a cycle, produces no effect oter tan te absorption of energy from a reservoir and te performance of an equal amount of work. No perfect eat engine!
Laws in Plain Language e 1st and 2nd laws of termodynamics can be summarized as follows: e first law specifies tat we cannot get more energy out of a cyclic process by work tan te amount of energy we put in. e second law states tat we cannot break even because we must put more energy in, at te iger temperature, tan te net amount of energy we get out by work.
An Equality Now putting in te proper signs, positive c c Carnot Cycle 0 d negative 0
A Sum of Carnot Cycles P adiabats,i Any reversible process can be approximated by a sum of Carnot cycles, ence i, i, i c, i c, i 0 C d 0 c,i
Clausius Definition of Entropy Entropy is a state function, te cange in entropy during a process depends only on te end points and is independent of te actual pat followed. C 2 2 C ds ds 12 d C 1, 2 reversible ds,21 C ds 0 1 C 1 S 2 S 1 ds ds C1,1 2 C2,21 C 2 ds,1 2
Return to Inexact Differential Assume (2,1) (1,1) dg (2,2) (2,1) dx x y dx dy x y dy 1 2ln 2 (1,2) (1,1) (2,2) (1,2) dx x y dy ln 2 1 Note: df dg x dx x dy y is an exact differential. Integrating factor f ( x, y) ln x ln y f 0
e Second Law in terms of Entropy e total entropy of an isolated system tat undergoes a cange can never decrease. If te process is irreversible, ten te total entropy of an isolated system always increases. In a reversible process, te total entropy of an isolated system remains constant. e cange in entropy of te Universe must be greater tan zero for an irreversible process and equal to zero for a reversible process. S Universe 0
Example 1: Clausius Statement S S c c S S S c c 0 Irreversible!
Example 2: Kelvin Statement S 0 Irreversible!
Example 3: Free Expansion U? 0 S 0 e can only calculate S wit a reversible process! In tis case, we replace te free expansion by te isotermal process wit te same initial and final states. S i f d i f Pd i f nrd f nr ln 0 i Irreversible!
Entropy: A Measure of Disorder Entropy: A Measure of Disorder ln 2 ln B i f B Nk Nk S k S B ln N m f f N m i i N i f i f e assume tat eac molecule occupies some microscopic volume m. suggesting (Boltzmann)
Order versus Disorder Isolated systems tend toward disorder and tat entropy is a measure of tis disorder. Ordered: all molecules on te left side Disordered: molecules on te left and rigt
Landauer s Principle & erification Computation needs to involve eat dissipation only wen you do someting irreversible wit te information. Lutz group (2012) k B ln 2 0.693
Information and Entropy (1927) Bell Labs, Ralp Hartley Measure for information in a message Logaritm: 8 bit = 2 8 = 256 different numbers (1940) Bell Labs, Claude Sannon A matematical teory of communication Probability of a particular message But tere is no information. You are not winning te lottery.
Information and Entropy (1927) Bell Labs, Ralp Hartley Measure for information in a message Logaritm: 8 bit = 2 8 = 256 different numbers (1940) Bell Labs, Claude Sannon A matematical teory of communication Probability of a particular message Now tat s someting. Okay, you are going to win te lottery.
Information and Entropy (1927) Bell Labs, Ralp Hartley Measure for information in a message Logaritm: 8 bit = 2 8 = 256 different numbers (1940) Bell Labs, Claude Sannon A matematical teory of communication Probability of a particular message Information ~ - log (probability) ~ negative entropy S infomation i P i log P i
For ose o Are Interested Reading (downloadable from my website): Carles Bennett and Rolf Landauer, e fundamental pysical limits of computation. Antoine Bérut et al., Experimental verification of Landauer s principle linking information and termodynamics, Nature (2012). Set Lloyd, Ultimate pysical limits to computation, Nature (2000). Dare to adventure were you ave not been!