Lecture 27: Entropy and Information Prof. WAN, Xin

Similar documents
Lecture 27: Entropy and Information Prof. WAN, Xin

General Physics I. New Lecture 27: Carnot Cycle, The 2nd Law, Entropy and Information. Prof. WAN, Xin

Lecture 27: Entropy and Information Prof. WAN, Xin

ESCI 341 Atmospheric Thermodynamics Lesson 11 The Second Law of Thermodynamics

Physics 231 Lecture 35

EF 152 Exam #3, Fall, 2012 Page 1 of 6

Physics 41 Chapter 22 HW

EF 152 Exam #3, Spring 2016 Page 1 of 6

Maximum work for Carnot-like heat engines with infinite heat source

Physics 207 Lecture 23

THE SECOND LAW OF THERMODYNAMICS

Carnot's theorem and Szilárd engine. School of energy and power engineering, Huazhong University of Science & Technology. Wuhan, China.

Natural Convection Experiment Measurements from a Vertical Surface

Earlier Lecture. This gas tube is called as Pulse Tube and this phenomenon is called as Pulse Tube action.

Lecture 10: Carnot theorem

= T. (kj/k) (kj/k) 0 (kj/k) int rev. Chapter 6 SUMMARY

3B SCIENTIFIC PHYSICS

Chapter 5 Differentiation

Chapter 3. Problem Solutions

The Second Law of Thermodynamics

11/19/2013. PHY 113 C General Physics I 11 AM 12:15 PM MWF Olin 101

Chapters 19 & 20 Heat and the First Law of Thermodynamics

Physics 41 Chapter 22 HW Serway 7 th Edition

Thermodynamic Computing. Forward Through Backwards Time by RocketBoom

the first derivative with respect to time is obtained by carefully applying the chain rule ( surf init ) T Tinit

Lesson 6: The Derivative

Comparative analysis of non-equilibrium quantum Landauer bounds

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 1 MATH00030 SEMESTER /2019

A Counterexample to the Second Law of Thermodynamics

Preface. Here are a couple of warnings to my students who may be here to get a copy of what happened on a day that you missed.

12 The Laws of Thermodynamics

University of Washington Department of Chemistry Chemistry 452/456 Summer Quarter 2011

PHYSICS 214A Midterm Exam February 10, 2009

Practice Problem Solutions: Exam 1

Reversibility. Processes in nature are always irreversible: far from equilibrium

Thermodynamics Second Law Entropy

Even if you're not burning books, destroying information generates heat.

How to Find the Derivative of a Function: Calculus 1

Lecture 3 Heat Exchangers

MAC Calculus II Summer All you need to know on partial fractions and more

Homework 1 Due: Wednesday, September 28, 2016

Information in Biology

Physics 107 Problem 2.5 O. A. Pringle h Physics 107 Problem 2.6 O. A. Pringle

Wave-Particle Duality: de Broglie Waves and Uncertainty

18. Heat Engine, Entropy and the second law of thermodynamics

Heat What is heat? Work = 2. PdV 1

Information in Biology

General Equilibrium. What happens to cause a reaction to come to equilibrium?

Thermodynamic Third class Dr. Arkan J. Hadi

THE ESSENCE OF QUANTUM MECHANICS

The Compton effect according to Schrödinger s theory

The total error in numerical differentiation

Test 2 Review. 1. Find the determinant of the matrix below using (a) cofactor expansion and (b) row reduction. A = 3 2 =

Chemistry 163B Winter 2013 Clausius Inequality and ΔS ideal gas

NAME and Section No. b). A refrigerator is a Carnot cycle run backwards. That is, heat is now withdrawn from the cold reservoir at T cold

3.1 Extreme Values of a Function

Murray Gell-Mann, The Quark and the Jaguar, 1995

THERMODYNAMICS Lecture 15: Heat exchangers

lecture 26: Richardson extrapolation

FEM ANALYSES OF CUTTING OF ANISOTROPIC DENSELY COMPACTED AND SATURATED SAND

(a) What is the origin of the weak localization effect?

CMSC 451: Lecture 9 Greedy Approximation: Set Cover Thursday, Sep 28, 2017

3.4 Worksheet: Proof of the Chain Rule NAME

THE LANDAUER LIMIT AND THERMODYNAMICS OF BIOLOGICAL SYSTEMS

Chapter 12. The Laws of Thermodynamics. First Law of Thermodynamics

DEFINITION OF A DERIVATIVE

Entropy and the second law of thermodynamics

1. State whether the function is an exponential growth or exponential decay, and describe its end behaviour using limits.

Higher Derivatives. Differentiable Functions

Announcements. Exam 4 - Review of important concepts

REVIEW LAB ANSWER KEY

Calculus I Homework: The Derivative as a Function Page 1

Logarithmic functions

Compressor 1. Evaporator. Condenser. Expansion valve. CHE 323, October 8, Chemical Engineering Thermodynamics. Tutorial problem 5.

Entropy A measure of molecular disorder

The derivative function

1. Second Law of Thermodynamics

Strauss PDEs 2e: Section Exercise 3 Page 1 of 13. u tt c 2 u xx = cos x. ( 2 t c 2 2 x)u = cos x. v = ( t c x )u

Notes: DERIVATIVES. Velocity and Other Rates of Change

Observations on harmonic Progressions *

MTH-112 Quiz 1 Name: # :

Irreversible Processes

PY2005: Thermodynamics

Continuity and Differentiability Worksheet

Review of classical thermodynamics

Design of Unknown Inputs Observer for a Chemical Process: Analysis of Existence and Observability Based on Phenomenological Model

Lab 6 Derivatives and Mutant Bacteria

. Compute the following limits.

More on Security Constrained Optimal Power Flow

3.4 Algebraic Limits. Ex 1) lim. Ex 2)

Chapter 20. Heat Engines, Entropy and the Second Law of Thermodynamics. Dr. Armen Kocharian

LIMITS AND DERIVATIVES CONDITIONS FOR THE EXISTENCE OF A LIMIT

Chapter 13 Differentiation and applications

Lecture Notes 2014March 13 on Thermodynamics A. First Law: based upon conservation of energy

S = S(f) S(i) dq rev /T. ds = dq rev /T

10, Physical Chemistry- III (Classical Thermodynamics, Non-Equilibrium Thermodynamics, Surface chemistry, Fast kinetics)

Reversible Processes. Furthermore, there must be no friction (i.e. mechanical energy loss) or turbulence i.e. it must be infinitely slow.

Lecture Notes Set 4c: Heat engines and the Carnot cycle

On my honor as a student, I have neither given nor received unauthorized assistance on this exam.

Section 2.7 Derivatives and Rates of Change Part II Section 2.8 The Derivative as a Function. at the point a, to be. = at time t = a is

Transcription:

General Pysis I Leture 27: Entropy and Information Prof. WAN, Xin xinwan@zju.edu.n ttp://zimp.zju.edu.n/~xinwan/

1st & 2nd Laws of ermodynamis e 1st law speifies tat we annot get more energy out of a yli proess by work tan te amount of energy we put in. U W e 2nd law states tat we annot break even beause we must put more energy in, at te iger temperature, tan te net amount of energy we get out by work. W 1 arnot 1

Carnot s Engine

Effiieny of a Carnot Engine All Carnot engines operating between te same two temperatures ave te same effiieny.

An Equality Now putting in te proper signs, positive Carnot Cyle 0 d negative 0

A Sum of Carnot Cyles P adiabats,i Any reversible proess an be approximated by a sum of Carnot yles, ene i, i, i, i, i 0 C d 0,i

Clausius Definition of Entropy Entropy is a state funtion, te ange in entropy during a proess depends only on te end points and is independent of te atual pat followed. C 2 2 C ds ds 12 d C 1, 2 reversible ds,21 C ds 0 1 C 1 S 2 S 1 ds ds C1,1 2 C2,21 C 2 ds,1 2

Return to Inexat Differential Assume (2,1) (1,1) dg (2,2) (2,1) dx x y dx dy x y dy 1 2ln 2 (1,2) (1,1) (2,2) (1,2) dx x y dy ln 2 1 Note: df dg x dx x dy y is an exat differential. Integrating fator f ( x, y) ln x ln y f 0

Bak to te First Law Heat is pat dependent. d du Pd erefore, 1/ is really te integrating fator for te differential form of eat. Now we an reast te 1st law of termodynamis as du ds Pd Entropy is also a state funtion, as is te internal energy or volume.

Entropy of an Ideal Gas (1 mole) p(, ) R U mol ( ) C fr 2 ds 1 du pd C mol d Rd Integrating from ( 0, 0 ) to (, ) S(, ) mol S0 C ln R ln 0 0

Carnot s eorem No real eat engine operating between two energy reservoirs an be more effiient tan Carnot s engine operating between te same two reservoirs. positive negative e' 1 ' ' 1 ' ' Wat does tis mean? Still, for any engine in a yle (S is a state funtion!) ds 0 0

Counting te Heat Bats in S ' ' > 0 S gas ds 0 after a yle S ' ' < 0 S S S gas S ' 0 ' 0

Counting te Heat Bats in S ' ' > 0 S gas ds 0 after a yle S ' ' < 0 e total entropy of an isolated system tat undergoes a ange an never derease.

Example 1: Clausius Statement S S S S S 0 Irreversible!

Example 2: Kelvin Statement S 0 Irreversible!

Speifi Heat Note: Last time we defined molar speifi eat. In pysis, we also use speifi eat per partile.

Example 3: Mixing Water Example 3: Mixing Water A A B B A B A < B B A B B A A B A B A m m m m m m A A A m : m B B B :

Example 3: Mixing Water A B S A < B A B A B A B m : ln Ad S A ma 0 A A m : ln Bd SB mb 0 B B For simpliity, assume S A S B m A m mln B 2 A m B 0 / 2, A B Irreversible!

e Seond Law in terms of Entropy e total entropy of an isolated system tat undergoes a ange an never derease. If te proess is irreversible, ten te total entropy of an isolated system always inreases. In a reversible proess, te total entropy of an isolated system remains onstant. e ange in entropy of te Universe must be greater tan zero for an irreversible proess and equal to zero for a reversible proess. ΔS Universe 0

Order versus Disorder Isolated systems tend toward disorder and tat entropy is a measure of tis disorder. Ordered: all moleules on te left side Disordered: moleules on te left and rigt

Example 4: Free Expansion U W? 0 S 0 We an only alulate S wit a reversible proess! In tis ase, we replae te free expansion by te isotermal proess wit te same initial and final states. S i f d i f Pd i f nrd f nr ln 0 i Irreversible!

Entropy: A Measure of Disorder Entropy: A Measure of Disorder ln 2 ln B i f B Nk Nk S W k S B ln N m f f W N m i i W N i f i f W W We assume tat ea moleule oupies some mirosopi volume m. suggesting (Boltzmann)

Information and Entropy (1927) Bell Labs, Ralp Hartley Measure for information in a message Logaritm: 8 bit = 2 8 = 256 different numbers (1940) Bell Labs, Claude Sannon A matematial teory of ommuniation Probability of a partiular message But tere is no information. You are not winning te lottery.

Information and Entropy (1927) Bell Labs, Ralp Hartley Measure for information in a message Logaritm: 8 bit = 2 8 = 256 different numbers (1940) Bell Labs, Claude Sannon A matematial teory of ommuniation Probability of a partiular message Now tat s someting. Okay, you are going to win te lottery.

Information and Entropy (1927) Bell Labs, Ralp Hartley Measure for information in a message Logaritm: 8 bit = 2 8 = 256 different numbers (1940) Bell Labs, Claude Sannon A matematial teory of ommuniation Probability of a partiular message Information ~ - log (probability) ~ negative entropy S infomation i P i log P i

It is already in use under tat name. and besides, it will give you great edge in debates beause nobody really knows wat entropy is anyway. ---- Jon von Neumann

Maxwell s Demon o determine weter to let a moleule troug, te demon must aquire information about te state of te moleule. However well prepared, te demon will eventually run out of information storage spae and must begin to erase te information it as previously gatered. Erasing information is a termodynamially irreversible proess tat inreases te entropy of a system.

Information Redues Entropy For irreversible proesses, S > 0. Erasure is irreversible, so S Erasure > 0. So learn to remember, wi osts n bit of information, redues entropy. Example: 1 bit Knowing noting: p 1 = 1/2, p 2 = 1/2. S = log 2. Knowing wi way: p 1 = 1, p 2 = 0. S = 0. So, S = - log 2 due to te 1 bit of information.

Landauer s Priniple & erifiation Computation needs to involve eat dissipation only wen you do someting irreversible wit te information. Lutz group (2012) k B ln 2 0.693

For ose Wo Are Interested Reading (downloadable from my website): Carles Bennett and Rolf Landauer, e fundamental pysial limits of omputation. Antoine Bérut et al., Experimental verifiation of Landauer s priniple linking information and termodynamis, Nature (2012). Set Lloyd, Ultimate pysial limits to omputation, Nature (2000). Dare to adventure were you ave not been!