Information and Physics Landauer Principle and Beyond Ryoichi Kawai Department of Physics University of Alabama at Birmingham
Maxwell Demon Lerner, 975
Landauer principle Ralf Landauer (929-999) Computational steps which inevitably require a minimal energy dissipation are those which discard information. (Landauer, 96) Any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information bearing degrees of freedom of the information processing apparatus or its environment. (Bennett, 982, Wikipedia) The erasure of one bit of information is necessarily accompanied by a dissipation of at least kt ln 2 heat. Key words: information, erasure, dissipation
Confusion If one were to imagine starting with those degrees of freedom in a thermalised state, there would be a real reduction in thermodynamic entropy if they were then re-set to a known state. This can only be achieved under information-preserving microscopically deterministic dynamics if the uncertainty is somehow dumped somewhere else i.e. if the entropy of the environment (or the non information-bearing degrees of freedom) is increased by at least an equivalent amount, as required by the Second Law, by gaining an appropriate quantity of heat: specifically kt ln 2 of heat for every bit of randomness erased. (Wikipedia, Information Theory and Physics) Landauer s principle for information erasure is valid for a symmetric doublewell potential, but not for an asymmetric one. The proof of W erase k T ln 2 using statistical mechanics is valid only for the symmetric case. (Sagawa and Ueda, 29)
Key word: Dissipation Which one is correct? W k T ln 2 Q k T ln 2 Δ S k ln 2 Δ S=k Δ H (H = information entropy) Δ S total k ln 2 Δ S total k I (erasure of one bit) (erasure of mutual information I)
Dissipation: Entropy production. Δ S total =Δ S sys +Δ S env Increase of total entropy in the universe. Information:? Erasure:?
What is information Erasure? Top Secret
Bit Eraser Which one erased the input information? Eraser Eraser How much energy dissipates?
Information is probabilistic transparent container H = H = H =ln 2 Information Entropy of one bit H = p ln p p ln p
Dissipation: Entropy production. Increase of total entropy in the universe. Information: Uncertainty of bits Ensemble of bits Erasure:?
Information Erasure E=mc 2 i ħ ψ t =H ψ t ρ=lρ reset to null randomize E=mc 2 i ħ ψ t =H ψ t ρ=lρ
Bit Eraser (Randomize) eraser eraser eraser / / / Δ H =ln 2 Δ H =ln 2 Δ H =
Bit Eraser (Reset to Zero) eraser eraser eraser Δ H = Δ H = Δ H = ln 2
Dissipation: Entropy production. Increase of total entropy in the universe. Information: Uncertainty of bits Ensemble of bits Erasure: For all possible input information, the outcome is the same. Any ensemble of bits is mapped to a standard ensemble. (There is a constraint to the mapping procedure.)
Implementation of Information in Physics: A model Z Microscopic state Z Ensemble Z p = Z p = p = 2 p = 2
A Physical Model: Randomizing Z Z Δ S total =?? Δ S total = Δ S total =k ln 2 Δ S total =k ln 2 Z Z?? Δ S total =k ln 2 Δ S total =
A Physical Model: Resetting to Zero y y W >, Q<?? Δ S total =k ln 2 x x Δ S total =k ln 2 y y W >, Q< W >, Q<?? x x Δ S total =k ln 2 Δ S total =
Dissipation: Entropy production. Increase of total entropy in the universe. Information: Uncertainty of bits Ensemble of bits Erasure: For all possible input information, the outcome is the same. Any ensemble of bits is mapped to a standard ensemble. The erasing procedure must be autonomous. (No feedback control.)
Bit Eraser (Randomize) eraser eraser eraser / / /// Δ I = Δ I = Δ I = ln 2
Bit Eraser (Reset to Zero) eraser eraser eraser Δ I = Δ I = Δ I = ln 2
Erasure of Mutual Information The bit eraser deleted the correlation between two bits of information. Input: (, R) and (, B) p R = 2, p B= 2, p R= p B = Output: (,R), (,B), (,R), and (,B) p R = p R = p B = p B = 4 Mutual Information between information and Y: I (,Y )=H ( )+H (Y ) H (,Y )=D( Y )= Y where I in =ln 2, I out = p = Y p Y, p Y = p Y p Y ln p Y p p Y When mutual information I(,Y) is erased, of entropy is generated. Δ S total k I (,Y )
More detailed analysis Δ S total =k D(ρ Γ (t) ρ Γ (t ))=k ρ Γ (q, p,t)ln ρ Γ(q, p,t) ρ Γ (q, p,t) (Kawai et al., 27) Information bearing coordinates: Other coordinates: Z Z? ρ(,t)= dz ρ Γ (q, p, t) Relation with information p i = d ρ(,t)χ i ( ), χ i ( )={ R i otherwise Δ S total D(ρ ρ) D ( p p), D( p p)= i p i ln p i p i
Landauer principle via free expansion Forward process: ρ( )={ Ω R R free expansion Backward process: ρ( )={ 2 Ω R 2 Ω R Z? Δ S total D(ρ ρ)=ln 2 Thermodynamics textbook says: Entropy increases during free expansion because information is lost. Landauer principle: When information is lost, entropy increases.
Proof of Δ S i I (,Y ) Y ρ(,y ) Hamiltonian: H (,Y ) H x ( )+H y (Y ) Forward: ρ (,Y ) ρ x ( )ρ y (Y ) ρ x ( )= dy ρ eq (, Y ), ρ y (Y )= d ρ eq (,Y ) Backward: ρ(,y )= ρ x ( ) ρ y (Y ) Y ρ(,y ) Δ S total =D(ρ ρ)= d dy ρ(,y )ln = d dy ρ(, Y )ln ρ(,y ) ρ(,y ) ρ(,y ) ρ x ( )ρ y (Y ) + d dy ρ(, Y )ln ρ ( )ρ (Y ) x y ρ x ( ) ρ y (Y ) =I (, Y )+D(ρ x ρ x )+ D(ρ y ρ y ) I (,Y )
The issue of asymmetric domain size Δ S total =k ln 3/2 2 3 V 3 V average: ln 3 ln 2>ln 2 2 Δ S total =k ln 3 W =kt ln 4 3, Δ S total=k ln 2 2 V 2 V W =kt ln 2 3, Δ S total=k ln 2 Sagawa and Ueda (29)
About Δ S=k Δ H When information entropy changes by H, the system entropy changes by S sys =k H. Since S total = S sys + S env >, S env >-k H. This information entropy is different from the one used to measure the uncertainty in information. Z Information flow diagram? Z? P = 2 P = 2 * Z? p = p 2 = 2 H = P ln P P ln P
Z? P = 2 Z? P = 2 Input has two possible perfectly certain states. H in =ln 2 Output is unique. H out = Δ H = ln 2 Δ S= k ln 2 Δ S env k ln 2 Entropy is transferred from information world to physical world. In this sense, the statement in Wikipedia is correct! This was Landauer's original idea but no physical basis has been offered.
Summary on Information Erasure Information is a profile of probability distribution: {p i } Relation between information and physics: p i = d ρ( )χ i ( ) Information erasure: {p i } {p i * } using an autonomous procedure (no feedback). p * (A single Hamiltonian for all initial conditions.) Remark: ρ(, t ) ρ * (, τ) is not necessary Amount of required entropy production depends on the initial condition (input information). 2 p The less uncertain is input, the larger is dissipation. If the input has no uncertainty, Δ S total k ln 2 If the input is completely uncertain, Δ S total p * Erasure of mutual information requires dissipation: Δ S total k I 2 p
Simple Logical Gates i i2 OR o Input: Two bits ; Output: One bit Erasure of one bit Δ S i k ln 2 dissipative Δ S i >? non-dissipative Δ S i Need work No work
Dissipation in an dissipative OR gate (information theory) Common argument in information theory using information flow P =P =P =P = 4 H =2ln 2 in P = 4, P = 3 4 H out =2 ln 2 3 4 ln 3 Δ H = 3 4 ln 3 Δ S total = k Δ H =k 3 4 ln 3
Dissipation in an dissipative OR gate (physics) Assume that the input information has no uncertainty. Δ S total Δ S total k ln 3 Δ S i 4 + 3 4 k ln 3 = k 3 4 ln 3