Information and Physics Landauer Principle and Beyond

Similar documents
Comparative analysis of non-equilibrium quantum Landauer bounds

ON BROWNIAN COMPUTATION

Beyond the Second Law of Thermodynamics

The (absence of a) relationship between thermodynamic and logical reversibility

Thermodynamic Computing. Forward Through Backwards Time by RocketBoom

Second law, entropy production, and reversibility in thermodynamics of information

Thermodynamics of computation

The physics of information: from Maxwell s demon to Landauer. Eric Lutz University of Erlangen-Nürnberg

2.2 Classical circuit model of computation

04. Information and Maxwell's Demon. I. Dilemma for Information-Theoretic Exorcisms.

Transmitting and Hiding Quantum Information

Maxwell s Demon. Kirk T. McDonald Joseph Henry Laboratories, Princeton University, Princeton, NJ (October 3, 2004)

Landauer s principle in the quantum domain

Quantum thermodynamics

Quantum Thermodynamics

Maxwell s Demon. Kirk T. McDonald Joseph Henry Laboratories, Princeton University, Princeton, NJ (October 3, 2004; updated September 20, 2016)

Even if you're not burning books, destroying information generates heat.

Notes on Landauer s principle, reversible computation, and Maxwell s Demon

Quantum computation: a tutorial

The End of the Thermodynamics of Computation: A No Go Result

Conditioning, Correlation and Entropy Generation in Maxwell s Demon

Hardwiring Maxwell s Demon Tobias Brandes (Institut für Theoretische Physik, TU Berlin)

Nonequilibrium Thermodynamics of Small Systems: Classical and Quantum Aspects. Massimiliano Esposito

DEMONS: MAXWELL S DEMON, SZILARD S ENGINE AND LANDAUER S ERASURE DISSIPATION

Zero and negative energy dissipation at information-theoretic erasure

PHYS 414 Problem Set 4: Demonic refrigerators and eternal sunshine

Thermodynamics of feedback controlled systems. Francisco J. Cao

The first law of general quantum resource theories

Information in Biology

Experimental Rectification of Entropy Production by Maxwell s Demon in a Quantum System

Thermodynamic Cost Due to Changing the Initial Distribution Over States

Information in Biology

Lecture 27: Entropy and Information Prof. WAN, Xin

Information to energy conversion in an electronic Maxwell s demon and thermodynamics of measurements.

84 My God, He Plays Dice! Chapter 12. Irreversibility. This chapter on the web informationphilosopher.com/problems/reversibility

Information Thermodynamics on Causal Networks

(# = %(& )(* +,(- Closed system, well-defined energy (or e.g. E± E/2): Microcanonical ensemble

arxiv:quant-ph/ v1 4 Jul 2003

Demon Dynamics: Deterministic Chaos, the Szilard Map, & the Intelligence of Thermodynamic Systems.

Thermodynamics of Information Processing in Small Systems )

From fully quantum thermodynamical identities to a second law equality

arxiv:quant-ph/ v1 21 Feb 2002

Maxwell's Demons and Quantum Heat Engines in Superconducting Circuits

QUANTUM INFORMATION -THE NO-HIDING THEOREM p.1/36

arxiv: v2 [cond-mat.stat-mech] 18 Jun 2009

Quantum Thermodynamics

B. CONSERVATIVE LOGIC 19

arxiv:chao-dyn/ v1 17 Feb 1999

Exact Solutions of the Einstein Equations

Thermodynamic of computing. Fisica dell Energia - a.a. 2017/2018

An Introduction to Quantum Computation and Quantum Information

Quantum heat engine using energy quantization in potential barrier

Lecture 5: Temperature, Adiabatic Processes

Fundamental work cost of quantum processes

Thermodynamic of computing. Fisica dell Energia - a.a. 2015/2016

İlke Ercan, PhD Assistant Professor, Electrical & Electronics Eng. Dep. Boğaziçi University, İstanbul, Turkey

Response to Comment on Zero and negative energy dissipation at information-theoretic erasure

Averaging II: Adiabatic Invariance for Integrable Systems (argued via the Averaging Principle)

Axiomatic Information Thermodynamics. Received: 1 February 2018; Accepted: 26 March 2018; Published: 29 March 2018

Physics, Time and Determinism

Relating Maxwell s Demon and Quantitative Analysis of Information Leakage for Practical Imperative Programs

Maxwell's Demon in Biochemical Signal Transduction

Landauer s Principle in Quantum Statistical Mechanics

CHEM-UA 652: Thermodynamics and Kinetics

arxiv: v4 [cond-mat.stat-mech] 3 Mar 2017

Contrasting measures of irreversibility in stochastic and deterministic dynamics

B.2 Mechanical and thermal modes

Thermodynamical cost of accuracy and stability of information processing

Physics of switches. Luca Gammaitoni NiPS Laboratory

arxiv:physics/ v2 [physics.class-ph] 9 Jan 2003

Compression and entanglement, entanglement transformations

Information and thermodynamics: Experimental verification of Landauer s erasure principle

The propagation of chaos for a rarefied gas of hard spheres

Efficiency at Maximum Power in Weak Dissipation Regimes

Quantum measurement theory

LECTURE 4: Information-powered refrigerators; quantum engines and refrigerators

Axiomatic information thermodynamics

Studies in History and Philosophy of Modern Physics

arxiv: v2 [cond-mat.stat-mech] 16 Mar 2012

arxiv: v2 [quant-ph] 3 May 2018

Information Equation of State

When Abandoned Bits Bite Back Reversibility and Quantum Computing

Two odd things about computation. Daniel Murfet University of Melbourne

Heat Machines (Chapters 18.6, 19)

The thermodynamics of cellular computation

Centralized Versus Decentralized Control - A Solvable Stylized Model in Transportation Logistics

Centralized Versus Decentralized Control - A Solvable Stylized Model in Transportation

AY127 Problem Set 3, Winter 2018

1. Thermodynamics 1.1. A macroscopic view of matter

Colloquium: The physics of Maxwell s demon and information

THE LANDAUER LIMIT AND THERMODYNAMICS OF BIOLOGICAL SYSTEMS

From the microscopic to the macroscopic world. Kolloqium April 10, 2014 Ludwig-Maximilians-Universität München. Jean BRICMONT

Reversible computer hardware

Thermodynamics: Entropy Conclusion

PHYSICS OF QUANTUM COMPUTATION

An Elementary Notion and Measurement of Entropy

Estimating Accuracy in Classical Molecular Simulation

Weeding Landauer s Garden

Fluctuation Theorems of Work and Entropy in Hamiltonian Systems

Quantum and Information Thermodynamics: A Unifying Framework based on Repeated Interactions

Transcription:

Information and Physics Landauer Principle and Beyond Ryoichi Kawai Department of Physics University of Alabama at Birmingham

Maxwell Demon Lerner, 975

Landauer principle Ralf Landauer (929-999) Computational steps which inevitably require a minimal energy dissipation are those which discard information. (Landauer, 96) Any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information bearing degrees of freedom of the information processing apparatus or its environment. (Bennett, 982, Wikipedia) The erasure of one bit of information is necessarily accompanied by a dissipation of at least kt ln 2 heat. Key words: information, erasure, dissipation

Confusion If one were to imagine starting with those degrees of freedom in a thermalised state, there would be a real reduction in thermodynamic entropy if they were then re-set to a known state. This can only be achieved under information-preserving microscopically deterministic dynamics if the uncertainty is somehow dumped somewhere else i.e. if the entropy of the environment (or the non information-bearing degrees of freedom) is increased by at least an equivalent amount, as required by the Second Law, by gaining an appropriate quantity of heat: specifically kt ln 2 of heat for every bit of randomness erased. (Wikipedia, Information Theory and Physics) Landauer s principle for information erasure is valid for a symmetric doublewell potential, but not for an asymmetric one. The proof of W erase k T ln 2 using statistical mechanics is valid only for the symmetric case. (Sagawa and Ueda, 29)

Key word: Dissipation Which one is correct? W k T ln 2 Q k T ln 2 Δ S k ln 2 Δ S=k Δ H (H = information entropy) Δ S total k ln 2 Δ S total k I (erasure of one bit) (erasure of mutual information I)

Dissipation: Entropy production. Δ S total =Δ S sys +Δ S env Increase of total entropy in the universe. Information:? Erasure:?

What is information Erasure? Top Secret

Bit Eraser Which one erased the input information? Eraser Eraser How much energy dissipates?

Information is probabilistic transparent container H = H = H =ln 2 Information Entropy of one bit H = p ln p p ln p

Dissipation: Entropy production. Increase of total entropy in the universe. Information: Uncertainty of bits Ensemble of bits Erasure:?

Information Erasure E=mc 2 i ħ ψ t =H ψ t ρ=lρ reset to null randomize E=mc 2 i ħ ψ t =H ψ t ρ=lρ

Bit Eraser (Randomize) eraser eraser eraser / / / Δ H =ln 2 Δ H =ln 2 Δ H =

Bit Eraser (Reset to Zero) eraser eraser eraser Δ H = Δ H = Δ H = ln 2

Dissipation: Entropy production. Increase of total entropy in the universe. Information: Uncertainty of bits Ensemble of bits Erasure: For all possible input information, the outcome is the same. Any ensemble of bits is mapped to a standard ensemble. (There is a constraint to the mapping procedure.)

Implementation of Information in Physics: A model Z Microscopic state Z Ensemble Z p = Z p = p = 2 p = 2

A Physical Model: Randomizing Z Z Δ S total =?? Δ S total = Δ S total =k ln 2 Δ S total =k ln 2 Z Z?? Δ S total =k ln 2 Δ S total =

A Physical Model: Resetting to Zero y y W >, Q<?? Δ S total =k ln 2 x x Δ S total =k ln 2 y y W >, Q< W >, Q<?? x x Δ S total =k ln 2 Δ S total =

Dissipation: Entropy production. Increase of total entropy in the universe. Information: Uncertainty of bits Ensemble of bits Erasure: For all possible input information, the outcome is the same. Any ensemble of bits is mapped to a standard ensemble. The erasing procedure must be autonomous. (No feedback control.)

Bit Eraser (Randomize) eraser eraser eraser / / /// Δ I = Δ I = Δ I = ln 2

Bit Eraser (Reset to Zero) eraser eraser eraser Δ I = Δ I = Δ I = ln 2

Erasure of Mutual Information The bit eraser deleted the correlation between two bits of information. Input: (, R) and (, B) p R = 2, p B= 2, p R= p B = Output: (,R), (,B), (,R), and (,B) p R = p R = p B = p B = 4 Mutual Information between information and Y: I (,Y )=H ( )+H (Y ) H (,Y )=D( Y )= Y where I in =ln 2, I out = p = Y p Y, p Y = p Y p Y ln p Y p p Y When mutual information I(,Y) is erased, of entropy is generated. Δ S total k I (,Y )

More detailed analysis Δ S total =k D(ρ Γ (t) ρ Γ (t ))=k ρ Γ (q, p,t)ln ρ Γ(q, p,t) ρ Γ (q, p,t) (Kawai et al., 27) Information bearing coordinates: Other coordinates: Z Z? ρ(,t)= dz ρ Γ (q, p, t) Relation with information p i = d ρ(,t)χ i ( ), χ i ( )={ R i otherwise Δ S total D(ρ ρ) D ( p p), D( p p)= i p i ln p i p i

Landauer principle via free expansion Forward process: ρ( )={ Ω R R free expansion Backward process: ρ( )={ 2 Ω R 2 Ω R Z? Δ S total D(ρ ρ)=ln 2 Thermodynamics textbook says: Entropy increases during free expansion because information is lost. Landauer principle: When information is lost, entropy increases.

Proof of Δ S i I (,Y ) Y ρ(,y ) Hamiltonian: H (,Y ) H x ( )+H y (Y ) Forward: ρ (,Y ) ρ x ( )ρ y (Y ) ρ x ( )= dy ρ eq (, Y ), ρ y (Y )= d ρ eq (,Y ) Backward: ρ(,y )= ρ x ( ) ρ y (Y ) Y ρ(,y ) Δ S total =D(ρ ρ)= d dy ρ(,y )ln = d dy ρ(, Y )ln ρ(,y ) ρ(,y ) ρ(,y ) ρ x ( )ρ y (Y ) + d dy ρ(, Y )ln ρ ( )ρ (Y ) x y ρ x ( ) ρ y (Y ) =I (, Y )+D(ρ x ρ x )+ D(ρ y ρ y ) I (,Y )

The issue of asymmetric domain size Δ S total =k ln 3/2 2 3 V 3 V average: ln 3 ln 2>ln 2 2 Δ S total =k ln 3 W =kt ln 4 3, Δ S total=k ln 2 2 V 2 V W =kt ln 2 3, Δ S total=k ln 2 Sagawa and Ueda (29)

About Δ S=k Δ H When information entropy changes by H, the system entropy changes by S sys =k H. Since S total = S sys + S env >, S env >-k H. This information entropy is different from the one used to measure the uncertainty in information. Z Information flow diagram? Z? P = 2 P = 2 * Z? p = p 2 = 2 H = P ln P P ln P

Z? P = 2 Z? P = 2 Input has two possible perfectly certain states. H in =ln 2 Output is unique. H out = Δ H = ln 2 Δ S= k ln 2 Δ S env k ln 2 Entropy is transferred from information world to physical world. In this sense, the statement in Wikipedia is correct! This was Landauer's original idea but no physical basis has been offered.

Summary on Information Erasure Information is a profile of probability distribution: {p i } Relation between information and physics: p i = d ρ( )χ i ( ) Information erasure: {p i } {p i * } using an autonomous procedure (no feedback). p * (A single Hamiltonian for all initial conditions.) Remark: ρ(, t ) ρ * (, τ) is not necessary Amount of required entropy production depends on the initial condition (input information). 2 p The less uncertain is input, the larger is dissipation. If the input has no uncertainty, Δ S total k ln 2 If the input is completely uncertain, Δ S total p * Erasure of mutual information requires dissipation: Δ S total k I 2 p

Simple Logical Gates i i2 OR o Input: Two bits ; Output: One bit Erasure of one bit Δ S i k ln 2 dissipative Δ S i >? non-dissipative Δ S i Need work No work

Dissipation in an dissipative OR gate (information theory) Common argument in information theory using information flow P =P =P =P = 4 H =2ln 2 in P = 4, P = 3 4 H out =2 ln 2 3 4 ln 3 Δ H = 3 4 ln 3 Δ S total = k Δ H =k 3 4 ln 3

Dissipation in an dissipative OR gate (physics) Assume that the input information has no uncertainty. Δ S total Δ S total k ln 3 Δ S i 4 + 3 4 k ln 3 = k 3 4 ln 3