Thermodynamics of computation

Similar documents
Thermodynamic Computing. Forward Through Backwards Time by RocketBoom

Even if you're not burning books, destroying information generates heat.

The physics of information: from Maxwell s demon to Landauer. Eric Lutz University of Erlangen-Nürnberg

B. CONSERVATIVE LOGIC 19

B.2 Mechanical and thermal modes

Boolean circuits. Lecture Definitions

Information and Physics Landauer Principle and Beyond

Lecture 3: Constructing a Quantum Model

Part B" Ants (Natural and Artificial)! Langton s Vants" (Virtual Ants)! Vants! Example! Time Reversibility!

Time, Space, and Energy in Reversible Computing. Paul Vitanyi, CWI, University of Amsterdam, National ICT Australia

6.895 Randomness and Computation March 19, Lecture Last Lecture: Boosting Weak Learners Into Strong Learners

Quantum computation: a tutorial

2.2 Classical circuit model of computation

ON BROWNIAN COMPUTATION

Adamatzky, A. (2001). Computing in Nonlinear Media and Automata Collectives.

CSC 5170: Theory of Computational Complexity Lecture 4 The Chinese University of Hong Kong 1 February 2010

04. Information and Maxwell's Demon. I. Dilemma for Information-Theoretic Exorcisms.

Maxwell s Demon. Kirk T. McDonald Joseph Henry Laboratories, Princeton University, Princeton, NJ (October 3, 2004)

Information in Biology

Lecture 2: From Classical to Quantum Model of Computation

Maxwell s Demon. Kirk T. McDonald Joseph Henry Laboratories, Princeton University, Princeton, NJ (October 3, 2004; updated September 20, 2016)

The Thermodynamics of Computation--a Review

Universal quantum computers

Quantum Complexity Theory and Adiabatic Computation

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 1: Quantum circuits and the abelian QFT

Information in Biology

Computers also need devices capable of Storing data and information Performing mathematical operations on such data

Recall the essence of data transmission: How Computers Work Lecture 11

Comparative analysis of non-equilibrium quantum Landauer bounds

Notes on Landauer s principle, reversible computation, and Maxwell s Demon

CS294: Pseudorandomness and Combinatorial Constructions September 13, Notes for Lecture 5

Lecture 22: Quantum computational complexity

The (absence of a) relationship between thermodynamic and logical reversibility

CSE 200 Lecture Notes Turing machine vs. RAM machine vs. circuits

A non-turing-recognizable language

Murray Gell-Mann, The Quark and the Jaguar, 1995

where Q is a finite set of states

More Turing Machines. CS154 Chris Pollett Mar 15, 2006.

Turing machine recap. Universal Turing Machines and Undecidability. Alphabet size doesn t matter. Robustness of TM

Universal quantum computers

Week 2: Defining Computation

DEMONS: MAXWELL S DEMON, SZILARD S ENGINE AND LANDAUER S ERASURE DISSIPATION

Bits. Chapter 1. Information can be learned through observation, experiment, or measurement.

Computability Theory

(a) Definition of TMs. First Problem of URMs

PHYS 414 Problem Set 4: Demonic refrigerators and eternal sunshine

Q = Set of states, IE661: Scheduling Theory (Fall 2003) Primer to Complexity Theory Satyaki Ghosh Dastidar

Statistical Mechanics

Lecture 3 (Notes) 1. The book Computational Complexity: A Modern Approach by Sanjeev Arora and Boaz Barak;

CMPT 710/407 - Complexity Theory Lecture 4: Complexity Classes, Completeness, Linear Speedup, and Hierarchy Theorems

Lecture 6. Statistical Processes. Irreversibility. Counting and Probability. Microstates and Macrostates. The Meaning of Equilibrium Ω(m) 9 spins

Lecture 14 Finite state machines

CISC 876: Kolmogorov Complexity

Boolean State Transformation

THE INFORMATION REQUIREMENTS OF COMPLEX BIOLOGICAL AND ECONOMIC SYSTEMS WITH ALGORITHMIC INFORMATION THEORY

VHDL DESIGN AND IMPLEMENTATION OF C.P.U BY REVERSIBLE LOGIC GATES

ROM-BASED COMPUTATION: QUANTUM VERSUS CLASSICAL

Kolmogorov complexity and its applications

Zero and negative energy dissipation at information-theoretic erasure

Demons: Maxwell demon; Szilard engine; and Landauer's erasure-dissipation

Sequential Logic Worksheet

Reversible Circuit Using Reversible Gate

Entropy. Physics 1425 Lecture 36. Michael Fowler, UVa

(Refer Slide Time: 0:21)

Computability Theory

Lecture 4 : Quest for Structure in Counting Problems

- HH Midterm 1 for Thermal and Statistical Physics I Hsiu-Hau Lin (Nov 5, 2012)

Physics 9 Friday, November 2, 2018

1.1 The Boolean Bit. Note some important things about information, some of which are illustrated in this example:

Is there an Elegant Universal Theory of Prediction?

Universal Turing Machine. Lecture 20

3 Probabilistic Turing Machines

CS261: A Second Course in Algorithms Lecture #11: Online Learning and the Multiplicative Weights Algorithm

Nondeterministic finite automata

CSC : Homework #3

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39

Physics of switches. Luca Gammaitoni NiPS Laboratory

Lecture 1: Shannon s Theorem

Coins and Counterfactuals

Realization of programmable logic array using compact reversible logic gates 1

Turing s Thesis. Fall Costas Busch - RPI!1

Dr.Salwa Alsaleh fac.ksu.edu.sa/salwams

ABridgeofBits. Norman Margolus Laboratory for Computer Science Massachusetts Institute of Technology Cambridge, MA

IV. Turing Machine. Yuxi Fu. BASICS, Shanghai Jiao Tong University

Philadelphia University Student Name: Student Number:

BER KELEY D AV IS IR VINE LOS AN GELES RIVERS IDE SAN D IEGO S AN FRANCISCO

Lecture 9 Examples and Problems

Lecture Outline Chapter 18. Physics, 4 th Edition James S. Walker. Copyright 2010 Pearson Education, Inc.

Thermodynamics: More Entropy

Computability Theory. CS215, Lecture 6,

Emphasize physical realizability.

NP, polynomial-time mapping reductions, and NP-completeness

Busch Complexity Lectures: Turing s Thesis. Costas Busch - LSU 1

Irreversibility. Have you ever seen this happen? (when you weren t asleep or on medication) Which stage never happens?

E diss t = F E P diss. (II.1)

20.1 2SAT. CS125 Lecture 20 Fall 2016

1 Reals are Uncountable

UNDERSTANDING BOLTZMANN S ANALYSIS VIA. Contents SOLVABLE MODELS

Stochastic Histories. Chapter Introduction

1 Probability Review. CS 124 Section #8 Hashing, Skip Lists 3/20/17. Expectation (weighted average): the expectation of a random quantity X is:

Transcription:

C. THERMODYNAMICS OF COMPUTATION 37 C Thermodynamics of computation This lecture is based on Charles H. Bennett s The Thermodynamics of Computation a Review, Int. J. Theo. Phys., 1982 [B82]. Unattributed quotations are from this paper. Computers may be thought of as engines for transforming free energy into waste heat and mathematical work. [B82] C.1 Brownian computers 1. Rather than trying to avoid randomization of kinetic energy (transfer from mechanical modes to thermal modes), perhaps it can be exploited. An example of respecting the medium in embodied computation. 2. Brownian computer: Makes logical transitions as a result of thermal agitation. It is about as likely to go backward as forward. It may be biased in the forward direction by a very weak external driving force. 3. DNA: DNA transcription is an example. It runs at about 30 nucleotides per second and dissipates about 20kT per nucleotide, making less than one mistake per 10 5 nucleotides. 4. Chemical Turing machine: Tape is a large macromolecule analogous to RNA. An added group encodes the state and head location. For each tradition rule there is a hypothetical enzyme that catalyzes the state transition. 5. Drift velocity is linear in dissipation per step. 6. We will look molecular computation in much more detail later in the class. 7. Clockwork Brownian TM: He considers a clockwork Brownian TM comparable to the billiard ball computers. It s a little more realistic, since it does not have to be machined perfectly and tolerates environmental noise. 8. Drift velocity is linear in dissipation per step.

38 CHAPTER II. PHYSICS OF COMPUTATION Figure II.23: Di erent degrees of logical reversibility. [from B82] C.2 Reversible computing C.2.a Reversible TM 1. Bennett (1973) defined a reversible TM. C.2.b Reversible logic 1. As in ballistic computing, Brownian computing needs logical reversibility. 2. A small degree of irreversibility can be tolerated (see Fig. II.23). (A) Strictly reversible computation. Any driving force will ensure forward movement. (B) Modestly irreversible computation. There are more backward detours, but can still be biased forward. (C) Exponentially backward branching trees. May spend much more of its time going backward than going forward, especially since one backward step is likely to lead to more backward steps. 3. For forward computation on such a tree the dissipation per step must exceed kt times the log of the mean number of immediate predecessors

C. THERMODYNAMICS OF COMPUTATION 39 to each state. 4. These undesired states may outnumber desired states by factors of 2 100, requiring driving forces on the order of 100kT. Why 2 100?Thinkofthenumberofpossiblepredecessorstoastatethat does something like x := 0. Or the number of ways of getting to the next statement after a loop. C.3 Erasure 1. As it turns out, it is not measurement or copying that necessarily dissipates energy, it is the erasure of previous information to restore it to astandardstatesothatitcanbeusedagain. Thus, in a TM writing a symbol on previously used tape requires two steps: erasing the previous contents and then writing the new contents. It is the former process that increases entropy and dissipates energy. Cf. the mechanical TM we saw in CS312; it erased the old symbol before it wrote the new one. Cf. also old computers on which the store instruction was called clear and add. 2. A bit can be copied reversibly, with arbitrarily small dissipation, if it is initially in a standard state (Fig. II.24). The reference bit (fixed at 0, the same as the moveable bit s initial state) allows the initial state to be restored, thus ensuring reversibility. 3. Susceptible to degradation by thermal fluctuation and tunneling. However, the error probability and dissipation /kt can be kept arbitrarily small and much less than unity. here is the driving force. 4. Whether erasure must dissipate energy depends on the prior state (Fig. II.25). (B) The initial state is genuinely unknown. This is reversible (step 6to1). kt ln 2 work was done to compress it into the left potential well (decreasing the entropy). Reversing the operation increases the entropy by kt ln 2. That is, the copying is reversible.

40 CHAPTER II. PHYSICS OF COMPUTATION Figure II.24: [from B82]

C. THERMODYNAMICS OF COMPUTATION 41 Figure II.25: [from B82]

42 CHAPTER II. PHYSICS OF COMPUTATION (C) The initial state is known (perhaps as a result of a prior computation or measurement). This initial state is lost (forgotten), converting kt ln 2 of work into heat with no compensating decrease of entropy. The irreversible entropy increase happens because of the expansion at step 2. This is where we re forgetting the initial state (vs. case B, where there was nothing known to be forgotten). 5. In research reported in March 2012 [EVLP] a setup very much like this was used to confirm experimentally the Landauer Principle and that it is the erasure that dissipates energy. incomplete erasure leads to less dissipated heat. For a success rate of r, thelandauerboundcanbegeneralizedto hqi r Landauer = kt[ln 2 + r ln r +(1 r)ln(1 r)]. Thus, no heat needs to be produced for r =0.5. [EVLP] 6. We have seen that erasing a random register increases entropy in the environment in order to decrease it in the register: NkT ln 2 for an N-bit register. 7. Such an initialized register can be used as fuel as it thermally randomizes itself to do NkT ln 2 useful work. Keep in mind that these 0s (or whatever) are physical states (charges, compressed gas, magnetic spins, etc.). 8. Any computable sequence of N bits (e.g., the first N bits of ) canbe used as fuel by a slightly more complicated apparatus. Start with a reversible computation from N 0s to the bit string. Reverse this computation, which will transform the bit string into all 0s, which can be used as fuel. This uncomputer puts it in usable form. 9. Suppose we have a random bit string, initialized, say, by coin tossing. Because it has a specific value, we can in principle get NkT ln 2 useful work out of it, but we don t know the mechanism (the uncomputer ) to do it. The apparatus would be specific to each particular string.

C. THERMODYNAMICS OF COMPUTATION 43 C.4 Algorithmic entropy 1. Algorithmic information theory was developed by Ray Solomono c. 1960, Andrey Kolmogorov in 1965, and Gregory Chaitin, around 1966. 2. Algorithmic entropy: The algorithmic entropy H(x) ofamicrostate x is the number of bits required to describe x as the output of a universal computer, roughly, the size of the program required to compute it. Specifically the smallest self-delimiting program (i.e., its end can be determined from the bit string itself). 3. Di erences in machine models lead to di erences of O(1), and so H(x) is well-defined up to an additive constant (like thermodynamical entropy). 4. Note that H is not computable. 5. Algorithmically random: Astringisalgorithmically random if it cannot be described by a program very much shorter than itself. 6. For any N, mostn-bit strings are algorithmically random. (For example, there are only enough N 10 bit programs to describe at most 1/1024 of all the N-bit strings. ) 7. Deterministic transformations cannot increase algorithmic entropy very much. Roughly, H[f(x)] H(x)+ f, where f is the size of program f. Reversible transformations also leave H unchanged. 8. A transformation must be probabilistic to be able to increase H significantly. 9. Statistical entropy: Statistical entropy in units of bits is defined: S 2 (p) = X x p(x)lgp(x). 10. Statistical entropy (a function of the macrostate) is related to algorithmic entropy (an average over algorithmic entropies of microstates) as follow: S 2 (p) < X p(x)h(x) apple S 2 (p)+h(p)+o(1). x

44 CHAPTER II. PHYSICS OF COMPUTATION 11. A macrostate p is concisely describable if, for example, it is determined by equations of motion and boundary conditions describable in a small number of bits. In this case S 2 and H are closely related as given above. 12. For macroscopic systems, typically S 2 (p) = O(10 23 )whileh(p) = O(10 3 ). 13. If the physical system increases its H by N bits, which it can do only by acting probabilistically, it can convert about NkT ln 2 of waste heat into useful work. 14. [T]he conversion of about NkT ln 2 of work into heat in the surroundings is necessary to decrease a system s algorithmic entropy by N bits. D Sources B82 Bennett, C. H. The Thermodynamics of Computation a Review. Int. J. Theo. Phys., 21,12(1982),905 940. EVLP Berut, Antoine, Arakelyan, Artak, Petrosyan, Artyom, Ciliberto, Sergio, Dillenschneider, Raoul and Lutz, Eric. Experimental verification of Landauer s principle linking information and thermodynamics. Nature 483, 187 189 (08 March 2012). doi:10.1038/nature10872 F Frank, Michael P. Introduction to Reversible Computing: Motivation, Progress, and Challenges. CF 05, May 4 6, 2005, Ischia, Italy. FT82 Fredkin, E. F., To oli, T. Conservative logic. Int. J. Theo. Phys., 21, 3/4 (1982), 219 253. E Exercises Exercise II.1 Show that the Fredkin gate is reversible. Exercise II.2 Show that the Fredkin gate implementations of the NOT, OR, and FAN-OUT gates (Fig. II.12) are correct.

E. EXERCISES 45 Figure II.26: Implementation of J- K flip-flop. [from FT82] Exercise II.3 Use the Fredkin gate to implement XOR. Exercise II.4 Show for the eight possible inputs that Fig. II.13 is a correct implementation of a 1-line to 4-line demultiplexer. That is, show in each of the four cases A 1 A 0 =00, 01, 10, 11 the bit X =0or1getsroutedto Y 0,Y 1,Y 2,Y 3, respectively. You can use an algebraic proof, if you prefer. Exercise II.5 Show that implementation of a J- K flip-flopwithfredkin gates in Fig. II.26 is correct. A J- K flip-flophasthefollowingbehavior: J K behavior 0 0 reset, Q! 0 0 1 hold, Q doesn t change 1 0 toggle, Q! Q 1 1 set, Q! 1 Exercise II.6 Show that the inverse of the interaction gate works correctly. Exercise II.7 Show that the realization of the Fredkin gate in terms of interaction gates (Fig. II.22) is correct, by labeling the inputs and outputs of the interaction gates with Boolean expressions of a, b, andc.