Time, Space, and Energy in Reversible Computing. Paul Vitanyi, CWI, University of Amsterdam, National ICT Australia

Similar documents
Time, Space, and Energy in Reversible Computing

a logically reversible way says nothing about whether or not the computation dissipates energy. It merely means that the laws of physics do not requir

Kolmogorov complexity and its applications

arxiv:quant-ph/ v2 19 Apr 2001

Kolmogorov complexity and its applications

Reversible Simulation of Irreversible Computation? Ming Li 1 Department of Computer Science, University of Waterloo, Waterloo, Ont. N2L 3G1, Canada. E

Thermodynamics of computation

2.2 Classical circuit model of computation

1 Introduction Computers can be regarded as engines that dissipate energy in order to process information. The ultimate limits of miniaturization of c

CISC 876: Kolmogorov Complexity

Low-Depth Witnesses are Easy to Find

Compression Complexity

Sophistication Revisited

Is there an Elegant Universal Theory of Prediction?

Journal of Computer and System Sciences

Computability Theory

Finish K-Complexity, Start Time Complexity

Quantum computation: a tutorial

Limitations of Efficient Reducibility to the Kolmogorov Random Strings

Algorithmic Information Theory

Derandomizing from Random Strings

Lecture notes on OPP algorithms [Preliminary Draft]

Randomness. What next?

Computer Sciences Department

SAMPLE CHAPTERS UNESCO EOLSS QUANTUM COMPUTING. Paul Vitányi CWI, Kruislaan, Amsterdam, The Netherlands

Kolmogorov Complexity with Error

Some Properties of Antistochastic Strings. Alexey Milovanov. Theory of Computing Systems ISSN

Complex Systems Methods 2. Conditional mutual information, entropy rate and algorithmic complexity

Transmitting and Hiding Quantum Information

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY

Complete problems for classes in PH, The Polynomial-Time Hierarchy (PH) oracle is like a subroutine, or function in

Time-bounded Kolmogorov complexity and Solovay functions

Production-rule complexity of recursive structures

Lecture Notes for Ph219/CS219: Quantum Information Chapter 5. John Preskill California Institute of Technology

CS154, Lecture 12: Kolmogorov Complexity: A Universal Theory of Data Compression

Lecture 6: Oracle TMs, Diagonalization Limits, Space Complexity

1 Randomized complexity

Does the Polynomial Hierarchy Collapse if Onto Functions are Invertible?

Maxwell s Demon. Kirk T. McDonald Joseph Henry Laboratories, Princeton University, Princeton, NJ (October 3, 2004; updated September 20, 2016)

LINEAR PROGRAMMING III

Exponential time vs probabilistic polynomial time

FORMAL LANGUAGES, AUTOMATA AND COMPUTABILITY

The Metamathematics of Randomness

Information in Biology

Notes for Lecture 3... x 4

Maxwell s Demon. Kirk T. McDonald Joseph Henry Laboratories, Princeton University, Princeton, NJ (October 3, 2004)

Cellular automata in noise, computing and self-organizing

Kolmogorov-Loveland Randomness and Stochasticity

A statistical mechanical interpretation of algorithmic information theory

The physics of information: from Maxwell s demon to Landauer. Eric Lutz University of Erlangen-Nürnberg

1 AC 0 and Håstad Switching Lemma

Computability Theory. CS215, Lecture 6,

Game arguments in computability theory and algorithmic information theory

Measures of relative complexity

INAPPROX APPROX PTAS. FPTAS Knapsack P

CP405 Theory of Computation

Information in Biology

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. XX, NO Y, MONTH Algorithmic Statistics. Péter Gács, John T. Tromp, and Paul M.B.

Turing Machine Variants. Sipser pages

Define Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 417: Algorithms and Computational Complexity. Winter 2007 Larry Ruzzo

Lecture 3: Reductions and Completeness

Average Case Analysis of QuickSort and Insertion Tree Height using Incompressibility

,

On Computation and Random Strings

CS151 Complexity Theory

Loss Bounds and Time Complexity for Speed Priors

Kolmogorov complexity

Defining Efficiency. 2: Analysis. Efficiency. Measuring efficiency. CSE 421: Intro Algorithms. Summer 2007 Larry Ruzzo

Algorithmic Probability

Lecture 3 (Notes) 1. The book Computational Complexity: A Modern Approach by Sanjeev Arora and Boaz Barak;

CSC 5170: Theory of Computational Complexity Lecture 9 The Chinese University of Hong Kong 15 March 2010

Separating NE from Some Nonuniform Nondeterministic Complexity Classes

6.045: Automata, Computability, and Complexity (GITCS) Class 15 Nancy Lynch

Cellular Automata: Tutorial

Robustness of PSPACE-complete sets

Three Approaches to the Quantitative Definition of Information in an Individual Pure Quantum State

From fully quantum thermodynamical identities to a second law equality

Time Complexity. CS60001: Foundations of Computing Science

Computer Science 385 Analysis of Algorithms Siena College Spring Topic Notes: Limitations of Algorithms

1 Randomized Computation

Prefix-like Complexities and Computability in the Limit

Computational Depth: Concept and Applications

Time-Bounded Kolmogorov Complexity and Solovay Functions

ON BROWNIAN COMPUTATION

CS 591 Complex Adaptive Systems Spring 2009 Measures of Complexity. Melanie Moses 1/28/09

2 Plain Kolmogorov Complexity

Kirk Pruhs. Green Computing Algorithmics. Talk 1: Energy Efficient Circuit Design ADFOCS 2015

BBM402-Lecture 11: The Class NP

04. Information and Maxwell's Demon. I. Dilemma for Information-Theoretic Exorcisms.

Algorithmic Information Theory and Kolmogorov Complexity

The P = NP Problem Bristol Teachers Circle, June P.D.Welch, University of Bristol

Entropy and perpetual computers

1 Computational Problems

Theory of Computation CS3102 Spring 2015 A tale of computers, math, problem solving, life, love and tragic death

Notes on Computer Theory Last updated: November, Circuits

2. ALGORITHM ANALYSIS

Basic Logic Gate Realization using Quantum Dot Cellular Automata based Reversible Universal Gate

Chapter 6 The Structural Risk Minimization Principle

A better lower bound for quantum algorithms searching an ordered list

Notes for Lecture Notes 2

Transcription:

Time, Space, and Energy in Reversible Computing Paul Vitanyi, CWI, University of Amsterdam, National ICT Australia

Thermodynamics of Computing Heat Dissipation Input Compute Output Physical Law: 1 kt ln 2 is needed to irreversibly process 1 bit (Von Neumann, Landauer) Reversible computation is free. A B A AND B B AND NOT A A AND NOT B A AND B Input 0 1 1 0 0 1 1 A billiard ball computer Output 1 0 0 0 1 1 1

Trend Graph follows Moore s Law: 10 10 1 1eV -9 10 1940 2000 2015-21 kt = ca 3 x 10 J Even at kt, 20 C, 100 Gigahertz, 10 Tera gates/cm3 3000 Wat

Relevance: Mobilization Battery improvement (20% in 10 years) Miniaturization computing devices Improved speed + higher density (linear+square) More heat = dissipation /cm2 (cubic) Matter migration of circuits Flaky performance requiring reduced clock speed.

Reversible Universal Computer We use Turing Machines for convenience It is Reversible if for every operation there is an inverse operation that undoes the effect of the former operation. We can effectively enumerate a family of reversible TMs. There is a Universal Reversible TM that simulates all others (Lecerf 63, Bennett 73)

Energy If we compute x y reversibly, then actually we do p,x y,q Program garbage Garbage Program What is a program at one end, is garbage at the other end. We need to supply the program bits and erase the garbage bits. The best way to do this is to compact them as much as possible (like the garbage collection truck does).

Andrey Nikolaevich Kolmogorov (1903-1987, Tambov, Russia) Measure Theory Probability Analysis Intuitionistic Logic Cohomology Dynamical Systems Hydrodynamics Kolmogorov complexity

Kolmogorov Complexity Defines the ultimate data compression limit: gzip, bzip2, PPMZ... KC. Therefore it gives us the thermodynamically cheapest way to get rid of data garbage. Getting rid of 11...1 (n ``1 s) costs n kt ln 2 joule; by first compressing it reversibly to log n bits, erasing costs only (log n)kt ln 2 Joule.

Kolmogorov Complexity Solomonoff (1960)-Kolmogorov (1965)-Chaitin (1969): The amount of information in a string is the size of the smallest program generating that string. { } ( ) min : ( ) K x = p U p = x p Invariance Theorem: It does not matter which universal Turing machine U we choose. I.e. all encoding methods are ok.

Kolmogorov complexity K(x)= length of shortest description of x K(x y)=length of shortest description of x given y. A string is (effectively) random if it is already compressed to the limit, that is, if K(x) x. To erase such a string costs the maximal entropy/energy x kt ln 2.

GENERALLY: The ultimate thermodynamic cost of erasing x is reached by: Reversibly compress x to x* Then erase x*. Cost ~K(x) bits. The longer you compute, the less heat dissipation. Thermodynamic cost of converting: x y E(x,y) = min { p : U(x,p) = y, U(y,p)=x }. Theorem. E(x,y) ) = max{ K(x y), K(y x) } (Bennett, Gacs,, Li, Vitanyi, Zurek STOC 91)

Cost of Erasure Trivially, E(x,ε) = K(x) (i.e., erasing x) t t Theorem. E(x, ε) K(x) Cost of erasing x using t time Length shortest program Computing x in t time Question: Is there a hierarchy for growing t?

Energy-Time Tradeoff Hierarchy For each large n and m= n /2, there is an x of length n, and t_1(n)<t_2(n) <... t_m(n) such that t_1 t_2 t_m E (x,ε) > E (x,ε) >... > E (x,ε) Note: t_i (n) = 2 in

General Reversible Simulation To do all (a priori unknown) computing tasks reversibly, we need reversibly compile irreversible programs into reversible ones, and then run a reversible computation. This implies general reversible simulation of irreversible computations.

Space Hungry (Bennett 1973) Irreversible Computation: Input Time 0 Reversible Simulation: ID Output Time T Instantaneous description = complete computer configuration ID description α Auxiliary Record: α_1, α_2,..., α_t Exists α -1

Algorithm: Input x: Reversibly Compute x f(x) Reversibly Copy f(x) Reversibly ``Uncompute x f(x) Erasing α_1, α_2,..., α_t History Output <x,f(x)>

Complexity Each Reversible Computation is 1:1 Reversible Simulation x f(x) using T time and S space is x <x,f(x)> with T = Θ (T) and S = Θ(T+S)

Space Parsimonious (Bennett 89) Compute 1 Copy 2 ID_1 ID_2 ID_2 Uncompute History Uncompute, erasing History 3 4

Checkpoint Levels: Can be modelled by ``Reversible Pebble Game 4 3 2 1 Time 0

Reversible Pebble Game Pebble = ID 1. Initially: 1 2 T... Positions n free pebbles o 2. Each step: o OR 3. Win if 1) T gets pebbled, and 2) All n pebbles eventually free

Theorem: There is a winning strategy using n pebbles for T=2^n-1 1 2 3 4 5 6 7 T o...... o o...... o...... o o..... o o o.... o. o... o o. o... o.. o...... o o..... o o o.... o. o.... o. o o... o. o.... o o o.... o o.. T n n pebbles T = 2-1 in 2 Log 3 T = T steps

Theorem (Li, Tromp, Vitanyi 98) No reversible pebble game can do better. (Proof is subtle) We use too much space!! In the worst case S= (log T)^2 which is pretty bad for large T. What if we allow erasures?

Theorem (Space-Erasure LVT 98) There is a winning strategy with n+2 pebbles and m-1 erasures for pebble games with T= m2^n for all m 1 with simulation time T = 2m^{1 log 3} T^{log 3}. Corollary: We can use n log (E-1) pebbles and E erasures instead of n pebbles.

Using Less space can only be done by simulations that are not reversible pebble games (LVT98). Lange, Mackenzie, Tapp 00: No extra space but exponential time, by reversible cycling through the configuration tree of the TM computation (connected component of fixed size IDs including the start ID), using a method due to Sipser 90.

Lemma (LMT00): Reversible simulation time is O(2^S) and reversible simulation space is S+ log T Reversibly cycle through All ID s of irreversible TM with ID < S (space) output Irreversible computatio input

Time-Space Tradeoff, Buhrman Tromp Vitanyi 00,01, Williams 00. Reversible Pebbling with long intervals for pebble moves being bridged by LMT simulations.

Algorithm: Reversible pebbling m SLMT Interval, Time = O(2^{O(m)}S) m SLMT

Using k pebbles and m=t/2^k Lemma: (BTV01) Simulation time is T ' = 3 Simulation space is: 2 k O ( T / 2 ) k S S = S(1+O(k))

Corollary 1: T ' = S3 S '/ S 2 O( T / 2 S '/ S )

Some special values: k=log log T: T' = S' = S(logT ) log3 2 O( T /logt ) S loglogt S logs k= log T: T ' = S 3 log T 2 O ( T / 2 log T ) S ' = S log T S S That is: Both Subexponential Time and Subquadratic Space!

Unknown Computation Time T: Simulate first t steps of computation for all values t=2,2^2,2^3,... Consecutively (and if T<t reversibly undo the simualtion and start with t 2t. Total time used is T 2T T '' 2 logt t = 1 S3 S '/ S 2 O(2 t S '/ S ) 2T '

Lower Bound (BTV01) Lemma: If T,S,T,S are average-case, then every reversible simulation of an Irreversible computation has: T T and S n+ log T Input length, e.g., n=s This is sharp, since (S)LMT reaches bound on S from above.

Big Question: Is there a polynomial time * space reversible simulation? We actually want about linear time And linear space -