Complexity, Parallel Computation and Statistical Physics

Similar documents
Physics and phase transitions in parallel computational complexity

Natural Complexity, Computational Complexity and Depth

arxiv:cond-mat/ v1 [cond-mat.stat-mech] 20 Jan 1997

Parallel Dynamics and Computational Complexity of Network Growth Models

R ij = 2. Using all of these facts together, you can solve problem number 9.

Limitations of Algorithm Power

Graphical Representations and Cluster Algorithms

Phase Transitions in Networks: Giant Components, Dynamic Networks, Combinatoric Solvability

PRAM lower bounds. 1 Overview. 2 Definitions. 3 Monotone Circuit Value Problem

P, NP, NP-Complete, and NPhard

Internal Diffusion-Limited Aggregation: Parallel Algorithms and Complexity

1 Circuit Complexity. CS 6743 Lecture 15 1 Fall Definitions

Circuits. Lecture 11 Uniform Circuit Complexity

Parallel Dynamics and Computational Complexity of the Bak-Sneppen Model

Analysis of Algorithms. Unit 5 - Intractable Problems

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling

Critical Dynamics of Two-Replica Cluster Algorithms

P, NP, NP-Complete. Ruth Anderson

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: NP-Completeness I Date: 11/13/18

1.1.6 Island Shapes (see Michely / Krug book, Chapter 3)

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

Lecture 14 - P v.s. NP 1

The non-backtracking operator

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2018

CSE332: Data Structures & Parallelism Lecture 2: Algorithm Analysis. Ruth Anderson Winter 2019

Definition: Alternating time and space Game Semantics: State of machine determines who

Space and Nondeterminism

VIII. NP-completeness

JASS 06 Report Summary. Circuit Complexity. Konstantin S. Ushakov. May 14, 2006

Latches. October 13, 2003 Latches 1

Definition: Alternating time and space Game Semantics: State of machine determines who

Notes on Computer Theory Last updated: November, Circuits

Intractable Problems Part One

NP-Completeness Review

CSCI 1590 Intro to Computational Complexity

Finite-State Model Checking

13 : Variational Inference: Loopy Belief Propagation and Mean Field

Review of unsolvability

Janus: FPGA Based System for Scientific Computing Filippo Mantovani

A Complexity Theory for Parallel Computation

Complexity Theory of Polynomial-Time Problems

CHAPTER 3 FUNDAMENTALS OF COMPUTATIONAL COMPLEXITY. E. Amaldi Foundations of Operations Research Politecnico di Milano 1

ECE 695 Numerical Simulations Lecture 2: Computability and NPhardness. Prof. Peter Bermel January 11, 2017

Complete problems for classes in PH, The Polynomial-Time Hierarchy (PH) oracle is like a subroutine, or function in

Ταουσάκος Θανάσης Αλγόριθμοι και Πολυπλοκότητα II 7 Φεβρουαρίου 2013

6.207/14.15: Networks Lecture 12: Generalized Random Graphs

Lecture 8: Complete Problems for Other Complexity Classes

CS151 Complexity Theory. Lecture 1 April 3, 2017

6.842 Randomness and Computation April 2, Lecture 14

A An Overview of Complexity Theory for the Algorithm Designer

Computational complexity and some Graph Theory

CS151 Complexity Theory

Chaos, Complexity, and Inference (36-462)

iretilp : An efficient incremental algorithm for min-period retiming under general delay model

Graph-theoretic Problems

Quantum and classical annealing in spin glasses and quantum computing. Anders W Sandvik, Boston University

NP-Completeness. Subhash Suri. May 15, 2018

Lattice gas models. - Lattice gas - Diffusion Limited Aggregates

NP-Completeness. Until now we have been designing algorithms for specific problems

Invaded cluster dynamics for frustrated models

1 Reductions and Expressiveness

COMPUTATIONAL COMPLEXITY

Scheduling Lecture 1: Scheduling on One Machine

1 Primals and Duals: Zero Sum Games

Complexity. Complexity Theory Lecture 3. Decidability and Complexity. Complexity Classes

Complexity of the quantum adiabatic algorithm

Optimized statistical ensembles for slowly equilibrating classical and quantum systems

MONTE CARLO METHODS IN SEQUENTIAL AND PARALLEL COMPUTING OF 2D AND 3D ISING MODEL

Notes for Lecture 21

Computer Sciences Department

Advanced topic: Space complexity

Friday Four Square! Today at 4:15PM, Outside Gates

Binary Decision Diagrams and Symbolic Model Checking

Algorithms. Grad Refresher Course 2011 University of British Columbia. Ron Maharik

15.083J/6.859J Integer Optimization. Lecture 2: Efficient Algorithms and Computational Complexity

An Efficient reconciliation algorithm for social networks

COMP 182 Algorithmic Thinking. Algorithm Efficiency. Luay Nakhleh Computer Science Rice University

Where do pseudo-random generators come from?

Lecture 4. 1 Circuit Complexity. Notes on Complexity Theory: Fall 2005 Last updated: September, Jonathan Katz

NP-Completeness I. Lecture Overview Introduction: Reduction and Expressiveness

Bayesian Networks Inference with Probabilistic Graphical Models

1 Computational problems

Overview. Discrete Event Systems Verification of Finite Automata. What can finite automata be used for? What can finite automata be used for?

Parallelism and Machine Models

Memory Elements I. CS31 Pascal Van Hentenryck. CS031 Lecture 6 Page 1

Complexity: Some examples

Mathematical or Physical? A physical analysis of mechanical computability. Answers. Abstract model. Mental calculator. Conceptual Outline

Determine the size of an instance of the minimum spanning tree problem.

Lecture 6 September 21, 2016

ICS 252 Introduction to Computer Design

Department of Electrical and Computer Engineering University of Wisconsin Madison. Fall Final Examination

Umans Complexity Theory Lectures

A = {(x, u) : 0 u f(x)},

1.1 P, NP, and NP-complete

Growing a Network on a Given Substrate

COMP Analysis of Algorithms & Data Structures

CS 583: Algorithms. NP Completeness Ch 34. Intractability

} } } Lecture 23: Computational Complexity. Lecture Overview. Definitions: EXP R. uncomputable/ undecidable P C EXP C R = = Examples

Computational Complexity for Algebraists

Transcription:

Complexity, Parallel Computation and Statistical Physics Jon Machta! Measures of Complexity workshop Santa Fe Institute January 13, 2011

Outline Overview and motivation: What aspect of natural complexity are we trying to formalize? Background: statistical physics and parallel computational complexity. Depth: a useful proxy for complexity? Examples: the simple and complex in statistical physics Conclusions

Collaborators Ray Greenlaw Cris Moore Dan Tillberg Ben Machta Ken Moriaty Xuenan Li

What s the difference? from NASA

What s the difference? Mass from NASA Temperature Entropy Entropy production

What s the difference? << Mass from NASA Temperature Entropy Entropy production

What s the difference? Complexity

What s the difference? << Complexity

What s the difference? << Complexity

Ising Model A more tractable example from statistical physics. System states described by spins on a lattice. s i = ±1 Probability of system states described by the Gibbs distribution

Ising Model H = (i,j) s i s j P [s] =e H[s]/kT /Z Gibbs Distribution

Ising Model H = (i,j) s i s j P [s] =e H[s]/kT /Z Gibbs Distribution T =0 T = T c T =

Ising Model H = (i,j) s i s j P [s] =e H[s]/kT /Z Critical point T =0 T = T c T =

The Ising Critical Point Long range correlations Fractal clusters of like spins Structure on all length scales. Difficult to simulate numerically and analyze theoretically.

Ising Model Temperature < < T =0 T = T c T =

Ising Model Temperature Entropy < < completely ordered completely disordered

Ising Model Temperature Entropy Complexity, <<

Question: What makes the Earth more complex than the Sun and the Ising critical point more complex than its high or low temperature phases?

Question: What makes the Earth more complex than the Sun and the Ising critical point more complex than its high or low temperature phases? One answer: A long history.

History and Complexity Charles Bennett in SFI Studies in the Sciences of Complexity, Vol. 7 (1990)

History and Complexity Charles Bennett in SFI Studies in the Sciences of Complexity, Vol. 7 (1990) The emergence of a complex system from simple initial conditions requires a long history.

The Earth and the Sun are both 4.5 billions years old...

The Earth and the Sun are both 4.5 billions years old......but, the present state of the Sun does not remember the full 4.5 billion year history (except via conserved quantities) while the present state of the Earth (biosphere) is contingent on a very long evolutionary process. The Earth does remember its past.

High and low temperature states of the Ising model can be sampled using small number of sweeps of a Monte Carlo algorithm. The critical state of the Ising model requires a number of sweeps of MC algorithm that scales as a power of the system size (critical slowing down).

History and Complexity Charles Bennett in SFI Studies in the Sciences of Complexity, Vol. 7 (1990)

History and Complexity Charles Bennett in SFI Studies in the Sciences of Complexity, Vol. 7 (1990) The emergence of a complex system from simple initial conditions requires a long history.

History and Complexity Charles Bennett in SFI Studies in the Sciences of Complexity, Vol. 7 (1990) The emergence of a complex system from simple initial conditions requires a long history. History can be quantified in terms of the computational complexity (running time) of simulating states of the system.

What computational complexity measure best measures physical complexity? Complexity emerges from interactions, not from signal propagation discount communication. Size alone should not contribute to physical complexity discount hardware. These considerations suggest parallel time as the appropriate computational complexity measure.

History and Complexity

History and Complexity The emergence of a complex system from simple initial conditions requires a long history.

History and Complexity The emergence of a complex system from simple initial conditions requires a long history. History can be quantified in terms of the computational complexity of simulating states of the system.

History and Complexity The emergence of a complex system from simple initial conditions requires a long history. History can be quantified in terms of the computational complexity of simulating states of the system. The appropriate computational measure of history is parallel time.

Models of Computation

Models of Computation

Parallel Random Access PRAM Machine Controller Each processor runs the same program but has a distinct label Processors 1 2 3 m Each processor communicates with any memory cell in a single time step. Primary resources: Global Memory Parallel time Number of processors

Boolean Circuit Family width OUTPUT OUTPUT NOR NOR NOR NOR NOR NOR NOR INPUT INPUT INPUT depth Gates evaluated one level at a time from input to output with no feedback. One hardwired circuit for each problem size. Primary resources Depth=number of levels parallel time Width=maximum number of gates in a level number of processors Work=total number of gates

Parallel Computing Adding n numbers can be carried out in O(log n) steps using O(n) processors. ΣX i + + + log n + + + + X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8

Parallel Computing Adding n numbers can be carried out in O(log n) steps using O(n) processors. ΣX i + + + log n + + + + X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 Connected components of a graph can be found in O(log 2 n) time using polynomially many processors.

Complexity Classes and P-completeness P is the class of feasible problems: solvable with polynomial work. NC is the class of problems efficiently solved in parallel (polylog depth and polynomial work, NC P). Are there feasible problems that cannot be solved efficiently in parallel (P NC)? P-complete problems are the hardest problems in P to solve in parallel. It is believed they are inherently sequential: not solvable in polylog depth. The Circuit Value Problem is P-complete.

Complexity Classes and P-completeness P is the class of feasible problems: solvable with polynomial work. NC is the class of problems efficiently solved in parallel? NOR (polylog depth and polynomial work, NC P). Are there feasible problems that cannot be solved NOR NOR NOR efficiently in parallel (P NC)? P-complete problems are the hardest problems in P to solve in parallel. It is believed they are inherently sequential: not solvable in polylog depth. NOR NOR TRUE FALSE TRUE The Circuit Value Problem is P-complete.

Sampling Complexity Typical System State NOR NOR NOR NOR NOR NOR NOR Random Bits depth Monte Carlo simulations convert random bits into descriptions of a typical system states. What is the depth of the shallowest feasible circuit (running time of the fastest PRAM program) that generates typical states?

Sampling Complexity Typical System State NOR NOR NOR NOR NOR NOR NOR Random Bits depth Monte Carlo simulations convert random bits into descriptions of a typical system states. What is the depth of the shallowest feasible circuit (running time of the fastest PRAM program) that generates typical states? Depth is a property of systems in statistical physics

Depth of Natural Systems The depth of a natural system is the time complexity of the fastest parallel Monte Carlo algorithm (PRAM or Boolean circuit family with random inputs) that generates typical system states (or histories) with polynomial hardware.

Comments on fastest

Comments on fastest A natural system should not be called complex because it emerges slowly via an inefficient process. - Many systems that appear to have a long history do not, in fact, have much depth.

Comments on fastest A natural system should not be called complex because it emerges slowly via an inefficient process. - Many systems that appear to have a long history do not, in fact, have much depth. Depth is uncomputable. Upper bounds can be found by demonstrating specific parallel sampling algorithms but lower bounds are difficult to establish. - A necessary feature, not a bug!

Maximal Property of Depth For a system AB composed of independent subsystems A and B, the depth of the whole is the maximum over subsystems: Follows immediately from parallelism.

Maximal Property of Depth For a system AB composed of independent subsystems A and B, the depth of the whole is the maximum over subsystems: Follows immediately from parallelism. +

Maximal Property of Depth For a system AB composed of independent subsystems A and B, the depth of the whole is the maximum over subsystems: Follows immediately from parallelism. + Depth is intensive (nearly independent of size) for homogeneous systems with short range correlations.

Examples from statistical physics Random walks Preferential attachment networks The Ising model Diffusion limited aggregation

Random Walks from wikipedia

Random Walks There is apparent history in the random walk since its position at time t+1 is obtained from the position at time t by adding a random step. from wikipedia

Random Walks There is apparent history in the random walk since its position at time t+1 is obtained from the position at time t by adding a random step. Since addition can be carried out in log parallel time, a random walk of length t has log t depth. from wikipedia

Examples from statistical physics Random walks Preferential attachment networks The Ising model Diffusion limited aggregation

Preferential Attachment Networks Barabasi, Albert, Science 286, 509 (1999) Krapivsky, Redner, Leyvraz, PRL 85, 4629 (2000) Add nodes one at a time, connecting new nodes to old nodes according to a rich get richer preferential attachment rule: where k n (t) is the degree of node n at time t. 1/3 1/9 1/3 1/9 1/9

Behavior of Growing Networks Random P(k) ~ exp(-k) P(k) ~ exp(-k β ) Scale Free P(k) ~ k ν Gel Node 0 α 1 P(k) is the degree distribution Discontinuous structural transition at α=1

Redirection Krapivsky, Redner, Leyvraz, PRE 63, 066123 (2001) I. Generate a random sequential network. II. With probability r, color edge R (redirect) and with probability 1-r color edge T (terminal). III. New links obtained by tracing R edges and stopping after traversing a T edge. 33

Parallel Algorithm for Scale Free Networks Redirection provides a fast parallel algorithm for the scale free case. The longest redirected path ~log N Tracing such a path in parallel ~log log N Depth of scale free networks ~log log N 34

Depth of PA Networks Mean field approximation T = log(n) log(1 2 α ) 1.4 T / log N 1.2 1 0.8 0.6 log N Simulation 0.4 0.2 constant 0.0 0.2 0.4 0.6 0.8 1 α

Depth of PA Networks Mean field approximation T = log(n) log(1 2 α ) 1.4 T / log N 1.2 1 0.8 0.6 0.4 log N log log N Simulation constant 0.2 constant 0.0 0.2 0.4 0.6 0.8 1 α

Examples from statistical physics Random walks Preferential attachment networks The Ising model Diffusion limited aggregation

Ising model The best known parallel algorithm for the (3D) Ising model (the Swendsen-Wang algorithm) equilibrates at the critical point in a time that scales as a small power of the system size. z 0.5 at T = T c z = 0(log) for T T c

Ising model The best known parallel algorithm for the (3D) Ising model (the Swendsen-Wang algorithm) equilibrates at the critical point in a time that scales as a small power of the system size. z 0.5 at T = T c z = 0(log) for T T c More generally, depth tends to be a maximum at transitions between ordered and disordered states.

Examples from statistical physics Random walks Preferential attachment networks The Ising model Diffusion limited aggregation

Diffusion Limited Aggregation Witten and Sander, PRL 47, 1400 (1981) Particles added one at a time with sticking probabilities given by the solution of Laplace s equation. Self-organized fractal object d f =1.715 (2D) Physical systems: Fluid flow in porous media Electrodeposition Bacterial colonies

Random Walk Dynamics for DLA

Random Walk Dynamics for DLA

Random Walk Dynamics for DLA

Random Walk Dynamics for DLA

Random Walk Dynamics for DLA

Random Walk Dynamics for DLA

Random Walk Dynamics for DLA

Random Walk Dynamics for DLA

Random Walk Dynamics for DLA

Random Walk Dynamics for DLA

Depth of DLA Theorem: Determining the shape of an aggregate from the random walks of the constituent particles is a P-complete problem. Proof sketch: Reduce the Circuit Value Problem to DLA dynamics. output d a b c power Caveats: 1. P NC not proven 2. Average case may be easier than worst case input 1 input 2 3. Alternative dynamics may be faster than random walk dynamics

Parallel Algorithm for DLA D. Tillberg and JM, PRE 69, 051403 (2004) 1. Start with seed particle at the origin and N walk trajectories 2. In parallel move all particles along their trajectories to tentative sticking points on tentative cluster, which is initially the seed particle at the origin. 3. New tentative cluster obtained by removing all particles that interfere with earlier particles. 4. Continue until all particles are correctly placed. 2 3 1 2 1 tentative cluster, step N tentative cluster, step N+1

Efficiency of the Algorithm DLA is a tree whose structural depth, D s scales as the radius of the cluster. The running time, T of the algorithm is asymptotically proportional to the structural depth. D s /T T

Summary Depth (parallel time complexity of sampling distributions) is a property of any natural system described in the framework of statistical physics. Depth captures some salient features of the intuitive notion of complexity.