CS 580: Algorithm Design and Analysis. Jeremiah Blocki Purdue University Spring 2018

Similar documents
Chapter 9. PSPACE: A Class of Problems Beyond NP. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

9. PSPACE 9. PSPACE. PSPACE complexity class quantified satisfiability planning problem PSPACE-complete

9. PSPACE. PSPACE complexity class quantified satisfiability planning problem PSPACE-complete

Algorithms Design & Analysis. Approximation Algorithm

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

Computational Complexity IV: PSPACE

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

8. INTRACTABILITY I. Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley. Last updated on 2/6/18 2:16 AM

Algorithm Design and Analysis

Lecture 20: PSPACE. November 15, 2016 CS 1010 Theory of Computation

Polynomial-Time Reductions

SOLUTION: SOLUTION: SOLUTION:

COP 4531 Complexity & Analysis of Data Structures & Algorithms

NP and Computational Intractability

POLYNOMIAL SPACE QSAT. Games. Polynomial space cont d

NP Completeness. CS 374: Algorithms & Models of Computation, Spring Lecture 23. November 19, 2015

Lecture 23: More PSPACE-Complete, Randomized Complexity

Approximation and Randomized Algorithms (ARA) Lecture 2, September 1, 2010

Intro to Theory of Computation

Computability and Complexity

SAT, NP, NP-Completeness

Definition: Alternating time and space Game Semantics: State of machine determines who

CS 6783 (Applied Algorithms) Lecture 3

Chapter 2. Reductions and NP. 2.1 Reductions Continued The Satisfiability Problem (SAT) SAT 3SAT. CS 573: Algorithms, Fall 2013 August 29, 2013

Announcements. Friday Four Square! Problem Set 8 due right now. Problem Set 9 out, due next Friday at 2:15PM. Did you lose a phone in my office?

Chapter 4. Greedy Algorithms. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Umans Complexity Theory Lectures

1. REPRESENTATIVE PROBLEMS

Chapter 8. NP and Computational Intractability. Slides by Kevin Wayne. Copyright 2005 Pearson-Addison Wesley. All rights reserved.

Computational Intractability 2010/4/15. Lecture 2

More on NP and Reductions

NP-Completeness. Andreas Klappenecker. [based on slides by Prof. Welch]

8.1 Polynomial-Time Reductions. Chapter 8. NP and Computational Intractability. Classify Problems

Answers to the CSCE 551 Final Exam, April 30, 2008

Chapter 8. NP and Computational Intractability

Correctness of Dijkstra s algorithm

COMP-330 Theory of Computation. Fall Prof. Claude Crépeau. Lecture 2 : Regular Expressions & DFAs

Notes on Complexity Theory Last updated: October, Lecture 6

12. LOCAL SEARCH. gradient descent Metropolis algorithm Hopfield neural networks maximum cut Nash equilibria

CS 580: Algorithm Design and Analysis

1. REPRESENTATIVE PROBLEMS

8.5 Sequencing Problems. Chapter 8. NP and Computational Intractability. Hamiltonian Cycle. Hamiltonian Cycle

Space Complexity. Huan Long. Shanghai Jiao Tong University

NP-Complete Problems. More reductions

More Approximation Algorithms

NP and NP Completeness

NP and Computational Intractability

4/30/14. Chapter Sequencing Problems. NP and Computational Intractability. Hamiltonian Cycle

Space is a computation resource. Unlike time it can be reused. Computational Complexity, by Fu Yuxi Space Complexity 1 / 44

Algorithms 6.5 REDUCTIONS. designing algorithms establishing lower bounds classifying problems intractability

INAPPROX APPROX PTAS. FPTAS Knapsack P

Complexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler

Outline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181.

P, NP, NP-Complete, and NPhard

Circuits. Lecture 11 Uniform Circuit Complexity

CS 583: Algorithms. NP Completeness Ch 34. Intractability

Harvard CS 121 and CSCI E-121 Lecture 22: The P vs. NP Question and NP-completeness

Comp487/587 - Boolean Formulas

NP-Completeness. Sections 28.5, 28.6

A An Overview of Complexity Theory for the Algorithm Designer

More NP-Complete Problems

Computability and Complexity Theory: An Introduction

4/12/2011. Chapter 8. NP and Computational Intractability. Directed Hamiltonian Cycle. Traveling Salesman Problem. Directed Hamiltonian Cycle

P C max. NP-complete from partition. Example j p j What is the makespan on 2 machines? 3 machines? 4 machines?

Algorithm Design. Scheduling Algorithms. Part 2. Parallel machines. Open-shop Scheduling. Job-shop Scheduling.

Computability and Complexity CISC462, Fall 2018, Space complexity 1

NP Completeness and Approximation Algorithms

Definition: Alternating time and space Game Semantics: State of machine determines who

NP-Completeness. Subhash Suri. May 15, 2018

COMPLEXITY THEORY. PSPACE = SPACE(n k ) k N. NPSPACE = NSPACE(n k ) 10/30/2012. Space Complexity: Savitch's Theorem and PSPACE- Completeness

Complete problems for classes in PH, The Polynomial-Time Hierarchy (PH) oracle is like a subroutine, or function in

Non-Deterministic Time

Copyright 2000, Kevin Wayne 1

2.1 Computational Tractability. Chapter 2. Basics of Algorithm Analysis. Computational Tractability. Polynomial-Time

8.5 Sequencing Problems

This means that we can assume each list ) is

The Class NP. NP is the problems that can be solved in polynomial time by a nondeterministic machine.

2. ALGORITHM ANALYSIS

COMP Analysis of Algorithms & Data Structures

CS 301: Complexity of Algorithms (Term I 2008) Alex Tiskin Harald Räcke. Hamiltonian Cycle. 8.5 Sequencing Problems. Directed Hamiltonian Cycle

6.840 Language Membership

CS 580: Algorithm Design and Analysis

NP-Completeness. f(n) \ n n sec sec sec. n sec 24.3 sec 5.2 mins. 2 n sec 17.9 mins 35.

Database Theory VU , SS Complexity of Query Evaluation. Reinhard Pichler

Lecture 11: Proofs, Games, and Alternation

CSE 202 Dynamic Programming II

Partition is reducible to P2 C max. c. P2 Pj = 1, prec Cmax is solvable in polynomial time. P Pj = 1, prec Cmax is NP-hard

1 Primals and Duals: Zero Sum Games

Summer School on Introduction to Algorithms and Optimization Techniques July 4-12, 2017 Organized by ACMU, ISI and IEEE CEDA.

Introduction to Computational Complexity

Computational Complexity

Design and Analysis of Algorithms

Theory of Computation. Ch.8 Space Complexity. wherein all branches of its computation halt on all

How hard is it to find a good solution?

Theory of Computation Space Complexity. (NTU EE) Space Complexity Fall / 1

A brief introduction to Logic. (slides from

CS151 Complexity Theory. Lecture 13 May 15, 2017

CS 580: Algorithm Design and Analysis

6.841/18.405J: Advanced Complexity Wednesday, February 12, Lecture Lecture 3

6. DYNAMIC PROGRAMMING I

Transcription:

CS 580: Algorithm Design and Analysis Jeremiah Blocki Purdue University Spring 2018

Chapter 9 PSPACE: A Class of Problems Beyond NP Slides by Kevin Wayne. Copyright @ 2005 Pearson-Addison Wesley. All rights reserved. 2

Geography Game Geography. Alice names capital city c of country she is in. Bob names a capital city c' that starts with the letter on which c ends. Alice and Bob repeat this game until one player is unable to continue. Does Alice have a forced win? Ex. Budapest Tokyo Ottawa Ankara Amsterdam Moscow Washington Nairobi Geography on graphs. Given a directed graph G = (V, E) and a start node s, two players alternate turns by following, if possible, an edge out of the current node to an unvisited node. Can first player guarantee to make the last legal move? Remark. Some problems (especially involving 2-player games and AI) defy classification according to P, EXPTIME, NP, and NP-complete. 3

9.1 PSPACE

PSPACE P. Decision problems solvable in polynomial time. PSPACE. Decision problems solvable in polynomial space. Observation. P PSPACE. poly-time algorithm can consume only polynomial space 5

PSPACE Binary counter. Count from 0 to 2 n - 1 in binary. Algorithm. Use n bit odometer. Claim. 3-SAT is in PSPACE. Pf. Enumerate all 2 n possible truth assignments using counter. Check each assignment to see if it satisfies all clauses. Theorem. NP PSPACE. Pf. Consider arbitrary problem Y in NP. Since Y P 3-SAT, there exists algorithm that solves Y in poly-time plus polynomial number of calls to 3-SAT black box. Can implement black box in poly-space. 6

9.3 Quantified Satisfiability

Quantified Satisfiability QSAT. Let (x 1,, x n ) be a Boolean CNF formula. Is the following propositional formula true? x 1 x 2 x 3 x 4 x n-1 x n (x 1,, x n ) assume n is odd Intuition. Amy picks truth value for x 1, then Bob for x 2, then Amy for x 3, and so on. Can Amy satisfy no matter what Bob does? Ex. (x 1 x 2 ) (x 2 x 3 ) (x 1 x 2 x 3 ) Yes. Amy sets x 1 true; Bob sets x 2 ; Amy sets x 3 to be same as x 2. Ex. (x 1 x 2 ) (x 2 x 3 ) (x 1 x 2 x 3 ) No. If Amy sets x 1 false; Bob sets x 2 false; Amy loses; No. if Amy sets x 1 true; Bob sets x 2 true; Amy loses. 8

QSAT is in PSPACE Theorem. QSAT PSPACE. Pf. Recursively try all possibilities. Only need one bit of information from each subproblem. Amount of space is proportional to depth of function call stack. x 1 = 0 x 1 = 1 return true iff both subproblems are true x 2 = 0 x 2 = 1 return true iff either subproblem is true x 3 = 0 x 3 = 1 (0, 0, 0) (0, 0, 1) (0, 1, 0) (0, 1, 1) (1, 0, 0) (1, 0, 1) (1, 1, 0) (1, 1, 1) 9

9.4 Planning Problem

15-Puzzle 8-puzzle, 15-puzzle. [Sam Loyd 1870s] Board: 3-by-3 grid of tiles labeled 1-8. Legal move: slide neighboring tile into blank (white) square. Find sequence of legal moves to transform initial configuration into goal configuration. 1 2 4 5 3 6 move 12 1 2 4 5 3?... 1 2 4 5 3 6 8 7 8 7 6 7 8 initial configuration goal configuration 11

Planning Problem Conditions. Set C = { C 1,, C n }. Initial configuration. Subset c 0 C of conditions initially satisfied. Goal configuration. Subset c* C of conditions we seek to satisfy. Operators. Set O = { O 1,, O k }. To invoke operator O i, must satisfy certain prereq conditions. After invoking O i certain conditions become true, and certain conditions become false. PLANNING. Is it possible to apply sequence of operators to get from initial configuration to goal configuration? Examples. 15-puzzle. Rubik's cube. Logistical operations to move people, equipment, and materials. 12

Planning Problem: 8-Puzzle Planning example. Can we solve the 8-puzzle? 1 2 3 4 5 6 8 7 9 Conditions. C ij, 1 i, j 9. C ij means tile i is in square j O i Initial state. c 0 = {C 11, C 22,, C 66, C 78, C 87, C 99 }. Goal state. c* = {C 11, C 22,, C 66, C 77, C 88, C 99 }. 1 2 3 4 5 6 8 9 7 Operators. Precondition to apply O i = {C 11, C 22,, C 66, C 78, C 87, C 99 }. After invoking O i, conditions C 79 and C 97 become true. After invoking O i, conditions C 78 and C 99 become false. Solution. No solution to 8-puzzle or 15-puzzle! 13

Diversion: Why is 8-Puzzle Unsolvable? 8-puzzle invariant. Any legal move preserves the parity of the number of pairs of pieces in reverse order (inversions). 3 1 2 4 5 6 8 7 3 inversions 1-3, 2-3, 7-8 3 1 2 4 5 6 8 7 3 inversions 1-3, 2-3, 7-8 3 1 2 4 6 8 5 7 5 inversions 1-3, 2-3, 7-8, 5-8, 5-6 1 2 3 4 5 6 7 8 0 inversions 1 2 3 4 5 6 8 7 1 inversion: 7-8 14

Planning Problem: Binary Counter Planning example. Can we increment an n-bit counter from the allzeroes state to the all-ones state? Conditions. C 1,, C n. Initial state. c 0 =. Goal state. c* = {C 1,, C n }. Operators. O 1,, O n. To invoke operator O i, must satisfy C 1,, C i-1. After invoking O i, condition C i becomes true. C i corresponds to bit i = 1 all 0s all 1s i-1 least significant bits are 1 After invoking O i, conditions C 1,, C i-1 become false. set bit i to 1 set i-1 least significant bits to 0 Solution. { } {C 1 } {C 2 } {C 1, C 2 } {C 3 } {C 3, C 1 } Observation. Any solution requires at least 2 n - 1 steps. 15

Planning Problem: In Exponential Space Configuration graph G. Include node for each of 2 n possible configurations. Include an edge from configuration c' to configuration c'' if one of the operators can convert from c' to c''. PLANNING. Is there a path from c 0 to c* in configuration graph? Claim. PLANNING is in EXPTIME. Pf. Run BFS to find path from c 0 to c* in configuration graph. Note. Configuration graph can have 2 n nodes, and shortest path can be of length = 2 n -1. binary counter 16

Planning Problem: In Polynomial Space Theorem. PLANNING is in PSPACE. Pf. Suppose there is a path from c 1 to c 2 of length L. Path from c 1 to midpoint and from midpoint to c 2 are each L/2. Enumerate all possible midpoints. Apply recursively. Depth of recursion = log 2 L. boolean haspath(c 1, c 2, L) { if (L 1) return correct answer enumerate using binary counter } foreach configuration c' { boolean x = haspath(c 1, c', L/2) boolean y = haspath(c', c 2, L/2) if (x and y) return true } return false 17

9.5 PSPACE-Complete

PSPACE-Complete PSPACE. Decision problems solvable in polynomial space. PSPACE-Complete. Problem Y is PSPACE-complete if (i) Y is in PSPACE and (ii) for every problem X in PSPACE, X P Y. Theorem. [Stockmeyer-Meyer 1973] QSAT is PSPACE-complete. Theorem. PSPACE EXPTIME. Pf. Previous algorithm solves QSAT in exponential time, and QSAT is PSPACE-complete. Summary. P NP PSPACE EXPTIME. it is known that P EXPTIME, but unknown which inclusion is strict; conjectured that all are 19

PSPACE-Complete Problems More PSPACE-complete problems. Competitive facility location. Natural generalizations of games. Othello, Hex, Geography, Rush-Hour, Instant Insanity Shanghai, go-moku, Sokoban Given a memory restricted Turing machine, does it terminate in at most k steps? Do two regular expressions describe different languages? Is it possible to move and rotate complicated object with attachments through an irregularly shaped corridor? Is a deadlock state possible within a system of communicating processors? 20

Competitive Facility Location Input. Graph with positive edge weights, and target B. Game. Two competing players alternate in selecting nodes. Not allowed to select a node if any of its neighbors has been selected. Competitive facility location. Can second player guarantee at least B units of profit? 10 1 5 15 5 1 5 1 15 10 Yes if B = 20; no if B = 25. 21

Competitive Facility Location Claim. COMPETITIVE-FACILITY is PSPACE-complete. Pf. To solve in poly-space, use recursion like QSAT, but at each step there are up to n choices instead of 2. To show that it's complete, we show that QSAT polynomial reduces to it. Given an instance of QSAT, we construct an instance of COMPETITIVE-FACILITY such that player 2 can force a win iff QSAT formula is false. 22

Competitive Facility Location assume n is odd Construction. Given instance (x 1,, x n ) = C 1 C 1 C k of QSAT. Include a node for each literal and its negation and connect them. at most one of x i and its negation can be chosen Choose c k+2, and put weight c i on literal x i and its negation; set B = c n-1 + c n-3 + + c 4 + c 2 + 1. ensures variables are selected in order x n, x n-1,, x 1. As is, player 2 will lose by 1 unit: c n-1 + c n-3 + + c 4 +c 2. 10 n 10 n x n x n 100. 100 x 2 x 2 10 10 x 1 x 1 23

Competitive Facility Location Construction. Given instance (x 1,, x n ) = C 1 C 1 C k of QSAT. Give player 2 one last move on which she can try to win. For each clause C j, add node with value 1 and an edge to each of its literals. Player 2 can make last move iff truth assignment defined alternately by the players failed to satisfy some clause. 10 n 10 n 1 x 1 x 2 x n. 100 x n x n 100 x 2 x 2 10 10 x 1 x 1 24

Chapter 11 Approximation Algorithms Slides by Kevin Wayne. Copyright @ 2005 Pearson-Addison Wesley. All rights reserved. 25

Approximation Algorithms Q. Suppose I need to solve an NP-hard problem. What should I do? A. Theory says you're unlikely to find a poly-time algorithm. Must sacrifice one of three desired features. Solve problem to optimality. Solve problem in poly-time. Solve arbitrary instances of the problem. -approximation algorithm. Guaranteed to run in poly-time. Guaranteed to solve arbitrary instance of the problem Guaranteed to find solution within ratio of true optimum. Challenge. Need to prove a solution's value is close to optimum, without even knowing what optimum value is! 26

11.1 Load Balancing

Load Balancing Input. m identical machines; n jobs, job j has processing time t j. Job j must run contiguously on one machine. A machine can process at most one job at a time. Def. Let J(i) be the subset of jobs assigned to machine i. The load of machine i is L i = j J(i) t j. Def. The makespan is the maximum load on any machine L = max i L i. Load balancing. Assign each job to a machine to minimize makespan. 28

Load Balancing: List Scheduling List-scheduling algorithm. Consider n jobs in some fixed order. Assign job j to machine whose load is smallest so far. play List-Scheduling(m, n, t 1,t 2,,t n ) { for i = 1 to m { L i 0 load on machine i J(i) } jobs assigned to machine i } for j = 1 to n { i = argmin k L k J(i) J(i) {j} L i L i + t j } return J(1),, J(m) machine i has smallest load assign job j to machine i update load of machine i Implementation. O(n log m) using a priority queue. 29

Load Balancing: List Scheduling Analysis Theorem. [Graham, 1966] Greedy algorithm is a 2-approximation. First worst-case analysis of an approximation algorithm. Need to compare resulting solution with optimal makespan L*. 30

Load Balancing: List Scheduling Analysis Theorem. Greedy algorithm is a 2-approximation. Pf. Consider load L i of bottleneck machine i. Let j be last job scheduled on machine i. When job j assigned to machine i, i had smallest load. Its load before assignment is L i -t L j i -t j L k for all 1 k m. blue jobs scheduled before j j 0 L i -t j L = L i 31

Load Balancing: List Scheduling Analysis Lemma 1 Now 32 Lemma 2

Load Balancing: List Scheduling Analysis Q. Is our analysis tight? A. Essentially yes. Ex: m machines, m(m-1) jobs length 1 jobs, one job of length m machine 2 idle machine 3 idle machine 4 idle machine 5 idle machine 6 idle machine 7 idle machine 8 idle machine 9 idle machine 10 idle list scheduling makespan = 19 33

Load Balancing: List Scheduling Analysis Q. Is our analysis tight? A. Essentially yes. Ex: m machines, m(m-1) jobs length 1 jobs, one job of length m 34 optimal makespan = 10

Load Balancing: LPT Rule Longest processing time (LPT). Sort n jobs in descending order of processing time, and then run list scheduling algorithm. LPT-List-Scheduling(m, n, t 1,t 2,,t n ) { Sort jobs so that t 1 t 2 t n for i = 1 to m { L i 0 J(i) } load on machine i jobs assigned to machine i } for j = 1 to n { i = argmin k L k J(i) J(i) {j} L i L i + t j } return J(1),, J(m) machine i has smallest load assign job j to machine i update load of machine i 35

Load Balancing: LPT Rule Observation. If at most m jobs, then list-scheduling is optimal. Pf. Each job put on its own machine. Lemma 3. If there are more than m jobs, L* 2t m+1. Pf. Consider first m+1 jobs t 1,, t m+1. Since the t i 's are in descending order, each takes at least t m+1 time. There are m+1 jobs and m machines, so by pigeonhole principle, at least one machine gets two jobs. Theorem. LPT rule is a 3/2 approximation algorithm. Pf. Same basic approach as for list scheduling. L i (L i t j ) L* t j 3 L *. 2 1 2 L* 36 Lemma 3 ( by observation, can assume number of jobs > m )

Load Balancing: LPT Rule Q. Is our 3/2 analysis tight? A. No. Theorem. [Graham, 1969] LPT rule is a 4/3-approximation. Pf. More sophisticated analysis of same algorithm. Q. Is Graham's 4/3 analysis tight? A. Essentially yes. Ex: m machines, n = 2m+1 jobs, 2 jobs of length m+1, m+2,, 2m-1 and one job of length m. 37