Introduction to Algorithms

Similar documents
Single Source Shortest Paths

Introduction to Algorithms

CMPS 6610 Fall 2018 Shortest Paths Carola Wenk

Design and Analysis of Algorithms

Introduction to Algorithms

CS 4407 Algorithms Lecture: Shortest Path Algorithms

Introduction to Algorithms

CS60020: Foundations of Algorithm Design and Machine Learning. Sourangshu Bhattacharya

Introduction to Algorithms

16.1 Min-Cut as an LP

Analysis of Algorithms. Outline. Single Source Shortest Path. Andres Mendez-Vazquez. November 9, Notes. Notes

Dynamic Programming: Shortest Paths and DFA to Reg Exps

Dynamic Programming: Shortest Paths and DFA to Reg Exps

Topics in Theoretical Computer Science April 08, Lecture 8

Discrete Optimization 2010 Lecture 3 Maximum Flows

Lecture Notes for Chapter 25: All-Pairs Shortest Paths

Flows. Chapter Circulations

Lecture 9 Tuesday, 4/20/10. Linear Programming

Optimization (168) Lecture 7-8-9

III. Linear Programming

Shortest Paths. CS 320, Fall Dr. Geri Georg, Instructor 320 ShortestPaths 3

Lecture #21. c T x Ax b. maximize subject to

Algorithms and Theory of Computation. Lecture 13: Linear Programming (2)

An example of LP problem: Political Elections

Linear Programming. Jie Wang. University of Massachusetts Lowell Department of Computer Science. J. Wang (UMass Lowell) Linear Programming 1 / 47

Discrete Optimization 2010 Lecture 2 Matroids & Shortest Paths

Chapter 1 Linear Programming. Paragraph 5 Duality

CSE 431/531: Analysis of Algorithms. Dynamic Programming. Lecturer: Shi Li. Department of Computer Science and Engineering University at Buffalo

Single Source Shortest Paths

Graph fundamentals. Matrices associated with a graph

Lecture 11. Single-Source Shortest Paths All-Pairs Shortest Paths

CS 253: Algorithms. Chapter 24. Shortest Paths. Credit: Dr. George Bebis

29 Linear Programming

CMPS 2200 Fall Carola Wenk Slides courtesy of Charles Leiserson with small changes by Carola Wenk. 10/8/12 CMPS 2200 Intro.

CS 6820 Fall 2014 Lectures, October 3-20, 2014

NP-completeness. Chapter 34. Sergey Bereg

Topic: Primal-Dual Algorithms Date: We finished our discussion of randomized rounding and began talking about LP Duality.

Travelling Salesman Problem

Solving the MWT. Recall the ILP for the MWT. We can obtain a solution to the MWT problem by solving the following ILP:

15.083J/6.859J Integer Optimization. Lecture 10: Solving Relaxations

The Ellipsoid (Kachiyan) Method

Duality. Peter Bro Mitersen (University of Aarhus) Optimization, Lecture 9 February 28, / 49

Dantzig s pivoting rule for shortest paths, deterministic MDPs, and minimum cost to time ratio cycles

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

CSC373: Algorithm Design, Analysis and Complexity Fall 2017 DENIS PANKRATOV NOVEMBER 1, 2017

CSC Design and Analysis of Algorithms. LP Shader Electronics Example

Shortest paths with negative lengths

Lecture 5 January 16, 2013

Planning in Markov Decision Processes

Notes taken by Graham Taylor. January 22, 2005

CS675: Convex and Combinatorial Optimization Fall 2016 Combinatorial Problems as Linear and Convex Programs. Instructor: Shaddin Dughmi

U.C. Berkeley Better-than-Worst-Case Analysis Handout 3 Luca Trevisan May 24, 2018

CS675: Convex and Combinatorial Optimization Fall 2014 Combinatorial Problems as Linear Programs. Instructor: Shaddin Dughmi

Mathematics for Decision Making: An Introduction. Lecture 13

Linear Programming. Scheduling problems

CS Algorithms and Complexity

Input: System of inequalities or equalities over the reals R. Output: Value for variables that minimizes cost function

Lecture 11: Generalized Lovász Local Lemma. Lovász Local Lemma

Slides credited from Hsueh-I Lu & Hsu-Chun Hsiao

NP-complete problems. CSE 101: Design and Analysis of Algorithms Lecture 20

Today: Linear Programming (con t.)

Discrete Optimization

Chapter 11. Approximation Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.

Algorithms. Outline! Approximation Algorithms. The class APX. The intelligence behind the hardware. ! Based on

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748

CSE 421 Introduction to Algorithms Final Exam Winter 2005

Linear Programming Duality

BBM402-Lecture 20: LP Duality

IE418 Integer Programming

Lecture: Expanders, in theory and in practice (2 of 2)

A polynomial relaxation-type algorithm for linear programming

Duality of LPs and Applications

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 11 Luca Trevisan February 29, 2016

Introduction to Mathematical Programming IE406. Lecture 13. Dr. Ted Ralphs

Algorithmic Game Theory and Applications. Lecture 7: The LP Duality Theorem

Week Cuts, Branch & Bound, and Lagrangean Relaxation

Week 8. 1 LP is easy: the Ellipsoid Method

Lecture 6 January 21, 2013

VIII. NP-completeness

6.854J / J Advanced Algorithms Fall 2008

LINEAR PROGRAMMING III

Discrete Optimization 2010 Lecture 10 P, N P, and N PCompleteness

Solutions to Exercises

The Matching Polytope: General graphs

Data Structures and and Algorithm Xiaoqing Zheng

ACO Comprehensive Exam October 18 and 19, Analysis of Algorithms

Partitioning Metric Spaces

Discrete Wiskunde II. Lecture 5: Shortest Paths & Spanning Trees

Design and Analysis of Algorithms

Informatique Fondamentale IMA S8

Motivating examples Introduction to algorithms Simplex algorithm. On a particular example General algorithm. Duality An application to game theory

Myriad of applications

Dynamic Programming. Cormen et. al. IV 15

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 12 Luca Trevisan October 3, 2017

The simplex algorithm

Distributed Optimization. Song Chong EE, KAIST

CSCE 750 Final Exam Answer Key Wednesday December 7, 2005

3.7 Cutting plane methods

7. Lecture notes on the ellipsoid algorithm

Two Applications of Maximum Flow

Transcription:

Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 18 Prof. Erik Demaine

Negative-weight cycles Recall: If a graph G = (V, E) contains a negativeweight cycle, then some shortest paths may not exist. Example: < 0 uu vv Bellman-Ford algorithm: Finds all shortest-path lengths from a source s V to all v V or determines that a negative-weight cycle exists. Introduction to Algorithms Day 31 L18.

Bellman-Ford algorithm d[s] 0 for each v V {s} do d[v] initialization for i 1 to V 1 do for each edge (u, v) E do if d[v] > d[u] + w(u, v) then d[v] d[u] + w(u, v) for each edge (u, v) E do if d[v] > d[u] + w(u, v) then report that a negative-weight cycle exists At the end, d[v] = δ(s, v). Time = O(VE). relaxation step Introduction to Algorithms Day 31 L18.3

Example of Bellman-Ford 1 BB 0 AA 3 1 EE 4 CC 5 D 3 A B C D E 0 Introduction to Algorithms Day 31 L18.4

Example of Bellman-Ford 1 1 BB 0 AA 3 1 EE 4 CC 5 D 3 A B C D E 0 0 1 Introduction to Algorithms Day 31 L18.5

Example of Bellman-Ford 1 1 BB 0 AA 3 1 EE 4 CC 4 5 D 3 A B C D E 0 0 1 0 1 4 Introduction to Algorithms Day 31 L18.6

Example of Bellman-Ford 1 1 BB 0 AA 3 1 EE 4 CC 4 5 D 3 A B C D E 0 0 1 0 1 4 0 1 Introduction to Algorithms Day 31 L18.7

Example of Bellman-Ford 1 1 BB 0 AA 3 1 EE 4 CC 5 D 3 A B C D E 0 0 1 0 1 4 0 1 Introduction to Algorithms Day 31 L18.8

Example of Bellman-Ford 0 AA 1 4 3 CC 1 BB 1 5 D 3 EE 1 A B C D E 0 0 1 0 1 4 0 1 0 1 1 Introduction to Algorithms Day 31 L18.9

Example of Bellman-Ford 0 AA 1 4 3 CC 1 BB 1 5 A B C D E 0 1 0 1 EE 0 1 4 3 0 1 D 0 1 1 1 0 1 1 1 Introduction to Algorithms Day 31 L18.10

Example of Bellman-Ford 0 AA 1 4 3 CC 1 BB 1 5 A B C D E 0 1 0 1 EE 0 1 4 3 0 1 D 0 1 1 1 0 1 1 1 0 1 1 Introduction to Algorithms Day 31 L18.11

Example of Bellman-Ford 0 AA 1 A B C D E 1 BB 0 1 0 1 3 1 EE 0 1 4 4 3 0 1 CC D 5 0 1 1 1 0 1 1 1 Note: Values decrease 0 1 1 monotonically. Introduction to Algorithms Day 31 L18.1

Correctness Theorem. If G = (V, E) contains no negativeweight cycles, then after the Bellman-Ford algorithm executes, d[v] = δ(s, v) for all v V. Proof. Let v V be any vertex, and consider a shortest path p from s to v with the minimum number of edges. p: s v 0 v 1 v 3 v Since p is a shortest path, we have δ(s, v i ) = δ(s, v i 1 ) + w(v i 1, v i ). v v k Introduction to Algorithms Day 31 L18.13

p: s v 0 Correctness (continued) v 1 v 3 v v v k Initially, d[v 0 ] = 0 = δ(s, v 0 ), and d[s] is unchanged by subsequent relaxations (because of the lemma from Lecture 17 that d[v] δ(s, v)). After 1 pass through E, we have d[v 1 ] = δ(s, v 1 ). After passes through E, we have d[v ] = δ(s, v ). M After k passes through E, we have d[v k ] = δ(s, v k ). Since G contains no negative-weight cycles, p is simple. Longest simple path has V 1edges. Introduction to Algorithms Day 31 L18.14

Detection of negative-weight cycles Corollary. If a value d[v] fails to converge after V 1passes, there exists a negative-weight cycle in G reachable from s. Introduction to Algorithms Day 31 L18.15

DAG shortest paths If the graph is a directed acyclic graph (DAG), we first topologically sort the vertices. Determine f : V {1,,, V } such that (u, v) E f (u) < f (v). O(V + E) time using depth-first search. s 11 Walk through the vertices u V in this order, relaxing the edges in Adj[u], thereby obtaining the shortest paths from s in a total of O(V + E) time. 33 55 44 66 77 Introduction to Algorithms Day 31 L18.16 88 99

Linear programming Let A be an m n matrix, b be an m-vector, and c be an n-vector. Find an n-vector x that maximizes c T x subject to Ax b, or determine that no such solution exists. n m.. maximizing A x b c T x Introduction to Algorithms Day 31 L18.17

Linear-programming algorithms Algorithms for the general problem Simplex methods practical, but worst-case exponential time. Ellipsoid algorithm polynomial time, but slow in practice. Interior-point methods polynomial time and competes with simplex. Feasibility problem: No optimization criterion. Just find x such that Ax b. In general, just as hard as ordinary LP. Introduction to Algorithms Day 31 L18.18

Solving a system of difference constraints Linear programming where each row of A contains exactly one 1, one 1, and the rest 0 s. Example: x 1 x 3 x x 3 x j x i w ij x 1 x 3 Solution: x 1 = 3 x = 0 x 3 = Constraint graph: x j x i w ij v ii w ij v jj (The A matrix has dimensions E V.) Introduction to Algorithms Day 31 L18.19

Unsatisfiable constraints Theorem. If the constraint graph contains a negative-weight cycle, then the system of differences is unsatisfiable. Proof. Suppose that the negative-weight cycle is v 1 v L v k v 1. Then, we have x x 1 w 1 x 3 x w 3 M x k x k 1 w k 1, k x 1 x k w k1 0 weight of cycle < 0 Therefore, no values for the x i can satisfy the constraints. Introduction to Algorithms Day 31 L18.0

Satisfying the constraints Theorem. Suppose no negative-weight cycle exists in the constraint graph. Then, the constraints are satisfiable. Proof. Add a new vertex s to V with a 0-weight edge to each vertex v i V. s v 1 0 Note: v 4 v 7 v 9 v 3 No negative-weight cycles introduced shortest paths exist. Introduction to Algorithms Day 31 L18.1

Proof (continued) Claim: The assignment x i = δ(s, v i ) solves the constraints. Consider any constraint x j x i w ij, and consider the shortest paths from s to v j and v i : ss δ(s, v i ) v ii δ(s, v j ) w ij The triangle inequality gives us δ(s,v j ) δ(s, v i ) + w ij. Since x i = δ(s, v i ) and x j = δ(s, v j ), the constraint x j x i w ij is satisfied. v jj Introduction to Algorithms Day 31 L18.

Bellman-Ford and linear programming Corollary. The Bellman-Ford algorithm can solve a system of m difference constraints on n variables in O(mn) time. Single-source shortest paths is a simple LP problem. In fact, Bellman-Ford maximizes x 1 + x + L + x n subject to the constraints x j x i w ij and x i 0 (exercise). Bellman-Ford also minimizes max i {x i } min i {x i } (exercise). Introduction to Algorithms Day 31 L18.3

Integrated -circuit features: Application to VLSI layout compaction minimum separation λ Problem: Compact (in one dimension) the space between the features of a VLSI layout without bringing any features too close together. Introduction to Algorithms Day 31 L18.4

VLSI layout compaction d 1 11 x 1 x Constraint: x x 1 d 1 + λ Bellman-Ford minimizes max i {x i } min i {x i }, which compacts the layout in the x-dimension. Introduction to Algorithms Day 31 L18.5