ORIE 633 Network Flows September 27, Lecture 8

Similar documents
Lecture 2: April 3, 2013

4.3 Growth Rates of Solutions to Recurrences

CSE 4095/5095 Topics in Big Data Analytics Spring 2017; Homework 1 Solutions

Recursive Algorithms. Recurrences. Recursive Algorithms Analysis

Design and Analysis of Algorithms

Skip Lists. Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015 S 3 S S 1

Average-Case Analysis of QuickSort

Math F215: Induction April 7, 2013

An Introduction to Randomized Algorithms

CIS 121 Data Structures and Algorithms with Java Spring Code Snippets and Recurrences Monday, February 4/Tuesday, February 5

Lecture 16: Monotone Formula Lower Bounds via Graph Entropy. 2 Monotone Formula Lower Bounds via Graph Entropy

5.1 Review of Singular Value Decomposition (SVD)

Disjoint set (Union-Find)

Section 11.8: Power Series

Ray-triangle intersection

Lecture 12: November 13, 2018

n outcome is (+1,+1, 1,..., 1). Let the r.v. X denote our position (relative to our starting point 0) after n moves. Thus X = X 1 + X 2 + +X n,

Recurrence Relations

Analysis of Algorithms. Introduction. Contents

A Probabilistic Analysis of Quicksort

Lecture 14: Graph Entropy

Lecture 5: April 17, 2013

Last time, we talked about how Equation (1) can simulate Equation (2). We asserted that Equation (2) can also simulate Equation (1).

Lecture 10 October Minimaxity and least favorable prior sequences

Test One (Answer Key)

1 Statement of the Game

CS 5150/6150: Assignment 1 Due: Sep 23, 2010

Notes for Lecture 5. 1 Grover Search. 1.1 The Setting. 1.2 Motivation. Lecture 5 (September 26, 2018)

Lecture 9: Hierarchy Theorems

Lecture 3. 1 Polynomial-time algorithms for the maximum flow problem

Notes for Lecture 11

UNIT 2 DIFFERENT APPROACHES TO PROBABILITY THEORY

Sorting Algorithms. Algorithms Kyuseok Shim SoEECS, SNU.

w (1) ˆx w (1) x (1) /ρ and w (2) ˆx w (2) x (2) /ρ.

Sums, products and sequences

CS:3330 (Prof. Pemmaraju ): Assignment #1 Solutions. (b) For n = 3, we will have 3 men and 3 women with preferences as follows: m 1 : w 3 > w 1 > w 2

Sequences I. Chapter Introduction

This Lecture. Divide and Conquer. Merge Sort: Algorithm. Merge Sort Algorithm. MergeSort (Example) - 1. MergeSort (Example) - 2

CS 270 Algorithms. Oliver Kullmann. Growth of Functions. Divide-and- Conquer Min-Max- Problem. Tutorial. Reading from CLRS for week 2

Divide & Conquer. Divide-and-conquer algorithms. Conventional product of polynomials. Conventional product of polynomials.

Bertrand s Postulate

CS583 Lecture 02. Jana Kosecka. some materials here are based on E. Demaine, D. Luebke slides

Discrete probability distributions

The picture in figure 1.1 helps us to see that the area represents the distance traveled. Figure 1: Area represents distance travelled

Lecture 11: Hash Functions and Random Oracle Model

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 1 MATH00030 SEMESTER / Statistics

6.3 Testing Series With Positive Terms

ACO Comprehensive Exam 9 October 2007 Student code A. 1. Graph Theory

Algorithms and Data Structures Lecture IV

Lecture 2 February 8, 2016

Merge and Quick Sort

MA131 - Analysis 1. Workbook 2 Sequences I

Online hypergraph matching: hiring teams of secretaries

Lecture 2. The Lovász Local Lemma

Lecture 16. Min Cut and Karger s Algorithm

MA131 - Analysis 1. Workbook 3 Sequences II

Sect 5.3 Proportions

18.440, March 9, Stirling s formula

Introduction to Probability. Ariel Yadin. Lecture 2

In algebra one spends much time finding common denominators and thus simplifying rational expressions. For example:

Analytic Continuation

Empirical Process Theory and Oracle Inequalities

Square-Congruence Modulo n

Randomized Algorithms I, Spring 2018, Department of Computer Science, University of Helsinki Homework 1: Solutions (Discussed January 25, 2018)

CS 330 Discussion - Probability

The Binomial Theorem

Mathematical Foundation. CSE 6331 Algorithms Steve Lai

2.1. The Algebraic and Order Properties of R Definition. A binary operation on a set F is a function B : F F! F.

Discrete Mathematics for CS Spring 2005 Clancy/Wagner Notes 21. Some Important Distributions

Data Structures Lecture 9

Classification of problem & problem solving strategies. classification of time complexities (linear, logarithmic etc)

Symbolic computation 2: Linear recurrences

Math 155 (Lecture 3)

5.6 Absolute Convergence and The Ratio and Root Tests

NUMERICAL METHODS FOR SOLVING EQUATIONS

CS / MCS 401 Homework 3 grader solutions

Polynomial identity testing and global minimum cut

Machine Learning Theory (CS 6783)

Chapter 8: STATISTICAL INTERVALS FOR A SINGLE SAMPLE. Part 3: Summary of CI for µ Confidence Interval for a Population Proportion p

Grouping 2: Spectral and Agglomerative Clustering. CS 510 Lecture #16 April 2 nd, 2014

IP Reference guide for integer programming formulations.

Lecture 2: Concentration Bounds

Lovász Local Lemma a new tool to asymptotic enumeration?

Dynamic Programming. Sequence Of Decisions

Dynamic Programming. Sequence Of Decisions. 0/1 Knapsack Problem. Sequence Of Decisions

Chapter 4. Fourier Series

Orthogonal transformations

Quantum Computing Lecture 7. Quantum Factoring

CS 332: Algorithms. Linear-Time Sorting. Order statistics. Slide credit: David Luebke (Virginia)

Recursive Algorithm for Generating Partitions of an Integer. 1 Preliminary

Integrable Functions. { f n } is called a determining sequence for f. If f is integrable with respect to, then f d does exist as a finite real number

Chapter 2 The Monte Carlo Method

It is often useful to approximate complicated functions using simpler ones. We consider the task of approximating a function by a polynomial.

Lecture 2 Clustering Part II

Lecture 7: Solving Recurrences

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 22

Lecture 4 The Simple Random Walk

Analysis of Algorithms -Quicksort-

(A sequence also can be thought of as the list of function values attained for a function f :ℵ X, where f (n) = x n for n 1.) x 1 x N +k x N +4 x 3

The Random Walk For Dummies

Transcription:

ORIE 633 Network Flows September 7, 007 Lecturer: David P. Williamso Lecture 8 Scribe: Gema Plaza-Martíez 1 Global mi-cuts i udirected graphs 1.1 Radom cotractio Recall from last time we itroduced the radom cotractio algorithm. Radom Cotractio (Karger 93) Util V Pick edge (i, j) at radom with probability proportioal to its capacity. Cotract (i, j). The ituitio for the algorithm is that the global mi-cut i a graph has capacity that is small relative to the capacity of the graph, ad so we are ulikely to choose a edge that is i a particular global mi-cut. Last time, we showed the followig. Theorem 1 Let S be a give global mi-cut. The Pr[Radom Cotractio cotracts o edge i S ] 1 (. We start off the lecture by observig that we ca derive from this a upper boud o the umber of global mi-cuts. Theorem There are at most ( global mi cuts. Proof: Notice that evets i which we do t cotract edges i two differet global mi cuts are disjoit. Let S1, S,..., S p be all global mi cuts. Let C i be the evet that we do t cotract ay edge i Si. We showed i last lecture that P r[c i] ( 1. Sice these p evets are all disjoit P r[c i ] 1, so we must have p (. i1 A example of a graph with ( global mi cuts is a ode cycle, where each edge has the same capacity, sice the ay pair of edges forms a global mi-cut. We ow tur to the questio of how to take the Radom Cotractio algorithm ad tur it ito a algorithm that returs a global mi-cut with high probability. Theorem 3 We ca implemet radom cotractio i O( ) time (O() per cotractio). Theorem 4 We ca fid a global mi cut with probability 1 1 i time O(4 log ). 8-1

Proof: Suppose we repeat Radom Cotractio ( log times. The usig the fact that 1 x e x, we have ( ) ( Pr[we do t fid mi cut] 1 ( 1 log ( e 1 ) ( ( log e log 1. So Pr[we do fid mi cut] 1 1. 1. Recursive radom cotractio The ruig time of this algorithm is ot very competitive, but Radom Cotractio (RC) has some features that we like, so we ll try to exploit those to fid a better algorithm. Oe of the drawbacks of RC is that the probability of cotractig a edge i a cut is low for large graphs ad higher as the graph gets smaller. Oe idea is to make the graph smaller by usig RC, ad the use some other algorithm. I this case, we ll use as our other algorithm two rus of RC. Our algorithm is thus called Recursive Radom Cotractio (RRC). We ll do two iteratios of cotractig the graph dow to roughly a factor of 1/ of its previous size, the call RRC o the remaiig graph. We retur the smaller of the two cuts foud i the two iteratios. Recursive Radom Cotractio (Karger, Stei 96) If V 6, fid a mi cut by exhaustive eumeratio Else For i 1 to Ru RC to get H i of 1 + V vertices C i RRC(H i ) Retur smaller of C 1, C Now we ca start to aalyze the algorithm. Lemma 5 RRC rus i O( log ) time. Proof: The ruig time o vertices is give by the recurrece T () T ( 1 + ) + O( ) Solvig this recurrece we get T () O( log ). ( ) Lemma 6 RRC fids a global mi cut with probability Ω 1 log. 8-

Corollary 7 Ruig RRC O(log ) times we fid a mi cut with probability 1 1. To prove Lemma 6, we ll eed the followig Lemma 8 For a give global mi-cut S, Pr[o edge i δ(s ) is cotracted whe graph is cotracted from to t vertices] ( t (. Proof: We recall from last time that the probability that o edge i δ(s ) is cotracted i the ith iteratio give that o edge was cotracted i previous iteratios is at least 1 /( i + 1). Thus the desired probability is at least t i1 ( 1 ) i + 1 t ( ) i 1 i + 1 i1 t+1 ( ) j Now we ca prove Lemma 6. Proof of Lemma 6: Give a graph of size t, the probability that o edge i δ(s ) is cotracted whe reduced to 1 + t vertices is at least 1+ t 1+ t 1 t(t 1) 1. Let P (t) be the probability that RRC does t cotract ay edge i δ(s ) i a graph of t vertices. The P (t) 1 P r[some edge i δ(s ) is cotracted i either iteratio 1 or ] ( ( 1 1 Pr[o edges are cotracted whe goig to t vertices]p 1 + t )) 1 (1 1 ( P 1 + t )). We kow that P (t) 1 for t 6. Skippig the algebra eeded to work out this recursio, we get ( ) 1 P (t) θ. log t So far we ve see the followig global mi0cut algorithms: MA orderigs: O((m + log )) RRC: O( log ) Hao-Orli: O(m log ) j ( t (. j 8-3

There is still aother global mi-cut algorithm that we have t covered: Karger: O(m log c ) Õ(m) Cotractios are ot easy to implemet, so though they look like the simplest algorithms, they are ot the clear wier. There is a comparative study of all these four algorithms by Chekuri, Goldberg, Karger, Levie ad Stei (1996). The two clear wiers i their study are MA orderigs ad Hao-Orli, ad they say that Hao-Orli is the most robust of their codes; this is probably i part because there are a lot of good implemetatios of push/relabel aroud. This shows that a better theoretical ruig time does t ecessarily imply that a algorithm is better i practice. Efficiet algorithms for max flow.1 Blockig flow We ll work ow toward oe more algorithm for computig a maximum flow. It is based o the cocept of a blockig flow. Defiitio 1 A flow f is blockig if i G, every s t path has some arc saturated. Every maximum flow is obviously also a blockig flow. Is every blockig flow a maximum flow? No. The followig example illustrates this: However, blockig flows are still useful i order to compute maximum flows. We give a algorithm below. Let d i distace from a ode i to the sik t. Defiitio A arc (i, j) is admissible w.r.t. flow f if (i, j) A f ad d i d j + 1. Note that a arc beig admissible meas it is o a shortest path to t. Next time we ll discuss the followig algorithm. 8-4

Diic s algorithm (1970) f 0 While s t path i G f Compute distace d i from i to t i G f Fid blockig flow f i graph G f with oly admissible arcs, capacity u f ij f f + f 8-5