Randomized Algorithms III Min Cut

Similar documents
Chapter 11. Min Cut Min Cut Problem Definition Some Definitions. By Sariel Har-Peled, December 10, Version: 1.

12.1. Branching processes Galton-Watson Process

Minimum Cut in a Graph Randomized Algorithms

Class notes for Randomized Algorithms

CS5314 Randomized Algorithms. Lecture 18: Probabilistic Method (De-randomization, Sample-and-Modify)

Algortithms for the Min-Cut problem

Notes on MapReduce Algorithms

Assignment 5: Solutions

Proving languages to be nonregular

University of Chicago Autumn 2003 CS Markov Chain Monte Carlo Methods

Theorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr( )

Graph-theoretic Problems

Lecture 11: Generalized Lovász Local Lemma. Lovász Local Lemma

Lecture 1 : Probabilistic Method

CMPUT 675: Approximation Algorithms Fall 2014

Lecture 1: Contraction Algorithm

F 99 Final Quiz Solutions

Algorithm Analysis Recurrence Relation. Chung-Ang University, Jaesung Lee

Algorithms and Data Structures 2016 Week 5 solutions (Tues 9th - Fri 12th February)

PRGs for space-bounded computation: INW, Nisan

1 Some loose ends from last time

HAMILTON CYCLES IN RANDOM REGULAR DIGRAPHS

McGill University Faculty of Science. Solutions to Practice Final Examination Math 240 Discrete Structures 1. Time: 3 hours Marked out of 60

Randomized Algorithms

Advanced Analysis of Algorithms - Midterm (Solutions)

CS1800: Mathematical Induction. Professor Kevin Gold

MA015: Graph Algorithms

MARKING A BINARY TREE PROBABILISTIC ANALYSIS OF A RANDOMIZED ALGORITHM

Probability & Computing

(c) Give a proof of or a counterexample to the following statement: (3n 2)= n(3n 1) 2

Tree Decompositions and Tree-Width

ACO Comprehensive Exam October 14 and 15, 2013

LIMITING PROBABILITY TRANSITION MATRIX OF A CONDENSED FIBONACCI TREE

k-protected VERTICES IN BINARY SEARCH TREES

Finite Metric Spaces & Their Embeddings: Introduction and Basic Tools

Analysis of Algorithms CMPSC 565

ACO Comprehensive Exam March 20 and 21, Computability, Complexity and Algorithms

CS6840: Advanced Complexity Theory Mar 29, Lecturer: Jayalal Sarma M.N. Scribe: Dinesh K.

CS361 Homework #3 Solutions

Lecture 17: Trees and Merge Sort 10:00 AM, Oct 15, 2018

Topic 4 Randomized algorithms

Balanced Allocation Through Random Walk

CS 6820 Fall 2014 Lectures, October 3-20, 2014

Cographs; chordal graphs and tree decompositions

Finding Paths with Minimum Shared Edges in Graphs with Bounded Treewidth

Module 1: Analyzing the Efficiency of Algorithms

CS4311 Design and Analysis of Algorithms. Analysis of Union-By-Rank with Path Compression

The Lovász Local Lemma : A constructive proof

Random Variable. Pr(X = a) = Pr(s)

Problem. Problem Given a dictionary and a word. Which page (if any) contains the given word? 3 / 26

Analysis of Algorithms I: Asymptotic Notation, Induction, and MergeSort

CPSC 536N: Randomized Algorithms Term 2. Lecture 9

Lecture 59 : Instance Compression and Succinct PCP s for NP

CS151 Complexity Theory. Lecture 6 April 19, 2017

Partitioning Metric Spaces

CS 161 Summer 2009 Homework #2 Sample Solutions

Lecture 24: April 12

Lecture 13: 04/23/2014

Graph coloring, perfect graphs

On shredders and vertex connectivity augmentation

25.1 Markov Chain Monte Carlo (MCMC)

USA Mathematical Talent Search Round 2 Solutions Year 27 Academic Year

Discrete Mathematics for CS Spring 2008 David Wagner Note 4

Tree Decomposition of Graphs

Dynamic Programming on Trees. Example: Independent Set on T = (V, E) rooted at r V.

Dominating Set Counting in Graph Classes

On improving matchings in trees, via bounded-length augmentations 1

Breadth-First Search of Graphs

directed weighted graphs as flow networks the Ford-Fulkerson algorithm termination and running time

Solution Set for Homework #1

Lecture 5: February 21, 2017

1 Sequences and Summation

CS Introduction to Complexity Theory. Lecture #11: Dec 8th, 2015

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 1

arxiv: v1 [math.co] 13 Dec 2014

Notes on Graph Theory

Hard-Core Model on Random Graphs

Cuts & Metrics: Multiway Cut

Parameterized Algorithms for the H-Packing with t-overlap Problem

Phylogenetic Reconstruction with Insertions and Deletions

2. This exam consists of 15 questions. The rst nine questions are multiple choice Q10 requires two

CS 151. Red Black Trees & Structural Induction. Thursday, November 1, 12

Tight Bounds on Vertex Connectivity Under Vertex Sampling

Announcements. CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis. Today. Mathematical induction. Dan Grossman Spring 2010

GRAPH PARTITIONING USING SINGLE COMMODITY FLOWS [KRV 06] 1. Preliminaries

AMTH140 Trimester Question 1. [8 marks] Answer

α-acyclic Joins Jef Wijsen May 4, 2017

Randomized Sorting Algorithms Quick sort can be converted to a randomized algorithm by picking the pivot element randomly. In this case we can show th

A necessary and sufficient condition for the existence of a spanning tree with specified vertices having large degrees

Central Algorithmic Techniques. Iterative Algorithms

Lecture 6: September 22

Winkler s Hat Guessing Game: Better Results for Imbalanced Hat Distributions

Lecture 6 January 15, 2014

Methods for solving recurrences

CS60007 Algorithm Design and Analysis 2018 Assignment 1

CMPT 307 : Divide-and-Conqer (Study Guide) Should be read in conjunction with the text June 2, 2015

Solutions to homework Assignment #6 Math 119B UC Davis, Spring 2012

Coloring Uniform Hypergraphs with Few Colors*

An Improved Approximation Algorithm for Requirement Cut

CSE 548: Analysis of Algorithms. Lectures 18, 19, 20 & 21 ( Randomized Algorithms & High Probability Bounds )

Transcription:

Chapter 11 Randomized Algorithms III Min Cut CS 57: Algorithms, Fall 01 October 1, 01 11.1 Min Cut 11.1.1 Problem Definition 11. Min cut 11..0.1 Min cut G = V, E): undirected graph, n vertices, m edges. Interested in cuts in G. S V \ S Definition 11..1. cut in G: a partition of V : S and V \ S. Edges of the cut: S, V \ S) is size of the cut S, V \ S) = { uv u S, v V \ S, and uv E }, minimum cut / mincut: cut in graph with min size. 11..0. Some definitions A) conditional probability of X given Y is Pr [ X = x Y = y ] [ = Pr Pr [ X = x) Y = y) ] = Pr [ X = x Y = y ] Pr[Y = y]. 1 ] X=x) Y =y) [ ]. Pr Y =y

B) X, Y events are independent, if Pr [ X = x Y = y ] = Pr [ X = x ] Pr [ Y = y ]. = Pr [ X = x ] [ ] Y = y = Pr X = x. 11..0. Some more probability Lemma 11... E 1,..., E n : n events not necessarily independent). Then, Pr [ ] [ ] [ ] [ ] n i=1 E i = Pr E1 Pr E E 1 Pr E E1 E... Pr [ ] E n E1... E n 1. 11. The Algorithm 11..0. Edge contraction... y x G: A) edge contraction: e = xy in G. B)... merge x, y into a single vertex. C)...remove self loops. D)... parallel edges multi-graph. E)... weights/ multiplicities on the edges. 11..0.5 Min cut in weighted graph G/xy: {x, y} Edge contraction implemented in On) time: A) Graph represented using adjacency lists. B) Merging the adjacency lists of the two vertices being contracted. C) Using hashing to do fix-ups. i.e., fix adjacency list of vertices connected to x, y.) D) Include edge weight in computing cut weight. 11..0.6 Cuts under contractions Observation 11..1. A) A cut in G/xy is a valid cut in G. B) There cuts in G are not in G/xy. C) The cut S = {x} is not in G/xy. D) = size mincut in G/xy mincut in G. A) Idea: Repeatedly perform edge contractions benefits: shrink graph)... B) Every vertex in contracted graph is a connected component in the original graph.)

11..0.7 Contraction x y {x, y} ) ) ) 5) 7) 5 5 8) 9) 10) 11) 1) 9 1) 1) 11..0.8 Contraction - all together now x y a) b) c) d) 5 5 e) f) g) h) 9 11..0.9 But... i) j) A) Not min cut!

B) Contracted wrong edge somewhere... C) If never contract an edge in the cut... D)...get min cut in the end! E) We might still get min cut even if we contract edge min cut. Why??? 11..1 The resulting algorithm 11..1.1 The algorithm... Algorithm MinCutG) G 0 G i = 0 while G i has more than two vertices do e i random edge from EG i ) G i+1 G i /e i i i + 1 Let S, V \ S) be the cut in the original graph corresponding to the single edge in G i return S, V \ S). 11..1. How to pick a random edge? Lemma 11... X = {x 1,..., x n }: elements, ωx i ): integer positive weight. Pick randomly, in On) time, an element X, with prob picking x i being ωx i ) /W, where W = ni=1 ωx i ). Proof : Randomly choose r [0, W ]. Precompute β i = i k=1 ωx k ) = β i 1 + ωx i ). Find first index i, β i 1 < r β i. Return x i. A) Edges have weight... B)...compute total weight of each vertex adjacent edges). C) Pick randomly a vertex by weight. D) Pick random edge adjacent to this vertex. 11.. Analysis 11...1 The probability of success 11... Lemma... Lemma 11... G: mincut of size k and n vertices, then EG) kn. Proof : Each vertex degree is at least k, otherwise the vertex itself would form a minimum cut of size smaller than k. As such, there are at least v V degreev)/ nk/ edges in the graph. 11... Lemma... Lemma 11... If we pick in random an edge e from a graph G, then with probability at most n belong to the minimum cut. it Proof : There are at least nk/ edges in the graph and exactly k edges in the minimum cut. Thus, the probability of picking an edge from the minimum cut is smaller then k/nk/) = /n.

11... Lemma Lemma 11..5. MinCut outputs the mincut with prob. nn 1). Proof A) E i : event that e i is not in the minimum cut of G i. B) MinCut outputs mincut if all the events E 0,..., E n happen. C) Pr [ ] E i E0 E 1... E i 1 1 V G i ) = 1 n i. = = Pr[E 0... E n ] = Pr[E 0 ] Pr [ [ [ ] E 1 E0 ] Pr E E0 E 1 ]... Pr En E0... E n 11...5 Proof continued... As such, we have n i=0 = n n = 1 ) n n i = n i i=0 n i n n 1 n n... 1 n n 1). 11...6 Running time analysis. 11...7 Running time analysis... Observation 11..6. MinCut runs in On ) time. Observation 11..7. The algorithm always outputs a cut, and the cut is not smaller than the minimum cut. Definition 11..8. Amplification: running an experiment again and again till the things we want to happen, with good probability, do happen. 11...8 Getting a good probability MinCutRep: algorithm runs MinCut nn 1) times and return the minimum cut computed. Lemma 11..9. probability MinCutRep fails to return the minimum cut is < 0.1. Proof : MinCut fails to output the mincut in each execution is at most 1. nn 1) MinCutRep fails, only if all nn 1) executions of MinCut fail. 1 nn 1) 11...9 Result ) nn 1) exp nn 1) nn 1)) = exp ) < 0.1, since 1 x e x for 0 x 1. Theorem 11..10. One can compute mincut in On ) time with constant probability to get a correct result. In On log n) time the minimum cut is returned with high probability. 5

11. A faster algorithm 11..0.10 Faster algorithm Why MinCutRep needs so many executions? Probability of failure in first ν iterations is Pr [ ] ν 1 E 0... E ν 1 1 ) ν 1 n i = i=0 n i i=0 n i = n n n n 1 n n... n ν)n ν 1) =. n n 1) = ν = n/: Prob of success 1/. = ν = n n: Prob of success 1/n. 11..0.11 Faster algorithm... Insight A) As the graph get smaller probability for bad choice increases. B) Currently do the amplification from the outside of the algorithm. C) Put amplification directly into the algorithm. 11..1 Contract... 11..1.1 ContractG, t) shrinks G till it has only t vertices. FastCut computes the minimum cut using Contract. Contract G, t ) while G) > t do Pick a random edge e in G. G G/e return G FastCutG = V, E)) G -- multi-graph begin n V G) if n 6 then Compute minimum cut of G and return cut. t 1 + n/ H 1 ContractG, t) H ContractG, t) /* Contract is randomized!!! */ X 1 FastCutH 1 ), X FastCutH ) return mincut of X 1 and X. end 11..1. Lemma... Lemma 11..1. The running time of FastCutG) is On log n), where n = V G). Proof : Well, we perform two calls to ContractG, t) which takes On ) time. And then we perform two recursive calls on the resulting graphs. We have: T n) = On ) + T n ) The solution to this recurrence is On log n) as one can easily and should) verify. 6

11..1. Success at each step Lemma 11... Probability that mincut in contracted graph is original mincut is at least 1/. Proof : Plug in ν = n t = n 1 + n/ into success probability: ] Pr [E 0... E n t = tt 1) n n 1) ) 1 + n/ 1 + n/ 1 nn 1) 1. 11..1. Probability of success... Lemma 11... FastCut finds the minimum cut with probability larger than Ω 1/ log n). See class notes for a formal proof. We provide a more elegant direct argument shortly. 11..1.5 Amplification Lemma 11... Running FastCut repeatedly c log n times, guarantee that the algorithm outputs mincut with probability 1 1/n. c is a constant large enough. Proof : A) FastCut succeeds with prob c / log n, c is a constant. B)...fails with prob. 1 c / log n. C)...fails in m reps with prob. 1 c / log n) m. But then 1 c / log n) m e c / log n ) m e mc / log n 1 n, for m = log n) /c. 11..1.6 Theorem Theorem 11..5. One can compute the minimum cut in a graph G with n vertices in On log n) time. The algorithm succeeds with probability 1 1/n. Proof : We do amplification on FastCut by running it Olog n) times. The running time bound follows from lemma... 11.5 On coloring trees and min-cut 11.5.0.7 Trees and coloring edges... A) T h be a complete binary tree of height h. B) Randomly color its edges by black and white. C) E h : there exists a black path from root T h to one of its leafs. D) ρ h = Pr[E h ]. E) ρ 0 = 1 and ρ 1 = / see below). 7

11.5.0.8 Bounding ρ h A) u root of T h : children u l and u r. B) ρ h 1 : Probability for black path u l children C) Prob of black path from u through u 1 is: Pr [ uu l is black ] ρ h 1 = ρ h 1 / D) Prob. no black path through u l is 1 ρ h 1 /. E) Prob no black path is: 1 ρ h 1 /) F) We have ρ h = 1 1 ρ h 1 ) ρ h 1 = ρ h 1 ) = ρ h 1 ρ h 1. 11.5.0.9 Lemma... Lemma 11.5.1. We have that ρ h 1/h + 1). Proof : A) By induction. For h = 1: ρ 1 = / 1/1 + 1). B) ρ h = ρ h 1 ρ h 1 = fρ h 1 ), for fx) = x x /. C) f x) = 1 x/. = f x) > 0 for x [0, 1]. D) fx) is increasing in the range [0, 1] ) 1 E) By induction: ρ h = f ρ h 1 ) f = 1 h 1) + 1 h 1 h. F) 1 1 1 hh + 1) h + 1) h h + h h 1 h h 1, h h h+1 11.5.0.10 Back to FastCut... A) Recursion tree for FastCut corresponds to such a coloring. B) Every call performs two recursive calls. C) Contraction in recursion succeeds with prob 1/. Draw recursion edge in black if successful. D) algorithm succeeds there black path from root of recursion tree to leaf. E) Since depth of tree H + log n. F) by above... probability of success is 1/h + 1) 1/ + log n). 11.5.0.11 Galton-Watson processes A) Start with a single node. B) Each node has two children. C) Each child survives with probability half independently). D) If a child survives then it is going to have two children, and so on. E) A single node give a rise to a random tree. F) Q: Probability that the original node has descendants h generations in the future. G) Prove this probability is at least 1/h + 1). 8

11.5.0.1 Galton-Watson process A) Victorians worried: aristocratic surnames were disappearing. B) Family names passed on only through the male children. C) Family with no male children had its family name disappear. D) # male children of a person is an independent random variable X {0, 1,,...}. E) Starting with a single person, its family as far as male children are concerned) is a random tree with the degree of a node being distributed according to X. F).. A family disappears if E[X] 1, and it has a constant probability of surviving if E[X] > 1. 11.5.0.1 Galton-Watson process A)... Infant mortality is dramatically down. No longer a problem. B) Countries with family names that were introduced long time ago... C)...have very few surnames. Koreans have 50 surnames, and three surnames form 5% of the population). D) Countries introduced surnames recently have more surnames. Dutch have surnames only for the last 00 years, and there are 68, 000 different family names). 9