CSE 548: Analysis of Algorithms. Lectures 18, 19, 20 & 21 ( Randomized Algorithms & High Probability Bounds )
|
|
- Francis Moody
- 6 years ago
- Views:
Transcription
1 CSE 548: Analysis of Algorithms Lectures 18, 19, 20 & 21 ( Randomized Algorithms & High Probability Bounds ) Rezaul A. Chowdhury Department of Computer Science SUNY Stony Brook Fall 2012
2 Markov s Inequality Theorem 1:Let be a random variable that assumes only nonnegative values. Then for all 0, Pr. Proof:For 0, let Since 0,. 1 if ; 0 otherwise. We also have, Pr 1 Pr. Then Pr.
3 Example: Coin Flipping Let us bound the probability of obtaining more than sequence of fair coin flips. heads in a Let 1 if the th coin flip is heads; 0 otherwise. Then the number of heads in flips, We know, Pr 1.. Hence,. Then applying Markov s inequality, Pr.
4 Theorem 2:For any 0, Chebyshev s Inequality Pr. Proof:Observe that Pr Pr. Since is a nonnegative random variable, we can use Markov s inequality, Pr.
5 Example: Fair Coin Flips 1 if the th coin flip is heads; 0 otherwise. Then the number of heads in flips,. We know, Pr 1 and. Then. Hence, and Then applying Chebyshev s inequality, Pr Pr..
6 Coin Flipping and Randomized Algorithms Suppose we have an algorithm that is correct ( heads ) only with probability 0,1, and incorrect ( tails ) with probability 1. Question:How many times should we run the algorithm to be reasonably confident that it returns at least one correct solution? Las Vegas Algorithm: You keep running the algorithm until you get a correct solution. What is the bound on running time? Monte Carlo Algorithm: You stop after a certain number of iterations no matter whether you found a correct solution or not. What is the probability that your solution is correct ( or you found a solution )?
7 Coin Flipping and Randomized Algorithms Suppose we have an algorithm that is correct ( heads ) only with probability 0,1, and incorrect ( tails ) with probability 1. Suppose we run the algorithm times. Then probability that no run produces a correct solution is 1. probability of getting at least one correct solution is 1 1. Set ln, where 1and 0is a constant. Then the probability that at least one run produces a correct solution is An event Πis said to occur with high probability if Pr Π 1. w.h.p.
8 Example: A Coloring Problem Let be a set of items. For 1, let such that for every pair of, 1, with, but not necessarily. For each 1,, let, where 2. Problem:Color each item of with one of two colors, redand blue, such that each contains at least one redand one blueitem. Algorithm:Take each item of and color it either redor blue independently at random ( with probability ). Clearly, the algorithm does not always lead to a valid coloring ( i.e., satisfy the constraints given in our problem statement ). What is the probability that it produces a valid coloring?
9 Example: A Coloring Problem For 1, let and be the events that all items of are colored red and blue, respectively. Then Pr Pr 2 for every 1,. Pr Pr Thus Pr 2. Pr 1 Pr 1. Hence, the algorithm is correct with probability at least. To check if the algorithm has produced a correct result we simply check the items in each to verify that neither nor holds. Hence, we can use this simple algorithm to design a Las Vegas algorithm for solving the coloring problem!
10 Example: The Min-Cut Problem Let, be a connected, undirected multigraphwith. A cutin is a set, such that, is not connected. A min-cutis a cut of minimum cardinality. The multigraphon the right has a min-cut of size 2:,,, and,,,. Most deterministic algorithms for finding min-cuts are based on network flows and hence are quite complicated. Instead in this lecture we will look at a very simple probabilistic algorithm that finds min-cuts with some probability 0.
11 Example: The Min-Cut Problem We apply the following contraction step 2times on, : Select an edge ( say,, ) from uniformly at random. Merge and into a single super vertex. Remove all edges between and from. If as a result of the contraction there are more than one edges between some pairs of super vertices retain them all. Let the initial graph be,, where and. Let, be the multigraphafter step 1, 2. Then clearly, and thus 2. We return as our solution.
12 Example: The Min-Cut Problem Let us fix our attention on a particular min-cut of. What is the probability that? Suppose. Then each vertex of must have degree at least as otherwise can be disconnected by removing fewer than edges. Hence, /2 /2. Let Π be the event of not picking an edge of for contraction in step 1, 2. Then clearly, Pr Π / Also Pr Π Π 1 1 / In general, Pr Π Π / 1
13 Example: The Min-Cut Problem The probability that no edge of was ever picked by the algorithm is: Pr Π 1. So Pr, and Pr 1. Suppose we run the algorithm /2times, and return the smallest cut, say, obtained from those /2attempts. Then Pr 1 / Pr 1. Hence, the algorithm will return a min-cut with probability 1. But we do not know how to detect if the cut returned by the algorithm is, indeed, a min-cut. Still we can design a Monte-Carlo algorithm based on this simple idea to produce a min-cut with high probability!
14 When Only One Success is Not Enough In both examples we have looked at so far, we were happy with only one success. The analysis was easy. But sometimes we need the algorithm to be successful for at least or at most a certain number of times ( we will see a very familiar such example shortly ). The number of successful runs required often depends on the size of the input. How do we analyze those algorithms?
15 The binomial distribution is the discrete probability distribution of the #successes in a sequence of independent yes/no experiments (i.e., Bernoulli trials), each of which succeeds with probability. Binomial Distribution Source Prof. Nick Harvey ( UBC ) Probability mass function: ;, Pr 1, 0 Cumulative distribution function: RED:Binomial distribution with 20and ;, Pr 1, 0
16 Approximating with Normal Distribution Normal distribution with mean and variance is given by: ;, 1 2, R For fixed as increases the binomial distribution with parameters and is well approximated by a normal distribution with and 1. RED: Binomial distribution with Source Prof. Nick Harvey ( UBC ) 20and BLUE: Normal distribution with 10 and 1 5
17 Approximating with Normal Distribution Normal distribution with mean and variance is given by: ;, 1 2 The probability that a normally distributed random variable lies in the interval, is given by:, R ;, 1 erf where, erf But erf cannot be expressed in closed form in terms of elementary functions, and hence difficult to evaluate.., Source Prof. Nick Harvey ( UBC )
18 Approximating with Poisson Distribution Poisson distribution with mean 0is given by: ;, 0,1,2,! If is fixed and increases the binomial distribution with parameters and is well approximated by a Poisson distribution with. RED: Binomial distribution with Source Prof. Nick Harvey ( UBC ) 50and BLUE: Poisson distribution with 200 Observe that the asymmetry in the plot cannot be well approximated by a symmetric normal distribution.
19 Preparing for Chernoff Bounds Lemma 1:Let,, be independent Poisson trials, that is, each is a 0-1 random variable with Pr 1 for some. Let and. Then for any 0, Proof: But for any, 1. Hence,. Now,. But, Hence,.
20 Chernoff Bound 1 Theorem 3:Let,, be independent Poisson trials, that is, each is a 0-1 random variable with Pr 1 for some. Let and. Then for any 0, Pr 1 Proof:Applying Markov s inequality for any 0, Pr 1 Pr [ Lemma 1 ] Setting ln 1 0, i.e., 1, we get, Pr 1..
21 Chernoff Bound 2 Theorem 4: For 0 1, Pr 1. Proof:From Theorem 3, for 0, Pr 1. We will show that for 0 1, 1 ln 1 That is, 1 ln 1 0 We have, ln 1, and Observe that 0for 0, and 0for.
22 Chernoff Bound 2 Theorem 4: For 0 1, Pr 1. Proof:From Theorem 3, for 0, Pr 1. We will show that for 0 1, 1 ln 1 That is, 1 ln 1 0 We have, ln 1, and Observe that 0for 0, and 0for.
23 Chernoff Bound 2 Theorem 4: For 0 1, Pr 1. Proof:From Theorem 3, for 0, Pr 1. We will show that for 0 1, 1 ln 1 That is, 1 ln 1 0 We have, ln 1, and Observe that 0for 0, and 0for. Hence, first decreases and then increases over 0,1. Since 0 0and 1 0, we have 0over 0,1.
24 Chernoff Bound 2 Theorem 4: For 0 1, Pr 1. Proof:From Theorem 3, for 0, Pr 1. We will show that for 0 1, 1 ln 1 That is, 1 ln 1 0 We have, ln 1, and Observe that 0for 0, and 0for. Hence, first decreases and then increases over 0,1. Since 0 0and 1 0, we have 0over 0,1.
25 Chernoff Bound 2 Theorem 4: For 0 1, Pr 1. Proof:From Theorem 3, for 0, Pr 1. We will show that for 0 1, 1 ln 1 That is, 1 ln 1 0 We have, ln 1, and Observe that 0for 0, and 0for. Hence, first decreases and then increases over 0,1. Since 0 0and 1 0, we have 0over 0,1. Since 0 0,it follows that 0in that interval.
26 Chernoff Bound 2 Theorem 4: For 0 1, Pr 1. Proof:From Theorem 3, for 0, Pr 1. We will show that for 0 1, 1 ln 1 That is, 1 ln 1 0 We have, ln 1, and Observe that 0for 0, and 0for. Hence, first decreases and then increases over 0,1. Since 0 0and 1 0, we have 0over 0,1. Since 0 0,it follows that 0in that interval.
27 Chernoff Bound 2 Theorem 4: For 0 1, Pr 1. Proof:From Theorem 3, for 0, Pr 1. We will show that for 0 1, 1 ln 1 That is, 1 ln 1 0 We have, ln 1, and Observe that 0for 0, and 0for. Hence, first decreases and then increases over 0,1. Since 0 0and 1 0, we have 0over 0,1. Since 0 0,it follows that 0in that interval.
28 Chernoff Bound 3 Corollary 1: For 0, Pr. Proof:From Theorem 2, for 0 1, Pr 1. Setting, we get, Pr for 0.
29 Example: Fair Coin Flips 1 if the th coin flip is heads; 0 otherwise. Then the number of heads in flips, We know, Pr 1. Hence,.. Now putting in Chernoffbound 2, we have, Pr.
30 Chernoff Bounds 4, 5 and 6 Theorem 5:For 0 1, Pr 1. Theorem 6: For 0 1, Pr 1. Corollary 2: For 0, Pr.
31 Chernoff Bounds Lower Tail Upper Tail : Pr 1 1 : Pr 1 1 : Pr 1 : Pr : Pr 1 : Pr Source greedyalgs.info
32 Randomized Quicksort ( RANDQS )
33 Randomized Quicksort ( RANDQS ) Input:A set of numbers. ( i.e., all numbers are distinct ) Output:The numbers of sorted in increasing order. Steps: 1. Pivot Selection:Select a number uniformly at random. 2. Partition: Compare each number of with, and determine sets and. 3. Recursion: Recursively sort and. 4. Output:Output the sorted version of, followed by, followed by the sorted version of.
34 Randomized Quicksort ( RANDQS ) Assumption:RANDQSis called only on nonempty. Observation:If, fewer than recursive calls to RANDQSwill be made during the sorting of. ( why? ) Observation:If,and is the total number of comparisons made in step 2 ( Partition ) across all ( original and recursive ) calls to RANDQS,then RANDQSsorts in Ο time.
35 Expected Running Time of RANDQS Observation:If,and is the total number of comparisons made in step 2 ( Partition ) across all ( original and recursive ) calls to RANDQS,then RANDQSsorts in Ο time. Then all we need to do is determine. Let,,, be the elements of in sorted order. Let,,, for all 1. Observe that each pair of elements of is compared at most once during the entire execution of the algorithm. ( why? ) For 1, let 1 if is compared to ; 0 otherwise.. Then
36 Expected Running Time of RANDQS For 1, let 1 if is compared to ; 0 otherwise. Then Pr 1 Observations: :Once a pivot with is chosen, and will never be compared at any subsequent time. ( why? ) :If either or is chosen as a pivot before any other item in then will be compared with.( why? )
37 Expected Running Time of RANDQS Since each element of is equally likely to be chosen as a pivot: Pr 1 Hence, Pr 1 Ο log Ο log. Thus expected running time of RANDQSis Ο log.
38 High Probability Bound for RANDQS We will prove that w.h.p. the running time of RANDQSdoes not exceed its expected running time by more than a constant factor. In other words, we show that w.h.p. RANDQSruns in Ο log time.
39 High Probability Bound for RANDQS Let us fix an element in the original input set of size. We will trace the partition containing for ln levels of recursion, where is a constant to be determined later. If a partitioning step divides such that, that partition a balanced partition., we call
40 High Probability Bound for RANDQS We will prove that among the ln partitioning steps undergoes, w.h.p. at least ln results in balanced partitions. If at any point is in a partition of size, after a balanced partitioning step it ends up in a partition of size at most. Since the input size is, after ln balanced partitions, will end up in a partition of size, which is 1for 14. That means if 14, then will end up in its final sorted position in the output after undergoing ln balanced partitions.
41 High Probability Bound for RANDQS For 1 ln, let 1 if the partition at recursion level is balanced; 0 otherwise. But a balanced partition is obtained by choosing a pivot with rank between and, where is the size of the set being partitioned. Since each element of the set is chosen as a pivot uniformly at random, a balancing pivot will be chosen with probability Hence, Pr 1. Thus Pr 1..
42 High Probability Bound for RANDQS Total number of balanced partitions,. Then. Now applying Chernoffbound 5 ( see Theorem 6 ) with, Pr 1 Pr ln. For 32, we have Pr 8 ln. This means that the probability that fails to reach its final sorted position even after 32 ln levels of recursion is.
43 High Probability Bound for RANDQS The probability that at least one of input elements fails to reach its final sorted position after 32 ln levels of recursion is. the probability that all input elements reach their final sorted positions after 32 ln levels of recursion is 1. But observe that the total amount of work done in each level of recursion is Ο. total work done in 32 ln levels of recursion is Ο log. Hence, w.h.p. RANDQSterminates in Ο log time.
44 Random Skip Lists
45 Traditional Linked List: Searching in a Sorted Linked List SEARCH : Takes time. items 2-level Linked List: SEARCH : Takes 2 time.
46 3-level Linked List: Searching in a Sorted Linked List SEARCH : Takes 3 time. -level Linked List: SEARCH takes time. For : SEARCH takes log 2 log time!
47 Searching in a Sorted Linked List -level Linked List: SEARCHtakes log 2 log time! Observations: items 1. Let #items in level. Then. 2. Let = #items in level that have not reached level 1. Then.
48 Searching in a Sorted Linked List -level Linked List: SEARCHtakes log 2 log time! items How do we maintain this regular structure under insertion and deletion of items? Deterministic solution does not seem straightforward. But randomization can make life really easy!
49 Random Skip Lists Construction: 1. Start with all items along with a sentinel in level Promote each non-sentinel item of level 0to level 1 with probability. If level 1is nonempty promote the sentinel, too.
50 Random Skip Lists Let be a skip list, be the set of all items in level 1, max, and max.
51 Random Skip Lists Clearly, for each and 1, Pr. Then Pr Pr Pr Pr. Pr 1 Pr 1. For constant 2, Pr log 1 Hence, w.h.p. height of a skip list is Ο log.. 1.
52 Random Skip Lists Let us flip 4 log fair coins, and let is the number of heads we get. Then 4 log 2 log. We know for 0 1, Chernoffbound, Pr 1. Putting and 2 log, we get, Pr log For 16, Pr log 1. Hence, w.h.p. we will get more than log heads..
Randomized Algorithms
Randomized Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A new 4 credit unit course Part of Theoretical Computer Science courses at the Department of Mathematics There will be 4 hours
More informationTheorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr( )
Theorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr Pr = Pr Pr Pr() Pr Pr. We are given three coins and are told that two of the coins are fair and the
More informationCSE 613: Parallel Programming. Lecture 9 ( Divide-and-Conquer: Partitioning for Selection and Sorting )
CSE 613: Parallel Programming Lecture 9 ( Divide-and-Conquer: Partitioning for Selection and Sorting ) Rezaul A. Chowdhury Department of Computer Science SUNY Stony Brook Spring 2012 Parallel Partition
More informationLecture 1 : Probabilistic Method
IITM-CS6845: Theory Jan 04, 01 Lecturer: N.S.Narayanaswamy Lecture 1 : Probabilistic Method Scribe: R.Krithika The probabilistic method is a technique to deal with combinatorial problems by introducing
More informationCPSC 536N: Randomized Algorithms Term 2. Lecture 2
CPSC 536N: Randomized Algorithms 2014-15 Term 2 Prof. Nick Harvey Lecture 2 University of British Columbia In this lecture we continue our introduction to randomized algorithms by discussing the Max Cut
More informationRandom Variable. Pr(X = a) = Pr(s)
Random Variable Definition A random variable X on a sample space Ω is a real-valued function on Ω; that is, X : Ω R. A discrete random variable is a random variable that takes on only a finite or countably
More informationLecture 5: The Principle of Deferred Decisions. Chernoff Bounds
Randomized Algorithms Lecture 5: The Principle of Deferred Decisions. Chernoff Bounds Sotiris Nikoletseas Associate Professor CEID - ETY Course 2013-2014 Sotiris Nikoletseas, Associate Professor Randomized
More informationCSE548, AMS542: Analysis of Algorithms, Fall 2016 Date: Nov 30. Final In-Class Exam. ( 7:05 PM 8:20 PM : 75 Minutes )
CSE548, AMS542: Analysis of Algorithms, Fall 2016 Date: Nov 30 Final In-Class Exam ( 7:05 PM 8:20 PM : 75 Minutes ) This exam will account for either 15% or 30% of your overall grade depending on your
More informationThe first bound is the strongest, the other two bounds are often easier to state and compute. Proof: Applying Markov's inequality, for any >0 we have
The first bound is the strongest, the other two bounds are often easier to state and compute Proof: Applying Markov's inequality, for any >0 we have Pr (1 + ) = Pr For any >0, we can set = ln 1+ (4.4.1):
More informationCSE548, AMS542: Analysis of Algorithms, Spring 2014 Date: May 12. Final In-Class Exam. ( 2:35 PM 3:50 PM : 75 Minutes )
CSE548, AMS54: Analysis of Algorithms, Spring 014 Date: May 1 Final In-Class Exam ( :35 PM 3:50 PM : 75 Minutes ) This exam will account for either 15% or 30% of your overall grade depending on your relative
More informationCSE525: Randomized Algorithms and Probabilistic Analysis April 2, Lecture 1
CSE525: Randomized Algorithms and Probabilistic Analysis April 2, 2013 Lecture 1 Lecturer: Anna Karlin Scribe: Sonya Alexandrova and Eric Lei 1 Introduction The main theme of this class is randomized algorithms.
More informationCSE 525 Randomized Algorithms & Probabilistic Analysis Spring Lecture 3: April 9
CSE 55 Randomized Algorithms & Probabilistic Analysis Spring 01 Lecture : April 9 Lecturer: Anna Karlin Scribe: Tyler Rigsby & John MacKinnon.1 Kinds of randomization in algorithms So far in our discussion
More informationLecture 2 Sep 5, 2017
CS 388R: Randomized Algorithms Fall 2017 Lecture 2 Sep 5, 2017 Prof. Eric Price Scribe: V. Orestis Papadigenopoulos and Patrick Rall NOTE: THESE NOTES HAVE NOT BEEN EDITED OR CHECKED FOR CORRECTNESS 1
More informationChernoff Bounds. Theme: try to show that it is unlikely a random variable X is far away from its expectation.
Chernoff Bounds Theme: try to show that it is unlikely a random variable X is far away from its expectation. The more you know about X, the better the bound you obtain. Markov s inequality: use E[X ] Chebyshev
More informationLecture 2: Randomized Algorithms
Lecture 2: Randomized Algorithms Independence & Conditional Probability Random Variables Expectation & Conditional Expectation Law of Total Probability Law of Total Expectation Derandomization Using Conditional
More informationIntroduction to Randomized Algorithms: Quick Sort and Quick Selection
Chapter 14 Introduction to Randomized Algorithms: Quick Sort and Quick Selection CS 473: Fundamental Algorithms, Spring 2011 March 10, 2011 14.1 Introduction to Randomized Algorithms 14.2 Introduction
More informationN/4 + N/2 + N = 2N 2.
CS61B Summer 2006 Instructor: Erin Korber Lecture 24, 7 Aug. 1 Amortized Analysis For some of the data structures we ve discussed (namely hash tables and splay trees), it was claimed that the average time
More informationCSE 613: Parallel Programming. Lecture 8 ( Analyzing Divide-and-Conquer Algorithms )
CSE 613: Parallel Programming Lecture 8 ( Analyzing Divide-and-Conquer Algorithms ) Rezaul A. Chowdhury Department of Computer Science SUNY Stony Brook Spring 2012 A Useful Recurrence Consider the following
More informationCS155: Probability and Computing: Randomized Algorithms and Probabilistic Analysis
CS155: Probability and Computing: Randomized Algorithms and Probabilistic Analysis Eli Upfal Eli Upfal@brown.edu Office: 319 TA s: Lorenzo De Stefani and Sorin Vatasoiu cs155tas@cs.brown.edu It is remarkable
More informationQuick Sort Notes , Spring 2010
Quick Sort Notes 18.310, Spring 2010 0.1 Randomized Median Finding In a previous lecture, we discussed the problem of finding the median of a list of m elements, or more generally the element of rank m.
More informationCMPUT 675: Approximation Algorithms Fall 2014
CMPUT 675: Approximation Algorithms Fall 204 Lecture 25 (Nov 3 & 5): Group Steiner Tree Lecturer: Zachary Friggstad Scribe: Zachary Friggstad 25. Group Steiner Tree In this problem, we are given a graph
More informationEssentials on the Analysis of Randomized Algorithms
Essentials on the Analysis of Randomized Algorithms Dimitris Diochnos Feb 0, 2009 Abstract These notes were written with Monte Carlo algorithms primarily in mind. Topics covered are basic (discrete) random
More informationCS261: A Second Course in Algorithms Lecture #18: Five Essential Tools for the Analysis of Randomized Algorithms
CS261: A Second Course in Algorithms Lecture #18: Five Essential Tools for the Analysis of Randomized Algorithms Tim Roughgarden March 3, 2016 1 Preamble In CS109 and CS161, you learned some tricks of
More informationDesign and Analysis of Algorithms
Design and Analysis of Algorithms 6.046J/18.401J LECTURE 7 Skip Lists Data structure Randomized insertion With high probability (w.h.p.) bound 7/10/15 Copyright 2001-8 by Leiserson et al L9.1 Skip lists
More informationX = X X n, + X 2
CS 70 Discrete Mathematics for CS Fall 2003 Wagner Lecture 22 Variance Question: At each time step, I flip a fair coin. If it comes up Heads, I walk one step to the right; if it comes up Tails, I walk
More informationELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Random Variable Discrete Random
More informationRandomized algorithms. Inge Li Gørtz
Randomized algorithms Inge Li Gørtz Randomized algorithms Today What are randomized algorithms? Properties of randomized algorithms Two examples: Median/Select. Quick-sort Randomized Algorithms Randomized
More informationAnalysis of Algorithms CMPSC 565
Analysis of Algorithms CMPSC 565 LECTURES 38-39 Randomized Algorithms II Quickselect Quicksort Running time Adam Smith L1.1 Types of randomized analysis Average-case analysis : Assume data is distributed
More informationKousha Etessami. U. of Edinburgh, UK. Kousha Etessami (U. of Edinburgh, UK) Discrete Mathematics (Chapter 7) 1 / 13
Discrete Mathematics & Mathematical Reasoning Chapter 7 (continued): Markov and Chebyshev s Inequalities; and Examples in probability: the birthday problem Kousha Etessami U. of Edinburgh, UK Kousha Etessami
More information11.1 Set Cover ILP formulation of set cover Deterministic rounding
CS787: Advanced Algorithms Lecture 11: Randomized Rounding, Concentration Bounds In this lecture we will see some more examples of approximation algorithms based on LP relaxations. This time we will use
More information13. RANDOMIZED ALGORITHMS
13. RANDOMIZED ALGORITHMS content resolution global min cut linearity of expectation max 3-satisfiability universal hashing Chernoff bounds load balancing Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison
More informationLecture 5: January 30
CS71 Randomness & Computation Spring 018 Instructor: Alistair Sinclair Lecture 5: January 30 Disclaimer: These notes have not been subjected to the usual scrutiny accorded to formal publications. They
More informationCS5314 Randomized Algorithms. Lecture 18: Probabilistic Method (De-randomization, Sample-and-Modify)
CS5314 Randomized Algorithms Lecture 18: Probabilistic Method (De-randomization, Sample-and-Modify) 1 Introduce two topics: De-randomize by conditional expectation provides a deterministic way to construct
More informationLecture 4: Sampling, Tail Inequalities
Lecture 4: Sampling, Tail Inequalities Variance and Covariance Moment and Deviation Concentration and Tail Inequalities Sampling and Estimation c Hung Q. Ngo (SUNY at Buffalo) CSE 694 A Fun Course 1 /
More informationCSE 613: Parallel Programming. Lectures ( Analyzing Divide-and-Conquer Algorithms )
CSE 613: Parallel Programming Lectures 13 14 ( Analyzing Divide-and-Conquer Algorithms ) Rezaul A. Chowdhury Department of Computer Science SUNY Stony Brook Spring 2015 A Useful Recurrence Consider the
More informationRandomized Complexity Classes; RP
Randomized Complexity Classes; RP Let N be a polynomial-time precise NTM that runs in time p(n) and has 2 nondeterministic choices at each step. N is a polynomial Monte Carlo Turing machine for a language
More informationProbability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2
Probability Density Functions and the Normal Distribution Quantitative Understanding in Biology, 1.2 1. Discrete Probability Distributions 1.1. The Binomial Distribution Question: You ve decided to flip
More informationAdvanced Algorithms (2IL45)
Technische Universiteit Eindhoven University of Technology Course Notes Fall 2013 Advanced Algorithms (2IL45) Mark de Berg Contents 1 Introduction to randomized algorithms 4 1.1 Some probability-theory
More informationRandomized Sorting Algorithms Quick sort can be converted to a randomized algorithm by picking the pivot element randomly. In this case we can show th
CSE 3500 Algorithms and Complexity Fall 2016 Lecture 10: September 29, 2016 Quick sort: Average Run Time In the last lecture we started analyzing the expected run time of quick sort. Let X = k 1, k 2,...,
More information13. RANDOMIZED ALGORITHMS
Randomization Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley http://www.cs.princeton.edu/~wayne/kleinberg-tardos Algorithmic design patterns. Greedy. Divide-and-conquer. Dynamic programming.
More informationDiscrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20
CS 70 Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20 Today we shall discuss a measure of how close a random variable tends to be to its expectation. But first we need to see how to compute
More informationU.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 3 Luca Trevisan August 31, 2017
U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 3 Luca Trevisan August 3, 207 Scribed by Keyhan Vakil Lecture 3 In which we complete the study of Independent Set and Max Cut in G n,p random graphs.
More informationLower Bounds for Testing Bipartiteness in Dense Graphs
Lower Bounds for Testing Bipartiteness in Dense Graphs Andrej Bogdanov Luca Trevisan Abstract We consider the problem of testing bipartiteness in the adjacency matrix model. The best known algorithm, due
More informationCSE613: Parallel Programming, Spring 2012 Date: May 11. Final Exam. ( 11:15 AM 1:45 PM : 150 Minutes )
CSE613: Parallel Programming, Spring 2012 Date: May 11 Final Exam ( 11:15 AM 1:45 PM : 150 Minutes ) This exam will account for either 10% or 20% of your overall grade depending on your relative performance
More informationData Mining. Chapter 5. Credibility: Evaluating What s Been Learned
Data Mining Chapter 5. Credibility: Evaluating What s Been Learned 1 Evaluating how different methods work Evaluation Large training set: no problem Quality data is scarce. Oil slicks: a skilled & labor-intensive
More informationRandomization in Algorithms
Randomization in Algorithms Randomization is a tool for designing good algorithms. Two kinds of algorithms Las Vegas - always correct, running time is random. Monte Carlo - may return incorrect answers,
More informationAdvanced Combinatorial Optimization Feb 13 and 25, Lectures 4 and 6
18.438 Advanced Combinatorial Optimization Feb 13 and 25, 2014 Lectures 4 and 6 Lecturer: Michel X. Goemans Scribe: Zhenyu Liao and Michel X. Goemans Today, we will use an algebraic approach to solve the
More informationRandomized Algorithms I. Spring 2013, period III Jyrki Kivinen
582691 Randomized Algorithms I Spring 2013, period III Jyrki Kivinen 1 Position of the course in the studies 4 credits advanced course (syventävät opinnot) in algorithms and machine learning prerequisites:
More informationBasic Probability. Introduction
Basic Probability Introduction The world is an uncertain place. Making predictions about something as seemingly mundane as tomorrow s weather, for example, is actually quite a difficult task. Even with
More informationCS 580: Algorithm Design and Analysis
CS 580: Algorithm Design and Analysis Jeremiah Blocki Purdue University Spring 2018 Announcements: Homework 6 deadline extended to April 24 th at 11:59 PM Course Evaluation Survey: Live until 4/29/2018
More information1.225J J (ESD 205) Transportation Flow Systems
1.225J J (ESD 25) Transportation Flow Systems Lecture 9 Simulation Models Prof. Ismail Chabini and Prof. Amedeo R. Odoni Lecture 9 Outline About this lecture: It is based on R16. Only material covered
More informationExample 1. The sample space of an experiment where we flip a pair of coins is denoted by:
Chapter 8 Probability 8. Preliminaries Definition (Sample Space). A Sample Space, Ω, is the set of all possible outcomes of an experiment. Such a sample space is considered discrete if Ω has finite cardinality.
More informationCSE 312 Final Review: Section AA
CSE 312 TAs December 8, 2011 General Information General Information Comprehensive Midterm General Information Comprehensive Midterm Heavily weighted toward material after the midterm Pre-Midterm Material
More informationChapter 13. Randomized Algorithms. Slides by Kevin Wayne Pearson-Addison Wesley. All rights reserved.
Chapter 13 Randomized Algorithms Slides by Kevin Wayne. Copyright @ 2005 Pearson-Addison Wesley. All rights reserved. 1 Randomization Algorithmic design patterns. Greedy. Divide-and-conquer. Dynamic programming.
More informationCOMP2610/COMP Information Theory
COMP2610/COMP6261 - Information Theory Lecture 9: Probabilistic Inequalities Mark Reid and Aditya Menon Research School of Computer Science The Australian National University August 19th, 2014 Mark Reid
More informationAn Introduction to Randomized algorithms
An Introduction to Randomized algorithms C.R. Subramanian The Institute of Mathematical Sciences, Chennai. Expository talk presented at the Research Promotion Workshop on Introduction to Geometric and
More informationCS 580: Algorithm Design and Analysis
CS 580: Algorithm Design and Analysis Jeremiah Blocki Purdue University Spring 2018 Reminder: Homework 6 has been released. Chapter 13 Randomized Algorithms Slides by Kevin Wayne. Copyright @ 2005 Pearson-Addison
More informationLecture 24: April 12
CS271 Randomness & Computation Spring 2018 Instructor: Alistair Sinclair Lecture 24: April 12 Disclaimer: These notes have not been subjected to the usual scrutiny accorded to formal publications. They
More informationChapter 3. Chapter 3 sections
sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional Distributions 3.7 Multivariate Distributions
More informationTheorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1
Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be
More informationName: Firas Rassoul-Agha
Midterm 1 - Math 5010 - Spring 016 Name: Firas Rassoul-Agha Solve the following 4 problems. You have to clearly explain your solution. The answer carries no points. Only the work does. CALCULATORS ARE
More informationHomework 4 Solutions
CS 174: Combinatorics and Discrete Probability Fall 01 Homework 4 Solutions Problem 1. (Exercise 3.4 from MU 5 points) Recall the randomized algorithm discussed in class for finding the median of a set
More informationExponential Tail Bounds
Exponential Tail Bounds Mathias Winther Madsen January 2, 205 Here s a warm-up problem to get you started: Problem You enter the casino with 00 chips and start playing a game in which you double your capital
More informationTail Inequalities Randomized Algorithms. Sariel Har-Peled. December 20, 2002
Tail Inequalities 497 - Randomized Algorithms Sariel Har-Peled December 0, 00 Wir mssen wissen, wir werden wissen (We must know, we shall know) David Hilbert 1 Tail Inequalities 1.1 The Chernoff Bound
More informationCMPSCI 240: Reasoning Under Uncertainty
CMPSCI 240: Reasoning Under Uncertainty Lecture 5 Prof. Hanna Wallach wallach@cs.umass.edu February 7, 2012 Reminders Pick up a copy of B&T Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/
More informationUCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis
UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 8 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 8 Notes Goals for Today Counting Partitions
More information1 Terminology and setup
15-451/651: Design & Analysis of Algorithms August 31, 2017 Lecture #2 last changed: August 29, 2017 In this lecture, we will examine some simple, concrete models of computation, each with a precise definition
More information1. Discrete Distributions
Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 1. Discrete Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an underlying sample space Ω.
More informationModel Counting for Logical Theories
Model Counting for Logical Theories Wednesday Dmitry Chistikov Rayna Dimitrova Department of Computer Science University of Oxford, UK Max Planck Institute for Software Systems (MPI-SWS) Kaiserslautern
More informationApplication: Bucket Sort
5.2.2. Application: Bucket Sort Bucket sort breaks the log) lower bound for standard comparison-based sorting, under certain assumptions on the input We want to sort a set of =2 integers chosen I+U@R from
More informationCopyright 2000, Kevin Wayne 1
Global Minimum Cut Contraction Algorithm CS 580: Algorithm Design and Analysis Jeremiah Blocki Purdue University Spring 2018 Announcements: Homework 6 deadline extended to April 24 th at 11:59 PM Course
More informationSUFFICIENT STATISTICS
SUFFICIENT STATISTICS. Introduction Let X (X,..., X n ) be a random sample from f θ, where θ Θ is unknown. We are interested using X to estimate θ. In the simple case where X i Bern(p), we found that the
More informationRandomized Algorithms III Min Cut
Chapter 11 Randomized Algorithms III Min Cut CS 57: Algorithms, Fall 01 October 1, 01 11.1 Min Cut 11.1.1 Problem Definition 11. Min cut 11..0.1 Min cut G = V, E): undirected graph, n vertices, m edges.
More informationErrata for the ASM Study Manual for Exam P, Fourth Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA
Errata for the ASM Study Manual for Exam P, Fourth Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA (krzysio@krzysio.net) Effective July 5, 3, only the latest edition of this manual will have its
More informationNotes 6 : First and second moment methods
Notes 6 : First and second moment methods Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Roc, Sections 2.1-2.3]. Recall: THM 6.1 (Markov s inequality) Let X be a non-negative
More informationLecture 5: Probabilistic tools and Applications II
T-79.7003: Graphs and Networks Fall 2013 Lecture 5: Probabilistic tools and Applications II Lecturer: Charalampos E. Tsourakakis Oct. 11, 2013 5.1 Overview In the first part of today s lecture we will
More informationAssignment 4: Solutions
Math 340: Discrete Structures II Assignment 4: Solutions. Random Walks. Consider a random walk on an connected, non-bipartite, undirected graph G. Show that, in the long run, the walk will traverse each
More informationEE514A Information Theory I Fall 2013
EE514A Information Theory I Fall 2013 K. Mohan, Prof. J. Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2013 http://j.ee.washington.edu/~bilmes/classes/ee514a_fall_2013/
More informationRandom Number Generators
1/18 Random Number Generators Professor Karl Sigman Columbia University Department of IEOR New York City USA 2/18 Introduction Your computer generates" numbers U 1, U 2, U 3,... that are considered independent
More informationF 99 Final Quiz Solutions
Massachusetts Institute of Technology Handout 53 6.04J/18.06J: Mathematics for Computer Science May 11, 000 Professors David Karger and Nancy Lynch F 99 Final Quiz Solutions Problem 1 [18 points] Injections
More informationCSC 446 Notes: Lecture 13
CSC 446 Notes: Lecture 3 The Problem We have already studied how to calculate the probability of a variable or variables using the message passing method. However, there are some times when the structure
More informationBernoulli Trials, Binomial and Cumulative Distributions
Bernoulli Trials, Binomial and Cumulative Distributions Sec 4.4-4.6 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 9-3339 Cathy Poliak,
More informationLecture 4: Probability and Discrete Random Variables
Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1
More informationPage Max. Possible Points Total 100
Math 3215 Exam 2 Summer 2014 Instructor: Sal Barone Name: GT username: 1. No books or notes are allowed. 2. You may use ONLY NON-GRAPHING and NON-PROGRAMABLE scientific calculators. All other electronic
More informationExpectation, inequalities and laws of large numbers
Chapter 3 Expectation, inequalities and laws of large numbers 3. Expectation and Variance Indicator random variable Let us suppose that the event A partitions the sample space S, i.e. A A S. The indicator
More informationOutline. CS38 Introduction to Algorithms. Max-3-SAT approximation algorithm 6/3/2014. randomness in algorithms. Lecture 19 June 3, 2014
Outline CS38 Introduction to Algorithms Lecture 19 June 3, 2014 randomness in algorithms max-3-sat approximation algorithm universal hashing load balancing Course summary and review June 3, 2014 CS38 Lecture
More informationContinuous-Valued Probability Review
CS 6323 Continuous-Valued Probability Review Prof. Gregory Provan Department of Computer Science University College Cork 2 Overview Review of discrete distributions Continuous distributions 3 Discrete
More informationCS 125 Section #12 (More) Probability and Randomized Algorithms 11/24/14. For random numbers X which only take on nonnegative integer values, E(X) =
CS 125 Section #12 (More) Probability and Randomized Algorithms 11/24/14 1 Probability First, recall a couple useful facts from last time about probability: Linearity of expectation: E(aX + by ) = ae(x)
More informationQuicksort (CLRS 7) We previously saw how the divide-and-conquer technique can be used to design sorting algorithm Merge-sort
Quicksort (CLRS 7) We previously saw how the divide-and-conquer technique can be used to design sorting algorithm Merge-sort Partition n elements array A into two subarrays of n/2 elements each Sort the
More informationThe Probabilistic Method
Lecture 3: Tail bounds, Probabilistic Method Today we will see what is nown as the probabilistic method for showing the existence of combinatorial objects. We also review basic concentration inequalities.
More informationCSCE 750 Final Exam Answer Key Wednesday December 7, 2005
CSCE 750 Final Exam Answer Key Wednesday December 7, 2005 Do all problems. Put your answers on blank paper or in a test booklet. There are 00 points total in the exam. You have 80 minutes. Please note
More informationCS161: Algorithm Design and Analysis Recitation Section 3 Stanford University Week of 29 January, Problem 3-1.
CS161: Algorithm Design and Analysis Recitation Section 3 Stanford University Week of 29 January, 2018 Problem 3-1. (Quicksort Median-of-3 Partition) One way to improve the randomized quicksort procedure
More informationLecture 1: Contraction Algorithm
CSE 5: Design and Analysis of Algorithms I Spring 06 Lecture : Contraction Algorithm Lecturer: Shayan Oveis Gharan March 8th Scribe: Mohammad Javad Hosseini Disclaimer: These notes have not been subjected
More informationCSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions)
CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions) 1. (Confidence Intervals, CLT) Let X 1,..., X n be iid with unknown mean θ and known variance σ 2. Assume
More informationMa/CS 6b Class 15: The Probabilistic Method 2
Ma/CS 6b Class 15: The Probabilistic Method 2 By Adam Sheffer Reminder: Random Variables A random variable is a function from the set of possible events to R. Example. Say that we flip five coins. We can
More informationChapter 11. Min Cut Min Cut Problem Definition Some Definitions. By Sariel Har-Peled, December 10, Version: 1.
Chapter 11 Min Cut By Sariel Har-Peled, December 10, 013 1 Version: 1.0 I built on the sand And it tumbled down, I built on a rock And it tumbled down. Now when I build, I shall begin With the smoke from
More informationThe Monte Carlo Method
The Monte Carlo Method Example: estimate the value of π. Choose X and Y independently and uniformly at random in [0, 1]. Let Pr(Z = 1) = π 4. 4E[Z] = π. { 1 if X Z = 2 + Y 2 1, 0 otherwise, Let Z 1,...,
More informationLecture 4: Two-point Sampling, Coupon Collector s problem
Randomized Algorithms Lecture 4: Two-point Sampling, Coupon Collector s problem Sotiris Nikoletseas Associate Professor CEID - ETY Course 2013-2014 Sotiris Nikoletseas, Associate Professor Randomized Algorithms
More informationLecture Notes CS:5360 Randomized Algorithms Lecture 20 and 21: Nov 6th and 8th, 2018 Scribe: Qianhang Sun
1 Probabilistic Method Lecture Notes CS:5360 Randomized Algorithms Lecture 20 and 21: Nov 6th and 8th, 2018 Scribe: Qianhang Sun Turning the MaxCut proof into an algorithm. { Las Vegas Algorithm Algorithm
More informationACO Comprehensive Exam October 14 and 15, 2013
1. Computability, Complexity and Algorithms (a) Let G be the complete graph on n vertices, and let c : V (G) V (G) [0, ) be a symmetric cost function. Consider the following closest point heuristic for
More information