Randomized algorithm

Similar documents
Solutions to Problem Set 4

Tail Inequalities. The Chernoff bound works for random variables that are a sum of indicator variables with the same distribution (Bernoulli trials).

Problem 1: (Chernoff Bounds via Negative Dependence - from MU Ex 5.15)

CS5314 Randomized Algorithms. Lecture 15: Balls, Bins, Random Graphs (Hashing)

Assignment 4: Solutions

Outline. Martingales. Piotr Wojciechowski 1. 1 Lane Department of Computer Science and Electrical Engineering West Virginia University.

1. Consider a random independent sample of size 712 from a distribution with the following pdf. c 1+x. f(x) =

The first bound is the strongest, the other two bounds are often easier to state and compute. Proof: Applying Markov's inequality, for any >0 we have

The space complexity of approximating the frequency moments

A = A U. U [n] P(A U ). n 1. 2 k(n k). k. k=1

UC Berkeley, CS 174: Combinatorics and Discrete Probability (Fall 2008) Midterm 1. October 7, 2008

Homework 4 Solutions

Lecture 4. P r[x > ce[x]] 1/c. = ap r[x = a] + a>ce[x] P r[x = a]

In a five-minute period, you get a certain number m of requests. Each needs to be served from one of your n servers.

Lecture 4: Two-point Sampling, Coupon Collector s problem

ACO Comprehensive Exam October 14 and 15, 2013

Algebra II CP Final Exam Review Packet. Calculator Questions

Twelfth Problem Assignment

Final Solutions Fri, June 8

Chernoff Bounds. Theme: try to show that it is unlikely a random variable X is far away from its expectation.

CSE 525 Randomized Algorithms & Probabilistic Analysis Spring Lecture 3: April 9

Essentials on the Analysis of Randomized Algorithms

Lectures 6: Degree Distributions and Concentration Inequalities

CS5314 Randomized Algorithms. Lecture 18: Probabilistic Method (De-randomization, Sample-and-Modify)

Randomized Algorithms

Errata: First Printing of Mitzenmacher/Upfal Probability and Computing

With high probability

Conflict-Free Colorings of Rectangles Ranges

Hash tables. Hash tables

Math 447. Introduction to Probability and Statistics I. Fall 1998.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2017 Kannan Ramchandran March 21, 2017.

Tail Inequalities Randomized Algorithms. Sariel Har-Peled. December 20, 2002

Midterm Exam 2 (Solutions)

Randomized Algorithms Week 2: Tail Inequalities

Randomized Load Balancing:The Power of 2 Choices

EECS 70 Discrete Mathematics and Probability Theory Fall 2015 Walrand/Rao Final

CS5314 Randomized Algorithms. Lecture 5: Discrete Random Variables and Expectation (Conditional Expectation, Geometric RV)

Lecture 5: The Principle of Deferred Decisions. Chernoff Bounds

Application: Bucket Sort

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Expectation, inequalities and laws of large numbers

Probabilistic Combinatorics. Jeong Han Kim

Midterm 2. Your Exam Room: Name of Person Sitting on Your Left: Name of Person Sitting on Your Right: Name of Person Sitting in Front of You:

6.1 Occupancy Problem

Chapter 4 - Tail inequalities

Partitioning Metric Spaces

CS280, Spring 2004: Final

11.1 Set Cover ILP formulation of set cover Deterministic rounding

Midterm Exam 1 Solution

Lecture 5: Two-point Sampling

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

9.1 Branching Process

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.

Balls & Bins. Balls into Bins. Revisit Birthday Paradox. Load. SCONE Lab. Put m balls into n bins uniformly at random

MARKING A BINARY TREE PROBABILISTIC ANALYSIS OF A RANDOMIZED ALGORITHM

ACO Comprehensive Exam 19 March Graph Theory

< k 2n. 2 1 (n 2). + (1 p) s) N (n < 1

Non-Interactive Zero Knowledge (II)

HAMILTON CYCLES IN RANDOM REGULAR DIGRAPHS

This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

Homework 6: Solutions Sid Banerjee Problem 1: (The Flajolet-Martin Counter) ORIE 4520: Stochastics at Scale Fall 2015

1 Approximate Counting by Random Sampling

Concentration of Measures by Bounded Size Bias Couplings

CSE525: Randomized Algorithms and Probabilistic Analysis April 2, Lecture 1

CSCI8980 Algorithmic Techniques for Big Data September 12, Lecture 2

:s ej2mttlm-(iii+j2mlnm )(J21nm/m-lnm/m)

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

Mathematics Sample Questions

CS6999 Probabilistic Methods in Integer Programming Randomized Rounding Andrew D. Smith April 2003

Random Variable. Pr(X = a) = Pr(s)

5. Partitions and Relations Ch.22 of PJE.

December 2010 Mathematics 302 Name Page 2 of 11 pages

CSE548, AMS542: Analysis of Algorithms, Spring 2014 Date: May 12. Final In-Class Exam. ( 2:35 PM 3:50 PM : 75 Minutes )

Second main application of Chernoff: analysis of load balancing. Already saw balls in bins example. oblivious algorithms only consider self packet.

Advanced Algorithm Design: Hashing and Applications to Compact Data Representation

Lecture 4: Sampling, Tail Inequalities

The expansion of random regular graphs

1 Review of The Learning Setting

Discrete Mathematics for CS Spring 2006 Vazirani Lecture 22

CMPUT 675: Approximation Algorithms Fall 2014

Equivalence of the random intersection graph and G(n, p)

Concentration of Measures by Bounded Couplings

6.842 Randomness and Computation Lecture 5

CS261: A Second Course in Algorithms Lecture #18: Five Essential Tools for the Analysis of Randomized Algorithms

Discrete Mathematics: Midterm Test with Answers. Professor Callahan, section (A or B): Name: NetID: 30 multiple choice, 3 points each:

Theoretical Cryptography, Lecture 10

Lecture 5: Probabilistic tools and Applications II

Randomized Algorithms

CSE548, AMS542: Analysis of Algorithms, Fall 2016 Date: Nov 30. Final In-Class Exam. ( 7:05 PM 8:20 PM : 75 Minutes )

Discrete Mathematics and Probability Theory Fall 2017 Ramchandran and Rao Final Solutions

Randomization in Algorithms

Midterm 2 Review. CS70 Summer Lecture 6D. David Dinh 28 July UC Berkeley

Stat 516, Homework 1

Disjointness and Additivity

Cryptography and Security Final Exam

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions

1 Large Deviations. Korea Lectures June 2017 Joel Spencer Tuesday Lecture III

So far we have implemented the search for a key by carefully choosing split-elements.

Transcription:

Tutorial 4 Joyce 2009-11-24

Outline Solution to Midterm Question 1 Question 2 Question 1 Question 2 Question 3 Question 4 Question 5

Solution to Midterm Solution to Midterm

Solution to Midterm Question 1 [Question 1]: Alice and Bob decide to have children until either they have their first girl or they have k 1 children. Assume that each child is a boy with probability 1/3. 1. What is the expected number of girls that they have? 2. What is the expected number of boys that they have? 3. Suppose Alice and Bob simply decide to keep having children until they have their first girl. What is the expected number of boys that they have?

Solution to Midterm Question 1 E[X ]: the expected number of girls. E[Y ]: the expected number of boys. E[Z]: the expected number of all children. [Solution]: n k=1 a i r k = a i(1 r n+1 ) 1 r E[X ] = 1 3 k 0 + (1 1 3 k ) 1 = 1 1 3 k E[Z] = 1 2 3 + 2 1 3 2 3 + + k 1 3 k 1 2 3 + k 1 ( 3 k 2 = 3 + 2 3 2 + + 2 ) ( 2 3 k + 3 2 + 2 3 3 + + 2 ) 3 k + ( 2 + 3 k 1 + 2 ) 3 k + 2 3 k + k 1 3 k

Solution to Midterm Question 1 E[Z] = = = (1 13 k ) ( 1 + 3 1 1 ) 3 k ( 1 + + ( 1 + 1 3 + 1 3 2 + + 1 3 k 1 k 3 [ k 3 (1 13 ) 2 k k3 ] k + k 1 3 k = 3 2 1 2 1 3 k 1 E[Y ] = E[Z] E[X ] = 1 2 1 2 1 3 k lim k 1 2 1 2 1 3 k = 1 2 3 k 1 1 3 k ) + k 1 3 k ) + k 1 3 k

Solution to Midterm Question 2 [Question 2]: A permutation π : [1, n] [1, n] can be represented as a set of cycles as follows. Let there be one vertex for each number i, i = 1,..., n. If the permutation maps the number i to the number π(i), then a directed arc is drawn from vertex i to vertex π(i). This leads to a graph that is a set of disjoint cycles. Notice that some of the cycles could be self-loop. What is the expected number of cycles in a random permutation if n numbers?

Solution to Midterm Question 2 [Solution]: The total number of permutation that v is in a cycle of exactly length k, is equal to: ( ) n 1 (k 1)!(n k)! = (n 1)! k 1 which is independent of k. Thus, the probability that v is in a cycle of length k is (n 1)!/n! = 1/n. Let Y denote the number of cycles in the graph, and let Y i be a random variable such that Y i = 1 length of cycle where vertex i belongs to

Solution to Midterm Question 2 Since Y = n i=1 Y i, we have E[Y ] = = = n E[Y i ] i=1 n n ( 1 j Pr Y i = 1 ) j i=1 j=1 n 1 n i=1 n i=1 n j=1 1 j 1 ln n = ln n n

Question 1 [Question 1]: Let X be a Poisson random variable with mean λ. 1. What is the most likely value of X 1.1 when λ is an integer? 1.2 when λ is not an integer? Hint: Compare Pr(X = k + 1) with Pr(X = k). 2. We define the median of X to be the least number m such that Pr(X m) 1/2. What is the median of X when λ = 3.9?

Question 2 [Question 2]: Let X be a Poisson random variable with mean µ, representing the number of criminals in a city. There are two types of criminals: For the first type, they are not too bad and are reformable. For the second type, they are flagrant. Suppose each criminal is independently reformable with probability p (so that flagrant with probability 1 p). Let Y and Z be random variables denoting the number of reformable criminals and flagrant criminals (respectively) in the city. Show that Y and Z are independent Poisson random variables.

Question 2 [Hint]: p Y: # of reformable criminals X: # of criminals 1-p Z: # of flagrant criminals By definition of Poisson random variable with some condition. Try to show Pr(Y = k) =? and Pr(Z = k) =?

Question 3 [Question 3]: Consider assigning some balls to n bins as follows: In the first round, each ball chooses a bin independently and uniformly at random.

Question 3 After that, if a ball lands at a bin by itself, the ball is served immediately, and will be removed from consideration. For the number of bins, it remains unchanged. : served : thrown again

Question 3 In the subsequent rounds, we repeat the process to assign the remaining balls to the bins. We finish when every ball is served.

Question 3 1. Suppose at the start of some round b balls are still remaining. Let f (b) denote the expected number of balls that will remain after this round. Given an explicit formula for f (b). 2. Show that f (b) b 2 /n. Hint: You may use Bernoulli s inequality: r N and x 1, (1 + x) r 1 + rx. 3. Suppose we have n k balls initially, for some fixed constant k > 1. Every round the number of balls served was exactly the expected number of balls to be served. Show that all the balls would be served in O(log log n) rounds.

Question 3 [Hint]: 1. E[number of bins with 1 ball] =? 2. Bernoulli s inequality 3. The number of balls at each round decreased exactly according to expectation: m, f (m), f (f (m)), f (f (f (m))).

Question 4 [Question 4]: Suppose that we vary the balls-and-bins process as follows. For convenience let the bins be numbered from 0 to n 1. There are log 2 n players. 1 2 3 log 2 n 0 1 2 n-1

Question 4 Each player chooses a starting location l uniformly at random from [0, n 1] and then places one ball in each of the bins numbered l mod n, l + 1 mod n,..., l + n/ log 2 n 1 mod n.( Assume that n is a multiple of log 2 n.) 2+ 0 1 2 n/lgn n-1-1

Question 4 Show that the maximum load in this case is only O(log log n/ log log log n) with probability that approaches 1 as n.

Question 4 [Hint]: Total number of balls are n. How would the probability for bin 1 to receive at least M balls?

Question 5 [Question 5]: We consider another way to obtain Chernoff-like bound in the balls-and-bins setting without using the theorem in Page 13 of Lecture 14. Consider n balls thrown randomly into n bins. Let X i = 1 if the i-th bin is empty and 0 otherwise. Let X = Σ n i=1 X i. Let Y i be independent Bernoulli random variable such that Y i = 1 with probability p = (1 1/n) n. Let Y = Σ n i=1 Y i. 1. Show that E[X 1 X 2 X k ] E[Y 1 Y 2 Y k ] for any k 1. 2. Show that X j 1 1 X j 2 2 X j k k = X 1 X 2 X k for any j 1, j 2,..., j k N. 3. Show that E[e tx ] E[e ty ] for all t 0. Hint: Use the expansion for e x and compare E[e tx ] to E[e ty ]. 4. Derive a Chernoff bound for Pr(X (1 + δ)e[x ]).

Question 5 [Hint]: Add oil.

Thank you