Randomness and Computation

Similar documents
U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora

Convergence of random processes

Lecture 3: Shannon s Theorem

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

Lecture 3. Ax x i a i. i i

Math 261 Exercise sheet 2

Lecture 10: May 6, 2013

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2)

Lecture 21: Numerical methods for pricing American type derivatives

CS 798: Homework Assignment 2 (Probability)

11 Tail Inequalities Markov s Inequality. Lecture 11: Tail Inequalities [Fa 13]

Lecture 4: Universal Hash Functions/Streaming Cont d

SELECTED PROOFS. DeMorgan s formulas: The first one is clear from Venn diagram, or the following truth table:

Lecture 3: Probability Distributions

Exercises of Chapter 2

Lecture Randomized Load Balancing strategies and their analysis. Probability concepts include, counting, the union bound, and Chernoff bounds.

Chapter 1. Probability

Math 426: Probability MWF 1pm, Gasson 310 Homework 4 Selected Solutions

Simulation and Random Number Generation

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

Lecture 10 Support Vector Machines II

Lecture 4: Constant Time SVD Approximation

Statistical analysis using matlab. HY 439 Presented by: George Fortetsanakis

Stat 642, Lecture notes for 01/27/ d i = 1 t. n i t nj. n j

Strong Markov property: Same assertion holds for stopping times τ.

Applied Stochastic Processes

Matrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD

1 The Mistake Bound Model

Probability and Random Variable Primer

The EM Algorithm (Dempster, Laird, Rubin 1977) The missing data or incomplete data setting: ODL(φ;Y ) = [Y;φ] = [Y X,φ][X φ] = X

CS-433: Simulation and Modeling Modeling and Probability Review

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

Announcements EWA with ɛ-exploration (recap) Lecture 20: EXP3 Algorithm. EECS598: Prediction and Learning: It s Only a Game Fall 2013.

Lecture 17 : Stochastic Processes II

Eigenvalues of Random Graphs

Finding Primitive Roots Pseudo-Deterministically

First Year Examination Department of Statistics, University of Florida

Google PageRank with Stochastic Matrix

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

E Tail Inequalities. E.1 Markov s Inequality. Non-Lecture E: Tail Inequalities

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Min Cut, Fast Cut, Polynomial Identities

Spectral Graph Theory and its Applications September 16, Lecture 5

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

Random Partitions of Samples

Hidden Markov Models

Notes on Frequency Estimation in Data Streams

More metrics on cartesian products

10-701/ Machine Learning, Fall 2005 Homework 3

APPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14

THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY. William A. Pearlman. References: S. Arimoto - IEEE Trans. Inform. Thy., Jan.

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7

} Often, when learning, we deal with uncertainty:

Generalized Linear Methods

Supplement to Clustering with Statistical Error Control

Complete subgraphs in multipartite graphs

Lecture 4: November 17, Part 1 Single Buffer Management

Natural Language Processing and Information Retrieval

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Maximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Probability Theory (revisited)

Section 8.3 Polar Form of Complex Numbers

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k.

Finding Dense Subgraphs in G(n, 1/2)

Continuous Time Markov Chains

APPENDIX A Some Linear Algebra

CALCULUS CLASSROOM CAPSULES

CSCE 790S Background Results

COS 521: Advanced Algorithms Game Theory and Linear Programming

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 6 Luca Trevisan September 12, 2017

Expected Value and Variance

Calculation of time complexity (3%)

Lecture 4: September 12

10-801: Advanced Optimization and Randomized Methods Lecture 2: Convex functions (Jan 15, 2014)

TAIL PROBABILITIES OF RANDOMLY WEIGHTED SUMS OF RANDOM VARIABLES WITH DOMINATED VARIATION

Lecture Notes on Linear Regression

Société de Calcul Mathématique SA

Large Sample Properties of Matching Estimators for Average Treatment Effects by Alberto Abadie & Guido Imbens

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement

Introduction to Algorithms

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

Computing MLE Bias Empirically

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS

Lecture 6 More on Complete Randomized Block Design (RBD)

As is less than , there is insufficient evidence to reject H 0 at the 5% level. The data may be modelled by Po(2).

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

Lecture Space-Bounded Derandomization

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

18.1 Introduction and Recap

6.842 Randomness and Computation February 18, Lecture 4

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline

Queueing Networks II Network Performance

TCOM 501: Networking Theory & Fundamentals. Lecture 7 February 25, 2003 Prof. Yannis A. Korilis

Several generation methods of multinomial distributed random number Tian Lei 1, a,linxihe 1,b,Zhigang Zhang 1,c

3.1 ML and Empirical Distribution

Transcription:

Randomness and Computaton or, Randomzed Algorthms Mary Cryan School of Informatcs Unversty of Ednburgh RC 208/9) Lecture 0 slde

Balls n Bns m balls, n bns, and balls thrown unformly at random nto bns usually one at a tme). Magc bns wth no upper lmt on capacty. Common model of random allocatons and ther affect on overall load and load balance, typcal dstrbuton n the system. Classc" queston - what does the dstrbuton look lke for m = n? Max load? wth hgh probablty results are what we want). We have already shown that when m = n same number of balls as bns) and n f suffcently large, the maxmum load s 3 lnn) wth probablty at least n. We wll show an Ω lnn) ) bound today. RC 208/9) Lecture 0 slde 2

Some prelmnary observatons, defntons The probablty s of a specfc bn bn, say) beng empty: n )m e m/n. Expected number of empty bns: ne m/n Probablty p r of a specfc bn havng r balls: ) m r p r = ) m r. r n n Note p r e m/n r! Defnton 5.) A dscrete Posson random varable X wth parameter µ s gven by the followng probablty dstrbuton on j = 0,, 2,...: m n r. Pr[X = j] = e µ µ j j!. RC 208/9) Lecture 0 slde 3

Posson as the lmt of the Bnomal Dstrbuton Theorem 5.5) If X n s a bnomal random varable wth parameters n and p = pn) such that lm n np = λ s a constant ndependent of n), then for any fxed k N 0 lm Pr[X n = k] = e λ λ k n k!. RC 208/9) Lecture 0 slde 4

Posson modellng of balls-n-bns Our balls n bns model has n bns, m for varable m) balls, and the balls are thrown nto bns ndependently and unformly at random. Each bn X m) behaves lke a bnomal r.v Bm, n ). Wrte X m),..., X m) n ) for the jont dstrbuton note the varous X m) s are not ndependent). For the Posson approxmaton we take λ = m n denote a Posson r.v wth parameter λ = m/n., and wrte Y m) We wrte Y m),..., Y m) n ) to denote a jont dstrbuton of ndependent Posson r.vs whch are all ndependent. to RC 208/9) Lecture 0 slde 5

Posson modellng of balls-n-bns Our balls n bns model has n bns, m for varable m) balls, and the balls are thrown nto bns ndependently and unformly at random. Each bn X m) behaves lke a bnomal r.v Bm, n ). Wrte X m),..., X m) n ) for the jont dstrbuton note the varous X m) s are not ndependent). For the Posson approxmaton we take λ = m n denote a Posson r.v wth parameter λ = m/n., and wrte Y m) We wrte Y m),..., Y m) n ) to denote a jont dstrbuton of ndependent Posson r.vs whch are all ndependent. to RC 208/9) Lecture 0 slde 5

Posson modellng of balls-n-bns Our balls n bns model has n bns, m for varable m) balls, and the balls are thrown nto bns ndependently and unformly at random. Each bn X m) behaves lke a bnomal r.v Bm, n ). Wrte X m),..., X m) n ) for the jont dstrbuton note the varous X m) s are not ndependent). For the Posson approxmaton we take λ = m n denote a Posson r.v wth parameter λ = m/n., and wrte Y m) We wrte Y m),..., Y m) n ) to denote a jont dstrbuton of ndependent Posson r.vs whch are all ndependent. to RC 208/9) Lecture 0 slde 5

Posson modellng of balls-n-bns Our balls n bns model has n bns, m for varable m) balls, and the balls are thrown nto bns ndependently and unformly at random. Each bn X m) behaves lke a bnomal r.v Bm, n ). Wrte X m),..., X m) n ) for the jont dstrbuton note the varous X m) s are not ndependent). For the Posson approxmaton we take λ = m n denote a Posson r.v wth parameter λ = m/n., and wrte Y m) We wrte Y m),..., Y m) n ) to denote a jont dstrbuton of ndependent Posson r.vs whch are all ndependent. to RC 208/9) Lecture 0 slde 5

Posson modellng of balls-n-bns Our balls n bns model has n bns, m for varable m) balls, and the balls are thrown nto bns ndependently and unformly at random. Each bn X m) behaves lke a bnomal r.v Bm, n ). Wrte X m),..., X m) n ) for the jont dstrbuton note the varous X m) s are not ndependent). For the Posson approxmaton we take λ = m n denote a Posson r.v wth parameter λ = m/n., and wrte Y m) We wrte Y m),..., Y m) n ) to denote a jont dstrbuton of ndependent Posson r.vs whch are all ndependent. to RC 208/9) Lecture 0 slde 5

Some prelmnares Theorem 5.7) Let f x,..., x n ) be a non-negatve functon. Then E[f X m),..., X m) n )] e m E[f Y m),..., Y m) n )]. Corollary 5.9) Any event that takes place wth probablty p n the Posson case takes place wth probablty at most pe m n the exact balls-n-bns case. RC 208/9) Lecture 0 slde 6

Lower bound for n balls n bns Lemma Let n balls be thrown ndependently and unformly at random nto n bns. Then for n suffcently large) the maxmum load s at least lnn)/ wth probablty at least n. Proof. For the Posson varables, we have λ = n n any bn say), Pr Poss [bn has load M] Pr Poss [bn has load = M] = M e M! = lnn) =. Let M =. For In our Posson model, the bns are ndependent, so the probablty no bn has load M our bad event) s at most ) n e n/). RC 208/9) Lecture 0 slde 7

Lower bound for n balls n bns Lemma Let n balls be thrown ndependently and unformly at random nto n bns. Then for n suffcently large) the maxmum load s at least lnn)/ wth probablty at least n. Proof. For the Posson varables, we have λ = n n any bn say), Pr Poss [bn has load M] Pr Poss [bn has load = M] = M e M! = lnn) =. Let M =. For In our Posson model, the bns are ndependent, so the probablty no bn has load M our bad event) s at most ) n e n/). RC 208/9) Lecture 0 slde 7

Lower bound for n balls n bns Lemma Let n balls be thrown ndependently and unformly at random nto n bns. Then for n suffcently large) the maxmum load s at least lnn)/ wth probablty at least n. Proof. For the Posson varables, we have λ = n n any bn say), Pr Poss [bn has load M] Pr Poss [bn has load = M] = M e M! = lnn) =. Let M =. For In our Posson model, the bns are ndependent, so the probablty no bn has load M our bad event) s at most ) n e n/). RC 208/9) Lecture 0 slde 7

Lower bound for n balls n bns Lemma Let n balls be thrown ndependently and unformly at random nto n bns. Then for n suffcently large) the maxmum load s at least lnn)/ wth probablty at least n. Proof. For the Posson varables, we have λ = n n any bn say), Pr Poss [bn has load M] Pr Poss [bn has load = M] = M e M! = lnn) =. Let M =. For In our Posson model, the bns are ndependent, so the probablty no bn has load M our bad event) s at most ) n e n/). RC 208/9) Lecture 0 slde 7

Lower bound for n balls n bns Lemma Let n balls be thrown ndependently and unformly at random nto n bns. Then for n suffcently large) the maxmum load s at least lnn)/ wth probablty at least n. Proof. For the Posson varables, we have λ = n n any bn say), Pr Poss [bn has load M] Pr Poss [bn has load = M] = M e M! = lnn) =. Let M =. For In our Posson model, the bns are ndependent, so the probablty no bn has load M our bad event) s at most ) n e n/). RC 208/9) Lecture 0 slde 7

Lower bound for n balls n bns Lemma Let n balls be thrown ndependently and unformly at random nto n bns. Then for n suffcently large) the maxmum load s at least lnn)/ wth probablty at least n. Proof. For the Posson varables, we have λ = n n any bn say), Pr Poss [bn has load M] Pr Poss [bn has load = M] = M e M! = lnn) =. Let M =. For In our Posson model, the bns are ndependent, so the probablty no bn has load M our bad event) s at most ) n e n/). RC 208/9) Lecture 0 slde 7

Lower bound for n balls n bns Proof of Lemma 5. cont d. We now relate Pr Poss [bn has load M] to the probablty of the same event n the balls-n-bns model. Corollary 5.9 tells us that when we consder the exact balls-n-bns dstrbuton X n),..., X n) n ), that the probablty of the event no bn has M balls s at most e n e n/). We want ths less than n, e we want e n/) n 3/2. Takng ln ) of both sdes, ths happens f n ) 3 2 lnn) + 3 2 lnn) n. Now M! e M M e )M M M e )M Lemma 5.8), hence n nem. em M+ RC 208/9) Lecture 0 slde 8

Lower bound for n balls n bns Proof of Lemma 5. cont d. We now relate Pr Poss [bn has load M] to the probablty of the same event n the balls-n-bns model. Corollary 5.9 tells us that when we consder the exact balls-n-bns dstrbuton X n),..., X n) n ), that the probablty of the event no bn has M balls s at most e n e n/). We want ths less than n, e we want e n/) n 3/2. Takng ln ) of both sdes, ths happens f n ) 3 2 lnn) + 3 2 lnn) n. Now M! e M M e )M M M e )M Lemma 5.8), hence n nem. em M+ RC 208/9) Lecture 0 slde 8

Lower bound for n balls n bns Proof of Lemma 5. cont d. We now relate Pr Poss [bn has load M] to the probablty of the same event n the balls-n-bns model. Corollary 5.9 tells us that when we consder the exact balls-n-bns dstrbuton X n),..., X n) n ), that the probablty of the event no bn has M balls s at most e n e n/). We want ths less than n, e we want e n/) n 3/2. Takng ln ) of both sdes, ths happens f n ) 3 2 lnn) + 3 2 lnn) n. Now M! e M M e )M M M e )M Lemma 5.8), hence n nem. em M+ RC 208/9) Lecture 0 slde 8

Lower bound for n balls n bns Proof of Lemma 5. cont d. Therefore t wll suffce to show that + 3 2 lnn) suffcently large n), that nem, or for em M+ 2 lnn) nem em M+. Takng the ln of both sdes, ths happens usng M lnn) ) when ln2)+ ) lnn) + lnn) + ) ) lnn) + ln ), e, exactly when +ln2)+ ) lnn) + lnn) lnn) + lnn) ln +ln, e, exactly when + ln2) + 2 lnn) + lnn) ln + ln. RC 208/9) Lecture 0 slde 9

Lower bound for n balls n bns Proof of Lemma 5. cont d. Therefore t wll suffce to show that + 3 2 lnn) suffcently large n), that nem, or for em M+ 2 lnn) nem em M+. Takng the ln of both sdes, ths happens usng M lnn) ) when ln2)+ ) lnn) + lnn) + ) ) lnn) + ln ), e, exactly when +ln2)+ ) lnn) + lnn) lnn) + lnn) ln +ln, e, exactly when + ln2) + 2 lnn) + lnn) ln + ln. RC 208/9) Lecture 0 slde 9

Lower bound for n balls n bns Proof of Lemma 5. cont d. Therefore t wll suffce to show that + 3 2 lnn) suffcently large n), that nem, or for em M+ 2 lnn) nem em M+. Takng the ln of both sdes, ths happens usng M lnn) ) when ln2)+ ) lnn) + lnn) + ) ) lnn) + ln ), e, exactly when +ln2)+ ) lnn) + lnn) lnn) + lnn) ln +ln, e, exactly when + ln2) + 2 lnn) + lnn) ln + ln. RC 208/9) Lecture 0 slde 9

Lower bound for n balls n bns Proof of Lemma 5. cont d. Therefore t wll suffce to show that + 3 2 lnn) suffcently large n), that nem, or for em M+ 2 lnn) nem em M+. Takng the ln of both sdes, ths happens usng M lnn) ) when ln2)+ ) lnn) + lnn) + ) ) lnn) + ln ), e, exactly when +ln2)+ ) lnn) + lnn) lnn) + lnn) ln +ln, e, exactly when + ln2) + 2 lnn) + lnn) ln + ln. RC 208/9) Lecture 0 slde 9

Lower bound for n balls n bns Proof of Lemma 5. cont d. To show that + ln2) + 2 lnn) + lnn) ln + ln we wll multply across by, to verfy the equvalent nequalty +ln2)) +2) 2 lnn)+lnn) ln +ln. At ths pont we notce that we have two terms on the rght lnn) and lnn) ln ) whch are exponentally larger than the two terms on the lhs - both lhs terms only grow wrt. We do not need to check the numbers - as n grows the rhs wll certanly be greater than the lhs. Hence our clam holds. RC 208/9) Lecture 0 slde 0

Lower bound for n balls n bns Proof of Lemma 5. cont d. To show that + ln2) + 2 lnn) + lnn) ln + ln we wll multply across by, to verfy the equvalent nequalty +ln2)) +2) 2 lnn)+lnn) ln +ln. At ths pont we notce that we have two terms on the rght lnn) and lnn) ln ) whch are exponentally larger than the two terms on the lhs - both lhs terms only grow wrt. We do not need to check the numbers - as n grows the rhs wll certanly be greater than the lhs. Hence our clam holds. RC 208/9) Lecture 0 slde 0

Lower bound for n balls n bns Proof of Lemma 5. cont d. To show that + ln2) + 2 lnn) + lnn) ln + ln we wll multply across by, to verfy the equvalent nequalty +ln2)) +2) 2 lnn)+lnn) ln +ln. At ths pont we notce that we have two terms on the rght lnn) and lnn) ln ) whch are exponentally larger than the two terms on the lhs - both lhs terms only grow wrt. We do not need to check the numbers - as n grows the rhs wll certanly be greater than the lhs. Hence our clam holds. RC 208/9) Lecture 0 slde 0

References and Exercses Sectons 5., 5.2 of Probablty and Computng". Sectons 5.3 and 5.4 have all precse detals of our Ω lnn) ) result. Secton 5.5 on Hashng s worth a read and has none of the Posson stuff I m skppng t because of tme lmtatons). Exercses I wll release a tutoral sheet. RC 208/9) Lecture 0 slde