Lecture 2: April 3, 2013

Similar documents
Lecture 12: November 13, 2018

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 22

CS 330 Discussion - Probability

An Introduction to Randomized Algorithms

HOMEWORK 2 SOLUTIONS

Discrete Mathematics for CS Spring 2005 Clancy/Wagner Notes 21. Some Important Distributions

Lecture 5: April 17, 2013

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 19

Discrete Mathematics and Probability Theory Summer 2014 James Cook Note 15

The Random Walk For Dummies

Axioms of Measure Theory

Problem Set 2 Solutions

6.3 Testing Series With Positive Terms

Discrete Mathematics and Probability Theory Spring 2012 Alistair Sinclair Note 15

Lecture Chapter 6: Convergence of Random Sequences

Randomized Algorithms I, Spring 2018, Department of Computer Science, University of Helsinki Homework 1: Solutions (Discussed January 25, 2018)

Skip Lists. Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015 S 3 S S 1

PH 425 Quantum Measurement and Spin Winter SPINS Lab 1

Math 113 Exam 3 Practice

10.6 ALTERNATING SERIES

Lecture 2 February 8, 2016

Arkansas Tech University MATH 2924: Calculus II Dr. Marcel B. Finan

Infinite Sequences and Series

As stated by Laplace, Probability is common sense reduced to calculation.

Advanced Stochastic Processes.

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman:

It is always the case that unions, intersections, complements, and set differences are preserved by the inverse image of a function.

Bertrand s Postulate

Design and Analysis of Algorithms

n outcome is (+1,+1, 1,..., 1). Let the r.v. X denote our position (relative to our starting point 0) after n moves. Thus X = X 1 + X 2 + +X n,

Lecture 4 The Simple Random Walk

Lecture 9: Expanders Part 2, Extractors

ACO Comprehensive Exam 9 October 2007 Student code A. 1. Graph Theory

The picture in figure 1.1 helps us to see that the area represents the distance traveled. Figure 1: Area represents distance travelled

Sets and Probabilistic Models

Roberto s Notes on Series Chapter 2: Convergence tests Section 7. Alternating series

kp(x = k) = λe λ λ k 1 (k 1)! = λe λ r k e λλk k! = e λ g(r) = e λ e rλ = e λ(r 1) g (1) = E[X] = λ g(r) = kr k 1 e λλk k! = E[X]

Sets and Probabilistic Models

A sequence of numbers is a function whose domain is the positive integers. We can see that the sequence

STAT Homework 1 - Solutions

Product measures, Tonelli s and Fubini s theorems For use in MAT3400/4400, autumn 2014 Nadia S. Larsen. Version of 13 October 2014.

Lecture 2: Concentration Bounds

Math 216A Notes, Week 5

0, otherwise. EX = E(X 1 + X n ) = EX j = np and. Var(X j ) = np(1 p). Var(X) = Var(X X n ) =

MAT1026 Calculus II Basic Convergence Tests for Series

Lecture 2. The Lovász Local Lemma

If a subset E of R contains no open interval, is it of zero measure? For instance, is the set of irrationals in [0, 1] is of measure zero?

On Random Line Segments in the Unit Square

Random Variables, Sampling and Estimation

Lecture 4: April 10, 2013

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 1 MATH00030 SEMESTER / Statistics

UNIT 2 DIFFERENT APPROACHES TO PROBABILITY THEORY

Sequences and Series of Functions

Online hypergraph matching: hiring teams of secretaries

Convergence of random variables. (telegram style notes) P.J.C. Spreij

Chapter 6 Infinite Series

Introduction to Probability. Ariel Yadin. Lecture 7

Once we have a sequence of numbers, the next thing to do is to sum them up. Given a sequence (a n ) n=1

f X (12) = Pr(X = 12) = Pr({(6, 6)}) = 1/36

Application to Random Graphs

What is Probability?

Probability theory and mathematical statistics:

Math 155 (Lecture 3)

Frequentist Inference

Section 11.8: Power Series

1 Hash tables. 1.1 Implementation

INFINITE SEQUENCES AND SERIES

Random Models. Tusheng Zhang. February 14, 2013

Optimally Sparse SVMs

Sequences A sequence of numbers is a function whose domain is the positive integers. We can see that the sequence

Properties of Regular Languages. Reading: Chapter 4

The Binomial Theorem

CS284A: Representations and Algorithms in Molecular Biology

Expectation and Variance of a random variable

SECTION 1.5 : SUMMATION NOTATION + WORK WITH SEQUENCES

THE SOLUTION OF NONLINEAR EQUATIONS f( x ) = 0.

Probability and Random Processes

Math 140 Introductory Statistics

Intro to Learning Theory

Lecture 01: the Central Limit Theorem. 1 Central Limit Theorem for i.i.d. random variables

Discrete Mathematics and Probability Theory Fall 2016 Walrand Probability: An Overview

Lecture 19: Convergence

PROBLEM SET 5 SOLUTIONS 126 = , 37 = , 15 = , 7 = 7 1.

Lecture 14: Graph Entropy

Math 25 Solutions to practice problems

Topic 9: Sampling Distributions of Estimators

MA131 - Analysis 1. Workbook 2 Sequences I

Sequences I. Chapter Introduction

Disjoint set (Union-Find)

Ada Boost, Risk Bounds, Concentration Inequalities. 1 AdaBoost and Estimates of Conditional Probabilities

Solutions to selected exercise of Randomized Algorithms

Chapter 0. Review of set theory. 0.1 Sets

x a x a Lecture 2 Series (See Chapter 1 in Boas)

if j is a neighbor of i,

11.6 Absolute Convergence and the Ratio and Root Tests

Chapter 5. Inequalities. 5.1 The Markov and Chebyshev inequalities

DS 100: Principles and Techniques of Data Science Date: April 13, Discussion #10

This is an introductory course in Analysis of Variance and Design of Experiments.

Chapter 22. Comparing Two Proportions. Copyright 2010 Pearson Education, Inc.

Math 113, Calculus II Winter 2007 Final Exam Solutions

Transcription:

TTIC/CMSC 350 Mathematical Toolkit Sprig 203 Madhur Tulsiai Lecture 2: April 3, 203 Scribe: Shubhedu Trivedi Coi tosses cotiued We retur to the coi tossig example from the last lecture agai: Example. Give, that P [heads = p, what is E [Z, where Z = #tosses till the first heads? We saw that E [Z = p. This of course holds whe all the coi tosses are idepedet. Such a Z is called a Geometric Radom Variable (I geeral: Suppose we have probability p of success i oe try, ad if we ca make idepedet attempts may times over, the the geometric radom variable couts the umber of attempts that would be eeded to obtai the first success i expectatio). A side remark: I the last lecture we used that for x satisfyig x, i xi =. This ( x) 2 ca be derived by differetiatig both sides of the equality i=0 xi = x, which ca be derived by defiig the partial sums S of the series o the left, as follows: Hece, we have, S = + x + x 2 +... + x xs = x + x 2 +... + x ( x)s = x S = x x lim S = x if x Aother thig that we swept uder the proverbial rug i the last lecture: What is the basic evet i the case of this example? How ca we defie a probability if the set is of potetially ifiite size? We eed to make sure that the problem is well posed by defiig a valid probability space. As metioed earlier, here if we try to assig a probability to each possible outcome (which is a ifiite sequece of coi tosses), it will simply be 0. However, we ca cosider a collectio of evets which is closed uder uio, itersectio ad complemetatio, to which we will assig probabilities. For all i N, ad all sequeces of i bits deoted by b {0, } i, we ca defie the evet E i,b first i bits are accordig to b. Thus, P [E i,b = p # of s i b ( p) # of 0s i b. We will use the collectio E geerated by uios, complemets ad itersectios of all such evets, to which we ca easily assig probabilities. We ow retur to the case with a fiite umber of coi tosses.

Example.2 Cosider idepedet tosses of a coi which comes up heads with probability p. Defie the followig radom variable: { if # of heads is odd Y = 0 if # of heads is eve We wat to compute E [Y. Note that ulike i the previous lecture, here we have o liearity to exploit ad add up. So what do we do i such a case? Also, ote that E [Y = 2 if P [heads = 2 (by symmetry). However, the case p 2 is more iterestig. Sice we have tosses, let us call this E [Y. We have, E [Y = p E [Y H + ( p) E [Y T. I this settig, oe trick that we utilize is that istead of a variable that takes values 0 ad, let s make the variable take values ad. We take { if # of heads is odd Ỹ = + if # of heads is eve The followig claim Ỹ = 2Y is easy to check. Now let us defie aother radom variable. { if ith coi is heads X i = + if ith coi is tails We have: Ỹ = X X 2 X 3... X. Therefore E [Ỹ = E [X X 2... X. Now, if we have idepedece, the we d have: E [Ỹ = E [X E [X 2 E [X Also, E [X i = 2p E [Ỹ = ( 2p). Sice we eed E [Y, we have: E [Y = ( 2p) 2 Basically, the trick of usig + ad as values for a radom variable i place of 0 ad, was able to capture the parity of the evet. Also, ote that here we did eed to uses that all coi tosses are idepedet. 2 Toy Problem: Coupo Collectio The model is the followig: There are kids of items/coupos ad at each time step T, we get oe radom coupo (idepedetly of the others) out of the total. We defie a radom variable, T which is the time whe we first have all the types of coupos. We wat to fid E [T. 2

We ca make the followig claim: T = Where, X i is the time to get from the i to the i type of coupo. Thus we have, X i E [T = i E [X i Clearly, we have E [X =, E [X 2 =, E [X 3 = ad so o. This is because, X i represets a Geometric Radom Variable with success probability gettig a ew coupo. Thus, at the ith step: E [X i = i + 2 i+ of E [T = + + 2 +... + = H() where H = + 2 + 3 + + is the th harmoic umber. It is kow (see Wikipedia for example) that H = l + Θ(). Thus, we have that E [T = l + Θ(). 3 A Simple Radomized Algorithm for Max-Cut Usig the ideas discussed so far, i this sectio we desig a algorithm for Max-Cut. The problem is as follows: Give a graph G = (V, E), we wat to divide the vertex set V ito two set S ad S such that the umber of edges betwee S ad S is as large as possible. For each i V, assig i ito S with probability 2 ad ito S with probability 2. Let us defie a radom variable Z, such that Z = # of edges cut. Clearly, Z = e X e, where: { if e is cut X e = 0 otherwise Sice we are pickig each vertex with probability 2 to be i either S or S, for ay edge i E, the probability that it is cut is 2. Thus if the umber of edges E = m, the by liearity of expectatio the expected umber of edges cute, E [Z = m 2. Now suppose we also wat to fid the sets S ad S determiistically, while esurig that the umber of edges betwee ad S ad S is at least m/2. We will proceed as follows: We lie up the vertices v, v 2,... v ad the determiistically decide if each v i should be i S or S, while maitaiig E [Z decisio for v,..., v i m 2. That is, suppose v, v 2,... v i are placed ito S or S determiistically, we have to aalyze the expectatio of the cut-size from ow o. E [Z decisio for v,..., v i = 2 E [Z decisio for v,..., v i & v i+ S + 2 E [ Z decisio for v,..., v i & v i+ S. 3

It is easy to see that at least with oe of the two choices of keepig v i+ i S or S, the expected cut size does ot decrease. Therefore at least oe of the followig is true: E [Z decisio for v,..., v i & v i+ S E [Z decisio for v,..., v i E [ Z decisio for v,..., v i & v i+ S E [Z decisio for v,..., v i or Thus, our strategy is to look at the expected size of the cut i both the cases ad put v i+ i the set that leads to a greater expected cut-size. Note that this decisio is completely determiistic. The oly questio is how to compute the quatity E [Z decisio for v,..., v i efficietly. This is easy sice give the decisio for v,..., v i, the edges for which both vertices are already o the same side compute 0 to the expectatio ad the edges for which both vertices are o differet sides cotribute. The remaiig edges cotribute /2 to the expectatio. Thus, we ca give the decisio for v,..., v i, we ca easily compute the cotributio of each edge. This leads to a algorithm which de-radomizes the radomized algorithm by makig the sequetially i a way that esures that the coditioal expectatio give our choices so far is large. This is sometimes also referred to as the Method of Coditioal Expectatios. 4 The Probabilistic Method: Idepedet Sets Now we do oe more applicatio of expectatios which is ofte called the Probabilistic Method. It is ofte used to show the existece of objects with certai properties without ecessarily costructig them. I the previous sectio we used probabilistic reasoig to show that a cut exists, but the later also showed how to fid such a cut. Agai, cosider a graph G = (V, E). Now, we wat to defie a idepedet set S V, such that o edge lies completely withi the set S. That is, e = (v i, v j ), either v i / S or v j / S. We are iterested i fidig a large idepedet set. Let s say that deg(v i ) is the umber of edges cotaiig v i. The followig result is due to Caro ad Wei. Theorem 4. Let G = (V, E) be a graph with vertices. The there exists a idepedet set S such that S deg(v i ) + max i (deg(v i )) +. The mai trick i such kid of problems is to set up the right kid of probabilistic experimet, the aalysis is usually quite easy. I this questio, we ca t do everythig idepedetly ulike i some previous questios. Suppose that we do - ad hece pursue the followig idea: Put each v i i S with probability p. We ca t guaratee that we would ot pick up both the edpoits of a edge to keep i S. However, this idea ca also be made to work ad we will come back to it i a bit. We first prove the theorem usig a differet idea. 4

Proof: Pick a radom orderig of the vertices v, v 2,... v ad we pick v i if it appears before all its eighbors i the orderig. This is clearly a idepedet set sice for ay edge (v i, v j ), the vertex which appears later i the orderig will certaily ot be picked. The ext questio is to aalyze the size of this idepedet set. Or we wat to look at E [ S. We have S = i X i, where X i = { if vi S 0 otherwise We would wat to calculate E [X i. To do it, if we choose a radom order, we ve shruk our probability space to the eighborhood of v i aloe. If we have a radom order, what is the probability that i appears first? E [X i = deg(v i ) + This immediately gives that E [ S = deg(v i ) +, ad hece there must exist a idepedet set S with the above size. We ca ow salvage our wrog idea discussed earlier. The problem here is: No matter what p is, we might ed up pickig up a edge. What oe ca do is to throw away such edges ad let us say we get a set S from S by throwig away edges. Now we wat to kow what is the expected size of S. This method is called the Method of Alteratios (sice we alter ad object to make it satisfy the desired properties). We have E [ S = E [ S 2umber of edges deleted. We ca use liearity to compute the above expectatio. We have that ex S = p. Also, each edge is deleted if ad oly if both its vertices are i S. This happes with probability p 2. Thus, E [ S = p 2 p 2 E. Suppose the maximum degree is d, the the umber of edges E d 2, thus we have: Now suppose we choose p = 2d, the E [ S p p 2 d E [ S 2d 4d = 4d Ad thus, we still get that there exists a idepedet set with size at least 4d. 5