Outline. Martingales. Piotr Wojciechowski 1. 1 Lane Department of Computer Science and Electrical Engineering West Virginia University.
|
|
- Collin Dennis
- 5 years ago
- Views:
Transcription
1 Outline Piotr 1 1 Lane Department of Computer Science and Electrical Engineering West Virginia University 8 April, 01
2 Outline Outline 1 Tail Inequalities
3 Outline Outline 1 Tail Inequalities General
4 Outline Tail Inequalities 1 Tail Inequalities General
5 Tail Inequalities We can apply a Chernoff-like bound to Martingale even when the variables are not independent. This bound is referred to as the Azuma-Hoeffding Inequality.
6 Tail Inequalities We can apply a Chernoff-like bound to Martingale even when the variables are not independent. This bound is referred to as the Azuma-Hoeffding Inequality. Azuma-Hoeffding Inequality Theorem If X 0,..., X n is a martingale such that X k X k 1 c k then, for all t > 0 and λ > 0 P( X t X 0 λ) e λ Pt c k=1 k.
7 Tail Inequalities Proof. Let Y i = X i X i 1 for i = 1,..., t. Thus Y i c i.
8 Tail Inequalities Proof. Let Y i = X i X i 1 for i = 1,..., t. Thus Y i c i. Since X 0,..., X n is a martingale we have that E[Y i X 0,..., X i 1 ] = E[X i X i 1 X 0,..., X i 1 ] = E[X i X 0,..., X i 1 ] X i 1 = 0.
9 Tail Inequalities Proof. Let Y i = X i X i 1 for i = 1,..., t. Thus Y i c i. Since X 0,..., X n is a martingale we have that E[Y i X 0,..., X i 1 ] = E[X i X i 1 X 0,..., X i 1 ] = E[X i X 0,..., X i 1 ] X i 1 = 0. We have that Y i = c i 1 Y i c i + c i 1 + Y i c i.
10 Tail Inequalities Proof. Let Y i = X i X i 1 for i = 1,..., t. Thus Y i c i. Since X 0,..., X n is a martingale we have that E[Y i X 0,..., X i 1 ] = E[X i X i 1 X 0,..., X i 1 ] = E[X i X 0,..., X i 1 ] X i 1 = 0. We have that Y i = c i 1 Y i c i Because the function e α x is convex it follows that + c i 1 + Y i c i. e α Y i 1 Y i c i e α c i Y i c i e α c i = eα c i + e α c i + Y i c i (e α c i e α c i ).
11 Tail Inequalities Proof. As shown previously E[Y i X 0,..., X i 1 ] = 0, thus» E[e α Y i e α c i + e α c i X 0,..., x i 1 ] = E = eα c i + e α c i + Y i (e α c i e α c i ) X 0,.., X i 1 c i e (α c i ).
12 Tail Inequalities Proof. As shown previously E[Y i X 0,..., X i 1 ] = 0, thus» E[e α Y i e α c i + e α c i X 0,..., x i 1 ] = E We have that h i E e α (X t X 0 ) = E = eα c i + e α c i t 1 Y 4 e α Y i 3 t Y 5 = E 4 + Y i (e α c i e α c i ) X 0,.., X i 1 c i e (α c i ). e α Y i i=1 i=1 3 t Y E 4 e α Y i 5 e (α c i ) α Pt c k=1 i e. i=1 3 i 5 E he α Y i 1 X 0,..., x i 1
13 Tail Inequalities Proof. Therefore, we have that P(X t X 0 λ) = P(e α (X t X 0 ) e α λ ) E ˆe α (X t X 0 ) e α λ e α P tk=1 c i α λ.
14 Tail Inequalities Proof. Therefore, we have that P(X t X 0 λ) = P(e α (X t X 0 ) e α λ ) E ˆe α (X t X 0 ) By letting α = λ P tk=1 c i we get that e α λ e α P tk=1 c i α λ. P(X t X 0 λ) e λ Pt c k=1 k.
15 Tail Inequalities Proof. Therefore, we have that P(X t X 0 λ) = P(e α (X t X 0 ) e α λ ) E ˆe α (X t X 0 ) By letting α = λ P tk=1 c i we get that e α λ e α P tk=1 c i α λ. P(X t X 0 λ) e λ Pt c k=1 k. We can similarly construct the same bound on P(X t X 0 λ).
16 Tail Inequalities Corollary If X 0,..., X n is a martingale such that X k X k 1 c then, for all t 1 and λ > 0 P( X t X 0 λ c t) e λ.
17 Tail Inequalities Corollary If X 0,..., X n is a martingale such that X k X k 1 c then, for all t 1 and λ > 0 P( X t X 0 λ c t) e λ. Azuma-Hoeffding Inequality Theorem If X 0,..., X n is a martingale such that B k X k X k 1 B k + d k for some constants d k and random variables B k then, for all t 0 and λ > 0 P( X t X 0 λ) e λ P tk=1 d k.
18 Outline Tail Inequalities General 1 Tail Inequalities General
19 Tail Inequalities General Lipschitz condition A function f( X) = f(x 1, X,..., X n) satisfies the Lipschitz condition with bound c if for any i and any x 1,..., x n and y i, f(x 1,..., x i,..., x n) f(x 1,..., x i 1, y i, x i+1,..., x n) c.
20 Tail Inequalities General Lipschitz condition A function f( X) = f(x 1, X,..., X n) satisfies the Lipschitz condition with bound c if for any i and any x 1,..., x n and y i, f(x 1,..., x i,..., x n) f(x 1,..., x i 1, y i, x i+1,..., x n) c. Theorem Let f be a functions satisfying the Lipschitz condition with bound c and let Z 0,... be the Doob martingale defined by Z 0 = E[f(X 1,..., X n)] and Z k = E[f(X 1,..., X n) X 1,..., X k ]. We have that for each k there exists a random variable B k depending on Z 0,..., Z k 1 such that B k Z k Z k 1 B k + c.
21 Tail Inequalities General Example Suppose that we are throwing m balls independently and uniformly at random into n bins. Let X i denote the bin into which the i th ball falls and let F denote the number of empty bins after all m balls are thrown.
22 Tail Inequalities General Example Suppose that we are throwing m balls independently and uniformly at random into n bins. Let X i denote the bin into which the i th ball falls and let F denote the number of empty bins after all m balls are thrown. We will use the Azuma-Hoeffding Inequality to determine the tightness of the distribution of F.
23 Tail Inequalities General Example Suppose that we are throwing m balls independently and uniformly at random into n bins. Let X i denote the bin into which the i th ball falls and let F denote the number of empty bins after all m balls are thrown. We will use the Azuma-Hoeffding Inequality to determine the tightness of the distribution of F. We have that the sequence Z i = E[F X 1,..., X i ] is Doob martingale.
24 Tail Inequalities General Example Suppose that we are throwing m balls independently and uniformly at random into n bins. Let X i denote the bin into which the i th ball falls and let F denote the number of empty bins after all m balls are thrown. We will use the Azuma-Hoeffding Inequality to determine the tightness of the distribution of F. We have that the sequence Z i = E[F X 1,..., X i ] is Doob martingale. Since F depends on X 1,..., X m, there is a function f such that F = f(x 1,..., X m).
25 Tail Inequalities General Example Suppose that we are throwing m balls independently and uniformly at random into n bins. Let X i denote the bin into which the i th ball falls and let F denote the number of empty bins after all m balls are thrown. We will use the Azuma-Hoeffding Inequality to determine the tightness of the distribution of F. We have that the sequence Z i = E[F X 1,..., X i ] is Doob martingale. Since F depends on X 1,..., X m, there is a function f such that F = f(x 1,..., X m). Because changing which bin a single ball lands in changes F by at most 1, we have that f satisfies the Lipschitz condition with bound 1.
26 Tail Inequalities General Example Suppose that we are throwing m balls independently and uniformly at random into n bins. Let X i denote the bin into which the i th ball falls and let F denote the number of empty bins after all m balls are thrown. We will use the Azuma-Hoeffding Inequality to determine the tightness of the distribution of F. We have that the sequence Z i = E[F X 1,..., X i ] is Doob martingale. Since F depends on X 1,..., X m, there is a function f such that F = f(x 1,..., X m). Because changing which bin a single ball lands in changes F by at most 1, we have that f satisfies the Lipschitz condition with bound 1. Thus applying the second Azuma-Hoeffding Inequality we get that P( F E[F] ǫ) = P( Z m Z 0 ǫ) e ǫ P mk=1 c = e ǫ m.
27 Tail Inequalities General Example Let G be a random graph in G n,p. The Chromatic number, χ(g), is the minimum number of colors needed to color all the verticies of a graph so that no two adjacent verticies are the same color.
28 Tail Inequalities General Example Let G be a random graph in G n,p. The Chromatic number, χ(g), is the minimum number of colors needed to color all the verticies of a graph so that no two adjacent verticies are the same color. We will use the Azuma-Hoeffding Inequality to determine the tightness of the distribution of χ(g).
29 Tail Inequalities General Example Let G be a random graph in G n,p. The Chromatic number, χ(g), is the minimum number of colors needed to color all the verticies of a graph so that no two adjacent verticies are the same color. We will use the Azuma-Hoeffding Inequality to determine the tightness of the distribution of χ(g). Let Z be the vertex exposure martingale for G. Thus if G i is subgraph of G induced by the verticies 1,..., i, we have that Z 0 = E[χ(G)] and Z i = E[χ(G) G 1,..., G i ].
30 Tail Inequalities General Example Let G be a random graph in G n,p. The Chromatic number, χ(g), is the minimum number of colors needed to color all the verticies of a graph so that no two adjacent verticies are the same color. We will use the Azuma-Hoeffding Inequality to determine the tightness of the distribution of χ(g). Let Z be the vertex exposure martingale for G. Thus if G i is subgraph of G induced by the verticies 1,..., i, we have that Z 0 = E[χ(G)] and Z i = E[χ(G) G 1,..., G i ]. Introducing a vertex into the graph increases he chromatic number by at most 1 so we have that for each i, 0 Z i Z i 1 1.
31 Tail Inequalities General Example Let G be a random graph in G n,p. The Chromatic number, χ(g), is the minimum number of colors needed to color all the verticies of a graph so that no two adjacent verticies are the same color. We will use the Azuma-Hoeffding Inequality to determine the tightness of the distribution of χ(g). Let Z be the vertex exposure martingale for G. Thus if G i is subgraph of G induced by the verticies 1,..., i, we have that Z 0 = E[χ(G)] and Z i = E[χ(G) G 1,..., G i ]. Introducing a vertex into the graph increases he chromatic number by at most 1 so we have that for each i, 0 Z i Z i 1 1. Thus, we can apply the second Azuma-Hoeffding Inequality to get that P( χ(g) E[χ(G)] λ n) e λ.
Randomized Algorithms
Randomized Algorithms 南京大学 尹一通 Martingales Definition: A sequence of random variables X 0, X 1,... is a martingale if for all i > 0, E[X i X 0,...,X i1 ] = X i1 x 0, x 1,...,x i1, E[X i X 0 = x 0, X 1
More informationAssignment 2 : Probabilistic Methods
Assignment 2 : Probabilistic Methods jverstra@math Question 1. Let d N, c R +, and let G n be an n-vertex d-regular graph. Suppose that vertices of G n are selected independently with probability p, where
More informationLecture 5: Probabilistic tools and Applications II
T-79.7003: Graphs and Networks Fall 2013 Lecture 5: Probabilistic tools and Applications II Lecturer: Charalampos E. Tsourakakis Oct. 11, 2013 5.1 Overview In the first part of today s lecture we will
More informationRandomized algorithm
Tutorial 4 Joyce 2009-11-24 Outline Solution to Midterm Question 1 Question 2 Question 1 Question 2 Question 3 Question 4 Question 5 Solution to Midterm Solution to Midterm Solution to Midterm Question
More informationECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018
ECE534, Spring 2018: s for Problem Set #4 Due Friday April 6, 2018 1. MMSE Estimation, Data Processing and Innovations The random variables X, Y, Z on a common probability space (Ω, F, P ) are said to
More informationLectures 6: Degree Distributions and Concentration Inequalities
University of Washington Lecturer: Abraham Flaxman/Vahab S Mirrokni CSE599m: Algorithms and Economics of Networks April 13 th, 007 Scribe: Ben Birnbaum Lectures 6: Degree Distributions and Concentration
More informationLecture 4: Inequalities and Asymptotic Estimates
CSE 713: Random Graphs and Applications SUNY at Buffalo, Fall 003 Lecturer: Hung Q. Ngo Scribe: Hung Q. Ngo Lecture 4: Inequalities and Asymptotic Estimates We draw materials from [, 5, 8 10, 17, 18].
More informationSTAT 200C: High-dimensional Statistics
STAT 200C: High-dimensional Statistics Arash A. Amini May 30, 2018 1 / 59 Classical case: n d. Asymptotic assumption: d is fixed and n. Basic tools: LLN and CLT. High-dimensional setting: n d, e.g. n/d
More informationSolutions to Problem Set 4
UC Berkeley, CS 174: Combinatorics and Discrete Probability (Fall 010 Solutions to Problem Set 4 1. (MU 5.4 In a lecture hall containing 100 people, you consider whether or not there are three people in
More informationConcentration function and other stuff
Concentration function and other stuff Sabrina Sixta Tuesday, June 16, 2014 Sabrina Sixta () Concentration function and other stuff Tuesday, June 16, 2014 1 / 13 Table of Contents Outline 1 Chernoff Bound
More informationCS5314 Randomized Algorithms. Lecture 18: Probabilistic Method (De-randomization, Sample-and-Modify)
CS5314 Randomized Algorithms Lecture 18: Probabilistic Method (De-randomization, Sample-and-Modify) 1 Introduce two topics: De-randomize by conditional expectation provides a deterministic way to construct
More informationMartingale techniques
Chapter 3 Martingale techniques Martingales are a central tool in probability theory. In this chapter we illustrate their use, as well as some related concepts, on a number of applications in discrete
More informationMarch 1, Florida State University. Concentration Inequalities: Martingale. Approach and Entropy Method. Lizhe Sun and Boning Yang.
Florida State University March 1, 2018 Framework 1. (Lizhe) Basic inequalities Chernoff bounding Review for STA 6448 2. (Lizhe) Discrete-time martingales inequalities via martingale approach 3. (Boning)
More informationConcentration of Measures by Bounded Couplings
Concentration of Measures by Bounded Couplings Subhankar Ghosh, Larry Goldstein and Ümit Işlak University of Southern California [arxiv:0906.3886] [arxiv:1304.5001] May 2013 Concentration of Measure Distributional
More informationUniform concentration inequalities, martingales, Rademacher complexity and symmetrization
Uniform concentration inequalities, martingales, Rademacher complexity and symmetrization John Duchi Outline I Motivation 1 Uniform laws of large numbers 2 Loss minimization and data dependence II Uniform
More information< k 2n. 2 1 (n 2). + (1 p) s) N (n < 1
List of Problems jacques@ucsd.edu Those question with a star next to them are considered slightly more challenging. Problems 9, 11, and 19 from the book The probabilistic method, by Alon and Spencer. Question
More informationLecture 1 Measure concentration
CSE 29: Learning Theory Fall 2006 Lecture Measure concentration Lecturer: Sanjoy Dasgupta Scribe: Nakul Verma, Aaron Arvey, and Paul Ruvolo. Concentration of measure: examples We start with some examples
More informationThe first bound is the strongest, the other two bounds are often easier to state and compute. Proof: Applying Markov's inequality, for any >0 we have
The first bound is the strongest, the other two bounds are often easier to state and compute Proof: Applying Markov's inequality, for any >0 we have Pr (1 + ) = Pr For any >0, we can set = ln 1+ (4.4.1):
More informationUC Berkeley, CS 174: Combinatorics and Discrete Probability (Fall 2008) Midterm 1. October 7, 2008
UC Berkeley, CS 74: Combinatorics and Discrete Probability (Fall 2008) Midterm Instructor: Prof. Yun S. Song October 7, 2008 Your Name : Student ID# : Read these instructions carefully:. This is a closed-book
More informationConcentration of Measures by Bounded Size Bias Couplings
Concentration of Measures by Bounded Size Bias Couplings Subhankar Ghosh, Larry Goldstein University of Southern California [arxiv:0906.3886] January 10 th, 2013 Concentration of Measure Distributional
More informationRandomized Algorithms Week 2: Tail Inequalities
Randomized Algorithms Week 2: Tail Inequalities Rao Kosaraju In this section, we study three ways to estimate the tail probabilities of random variables. Please note that the more information we know about
More informationTail Inequalities. The Chernoff bound works for random variables that are a sum of indicator variables with the same distribution (Bernoulli trials).
Tail Inequalities William Hunt Lane Department of Computer Science and Electrical Engineering, West Virginia University, Morgantown, WV William.Hunt@mail.wvu.edu Introduction In this chapter, we are interested
More informationRandomized Load Balancing:The Power of 2 Choices
Randomized Load Balancing: The Power of 2 Choices June 3, 2010 Balls and Bins Problem We have m balls that are thrown into n bins, the location of each ball chosen independently and uniformly at random
More informationSTAT 200C: High-dimensional Statistics
STAT 200C: High-dimensional Statistics Arash A. Amini April 27, 2018 1 / 80 Classical case: n d. Asymptotic assumption: d is fixed and n. Basic tools: LLN and CLT. High-dimensional setting: n d, e.g. n/d
More informationSelected Exercises on Expectations and Some Probability Inequalities
Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ
More informationExercises Measure Theoretic Probability
Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary
More informationInequalities for Sums of Random Variables: a combinatorial perspective
Inequalities for Sums of Random Variables: a combinatorial perspective Doctoral thesis Matas Šileikis Department of Mathematics and Computer Science, Adam Mickiewicz University, Poznań April 2012 Acknowledgements
More information11.1 Set Cover ILP formulation of set cover Deterministic rounding
CS787: Advanced Algorithms Lecture 11: Randomized Rounding, Concentration Bounds In this lecture we will see some more examples of approximation algorithms based on LP relaxations. This time we will use
More informationAN INEQUALITY FOR TAIL PROBABILITIES OF MARTINGALES WITH BOUNDED DIFFERENCES
Lithuanian Mathematical Journal, Vol. 4, No. 3, 00 AN INEQUALITY FOR TAIL PROBABILITIES OF MARTINGALES WITH BOUNDED DIFFERENCES V. Bentkus Vilnius Institute of Mathematics and Informatics, Akademijos 4,
More information2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).
Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,
More informationConcentration inequalities and the entropy method
Concentration inequalities and the entropy method Gábor Lugosi ICREA and Pompeu Fabra University Barcelona what is concentration? We are interested in bounding random fluctuations of functions of many
More informationColoring. Basics. A k-coloring of a loopless graph G is a function f : V (G) S where S = k (often S = [k]).
Coloring Basics A k-coloring of a loopless graph G is a function f : V (G) S where S = k (often S = [k]). For an i S, the set f 1 (i) is called a color class. A k-coloring is called proper if adjacent
More informationLecture 4. P r[x > ce[x]] 1/c. = ap r[x = a] + a>ce[x] P r[x = a]
U.C. Berkeley CS273: Parallel and Distributed Theory Lecture 4 Professor Satish Rao September 7, 2010 Lecturer: Satish Rao Last revised September 13, 2010 Lecture 4 1 Deviation bounds. Deviation bounds
More informationBalls & Bins. Balls into Bins. Revisit Birthday Paradox. Load. SCONE Lab. Put m balls into n bins uniformly at random
Balls & Bins Put m balls into n bins uniformly at random Seoul National University 1 2 3 n Balls into Bins Name: Chong kwon Kim Same (or similar) problems Birthday paradox Hash table Coupon collection
More informationA = A U. U [n] P(A U ). n 1. 2 k(n k). k. k=1
Lecture I jacques@ucsd.edu Notation: Throughout, P denotes probability and E denotes expectation. Denote (X) (r) = X(X 1)... (X r + 1) and let G n,p denote the Erdős-Rényi model of random graphs. 10 Random
More informationMatrix Concentration. Nick Harvey University of British Columbia
Matrix Concentration Nick Harvey University of British Columbia The Problem Given any random nxn, symmetric matrices Y 1,,Y k. Show that i Y i is probably close to E[ i Y i ]. Why? A matrix generalization
More informationProbabilistic Combinatorics. Jeong Han Kim
Probabilistic Combinatorics Jeong Han Kim 1 Tentative Plans 1. Basics for Probability theory Coin Tossing, Expectation, Linearity of Expectation, Probability vs. Expectation, Bool s Inequality 2. First
More informationThe concentration of the chromatic number of random graphs
The concentration of the chromatic number of random graphs Noga Alon Michael Krivelevich Abstract We prove that for every constant δ > 0 the chromatic number of the random graph G(n, p) with p = n 1/2
More informationProbabilistic Method. Benny Sudakov. Princeton University
Probabilistic Method Benny Sudakov Princeton University Rough outline The basic Probabilistic method can be described as follows: In order to prove the existence of a combinatorial structure with certain
More information(A n + B n + 1) A n + B n
344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals
More informationConditional Expectation and Martingales
Conditional Epectation and Martingales Palash Sarkar Applied Statistics Unit Indian Statistical Institute 03, B.T. Road, Kolkata INDIA 700 108 e-mail: palash@isical.ac.in 1 Introduction This is a set of
More informationA sequential hypothesis test based on a generalized Azuma inequality 1
A sequential hypothesis test based on a generalized Azuma inequality 1 Daniël Reijsbergen a,2, Werner Scheinhardt b, Pieter-Tjerk de Boer b a Laboratory for Foundations of Computer Science, University
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction
More informationProblem 1: (Chernoff Bounds via Negative Dependence - from MU Ex 5.15)
Problem 1: Chernoff Bounds via Negative Dependence - from MU Ex 5.15) While deriving lower bounds on the load of the maximum loaded bin when n balls are thrown in n bins, we saw the use of negative dependence.
More informationDiscrete Mathematics
Discrete Mathematics Workshop Organized by: ACM Unit, ISI Tutorial-1 Date: 05.07.2017 (Q1) Given seven points in a triangle of unit area, prove that three of them form a triangle of area not exceeding
More informationLecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.
Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal
More informationEXPONENTIAL INEQUALITIES IN NONPARAMETRIC ESTIMATION
EXPONENTIAL INEQUALITIES IN NONPARAMETRIC ESTIMATION Luc Devroye Division of Statistics University of California at Davis Davis, CA 95616 ABSTRACT We derive exponential inequalities for the oscillation
More informationLecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011
Random Walks and Brownian Motion Tel Aviv University Spring 20 Instructor: Ron Peled Lecture 5 Lecture date: Feb 28, 20 Scribe: Yishai Kohn In today's lecture we return to the Chung-Fuchs theorem regarding
More informationLecture 2: Subgradient Methods
Lecture 2: Subgradient Methods John C. Duchi Stanford University Park City Mathematics Institute 206 Abstract In this lecture, we discuss first order methods for the minimization of convex functions. We
More informationStudying Balanced Allocations with Differential Equations
Combinatorics, Probability and Computing (1999) 8, 473 482. c 1999 Cambridge University Press Printed in the United Kingdom Studying Balanced Allocations with Differential Equations MICHAEL MITZENMACHER
More informationSize and degree anti-ramsey numbers
Size and degree anti-ramsey numbers Noga Alon Abstract A copy of a graph H in an edge colored graph G is called rainbow if all edges of H have distinct colors. The size anti-ramsey number of H, denoted
More informationThe maximal stable set problem : Copositive programming and Semidefinite Relaxations
The maximal stable set problem : Copositive programming and Semidefinite Relaxations Kartik Krishnan Department of Mathematical Sciences Rensselaer Polytechnic Institute Troy, NY 12180 USA kartis@rpi.edu
More informationHamming Cube and Other Stuff
Hamming Cube and Other Stuff Sabrina Sixta Tuesday, May 27, 2014 Sabrina Sixta () Hamming Cube and Other Stuff Tuesday, May 27, 2014 1 / 12 Table of Contents Outline 1 Definition of a Hamming cube 2 Properties
More informationCases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem
Cases Where Finding the Minimum Entropy Coloring of a Characteristic Graph is a Polynomial Time Problem Soheil Feizi, Muriel Médard RLE at MIT Emails: {sfeizi,medard}@mit.edu Abstract In this paper, we
More informationProblem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1
Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :
More informationColorings of uniform hypergraphs with large girth
Colorings of uniform hypergraphs with large girth Andrey Kupavskii, Dmitry Shabanov Lomonosov Moscow State University, Moscow Institute of Physics and Technology, Yandex SIAM conference on Discrete Mathematics
More informationACO Comprehensive Exam October 18 and 19, Analysis of Algorithms
Consider the following two graph problems: 1. Analysis of Algorithms Graph coloring: Given a graph G = (V,E) and an integer c 0, a c-coloring is a function f : V {1,,...,c} such that f(u) f(v) for all
More informationJeong-Hyun Kang Department of Mathematics, University of West Georgia, Carrollton, GA
#A33 INTEGERS 10 (2010), 379-392 DISTANCE GRAPHS FROM P -ADIC NORMS Jeong-Hyun Kang Department of Mathematics, University of West Georgia, Carrollton, GA 30118 jkang@westga.edu Hiren Maharaj Department
More informationLecture 04: Balls and Bins: Birthday Paradox. Birthday Paradox
Lecture 04: Balls and Bins: Overview In today s lecture we will start our study of balls-and-bins problems We shall consider a fundamental problem known as the Recall: Inequalities I Lemma Before we begin,
More informationLecture 4: Applications: random trees, determinantal measures and sampling
Lecture 4: Applications: random trees, determinantal measures and sampling Robin Pemantle University of Pennsylvania pemantle@math.upenn.edu Minerva Lectures at Columbia University 09 November, 2016 Sampling
More informationWith high probability
With high probability So far we have been mainly concerned with expected behaviour: expected running times, expected competitive ratio s. But it would often be much more interesting if we would be able
More informationAssignment 4: Solutions
Math 340: Discrete Structures II Assignment 4: Solutions. Random Walks. Consider a random walk on an connected, non-bipartite, undirected graph G. Show that, in the long run, the walk will traverse each
More informationGraph Packing - Conjectures and Results
Graph Packing p.1/23 Graph Packing - Conjectures and Results Hemanshu Kaul kaul@math.iit.edu www.math.iit.edu/ kaul. Illinois Institute of Technology Graph Packing p.2/23 Introduction Let G 1 = (V 1,E
More informationThe application of probabilistic method in graph theory. Jiayi Li
The application of probabilistic method in graph theory Jiayi Li Seite 2 Probabilistic method The probabilistic method is a nonconstructive method, primarily used in combinatorics and pioneered by Paul
More informationSRC Technical Note October 1, Studying Balanced Allocations with Differential Equations. Michael Mitzenmacher.
SRC Technical Note 1997-024 October 1, 1997 Studying Balanced Allocations with Differential Equations Michael Mitzenmacher d i g i t a l Systems Research Center 130 Lytton Avenue Palo Alto, California
More informationConcentration Inequalities
Chapter Concentration Inequalities I. Moment generating functions, the Chernoff method, and sub-gaussian and sub-exponential random variables a. Goal for this section: given a random variable X, how does
More informationarxiv: v2 [math.co] 10 May 2016
The asymptotic behavior of the correspondence chromatic number arxiv:602.00347v2 [math.co] 0 May 206 Anton Bernshteyn University of Illinois at Urbana-Champaign Abstract Alon [] proved that for any graph
More informationDestroying non-complete regular components in graph partitions
Destroying non-complete regular components in graph partitions Landon Rabern landon.rabern@gmail.com June 27, 2011 Abstract We prove that if G is a graph and r 1,..., r k Z 0 such that k i=1 r i (G) +
More informationApplication: Bucket Sort
5.2.2. Application: Bucket Sort Bucket sort breaks the log) lower bound for standard comparison-based sorting, under certain assumptions on the input We want to sort a set of =2 integers chosen I+U@R from
More informationBounds for pairs in partitions of graphs
Bounds for pairs in partitions of graphs Jie Ma Xingxing Yu School of Mathematics Georgia Institute of Technology Atlanta, GA 30332-0160, USA Abstract In this paper we study the following problem of Bollobás
More informationDefinition of Algebraic Graph and New Perspective on Chromatic Number
Definition of Algebraic Graph and New Perspective on Chromatic Number Gosu Choi February 2019 Abstract This paper is an article on the unusual structure that can be thought on set of simple graph. I discovered
More informationarxiv: v1 [math.co] 21 Sep 2017
Chromatic number, Clique number, and Lovász s bound: In a comparison Hamid Reza Daneshpajouh a,1 a School of Mathematics, Institute for Research in Fundamental Sciences (IPM), Tehran, Iran, P.O. Box 19395-5746
More informationChapter 8 ~ Quadratic Functions and Equations In this chapter you will study... You can use these skills...
Chapter 8 ~ Quadratic Functions and Equations In this chapter you will study... identifying and graphing quadratic functions transforming quadratic equations solving quadratic equations using factoring
More informationOXPORD UNIVERSITY PRESS
Concentration Inequalities A Nonasymptotic Theory of Independence STEPHANE BOUCHERON GABOR LUGOSI PASCAL MASS ART OXPORD UNIVERSITY PRESS CONTENTS 1 Introduction 1 1.1 Sums of Independent Random Variables
More informationk-regular subgraphs near the k-core threshold of a random graph
-regular subgraphs near the -core threshold of a random graph Dieter Mitsche Michael Molloy Pawe l Pra lat April 5, 208 Abstract We prove that G n,p=c/n w.h.p. has a -regular subgraph if c is at least
More informationOn Brooks Coloring Theorem
On Brooks Coloring Theorem Hong-Jian Lai, Xiangwen Li, Gexin Yu Department of Mathematics West Virginia University Morgantown, WV, 26505 Abstract Let G be a connected finite simple graph. δ(g), (G) and
More informationErrata: First Printing of Mitzenmacher/Upfal Probability and Computing
Errata: First Printing of Mitzenmacher/Upfal Probability and Computing Michael Mitzenmacher and Eli Upfal October 10, 2006 We would like to thank the many of you who have bought our book, and we would
More informationTHE DELETION METHOD FOR UPPER TAIL ESTIMATES
THE DELETION METHOD FOR UPPER TAIL ESTIMATES SVANTE JANSON AND ANDRZEJ RUCIŃSKI Abstract. We present a new method to show concentration of the upper tail of random variables that can be written as sums
More informationCOS597D: Information Theory in Computer Science September 21, Lecture 2
COS597D: Information Theory in Computer Science September 1, 011 Lecture Lecturer: Mark Braverman Scribe: Mark Braverman In the last lecture, we introduced entropy H(X), and conditional entry H(X Y ),
More informationExercises Measure Theoretic Probability
Exercises Measure Theoretic Probability Chapter 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary family
More informationThe Alon-Saks-Seymour and Rank-Coloring Conjectures
The Alon-Saks-Seymour and Rank-Coloring Conjectures Michael Tait Department of Mathematical Sciences University of Delaware Newark, DE 19716 tait@math.udel.edu April 20, 2011 Preliminaries A graph is a
More informationCollaborative transmission in wireless sensor networks
Collaborative transmission in wireless sensor networks Introduction to probability theory Stephan Sigg Institute of Distributed and Ubiquitous Systems Technische Universität Braunschweig November 23, 2009
More informationCSE 525 Randomized Algorithms & Probabilistic Analysis Spring Lecture 3: April 9
CSE 55 Randomized Algorithms & Probabilistic Analysis Spring 01 Lecture : April 9 Lecturer: Anna Karlin Scribe: Tyler Rigsby & John MacKinnon.1 Kinds of randomization in algorithms So far in our discussion
More informationEssentials on the Analysis of Randomized Algorithms
Essentials on the Analysis of Randomized Algorithms Dimitris Diochnos Feb 0, 2009 Abstract These notes were written with Monte Carlo algorithms primarily in mind. Topics covered are basic (discrete) random
More informationProperly colored Hamilton cycles in edge colored complete graphs
Properly colored Hamilton cycles in edge colored complete graphs N. Alon G. Gutin Dedicated to the memory of Paul Erdős Abstract It is shown that for every ɛ > 0 and n > n 0 (ɛ), any complete graph K on
More informationX n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)
14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence
More informationOn the Spectra of General Random Graphs
On the Spectra of General Random Graphs Fan Chung Mary Radcliffe University of California, San Diego La Jolla, CA 92093 Abstract We consider random graphs such that each edge is determined by an independent
More informationLearning Theory. Sridhar Mahadevan. University of Massachusetts. p. 1/38
Learning Theory Sridhar Mahadevan mahadeva@cs.umass.edu University of Massachusetts p. 1/38 Topics Probability theory meet machine learning Concentration inequalities: Chebyshev, Chernoff, Hoeffding, and
More informationEntropy and Ergodic Theory Lecture 15: A first look at concentration
Entropy and Ergodic Theory Lecture 15: A first look at concentration 1 Introduction to concentration Let X 1, X 2,... be i.i.d. R-valued RVs with common distribution µ, and suppose for simplicity that
More informationDecomposition of random graphs into complete bipartite graphs
Decomposition of random graphs into complete bipartite graphs Fan Chung Xing Peng Abstract We consider the problem of partitioning the edge set of a graph G into the minimum number τg) of edge-disjoint
More informationCPSC 536N: Randomized Algorithms Term 2. Lecture 2
CPSC 536N: Randomized Algorithms 2014-15 Term 2 Prof. Nick Harvey Lecture 2 University of British Columbia In this lecture we continue our introduction to randomized algorithms by discussing the Max Cut
More informationHINT: Some of the solution methods involve highschool math as well as new methods from this class.
SOLUTIONS DISCRETE MATH 1 W3203 FINAL EXAM open book Your Name (2 pts for legibly PRINTING your name) Problem Points Score your name 2 1 23 2 20 3 25 4 25 5 30 6 30 7 20 8 25 Total 200 SUGGESTION: Do the
More informationModelling self-organizing networks
Paweł Department of Mathematics, Ryerson University, Toronto, ON Cargese Fall School on Random Graphs (September 2015) Outline 1 Introduction 2 Spatial Preferred Attachment (SPA) Model 3 Future work Multidisciplinary
More information1. The graph of a quadratic function is shown. Each square is one unit.
1. The graph of a quadratic function is shown. Each square is one unit. a. What is the vertex of the function? b. If the lead coefficient (the value of a) is 1, write the formula for the function in vertex
More information= (, ) V λ (1) λ λ ( + + ) P = [ ( ), (1)] ( ) ( ) = ( ) ( ) ( 0 ) ( 0 ) = ( 0 ) ( 0 ) 0 ( 0 ) ( ( 0 )) ( ( 0 )) = ( ( 0 )) ( ( 0 )) ( + ( 0 )) ( + ( 0 )) = ( + ( 0 )) ( ( 0 )) P V V V V V P V P V V V
More informationOn Brooks Theorem For Sparse Graphs
On Brooks Theorem For Sparse Graphs Jeong Han Kim Abstract Let G be a graph with maximum degree (G). In this paper we prove that if the girth g(g) of G is greater than 4 then its chromatic number, χ(g),
More information12.1 The Achromatic Number of a Graph
Chapter 1 Complete Colorings The proper vertex colorings of a graph G in which we are most interested are those that use the smallest number of colors These are, of course, the χ(g)-colorings of G If χ(g)
More informationSelf Similar (Scale Free, Power Law) Networks (I)
Self Similar (Scale Free, Power Law) Networks (I) E6083: lecture 4 Prof. Predrag R. Jelenković Dept. of Electrical Engineering Columbia University, NY 10027, USA {predrag}@ee.columbia.edu February 7, 2007
More informationSemidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5
Semidefinite and Second Order Cone Programming Seminar Fall 2001 Lecture 5 Instructor: Farid Alizadeh Scribe: Anton Riabov 10/08/2001 1 Overview We continue studying the maximum eigenvalue SDP, and generalize
More information