Characterization of cutoff for reversible Markov chains

Size: px
Start display at page:

Download "Characterization of cutoff for reversible Markov chains"

Transcription

1 Characterization of cutoff for reversible Markov chains Yuval Peres Joint work with Riddhi Basu and Jonathan Hermon 3 December 2014 Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 1 / 34

2 Notation Transition matrix - P (reversible). Stationary dist. - π. Reversibility: π(x)p (x, y) = π(y)p (y, x), x, y Ω. Laziness P (x, x) 1/2, x Ω. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 2 / 34

3 TV distance For any 2 dist. µ, ν on Ω, their total-variation distance is: µ ν TV = d max µ(a) ν(a). A Ω d(t, x) d = P t x π TV, d(t) = d max d(t, x). x Ω Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 3 / 34

4 TV distance For any 2 dist. µ, ν on Ω, their total-variation distance is: µ ν TV = d max µ(a) ν(a). A Ω d(t, x) d = P t x π TV, d(t) = d max d(t, x). x Ω The ɛ-mixing-time (0 < ɛ < 1) is: t mix(ɛ) d = min {t : d(t) ɛ} t mix d = t mix(1/4). Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 3 / 34

5 Cutoff - definition Def: a sequence of MCs (X (n) t ) exhibits cutoff if t (n) mix (ɛ) t(n) (1 ɛ) = o(t(n) ), 0 < ɛ < 1/4. (1) mix mix Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 4 / 34

6 Cutoff - definition Def: a sequence of MCs (X (n) t ) exhibits cutoff if t (n) mix (ɛ) t(n) (1 ɛ) = o(t(n) ), 0 < ɛ < 1/4. (1) mix mix ( (w n) is called a cutoff window for (X (n) t ) if: w n = o t (n) mix t (n) mix ), and (ɛ) t(n) (1 ɛ) cɛwn, n 1, ɛ (0, 1/4). mix Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 4 / 34

7 Cutoff Figure : cutoff Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 5 / 34

8 Background Cutoff was first identified for random transpositions Diaconis & Shashahani 81 and RW on the hypercube by Aldous 83. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 6 / 34

9 Background Cutoff was first identified for random transpositions Diaconis & Shashahani 81 and RW on the hypercube by Aldous 83. Many chains are believed to exhibit cutoff. Verifying the occurrence of cutoff rigorously is usually hard. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 6 / 34

10 Background Cutoff was first identified for random transpositions Diaconis & Shashahani 81 and RW on the hypercube by Aldous 83. Many chains are believed to exhibit cutoff. Verifying the occurrence of cutoff rigorously is usually hard. The name cutoff was coined by Aldous and Diaconis in their seminal 86 paper. Aldous & Diaconis 86 - the most interesting open problem : Find verifiable conditions for cutoff. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 6 / 34

11 Background Cutoff was first identified for random transpositions Diaconis & Shashahani 81 and RW on the hypercube by Aldous 83. Many chains are believed to exhibit cutoff. Verifying the occurrence of cutoff rigorously is usually hard. The name cutoff was coined by Aldous and Diaconis in their seminal 86 paper. Aldous & Diaconis 86 - the most interesting open problem : Find verifiable conditions for cutoff. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 6 / 34

12 Spectral gap & relaxation-time Let λ 2 be the largest non-trivial e.v. of P. Definition: gap = 1 λ 2 - the spectral gap. Def: t rel := gap 1 - the relaxation-time. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 7 / 34

13 The product condition (Prod. cond.) In a 2004 Aim workshop I proposed that The product condition (Prod. Cond.) - gap (n) t (n) mix (equivalently, t(n) rel = o(t (n) mix )) should imply cutoff for nice reversible chains. (It is a necessary condition for cutoff) Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 8 / 34

14 The product condition (Prod. cond.) In a 2004 Aim workshop I proposed that The product condition (Prod. Cond.) - gap (n) t (n) mix (equivalently, t(n) rel = o(t (n) mix )) should imply cutoff for nice reversible chains. (It is a necessary condition for cutoff) It is not always sufficient - examples due to Aldous and Pak. Problem: Find families of MCs s.t. Prod. Cond. = cutoff. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 8 / 34

15 Aldous example 1/12 x 3/4 1/6 3/4 10n 1/12 - The mass is concentrated in a small neighborhood of y. - Last step away from z before Ty determines Ty. z 3/4 1/12 1/6 1/12 - each 1/6 n n 1/2 1/3 y Figure : Fixed bias to the right conditioned on a non-lazy step. Different laziness probabilities along the 2 paths. t (n) rel = O(1). d n(t) P x[t y > t] = ɛ d n(130n) d n(128n) 1 ɛ, for some ɛ.. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 9 / 34

16 Aldous example d n (t) 126n 132n Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 10 / 34

17 Hitting and Mixing Def: The hitting time of a set A Ω = T A := min{t : X t A}. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 11 / 34

18 Hitting and Mixing Def: The hitting time of a set A Ω = T A := min{t : X t A}. Hitting times of worst sets are related to mixing - mid 80 s (Aldous). Refined independently by Oliviera (2011) and Peres-Sousi (2011) (case α = 1/2 due to Griffiths-Kang-Oliviera-Patel 2012): for any irreducible reversible lazy MC and 0 < α 1/2: t H(α) = Θ α(t mix), where t H(α) := max E x[t A]. (2) x,a: π(a) α Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 11 / 34

19 Hitting and Mixing Def: The hitting time of a set A Ω = T A := min{t : X t A}. Hitting times of worst sets are related to mixing - mid 80 s (Aldous). Refined independently by Oliviera (2011) and Peres-Sousi (2011) (case α = 1/2 due to Griffiths-Kang-Oliviera-Patel 2012): for any irreducible reversible lazy MC and 0 < α 1/2: t H(α) = Θ α(t mix), where t H(α) := max E x[t A]. (2) x,a: π(a) α We relate d(t) and max x,a: π(a) α P x[t A > t] and refine (2) by also allowing 1/2 < α 1 exp[ Ct mix/t rel ] and improving Θ α to Θ. Remark: (2) may fail for α > 1/2. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 11 / 34

20 counter-example Kn Kn Figure : n is the index of the chain Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 12 / 34

21 Hitting and Cutoff Concentration of hitting times of worst sets is related to cutoff in birth and death (BD) chains. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 13 / 34

22 Hitting and Cutoff Concentration of hitting times of worst sets is related to cutoff in birth and death (BD) chains. Diaconis & Saloff-Coste (06) (separation cutoff) and Ding-Lubetzky-Peres (10) (TV cutoff): A seq. of BD chains exhibits cutoff iff the Prod. Cond. holds. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 13 / 34

23 Hitting and Cutoff Concentration of hitting times of worst sets is related to cutoff in birth and death (BD) chains. Diaconis & Saloff-Coste (06) (separation cutoff) and Ding-Lubetzky-Peres (10) (TV cutoff): A seq. of BD chains exhibits cutoff iff the Prod. Cond. holds. We extend their results to weighted nearest-neighbor RWs on trees. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 13 / 34

24 Cutoff for trees Theorem Let (V, P, π) be a lazy Markov chain on a tree T = (V, E) with V 3. Then t mix(ɛ) t mix(1 ɛ) C log ɛ t rel t mix, for any 0 < ɛ 1/4. In particular, the Prod. Cond. implies cutoff with a cutoff window w n = t (n) rel t(n) mix and c ɛ = C log ɛ. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 14 / 34

25 Cutoff for trees Theorem Let (V, P, π) be a lazy Markov chain on a tree T = (V, E) with V 3. Then t mix(ɛ) t mix(1 ɛ) C log ɛ t rel t mix, for any 0 < ɛ 1/4. In particular, the Prod. Cond. implies cutoff with a cutoff window w n = t (n) rel t(n) mix and c ɛ = C log ɛ. Ding Lubetzky Peres (10) - For BD chains t mix(ɛ) t mix(1 ɛ) O(ɛ 1 t rel t ( ) mix) and in some cases w n = Ω t (n) rel t(n) mix. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 14 / 34

26 To mix - escape and then relax Definition: hit α := hit α(1/4), where hit α,x(ɛ) := min{t : P x[t A > t] ɛ : for all A Ω s.t. π(a) α}, hit α(ɛ) := max x Ω hitα,x(ɛ) Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 15 / 34

27 To mix - escape and then relax Definition: hit α := hit α(1/4), where hit α,x(ɛ) := min{t : P x[t A > t] ɛ : for all A Ω s.t. π(a) α}, hit α(ɛ) := max x Ω hitα,x(ɛ) Easy direction: to mix, the chain must first escape from small sets = first stage of mixing. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 15 / 34

28 To mix - escape and then relax Definition: hit α := hit α(1/4), where hit α,x(ɛ) := min{t : P x[t A > t] ɛ : for all A Ω s.t. π(a) α}, hit α(ɛ) := max x Ω hitα,x(ɛ) Easy direction: to mix, the chain must first escape from small sets = first stage of mixing. Loosely speaking - we show that in the 2nd stage of mixing, the chain mixes at the fastest possible rate (governed by its relaxation-time). Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 15 / 34

29 Hitting times when X 0 π Fact: Let A Ω be such that π(a) 1/2. Then (under reversibility) P π[t A > 2st rel ] e s, for all s 0. 2 Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 16 / 34

30 Hitting times when X 0 π Fact: Let A Ω be such that π(a) 1/2. Then (under reversibility) P π[t A > 2st rel ] e s, for all s 0. 2 By a coupling argument, P x[t A > t + 2st rel ] d(t) + P π[t A > 2st rel ]. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 16 / 34

31 Hitting of worst sets For any reversible irreducible finite lazy chain and any 0 < ɛ 1/4, hit 1/2 (3ɛ) t rel log(2ɛ) t mix(2ɛ) hit 1/2 (ɛ) + t rel log(4ɛ) Terms involving t rel are negligible under the Prod. Cond.. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 17 / 34

32 Hitting of worst sets For any reversible irreducible finite lazy chain and any 0 < ɛ 1/4, hit 1/2 (3ɛ) t rel log(2ɛ) t mix(2ɛ) hit 1/2 (ɛ) + t rel log(4ɛ) Terms involving t rel are negligible under the Prod. Cond.. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 17 / 34

33 Hitting of worst sets For any reversible irreducible finite lazy chain and any 0 < ɛ 1/4, hit 1/2 (3ɛ) t rel log(2ɛ) t mix(2ɛ) hit 1/2 (ɛ) + t rel log(4ɛ) Terms involving t rel are negligible under the Prod. Cond.. A similar two sided inequality holds for t mix(1 2ɛ). Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 17 / 34

34 Main abstract result Definition: A sequence has hit α-cutoff if hit (n) α (ɛ) hit α (n) (1 ɛ) = o(hit α (n) ) for all 0 < ɛ < 1/4. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 18 / 34

35 Main abstract result Definition: A sequence has hit α-cutoff if hit (n) α Main abstract result: Theorem (ɛ) hit α (n) (1 ɛ) = o(hit α (n) ) for all 0 < ɛ < 1/4. Let (Ω n, P n, π n) be a seq. of finite reversible lazy MCs. Then TFAE: The seq. exhibits cutoff. The seq. exhibits a hit α-cutoff for some α (0, 1/2). The seq. exhibits a hit α-cutoff for some α [1/2, 1) and the Prod. Cond. holds. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 18 / 34

36 Main abstract result Definition: A sequence has hit α-cutoff if hit (n) α Main abstract result: Theorem (ɛ) hit α (n) (1 ɛ) = o(hit α (n) ) for all 0 < ɛ < 1/4. Let (Ω n, P n, π n) be a seq. of finite reversible lazy MCs. Then TFAE: The seq. exhibits cutoff. The seq. exhibits a hit α-cutoff for some α (0, 1/2). The seq. exhibits a hit α-cutoff for some α [1/2, 1) and the Prod. Cond. holds. The equivalence of cutoff to hit 1/2 -cutoff under the Prod. Cond. follows from the ineq. from the prev. slide together with the fact that hit (n) 1/2 = Θ(t(n) mix ). Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 18 / 34

37 Main abstract result Definition: A sequence has hit α-cutoff if hit (n) α Main abstract result: Theorem (ɛ) hit α (n) (1 ɛ) = o(hit α (n) ) for all 0 < ɛ < 1/4. Let (Ω n, P n, π n) be a seq. of finite reversible lazy MCs. Then TFAE: The seq. exhibits cutoff. The seq. exhibits a hit α-cutoff for some α (0, 1/2). The seq. exhibits a hit α-cutoff for some α [1/2, 1) and the Prod. Cond. holds. The equivalence of cutoff to hit 1/2 -cutoff under the Prod. Cond. follows from the ineq. from the prev. slide together with the fact that hit (n) 1/2 = Θ(t(n) mix ). For general α we show under the Prod. Cond. (using the tail decay of T A/t rel when X 0 π): hit α-cutoff for some α (0, 1) = hit β -cutoff for all β (0, 1). Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 18 / 34

38 Tools Def: For f R Ω, t 0, define P t f R Ω by P t f(x) := E x[f(x t)] = P t (x, y)f(y). y Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 19 / 34

39 Tools Def: For f R Ω, t 0, define P t f R Ω by P t f(x) := E x[f(x t)] = P t (x, y)f(y). y For f R Ω define E π[f] := x Ω π(x)f(x) and f 2 2 := E π[f 2 ]. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 19 / 34

40 Tools Def: For f R Ω, t 0, define P t f R Ω by P t f(x) := E x[f(x t)] = P t (x, y)f(y). y For f R Ω define E π[f] := x Ω π(x)f(x) and f 2 2 := E π[f 2 ]. For g R Ω denote Var πg := g E π[g] 2 2. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 19 / 34

41 Tools Def: For f R Ω, t 0, define P t f R Ω by P t f(x) := E x[f(x t)] = P t (x, y)f(y). y For f R Ω define E π[f] := x Ω π(x)f(x) and f 2 2 := E π[f 2 ]. For g R Ω denote Var πg := g E π[g] 2 2. The following is well-known and follows from elementary linear-algebra. Lemma (Contraction Lemma) Let (Ω, P, π) be a finite rev. irr. lazy MC. Let A Ω. Let t 0. Then Var πp t 1 A e 2t/t rel. (3) Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 19 / 34

42 Maximal Inequality The main ingredient in our approach is Starr s maximal-inequality (66) (refines Stein s max-inequality (61)) Theorem (Maximal inequality) Let (Ω, P, π) be a lazy irreducible reversible Markov chain. Let f R Ω. Define the corresponding maximal function f R Ω as f (x) := sup P k (f)(x) = sup E x[f(x k )]. 0 k< 0 k< Then for 1 < p <, f p q f p 1/p + 1/q = 1 (4) Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 20 / 34

43 Combining the Max-in. with the Contraction Lemma Goal: want for every A Ω to have G = G s(a) Ω s.t. T G t serves as a certificate of being ɛ-mixed w.r.t. A and to control its π measure from below. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 21 / 34

44 Combining the Max-in. with the Contraction Lemma Goal: want for every A Ω to have G = G s(a) Ω s.t. T G t serves as a certificate of being ɛ-mixed w.r.t. A and to control its π measure from below. Let σ s := e s/t rel Var πp s 1 A (contraction lemma). Consider } G = G s(a) := {g : s s, P s g(a) π(a) 4σ s. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 21 / 34

45 Combining the Max-in. with the Contraction Lemma Goal: want for every A Ω to have G = G s(a) Ω s.t. T G t serves as a certificate of being ɛ-mixed w.r.t. A and to control its π measure from below. Let σ s := e s/t rel Var πp s 1 A (contraction lemma). Consider } G = G s(a) := {g : s s, P s g(a) π(a) 4σ s. Want precision 4σ s = ɛ = s := t rel log(4/ɛ). Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 21 / 34

46 Combining the Max-in. with the Contraction Lemma Goal: want for every A Ω to have G = G s(a) Ω s.t. T G t serves as a certificate of being ɛ-mixed w.r.t. A and to control its π measure from below. Let σ s := e s/t rel Var πp s 1 A (contraction lemma). Consider } G = G s(a) := {g : s s, P s g(a) π(a) 4σ s. Want precision 4σ s = ɛ = s := t rel log(4/ɛ). Claim π(g) 1/2. (5) Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 21 / 34

47 Combining the Max-in. with the Contraction Lemma Goal: want for every A Ω to have G = G s(a) Ω s.t. T G t serves as a certificate of being ɛ-mixed w.r.t. A and to control its π measure from below. Let σ s := e s/t rel Var πp s 1 A (contraction lemma). Consider } G = G s(a) := {g : s s, P s g(a) π(a) 4σ s. Want precision 4σ s = ɛ = s := t rel log(4/ɛ). Claim π(g) 1/2. (5) Proof: Set f s := P s (1 A π(a)). Then G c {fs > 4 f s 2}. Apply Starr s inequality. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 21 / 34

48 Main idea Claim: Proof: Recall G := G s(a, m) := t mix(2ɛ) hit 1/2 (ɛ) + t rel log(4/ɛ). { } g : s s, P s g(a) π(a) ɛ, s := t rel log(4/ɛ) Set t := hit 1/2 (ɛ). By prev. claim π(g) 1/2 = P x[t G > t] ɛ (by def. of t). Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 22 / 34

49 Main idea Claim: Proof: Recall G := G s(a, m) := t mix(2ɛ) hit 1/2 (ɛ) + t rel log(4/ɛ). { } g : s s, P s g(a) π(a) ɛ, s := t rel log(4/ɛ) Set t := hit 1/2 (ɛ). By prev. claim π(g) 1/2 = P x[t G > t] ɛ (by def. of t). For any x, A: P t+s x (A) π(a) P x[t G > t] + max g G, s s P s g(a) π(a) 2ɛ. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 22 / 34

50 Trees Let: T := (V, E) be a finite tree. (V, P, π) a lazy MC corresponding to some (lazy) weighted nearest-neighbor walk on T (i.e. P (x, y) > 0 iff {x, y} E or y = x). Fact: (Kolmogorov s cycle condition) every MC on a tree is reversible. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 23 / 34

51 Trees Can the tree structure be used to determine the identity of the worst sets? Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 24 / 34

52 Trees Can the tree structure be used to determine the identity of the worst sets? Easier question: what set of π measure 1/2 is the hardest to hit in a birth & death chain with state space [n] := {1, 2,..., n}? Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 24 / 34

53 Trees Can the tree structure be used to determine the identity of the worst sets? Easier question: what set of π measure 1/2 is the hardest to hit in a birth & death chain with state space [n] := {1, 2,..., n}? Answer: take a state m with π([m]) 1/2 and π([m 1]) < 1/2. Then the set worst set would be either [m] or [n] \ [m 1]. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 24 / 34

54 Trees How to generalize this to trees? Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 25 / 34

55 Central vertex C2 o (Ci) 1/2 for all i C1 C3 C4 Figure : A vertex o V is called a central-vertex if each connected component of T \ {o} has stationary probability at most 1/2. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 26 / 34

56 Trees There is always a central-vertex (and at most 2). We fix one, denote it by o and call it the root. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 27 / 34

57 Trees There is always a central-vertex (and at most 2). We fix one, denote it by o and call it the root. It follows from our analysis that for trees the Prod. Cond. holds iff T o is concentrated (from worst leaf). A counterintuitive result = such unweighed trees (Peres-Sousi (13)). Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 27 / 34

58 Trees o Let A be s.t. (A) 1/2. Partition V to B and D=V\B s.t. B is connected, o is in B and (A ) 1/4, where A :=(D U {o}) A. Po[TA>s] Po[TA >s] P B[TA >s] 2P [TA >s], where B is conditioned on B. Take s:=ctrel log(ε). =>Po[TA>s] ε. => hit1/2(a+ε) min{t:px[to>t] a, for all x} + s. trivially: min{t:px[to>t] a, for all x} hit1/2(a) Figure : Hitting the worst set is roughly like hitting o. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 28 / 34

59 Trees Cutoff would follow if we show that T o is concentrated (under the Prod. Cond.). More precisely, we need to show that E x[t o] = Ω(t mix) = T yβ (x) is concentrated if X 0 = x. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 29 / 34

60 Trees Cutoff would follow if we show that T o is concentrated (under the Prod. Cond.). More precisely, we need to show that E x[t o] = Ω(t mix) = T yβ (x) is concentrated if X 0 = x. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 29 / 34

61 Trees o=vk vk-1 v3 v2 v1 x=v0 Figure : Let v 0 = x,v 1,..., v k = o be the vertices along the path from x to o. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 30 / 34

62 Trees Proof of Concentration: Var x[t o] Ct rel t mix: It suffices to show that Var x[t o] 4t rel E x[t o]. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 31 / 34

63 Trees Proof of Concentration: Var x[t o] Ct rel t mix: It suffices to show that Var x[t o] 4t rel E x[t o]. If X 0 = x then T o is the sum of crossing times of the edges along the path between x: τ i := T vi T vi 1 d = T vi underx 0 = v i 1 Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 31 / 34

64 Trees Proof of Concentration: Var x[t o] Ct rel t mix: It suffices to show that Var x[t o] 4t rel E x[t o]. If X 0 = x then T o is the sum of crossing times of the edges along the path between x: τ i := T vi T vi 1 d = T vi underx 0 = v i 1 τ 1,..., τ k are independent = it suffices to bound the sum of their 2nd moments Var x[t o] = Var x[τ i] = Var vi 1 [T vi ] E vi 1 [T 2 v i ]. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 31 / 34

65 Trees Proof of Concentration: Var x[t o] Ct rel t mix: It suffices to show that Var x[t o] 4t rel E x[t o]. If X 0 = x then T o is the sum of crossing times of the edges along the path between x: τ i := T vi T vi 1 d = T vi underx 0 = v i 1 τ 1,..., τ k are independent = it suffices to bound the sum of their 2nd moments Var x[t o] = Var x[τ i] = Var vi 1 [T vi ] E vi 1 [T 2 v i ]. Denote the subtree rooted at v (the set of vertices whose path to o goes through v) by W v. For A Ω let π A be π conditioned on A. Kac formula implies that for any A, there exists a dist. µ on the external vertex boundary of A s.t. E µ[t 2 A] 2E µ[t A]E πa c [T A] = By the tree structure E vi 1 [T 2 v i ] 2E vi 1 [T vi ]E πwvi 1 [T vi ]. Not hard to show E πwvi 1 [T vi ] 2t rel (generally π(a c )E πa c [T A] t rel ) so Evi 1 [T 2 v i ] 4t rel E vi 1 [Tv i ] = 4t rele x[t o]. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 31 / 34

66 Beyond trees The tree assumption can be relaxed. In particular, we can treat jumps to vertices of bounded distance on a tree (i.e. the length of the path from u to v in the tree (which is now just an auxiliary structure) is > r = P (u, v) = 0) under some mild necessary assumption. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 32 / 34

67 Beyond trees The tree assumption can be relaxed. In particular, we can treat jumps to vertices of bounded distance on a tree (i.e. the length of the path from u to v in the tree (which is now just an auxiliary structure) is > r = P (u, v) = 0) under some mild necessary assumption. Previously the BD assumption could not be relaxed mainly due to it being exploited via a representation of hitting times result for BD chains. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 32 / 34

68 Beyond trees The tree assumption can be relaxed. In particular, we can treat jumps to vertices of bounded distance on a tree (i.e. the length of the path from u to v in the tree (which is now just an auxiliary structure) is > r = P (u, v) = 0) under some mild necessary assumption. Previously the BD assumption could not be relaxed mainly due to it being exploited via a representation of hitting times result for BD chains. In particular, if P (u, v) δ > 0 for all u, v s.t. d T (u, v) r (and otherwise P (u, v) = 0), then cutoff the Prod. Cond. holds. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 32 / 34

69 Last remark: Previously good expansion of small sets can improve mixing. Now know - considering expansion only of small sets and t rel suffices to bound t mix! t mix(ɛ) hit 1 ɛ/4 (3ɛ/4) + 3t rel log (4/ɛ). 2 From which it follows that t mix 5 max Ex[TA] + 3t rel log (4/ɛ). x,a:π(a) 1 ɛ/4 2 For any x and A with π(a) 1 ɛ/4 we can bound E x[t A] using the expansion profile of sets only of π measure at most ɛ/4 (by an integral of the form used to bound the mixing time via the expansion profile). Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 33 / 34

70 Last remark: Previously good expansion of small sets can improve mixing. Now know - considering expansion only of small sets and t rel suffices to bound t mix! t mix(ɛ) hit 1 ɛ/4 (3ɛ/4) + 3t rel log (4/ɛ). 2 From which it follows that t mix 5 max Ex[TA] + 3t rel log (4/ɛ). x,a:π(a) 1 ɛ/4 2 For any x and A with π(a) 1 ɛ/4 we can bound E x[t A] using the expansion profile of sets only of π measure at most ɛ/4 (by an integral of the form used to bound the mixing time via the expansion profile). In practice, we can take ɛ = exp[ ct mix/t rel ] to determine t mix up to a constant. Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 33 / 34

71 Open problems What can be said about the geometry of the worst sets in some interesting particular cases (say, transitivity or monotonicity)? Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 34 / 34

72 Open problems What can be said about the geometry of the worst sets in some interesting particular cases (say, transitivity or monotonicity)? When can the worst sets be described as { f 2 C} (P f 2 = λ 2f 2)? (would imply several new cutoff results if true in certain cases) When can one relate escaping time from balls of π-measure ɛ to escaping time from sets of π-measure ɛ 100 /100? When can monotonicity w.r.t. a partial order (preserved by the chain) be used to describe the worst sets and their hitting time distributions? Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff for reversible Markov chains 34 / 34

Characterization of cutoff for reversible Markov chains

Characterization of cutoff for reversible Markov chains Characterization of cutoff for reversible Markov chains Yuval Peres Joint work with Riddhi Basu and Jonathan Hermon 23 Feb 2015 Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff

More information

215 Problem 1. (a) Define the total variation distance µ ν tv for probability distributions µ, ν on a finite set S. Show that

215 Problem 1. (a) Define the total variation distance µ ν tv for probability distributions µ, ν on a finite set S. Show that 15 Problem 1. (a) Define the total variation distance µ ν tv for probability distributions µ, ν on a finite set S. Show that µ ν tv = (1/) x S µ(x) ν(x) = x S(µ(x) ν(x)) + where a + = max(a, 0). Show that

More information

TOTAL VARIATION CUTOFF IN BIRTH-AND-DEATH CHAINS

TOTAL VARIATION CUTOFF IN BIRTH-AND-DEATH CHAINS TOTAL VARIATION CUTOFF IN BIRTH-AND-DEATH CHAINS JIAN DING, EYAL LUBETZKY AND YUVAL PERES ABSTRACT. The cutoff phenomenon describes a case where a Markov chain exhibits a sharp transition in its convergence

More information

Oberwolfach workshop on Combinatorics and Probability

Oberwolfach workshop on Combinatorics and Probability Apr 2009 Oberwolfach workshop on Combinatorics and Probability 1 Describes a sharp transition in the convergence of finite ergodic Markov chains to stationarity. Steady convergence it takes a while to

More information

The cutoff phenomenon for random walk on random directed graphs

The cutoff phenomenon for random walk on random directed graphs The cutoff phenomenon for random walk on random directed graphs Justin Salez Joint work with C. Bordenave and P. Caputo Outline of the talk Outline of the talk 1. The cutoff phenomenon for Markov chains

More information

Coupling AMS Short Course

Coupling AMS Short Course Coupling AMS Short Course January 2010 Distance If µ and ν are two probability distributions on a set Ω, then the total variation distance between µ and ν is Example. Let Ω = {0, 1}, and set Then d TV

More information

Mixing time for a random walk on a ring

Mixing time for a random walk on a ring Mixing time for a random walk on a ring Stephen Connor Joint work with Michael Bate Paris, September 2013 Introduction Let X be a discrete time Markov chain on a finite state space S, with transition matrix

More information

MARKOV CHAINS AND MIXING TIMES

MARKOV CHAINS AND MIXING TIMES MARKOV CHAINS AND MIXING TIMES BEAU DABBS Abstract. This paper introduces the idea of a Markov chain, a random process which is independent of all states but its current one. We analyse some basic properties

More information

The coupling method - Simons Counting Complexity Bootcamp, 2016

The coupling method - Simons Counting Complexity Bootcamp, 2016 The coupling method - Simons Counting Complexity Bootcamp, 2016 Nayantara Bhatnagar (University of Delaware) Ivona Bezáková (Rochester Institute of Technology) January 26, 2016 Techniques for bounding

More information

A note on adiabatic theorem for Markov chains and adiabatic quantum computation. Yevgeniy Kovchegov Oregon State University

A note on adiabatic theorem for Markov chains and adiabatic quantum computation. Yevgeniy Kovchegov Oregon State University A note on adiabatic theorem for Markov chains and adiabatic quantum computation Yevgeniy Kovchegov Oregon State University Introduction Max Born and Vladimir Fock in 1928: a physical system remains in

More information

Simons Workshop on Approximate Counting, Markov Chains and Phase Transitions: Open Problem Session

Simons Workshop on Approximate Counting, Markov Chains and Phase Transitions: Open Problem Session Simons Workshop on Approximate Counting, Markov Chains and Phase Transitions: Open Problem Session Scribes: Antonio Blanca, Sarah Cannon, Yumeng Zhang February 4th, 06 Yuval Peres: Simple Random Walk on

More information

MARKOV CHAINS AND HIDDEN MARKOV MODELS

MARKOV CHAINS AND HIDDEN MARKOV MODELS MARKOV CHAINS AND HIDDEN MARKOV MODELS MERYL SEAH Abstract. This is an expository paper outlining the basics of Markov chains. We start the paper by explaining what a finite Markov chain is. Then we describe

More information

3. The Voter Model. David Aldous. June 20, 2012

3. The Voter Model. David Aldous. June 20, 2012 3. The Voter Model David Aldous June 20, 2012 We now move on to the voter model, which (compared to the averaging model) has a more substantial literature in the finite setting, so what s written here

More information

Modern Discrete Probability Spectral Techniques

Modern Discrete Probability Spectral Techniques Modern Discrete Probability VI - Spectral Techniques Background Sébastien Roch UW Madison Mathematics December 22, 2014 1 Review 2 3 4 Mixing time I Theorem (Convergence to stationarity) Consider a finite

More information

XVI International Congress on Mathematical Physics

XVI International Congress on Mathematical Physics Aug 2009 XVI International Congress on Mathematical Physics Underlying geometry: finite graph G=(V,E ). Set of possible configurations: V (assignments of +/- spins to the vertices). Probability of a configuration

More information

The Markov Chain Monte Carlo Method

The Markov Chain Monte Carlo Method The Markov Chain Monte Carlo Method Idea: define an ergodic Markov chain whose stationary distribution is the desired probability distribution. Let X 0, X 1, X 2,..., X n be the run of the chain. The Markov

More information

Glauber Dynamics for Ising Model I AMS Short Course

Glauber Dynamics for Ising Model I AMS Short Course Glauber Dynamics for Ising Model I AMS Short Course January 2010 Ising model Let G n = (V n, E n ) be a graph with N = V n < vertices. The nearest-neighbor Ising model on G n is the probability distribution

More information

Lecture 3: September 10

Lecture 3: September 10 CS294 Markov Chain Monte Carlo: Foundations & Applications Fall 2009 Lecture 3: September 10 Lecturer: Prof. Alistair Sinclair Scribes: Andrew H. Chan, Piyush Srivastava Disclaimer: These notes have not

More information

4: The Pandemic process

4: The Pandemic process 4: The Pandemic process David Aldous July 12, 2012 (repeat of previous slide) Background meeting model with rates ν. Model: Pandemic Initially one agent is infected. Whenever an infected agent meets another

More information

CUTOFF FOR THE ISING MODEL ON THE LATTICE

CUTOFF FOR THE ISING MODEL ON THE LATTICE CUTOFF FOR THE ISING MODEL ON THE LATTICE EYAL LUBETZKY AND ALLAN SLY Abstract. Introduced in 1963, Glauber dynamics is one of the most practiced and extensively studied methods for sampling the Ising

More information

Mixing Times and Hitting Times

Mixing Times and Hitting Times Mixing Times and Hitting Times David Aldous January 12, 2010 Old Results and Old Puzzles Levin-Peres-Wilmer give some history of the emergence of this Mixing Times topic in the early 1980s, and we nowadays

More information

Introduction to Markov Chains and Riffle Shuffling

Introduction to Markov Chains and Riffle Shuffling Introduction to Markov Chains and Riffle Shuffling Nina Kuklisova Math REU 202 University of Chicago September 27, 202 Abstract In this paper, we introduce Markov Chains and their basic properties, and

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Mean-field dual of cooperative reproduction

Mean-field dual of cooperative reproduction The mean-field dual of systems with cooperative reproduction joint with Tibor Mach (Prague) A. Sturm (Göttingen) Friday, July 6th, 2018 Poisson construction of Markov processes Let (X t ) t 0 be a continuous-time

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

CUTOFF FOR THE EAST PROCESS

CUTOFF FOR THE EAST PROCESS CUTOFF FOR THE EAST PROCESS S. GANGULY, E. LUBETZKY, AND F. MARTINELLI ABSTRACT. The East process is a D kinetically constrained interacting particle system, introduced in the physics literature in the

More information

Lecture 28: April 26

Lecture 28: April 26 CS271 Randomness & Computation Spring 2018 Instructor: Alistair Sinclair Lecture 28: April 26 Disclaimer: These notes have not been subjected to the usual scrutiny accorded to formal publications. They

More information

COUPLING TIMES FOR RANDOM WALKS WITH INTERNAL STATES

COUPLING TIMES FOR RANDOM WALKS WITH INTERNAL STATES COUPLING TIMES FOR RANDOM WALKS WITH INTERNAL STATES ELYSE AZORR, SAMUEL J. GHITELMAN, RALPH MORRISON, AND GREG RICE ADVISOR: YEVGENIY KOVCHEGOV OREGON STATE UNIVERSITY ABSTRACT. Using coupling techniques,

More information

On Markov Chain Monte Carlo

On Markov Chain Monte Carlo MCMC 0 On Markov Chain Monte Carlo Yevgeniy Kovchegov Oregon State University MCMC 1 Metropolis-Hastings algorithm. Goal: simulating an Ω-valued random variable distributed according to a given probability

More information

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R. Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions

More information

G(t) := i. G(t) = 1 + e λut (1) u=2

G(t) := i. G(t) = 1 + e λut (1) u=2 Note: a conjectured compactification of some finite reversible MCs There are two established theories which concern different aspects of the behavior of finite state Markov chains as the size of the state

More information

CONVERGENCE THEOREM FOR FINITE MARKOV CHAINS. Contents

CONVERGENCE THEOREM FOR FINITE MARKOV CHAINS. Contents CONVERGENCE THEOREM FOR FINITE MARKOV CHAINS ARI FREEDMAN Abstract. In this expository paper, I will give an overview of the necessary conditions for convergence in Markov chains on finite state spaces.

More information

Topic 1: a paper The asymmetric one-dimensional constrained Ising model by David Aldous and Persi Diaconis. J. Statistical Physics 2002.

Topic 1: a paper The asymmetric one-dimensional constrained Ising model by David Aldous and Persi Diaconis. J. Statistical Physics 2002. Topic 1: a paper The asymmetric one-dimensional constrained Ising model by David Aldous and Persi Diaconis. J. Statistical Physics 2002. Topic 2: my speculation on use of constrained Ising as algorithm

More information

CUTOFF FOR THE STAR TRANSPOSITION RANDOM WALK

CUTOFF FOR THE STAR TRANSPOSITION RANDOM WALK CUTOFF FOR THE STAR TRANSPOSITION RANDOM WALK JONATHAN NOVAK AND ARIEL SCHVARTZMAN Abstract. In this note, we give a complete proof of the cutoff phenomenon for the star transposition random walk on the

More information

Sensitivity of mixing times in Eulerian digraphs

Sensitivity of mixing times in Eulerian digraphs Sensitivity of mixing times in Eulerian digraphs Lucas Boczkowski Yuval Peres Perla Sousi Abstract Let X be a lazy random walk on a graph G. If G is undirected, then the mixing time is upper bounded by

More information

Discrete solid-on-solid models

Discrete solid-on-solid models Discrete solid-on-solid models University of Alberta 2018 COSy, University of Manitoba - June 7 Discrete processes, stochastic PDEs, deterministic PDEs Table: Deterministic PDEs Heat-diffusion equation

More information

Latent voter model on random regular graphs

Latent voter model on random regular graphs Latent voter model on random regular graphs Shirshendu Chatterjee Cornell University (visiting Duke U.) Work in progress with Rick Durrett April 25, 2011 Outline Definition of voter model and duality with

More information

CUTOFF FOR GENERAL SPIN SYSTEMS WITH ARBITRARY BOUNDARY CONDITIONS

CUTOFF FOR GENERAL SPIN SYSTEMS WITH ARBITRARY BOUNDARY CONDITIONS CUTOFF FOR GENERAL SPIN SYSTEMS WITH ARBITRARY BOUNDARY CONDITIONS EYAL LUBETZKY AND ALLAN SLY Abstract. The cutoff phenomenon describes a sharp transition in the convergence of a Markov chain to equilibrium.

More information

Edge Flip Chain for Unbiased Dyadic Tilings

Edge Flip Chain for Unbiased Dyadic Tilings Sarah Cannon, Georgia Tech,, David A. Levin, U. Oregon Alexandre Stauffer, U. Bath, March 30, 2018 Dyadic Tilings A dyadic tiling is a tiling on [0,1] 2 by n = 2 k dyadic rectangles, rectangles which are

More information

2. The averaging process

2. The averaging process 2. The averaging process David Aldous June 19, 2012 Write-up published as A Lecture on the Averaging Process (with Dan Lanoue) in recent Probability Surveys. Intended as illustration of how one might teach

More information

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1 Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).

More information

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past. 1 Markov chain: definition Lecture 5 Definition 1.1 Markov chain] A sequence of random variables (X n ) n 0 taking values in a measurable state space (S, S) is called a (discrete time) Markov chain, if

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

ARCC WORKSHOP: SHARP THRESHOLDS FOR MIXING TIMES SCRIBES: SHARAD GOEL, ASAF NACHMIAS, PETER RALPH AND DEVAVRAT SHAH

ARCC WORKSHOP: SHARP THRESHOLDS FOR MIXING TIMES SCRIBES: SHARAD GOEL, ASAF NACHMIAS, PETER RALPH AND DEVAVRAT SHAH ARCC WORKSHOP: SHARP THRESHOLDS FOR MIXING TIMES SCRIBES: SHARAD GOEL, ASAF NACHMIAS, PETER RALPH AND DEVAVRAT SHAH Day I 1. Speaker I: Persi Diaconis Definition 1 (Sharp Threshold). A Markov chain with

More information

Lecture 8: Path Technology

Lecture 8: Path Technology Counting and Sampling Fall 07 Lecture 8: Path Technology Lecturer: Shayan Oveis Gharan October 0 Disclaimer: These notes have not been subjected to the usual scrutiny reserved for formal publications.

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

A Proof of Aldous Spectral Gap Conjecture. Thomas M. Liggett, UCLA. Stirring Processes

A Proof of Aldous Spectral Gap Conjecture. Thomas M. Liggett, UCLA. Stirring Processes A Proof of Aldous Spectral Gap Conjecture Thomas M. Liggett, UCLA Stirring Processes Given a graph G = (V, E), associate with each edge e E a Poisson process Π e with rate c e 0. Labels are put on the

More information

Some Definition and Example of Markov Chain

Some Definition and Example of Markov Chain Some Definition and Example of Markov Chain Bowen Dai The Ohio State University April 5 th 2016 Introduction Definition and Notation Simple example of Markov Chain Aim Have some taste of Markov Chain and

More information

Lecture 6: September 22

Lecture 6: September 22 CS294 Markov Chain Monte Carlo: Foundations & Applications Fall 2009 Lecture 6: September 22 Lecturer: Prof. Alistair Sinclair Scribes: Alistair Sinclair Disclaimer: These notes have not been subjected

More information

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time

More information

MATH37012 Week 10. Dr Jonathan Bagley. Semester

MATH37012 Week 10. Dr Jonathan Bagley. Semester MATH37012 Week 10 Dr Jonathan Bagley Semester 2-2018 2.18 a) Finding and µ j for a particular category of B.D. processes. Consider a process where the destination of the next transition is determined by

More information

INFORMATION PERCOLATION AND CUTOFF FOR THE STOCHASTIC ISING MODEL

INFORMATION PERCOLATION AND CUTOFF FOR THE STOCHASTIC ISING MODEL INFORMATION PERCOLATION AND CUTOFF FOR THE STOCHASTIC ISING MODEL EYAL LUBETZKY AND ALLAN SLY Abstract. We introduce a new framework for analyzing Glauber dynamics for the Ising model. The traditional

More information

process on the hierarchical group

process on the hierarchical group Intertwining of Markov processes and the contact process on the hierarchical group April 27, 2010 Outline Intertwining of Markov processes Outline Intertwining of Markov processes First passage times of

More information

Cut-off and Escape Behaviors for Birth and Death Chains on Trees

Cut-off and Escape Behaviors for Birth and Death Chains on Trees ALEA, Lat. Am. J. Probab. Math. Stat. 8, 49 6 0 Cut-off and Escape Behaviors for Birth and Death Chains on Trees Olivier Bertoncini Department of Physics, University of Aveiro, 380-93 Aveiro, Portugal.

More information

Convergence Rate of Markov Chains

Convergence Rate of Markov Chains Convergence Rate of Markov Chains Will Perkins April 16, 2013 Convergence Last class we saw that if X n is an irreducible, aperiodic, positive recurrent Markov chain, then there exists a stationary distribution

More information

Lecture 2: September 8

Lecture 2: September 8 CS294 Markov Chain Monte Carlo: Foundations & Applications Fall 2009 Lecture 2: September 8 Lecturer: Prof. Alistair Sinclair Scribes: Anand Bhaskar and Anindya De Disclaimer: These notes have not been

More information

MIXING TIMES OF RANDOM WALKS ON DYNAMIC CONFIGURATION MODELS

MIXING TIMES OF RANDOM WALKS ON DYNAMIC CONFIGURATION MODELS MIXING TIMES OF RANDOM WALKS ON DYNAMIC CONFIGURATION MODELS Frank den Hollander Leiden University, The Netherlands New Results and Challenges with RWDRE, 11 13 January 2017, Heilbronn Institute for Mathematical

More information

Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes with Killing

Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes with Killing Advances in Dynamical Systems and Applications ISSN 0973-5321, Volume 8, Number 2, pp. 401 412 (2013) http://campus.mst.edu/adsa Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes

More information

curvature, mixing, and entropic interpolation Simons Feb-2016 and CSE 599s Lecture 13

curvature, mixing, and entropic interpolation Simons Feb-2016 and CSE 599s Lecture 13 curvature, mixing, and entropic interpolation Simons Feb-2016 and CSE 599s Lecture 13 James R. Lee University of Washington Joint with Ronen Eldan (Weizmann) and Joseph Lehec (Paris-Dauphine) Markov chain

More information

1 Independent increments

1 Independent increments Tel Aviv University, 2008 Brownian motion 1 1 Independent increments 1a Three convolution semigroups........... 1 1b Independent increments.............. 2 1c Continuous time................... 3 1d Bad

More information

1 Sequences of events and their limits

1 Sequences of events and their limits O.H. Probability II (MATH 2647 M15 1 Sequences of events and their limits 1.1 Monotone sequences of events Sequences of events arise naturally when a probabilistic experiment is repeated many times. For

More information

MIXING TIME OF CRITICAL ISING MODEL ON TREES IS POLYNOMIAL IN THE HEIGHT

MIXING TIME OF CRITICAL ISING MODEL ON TREES IS POLYNOMIAL IN THE HEIGHT MIXING TIME OF CRITICAL ISING MODEL ON TREES IS POLYNOMIAL IN THE HEIGHT JIAN DING, EYAL LUBETZKY AND YUVAL PERES Abstract. In the heat-bath Glauber dynamics for the Ising model on the lattice, physicists

More information

FAST INITIAL CONDITIONS FOR GLAUBER DYNAMICS

FAST INITIAL CONDITIONS FOR GLAUBER DYNAMICS FAST INITIAL CONDITIONS FOR GLAUBER DYNAMICS EYAL LUBETZKY AND ALLAN SLY Abstract. In the study of Markov chain mixing times, analysis has centered on the performance from a worst-case starting state.

More information

A GIBBS SAMPLER ON THE N-SIMPLEX. By Aaron Smith Stanford University

A GIBBS SAMPLER ON THE N-SIMPLEX. By Aaron Smith Stanford University Submitted to the Annals of Applied Probability arxiv: math.pr/0290964 A GIBBS SAMPLER ON THE N-SIMPLEX By Aaron Smith Stanford University We determine the mixing time of a simple Gibbs sampler on the unit

More information

Brownian Motion. Chapter Definition of Brownian motion

Brownian Motion. Chapter Definition of Brownian motion Chapter 5 Brownian Motion Brownian motion originated as a model proposed by Robert Brown in 1828 for the phenomenon of continual swarming motion of pollen grains suspended in water. In 1900, Bachelier

More information

Lecture 14: Random Walks, Local Graph Clustering, Linear Programming

Lecture 14: Random Walks, Local Graph Clustering, Linear Programming CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 14: Random Walks, Local Graph Clustering, Linear Programming Lecturer: Shayan Oveis Gharan 3/01/17 Scribe: Laura Vonessen Disclaimer: These

More information

Model Counting for Logical Theories

Model Counting for Logical Theories Model Counting for Logical Theories Wednesday Dmitry Chistikov Rayna Dimitrova Department of Computer Science University of Oxford, UK Max Planck Institute for Software Systems (MPI-SWS) Kaiserslautern

More information

Lecture 21. David Aldous. 16 October David Aldous Lecture 21

Lecture 21. David Aldous. 16 October David Aldous Lecture 21 Lecture 21 David Aldous 16 October 2015 In continuous time 0 t < we specify transition rates or informally P(X (t+δ)=j X (t)=i, past ) q ij = lim δ 0 δ P(X (t + dt) = j X (t) = i) = q ij dt but note these

More information

WAITING FOR A BAT TO FLY BY (IN POLYNOMIAL TIME)

WAITING FOR A BAT TO FLY BY (IN POLYNOMIAL TIME) WAITING FOR A BAT TO FLY BY (IN POLYNOMIAL TIME ITAI BENJAMINI, GADY KOZMA, LÁSZLÓ LOVÁSZ, DAN ROMIK, AND GÁBOR TARDOS Abstract. We observe returns of a simple random wal on a finite graph to a fixed node,

More information

LECTURE NOTES FOR MARKOV CHAINS: MIXING TIMES, HITTING TIMES, AND COVER TIMES IN SAINT PETERSBURG SUMMER SCHOOL, 2012

LECTURE NOTES FOR MARKOV CHAINS: MIXING TIMES, HITTING TIMES, AND COVER TIMES IN SAINT PETERSBURG SUMMER SCHOOL, 2012 LECTURE NOTES FOR MARKOV CHAINS: MIXING TIMES, HITTING TIMES, AND COVER TIMES IN SAINT PETERSBURG SUMMER SCHOOL, 2012 By Júlia Komjáthy Yuval Peres Eindhoven University of Technology and Microsoft Research

More information

(K2) For sum of the potential differences around any cycle is equal to zero. r uv resistance of {u, v}. [In other words, V ir.]

(K2) For sum of the potential differences around any cycle is equal to zero. r uv resistance of {u, v}. [In other words, V ir.] Lectures 16-17: Random walks on graphs CSE 525, Winter 2015 Instructor: James R. Lee 1 Random walks Let G V, E) be an undirected graph. The random walk on G is a Markov chain on V that, at each time step,

More information

Lecture 14: October 22

Lecture 14: October 22 CS294 Markov Chain Monte Carlo: Foundations & Applications Fall 2009 Lecture 14: October 22 Lecturer: Alistair Sinclair Scribes: Alistair Sinclair Disclaimer: These notes have not been subjected to the

More information

Vertex and edge expansion properties for rapid mixing

Vertex and edge expansion properties for rapid mixing Vertex and edge expansion properties for rapid mixing Ravi Montenegro Abstract We show a strict hierarchy among various edge and vertex expansion properties of Markov chains. This gives easy proofs of

More information

Random Times and Their Properties

Random Times and Their Properties Chapter 6 Random Times and Their Properties Section 6.1 recalls the definition of a filtration (a growing collection of σ-fields) and of stopping times (basically, measurable random times). Section 6.2

More information

University of Chicago Autumn 2003 CS Markov Chain Monte Carlo Methods

University of Chicago Autumn 2003 CS Markov Chain Monte Carlo Methods University of Chicago Autumn 2003 CS37101-1 Markov Chain Monte Carlo Methods Lecture 4: October 21, 2003 Bounding the mixing time via coupling Eric Vigoda 4.1 Introduction In this lecture we ll use the

More information

INTRODUCTION TO MARKOV CHAINS AND MARKOV CHAIN MIXING

INTRODUCTION TO MARKOV CHAINS AND MARKOV CHAIN MIXING INTRODUCTION TO MARKOV CHAINS AND MARKOV CHAIN MIXING ERIC SHANG Abstract. This paper provides an introduction to Markov chains and their basic classifications and interesting properties. After establishing

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction

More information

Lecture Introduction. 2 Brief Recap of Lecture 10. CS-621 Theory Gems October 24, 2012

Lecture Introduction. 2 Brief Recap of Lecture 10. CS-621 Theory Gems October 24, 2012 CS-62 Theory Gems October 24, 202 Lecture Lecturer: Aleksander Mądry Scribes: Carsten Moldenhauer and Robin Scheibler Introduction In Lecture 0, we introduced a fundamental object of spectral graph theory:

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

The expansion of random regular graphs

The expansion of random regular graphs The expansion of random regular graphs David Ellis Introduction Our aim is now to show that for any d 3, almost all d-regular graphs on {1, 2,..., n} have edge-expansion ratio at least c d d (if nd is

More information

More complete intersection theorems

More complete intersection theorems More complete intersection theorems Yuval Filmus April 4, 2017 Abstract The seminal complete intersection theorem of Ahlswede and Khachatrian gives the maximum cardinality of a k-uniform t-intersecting

More information

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10. x n+1 = f(x n ),

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10. x n+1 = f(x n ), MS&E 321 Spring 12-13 Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10 Section 4: Steady-State Theory Contents 4.1 The Concept of Stochastic Equilibrium.......................... 1 4.2

More information

MIXING TIME OF NEAR-CRITICAL RANDOM GRAPHS

MIXING TIME OF NEAR-CRITICAL RANDOM GRAPHS MIXING TIME OF NEAR-CRITICAL RANDOM GRAPHS JIAN DING, EYAL LUBETZKY AND YUVAL PERES Abstract. Let C 1 be the largest component of the Erdős-Rényi random graph G(n, p). The mixing time of random walk on

More information

Probability and Measure

Probability and Measure Chapter 4 Probability and Measure 4.1 Introduction In this chapter we will examine probability theory from the measure theoretic perspective. The realisation that measure theory is the foundation of probability

More information

CONSTRAINED PERCOLATION ON Z 2

CONSTRAINED PERCOLATION ON Z 2 CONSTRAINED PERCOLATION ON Z 2 ZHONGYANG LI Abstract. We study a constrained percolation process on Z 2, and prove the almost sure nonexistence of infinite clusters and contours for a large class of probability

More information

2. The averaging process

2. The averaging process 2. The averaging process David Aldous July 10, 2012 The two models in lecture 1 has only rather trivial interaction. In this lecture I discuss a simple FMIE process with less trivial interaction. It is

More information

Mixing time and diameter in random graphs

Mixing time and diameter in random graphs October 5, 2009 Based on joint works with: Asaf Nachmias, and Jian Ding, Eyal Lubetzky and Jeong-Han Kim. Background The mixing time of the lazy random walk on a graph G is T mix (G) = T mix (G, 1/4) =

More information

Math Homework 5 Solutions

Math Homework 5 Solutions Math 45 - Homework 5 Solutions. Exercise.3., textbook. The stochastic matrix for the gambler problem has the following form, where the states are ordered as (,, 4, 6, 8, ): P = The corresponding diagram

More information

The cover time of random walks on random graphs. Colin Cooper Alan Frieze

The cover time of random walks on random graphs. Colin Cooper Alan Frieze Happy Birthday Bela Random Graphs 1985 The cover time of random walks on random graphs Colin Cooper Alan Frieze The cover time of random walks on random graphs Colin Cooper Alan Frieze and Eyal Lubetzky

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

Lecture 9: Counting Matchings

Lecture 9: Counting Matchings Counting and Sampling Fall 207 Lecture 9: Counting Matchings Lecturer: Shayan Oveis Gharan October 20 Disclaimer: These notes have not been subjected to the usual scrutiny reserved for formal publications.

More information

MATH 425, FINAL EXAM SOLUTIONS

MATH 425, FINAL EXAM SOLUTIONS MATH 425, FINAL EXAM SOLUTIONS Each exercise is worth 50 points. Exercise. a The operator L is defined on smooth functions of (x, y by: Is the operator L linear? Prove your answer. L (u := arctan(xy u

More information

Lecture Notes 3 Convergence (Chapter 5)

Lecture Notes 3 Convergence (Chapter 5) Lecture Notes 3 Convergence (Chapter 5) 1 Convergence of Random Variables Let X 1, X 2,... be a sequence of random variables and let X be another random variable. Let F n denote the cdf of X n and let

More information

25.1 Markov Chain Monte Carlo (MCMC)

25.1 Markov Chain Monte Carlo (MCMC) CS880: Approximations Algorithms Scribe: Dave Andrzejewski Lecturer: Shuchi Chawla Topic: Approx counting/sampling, MCMC methods Date: 4/4/07 The previous lecture showed that, for self-reducible problems,

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

Mixing times via super-fast coupling

Mixing times via super-fast coupling s via super-fast coupling Yevgeniy Kovchegov Department of Mathematics Oregon State University Outline 1 About 2 Definition Coupling Result. About Outline 1 About 2 Definition Coupling Result. About Coauthor

More information

Notes 6 : First and second moment methods

Notes 6 : First and second moment methods Notes 6 : First and second moment methods Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Roc, Sections 2.1-2.3]. Recall: THM 6.1 (Markov s inequality) Let X be a non-negative

More information

arxiv: v1 [math.pr] 1 Jan 2013

arxiv: v1 [math.pr] 1 Jan 2013 The role of dispersal in interacting patches subject to an Allee effect arxiv:1301.0125v1 [math.pr] 1 Jan 2013 1. Introduction N. Lanchier Abstract This article is concerned with a stochastic multi-patch

More information