215 Problem 1. (a) Define the total variation distance µ ν tv for probability distributions µ, ν on a finite set S. Show that

Size: px
Start display at page:

Download "215 Problem 1. (a) Define the total variation distance µ ν tv for probability distributions µ, ν on a finite set S. Show that"

Transcription

1 15 Problem 1. (a) Define the total variation distance µ ν tv for probability distributions µ, ν on a finite set S. Show that µ ν tv = (1/) x S µ(x) ν(x) = x S(µ(x) ν(x)) + where a + = max(a, 0). Show that if P is the transition matrix of an irreducible, aperiodic Markov chain on a state space S with invariant distribution π, and if d(t) = sup x P t (x, ) π( ) tv then d(t) d(t) d(t) where d(t) = sup x P t (x, ) P t (y, ) tv. (b) Define what is meant by a coupling of µ and ν, and show that if (X, Y ) is such a coupling then µ ν tv P(X Y ). (c) Using a coupling or otherwise, show that d(t + s) d(t) d(s). Hence deduce that ρ = lim t d(t) 1/t exists. [Hint: you can use without proof the following lemma: if f is subadditive, i.e., if f(t + s) f(t) + f(s) for all s, t 0 then lim t f(t)/t exists in R { }.] (d) In the above setting, if the chain is also assumed to be reversible, what is the value of ρ? [You can use without proof any result from the course, provided it is clearly stated]. Page 1017 Part III Paper Mixing times of Markov Chains//15Draft 6th May 017

2 Marks Comments (a) Definition. The total variation distance between µ and ν is [] For the proof of d(t) d(t), note that µ ν tv = sup µ(a) ν(a) A S π(a) = y S π(y)p t (y, A) Lectures [3] Therefore, by the triangular inequality: π P t (x, ) tv = max A S = max A S max A S P t (x, A) π(a) π(y)[p t (x, A) P t (y, A)] y S π(y) P t (x, A) P t (y, A)] y S d(t) y S π(y) = d(t). (b) A coupling of µ and ν is the realisation of a pair of random variables (X, Y ) on the same probability space such that X µ and Y ν. For all couplings (X, Y ) of µ and ν, we have: Example class µ ν tv P(X Y ). (1) Furthermore, there always is a coupling (X, Y ) which achieves equality in (1). We start with the proof of the inequality (1), which is practice the only thing we use. (But it is reassuring to know that this is a sharp inequality!) Let (X, Y ) denote a coupling of µ and ν. If A is any subset of S, then we have: µ(a) ν(a) = P(X A) P(Y A) P(X A, X = Y ) P(Y A, X = Y ) + P(X A, X Y ) P(Y A, X Y ) [3] Note that the first term is in fact equal to zero, while the second is less or equal to P(X Y ), as desired. (c) We use the converse to the previous inequality (no need to prove): there is a coupling which achieves equality. Fix x, y S. Let (X s, Y s ) be an optimal coupling of Page 017 Part III Paper Mixing times of Markov Chains//15Draft 6th May 017

3 [6] P s (x, ) and P s (y, ), so P s (x, ) P s (y, ) tv = P(X s Y s ). Then P s+t (x, ) P s+t (y, ) tv = 1 P(X s+t = z) P(Y s+t = z) z = 1 E(P t (X s, z)) E(P t (Y s, z)) Taking the sup over x, y finishes the proof. y = E( P t (X s, ) P t (Y s, ) tv ) d(t)e(1 Xs Y s ) d(t)p(x s Y s ) d(t) d(s). Unseen, not easy to get it right Let f(t) = log d(t). Then f(t + s) f(t) + f(s), so by the hint, lim t f(t)/t exists in R { }, call it α. Then lim t d(t) 1/t = ρ := e α [0, ). Since [] (1/) d(t) d(t) d(t), and since (1/) 1/t 1 as t we deduce from the sandwich theorem that also d(t) 1/t ρ. Unseen (d) Recall that by a result of the course, since the chain is reversible, we have: P x (X t = y)/π(y) = λ t jf j (x)f j (y) where f j are an orthonormal basis of eigenfucntions (wrt l (π)) and λ j are the eigenvalues. so P t (x, y) π(y) = λ t jf j (x)f j (y)π(y). j= Taking the absolute value and summing over of y: P t (x, ) π( ) tv = 1 y j λ t jf j (x)f j (y)π(y) Here n is fixed and we let t : since each of the summand is asymptotic to C xy λ t j, we deduce that as t, P t (x, ) π( ) tv (1 γ ) t [4] where γ is the absolute spectral gap and a n b n means c 1 b n a n c b n for some constants c 1, c. Taking the sup over all x, and raising to the power 1/t we deduce that ρ = 1 γ. Unseen, not easy [0] Page 3017 Part III Paper Mixing times of Markov Chains//15Draft 6th May 017

4 15 Problem. (a) Let P be the transition matrix of an irreducible, aperiodic and reversible Markov chain on a finite state space S of size n with invariant distribution (π(x)) x S, with eigenvalues λ 1... λ n. Define the Dirichlet form E(f, f) associated to P, and give without proof an equivalent expression. State and prove the variational characterisation of the spectral gap in terms of E(f, f). State without proof a similar characterisation for higher order eigenvalues. (b) Let P, P be two transitive Markov chains on S, with corresponding Dirichlet forms E, Ẽ respectively. Suppose that if A > 0 is such that Ẽ(f, f) AE(f, f). State and prove a theorem concerning their respective mixing behaviours in L, defining carefully the expressions you introduce. (You can use without proof a relation between eigenvalues and L distance to stationarity, provided that this is stated clearly). (c) Define the interchange process on a connected graph G = (V, E). State a theorem giving a bound of mixing time of the interchange on G in terms of geometric quantities associated with G. (d) Suppose G = (V, E) = [0, n) Z is the n n square, and E is the set of nearest neighbour edges, so (u, v) E if and only if u v 1 = 1 for u, v V (here u 1 = u 1 + u for u = (u 1, u )). Show that the interchange process (in continuous time) satisfies t mix = O(n 4 log n). On the other hand, explain briefly, e.g. by considering the position of a single card, why t mix cn 4 for some c > 0. Page 4017 Part III Paper Mixing times of Markov Chains//15Draft 6th May 017

5 Marks Comments (a) Let f, g : S R. The Dirichlet form associated with P is defined by E(f, g) = (I P )f, g π. [1] It can be checked that E(f, f) = (1/) x,y π(x)p (x, y)(f(y) f(x)). Lectures [1] Theorem. Assume (P, π) is reversible, let γ be the spectral gap. Then γ = Equality is attained for f = f. min E(f, f) = f:s R E π(f)=0, f =1 min f:s R E π(f)=0 E(f, f) f. Proof: By scaling it suffices to prove the first equality. Now note that E π (f) = 0 = f, 1 π so the condition E π (f) = 0 means that f 1 f 1. Thus consider any function f with f = 1 with f 1, we have f = f, f j π f j = f, f j π f j since f, f 1 π = 0 by assumption. Using orthonormality of the eigenfunctions, and the fact that f = 1: j= E(f, f) = (I P )f, f π = f, f j π (1 λ j ) (1 λ ) f, f j = (1 λ ) f = γ. j= [3] On the other hand there is clearly equality for f = f. Lectures Theorem. Suppose P is irreducible, aperiodic, and reversible, let λ j be the eigenvalues ordered in nonincreasing order, so 1 = λ 1 λ... λ n. Then for 1 j n 1 λ j = max φ 1,...,φ j 1 min{e(f, f) : f = 1, f φ 1,..., φ j 1 }. [1] Lectures We let H t (x, y) denote the transition probabilities of the Marjov chain in continuous time, and call introduce d (t) defined by d (t) = H t (x, ) 1 π( ) Then we have the following theorem. Theorem. Assume that P, P are two transitive Markov chains on a set S, and suppose that Ẽ(f, f) AE(f, f) for all functions f : S R. Then if d (t), d (t) denote the respective l distance to stationarity of their continuous-time versions, d (t) d (t/a). Page 5017 Part III Paper Mixing times of Markov Chains//15Draft 6th May 017

6 [] Lectures Proof: For a fixed j 1, and for fixed functions φ 1,..., φ j 1, if W = Span(φ 1,..., φ j 1 ), min{ẽ(f, f) : f = 1; f W } A min{e(f, f) : f = 1, f W }. Taking the maximum over φ 1,..., φ j 1, we get so µ j A µ j. 1 λ j A(1 λ j ), Since d (t) = n [3] j= e tµ j, the result follows. Lectures [1] (c) Interchange process: Consider a fix connected graph G = (V, E) on n vertices labelled {v 1,..., v n }. One can define a random walk on the permutations of V by imagining that there is a card on each vertex of V and at each step, we exchange two neighbouring cards at random, or do nothing with probability 1/n (where n = V ). In other words the group is G = S(V ) S n and the set S of generators consists of the identity along with the transpositions (v i, v j ) for every pair of neighbouring vertices v i, v j. The kernel p is defined to be p(id) = 1/n, and p((v i, v j )) = (1 1/n)(1/ E ). Note that this set of generators is symmetric and the kernel is symmetric. Moreover since G is connected every transposition (v, w) can be built from neighbouring transpositions, and hence the set S generates all of S(V ). For each vertex x, y V fix γ xy a path on G from x to y, and set = max γ xy, x,y (where γ xy is the length of the path γ xy, measured in the number of edges), K = max e E {(x, y) V : e γ x,y }. Theorem The comparison Ẽ AE holds with A = 8 E K n(n 1). [] As a result, if t = (A/)n(log n + c), d (t) αe c for some universal constant α > 0. Lectures (d) Applications. We take the canonical paths γ xy on the graph G to be the one where we go in two segments from x to y, first chaning only the first coordinate and then changing only the second coordinate. Note that E = O(n ), = O(n), and K = O(n 3 ) since for a fixed edge e (assume wlog only the second coordinate is changing on the edge e), there are are most O(n) choices for y and and most O(n ) choices for x. [4] So we get A = O(n n n 3 /n (n 1)) = O(n ) (recall that now the base graph has not n elements but n ). Applying the Diaconis-Shahshahahani result this gives us a mixing time of at most An log(n ) in L and thus in total variation by Cauchy Schwarz. This implies t mix = O(n 4 log n), as desired. Similar to lectures Page 6017 Part III Paper Mixing times of Markov Chains//15Draft 6th May 017

7 Conversely, a single card performs a SRW on G, roughly every n steps. Since it takes t mix (G) cn steps for SRW to mix on G by a result of the course, we deduce that [] the mixing time is at least t mix cn 4, as desired. Unseen [0] Page 7017 Part III Paper Mixing times of Markov Chains//15Draft 6th May 017

8 15 Problem 3. (a) Let (X t, t = 0, 1,...) be an irreducible, aperiodic and reversible Markov chain on a finite state space S with invariant distribution π(y), y S. Define the notion of mixing time t mix (α) at level α (0, 1). Give the definition of the absolute spectral gap γ of the chain, as well as that of the relaxation time t rel, and give without proof the statement of a relation between relaxation time and mixing time at level ε > 0. (b) Define the bottleneck (or isoperimetric) ratio Φ of an irreducible, reversible Markov chain on a finite state space S. State Cheeger s inequality, and prove that if γ is the spectral gap, then γ Φ. (c) Let S = {1,..., n} be the n-cycle and consider the Markov chain on S which is the lazy simple random walk on S. Show that Φ = (1/n)(1 + o(1)) as n. Deduce that γ (1 + o(1))1/(n ), and hence show that t mix (1/4) O(n log n). (d) Compute all the eigenvalues of this Markov chain, and compare the estimate above with the actual value of the spectral gap. Page 8017 Part III Paper Mixing times of Markov Chains//15Draft 6th May 017

9 Marks Comments Definition. Let P be an irreducible, aperiodic transition matrix on a finite state space S, and let π(x) denote its stationary distribution, defined by the identity πp = π. Define the distance function for all t = 0, 1,... by: d(t) = max x S P t (x, ) π( ) tv. We also extend the definition of d(t) to all [0, ) by setting d(t) = d( t ). Note that by the ergodic theorem, d(t) 0 as t. Hence we can define, for 0 < ε < 1: t mix (ε) = inf{t 0 : d(t) ε}. [] Lectures Definition. Supose P irreducible and aperiodic. Let λ = max{ λ : λ eigenvalue 1}. γ = 1 λ is called the absolute spectral gap, and γ = 1 λ is called the spectral gap of P. The relaxation time t rel is defined by t rel = 1 γ. Theorem. ( ) ( ) 1 1 (t rel 1) log t mix (ε) log ε ε t rel π min [] Lectures (b) Recall our notation Q(e) = π(x)p (x, y) for the equilibrium flow through the edge e = (x, y). Let Q(A, B) = x A,y B Q(x, y) and define the bottleneck ratio of a set A to be Φ(A) = Q(A, Ac ) π(a) Essentially this is a measure of the size of the boundary relative to the total size of the set A. Definition The bottleneck ratio of the Markov chain is defined by [3] Φ = We now state Cheeger s inequality: min Φ(A). A:π(A) 1/ Theorem Suppose P is reversible and let γ = 1 λ be the spectral gap. Then [] Proof of upper bound. Φ γ Φ. Page 9017 Part III Paper Mixing times of Markov Chains//15Draft 6th May 017

10 We start with the proof that γ Φ which is easier. By the variational characterisation of the spectral gap, γ = min f 0;E π(f)=0 E(f, f) var π (f) Define f to be the function which is constant on A and A c by f(x) = π(a c ) if x A, f(x) = π(a) if x A c. Then note that E π (f) = 0. Moreover, E(f, f) = 1 π(x)p (x, y)(f(x) f(y)) = 1 [Q(A, Ac ) + Q(A c, A)] = Q(A, A c ). x,y On the other hand, var π (f) = E π (f ) = π(x)π(a c ) + π(x)π(a ) x A x A c = π(a)π(a c ) + π(a c )π(a) = π(a)π(a c ) π(a)/. [3] Consequently, γ Q(A, Ac ) π(a)/. Taking the minimum over all sets A such that π(a) 1/ gives γ Φ, as desired. (c) It is clear that the set A which realises the infimum must be an arc, since given A, the hull of A (the interval between left most point and rightmost point) or its complement has bigger mass and smaller boundary. Furthermore, we see that 1 A or n A since translating A to contain one or the other will likewise reduce φ(a). We deduce that Lectures Φ = min (1/)/a = 1/( n/ ) = (1/n)(1 + o(1) 1 a n/ Hence γ Φ / (/n )(1+o(1)). Hence since the walk is lazy, γ = γ and t rel = O(n ). [4] Applying the above theorem, and since π min = 1/n, we deduce that t mix = O(n log n). Unseen (d) We view S as a subset of the complex plane W n = {1, ω, ω,..., ω n 1 } with ω = e iπ/n. Let P be the matrix of this walk. To be an eigenfunction f with eigenvalue λ for P means that λf(ω k ) = P f(ω k ) = 1 (f(ωk+1 ) + f(ω k 1 )) for all 1 k n. We claim that the functions φ j (z) = z j, 1 j n, give us n eigenvalues. Indeed, φ j (ω k+1 ) + φ j (ω k 1 ) = ω jk ωj + ω j = φ j (ω k )Re(ω j ) = φ j (ω k ) cos ( πj n ). Page Part III Paper Mixing times of Markov Chains//15Draft 6th May 017

11 Thus φ j is an eigenfunction with eigenvalue cos(πj/n). Since the φ j are linearly independent, we deduce that these are all the eigenfunctions. If n is even, the chain is periodic and the absolute spectral gap is 0. If n is odd, the chain is aperiodic and the absolute spectral gap is equal to ( ) π[(n + 1)/] ( π 1 + cos = 1 cos n n) π n [4] as n. Thus So the estimate was off by a factor 1/π. t rel n π. [0] Page Part III Paper Mixing times of Markov Chains//15Draft 6th May 017

12 15 Problem 4. (a) Let (X t, t = 0, 1,...) be an irreducible, aperiodic and reversible Markov chain on a finite state space S with invariant distribution π(y), y S. Show that if P t (x, y) denote the t-step transition probabilities of the chain, P t (x, y) π(y) = λ t jf j (x)f j (y) where λ j are the eigenvalues and f j are functions which you should specify. [You can assume without proof that there exists an orthonormal basis of eigenfunctions for the inner product associated with the l norm f = ( x f(x) π(x)) 1/ ]. (b) Define the relaxation time t rel, and the l distance d (t) to equilibrium. Show that d(t) (1/)d (t) where d(t) is the total variation distance to equilibrium, and show that ( ) 1 t mix (ε) log ε t rel π min where π min = min{π(x) : x S}, and t mix (ε) is the mixing time at level ε. (c) Show that P t (x, x) is a decreasing sequence (as a function of t = 0, 1,...). Show however with an example that P t (x, x) is not in general monotone, as a function of t. (d) Show that d (t) is a contraction: for all t, s 0: d (t + s) d (s)e t/ t rel. [You can use freely without proof the inequality 1 x e x, valid for all x R]. Page 1017 Part III Paper Mixing times of Markov Chains//15Draft 6th May 017

13 Marks Comments (a) Let δ y be the function s S δ y (s) equal to 1 if s = y and 0 otherwise. By the spectral theorem (for symmetric matrices), we can expand this function on the orthonormal basis of eigenfunctions f j with respect to the inner product f, g π = x f(x)g(x)π(x): δ y = δ y, f j π f j = f j (y)π(y)f j. Hence, since P t (x, y) is nothing else but (P t δ y )(x) and λ t j is an eignevalue of P t we get: P t (x, y) = f j (y)π(y)λ t jf j (x) [3] as required. Lectures (b) Definition. Supose P irreducible and aperiodic. Let λ = max{ λ : λ eigenvalue 1}. γ = 1 λ is called the absolute spectral gap, and γ = 1 λ is called the spectral gap of P. The relaxation time t rel is defined by [1] [1] [3] t rel = 1 γ. P Definition Let d (t) = sup t (x, ) x S π( ) 1. We call d (t), the l distance to stationarity. Lemma Assume that P is irreducible, aperiodic (but not necessarily reversible). Then we have d(t) (1/)d (t). Proof: Recall that one of the basic identities for the definition of the total variation distance is P t (x, ) π = 1 P t (x, y) π(y) = π(y) P t (x, y) 1 π(y) = P t (x, ) 1 π( ) y y 1 where 1 refers to the l 1 (π) norm. Taking the square and using Jensen s inequality, we get 4 P t (x, ) π P t (x, ) 1 π( ) d (t). Taking the maximum over x gives the result. Theorem. ( t mix (ε). log 1 ε π min ) t rel Proof: Expanding the function on the eigenfunction basis f j, we get P t (x, ) 1 π( ) = f j (x)f j ( )λ t j j= = λ t j f j (x) λ t f j (x). j= j Page Part III Paper Mixing times of Markov Chains//15Draft 6th May 017

14 Now, we claim that n f j(x) = π(x) 1. Indeed, by decomposition: Hence π(x) = δ x, δ x π = f j (x) π(x). 4 P t (x, ) π λ t π(x) 1 λ t π 1 min (1 γ ) t π min e γ t π 1 min. Maximising over x and taking the square root, we get d(t) 1 e γ t π 1 min. Solving for the right-hand side equal to ε gives us d(t) ε as soon as t 1 1 [4] γ log( ε π min ). Lectures (c) When we specialise the eigenfunction expansion to y = x then we get P t (x, x) = π(x) j λ t j f j (x). [] All terms are positive and λ j 1 so the sum is decreasing (nonincreasing). Unseen Counterexample: take the n circle. Then P 0 (x, x) = 1, P 1 (x, x) = 0 but [] P (x, x) > 0. (Basically this is locally periodic.) Unseen (d) Observe that an exact expression (from the above proof) is so d (t + s) = d (t) = j= j= λ t j f j (x) λ t+s j f j (x) λ s d (t) (1 γ ) s d (s) e sγ d (s). Taking the square root, and by definition of t rel, d (t + s) e s/ t rel d (t), [4] as desired. Unseen [0] Page Part III Paper Mixing times of Markov Chains//15Draft 6th May 017

Modern Discrete Probability Spectral Techniques

Modern Discrete Probability Spectral Techniques Modern Discrete Probability VI - Spectral Techniques Background Sébastien Roch UW Madison Mathematics December 22, 2014 1 Review 2 3 4 Mixing time I Theorem (Convergence to stationarity) Consider a finite

More information

MARKOV CHAINS AND MIXING TIMES

MARKOV CHAINS AND MIXING TIMES MARKOV CHAINS AND MIXING TIMES BEAU DABBS Abstract. This paper introduces the idea of a Markov chain, a random process which is independent of all states but its current one. We analyse some basic properties

More information

Characterization of cutoff for reversible Markov chains

Characterization of cutoff for reversible Markov chains Characterization of cutoff for reversible Markov chains Yuval Peres Joint work with Riddhi Basu and Jonathan Hermon 23 Feb 2015 Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff

More information

Characterization of cutoff for reversible Markov chains

Characterization of cutoff for reversible Markov chains Characterization of cutoff for reversible Markov chains Yuval Peres Joint work with Riddhi Basu and Jonathan Hermon 3 December 2014 Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff

More information

The coupling method - Simons Counting Complexity Bootcamp, 2016

The coupling method - Simons Counting Complexity Bootcamp, 2016 The coupling method - Simons Counting Complexity Bootcamp, 2016 Nayantara Bhatnagar (University of Delaware) Ivona Bezáková (Rochester Institute of Technology) January 26, 2016 Techniques for bounding

More information

Lecture 7. µ(x)f(x). When µ is a probability measure, we say µ is a stationary distribution.

Lecture 7. µ(x)f(x). When µ is a probability measure, we say µ is a stationary distribution. Lecture 7 1 Stationary measures of a Markov chain We now study the long time behavior of a Markov Chain: in particular, the existence and uniqueness of stationary measures, and the convergence of the distribution

More information

Reversible Markov chains

Reversible Markov chains Reversible Markov chains Variational representations and ordering Chris Sherlock Abstract This pedagogical document explains three variational representations that are useful when comparing the efficiencies

More information

Lecture 2: September 8

Lecture 2: September 8 CS294 Markov Chain Monte Carlo: Foundations & Applications Fall 2009 Lecture 2: September 8 Lecturer: Prof. Alistair Sinclair Scribes: Anand Bhaskar and Anindya De Disclaimer: These notes have not been

More information

Necessary and sufficient conditions for strong R-positivity

Necessary and sufficient conditions for strong R-positivity Necessary and sufficient conditions for strong R-positivity Wednesday, November 29th, 2017 The Perron-Frobenius theorem Let A = (A(x, y)) x,y S be a nonnegative matrix indexed by a countable set S. We

More information

PCMI LECTURE NOTES ON PROPERTY (T ), EXPANDER GRAPHS AND APPROXIMATE GROUPS (PRELIMINARY VERSION)

PCMI LECTURE NOTES ON PROPERTY (T ), EXPANDER GRAPHS AND APPROXIMATE GROUPS (PRELIMINARY VERSION) PCMI LECTURE NOTES ON PROPERTY (T ), EXPANDER GRAPHS AND APPROXIMATE GROUPS (PRELIMINARY VERSION) EMMANUEL BREUILLARD 1. Lecture 1, Spectral gaps for infinite groups and non-amenability The final aim of

More information

A note on adiabatic theorem for Markov chains and adiabatic quantum computation. Yevgeniy Kovchegov Oregon State University

A note on adiabatic theorem for Markov chains and adiabatic quantum computation. Yevgeniy Kovchegov Oregon State University A note on adiabatic theorem for Markov chains and adiabatic quantum computation Yevgeniy Kovchegov Oregon State University Introduction Max Born and Vladimir Fock in 1928: a physical system remains in

More information

MARKOV CHAINS AND HIDDEN MARKOV MODELS

MARKOV CHAINS AND HIDDEN MARKOV MODELS MARKOV CHAINS AND HIDDEN MARKOV MODELS MERYL SEAH Abstract. This is an expository paper outlining the basics of Markov chains. We start the paper by explaining what a finite Markov chain is. Then we describe

More information

Spectral Graph Theory and its Applications

Spectral Graph Theory and its Applications Spectral Graph Theory and its Applications Yi-Hsuan Lin Abstract This notes were given in a series of lectures by Prof Fan Chung in National Taiwan University Introduction Basic notations Let G = (V, E)

More information

Lecture 10. Theorem 1.1 [Ergodicity and extremality] A probability measure µ on (Ω, F) is ergodic for T if and only if it is an extremal point in M.

Lecture 10. Theorem 1.1 [Ergodicity and extremality] A probability measure µ on (Ω, F) is ergodic for T if and only if it is an extremal point in M. Lecture 10 1 Ergodic decomposition of invariant measures Let T : (Ω, F) (Ω, F) be measurable, and let M denote the space of T -invariant probability measures on (Ω, F). Then M is a convex set, although

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure? MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due ). Show that the open disk x 2 + y 2 < 1 is a countable union of planar elementary sets. Show that the closed disk x 2 + y 2 1 is a countable

More information

TOTAL VARIATION CUTOFF IN BIRTH-AND-DEATH CHAINS

TOTAL VARIATION CUTOFF IN BIRTH-AND-DEATH CHAINS TOTAL VARIATION CUTOFF IN BIRTH-AND-DEATH CHAINS JIAN DING, EYAL LUBETZKY AND YUVAL PERES ABSTRACT. The cutoff phenomenon describes a case where a Markov chain exhibits a sharp transition in its convergence

More information

The Logarithmic Sobolev Constant of Some Finite Markov Chains. Wai Wai Liu

The Logarithmic Sobolev Constant of Some Finite Markov Chains. Wai Wai Liu CORNELL UNIVERSITY MATHEMATICS DEPARTMENT SENIOR THESIS The Logarithmic Sobolev Constant of Some Finite Markov Chains A THESIS PRESENTED IN PARTIAL FULFILLMENT OF CRITERIA FOR HONORS IN MATHEMATICS Wai

More information

Markov Chains and Stochastic Sampling

Markov Chains and Stochastic Sampling Part I Markov Chains and Stochastic Sampling 1 Markov Chains and Random Walks on Graphs 1.1 Structure of Finite Markov Chains We shall only consider Markov chains with a finite, but usually very large,

More information

Markov Chain Monte Carlo (MCMC)

Markov Chain Monte Carlo (MCMC) Markov Chain Monte Carlo (MCMC Dependent Sampling Suppose we wish to sample from a density π, and we can evaluate π as a function but have no means to directly generate a sample. Rejection sampling can

More information

Lecture 8: Path Technology

Lecture 8: Path Technology Counting and Sampling Fall 07 Lecture 8: Path Technology Lecturer: Shayan Oveis Gharan October 0 Disclaimer: These notes have not been subjected to the usual scrutiny reserved for formal publications.

More information

Eigenvalues, random walks and Ramanujan graphs

Eigenvalues, random walks and Ramanujan graphs Eigenvalues, random walks and Ramanujan graphs David Ellis 1 The Expander Mixing lemma We have seen that a bounded-degree graph is a good edge-expander if and only if if has large spectral gap If G = (V,

More information

Measure and Integration: Solutions of CW2

Measure and Integration: Solutions of CW2 Measure and Integration: s of CW2 Fall 206 [G. Holzegel] December 9, 206 Problem of Sheet 5 a) Left (f n ) and (g n ) be sequences of integrable functions with f n (x) f (x) and g n (x) g (x) for almost

More information

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure? MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due 9/5). Prove that every countable set A is measurable and µ(a) = 0. 2 (Bonus). Let A consist of points (x, y) such that either x or y is

More information

Applied Stochastic Processes

Applied Stochastic Processes Applied Stochastic Processes Jochen Geiger last update: July 18, 2007) Contents 1 Discrete Markov chains........................................ 1 1.1 Basic properties and examples................................

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

LECTURE NOTES FOR MARKOV CHAINS: MIXING TIMES, HITTING TIMES, AND COVER TIMES IN SAINT PETERSBURG SUMMER SCHOOL, 2012

LECTURE NOTES FOR MARKOV CHAINS: MIXING TIMES, HITTING TIMES, AND COVER TIMES IN SAINT PETERSBURG SUMMER SCHOOL, 2012 LECTURE NOTES FOR MARKOV CHAINS: MIXING TIMES, HITTING TIMES, AND COVER TIMES IN SAINT PETERSBURG SUMMER SCHOOL, 2012 By Júlia Komjáthy Yuval Peres Eindhoven University of Technology and Microsoft Research

More information

Markov Chains for Everybody

Markov Chains for Everybody Markov Chains for Everybody An Introduction to the theory of discrete time Markov chains on countable state spaces. Wilhelm Huisinga, & Eike Meerbach Fachbereich Mathematik und Informatik Freien Universität

More information

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R. Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions

More information

Approximate Counting and Markov Chain Monte Carlo

Approximate Counting and Markov Chain Monte Carlo Approximate Counting and Markov Chain Monte Carlo A Randomized Approach Arindam Pal Department of Computer Science and Engineering Indian Institute of Technology Delhi March 18, 2011 April 8, 2011 Arindam

More information

Expansion and Isoperimetric Constants for Product Graphs

Expansion and Isoperimetric Constants for Product Graphs Expansion and Isoperimetric Constants for Product Graphs C. Houdré and T. Stoyanov May 4, 2004 Abstract Vertex and edge isoperimetric constants of graphs are studied. Using a functional-analytic approach,

More information

25.1 Markov Chain Monte Carlo (MCMC)

25.1 Markov Chain Monte Carlo (MCMC) CS880: Approximations Algorithms Scribe: Dave Andrzejewski Lecturer: Shuchi Chawla Topic: Approx counting/sampling, MCMC methods Date: 4/4/07 The previous lecture showed that, for self-reducible problems,

More information

Markov Chain Monte Carlo

Markov Chain Monte Carlo Chapter 5 Markov Chain Monte Carlo MCMC is a kind of improvement of the Monte Carlo method By sampling from a Markov chain whose stationary distribution is the desired sampling distributuion, it is possible

More information

Flip dynamics on canonical cut and project tilings

Flip dynamics on canonical cut and project tilings Flip dynamics on canonical cut and project tilings Thomas Fernique CNRS & Univ. Paris 13 M2 Pavages ENS Lyon November 5, 2015 Outline 1 Random tilings 2 Random sampling 3 Mixing time 4 Slow cooling Outline

More information

The cutoff phenomenon for random walk on random directed graphs

The cutoff phenomenon for random walk on random directed graphs The cutoff phenomenon for random walk on random directed graphs Justin Salez Joint work with C. Bordenave and P. Caputo Outline of the talk Outline of the talk 1. The cutoff phenomenon for Markov chains

More information

Lecture 14: October 22

Lecture 14: October 22 CS294 Markov Chain Monte Carlo: Foundations & Applications Fall 2009 Lecture 14: October 22 Lecturer: Alistair Sinclair Scribes: Alistair Sinclair Disclaimer: These notes have not been subjected to the

More information

Positive and null recurrent-branching Process

Positive and null recurrent-branching Process December 15, 2011 In last discussion we studied the transience and recurrence of Markov chains There are 2 other closely related issues about Markov chains that we address Is there an invariant distribution?

More information

Lecture 28: April 26

Lecture 28: April 26 CS271 Randomness & Computation Spring 2018 Instructor: Alistair Sinclair Lecture 28: April 26 Disclaimer: These notes have not been subjected to the usual scrutiny accorded to formal publications. They

More information

University of Toronto Department of Statistics

University of Toronto Department of Statistics Norm Comparisons for Data Augmentation by James P. Hobert Department of Statistics University of Florida and Jeffrey S. Rosenthal Department of Statistics University of Toronto Technical Report No. 0704

More information

A Proof of Aldous Spectral Gap Conjecture. Thomas M. Liggett, UCLA. Stirring Processes

A Proof of Aldous Spectral Gap Conjecture. Thomas M. Liggett, UCLA. Stirring Processes A Proof of Aldous Spectral Gap Conjecture Thomas M. Liggett, UCLA Stirring Processes Given a graph G = (V, E), associate with each edge e E a Poisson process Π e with rate c e 0. Labels are put on the

More information

Principle of Mathematical Induction

Principle of Mathematical Induction Advanced Calculus I. Math 451, Fall 2016, Prof. Vershynin Principle of Mathematical Induction 1. Prove that 1 + 2 + + n = 1 n(n + 1) for all n N. 2 2. Prove that 1 2 + 2 2 + + n 2 = 1 n(n + 1)(2n + 1)

More information

Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product

Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product Chapter 4 Hilbert Spaces 4.1 Inner Product Spaces Inner Product Space. A complex vector space E is called an inner product space (or a pre-hilbert space, or a unitary space) if there is a mapping (, )

More information

Mixing time for a random walk on a ring

Mixing time for a random walk on a ring Mixing time for a random walk on a ring Stephen Connor Joint work with Michael Bate Paris, September 2013 Introduction Let X be a discrete time Markov chain on a finite state space S, with transition matrix

More information

The Caratheodory Construction of Measures

The Caratheodory Construction of Measures Chapter 5 The Caratheodory Construction of Measures Recall how our construction of Lebesgue measure in Chapter 2 proceeded from an initial notion of the size of a very restricted class of subsets of R,

More information

Introduction to Markov Chains and Riffle Shuffling

Introduction to Markov Chains and Riffle Shuffling Introduction to Markov Chains and Riffle Shuffling Nina Kuklisova Math REU 202 University of Chicago September 27, 202 Abstract In this paper, we introduce Markov Chains and their basic properties, and

More information

Lecture 3: September 10

Lecture 3: September 10 CS294 Markov Chain Monte Carlo: Foundations & Applications Fall 2009 Lecture 3: September 10 Lecturer: Prof. Alistair Sinclair Scribes: Andrew H. Chan, Piyush Srivastava Disclaimer: These notes have not

More information

APPENDIX A. Background Mathematics. A.1 Linear Algebra. Vector algebra. Let x denote the n-dimensional column vector with components x 1 x 2.

APPENDIX A. Background Mathematics. A.1 Linear Algebra. Vector algebra. Let x denote the n-dimensional column vector with components x 1 x 2. APPENDIX A Background Mathematics A. Linear Algebra A.. Vector algebra Let x denote the n-dimensional column vector with components 0 x x 2 B C @. A x n Definition 6 (scalar product). The scalar product

More information

Combinatorics in Banach space theory Lecture 12

Combinatorics in Banach space theory Lecture 12 Combinatorics in Banach space theory Lecture The next lemma considerably strengthens the assertion of Lemma.6(b). Lemma.9. For every Banach space X and any n N, either all the numbers n b n (X), c n (X)

More information

Markov Processes on Discrete State Spaces

Markov Processes on Discrete State Spaces Markov Processes on Discrete State Spaces Theoretical Background and Applications. Christof Schuette 1 & Wilhelm Huisinga 2 1 Fachbereich Mathematik und Informatik Freie Universität Berlin & DFG Research

More information

Detailed Proofs of Lemmas, Theorems, and Corollaries

Detailed Proofs of Lemmas, Theorems, and Corollaries Dahua Lin CSAIL, MIT John Fisher CSAIL, MIT A List of Lemmas, Theorems, and Corollaries For being self-contained, we list here all the lemmas, theorems, and corollaries in the main paper. Lemma. The joint

More information

Geometric ρ-mixing property of the interarrival times of a stationary Markovian Arrival Process

Geometric ρ-mixing property of the interarrival times of a stationary Markovian Arrival Process Author manuscript, published in "Journal of Applied Probability 50, 2 (2013) 598-601" Geometric ρ-mixing property of the interarrival times of a stationary Markovian Arrival Process L. Hervé and J. Ledoux

More information

Partial Differential Equations and Random Walks

Partial Differential Equations and Random Walks Partial Differential Equations and Random Walks with Emphasis on the Heat Equation Kevin Hu January 7, 2014 Kevin Hu PDE and Random Walks January 7, 2014 1 / 28 Agenda 1 Introduction Overview 2 Random

More information

C.7. Numerical series. Pag. 147 Proof of the converging criteria for series. Theorem 5.29 (Comparison test) Let a k and b k be positive-term series

C.7. Numerical series. Pag. 147 Proof of the converging criteria for series. Theorem 5.29 (Comparison test) Let a k and b k be positive-term series C.7 Numerical series Pag. 147 Proof of the converging criteria for series Theorem 5.29 (Comparison test) Let and be positive-term series such that 0, for any k 0. i) If the series converges, then also

More information

Optimization and Optimal Control in Banach Spaces

Optimization and Optimal Control in Banach Spaces Optimization and Optimal Control in Banach Spaces Bernhard Schmitzer October 19, 2017 1 Convex non-smooth optimization with proximal operators Remark 1.1 (Motivation). Convex optimization: easier to solve,

More information

A generalization of Dobrushin coefficient

A generalization of Dobrushin coefficient A generalization of Dobrushin coefficient Ü µ ŒÆ.êÆ ÆÆ 202.5 . Introduction and main results We generalize the well-known Dobrushin coefficient δ in total variation to weighted total variation δ V, which

More information

Markov Chains, Random Walks on Graphs, and the Laplacian

Markov Chains, Random Walks on Graphs, and the Laplacian Markov Chains, Random Walks on Graphs, and the Laplacian CMPSCI 791BB: Advanced ML Sridhar Mahadevan Random Walks! There is significant interest in the problem of random walks! Markov chain analysis! Computer

More information

(1) Consider the space S consisting of all continuous real-valued functions on the closed interval [0, 1]. For f, g S, define

(1) Consider the space S consisting of all continuous real-valued functions on the closed interval [0, 1]. For f, g S, define Homework, Real Analysis I, Fall, 2010. (1) Consider the space S consisting of all continuous real-valued functions on the closed interval [0, 1]. For f, g S, define ρ(f, g) = 1 0 f(x) g(x) dx. Show that

More information

Functional Analysis I

Functional Analysis I Functional Analysis I Course Notes by Stefan Richter Transcribed and Annotated by Gregory Zitelli Polar Decomposition Definition. An operator W B(H) is called a partial isometry if W x = X for all x (ker

More information

CONVERGENCE THEOREM FOR FINITE MARKOV CHAINS. Contents

CONVERGENCE THEOREM FOR FINITE MARKOV CHAINS. Contents CONVERGENCE THEOREM FOR FINITE MARKOV CHAINS ARI FREEDMAN Abstract. In this expository paper, I will give an overview of the necessary conditions for convergence in Markov chains on finite state spaces.

More information

Convergence Rate of Markov Chains

Convergence Rate of Markov Chains Convergence Rate of Markov Chains Will Perkins April 16, 2013 Convergence Last class we saw that if X n is an irreducible, aperiodic, positive recurrent Markov chain, then there exists a stationary distribution

More information

RECURRENCE IN COUNTABLE STATE MARKOV CHAINS

RECURRENCE IN COUNTABLE STATE MARKOV CHAINS RECURRENCE IN COUNTABLE STATE MARKOV CHAINS JIN WOO SUNG Abstract. This paper investigates the recurrence and transience of countable state irreducible Markov chains. Recurrence is the property that a

More information

Spectral Geometry of Riemann Surfaces

Spectral Geometry of Riemann Surfaces Spectral Geometry of Riemann Surfaces These are rough notes on Spectral Geometry and their application to hyperbolic riemann surfaces. They are based on Buser s text Geometry and Spectra of Compact Riemann

More information

Spectral properties of Markov operators in Markov chain Monte Carlo

Spectral properties of Markov operators in Markov chain Monte Carlo Spectral properties of Markov operators in Markov chain Monte Carlo Qian Qin Advisor: James P. Hobert October 2017 1 Introduction Markov chain Monte Carlo (MCMC) is an indispensable tool in Bayesian statistics.

More information

18.175: Lecture 30 Markov chains

18.175: Lecture 30 Markov chains 18.175: Lecture 30 Markov chains Scott Sheffield MIT Outline Review what you know about finite state Markov chains Finite state ergodicity and stationarity More general setup Outline Review what you know

More information

Math 321 Final Examination April 1995 Notation used in this exam: N. (1) S N (f,x) = f(t)e int dt e inx.

Math 321 Final Examination April 1995 Notation used in this exam: N. (1) S N (f,x) = f(t)e int dt e inx. Math 321 Final Examination April 1995 Notation used in this exam: N 1 π (1) S N (f,x) = f(t)e int dt e inx. 2π n= N π (2) C(X, R) is the space of bounded real-valued functions on the metric space X, equipped

More information

18.175: Lecture 3 Integration

18.175: Lecture 3 Integration 18.175: Lecture 3 Scott Sheffield MIT Outline Outline Recall definitions Probability space is triple (Ω, F, P) where Ω is sample space, F is set of events (the σ-algebra) and P : F [0, 1] is the probability

More information

MATH 5640: Fourier Series

MATH 5640: Fourier Series MATH 564: Fourier Series Hung Phan, UMass Lowell September, 8 Power Series A power series in the variable x is a series of the form a + a x + a x + = where the coefficients a, a,... are real or complex

More information

Convergence of Random Walks

Convergence of Random Walks Graphs and Networks Lecture 9 Convergence of Random Walks Daniel A. Spielman September 30, 010 9.1 Overview We begin by reviewing the basics of spectral theory. We then apply this theory to show that lazy

More information

n [ F (b j ) F (a j ) ], n j=1(a j, b j ] E (4.1)

n [ F (b j ) F (a j ) ], n j=1(a j, b j ] E (4.1) 1.4. CONSTRUCTION OF LEBESGUE-STIELTJES MEASURES In this section we shall put to use the Carathéodory-Hahn theory, in order to construct measures with certain desirable properties first on the real line

More information

Mixing times via super-fast coupling

Mixing times via super-fast coupling s via super-fast coupling Yevgeniy Kovchegov Department of Mathematics Oregon State University Outline 1 About 2 Definition Coupling Result. About Outline 1 About 2 Definition Coupling Result. About Coauthor

More information

Take-Home Final Examination of Math 55a (January 17 to January 23, 2004)

Take-Home Final Examination of Math 55a (January 17 to January 23, 2004) Take-Home Final Eamination of Math 55a January 17 to January 3, 004) N.B. For problems which are similar to those on the homework assignments, complete self-contained solutions are required and homework

More information

Math 61CM - Solutions to homework 6

Math 61CM - Solutions to homework 6 Math 61CM - Solutions to homework 6 Cédric De Groote November 5 th, 2018 Problem 1: (i) Give an example of a metric space X such that not all Cauchy sequences in X are convergent. (ii) Let X be a metric

More information

Lecture 5: Random Walks and Markov Chain

Lecture 5: Random Walks and Markov Chain Spectral Graph Theory and Applications WS 20/202 Lecture 5: Random Walks and Markov Chain Lecturer: Thomas Sauerwald & He Sun Introduction to Markov Chains Definition 5.. A sequence of random variables

More information

Lecture 1 Measure concentration

Lecture 1 Measure concentration CSE 29: Learning Theory Fall 2006 Lecture Measure concentration Lecturer: Sanjoy Dasgupta Scribe: Nakul Verma, Aaron Arvey, and Paul Ruvolo. Concentration of measure: examples We start with some examples

More information

Premiliminary Examination, Part I & II

Premiliminary Examination, Part I & II Premiliminary Examination, Part I & II Monday, August 9, 16 Dept. of Mathematics University of Pennsylvania Problem 1. Let V be the real vector space of continuous real-valued functions on the closed interval

More information

Economics 204 Summer/Fall 2011 Lecture 5 Friday July 29, 2011

Economics 204 Summer/Fall 2011 Lecture 5 Friday July 29, 2011 Economics 204 Summer/Fall 2011 Lecture 5 Friday July 29, 2011 Section 2.6 (cont.) Properties of Real Functions Here we first study properties of functions from R to R, making use of the additional structure

More information

NORMS ON SPACE OF MATRICES

NORMS ON SPACE OF MATRICES NORMS ON SPACE OF MATRICES. Operator Norms on Space of linear maps Let A be an n n real matrix and x 0 be a vector in R n. We would like to use the Picard iteration method to solve for the following system

More information

Vertex and edge expansion properties for rapid mixing

Vertex and edge expansion properties for rapid mixing Vertex and edge expansion properties for rapid mixing Ravi Montenegro Abstract We show a strict hierarchy among various edge and vertex expansion properties of Markov chains. This gives easy proofs of

More information

Edge Flip Chain for Unbiased Dyadic Tilings

Edge Flip Chain for Unbiased Dyadic Tilings Sarah Cannon, Georgia Tech,, David A. Levin, U. Oregon Alexandre Stauffer, U. Bath, March 30, 2018 Dyadic Tilings A dyadic tiling is a tiling on [0,1] 2 by n = 2 k dyadic rectangles, rectangles which are

More information

ABSTRACT CONDITIONAL EXPECTATION IN L 2

ABSTRACT CONDITIONAL EXPECTATION IN L 2 ABSTRACT CONDITIONAL EXPECTATION IN L 2 Abstract. We prove that conditional expecations exist in the L 2 case. The L 2 treatment also gives us a geometric interpretation for conditional expectation. 1.

More information

On Markov chains for independent sets

On Markov chains for independent sets On Markov chains for independent sets Martin Dyer and Catherine Greenhill School of Computer Studies University of Leeds Leeds LS2 9JT United Kingdom Submitted: 8 December 1997. Revised: 16 February 1999

More information

SPECTRAL PROPERTIES OF THE LAPLACIAN ON BOUNDED DOMAINS

SPECTRAL PROPERTIES OF THE LAPLACIAN ON BOUNDED DOMAINS SPECTRAL PROPERTIES OF THE LAPLACIAN ON BOUNDED DOMAINS TSOGTGEREL GANTUMUR Abstract. After establishing discrete spectra for a large class of elliptic operators, we present some fundamental spectral properties

More information

Math 361: Homework 1 Solutions

Math 361: Homework 1 Solutions January 3, 4 Math 36: Homework Solutions. We say that two norms and on a vector space V are equivalent or comparable if the topology they define on V are the same, i.e., for any sequence of vectors {x

More information

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989),

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989), Real Analysis 2, Math 651, Spring 2005 April 26, 2005 1 Real Analysis 2, Math 651, Spring 2005 Krzysztof Chris Ciesielski 1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer

More information

Spectral Analysis of Random Walks

Spectral Analysis of Random Walks Graphs and Networks Lecture 9 Spectral Analysis of Random Walks Daniel A. Spielman September 26, 2013 9.1 Disclaimer These notes are not necessarily an accurate representation of what happened in class.

More information

INTRODUCTION TO MCMC AND PAGERANK. Eric Vigoda Georgia Tech. Lecture for CS 6505

INTRODUCTION TO MCMC AND PAGERANK. Eric Vigoda Georgia Tech. Lecture for CS 6505 INTRODUCTION TO MCMC AND PAGERANK Eric Vigoda Georgia Tech Lecture for CS 6505 1 MARKOV CHAIN BASICS 2 ERGODICITY 3 WHAT IS THE STATIONARY DISTRIBUTION? 4 PAGERANK 5 MIXING TIME 6 PREVIEW OF FURTHER TOPICS

More information

Definition A finite Markov chain is a memoryless homogeneous discrete stochastic process with a finite number of states.

Definition A finite Markov chain is a memoryless homogeneous discrete stochastic process with a finite number of states. Chapter 8 Finite Markov Chains A discrete system is characterized by a set V of states and transitions between the states. V is referred to as the state space. We think of the transitions as occurring

More information

Numerical Methods for Differential Equations Mathematical and Computational Tools

Numerical Methods for Differential Equations Mathematical and Computational Tools Numerical Methods for Differential Equations Mathematical and Computational Tools Gustaf Söderlind Numerical Analysis, Lund University Contents V4.16 Part 1. Vector norms, matrix norms and logarithmic

More information

MATHS 730 FC Lecture Notes March 5, Introduction

MATHS 730 FC Lecture Notes March 5, Introduction 1 INTRODUCTION MATHS 730 FC Lecture Notes March 5, 2014 1 Introduction Definition. If A, B are sets and there exists a bijection A B, they have the same cardinality, which we write as A, #A. If there exists

More information

Notes on Measure, Probability and Stochastic Processes. João Lopes Dias

Notes on Measure, Probability and Stochastic Processes. João Lopes Dias Notes on Measure, Probability and Stochastic Processes João Lopes Dias Departamento de Matemática, ISEG, Universidade de Lisboa, Rua do Quelhas 6, 1200-781 Lisboa, Portugal E-mail address: jldias@iseg.ulisboa.pt

More information

1.4 Outer measures 10 CHAPTER 1. MEASURE

1.4 Outer measures 10 CHAPTER 1. MEASURE 10 CHAPTER 1. MEASURE 1.3.6. ( Almost everywhere and null sets If (X, A, µ is a measure space, then a set in A is called a null set (or µ-null if its measure is 0. Clearly a countable union of null sets

More information

INTRODUCTION TO MCMC AND PAGERANK. Eric Vigoda Georgia Tech. Lecture for CS 6505

INTRODUCTION TO MCMC AND PAGERANK. Eric Vigoda Georgia Tech. Lecture for CS 6505 INTRODUCTION TO MCMC AND PAGERANK Eric Vigoda Georgia Tech Lecture for CS 6505 1 MARKOV CHAIN BASICS 2 ERGODICITY 3 WHAT IS THE STATIONARY DISTRIBUTION? 4 PAGERANK 5 MIXING TIME 6 PREVIEW OF FURTHER TOPICS

More information

MET Workshop: Exercises

MET Workshop: Exercises MET Workshop: Exercises Alex Blumenthal and Anthony Quas May 7, 206 Notation. R d is endowed with the standard inner product (, ) and Euclidean norm. M d d (R) denotes the space of n n real matrices. When

More information

MATH 56A: STOCHASTIC PROCESSES CHAPTER 2

MATH 56A: STOCHASTIC PROCESSES CHAPTER 2 MATH 56A: STOCHASTIC PROCESSES CHAPTER 2 2. Countable Markov Chains I started Chapter 2 which talks about Markov chains with a countably infinite number of states. I did my favorite example which is on

More information

Embeddings of finite metric spaces in Euclidean space: a probabilistic view

Embeddings of finite metric spaces in Euclidean space: a probabilistic view Embeddings of finite metric spaces in Euclidean space: a probabilistic view Yuval Peres May 11, 2006 Talk based on work joint with: Assaf Naor, Oded Schramm and Scott Sheffield Definition: An invertible

More information

Analysis Comprehensive Exam Questions Fall 2008

Analysis Comprehensive Exam Questions Fall 2008 Analysis Comprehensive xam Questions Fall 28. (a) Let R be measurable with finite Lebesgue measure. Suppose that {f n } n N is a bounded sequence in L 2 () and there exists a function f such that f n (x)

More information

Homework set 3 - Solutions

Homework set 3 - Solutions Homework set 3 - Solutions Math 495 Renato Feres Problems 1. (Text, Exercise 1.13, page 38.) Consider the Markov chain described in Exercise 1.1: The Smiths receive the paper every morning and place it

More information

18.600: Lecture 32 Markov Chains

18.600: Lecture 32 Markov Chains 18.600: Lecture 32 Markov Chains Scott Sheffield MIT Outline Markov chains Examples Ergodicity and stationarity Outline Markov chains Examples Ergodicity and stationarity Markov chains Consider a sequence

More information

The parabolic Anderson model on Z d with time-dependent potential: Frank s works

The parabolic Anderson model on Z d with time-dependent potential: Frank s works Weierstrass Institute for Applied Analysis and Stochastics The parabolic Anderson model on Z d with time-dependent potential: Frank s works Based on Frank s works 2006 2016 jointly with Dirk Erhard (Warwick),

More information

1 Independent increments

1 Independent increments Tel Aviv University, 2008 Brownian motion 1 1 Independent increments 1a Three convolution semigroups........... 1 1b Independent increments.............. 2 1c Continuous time................... 3 1d Bad

More information