A mathematical framework for Exact Milestoning
|
|
- Theresa Flynn
- 5 years ago
- Views:
Transcription
1 A mathematical framework for Exact Milestoning David Aristoff (joint work with Juan M. Bello-Rivas and Ron Elber) Colorado State University July 2015 D. Aristoff (Colorado State University) July / 22
2 Introduction Milestoning is a technique for estimating mean first passage times (MFPTs). Exact Milestoning is a variant which yields exact times in a certain limit. Milestoning and Exact Milestoning are both practical algorithms. Both algorithms are appropriate for systems with rough energy landscapes. Important application: in silico drug design. Estimating characteristic time for a drug to dissociate from a protein target. D. Aristoff (Colorado State University) July / 22
3 Introduction Efficiency is based on the use of short trajectories simulated in parallel. These trajectories start on a milestone and end at a neighboring milestone. The milestones are usually codimension 1 hypersurfaces. Examples: 1D milestoning: milestones = level sets of a scalar reaction coordinate Network milestoning: milestones = faces of Voronoi cell boundaries D. Aristoff (Colorado State University) July / 22
4 Introduction The problem. Let (X t ) be a stochastic dynamics and R, P disjoint subsets of state space. We want to compute the mean first passage time of (X t ) from R to P. Source and sink. When (X t ) reaches P, it immediately restarts at R. This assumption does not affect the MFPT but it is useful for theory. Examples to keep in mind: State space is a torus, (X t ) is a diffusion; State space is discrete, (X t ) is a continuous time Markov process. Below we think of R as a point, though it can be a distribution too. D. Aristoff (Colorado State University) July / 22
5 Setup Let (X t ) take values in a standard Borel space. Our goal. To efficiently compute E R [τ P ], where τ P is the first time for (X t ) to hit P. τ P P R D. Aristoff (Colorado State University) July / 22
6 Setup We coarse-grain (X t ) using closed sets called milestones. Milestones. The milestones are closed sets with pairwise disjoint interiors. τ P P R D. Aristoff (Colorado State University) July / 22
7 Setup We coarse-grain (X t ) using closed sets called milestones. Milestones. P and R are two of the milestones. M is the union of the milestones. τ P P R D. Aristoff (Colorado State University) July / 22
8 Setup Assume (X t ) is strong Markov with càdlàg paths. Jump chain. Keep the first hit points, J n, on the milestones. (J n ) is a Markov chain on M. J 4 J 3 J 5 P J 2 J 1 R = J 0 = J 6 D. Aristoff (Colorado State University) July / 22
9 Setup Assume (X t ) is strong Markov with càdlàg paths. Sojourn times. Keep the times, τ n, between the first hit points. τ n depends only on J n 1 and J n. τ 5 τ 4 P τ 3 τ 2 τ 1 τ P = τ τ 5 R D. Aristoff (Colorado State University) July / 22
10 Theory Semi-Markov viewpoint Define Y t = J n for τ τ n t < τ τ n+1, and let (Y t ) start on M. Theorem. (Y t ) is a semi-markov process on M with the same FPT to P as (X t ). Proof: This is clear from construction... Definition. Let τ P = inf{t > 0 : Y t P} and σ P = min{n 0 : J n P}. Assumption. E ξ [τ P ] and E ξ [σ P ] are finite for all initial distributions ξ on M. This ensures that (Y t ) has finite MFPTs to P and is nonexplosive. D. Aristoff (Colorado State University) July / 22
11 Theory The invariant measure Definition. Let K(x, dy) be the transition kernel for (J n ) and K(x, dy) = { K(x, dy), x / P 0, x P. K and K have left/right actions on measures/bounded functions in the usual way. Theorem. (J n ) has an invariant probability measure µ, defined by µ = Z 1 δ R K n, n=0 where Z is a normalization constant. Moreover, µ(p) > 0. Proof: Show that the residence time E R [ σ P n=0 1 J n ] is invariant. D. Aristoff (Colorado State University) July / 22
12 Theory The exact milestoning equation Definition. Let M x be the milestone containing x and τ x M = inf{t > 0 : Y t M \ int(m x )}. P τ x M x R D. Aristoff (Colorado State University) July / 22
13 Theory The exact milestoning equation Definition. Let M x be the milestone containing x and τm x = inf{t > 0 : Y t M \ int(m x )}. Theorem. With µ the invariant distribution for (J n ), we have µ(p)e R [τ P ] = E µ [τ M ] := µ(dx)e x [τm]. x M Proof: Write τ P = (τ P τ x M ) + τ x M, condition on Y τ x M = y, integrate w.r.t. µ(dy). Note that E x [τm x ] can be obtained from short trajectories running in parallel. So if we can sample efficiently from µ, we can efficiently estimate E R [τ P ]. D. Aristoff (Colorado State University) July / 22
14 Theory Convergence to stationarity Theorem. Assume (J n ) is aperiodic in the following sense: Then for any initial distribution ξ, g.c.d.{n 1 : P R (σ P = n 1) > 0} = 1. lim ξk n µ TV = 0. n Proof: Coupling argument, using the fact that R is recurrent for (J n ). This suggests how to estimate µ: start with a guess ξ = µ 0, and iterate. D. Aristoff (Colorado State University) July / 22
15 Theory Convergence to stationarity Aside: 1D milestoning. The milestones are R = M 1, M 2,..., M m = P. If J n M 1, then J n+1 M 2. And if J n M j for j = 2,..., m 1, then J n+1 M j 1 or J n+1 M j+1, both w.p. > 0. M 1 = R M 2 M 3 M 4... M m = P Here, ξk n µ TV 0 for all ξ the number of milestones is odd. Why? Say there are m milestones. Then (J n ) can go from R to P in m 1 or m + 1 steps, and m, m + 2 are coprime when m is odd. Conversely, if m is even and ξ is supported on the odd indexed milestones, then after an even (resp. odd) number of steps the distribution of (J n ) is supported on the odd (resp. even) indexed milestones. D. Aristoff (Colorado State University) July / 22
16 Theory Convergence to stationarity Error Analysis. The error in milestoning has two sources: Error in the approximation µ of µ; Error due to time discretization X nδt of (X t ). Define τ x M = min{n > 0 : line segment from X (n 1)δt to X nδt intersects M \ int(m x )}. Theorem. The error in the Milestoning approximation of the MFPT satisfies E R [τ P ] µ(r) 1 E µ [ τ M ] c 1 µ(p) 1 µ(p) 1 + µ(p) 1 (c 2 µ µ TV + φ(δt)), where φ is a function depending only on the time step error, and c 1 = E µ [τ M ], c 2 = sup E x [τm]. x x M Proof: Triangle inequalities. D. Aristoff (Colorado State University) July / 22
17 Theory Numerical error Is an iteration scheme based on powers of a numerical approximant K consistent? Theorem. (Ferré et. al., 2013) Suppose (J n ) is geometrically ergodic (GE): there exists κ (0, 1) such that sup δ x K n µ TV = O(κ n ). x M Given ɛ > 0, if K is sufficiently close to K in operator norm, then sup δ x K n µ TV = O( κ n ), µ µ TV < ɛ, x M where µ is some probability measure on M and κ (κ, 1). Lemma. (J n ) is GE if there is a uniform lower bound to reach P in n steps. Lemma. (J n ) is GE if it is strong Feller and aperiodic, and state space is compact. D. Aristoff (Colorado State University) July / 22
18 Example Example. Let (X t ) be Brownian dynamics: dx t = V (X t ) dt + 2β 1 db t, and for N = 10 let V (x 1, x 2 ) = N 1 N 1 k 1= N k 2= N C k1,k 2 f k1,k 2 (x 1, x 2 ), (1) where C k1,k 2 is either 0 or sampled uniformly from ( 1 2π, ) 1 2π, each w.p. 1 2, and 1 cos(2πk 1 x 1 ) cos(2πk 2 x 2 ), w.p. 3, 1 f k1,k 2 (x 1, x 2 ) = cos(2πk 1 x 1 ) sin(2πk 2 x 2 ), w.p. 3, 1 sin(2πk 1 x 1 ) sin(2πk 2 x 2 ), w.p. 3. V defines a rough energy landscape on the torus R 2 /Z 2. D. Aristoff (Colorado State University) July / 22
19 Example x2 x 1 Figure: Source R (center bottom), sink P (center), and other milestones (line segments). D. Aristoff (Colorado State University) July / 22
20 Example x x1 Figure: Contour map of the canonical Gibbs density Z 1 e βv. D. Aristoff (Colorado State University) July / 22
21 Example x2 µ x 1 Figure: The stationary measure µ, superimposed on contour lines of V. D. Aristoff (Colorado State University) July / 22
22 Example MFPT Iteration Figure: The MFPT vs. the number of iterations, starting at µ 0(dx) = Z 1 e βv (x) dx. D. Aristoff (Colorado State University) July / 22
The parallel replica method for Markov chains
The parallel replica method for Markov chains David Aristoff (joint work with T Lelièvre and G Simpson) Colorado State University March 2015 D Aristoff (Colorado State University) March 2015 1 / 29 Introduction
More informationProblem Set 2: Solutions Math 201A: Fall 2016
Problem Set 2: s Math 201A: Fall 2016 Problem 1. (a) Prove that a closed subset of a complete metric space is complete. (b) Prove that a closed subset of a compact metric space is compact. (c) Prove that
More informationMATH 202B - Problem Set 5
MATH 202B - Problem Set 5 Walid Krichene (23265217) March 6, 2013 (5.1) Show that there exists a continuous function F : [0, 1] R which is monotonic on no interval of positive length. proof We know there
More informationLecture 10. Theorem 1.1 [Ergodicity and extremality] A probability measure µ on (Ω, F) is ergodic for T if and only if it is an extremal point in M.
Lecture 10 1 Ergodic decomposition of invariant measures Let T : (Ω, F) (Ω, F) be measurable, and let M denote the space of T -invariant probability measures on (Ω, F). Then M is a convex set, although
More informationRandom Locations of Periodic Stationary Processes
Random Locations of Periodic Stationary Processes Yi Shen Department of Statistics and Actuarial Science University of Waterloo Joint work with Jie Shen and Ruodu Wang May 3, 2016 The Fields Institute
More informationLecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.
1 Markov chain: definition Lecture 5 Definition 1.1 Markov chain] A sequence of random variables (X n ) n 0 taking values in a measurable state space (S, S) is called a (discrete time) Markov chain, if
More informationMarkov Chains and Stochastic Sampling
Part I Markov Chains and Stochastic Sampling 1 Markov Chains and Random Walks on Graphs 1.1 Structure of Finite Markov Chains We shall only consider Markov chains with a finite, but usually very large,
More informationLectures on Stochastic Stability. Sergey FOSS. Heriot-Watt University. Lecture 4. Coupling and Harris Processes
Lectures on Stochastic Stability Sergey FOSS Heriot-Watt University Lecture 4 Coupling and Harris Processes 1 A simple example Consider a Markov chain X n in a countable state space S with transition probabilities
More informationPath Decomposition of Markov Processes. Götz Kersting. University of Frankfurt/Main
Path Decomposition of Markov Processes Götz Kersting University of Frankfurt/Main joint work with Kaya Memisoglu, Jim Pitman 1 A Brownian path with positive drift 50 40 30 20 10 0 0 200 400 600 800 1000-10
More informationSimultaneous drift conditions for Adaptive Markov Chain Monte Carlo algorithms
Simultaneous drift conditions for Adaptive Markov Chain Monte Carlo algorithms Yan Bai Feb 2009; Revised Nov 2009 Abstract In the paper, we mainly study ergodicity of adaptive MCMC algorithms. Assume that
More informationExistence, Uniqueness and Stability of Invariant Distributions in Continuous-Time Stochastic Models
Existence, Uniqueness and Stability of Invariant Distributions in Continuous-Time Stochastic Models Christian Bayer and Klaus Wälde Weierstrass Institute for Applied Analysis and Stochastics and University
More informationB. Appendix B. Topological vector spaces
B.1 B. Appendix B. Topological vector spaces B.1. Fréchet spaces. In this appendix we go through the definition of Fréchet spaces and their inductive limits, such as they are used for definitions of function
More informationEffective dynamics for the (overdamped) Langevin equation
Effective dynamics for the (overdamped) Langevin equation Frédéric Legoll ENPC and INRIA joint work with T. Lelièvre (ENPC and INRIA) Enumath conference, MS Numerical methods for molecular dynamics EnuMath
More informationTHE INVERSE FUNCTION THEOREM
THE INVERSE FUNCTION THEOREM W. PATRICK HOOPER The implicit function theorem is the following result: Theorem 1. Let f be a C 1 function from a neighborhood of a point a R n into R n. Suppose A = Df(a)
More informationANALYSIS WORKSHEET II: METRIC SPACES
ANALYSIS WORKSHEET II: METRIC SPACES Definition 1. A metric space (X, d) is a space X of objects (called points), together with a distance function or metric d : X X [0, ), which associates to each pair
More informationLecture 12. F o s, (1.1) F t := s>t
Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let
More information{σ x >t}p x. (σ x >t)=e at.
3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ
More informationUniformly Uniformly-ergodic Markov chains and BSDEs
Uniformly Uniformly-ergodic Markov chains and BSDEs Samuel N. Cohen Mathematical Institute, University of Oxford (Based on joint work with Ying Hu, Robert Elliott, Lukas Szpruch) Centre Henri Lebesgue,
More informationMid Term-1 : Practice problems
Mid Term-1 : Practice problems These problems are meant only to provide practice; they do not necessarily reflect the difficulty level of the problems in the exam. The actual exam problems are likely to
More informationUniversal examples. Chapter The Bernoulli process
Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial
More informationprocess on the hierarchical group
Intertwining of Markov processes and the contact process on the hierarchical group April 27, 2010 Outline Intertwining of Markov processes Outline Intertwining of Markov processes First passage times of
More information1 Continuous-time chains, finite state space
Université Paris Diderot 208 Markov chains Exercises 3 Continuous-time chains, finite state space Exercise Consider a continuous-time taking values in {, 2, 3}, with generator 2 2. 2 2 0. Draw the diagramm
More informationThe Theory behind PageRank
The Theory behind PageRank Mauro Sozio Telecom ParisTech May 21, 2014 Mauro Sozio (LTCI TPT) The Theory behind PageRank May 21, 2014 1 / 19 A Crash Course on Discrete Probability Events and Probability
More informationLecture 7. µ(x)f(x). When µ is a probability measure, we say µ is a stationary distribution.
Lecture 7 1 Stationary measures of a Markov chain We now study the long time behavior of a Markov Chain: in particular, the existence and uniqueness of stationary measures, and the convergence of the distribution
More informationLattice spin models: Crash course
Chapter 1 Lattice spin models: Crash course 1.1 Basic setup Here we will discuss the basic setup of the models to which we will direct our attention throughout this course. The basic ingredients are as
More informationSMSTC (2007/08) Probability.
SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................
More informationI forgot to mention last time: in the Ito formula for two standard processes, putting
I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy
More informationLecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321
Lecture 11: Introduction to Markov Chains Copyright G. Caire (Sample Lectures) 321 Discrete-time random processes A sequence of RVs indexed by a variable n 2 {0, 1, 2,...} forms a discretetime random process
More informationChapter 7. Markov chain background. 7.1 Finite state space
Chapter 7 Markov chain background A stochastic process is a family of random variables {X t } indexed by a varaible t which we will think of as time. Time can be discrete or continuous. We will only consider
More informationConsistency of the maximum likelihood estimator for general hidden Markov models
Consistency of the maximum likelihood estimator for general hidden Markov models Jimmy Olsson Centre for Mathematical Sciences Lund University Nordstat 2012 Umeå, Sweden Collaborators Hidden Markov models
More informationWalsh Diffusions. Andrey Sarantsev. March 27, University of California, Santa Barbara. Andrey Sarantsev University of Washington, Seattle 1 / 1
Walsh Diffusions Andrey Sarantsev University of California, Santa Barbara March 27, 2017 Andrey Sarantsev University of Washington, Seattle 1 / 1 Walsh Brownian Motion on R d Spinning measure µ: probability
More informationLong Time Stability and Control Problems for Stochastic Networks in Heavy Traffic
Long Time Stability and Control Problems for Stochastic Networks in Heavy Traffic Chihoon Lee A dissertation submitted to the faculty of the University of North Carolina at Chapel Hill in partial fulfillment
More informationarxiv: v2 [math.na] 20 Dec 2016
SAIONARY AVERAGING FOR MULI-SCALE CONINUOUS IME MARKOV CHAINS USING PARALLEL REPLICA DYNAMICS ING WANG, PER PLECHÁČ, AND DAVID ARISOFF arxiv:169.6363v2 [math.na 2 Dec 216 Abstract. We propose two algorithms
More informationLocally convex spaces, the hyperplane separation theorem, and the Krein-Milman theorem
56 Chapter 7 Locally convex spaces, the hyperplane separation theorem, and the Krein-Milman theorem Recall that C(X) is not a normed linear space when X is not compact. On the other hand we could use semi
More informationSession 1: Probability and Markov chains
Session 1: Probability and Markov chains 1. Probability distributions and densities. 2. Relevant distributions. 3. Change of variable. 4. Stochastic processes. 5. The Markov property. 6. Markov finite
More informationSome classical results on stationary distributions of continuous time Markov processes
Some classical results on stationary distributions of continuous time Markov processes Chris Janjigian March 25, 24 These presentation notes are for my talk in the graduate probability seminar at UW Madison
More informationDiscrete solid-on-solid models
Discrete solid-on-solid models University of Alberta 2018 COSy, University of Manitoba - June 7 Discrete processes, stochastic PDEs, deterministic PDEs Table: Deterministic PDEs Heat-diffusion equation
More informationBrownian Motion. 1 Definition Brownian Motion Wiener measure... 3
Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................
More informationp 1 ( Y p dp) 1/p ( X p dp) 1 1 p
Doob s inequality Let X(t) be a right continuous submartingale with respect to F(t), t 1 P(sup s t X(s) λ) 1 λ {sup s t X(s) λ} X + (t)dp 2 For 1 < p
More informationPerturbed Proximal Gradient Algorithm
Perturbed Proximal Gradient Algorithm Gersende FORT LTCI, CNRS, Telecom ParisTech Université Paris-Saclay, 75013, Paris, France Large-scale inverse problems and optimization Applications to image processing
More information4.5 The critical BGW tree
4.5. THE CRITICAL BGW TREE 61 4.5 The critical BGW tree 4.5.1 The rooted BGW tree as a metric space We begin by recalling that a BGW tree T T with root is a graph in which the vertices are a subset of
More informationHomogenization for chaotic dynamical systems
Homogenization for chaotic dynamical systems David Kelly Ian Melbourne Department of Mathematics / Renci UNC Chapel Hill Mathematics Institute University of Warwick November 3, 2013 Duke/UNC Probability
More informationProblem set 1, Real Analysis I, Spring, 2015.
Problem set 1, Real Analysis I, Spring, 015. (1) Let f n : D R be a sequence of functions with domain D R n. Recall that f n f uniformly if and only if for all ɛ > 0, there is an N = N(ɛ) so that if n
More informationRandom Process Lecture 1. Fundamentals of Probability
Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus
More informationQuenched invariance principle for random walks on Poisson-Delaunay triangulations
Quenched invariance principle for random walks on Poisson-Delaunay triangulations Institut de Mathématiques de Bourgogne Journées de Probabilités Université de Rouen Jeudi 17 septembre 2015 Introduction
More informationLet (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t
2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition
More informationTopics in Stochastic Geometry. Lecture 4 The Boolean model
Institut für Stochastik Karlsruher Institut für Technologie Topics in Stochastic Geometry Lecture 4 The Boolean model Lectures presented at the Department of Mathematical Sciences University of Bath May
More informationLECTURE 12 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT
MARCH 29, 26 LECTURE 2 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT (Davidson (2), Chapter 4; Phillips Lectures on Unit Roots, Cointegration and Nonstationarity; White (999), Chapter 7) Unit root processes
More informationApproximating a Diffusion by a Finite-State Hidden Markov Model
Approximating a Diffusion by a Finite-State Hidden Markov Model I. Kontoyiannis S.P. Meyn November 28, 216 Abstract For a wide class of continuous-time Markov processes evolving on an open, connected subset
More informationµ n 1 (v )z n P (v, )
Plan More Examples (Countable-state case). Questions 1. Extended Examples 2. Ideas and Results Next Time: General-state Markov Chains Homework 4 typo Unless otherwise noted, let X be an irreducible, aperiodic
More informationON THE REGULARITY OF SAMPLE PATHS OF SUB-ELLIPTIC DIFFUSIONS ON MANIFOLDS
Bendikov, A. and Saloff-Coste, L. Osaka J. Math. 4 (5), 677 7 ON THE REGULARITY OF SAMPLE PATHS OF SUB-ELLIPTIC DIFFUSIONS ON MANIFOLDS ALEXANDER BENDIKOV and LAURENT SALOFF-COSTE (Received March 4, 4)
More information25.1 Ergodicity and Metric Transitivity
Chapter 25 Ergodicity This lecture explains what it means for a process to be ergodic or metrically transitive, gives a few characterizes of these properties (especially for AMS processes), and deduces
More informationConvergence of Feller Processes
Chapter 15 Convergence of Feller Processes This chapter looks at the convergence of sequences of Feller processes to a iting process. Section 15.1 lays some ground work concerning weak convergence of processes
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationExcursion Reflected Brownian Motion and a Loewner Equation
and a Loewner Equation Department of Mathematics University of Chicago Cornell Probability Summer School, 2011 and a Loewner Equation The Chordal Loewner Equation Let γ : [0, ) C be a simple curve with
More informationErgodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.
Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions
More information2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?
MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due 9/5). Prove that every countable set A is measurable and µ(a) = 0. 2 (Bonus). Let A consist of points (x, y) such that either x or y is
More informationMarkov processes and queueing networks
Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution
More informationMH 7500 THEOREMS. (iii) A = A; (iv) A B = A B. Theorem 5. If {A α : α Λ} is any collection of subsets of a space X, then
MH 7500 THEOREMS Definition. A topological space is an ordered pair (X, T ), where X is a set and T is a collection of subsets of X such that (i) T and X T ; (ii) U V T whenever U, V T ; (iii) U T whenever
More informationDoléans measures. Appendix C. C.1 Introduction
Appendix C Doléans measures C.1 Introduction Once again all random processes will live on a fixed probability space (Ω, F, P equipped with a filtration {F t : 0 t 1}. We should probably assume the filtration
More informationIntertwining of Markov processes
January 4, 2011 Outline Outline First passage times of birth and death processes Outline First passage times of birth and death processes The contact process on the hierarchical group 1 0.5 0-0.5-1 0 0.2
More informationMixing time for a random walk on a ring
Mixing time for a random walk on a ring Stephen Connor Joint work with Michael Bate Paris, September 2013 Introduction Let X be a discrete time Markov chain on a finite state space S, with transition matrix
More informationMarkov Chain Monte Carlo (MCMC)
Markov Chain Monte Carlo (MCMC Dependent Sampling Suppose we wish to sample from a density π, and we can evaluate π as a function but have no means to directly generate a sample. Rejection sampling can
More informationSome Semi-Markov Processes
Some Semi-Markov Processes and the Navier-Stokes equations NW Probability Seminar, October 22, 2006 Mina Ossiander Department of Mathematics Oregon State University 1 Abstract Semi-Markov processes were
More informationMATH 564/STAT 555 Applied Stochastic Processes Homework 2, September 18, 2015 Due September 30, 2015
ID NAME SCORE MATH 56/STAT 555 Applied Stochastic Processes Homework 2, September 8, 205 Due September 30, 205 The generating function of a sequence a n n 0 is defined as As : a ns n for all s 0 for which
More informationThe range of tree-indexed random walk
The range of tree-indexed random walk Jean-François Le Gall, Shen Lin Institut universitaire de France et Université Paris-Sud Orsay Erdös Centennial Conference July 2013 Jean-François Le Gall (Université
More informationPropp-Wilson Algorithm (and sampling the Ising model)
Propp-Wilson Algorithm (and sampling the Ising model) Danny Leshem, Nov 2009 References: Haggstrom, O. (2002) Finite Markov Chains and Algorithmic Applications, ch. 10-11 Propp, J. & Wilson, D. (1996)
More information6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )
6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined
More informationApril 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning
for for Advanced Topics in California Institute of Technology April 20th, 2017 1 / 50 Table of Contents for 1 2 3 4 2 / 50 History of methods for Enrico Fermi used to calculate incredibly accurate predictions
More information215 Problem 1. (a) Define the total variation distance µ ν tv for probability distributions µ, ν on a finite set S. Show that
15 Problem 1. (a) Define the total variation distance µ ν tv for probability distributions µ, ν on a finite set S. Show that µ ν tv = (1/) x S µ(x) ν(x) = x S(µ(x) ν(x)) + where a + = max(a, 0). Show that
More informationExercises. T 2T. e ita φ(t)dt.
Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.
More informationCONSTRAINED PERCOLATION ON Z 2
CONSTRAINED PERCOLATION ON Z 2 ZHONGYANG LI Abstract. We study a constrained percolation process on Z 2, and prove the almost sure nonexistence of infinite clusters and contours for a large class of probability
More informationVARIATIONAL PRINCIPLE FOR THE ENTROPY
VARIATIONAL PRINCIPLE FOR THE ENTROPY LUCIAN RADU. Metric entropy Let (X, B, µ a measure space and I a countable family of indices. Definition. We say that ξ = {C i : i I} B is a measurable partition if:
More informationStochastic process. X, a series of random variables indexed by t
Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,
More informationRectifiability of sets and measures
Rectifiability of sets and measures Tatiana Toro University of Washington IMPA February 7, 206 Tatiana Toro (University of Washington) Structure of n-uniform measure in R m February 7, 206 / 23 State of
More informationLecture Notes in Advanced Calculus 1 (80315) Raz Kupferman Institute of Mathematics The Hebrew University
Lecture Notes in Advanced Calculus 1 (80315) Raz Kupferman Institute of Mathematics The Hebrew University February 7, 2007 2 Contents 1 Metric Spaces 1 1.1 Basic definitions...........................
More informationUniversity of Toronto Department of Statistics
Norm Comparisons for Data Augmentation by James P. Hobert Department of Statistics University of Florida and Jeffrey S. Rosenthal Department of Statistics University of Toronto Technical Report No. 0704
More informationDynamics of Group Actions and Minimal Sets
University of Illinois at Chicago www.math.uic.edu/ hurder First Joint Meeting of the Sociedad de Matemática de Chile American Mathematical Society Special Session on Group Actions: Probability and Dynamics
More informationM17 MAT25-21 HOMEWORK 6
M17 MAT25-21 HOMEWORK 6 DUE 10:00AM WEDNESDAY SEPTEMBER 13TH 1. To Hand In Double Series. The exercises in this section will guide you to complete the proof of the following theorem: Theorem 1: Absolute
More information9.2 Branching random walk and branching Brownian motions
168 CHAPTER 9. SPATIALLY STRUCTURED MODELS 9.2 Branching random walk and branching Brownian motions Branching random walks and branching diffusions have a long history. A general theory of branching Markov
More informationStochastic Loewner Evolution: another way of thinking about Conformal Field Theory
Stochastic Loewner Evolution: another way of thinking about Conformal Field Theory John Cardy University of Oxford October 2005 Centre for Mathematical Physics, Hamburg Outline recall some facts about
More informationStrong approximation for additive functionals of geometrically ergodic Markov chains
Strong approximation for additive functionals of geometrically ergodic Markov chains Florence Merlevède Joint work with E. Rio Université Paris-Est-Marne-La-Vallée (UPEM) Cincinnati Symposium on Probability
More informationConvergence to equilibrium for rough differential equations
Convergence to equilibrium for rough differential equations Samy Tindel Purdue University Barcelona GSE Summer Forum 2017 Joint work with Aurélien Deya (Nancy) and Fabien Panloup (Angers) Samy T. (Purdue)
More informationIn terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.
1. GAUSSIAN PROCESSES A Gaussian process on a set T is a collection of random variables X =(X t ) t T on a common probability space such that for any n 1 and any t 1,...,t n T, the vector (X(t 1 ),...,X(t
More informationA regeneration proof of the central limit theorem for uniformly ergodic Markov chains
A regeneration proof of the central limit theorem for uniformly ergodic Markov chains By AJAY JASRA Department of Mathematics, Imperial College London, SW7 2AZ, London, UK and CHAO YANG Department of Mathematics,
More informationRandom Walks on Hyperbolic Groups IV
Random Walks on Hyperbolic Groups IV Steve Lalley University of Chicago January 2014 Automatic Structure Thermodynamic Formalism Local Limit Theorem Final Challenge: Excursions I. Automatic Structure of
More informationNOTES ON BARNSLEY FERN
NOTES ON BARNSLEY FERN ERIC MARTIN 1. Affine transformations An affine transformation on the plane is a mapping T that preserves collinearity and ratios of distances: given two points A and B, if C is
More informationMath 328 Course Notes
Math 328 Course Notes Ian Robertson March 3, 2006 3 Properties of C[0, 1]: Sup-norm and Completeness In this chapter we are going to examine the vector space of all continuous functions defined on the
More informationAn invariance result for Hammersley s process with sources and sinks
An invariance result for Hammersley s process with sources and sinks Piet Groeneboom Delft University of Technology, Vrije Universiteit, Amsterdam, and University of Washington, Seattle March 31, 26 Abstract
More informationMetric Spaces and Topology
Chapter 2 Metric Spaces and Topology From an engineering perspective, the most important way to construct a topology on a set is to define the topology in terms of a metric on the set. This approach underlies
More informationFunctional Analysis Winter 2018/2019
Functional Analysis Winter 2018/2019 Peer Christian Kunstmann Karlsruher Institut für Technologie (KIT) Institut für Analysis Englerstr. 2, 76131 Karlsruhe e-mail: peer.kunstmann@kit.edu These lecture
More informationGaussian Processes. 1. Basic Notions
Gaussian Processes 1. Basic Notions Let T be a set, and X : {X } T a stochastic process, defined on a suitable probability space (Ω P), that is indexed by T. Definition 1.1. We say that X is a Gaussian
More informationStochastic Processes
Stochastic Processes 8.445 MIT, fall 20 Mid Term Exam Solutions October 27, 20 Your Name: Alberto De Sole Exercise Max Grade Grade 5 5 2 5 5 3 5 5 4 5 5 5 5 5 6 5 5 Total 30 30 Problem :. True / False
More informationOn the fast convergence of random perturbations of the gradient flow.
On the fast convergence of random perturbations of the gradient flow. Wenqing Hu. 1 (Joint work with Chris Junchi Li 2.) 1. Department of Mathematics and Statistics, Missouri S&T. 2. Department of Operations
More informationExample I: Capital Accumulation
1 Example I: Capital Accumulation Time t = 0, 1,..., T < Output y, initial output y 0 Fraction of output invested a, capital k = ay Transition (production function) y = g(k) = g(ay) Reward (utility of
More informationLecture 17 Brownian motion as a Markov process
Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is
More informationMathematical Analysis Outline. William G. Faris
Mathematical Analysis Outline William G. Faris January 8, 2007 2 Chapter 1 Metric spaces and continuous maps 1.1 Metric spaces A metric space is a set X together with a real distance function (x, x ) d(x,
More informationMartingales, standard filtrations, and stopping times
Project 4 Martingales, standard filtrations, and stopping times Throughout this Project the index set T is taken to equal R +, unless explicitly noted otherwise. Some things you might want to explain in
More informationKnudsen Diffusion and Random Billiards
Knudsen Diffusion and Random Billiards R. Feres http://www.math.wustl.edu/ feres/publications.html November 11, 2004 Typeset by FoilTEX Gas flow in the Knudsen regime Random Billiards [1] Large Knudsen
More information6 Markov Chain Monte Carlo (MCMC)
6 Markov Chain Monte Carlo (MCMC) The underlying idea in MCMC is to replace the iid samples of basic MC methods, with dependent samples from an ergodic Markov chain, whose limiting (stationary) distribution
More information