Chapter 6. Markov processes. 6.1 Introduction

Size: px
Start display at page:

Download "Chapter 6. Markov processes. 6.1 Introduction"

Transcription

1 Chapter 6 Markov processes 6.1 Introduction It is not uncommon for a Markov process to be defined as a sextuple (, F, F t,x t, t, P x ), and for additional notation (e.g.,,, S,P t,r,etc.) tobe introduced rather rapidly. This can be intimidating for the beginner. We will explain this notation in as gentle a manner as possible. We will consider amarkovprocesstobeapair(x t, P x )(ratherthanasextuple),wherex t is asinglestochasticprocessand{p x } is a family of probability measures, one probability measure P x corresponding to each element x of the state space. The idea that a Markov process consists of one process and many probabilities is one that takes some getting used to. To explain this, let us first look at an example. Suppose X 1,X 2,... is a Markov chain with stationary transition probabilities with 5 states: 1, 2,...,5. Everything we want to know about X can be determined if we know p(i, j) =P(X 1 = j X 0 = i) for each i and j and µ(i) =P(X 0 = i) foreachi. We sometimes think of having a di erent Markov chain for every choice of starting distribution µ =(µ(1),...,µ(5)). But instead let us define a new probability space by taking 0 to be the collection of all sequences! =(! 0,! 1,...)suchthateach! n takes one of the values 1,...,5. Define X n (!) =! n.definef n to be the -field generated by X 0,...,X n ; this is the same as the -field generated by sets of the form {! :! 0 = a 0,...,! n = a n },wherea 0,...,a n 2{1, 2,...,5}. 85

2 86 CHAPTER 6. MARKOV PROCESSES For each x =1, 2,...,5, define a probability measure P x on 0 by P x (X 0 = x 0,X 1 = x 1,...X n = x n ) (6.1) =1 {x} (x 0 )p(x 0,x 1 ) p(x n 1,x n ). We have 5 di erent probability measures, one for each of x =1, 2,...,5, and we can start with an arbitrary probability distribution µ if we define P µ (A) = P 5 i=1 Pi (A)µ(i). We have lost no information by this redefinition, and it turns out this works much better when doing technical details. The value of X 0 (!) =! 0 can be any of 1, 2,...,5; the notion of starting at x is captured by P x,notbyx 0.TheprobabilitymeasureP x is concentrated on those! s for which! 0 = x and P x gives no mass to any other!. Let us now look at a Lévy process, and see how this framework plays out there. Let P be a probability measure and let Z t be a Lévy process with respect to P started at 0. Then Z x t = x + Z t is a Lévy process started at x. Let 0 be the set of right continuous left limit functions from [0, 1) tor, so that each element! in 0 is a right continuous left limit function. (We do not require that!(0) = 0 or that!(0) take any particular value of x.) Define X t (!) =!(t). (6.2) This will be our process. Let F be the -field on 0,therightcontinuous left limit functions, generated by the cylindrical subsets. Now define P x to be the law of Z x.thismeansthatp x is the probability measure on ( 0, F) defined by P x (X 2 A) =P(Z x 2 A), x 2 R,A2F. (6.3) The probability measure P x is determined by the fact that if n applet n,andb 1,...,B n are Borel subsets of R, then P(X t1 2 B 1,...,X tn 2 B n )=P(Z x t 1 2 B 1,...,Z x t n 2 B n ). 1, t 1 apple 6.2 Definition of a Markov process We want to allow our Markov processes to take values in spaces other than the Euclidean ones. For now, we take our state space S to be a separable metric space, furnished with the Borel -field. For the first time around, just think of R in place of S.

3 6.2. DEFINITION OF A MARKOV PROCESS 87 To define a Markov process, we start with a measurable space (, F) and we suppose we have a filtration {F t } (not necessarily satisfying the usual conditions). Definition 6.1 A Markov process (X t, P x ) is a stochastic process X :[0, 1)!S and a family of probability measures {P x : x 2S}on (, F) satisfying the following. (1) For each t, X t is F t measurable. (2) For each t and each Borel subset A of S, the map x! P x (X t 2 A) is Borel measurable. (3) For each s, t 0, each Borel subset A of S, and each x 2S, we have P x (X s+t 2 A F s )=P Xs (X t 2 A), P x a.s. (6.4) Some explanation is definitely in order. Let '(x) =P x (X t 2 A), (6.5) so that ' is a function mapping S to R. Part of the definition of filtration is that each F t F. Since we are requiring X t to be F t measurable, that means that (X t 2 A) isinf and it makes sense to talk about P x (X t 2 A). Definition 6.1(2) says that the function ' is Borel measurable. This is a very mild assumption, and will be satisfied in the examples we look at. The expression P Xs (X t 2 A) on the right hand side of (6.4) is a random variable and its value at! 2 isdefinedtobe'(x s (!)), with ' given by (6.5). Note that the randomness in P Xs (X t 2 A) isthusallduetothe X s term and not the X t term. Definition 6.1(3) can be rephrased as saying that for each s, t, eacha, andeachx, thereisasetn s,t,x,a thatisa null set with respect to P x and for!/2 N s,t,x,a,theconditionalexpectation P x (X s+t 2 A F s )isequalto'(x s ). We have now explained all the terms in the sextuple (, F, F t,x t, t, P x ) except for t. These are called shift operators and are maps from! such that X s t = X s+t. We defer the precise meaning of the t and the rationale for them until Section 6.4, where they will appear in a natural way.

4 88 CHAPTER 6. MARKOV PROCESSES In the remainder of the section and in Section 6.3 we define some of the additional notation commonly used for Markov processes. The first one is almost self-explanatory. We use E x for expectation with respect to P x. As with P Xs (X t 2 A), the notation E Xs f(x t ), where f is bounded and Borel measurable, is to be taken to mean (X s )with (y) =E y f(x t ). If we want to talk about our Markov process started with distribution µ, we define Z P µ (B) = P x (B) µ(dx), and similarly for E µ ;hereµ is a probability on S. 6.3 Transition probabilities If B is the Borel -field on a metric space S, akernel Q(x, A) ons is a map from S B!R satisfying the following. (1) For each x 2S, Q(x, ) isameasureon(s, B). (2) For each A 2B,thefunctionx! Q(x, A) isborelmeasurable. The definition of Markov transition probabilities or simply transition probabilities is the following. 0} are Markov transi- Definition 6.2 A collection of kernels {P t (x, A); t tion probabilities for a Markov process (X t, P x ) if (1) P t (x, S) =1for each t 0 and each x 2S. (2) For each x 2S, each Borel subset A of S, and each s, t 0, Z P t+s (x, A) = P t (y, A)P s (x, dy). (6.6) (3) For each x 2S, each Borel subset A of S, and each t 0, S P t (x, A) =P x (X t 2 A). (6.7) Definition 6.2(3) can be rephrased as saying that for each x, themeasures P t (x, dy) andp x (X t 2 dy) arethesame.wedefine Z P t f(x) = f(y)p t (x, dy) (6.8)

5 6.3. TRANSITION PROBABILITIES 89 when f : S!R is Borel measurable and either bounded or non-negative. The equations (6.6) are known as the Chapman-Kolmogorov equations. They can be rephrased in terms of equality of measures: for each x Z P s+t (x, dz) = P t (y, dz)p s (x, dy). (6.9) y2s Multiplying (6.9) by a bounded Borel measurable function f(z) andintegrating gives Z P s+t f(x) = P t f(y)p s (x, dy). (6.10) The right hand side is the same as P s (P t f)(x), so we have i.e., the functions P s+t f and P s P t f are the same. known as the semigroup property. P s+t f(x) =P s P t f(x), (6.11) The equation (6.11) is P t is a linear operator on the space of bounded Borel measurable functions on S. Wecanthenrephrase(6.11)simplyas P s+t = P s P t. (6.12) Operators satisfying (6.12) are called a semigroup, and are much studied in functional analysis. One more observation about semigroups: if we take expectations in (6.4), we obtain i P x (X s+t 2 A) =E hp x Xs (X t 2 A). The left hand side is P s+t 1 A (x) andtherighthandsideis E x [P t 1 A (X s )] = P s P t 1 A (x), and so (6.4) encodes the semigroup property. The resolvent or -potential of a semigroup P t is defined by R f(x) = Z 1 0 e t P t f(x) dt, 0, x 2S.

6 90 CHAPTER 6. MARKOV PROCESSES This can be recognized as the Laplace transform of P t.bythefubinitheorem, we see that Z 1 R f(x) =E x e t f(x t ) dt. 0 Resolvents are useful because they are typically easier to work with than semigroups. When practitioners of stochastic calculus tire of a martingale, they stop it. Markov process theorists are a harsher lot and they kill their processes. To be precise, attach an isolated point to S. ThusonelooksatS b = S[, and the topology on S b is the one generated by the open sets of S and { }. is called the cemetery point. All functions on S are extended to S b by defining them to be 0 at. At some random time the Markov process is killed, which means that X t = forallt. Thetime is called the lifetime of the Markov process. 6.4 The canonical process and shift operators Suppose we have a Markov process (X t, P x )wheref t = (X s ; s apple t). Suppose that X t has right continuous left limit paths. For this to even make sense, we need the set {t! X t is not right continuous left limit} to be in F, and then we require this event to be P x -null for each x. Define tobethe e set of right continuous left limit functions on [0, 1). If e! 2, e set X e t = e!(t). Define F e t = ( X e s ; s apple t) andf e 1 = _ t 0Ft e. Finally define P ex on (, e F e 1 )by ep x ( X e 2 )=P x (X 2 ). Thus P ex is specified uniquely by ep x ( e X t1 2 A 1,..., e X tn 2 A n )=P x (X t1 2 A 1,...,X tn 2 A n ) for n 1, A 1,...,A n Borel subsets of S, andt 1 < <t n.clearlythereis so far no loss (nor gain) by looking at the Markov process ( X e t, P ex ), which is called the canonical process. Let us now suppose we are working with the canonical process, and we drop the tildes everywhere. We define the shift operators t :! as follows. t (!) will be an element of and therefore is a continuous function from [0, 1) tos. Define t (!)(s) =!(t + s).

7 6.5. ENLARGING THE FILTRATION 91 Then X s t (!) =X s ( t (!)) = t (!)(s) =!(t + s) =X t+s (!). The shift operator t takes the path of X and chops o and discards the part of the path before time t. We will use expressions like f(x s ) or f(x s ) t = f(x s+t ). t.ifweapplythisto! 2, then (f(x s ) t )(!) =f(x s ( t (!))) = f(x s+t (!)), Even if we are not in this canonical setup, from now on we will suppose there exist shift operators mapping into itself so that X s t = X s+t. 6.5 Enlarging the filtration Throughout the remainder of this chapter we assume that X has paths that are right continuous with left limits. To be more precise, if N = {! : the function t! X t (!) isnotrightcontinuouswithleftlimits}, then we assume N 2F and N is P x -null for every x 2S. Let us first introduce some notation. Define F 00 t = (X s ; s apple t), t 0. (6.13) This is the smallest -field with respect to which each X s is measurable for s apple t. WeletFt 0 be the completion of Ft 00,butweneedtobecarefulwhat we mean by completion here, because we have more than one probability measure present. Let N be the collection of sets that are P x -null for every x 2S.ThusN 2N if (P x ) (N) =0foreachx 2S,where(P x ) is the outer probability corresponding to P x.theouterprobability(p x ) is defined by (P x ) (S) =inf{p x (B) :A B,B 2F}. Let F 0 t = (F 00 t [N). (6.14)

8 92 CHAPTER 6. MARKOV PROCESSES Finally, let F t = F 0 t+ = \ ">0 F 0 t+". (6.15) We call {F t } the minimal augmented filtration generated by X. The reason for worrying about which filtrations to use is that {Ft 00 } is too small to include many interesting sets (such as those arising in the law of the iterated logarithm, for example), while if the filtration is too large, the Markov property will not hold for that filtration. The filtration matters when defining a Markov process; see Definition 6.1(3). We will make the following assumption. Assumption 6.3 Suppose P t f is continuous on S whenever f is bounded and continuous on S. Markov processes satisfying Assumption 6.3 are called Feller processes or weak Feller processes. IfP t f is continuous whenever f is bounded and Borel measurable, then the Markov process is said to be a strong Feller process. One can show that under Assumption 6.3 we have P x (X s+t 2 A F s )=P Xs (X t 2 A), P x a.s. 6.6 The Markov property We start with the Markov property: E x [f(x s+t ) F s ]=E Xs [f(x t )], P x a.s. (6.16) Since f(x s+t )=f(x t ) s,ifwewritey for the random variable f(x t ), we have E x [Y s F s ]=E Xs Y, P x a.s. (6.17) We wish to generalize this to other random variables Y. Proposition 6.4 Let (X t, P x ) be a Markov process and suppose (6.16) holds. Suppose Y = Q n i=1 f i(x ti s), where the f i are bounded, Borel measurable, and s apple t 1 apple...apple t n. Then (6.17) holds.

9 6.7. STRONG MARKOV PROPERTY 93 Proof. We will prove this by induction on n. The case n =1is(6.16),so we suppose the equality holds for n and prove it for n +1. Let V = Q n+1 j=2 f j(x tj t 1 )andh(y) =E y V.Bytheinductionhypothesis, h n+1 Y i E x f j (X tj ) F s j=1 = E x h E x [V t1 F t1 ]f 1 (X t1 ) F s i = E x h (E Xt 1 V )f1 (X t1 ) F s i = E x [(hf 1 )(X t1 ) F s ]. By (6.16) this is E Xs [(hf 1 )(X t1 s)]. For any y, E y [(hf 1 )(X t1 s)] = E y [(E X t 1 s V )f 1 (X t1 s)] i = E he y y [V t1 s F t1 s]f 1 (X t1 s) = E y [(V t1 s)f 1 (X t1 s)]. If we replace V by its definition, replace y by X s,andusethedefinitionof t1 s,wegetthedesiredequalityforn +1andhencetheinductionstep. We now come to the general version of the Markov property. As usual, F 1 = _ t 0 F t. The expression Y t for general Y may seem puzzling at first. Theorem 6.5 Let (X t, P x ) be a Markov process and suppose (6.16) holds. Suppose Y is bounded and measurable with respect to F 1. Then E x [Y s F s ]=E Xs Y, P x a.s. (6.18) The proof follows from the previous proposition by a monotone class argument. 6.7 Strong Markov property Given a stopping time T, recall that the -field of events known up to time T is defined to be F T = A 2F 1 : A \ (T apple t) 2F t for all t>0

10 94 CHAPTER 6. MARKOV PROCESSES We define T by T (!)(t) =!(T (!) +t). Thus, for example, X t T (!) = X T (!)+t (!) andx T (!) =X T (!) (!). Now we can state the strong Markov property. Suppose (X t, P x )isamarkovprocesswithrespectto{f t }. The strong Markov property is said to hold if whenever T is a finite stopping time and Y is bounded and measurable with respect to F 1,then E x [Y T F T ]=E X T Y, P x a.s. Recall that we are restricting our attention to Markov processes whose paths are right continuous with left limits. If we have a Markov process (X t, P x ) whose paths are right continuous with left limits, which has shift operators { t },andwhichsatisfiestheconclusionoftheorem6.6,whether or not Assumption 6.3 holds, then we say that (X t, P x )isastrong Markov process. AstrongMarkovprocessissaidtobequasi-left continuous if X Tn! X T,a.s.,on{T <1} whenever T n are stopping times increasing up to T. Unlike in the definition of predictable stopping times, we are not requiring the T n to be strictly less than T.AHunt process is a strong Markov process that is quasi-left continuous. Quasi-left continuity does not imply left continuity; consider the Poisson process.

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t 2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition

More information

Stochastic Processes

Stochastic Processes Introduction and Techniques Lecture 4 in Financial Mathematics UiO-STK4510 Autumn 2015 Teacher: S. Ortiz-Latorre Stochastic Processes 1 Stochastic Processes De nition 1 Let (E; E) be a measurable space

More information

Advanced Computer Networks Lecture 2. Markov Processes

Advanced Computer Networks Lecture 2. Markov Processes Advanced Computer Networks Lecture 2. Markov Processes Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/28 Outline 2/28 1 Definition

More information

Lecture 1: Brief Review on Stochastic Processes

Lecture 1: Brief Review on Stochastic Processes Lecture 1: Brief Review on Stochastic Processes A stochastic process is a collection of random variables {X t (s) : t T, s S}, where T is some index set and S is the common sample space of the random variables.

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

Convergence of Feller Processes

Convergence of Feller Processes Chapter 15 Convergence of Feller Processes This chapter looks at the convergence of sequences of Feller processes to a iting process. Section 15.1 lays some ground work concerning weak convergence of processes

More information

25. Chain Rule. Now, f is a function of t only. Expand by multiplication:

25. Chain Rule. Now, f is a function of t only. Expand by multiplication: 25. Chain Rule The Chain Rule is present in all differentiation. If z = f(x, y) represents a two-variable function, then it is plausible to consider the cases when x and y may be functions of other variable(s).

More information

Basic Definitions: Indexed Collections and Random Functions

Basic Definitions: Indexed Collections and Random Functions Chapter 1 Basic Definitions: Indexed Collections and Random Functions Section 1.1 introduces stochastic processes as indexed collections of random variables. Section 1.2 builds the necessary machinery

More information

Sample Spaces, Random Variables

Sample Spaces, Random Variables Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

Chapter 1. Poisson processes. 1.1 Definitions

Chapter 1. Poisson processes. 1.1 Definitions Chapter 1 Poisson processes 1.1 Definitions Let (, F, P) be a probability space. A filtration is a collection of -fields F t contained in F such that F s F t whenever s

More information

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2

More information

Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi

Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi s Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi Lectures on Probability and Stochastic Processes III Indian Statistical Institute, Kolkata 20 24 November

More information

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past. 1 Markov chain: definition Lecture 5 Definition 1.1 Markov chain] A sequence of random variables (X n ) n 0 taking values in a measurable state space (S, S) is called a (discrete time) Markov chain, if

More information

SMSTC (2007/08) Probability.

SMSTC (2007/08) Probability. SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

An essay on the general theory of stochastic processes

An essay on the general theory of stochastic processes Probability Surveys Vol. 3 (26) 345 412 ISSN: 1549-5787 DOI: 1.1214/1549578614 An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time

More information

Course 212: Academic Year Section 1: Metric Spaces

Course 212: Academic Year Section 1: Metric Spaces Course 212: Academic Year 1991-2 Section 1: Metric Spaces D. R. Wilkins Contents 1 Metric Spaces 3 1.1 Distance Functions and Metric Spaces............. 3 1.2 Convergence and Continuity in Metric Spaces.........

More information

Feller Processes and Semigroups

Feller Processes and Semigroups Stat25B: Probability Theory (Spring 23) Lecture: 27 Feller Processes and Semigroups Lecturer: Rui Dong Scribe: Rui Dong ruidong@stat.berkeley.edu For convenience, we can have a look at the list of materials

More information

Metric Spaces and Topology

Metric Spaces and Topology Chapter 2 Metric Spaces and Topology From an engineering perspective, the most important way to construct a topology on a set is to define the topology in terms of a metric on the set. This approach underlies

More information

Note that in the example in Lecture 1, the state Home is recurrent (and even absorbing), but all other states are transient. f ii (n) f ii = n=1 < +

Note that in the example in Lecture 1, the state Home is recurrent (and even absorbing), but all other states are transient. f ii (n) f ii = n=1 < + Random Walks: WEEK 2 Recurrence and transience Consider the event {X n = i for some n > 0} by which we mean {X = i}or{x 2 = i,x i}or{x 3 = i,x 2 i,x i},. Definition.. A state i S is recurrent if P(X n

More information

The strictly 1/2-stable example

The strictly 1/2-stable example The strictly 1/2-stable example 1 Direct approach: building a Lévy pure jump process on R Bert Fristedt provided key mathematical facts for this example. A pure jump Lévy process X is a Lévy process such

More information

1 Independent increments

1 Independent increments Tel Aviv University, 2008 Brownian motion 1 1 Independent increments 1a Three convolution semigroups........... 1 1b Independent increments.............. 2 1c Continuous time................... 3 1d Bad

More information

Stochastic Processes

Stochastic Processes Stochastic Processes A very simple introduction Péter Medvegyev 2009, January Medvegyev (CEU) Stochastic Processes 2009, January 1 / 54 Summary from measure theory De nition (X, A) is a measurable space

More information

Mean-field dual of cooperative reproduction

Mean-field dual of cooperative reproduction The mean-field dual of systems with cooperative reproduction joint with Tibor Mach (Prague) A. Sturm (Göttingen) Friday, July 6th, 2018 Poisson construction of Markov processes Let (X t ) t 0 be a continuous-time

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information

Inference for Stochastic Processes

Inference for Stochastic Processes Inference for Stochastic Processes Robert L. Wolpert Revised: June 19, 005 Introduction A stochastic process is a family {X t } of real-valued random variables, all defined on the same probability space

More information

MATH 51H Section 4. October 16, Recall what it means for a function between metric spaces to be continuous:

MATH 51H Section 4. October 16, Recall what it means for a function between metric spaces to be continuous: MATH 51H Section 4 October 16, 2015 1 Continuity Recall what it means for a function between metric spaces to be continuous: Definition. Let (X, d X ), (Y, d Y ) be metric spaces. A function f : X Y is

More information

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3. 1. GAUSSIAN PROCESSES A Gaussian process on a set T is a collection of random variables X =(X t ) t T on a common probability space such that for any n 1 and any t 1,...,t n T, the vector (X(t 1 ),...,X(t

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Chapter 8: Taylor s theorem and L Hospital s rule

Chapter 8: Taylor s theorem and L Hospital s rule Chapter 8: Taylor s theorem and L Hospital s rule Theorem: [Inverse Mapping Theorem] Suppose that a < b and f : [a, b] R. Given that f (x) > 0 for all x (a, b) then f 1 is differentiable on (f(a), f(b))

More information

x log x, which is strictly convex, and use Jensen s Inequality:

x log x, which is strictly convex, and use Jensen s Inequality: 2. Information measures: mutual information 2.1 Divergence: main inequality Theorem 2.1 (Information Inequality). D(P Q) 0 ; D(P Q) = 0 iff P = Q Proof. Let ϕ(x) x log x, which is strictly convex, and

More information

http://www.math.uah.edu/stat/markov/.xhtml 1 of 9 7/16/2009 7:20 AM Virtual Laboratories > 16. Markov Chains > 1 2 3 4 5 6 7 8 9 10 11 12 1. A Markov process is a random process in which the future is

More information

THEODORE VORONOV DIFFERENTIABLE MANIFOLDS. Fall Last updated: November 26, (Under construction.)

THEODORE VORONOV DIFFERENTIABLE MANIFOLDS. Fall Last updated: November 26, (Under construction.) 4 Vector fields Last updated: November 26, 2009. (Under construction.) 4.1 Tangent vectors as derivations After we have introduced topological notions, we can come back to analysis on manifolds. Let M

More information

Theory of Stochastic Processes 3. Generating functions and their applications

Theory of Stochastic Processes 3. Generating functions and their applications Theory of Stochastic Processes 3. Generating functions and their applications Tomonari Sei sei@mist.i.u-tokyo.ac.jp Department of Mathematical Informatics, University of Tokyo April 20, 2017 http://www.stat.t.u-tokyo.ac.jp/~sei/lec.html

More information

4.7.1 Computing a stationary distribution

4.7.1 Computing a stationary distribution At a high-level our interest in the rest of this section will be to understand the limiting distribution, when it exists and how to compute it To compute it, we will try to reason about when the limiting

More information

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic

More information

Stochastic processes. MAS275 Probability Modelling. Introduction and Markov chains. Continuous time. Markov property

Stochastic processes. MAS275 Probability Modelling. Introduction and Markov chains. Continuous time. Markov property Chapter 1: and Markov chains Stochastic processes We study stochastic processes, which are families of random variables describing the evolution of a quantity with time. In some situations, we can treat

More information

The Transition Probability Function P ij (t)

The Transition Probability Function P ij (t) The Transition Probability Function P ij (t) Consider a continuous time Markov chain {X(t), t 0}. We are interested in the probability that in t time units the process will be in state j, given that it

More information

2 Garrett: `A Good Spectral Theorem' 1. von Neumann algebras, density theorem The commutant of a subring S of a ring R is S 0 = fr 2 R : rs = sr; 8s 2

2 Garrett: `A Good Spectral Theorem' 1. von Neumann algebras, density theorem The commutant of a subring S of a ring R is S 0 = fr 2 R : rs = sr; 8s 2 1 A Good Spectral Theorem c1996, Paul Garrett, garrett@math.umn.edu version February 12, 1996 1 Measurable Hilbert bundles Measurable Banach bundles Direct integrals of Hilbert spaces Trivializing Hilbert

More information

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico

More information

THE INVERSE FUNCTION THEOREM

THE INVERSE FUNCTION THEOREM THE INVERSE FUNCTION THEOREM W. PATRICK HOOPER The implicit function theorem is the following result: Theorem 1. Let f be a C 1 function from a neighborhood of a point a R n into R n. Suppose A = Df(a)

More information

1.2 Functions What is a Function? 1.2. FUNCTIONS 11

1.2 Functions What is a Function? 1.2. FUNCTIONS 11 1.2. FUNCTIONS 11 1.2 Functions 1.2.1 What is a Function? In this section, we only consider functions of one variable. Loosely speaking, a function is a special relation which exists between two variables.

More information

Notes on Measure, Probability and Stochastic Processes. João Lopes Dias

Notes on Measure, Probability and Stochastic Processes. João Lopes Dias Notes on Measure, Probability and Stochastic Processes João Lopes Dias Departamento de Matemática, ISEG, Universidade de Lisboa, Rua do Quelhas 6, 1200-781 Lisboa, Portugal E-mail address: jldias@iseg.ulisboa.pt

More information

Kolmogorov Equations and Markov Processes

Kolmogorov Equations and Markov Processes Kolmogorov Equations and Markov Processes May 3, 013 1 Transition measures and functions Consider a stochastic process {X(t)} t 0 whose state space is a product of intervals contained in R n. We define

More information

An Ergodic Theorem for Fleming-Viot Models in Random Environments 1

An Ergodic Theorem for Fleming-Viot Models in Random Environments 1 An Ergodic Theorem for Fleming-Viot Models in Random Environments 1 Arash Jamshidpey 2 216 Abstract The Fleming-Viot (FV) process is a measure-valued diffusion that models the evolution of type frequencies

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

Metric spaces and metrizability

Metric spaces and metrizability 1 Motivation Metric spaces and metrizability By this point in the course, this section should not need much in the way of motivation. From the very beginning, we have talked about R n usual and how relatively

More information

16 1 Basic Facts from Functional Analysis and Banach Lattices

16 1 Basic Facts from Functional Analysis and Banach Lattices 16 1 Basic Facts from Functional Analysis and Banach Lattices 1.2.3 Banach Steinhaus Theorem Another fundamental theorem of functional analysis is the Banach Steinhaus theorem, or the Uniform Boundedness

More information

APPENDIX C: Measure Theoretic Issues

APPENDIX C: Measure Theoretic Issues APPENDIX C: Measure Theoretic Issues A general theory of stochastic dynamic programming must deal with the formidable mathematical questions that arise from the presence of uncountable probability spaces.

More information

Introduction to Algebraic and Geometric Topology Week 3

Introduction to Algebraic and Geometric Topology Week 3 Introduction to Algebraic and Geometric Topology Week 3 Domingo Toledo University of Utah Fall 2017 Lipschitz Maps I Recall f :(X, d)! (X 0, d 0 ) is Lipschitz iff 9C > 0 such that d 0 (f (x), f (y)) apple

More information

ON THE REGULARITY OF SAMPLE PATHS OF SUB-ELLIPTIC DIFFUSIONS ON MANIFOLDS

ON THE REGULARITY OF SAMPLE PATHS OF SUB-ELLIPTIC DIFFUSIONS ON MANIFOLDS Bendikov, A. and Saloff-Coste, L. Osaka J. Math. 4 (5), 677 7 ON THE REGULARITY OF SAMPLE PATHS OF SUB-ELLIPTIC DIFFUSIONS ON MANIFOLDS ALEXANDER BENDIKOV and LAURENT SALOFF-COSTE (Received March 4, 4)

More information

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 Last compiled: November 6, 213 1. Conditional expectation Exercise 1.1. To start with, note that P(X Y = P( c R : X > c, Y c or X c, Y > c = P( c Q : X > c, Y

More information

Unconstrained minimization of smooth functions

Unconstrained minimization of smooth functions Unconstrained minimization of smooth functions We want to solve min x R N f(x), where f is convex. In this section, we will assume that f is differentiable (so its gradient exists at every point), and

More information

Construction of a general measure structure

Construction of a general measure structure Chapter 4 Construction of a general measure structure We turn to the development of general measure theory. The ingredients are a set describing the universe of points, a class of measurable subsets along

More information

Section 9.2 introduces the description of Markov processes in terms of their transition probabilities and proves the existence of such processes.

Section 9.2 introduces the description of Markov processes in terms of their transition probabilities and proves the existence of such processes. Chapter 9 Markov Processes This lecture begins our study of Markov processes. Section 9.1 is mainly ideological : it formally defines the Markov property for one-parameter processes, and explains why it

More information

Math 180C, Spring Supplement on the Renewal Equation

Math 180C, Spring Supplement on the Renewal Equation Math 18C Spring 218 Supplement on the Renewal Equation. These remarks supplement our text and set down some of the material discussed in my lectures. Unexplained notation is as in the text or in lecture.

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Introduction to Random Diffusions

Introduction to Random Diffusions Introduction to Random Diffusions The main reason to study random diffusions is that this class of processes combines two key features of modern probability theory. On the one hand they are semi-martingales

More information

Some Background Math Notes on Limsups, Sets, and Convexity

Some Background Math Notes on Limsups, Sets, and Convexity EE599 STOCHASTIC NETWORK OPTIMIZATION, MICHAEL J. NEELY, FALL 2008 1 Some Background Math Notes on Limsups, Sets, and Convexity I. LIMITS Let f(t) be a real valued function of time. Suppose f(t) converges

More information

PROBABILITY THEORY II

PROBABILITY THEORY II Ruprecht-Karls-Universität Heidelberg Institut für Angewandte Mathematik Prof. Dr. Jan JOHANNES Outline of the lecture course PROBABILITY THEORY II Summer semester 2016 Preliminary version: April 21, 2016

More information

The Lebesgue Integral

The Lebesgue Integral The Lebesgue Integral Brent Nelson In these notes we give an introduction to the Lebesgue integral, assuming only a knowledge of metric spaces and the iemann integral. For more details see [1, Chapters

More information

The Fundamental Group and Covering Spaces

The Fundamental Group and Covering Spaces Chapter 8 The Fundamental Group and Covering Spaces In the first seven chapters we have dealt with point-set topology. This chapter provides an introduction to algebraic topology. Algebraic topology may

More information

Jump Processes. Richard F. Bass

Jump Processes. Richard F. Bass Jump Processes Richard F. Bass ii c Copyright 214 Richard F. Bass Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

Probability Theory II. Spring 2014 Peter Orbanz

Probability Theory II. Spring 2014 Peter Orbanz Probability Theory II Spring 2014 Peter Orbanz Contents Chapter 1. Martingales, continued 1 1.1. Martingales indexed by partially ordered sets 1 1.2. Notions of convergence for martingales 3 1.3. Uniform

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Do stochastic processes exist?

Do stochastic processes exist? Project 1 Do stochastic processes exist? If you wish to take this course for credit, you should keep a notebook that contains detailed proofs of the results sketched in my handouts. You may consult any

More information

On the submartingale / supermartingale property of diffusions in natural scale

On the submartingale / supermartingale property of diffusions in natural scale On the submartingale / supermartingale property of diffusions in natural scale Alexander Gushchin Mikhail Urusov Mihail Zervos November 13, 214 Abstract Kotani 5 has characterised the martingale property

More information

Introduction to Topology

Introduction to Topology Introduction to Topology Randall R. Holmes Auburn University Typeset by AMS-TEX Chapter 1. Metric Spaces 1. Definition and Examples. As the course progresses we will need to review some basic notions about

More information

An Introduction to Stochastic Processes in Continuous Time

An Introduction to Stochastic Processes in Continuous Time An Introduction to Stochastic Processes in Continuous Time Flora Spieksma adaptation of the text by Harry van Zanten to be used at your own expense May 22, 212 Contents 1 Stochastic Processes 1 1.1 Introduction......................................

More information

Countable state discrete time Markov Chains

Countable state discrete time Markov Chains Countable state discrete time Markov Chains Tuesday, March 18, 2014 2:12 PM Readings: Lawler Ch. 2 Karlin & Taylor Chs. 2 & 3 Resnick Ch. 1 Countably infinite state spaces are of practical utility in situations

More information

EC 521 MATHEMATICAL METHODS FOR ECONOMICS. Lecture 1: Preliminaries

EC 521 MATHEMATICAL METHODS FOR ECONOMICS. Lecture 1: Preliminaries EC 521 MATHEMATICAL METHODS FOR ECONOMICS Lecture 1: Preliminaries Murat YILMAZ Boğaziçi University In this lecture we provide some basic facts from both Linear Algebra and Real Analysis, which are going

More information

25.1 Ergodicity and Metric Transitivity

25.1 Ergodicity and Metric Transitivity Chapter 25 Ergodicity This lecture explains what it means for a process to be ergodic or metrically transitive, gives a few characterizes of these properties (especially for AMS processes), and deduces

More information

Classical and quantum Markov semigroups

Classical and quantum Markov semigroups Classical and quantum Markov semigroups Alexander Belton Department of Mathematics and Statistics Lancaster University United Kingdom http://www.maths.lancs.ac.uk/~belton/ a.belton@lancaster.ac.uk Young

More information

Alternative Characterizations of Markov Processes

Alternative Characterizations of Markov Processes Chapter 10 Alternative Characterizations of Markov Processes This lecture introduces two ways of characterizing Markov processes other than through their transition probabilities. Section 10.1 describes

More information

Markov Chains, Stochastic Processes, and Matrix Decompositions

Markov Chains, Stochastic Processes, and Matrix Decompositions Markov Chains, Stochastic Processes, and Matrix Decompositions 5 May 2014 Outline 1 Markov Chains Outline 1 Markov Chains 2 Introduction Perron-Frobenius Matrix Decompositions and Markov Chains Spectral

More information

From Stationary Methods to Krylov Subspaces

From Stationary Methods to Krylov Subspaces Week 6: Wednesday, Mar 7 From Stationary Methods to Krylov Subspaces Last time, we discussed stationary methods for the iterative solution of linear systems of equations, which can generally be written

More information

Finite-Horizon Statistics for Markov chains

Finite-Horizon Statistics for Markov chains Analyzing FSDT Markov chains Friday, September 30, 2011 2:03 PM Simulating FSDT Markov chains, as we have said is very straightforward, either by using probability transition matrix or stochastic update

More information

13 The martingale problem

13 The martingale problem 19-3-2012 Notations Ω complete metric space of all continuous functions from [0, + ) to R d endowed with the distance d(ω 1, ω 2 ) = k=1 ω 1 ω 2 C([0,k];H) 2 k (1 + ω 1 ω 2 C([0,k];H) ), ω 1, ω 2 Ω. F

More information

Lecture 2: Convex Sets and Functions

Lecture 2: Convex Sets and Functions Lecture 2: Convex Sets and Functions Hyang-Won Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 2 Network Optimization, Fall 2015 1 / 22 Optimization Problems Optimization problems are

More information

Verona Course April Lecture 1. Review of probability

Verona Course April Lecture 1. Review of probability Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is

More information

The Central Limit Theorem: More of the Story

The Central Limit Theorem: More of the Story The Central Limit Theorem: More of the Story Steven Janke November 2015 Steven Janke (Seminar) The Central Limit Theorem:More of the Story November 2015 1 / 33 Central Limit Theorem Theorem (Central Limit

More information

COMBINATORIAL LÉVY PROCESSES

COMBINATORIAL LÉVY PROCESSES COMBINATORIAL LÉVY PROCESSES HARRY CRANE Abstract. Combinatorial Lévy processes evolve on general state spaces of countable combinatorial structures. In this setting, the usual Lévy process properties

More information

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 6. Renewal Mathematically, renewal refers to a continuous time stochastic process with states,, 2,. N t {,, 2, 3, } so that you only have jumps from x to x + and

More information

4 Countability axioms

4 Countability axioms 4 COUNTABILITY AXIOMS 4 Countability axioms Definition 4.1. Let X be a topological space X is said to be first countable if for any x X, there is a countable basis for the neighborhoods of x. X is said

More information

Vector Calculus, Maths II

Vector Calculus, Maths II Section A Vector Calculus, Maths II REVISION (VECTORS) 1. Position vector of a point P(x, y, z) is given as + y and its magnitude by 2. The scalar components of a vector are its direction ratios, and represent

More information

2.2 Separable Equations

2.2 Separable Equations 2.2 Separable Equations Definition A first-order differential equation that can be written in the form Is said to be separable. Note: the variables of a separable equation can be written as Examples Solve

More information

Math-Stat-491-Fall2014-Notes-V

Math-Stat-491-Fall2014-Notes-V Math-Stat-491-Fall2014-Notes-V Hariharan Narayanan November 18, 2014 Martingales 1 Introduction Martingales were originally introduced into probability theory as a model for fair betting games. Essentially

More information

1 Stochastic Dynamic Programming

1 Stochastic Dynamic Programming 1 Stochastic Dynamic Programming Formally, a stochastic dynamic program has the same components as a deterministic one; the only modification is to the state transition equation. When events in the future

More information

2 Metric Spaces Definitions Exotic Examples... 3

2 Metric Spaces Definitions Exotic Examples... 3 Contents 1 Vector Spaces and Norms 1 2 Metric Spaces 2 2.1 Definitions.......................................... 2 2.2 Exotic Examples...................................... 3 3 Topologies 4 3.1 Open Sets..........................................

More information

Translation Invariant Exclusion Processes (Book in Progress)

Translation Invariant Exclusion Processes (Book in Progress) Translation Invariant Exclusion Processes (Book in Progress) c 2003 Timo Seppäläinen Department of Mathematics University of Wisconsin Madison, WI 53706-1388 December 11, 2008 Contents PART I Preliminaries

More information

Weak convergence and large deviation theory

Weak convergence and large deviation theory First Prev Next Go To Go Back Full Screen Close Quit 1 Weak convergence and large deviation theory Large deviation principle Convergence in distribution The Bryc-Varadhan theorem Tightness and Prohorov

More information

Independent random variables

Independent random variables CHAPTER 2 Independent random variables 2.1. Product measures Definition 2.1. Let µ i be measures on (Ω i,f i ), 1 i n. Let F F 1... F n be the sigma algebra of subsets of Ω : Ω 1... Ω n generated by all

More information