Reflected Brownian Motion
|
|
- Camilla Blankenship
- 5 years ago
- Views:
Transcription
1 Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide what we want to do afterwards. If allowed to continue the process may not remain inside the domain or even inside the closed domain including the boundary. There are some choices. We could freeze it at the point where it exited. Or define some motion on the boundary that it can follow. Is there a way where we can force it to return to the interior and still remain a Markov process with continuous trajectories. Reflected Brownian motion on the half line [, ) is a way of keeping Brownian motion in the half line [, ). It can be defined as the unique process P x supported on Ω = C[[, ); [, )] with the following properties:. It starts from x() = x, i.e. P x [x() = x] = (6.) 2. It behaves locally Brownian motion on (, ). This is formulated in the following manner. If τ is any stopping time and σ = inf{t : t τ, x(t) = } is the hitting time of after τ then the conditional probability distribution of P x given F τ agrees with Q x(τ) the distribution of the Brownian motion starting at x(τ) on the σ field F τ(ω) σ. 3. Finally the time spent at the boundary is. i.e. for any t, E Px [ {} (x(s))ds] = (6.2) Lemma 6.. Behaving locally like the Brownian motion is equivalent to the following condition: f(x(t)) f(x()) 2 f xx (x(s))ds (6.3)
2 2 CHAPTER 6. REFLECTED BROWNIAN MOTION is a martingale for any bounded smooth function f that is a constant (which can be taken to be with out loss of generality) in some neighborhood of. Proof. The equivalence is easy to establish. Let us take ǫ > and consider the exit time τ ǫ from (ǫ, ). Any smooth function on [ǫ, ) can be extended to [, ) in such a way that it has compact support in (, ). Therefore the martingale property (6.3) holds for arbitrary smooth bounded f until τ ǫ. Hence P x agrees with Q x on F τǫ. But ǫ > is arbitrary and we can let it go to. F τǫ F τ as ǫ. In the other direction if f is supported on [a, ) we can consider the sequence of stopping times defined successively, starting from τ = and for j, and for j, Clearly τ 2j+ = inf{t : t τ 2j, x(t) = } τ 2j = inf{t : t τ 2j, x(t) = a} Z(t) = f(x(t)) f(x()) 2 f xx (x(s))ds is a martingale in [τ 2j, τ 2j+ ] and is constant in [τ 2j, τ 2j ] when x(t) [, a]. It is now easy to verify that Z(t) is a martingale. Just write for s < t, Z(t) Z(s) = j [Z((τ j+ s) t) Z((τ j s) t)] Theorem 6.2. There is a unique P x satisfying (6.), (6.2) and (6.3). It can also be characterized by replacing (6.2) and (6.3) by the following. f is a smooth bounded function satisfying f (), then Z f (t) = f(x(t)) f(x()) 2 f xx (x(s))ds (6.4) is a sub-martingale with respect to (Ω, F t, P x ). In particular if f () =, Z f (t) is a martingale. If β(t) is the ordinary Brownian motion starting from x, then P x is the distribution of x(t) = β(t) and is a Markov process with transition probability density, for t >, x, y. p(t, x, y) = [ (y x) e 2 2t 2πt ] + e y+x)2 2t Remark 6.. In both (6.3) and (6.4) the condition that f is bounded can be dropped and replaced by f having polynomial growth. If f is x 2n for large x, then f xx is c n x 2n 2. If n = E[[x(t)] 2 ] = E[[x()] 2 ] + ds
3 3 This is easily justified by standard methods of approximating x 2 by bounded functions below it. By induction on n, one can show in a similar fashion that E[[x(t)] 2n ] = E[[x()] 2n ] + n(2n ) [x(s)] 2n 2 ds and deduce bounds of the form E [ [x(t)] 2n] C n [x 2n + t n ] for all n. Then the martingale property will extend to smooth functions with polynomial growth. Proof. First let us establish the equivalence between (6.2)-(6.3) and (6.4). Given a function f(x) with f () =, we can approximate it with functions that are constant near. f (n) (x) = f() for x n and f(n) (x) = f(x n ). It is piecewise smooth, with matching first derivatives. Therefore it is easy to check that f (n) (x(t)) f (n) (x()) f (n) xx (x(s))ds is a Martingale. Let n, then f (n) f and f xx (n) f xx except at. But f xx (n) is uniformly bounded and because of (6.2) we obtain (6.4) for functions with f () =. Conversely if Z(t) is a martingale when f () =, we can then show that (6.2) holds. Consider the function { x 2 if x ǫ f ǫ (x) = 2ǫx ǫ 2 (6.5) if x ǫ It is piecewise smooth, with matching first derivatives. Therefore E[f ǫ (x(t))] f ǫ (x) = 2 E[ Since f ǫ (x) 2ǫx it follows that f ǫ (x(s))ds] = E[ lim E[ [,ǫ] (x(s))ds] = ǫ [,ǫ] (x(s))ds] proving (6.2). To prove the sub-martingale property, let φ(x) be a smooth, bounded function that is identically equal to x in some interval [, δ]. Then g(x) = f(x) f ()φ(x) satisfies g () = and therefore Z g is a martingale. It is then enough to prove that Z φ (t) is sub-martingale. We repeat an argument used earlier in the proof of Lemma 6. of writing Z φ (t) = j [Z φ (τ j+ t) Z φ (τ j t)] with a = δ. The terms with even j are martingales and as for terms with odd j, φ(x(τ j+ t)) φ(x(τ j t)). An easy calculation now shows that E Px [Z φ (t) Z φ (s) F s ]
4 4 CHAPTER 6. REFLECTED BROWNIAN MOTION Finally we will identify any P x satisfying (6.), (6.2) and (6.3) or (6.4), has the distribution of x(t) = β(t) starting from β( + x. p(t, x, y) is clearly the transition probability of β(t) which is Markov because Brownian motion has the x x symmetry. The function u(t, x) = q(t, x, y)f(y)dy satisfies u t = 2 u xx; u(, x) = f(x), u x (t, ) = Just as in the case of diffusions with out boundary, if f(x(t)) f(x()) 2 f xx (x(s))ds is a martingale for all smooth f satisfying f () =, then it follows that for all smooth functions v = v(s, x) satisfying v x (s, ) = for all s, v(t, x(t)) v(, x()) [v s (s, x(s)) + 2 v xx(, x(s))]ds is a martingale. Therefore u(t t, x(t)) is a martingale under P x and E Px [f(x(t)] = u(t, x). This identifies the marginal distributions of P x as that of β( ). Regular conditioning argument allows us to complete the identification. The reflected Brownian motion can be defined in terms of the ordinary Brownian motion β(t) as a solution of the equation x(t) x(t) = x + β(t) + A(t) where A(t) is continuous and increasing, but is allowed to increase only when x(s) =, i.e da(s) = {s:x(s)>} Such a solution is unique. If x i (t); i =, 2 are two solutions d[x (t) x 2 (t)] 2 = 2(x (t) x 2 (t))d(a (t) A 2 (t)) = 2x (t)da 2 (t) 2x 2 (t)da (t) But x () = x 2 (). Implies x (t) x 2 (t) for all t. Existence is easy. x(t) = x + β(t) inf [(x + s t β(s)) ] is clearly non negative. A(t) = inf s t [(x+β(s)) ] is increasing and does so only when [x + β(t)] = A(t), i.e. x(t) =. The reflected Brownian motion comes with a representation x(t) = β(t) + A(t)
5 5 where A(t) is a local time at, almost surely continuous in t, acting as an infinitely strong drift when x(t) = which is a set of times of Lebesgue measure. This leads to an Ito s formula of the form du(t, x(t)) = u t (t, x(t))dt + u x (t, (x(t))dx(t) + 2 u xx(t, (x(t))dt = u t (t, x(t))dt + u x (t, (x(t))dβ(t) + u x (t, )da(t) + 2 u xx(t, (x(t))dt which is a way of seeing the effect of the boundary condition satisfied by u. There is a way of slowing down the process at the boundary. Let σ(s) = ρa(s) + s Define increasing family of stopping times {τ t } by σ(τ t ) = t; τ σ(s) = s which has a unique solution which is strictly increasing and continuous in t. Since ds and da(s) are orthogonal (A(s) increases only when x(s) = which is a set of Lebesgue measure), we have and We can now define a new process da(s) = ρ {} (x(s))dσ(s) ds = {(, )} (x(s))dσ(s) y(t) = x(τ t ) We will identify the martingales that go with this process. By Doob s optional stopping theorem, Z f (t) = f(x(τ t )) f(x()) 2 = f(x(τ t )) f(x()) 2 ρ f () τt = f(y(t)) f(y()) 2 ρ f () τt τt f (x(s))ds f ()A(τ t ) f (x(s)) (, ) (x(s))dσ(s) {} (x(s))dσ(s) f (y(s)) (, ) (y(s))ds {} (y(s))ds is a martingale. The new process is then a Markov process with generator { (Lf)(x) = 2 f (x) if x > ρ f () if x =
6 6 CHAPTER 6. REFLECTED BROWNIAN MOTION We can calculate the Laplace transform in t of the probability P x [x(t) = ]. This requires us to solve the resolvent equation, denoting ρ by γ which can be done explicitly to give λψ(x) 2 ψ (x) =, λψ γψ () = e λ t p(t, x, {})dt = λ + γ 2λ x 2λ e In particular p(t,, {}) as t. But p(t,, {}) does not go to linearly in t. {} is an instantaneous state. As γ this becomes the reflected process and as γ the absorbed process. For the characterization of these processes {P γ,x } through martingales we have the following theorem. Theorem 6.3. The process {P γ,x } is characterized uniquely as the process with respect to which u(t, x(t)) u(, x()) [u s (s, x(s)) + 2 u xx(s, x(s))] (, ) (x(s))ds (6.6) is a sub-martingale provided (u t + γu x )(t, ) for all t. Proof. First let us prove that for any smooth bounded u(t, x), u(t, x(t)) u(, x()) γ u s (s, x(s))ds u x (s, ) {} (x(s))ds 2 u xx(s, x(s)) (, ) (x(s))ds is a martingale. We saw earlier that the time changed process P x has the property that f(x(t)) f(x()) [γf () {} (x(s)) + 2 f xx(x(s)) (, ) (x(s))]ds is a martingale for smooth bounded functions. It follows from this that for smooth functions u = u(t, x), u(t, x(t)) u(, (x()) u s (s, x(s))ds [γu x (s, x(s)) {} (x(s)) + 2 u xx(s, x(s)) (, ) (x(s))]ds is a martingale. If u s (s, )+γu x (s, ) = for all s, we can replace u s(s, x(s))ds by u s (s, ) {} (x(s))ds + = γ u s (s, x(s)) (, ) (x(s))ds u x (s, ) {} (x(s))ds + u s (s, x(s)) (, ) (x(s))ds
7 7 proving that the expression (6.6) is a martingale. On the other hand, if u s (s, ) + γu x (s, ) = h(s), then v(s, x) = u(s, x) H(s) with H (s) = h(s), satisfies the boundary condition v s (s, ) + γv x (s, ) = and v(t, x(t)) v(, x()) γ [v s (s, x(s))ds + 2 v xx(s, x(s))] (, ) (x(s))ds v x (s, ) {} (x(s))ds is a martingale. Substituting v(s, x) = u(s, x) H(s) with H(s) nondecreasing, we obtain the result. Finally will will identify the processes P γ,x through their associated ODEs. Consider the solution g(x) of λg(x) 2 g (x) = f(x) for x > λg() γg () = a Then with u(t, x) = e λ t g(x) we see that e λt g(x(t)) g(x()) + e λ s [f(x(s)) (, ) (x(s))ds + a (, ) (x(s))]ds is a martingale. Equating expectations at t = and t = we identify [ g(x) = E x e λ s [f(x(s)) (, ) (x(s))ds + a (, ) (x(s))]ds ] The situation corresponding to 2 a(x)d2 x + b(x)d x is similar so long as a(x) c >. Girsanov formula works and b can be taken care of and one can do random time change as well and reduce the problem to a =, b =. Markov Chain approximations. We have a transition probability π h (x, dy). We assume that for any δ >, sup π h (x, {y : x y δ}) = o(h) x uniformly in x. This implies immediately that the processes P h,x of piecewise constant (intervals of size h) versions of the Markov Chain are compact in D[[, T], R]. We impose conditions on the behavior of a h (x) = (y x) 2 π h (x, dy) h and b h (x) = h y x δ y x δ (y x)π h (x, dy)
8 8 CHAPTER 6. REFLECTED BROWNIAN MOTION to determine any limit. We assume that sup a h (x) C(l) <h x l inf b h (x) C(l) <h x l and uniformly on compact subsets of (, ), lim a h(x) = h lim b h(x) = h For smooth functions f with compact support in (, ), by a Taylor expansion lim [f(y) f(x)]π h (x, dy) h h 2 f (x) locally uniformly. Any limit is therefore Brownian motion away from. We need only determine what happens at. The parameter γ is to be determined from the behavior of a h (x), b h (x) as x, h. Theorem If lim inf a h (x) c > h x then the limit exists and is the reflected Brownian motion, i.e. γ =. 2. If b h (x) + whenever x, h, a h (x) then again the limit is the reflected Brownian motion. 3. If b h (x) remains uniformly bounded as h, x and tends to γ whenever h,, x and a h (x), then the limit exists and is the process with generator (L γ f)(x) = 2 (, )(x)f (x) + γ {} (x)f () Proof. For proving or 2, we need to show that for any limit P of P h,x as h, x, for some l >, E P [ τl t {} (x(s))ds] = where τ l is the exit time from [, l]. Take l =. By Taylor expansion we see that [f(y) f(x)]π h (x, dy) = b h (x)f (x) + h 2 a h(x)f (x) + (h)
9 9 In particular if we consider f = f ǫ (x) defined in (6.5), it is uniformly bounded by 2ǫ on [, ]. and [f(y) f(x)]π h (x, dy) = h[b h (x)f (x)+ 2 a h(x)f (x)]+o(h) = (L j f)(x)+o(h) uniformly on [, ]. (L h f)(x) = [b h (x)f (x) + 2 a h(x)f (x)] c [,ǫ] (x) 2C()ǫ f(x(nh)) f(x()) (n ) j= is a martingale. Therefore we obtain in the limit f(x(t τ )) f(x()) [f(y) f(x(jh))]π(x(jh), dy) τ is a sub-martingale. This provides an estimate E P [ τ c [,ǫ] (x) 2C()ǫ [,ǫ] (x(s))ds] Kǫ(t + ) which proves. To prove 2 we need a different test function φ. Take φ(x) = f(x) + δx on [, ]. Then on [, ], φ and φ are bounded by C(ǫ + δ) and for some L θ as θ, lim inf h [b h(x)φ (x) + 2 a h(x)φ (x)] (L θ δ θ) [,ǫ] (x) C(ǫ + δ) This provides the estimate for the limiting P E P [ τ and this is enough. Let ǫ to get lim sup E P [ ǫ [L θ δ θ] [,ǫ] (x(s))ds] C(ǫ + δ) τ δ [,ǫ] (x(s))ds] C L θ δ θ and now pick θ = δ and let δ. Finally we turn to 3. We need to consider for smooth u(s, x), with u s (s, ) + γu x (s, ), [u(s + h, y) u(s, x)]π h (x, dy) and determine its sign when x is close to and h is small. We see that this is nearly h[u s (s, x) + b h (x)u x (s, x) + 2 a h(x)u xx (s, x)] From earlier estimate we know that the chain will not spend much time close to where a h (x) is not small and then b h (x) is close to γ. Since u s (s, )+γu x (s, ) this contribution is nonnegative.
4.6 Example of non-uniqueness.
66 CHAPTER 4. STOCHASTIC DIFFERENTIAL EQUATIONS. 4.6 Example of non-uniqueness. If we try to construct a solution to the martingale problem in dimension coresponding to a(x) = x α with
More informationStochastic Differential Equations.
Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)
More informationStochastic Integration.
Chapter Stochastic Integration..1 Brownian Motion as a Martingale P is the Wiener measure on (Ω, B) where Ω = C, T B is the Borel σ-field on Ω. In addition we denote by B t the σ-field generated by x(s)
More informationLecture 12. F o s, (1.1) F t := s>t
Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let
More informationVerona Course April Lecture 1. Review of probability
Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is
More informationThe first order quasi-linear PDEs
Chapter 2 The first order quasi-linear PDEs The first order quasi-linear PDEs have the following general form: F (x, u, Du) = 0, (2.1) where x = (x 1, x 2,, x 3 ) R n, u = u(x), Du is the gradient of u.
More informationStability of Stochastic Differential Equations
Lyapunov stability theory for ODEs s Stability of Stochastic Differential Equations Part 1: Introduction Department of Mathematics and Statistics University of Strathclyde Glasgow, G1 1XH December 2010
More information6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )
6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationBrownian Motion. 1 Definition Brownian Motion Wiener measure... 3
Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................
More informationSolution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have
362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications
More informationBrownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion
Brownian Motion An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan 2010 Background We have already seen that the limiting behavior of a discrete random walk yields a derivation of
More informationExercises. T 2T. e ita φ(t)dt.
Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.
More information1 Brownian Local Time
1 Brownian Local Time We first begin by defining the space and variables for Brownian local time. Let W t be a standard 1-D Wiener process. We know that for the set, {t : W t = } P (µ{t : W t = } = ) =
More informationHarmonic Functions and Brownian motion
Harmonic Functions and Brownian motion Steven P. Lalley April 25, 211 1 Dynkin s Formula Denote by W t = (W 1 t, W 2 t,..., W d t ) a standard d dimensional Wiener process on (Ω, F, P ), and let F = (F
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationIntroduction to Random Diffusions
Introduction to Random Diffusions The main reason to study random diffusions is that this class of processes combines two key features of modern probability theory. On the one hand they are semi-martingales
More informationLecture 4: Ito s Stochastic Calculus and SDE. Seung Yeal Ha Dept of Mathematical Sciences Seoul National University
Lecture 4: Ito s Stochastic Calculus and SDE Seung Yeal Ha Dept of Mathematical Sciences Seoul National University 1 Preliminaries What is Calculus? Integral, Differentiation. Differentiation 2 Integral
More informationp 1 ( Y p dp) 1/p ( X p dp) 1 1 p
Doob s inequality Let X(t) be a right continuous submartingale with respect to F(t), t 1 P(sup s t X(s) λ) 1 λ {sup s t X(s) λ} X + (t)dp 2 For 1 < p
More informationDeterministic Dynamic Programming
Deterministic Dynamic Programming 1 Value Function Consider the following optimal control problem in Mayer s form: V (t 0, x 0 ) = inf u U J(t 1, x(t 1 )) (1) subject to ẋ(t) = f(t, x(t), u(t)), x(t 0
More informationn E(X t T n = lim X s Tn = X s
Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:
More informationlim n C1/n n := ρ. [f(y) f(x)], y x =1 [f(x) f(y)] [g(x) g(y)]. (x,y) E A E(f, f),
1 Part I Exercise 1.1. Let C n denote the number of self-avoiding random walks starting at the origin in Z of length n. 1. Show that (Hint: Use C n+m C n C m.) lim n C1/n n = inf n C1/n n := ρ.. Show that
More informationMATH 425, FINAL EXAM SOLUTIONS
MATH 425, FINAL EXAM SOLUTIONS Each exercise is worth 50 points. Exercise. a The operator L is defined on smooth functions of (x, y by: Is the operator L linear? Prove your answer. L (u := arctan(xy u
More informationLecture 3. This operator commutes with translations and it is not hard to evaluate. Ae iξx = ψ(ξ)e iξx. T t I. A = lim
Lecture 3. If we specify D t,ω as a progressively mesurable map of (Ω [, T], F t ) into the space of infinitely divisible distributions, as well as an initial distribution for the starting point x() =
More information3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?
MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due ). Show that the open disk x 2 + y 2 < 1 is a countable union of planar elementary sets. Show that the closed disk x 2 + y 2 1 is a countable
More informationApplications of Ito s Formula
CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale
More informationRichard F. Bass Krzysztof Burdzy University of Washington
ON DOMAIN MONOTONICITY OF THE NEUMANN HEAT KERNEL Richard F. Bass Krzysztof Burdzy University of Washington Abstract. Some examples are given of convex domains for which domain monotonicity of the Neumann
More informationON THE POLICY IMPROVEMENT ALGORITHM IN CONTINUOUS TIME
ON THE POLICY IMPROVEMENT ALGORITHM IN CONTINUOUS TIME SAUL D. JACKA AND ALEKSANDAR MIJATOVIĆ Abstract. We develop a general approach to the Policy Improvement Algorithm (PIA) for stochastic control problems
More informationLecture 17 Brownian motion as a Markov process
Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is
More informationHomework #6 : final examination Due on March 22nd : individual work
Université de ennes Année 28-29 Master 2ème Mathématiques Modèles stochastiques continus ou à sauts Homework #6 : final examination Due on March 22nd : individual work Exercise Warm-up : behaviour of characteristic
More informationEXISTENCE OF NEUTRAL STOCHASTIC FUNCTIONAL DIFFERENTIAL EQUATIONS WITH ABSTRACT VOLTERRA OPERATORS
International Journal of Differential Equations and Applications Volume 7 No. 1 23, 11-17 EXISTENCE OF NEUTRAL STOCHASTIC FUNCTIONAL DIFFERENTIAL EQUATIONS WITH ABSTRACT VOLTERRA OPERATORS Zephyrinus C.
More informationHarmonic Functions and Brownian Motion in Several Dimensions
Harmonic Functions and Brownian Motion in Several Dimensions Steven P. Lalley October 11, 2016 1 d -Dimensional Brownian Motion Definition 1. A standard d dimensional Brownian motion is an R d valued continuous-time
More informationUniformly Uniformly-ergodic Markov chains and BSDEs
Uniformly Uniformly-ergodic Markov chains and BSDEs Samuel N. Cohen Mathematical Institute, University of Oxford (Based on joint work with Ying Hu, Robert Elliott, Lukas Szpruch) Centre Henri Lebesgue,
More informationIowa State University. Instructor: Alex Roitershtein Summer Homework #5. Solutions
Math 50 Iowa State University Introduction to Real Analysis Department of Mathematics Instructor: Alex Roitershtein Summer 205 Homework #5 Solutions. Let α and c be real numbers, c > 0, and f is defined
More informationMath 4263 Homework Set 1
Homework Set 1 1. Solve the following PDE/BVP 2. Solve the following PDE/BVP 2u t + 3u x = 0 u (x, 0) = sin (x) u x + e x u y = 0 u (0, y) = y 2 3. (a) Find the curves γ : t (x (t), y (t)) such that that
More informationWeak convergence and large deviation theory
First Prev Next Go To Go Back Full Screen Close Quit 1 Weak convergence and large deviation theory Large deviation principle Convergence in distribution The Bryc-Varadhan theorem Tightness and Prohorov
More information1. Stochastic Processes and filtrations
1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S
More information2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?
MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due 9/5). Prove that every countable set A is measurable and µ(a) = 0. 2 (Bonus). Let A consist of points (x, y) such that either x or y is
More informationProbability and Measure
Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability
More informationPoisson Jumps in Credit Risk Modeling: a Partial Integro-differential Equation Formulation
Poisson Jumps in Credit Risk Modeling: a Partial Integro-differential Equation Formulation Jingyi Zhu Department of Mathematics University of Utah zhu@math.utah.edu Collaborator: Marco Avellaneda (Courant
More informationUNIVERSITY OF MANITOBA
Question Points Score INSTRUCTIONS TO STUDENTS: This is a 6 hour examination. No extra time will be given. No texts, notes, or other aids are permitted. There are no calculators, cellphones or electronic
More informationSDE Coefficients. March 4, 2008
SDE Coefficients March 4, 2008 The following is a summary of GARD sections 3.3 and 6., mainly as an overview of the two main approaches to creating a SDE model. Stochastic Differential Equations (SDE)
More informationStochastic Calculus. Kevin Sinclair. August 2, 2016
Stochastic Calculus Kevin Sinclair August, 16 1 Background Suppose we have a Brownian motion W. This is a process, and the value of W at a particular time T (which we write W T ) is a normally distributed
More informationLecture 22 Girsanov s Theorem
Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n
More informationERRATA: Probabilistic Techniques in Analysis
ERRATA: Probabilistic Techniques in Analysis ERRATA 1 Updated April 25, 26 Page 3, line 13. A 1,..., A n are independent if P(A i1 A ij ) = P(A 1 ) P(A ij ) for every subset {i 1,..., i j } of {1,...,
More informationPartial Differential Equations
Part II Partial Differential Equations Year 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2015 Paper 4, Section II 29E Partial Differential Equations 72 (a) Show that the Cauchy problem for u(x,
More informationLaplace s Equation. Chapter Mean Value Formulas
Chapter 1 Laplace s Equation Let be an open set in R n. A function u C 2 () is called harmonic in if it satisfies Laplace s equation n (1.1) u := D ii u = 0 in. i=1 A function u C 2 () is called subharmonic
More informationOn the submartingale / supermartingale property of diffusions in natural scale
On the submartingale / supermartingale property of diffusions in natural scale Alexander Gushchin Mikhail Urusov Mihail Zervos November 13, 214 Abstract Kotani 5 has characterised the martingale property
More informationFeller Processes and Semigroups
Stat25B: Probability Theory (Spring 23) Lecture: 27 Feller Processes and Semigroups Lecturer: Rui Dong Scribe: Rui Dong ruidong@stat.berkeley.edu For convenience, we can have a look at the list of materials
More informationSome Tools From Stochastic Analysis
W H I T E Some Tools From Stochastic Analysis J. Potthoff Lehrstuhl für Mathematik V Universität Mannheim email: potthoff@math.uni-mannheim.de url: http://ls5.math.uni-mannheim.de To close the file, click
More informationMalliavin Calculus in Finance
Malliavin Calculus in Finance Peter K. Friz 1 Greeks and the logarithmic derivative trick Model an underlying assent by a Markov process with values in R m with dynamics described by the SDE dx t = b(x
More informationA Short Introduction to Diffusion Processes and Ito Calculus
A Short Introduction to Diffusion Processes and Ito Calculus Cédric Archambeau University College, London Center for Computational Statistics and Machine Learning c.archambeau@cs.ucl.ac.uk January 24,
More informationLECTURE 2: LOCAL TIME FOR BROWNIAN MOTION
LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in
More information13 The martingale problem
19-3-2012 Notations Ω complete metric space of all continuous functions from [0, + ) to R d endowed with the distance d(ω 1, ω 2 ) = k=1 ω 1 ω 2 C([0,k];H) 2 k (1 + ω 1 ω 2 C([0,k];H) ), ω 1, ω 2 Ω. F
More information{σ x >t}p x. (σ x >t)=e at.
3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ
More informationMeasure and Integration: Solutions of CW2
Measure and Integration: s of CW2 Fall 206 [G. Holzegel] December 9, 206 Problem of Sheet 5 a) Left (f n ) and (g n ) be sequences of integrable functions with f n (x) f (x) and g n (x) g (x) for almost
More informationFunctional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals
Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico
More informationfor all f satisfying E[ f(x) ] <.
. Let (Ω, F, P ) be a probability space and D be a sub-σ-algebra of F. An (H, H)-valued random variable X is independent of D if and only if P ({X Γ} D) = P {X Γ}P (D) for all Γ H and D D. Prove that if
More informationBernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012
1 Stochastic Calculus Notes March 9 th, 1 In 19, Bachelier proposed for the Paris stock exchange a model for the fluctuations affecting the price X(t) of an asset that was given by the Brownian motion.
More informationKolmogorov Equations and Markov Processes
Kolmogorov Equations and Markov Processes May 3, 013 1 Transition measures and functions Consider a stochastic process {X(t)} t 0 whose state space is a product of intervals contained in R n. We define
More informationThe relations are fairly easy to deduce (either by multiplication by matrices or geometrically), as one has. , R θ1 S θ2 = S θ1+θ 2
10 PART 1 : SOLITON EQUATIONS 4. Symmetries of the KdV equation The idea behind symmetries is that we start with the idea of symmetries of general systems of algebraic equations. We progress to the idea
More informationẋ = f(x, y), ẏ = g(x, y), (x, y) D, can only have periodic solutions if (f,g) changes sign in D or if (f,g)=0in D.
4 Periodic Solutions We have shown that in the case of an autonomous equation the periodic solutions correspond with closed orbits in phase-space. Autonomous two-dimensional systems with phase-space R
More informationLMI Methods in Optimal and Robust Control
LMI Methods in Optimal and Robust Control Matthew M. Peet Arizona State University Lecture 15: Nonlinear Systems and Lyapunov Functions Overview Our next goal is to extend LMI s and optimization to nonlinear
More informationNonlinear Control Lecture # 14 Input-Output Stability. Nonlinear Control
Nonlinear Control Lecture # 14 Input-Output Stability L Stability Input-Output Models: y = Hu u(t) is a piecewise continuous function of t and belongs to a linear space of signals The space of bounded
More informationMath Ordinary Differential Equations
Math 411 - Ordinary Differential Equations Review Notes - 1 1 - Basic Theory A first order ordinary differential equation has the form x = f(t, x) (11) Here x = dx/dt Given an initial data x(t 0 ) = x
More informationInformation and Credit Risk
Information and Credit Risk M. L. Bedini Université de Bretagne Occidentale, Brest - Friedrich Schiller Universität, Jena Jena, March 2011 M. L. Bedini (Université de Bretagne Occidentale, Brest Information
More informationSeries Solutions. 8.1 Taylor Polynomials
8 Series Solutions 8.1 Taylor Polynomials Polynomial functions, as we have seen, are well behaved. They are continuous everywhere, and have continuous derivatives of all orders everywhere. It also turns
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationIntroduction to Nonlinear Control Lecture # 3 Time-Varying and Perturbed Systems
p. 1/5 Introduction to Nonlinear Control Lecture # 3 Time-Varying and Perturbed Systems p. 2/5 Time-varying Systems ẋ = f(t, x) f(t, x) is piecewise continuous in t and locally Lipschitz in x for all t
More informationd(x n, x) d(x n, x nk ) + d(x nk, x) where we chose any fixed k > N
Problem 1. Let f : A R R have the property that for every x A, there exists ɛ > 0 such that f(t) > ɛ if t (x ɛ, x + ɛ) A. If the set A is compact, prove there exists c > 0 such that f(x) > c for all x
More informationMultivariable Calculus
2 Multivariable Calculus 2.1 Limits and Continuity Problem 2.1.1 (Fa94) Let the function f : R n R n satisfy the following two conditions: (i) f (K ) is compact whenever K is a compact subset of R n. (ii)
More informationAnalysis Qualifying Exam
Analysis Qualifying Exam Spring 2017 Problem 1: Let f be differentiable on R. Suppose that there exists M > 0 such that f(k) M for each integer k, and f (x) M for all x R. Show that f is bounded, i.e.,
More informationBranching Processes II: Convergence of critical branching to Feller s CSB
Chapter 4 Branching Processes II: Convergence of critical branching to Feller s CSB Figure 4.1: Feller 4.1 Birth and Death Processes 4.1.1 Linear birth and death processes Branching processes can be studied
More information= F (b) F (a) F (x i ) F (x i+1 ). a x 0 x 1 x n b i
Real Analysis Problem 1. If F : R R is a monotone function, show that F T V ([a,b]) = F (b) F (a) for any interval [a, b], and that F has bounded variation on R if and only if it is bounded. Here F T V
More informationApproximating diffusions by piecewise constant parameters
Approximating diffusions by piecewise constant parameters Lothar Breuer Institute of Mathematics Statistics, University of Kent, Canterbury CT2 7NF, UK Abstract We approximate the resolvent of a one-dimensional
More informationTHE SKOROKHOD OBLIQUE REFLECTION PROBLEM IN A CONVEX POLYHEDRON
GEORGIAN MATHEMATICAL JOURNAL: Vol. 3, No. 2, 1996, 153-176 THE SKOROKHOD OBLIQUE REFLECTION PROBLEM IN A CONVEX POLYHEDRON M. SHASHIASHVILI Abstract. The Skorokhod oblique reflection problem is studied
More informationExercises in stochastic analysis
Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with
More informationConvergence of Feller Processes
Chapter 15 Convergence of Feller Processes This chapter looks at the convergence of sequences of Feller processes to a iting process. Section 15.1 lays some ground work concerning weak convergence of processes
More informationON THE OPTIMAL STOPPING PROBLEM FOR ONE DIMENSIONAL DIFFUSIONS
ON THE OPTIMAL STOPPING PROBLEM FOR ONE DIMENSIONAL DIFFUSIONS SAVAS DAYANIK Department of Operations Research and Financial Engineering and the Bendheim Center for Finance Princeton University, Princeton,
More informationA Concise Course on Stochastic Partial Differential Equations
A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original
More informationAsymptotic statistics using the Functional Delta Method
Quantiles, Order Statistics and L-Statsitics TU Kaiserslautern 15. Februar 2015 Motivation Functional The delta method introduced in chapter 3 is an useful technique to turn the weak convergence of random
More informationPartial Differential Equations with Applications to Finance Seminar 1: Proving and applying Dynkin s formula
Partial Differential Equations with Applications to Finance Seminar 1: Proving and applying Dynkin s formula Group 4: Bertan Yilmaz, Richard Oti-Aboagye and Di Liu May, 15 Chapter 1 Proving Dynkin s formula
More informationContents. 1. Introduction
FUNDAMENTAL THEOREM OF THE LOCAL THEORY OF CURVES KAIXIN WANG Abstract. In this expository paper, we present the fundamental theorem of the local theory of curves along with a detailed proof. We first
More informationOn the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem
On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem Koichiro TAKAOKA Dept of Applied Physics, Tokyo Institute of Technology Abstract M Yor constructed a family
More information3. Identify and find the general solution of each of the following first order differential equations.
Final Exam MATH 33, Sample Questions. Fall 7. y = Cx 3 3 is the general solution of a differential equation. Find the equation. Answer: y = 3y + 9 xy. y = C x + C x is the general solution of a differential
More informationLecture 4: Numerical solution of ordinary differential equations
Lecture 4: Numerical solution of ordinary differential equations Department of Mathematics, ETH Zürich General explicit one-step method: Consistency; Stability; Convergence. High-order methods: Taylor
More informationA Note on the Differential Equations with Distributional Coefficients
MATEMATIKA, 24, Jilid 2, Bil. 2, hlm. 115 124 c Jabatan Matematik, UTM. A Note on the Differential Equations with Distributional Coefficients Adem Kilicman Department of Mathematics, Institute for Mathematical
More informationPath Decomposition of Markov Processes. Götz Kersting. University of Frankfurt/Main
Path Decomposition of Markov Processes Götz Kersting University of Frankfurt/Main joint work with Kaya Memisoglu, Jim Pitman 1 A Brownian path with positive drift 50 40 30 20 10 0 0 200 400 600 800 1000-10
More informationKrzysztof Burdzy Robert Ho lyst Peter March
A FLEMING-VIOT PARTICLE REPRESENTATION OF THE DIRICHLET LAPLACIAN Krzysztof Burdzy Robert Ho lyst Peter March Abstract: We consider a model with a large number N of particles which move according to independent
More informationThe Pedestrian s Guide to Local Time
The Pedestrian s Guide to Local Time Tomas Björk, Department of Finance, Stockholm School of Economics, Box 651, SE-113 83 Stockholm, SWEDEN tomas.bjork@hhs.se November 19, 213 Preliminary version Comments
More informationLEGENDRE POLYNOMIALS AND APPLICATIONS. We construct Legendre polynomials and apply them to solve Dirichlet problems in spherical coordinates.
LEGENDRE POLYNOMIALS AND APPLICATIONS We construct Legendre polynomials and apply them to solve Dirichlet problems in spherical coordinates.. Legendre equation: series solutions The Legendre equation is
More informationMath 321 Final Examination April 1995 Notation used in this exam: N. (1) S N (f,x) = f(t)e int dt e inx.
Math 321 Final Examination April 1995 Notation used in this exam: N 1 π (1) S N (f,x) = f(t)e int dt e inx. 2π n= N π (2) C(X, R) is the space of bounded real-valued functions on the metric space X, equipped
More informationWiener Measure and Brownian Motion
Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u
More informationOrdinary Differential Equation Theory
Part I Ordinary Differential Equation Theory 1 Introductory Theory An n th order ODE for y = y(t) has the form Usually it can be written F (t, y, y,.., y (n) ) = y (n) = f(t, y, y,.., y (n 1) ) (Implicit
More informationIn class midterm Exam - Answer key
Fall 2013 In class midterm Exam - Answer key ARE211 Problem 1 (20 points). Metrics: Let B be the set of all sequences x = (x 1,x 2,...). Define d(x,y) = sup{ x i y i : i = 1,2,...}. a) Prove that d is
More informationPartial Derivatives. w = f(x, y, z).
Partial Derivatives 1 Functions of Several Variables So far we have focused our attention of functions of one variable. These functions model situations in which a variable depends on another independent
More informationMathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )
Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca November 26th, 2014 E. Tanré (INRIA - Team Tosca) Mathematical
More informationLecture 21 Representations of Martingales
Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let
More informationLecture 7. 1 Notations. Tel Aviv University Spring 2011
Random Walks and Brownian Motion Tel Aviv University Spring 2011 Lecture date: Apr 11, 2011 Lecture 7 Instructor: Ron Peled Scribe: Yoav Ram The following lecture (and the next one) will be an introduction
More informationODE Homework 1. Due Wed. 19 August 2009; At the beginning of the class
ODE Homework Due Wed. 9 August 2009; At the beginning of the class. (a) Solve Lẏ + Ry = E sin(ωt) with y(0) = k () L, R, E, ω are positive constants. (b) What is the limit of the solution as ω 0? (c) Is
More information