PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Similar documents
PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

{σ x >t}p x. (σ x >t)=e at.

ELEMENTS OF PROBABILITY THEORY

A Short Introduction to Diffusion Processes and Ito Calculus

Exercises. T 2T. e ita φ(t)dt.

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

µ X (A) = P ( X 1 (A) )

Verona Course April Lecture 1. Review of probability

A Concise Course on Stochastic Partial Differential Equations

1. Stochastic Processes and filtrations

Lecture 4: Introduction to stochastic processes and stochastic calculus

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

Exercises in stochastic analysis

Lecture 17 Brownian motion as a Markov process

Stochastic Calculus February 11, / 33

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

1. Stochastic Process

MATH 6605: SUMMARY LECTURE NOTES

Probability and Measure

Universal examples. Chapter The Bernoulli process

Selected Exercises on Expectations and Some Probability Inequalities

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Interest Rate Models:

Controlled Diffusions and Hamilton-Jacobi Bellman Equations

Lecture 12. F o s, (1.1) F t := s>t

Tools of stochastic calculus

Topics in fractional Brownian motion

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Random Process Lecture 1. Fundamentals of Probability

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Reflected Brownian Motion

Introduction to Random Diffusions

Feller Processes and Semigroups

Stochastic Integration.

Some Tools From Stochastic Analysis

Lecture 21 Representations of Martingales

1 Stat 605. Homework I. Due Feb. 1, 2011

Kolmogorov Equations and Markov Processes

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Applications of Ito s Formula

Backward Stochastic Differential Equations with Infinite Time Horizon

Stochastic Differential Equations.

Lecture 22 Girsanov s Theorem

I forgot to mention last time: in the Ito formula for two standard processes, putting

Stochastic integration. P.J.C. Spreij

On pathwise stochastic integration

Hardy-Stein identity and Square functions

Lecture 12: Detailed balance and Eigenfunction methods

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

Lecture 4: Stochastic Processes (I)

The strictly 1/2-stable example

E. Vitali Lecture 6. Brownian motion

LAN property for sde s with additive fractional noise and continuous time observation

3 hours UNIVERSITY OF MANCHESTER. 22nd May and. Electronic calculators may be used, provided that they cannot store text.

Stochastic Processes

Stochastic Integration and Continuous Time Models

Brownian Motion and Stochastic Calculus

Stochastic differential equation models in biology Susanne Ditlevsen

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Stochastic Analysis. King s College London Version 1.4 October Markus Riedle

1. Aufgabenblatt zur Vorlesung Probability Theory

Contents. 1 Preliminaries 3. Martingales

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

Fast-slow systems with chaotic noise

STATISTICS 385: STOCHASTIC CALCULUS HOMEWORK ASSIGNMENT 4 DUE NOVEMBER 23, = (2n 1)(2n 3) 3 1.

Gaussian Random Fields: Geometric Properties and Extremes

State Space Representation of Gaussian Processes

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.

Strong uniqueness for stochastic evolution equations with possibly unbounded measurable drift term

Convergence of Feller Processes

Markov processes and queueing networks

Homogenization for chaotic dynamical systems

Solutions to the Exercises in Stochastic Analysis

Wiener Measure and Brownian Motion

Lecture 19 L 2 -Stochastic integration

Stochastic differential equations in neuroscience

Basic Definitions: Indexed Collections and Random Functions

Convoluted Brownian motions: a class of remarkable Gaussian processes

Three hours THE UNIVERSITY OF MANCHESTER. 31st May :00 17:00

Product measure and Fubini s theorem

An Introduction to Stochastic Processes in Continuous Time

Partial Differential Equations with Applications to Finance Seminar 1: Proving and applying Dynkin s formula

Stochastic Calculus. Michael R. Tehranchi

Jump Processes. Richard F. Bass

Part II Probability and Measure

Poisson random measure: motivation

Stochastic Differential Equations

Stochastic Differential Equations

Some Terminology and Concepts that We will Use, But Not Emphasize (Section 6.2)

The Wiener Itô Chaos Expansion

Regular Variation and Extreme Events for Stochastic Processes

9 Brownian Motion: Construction

Some Properties of NSFDEs

Harmonic Functions and Brownian Motion in Several Dimensions

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Transcription:

PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please give complete solutions, all claims need to be justified. Late homework will not be accepted. Please let me know if you find any misprints or mistakes. 1. Due by Feb 21, 11:am 1. Let (X n ) n N be an i.i.d. positive sequence, and S n = X 1 +... + X n. Let N t = sup{n : S n t}. Prove that (N t ) t R+ is a stochastic process. 2. Prove that every Borel set B in R d is regular, i.e., for every probability Borel measure µ, every ε >, there is a compact set K and open set U such that K B U and µ(u \ K) < ε. 3. Prove that cylinders C(t 1,..., t n, B), B B(R n ), t 1,..., t n, n N form an algebra. 4. We can define the cylindrical σ-algebra as the σ-algebra generated by elementary cylinders or by cylinders. Prove that these definitions are equivalent. 5. Let F T = σ{c(t 1,..., t n, B) : t 1,..., t n T } for T T. Prove that B(R T ) = F T. countable T T 6. Use characteristic functions to prove the existence of a Wiener process (up to continuity of paths). 7. Let (X t ) t [,1] be an (uncountable) family of i.i.d. r.v. s with nondegenerate distribution. Prove that no modification of this process can be continuous. 8. A multidimensional version of the Kolmogorov Chentsov theorem. Suppose d 1, and there is a stochastic field X : [, 1] d Ω R that satisfies E X(s) X(t) α C s t d+β for some α, β, C > and all t, s [, 1] d. Prove that there is a continuous modification of X on [, 1] d. 9. Show that the Kolmogorov Chentsov theorem cannot be relaxed: inequality E X t X s C t s is not sufficient for existence of a continuous modification. Hint: consider the following process: let τ be a r.v. with exponential distribution, and define X t = 1 {τ t}. 1. Prove that there exists a Poisson process (a process with independent increments that have Poisson distribution with parameter proportional to the length of time increments) such that: 1

2 PROF. YURI BAKHTIN (a) its realizations are nondecreasing, taking only whole values a.s. (b) its realizations are continuous on the right a.s. (c) all the jumps of the realizations are equal to 1 a.s. 11. Give an example of a non-gaussian 2-dimensional random vector with Gaussian marginal distributions. 12. Let Y N (a, C) be a d-dimensional random vector. Let Z = AY where A is an n d matrix. Prove that Z is Gaussian and find its mean and covariance matrix. 13. Prove that an R d -valued random vector X is Gaussian iff for every vector b R d, the r.v. b, X is Gaussian. 14. Prove that (s, t) t s defined for s, t is positive semi-definite. Hint: 1 [,t], 1 [,s] L 2 (R + ) = t s. 15. Prove that (s, t) e t s is positive semi-definite. 16. Prove that if X is a Gaussian vector in R d with parameters (a, C) and C is non-degenerate, then the distribution of X is absolutely continuous w.r.t. Lebesgue measure and the density is 1 p X (x) = det(c) 1/2 (2π) d/2 e 1 2 C 1(x a),(x a). 17. Find a condition on the mean a(t) and covariance function r(s, t) that guarantees existence of a continuous Gaussian process with these parameters. 18. Suppose (X, X 1,..., X n ) is a (not necessarily centered) Gaussian vector. Show that there are constants c, c 1,..., c n such that E(X X 1,..., X n ) = c + c 1 X 1 +... + c n X n. Your proof should be valid even if the covariance matrix of (X 1,..., X n ) is degenerate. 19. Consider the standard Ornstein Uhlenbeck process X (Gaussian process with mean and covariance function c(s, t) = e t s ). (a) Prove that X has a continuous modification. (b) Find E(X 4 X 1, X 2, X 3 ). 2. Prove that for every centered Gaussian process X with independent increments on R + = [, ) (X(t) X(s) is required to be independent of σ(x(r), r [, s]) for t s ), there is a nondecreasing nonrandom function f : R + R + such that X has the same f.d.d. s as Y defined by Y (t) = W (f(t)), for a Wiener process W. 21. Let µ be a σ-finite Borel measure on R d. Prove existence of Poisson point process and (Gaussian) white noise with leading measure µ. In both cases, the process X we are interested in is indexed by Borel sets A with µ(a) <, with independent values on disjoint sets, with finite additivity property for disjoint sets and such that (a) X(B) is Poisson with parameter µ(b) (for Poisson process). (b) X(B) is centered Gaussian with variance µ(b) (for white noise).

PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS 3 2. Due by March 28, 11:am 1. Suppose the process X t is a stationary Gaussian process, and let H be the Hilbert space generated by (X t ) t R, i.e., the space consisting of L 2 -limits of linear combinations of values of X t. Prove that every element in H is a Gaussian r.v. 2. Let A, η be random variables, let a r.v. φ be independent of (A, η) and uniform on [, 2π]. Prove that X t = A cos(ηt + φ), t R, is a strictly stationary process. 3. Suppose the process X t is a stationary Gaussian process, and let H be the Hilbert space generated by (X t ) t R, i.e., the space consisting of L 2 -limits of linear combinations of values of X t. Prove that every element in H is a Gaussian r.v. 4. Find the covariance function of a stationary process such that its spectral dx 1+x 2. measure is ρ(dx) = 5. Give an example of a weakly stationary stochastic process (X n ) n N such that (X 1 +... + X n )/n converges in L 2 to a limit that is not a constant. 6. Let (X n ) n Z be a weakly stationary process. Prove that for any K N and any numbers a K, a K+1,..., a K 1, a K, the process (Y n ) n Z defined by Y n = K k= K a k X n+k is weakly stationary. Express the spectral measure of Y in terms of the spectral measure for X. 7. Describe the stationary sequence X n = 2π cos(nλ)z(dλ), n Z, where Z( ) is the standard white noise on [, 2π). 8. Let stationary process (X n ) n Z satisfy E X <. Prove that with probability 1, lim n (X n /n) =. 9. Find the spectral measure and spectral representation for stationary process (X t ) t R given by X t = A cos(t + φ), t R, where A and φ are independent, A L 2 and φ is uniform on [, 2π]. 1. Consider a map θ : Ω Ω. A set A is called (backward) invariant if θ 1 A = A, forward invariant if θa = A. Prove that the collection of backward invariant sets forms a σ-algebra. Give an example of Ω and θ such that the collection of forward invariant sets does not form a σ- algebra. 11. Consider the transformation θ : ω {ω + λ} on Ω = [, 1) equipped with Lebesgue measure. Here {...} denotes the fractional part of a number. This map can be interpreted as rotation of the circle (seen as [, 1) with endpoints and 1 identified).

4 PROF. YURI BAKHTIN A. Prove that this dynamical system is ergodic (i.e., there are no backward invariant Borel sets besides and Ω) if and only if λ / Q. Hint: for λ Q construct a backward invariant subset, for λ / Q take the indicator of an invariant set and write down the Fourier series for it (w.r.t. e 2πinx ). What happens to this expansion under θ? B. Using the Birkhoff ergodic theorem and Part A, describe the limiting behavior of (f(ω) + f(θ 1 ω) +... + f(θ n 1 ω))/n as n, where f is any L 1 function on [, 1]. 12. Prove that every Gaussian martingale is a process with independent increments. 13. Prove that if W t is a standard Brownian motion, then Wt 2 t is a martingale. 14. Show that the function 1 P (s, x, t, Γ) = P (t s, x, Γ) = e (x y) 2 2(t s) dy 2π(t s) is a Markov transition probability function for the standard Wiener process. 15. Let (X t ) t be a stochastic process. We define F t = σ(x s, s t) and F t = σ(x s, s t). Prove that the following two versions of the Markov property are, in fact, equivalent: 1. For all A F t, B F t, P(AB X t ) = P(A X t )P(B X t ). 2. For all B F t, P(B F t ) = P(B X t ). 16. Let W be a Wiener process. The previous problem shows that the process X t = W t, t (, ] is Markov. Find the transition probability function for X t. 17. Prove that the process (X t ) given by X t = W t is Markov. Find its transition probability function. 18. Let (N t ) t be the standard Poisson process. Let X t = ( 1) Nt. Prove that (X t ) is a Markov process and find the transition probability function. 19. For a Wiener process W and b >, compute the density of τ b = inf{t : W t = b} and find Eτ b. 2. We say that a r.v. τ is a stopping time w.r.t. a filtration (F t ) t if for every t, {τ t} F t. Let (X t, F t ) t be a continuous process such that X =. Suppose a > and let τ = inf{t : X(t) > a} and ν = inf{t : X(t) a}. Show that ν is a stopping time w.r.t. (F t ) t and τ is a stopping time w.r.t (F t+ ) t, where F t+ = ε> F t+ε. 21. Show that if τ 1 τ 2... are stopping times w.r.t. to a filtration (F t ), then τ = lim n τ n is also a stopping time w.r.t. to (F t ). 22. Let F τ = {A : A {τ t} F t } for a filtration (F t ) and a stopping time τ. Show that F τ is a σ-algebra. Γ

PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS 5 23. Give an example of the following: a random variable τ is not a stopping time, F τ is not a σ-algebra. 24. Suppose τ is a stopping time w.r.t. (F t+ ). Let us define τ n = [2n τ] + 1 2 n = k 2 n 1 {τ [ k 1 k N 2 n, k 2 n )}, n N. Prove that for every n N, τ n is a stopping time w.r.t. (F t ) t, F τn F τ+, and τ n τ. 3. Due by May 2, 11:am 1. Prove: if (X t, F t ) t is a continuous process, then for any stopping time τ, X τ is a r.v. measurable w.r.t. F τ. 2. Let (X t, F t ) be a continuous martingale and let τ be a stopping time w.r.t. F t. Prove that the stopped process (X τ t, F t ) t, where X τ t = X τ t, is also a martingale. 3. Prove the following theorem (Kolmogorov, 1931) using Taylor expansions of test functions: Suppose (P x ) x R d is a (homogeneous) Markov family on R d with transition probabilities P (,, ). Suppose there are continuous functions a ij (x), b i (x), i, j = 1,..., d, such that for every ε >, the following relations hold uniformly in x: B ε(x) B ε(x) P (t, x, B c ε(x)) = o(t), t, (y i x i )P (t, x, dy) = b i (x)t + o(t), t, (y i x i )(y j x j )P (t, x, dy) = a ij (x)t + o(t), t. where B ε (x) is the Euclidean ball of radius ε centered at x. Then the infinitesimal generator A of the Markov semigroup associated to the Markov family is defined on all functions f such that f itself and all its partial derivatives of first and second order are bounded and uniformly continuous. For such functions Af = 1 a ij ij f + b i i f. 2 ij i 4. Consider a Markov process X in R 2 given by X 1 (t) = X 1 () + W (t), X 2 (t) = X 2 () + t X 1 (s)ds. Find its generator on C 2 -functions with compact support.

6 PROF. YURI BAKHTIN 5. Consider the Poisson transition probabilities, i.e., fix a number λ > and for i Z and t, let P (i, t, ) be the distribution of i + π λt, where π s denotes a random variable with Poisson distribution with parameter s >. In other words, λt (λt)j i P (i, t, {j}) = e, i Z, j {i, i + 1,..., }, t >. (j i)! Find the generator of the Markov semigroup on all bounded test functions f : Z R. 6. Find the transition probabilities and generator associated to the Ornstein- Uhlenbeck process. 7. Let W 1 and W 2 be two independent Wiener processes w.r.t. a filtration (F t ) t, and let X be a bounded process adapted to (F t ) t. For a partition t of time interval [, T ] (i.e., a sequence of times t = (t, t 1,..., t n ) such that = t < t 1 < t 2 <... < t n = T ), we define Q(t) = j X tj (W 1 t j+1 W 1 t j )(W 2 t j+1 W 2 t j ). Prove: lim Q(t) = in max(t j+1 t j ) L2. 8. Let (F t ) be a filtration. Suppose that = A (t) + A 1 (t)w (t) for all t, where (A, F t ) and (A 1, F t ) are processes with C 1 trajectories, and W (t) is a Wiener process w.r.t. (F t ). Prove that A and A 1. 9. Suppose f L 2 ([, T ], B, Leb). For all h >, we define, t < h; f h (t) = 1 kh f(s)ds, kh t < (k + 1)h T. h (k 1)h L Prove that f 2 h f and f h L 2 f. Hint: do not work with generic L 2 functions right away; consider a smaller supply of functions f that behave nicely under this kind of averaging. 1. Compute T W 2 (t)dw (t) directly as the limit of W 2 (t k )(W (t k+1 ) W (t k )) 11. The so-called Stratonovich stochastic integral may be defined for a broad class of adapted processes X t via T def X tj+1 + X tj X t dw t = lim (W tj+1 W tj ) in L 2. max(t j+1 t j ) 2 j Impose any conditions you need on X and express the difference between the Itô and Stratotovich integrals in terms of quadratic covariation between X and W. Compute T W t dw t. Is the answer a martingale?

PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS 7 12. Let M 2 c = {square-integrable martingales with continuous paths}. Prove that if (M t, F t ) M 2 c, then E[(M t M s ) 2 F s ] = E[M 2 t M 2 s F s ] = E[ M t M s F s ], s < t. 13. Suppose (M t, F t ) M 2 c, X is a simple process, and (X M) t = t X sdm s. Prove that [ t ] E[((X M) t (X M) s ) 2 F s ] = E Xr 2 d M r F s, s < t. 14. Let M M 2 c. Prove that Y (X M) = (Y X) M for simple processes X, Y. Find reasonable weaker conditions on X and Y guaranteeing the correctness of this identity in the sense of square integrable martingales. 15. Suppose (M t, F t ) M 2 c, and X, Y are bounded processes. Prove that X M, Y M t = t s X s Y s d M s. Here, for two processes M, N M 2 c the cross-variation M, N t is defined by M, N t = M + N t M N t. 4 16. Let us define the process X by t X t = e λt X + εe λt e λs dw s, t. Here λ R, ε >, W is a standard Wiener process, and X is a squareintegrable r.v., independent of W. Prove that dx t = λx t dt + εdw t. 17. Prove that if f : [, ) is a deterministic function, bounded on any interval [, t], then X t = t f(s)dw s, t, is a Gaussian process. Find its mean and covariance function. 18. In the context of Problem 16, find all the values of λ with the following property: there are a and σ 2 such that if X N (a, σ 2 ), then (X t ) is a stationary process. 19. [Feynman Kac formula.] Suppose u : R [, ) and φ : [, ) R R are smooth bounded functions. Suppose that u : [, ) R R is a smooth function solving the Cauchy problem for the following equation: t u(t, x) = 1 2 xxu(t, x) + φ(t, x)u(t, x), u(, x) = u (x).

8 PROF. YURI BAKHTIN Using the Itô formula and martingales, prove that u(t, x) = Ee t φ(t s,x+ws)ds u (x + W t ), t >, x R, where W is a standard Wiener process. (If you need further assumptions on u and/or its derivatives, use PDE theory or just impose assumptions you need.) 2. Let D be an open bounded of R d with smooth boundary D. Let g : D R and f : D R be continuous functions. Suppose that u C( D) C 2 (D) and 1 u(x) = g(x), x D, 2 u(x) = f(x), x D. Let W be a standard 2-dimensional Wiener process (a process with independent components each of which is a standard 1-dimensional Wiwner process) Prove that if τ = inf{t : x + W t D}, then [ τ ] u(x) = E f(x + W τ ) + g(x + W t )dt. 21. Suppose a R, σ >, x >, and W is the standard Wiener process. Find constants A, B R such that the process S defined for all t by S t = x exp(at + σw t ) (and often called the geometric Brownian motion ) satisfies the following stochastic equation ds t = AS t dt + BS t dw t, t. Find necessary and sufficient conditions on a and σ for (S t ) to be a martingale. 22. [The Girsanov density.] Suppose (W t, F t ) is a Wiener process and (X t, F t ) is a bounded process. Use the Itô formula to prove that [ t Z t = exp X s dw s 1 t ] Xs 2 ds, t, 2 is a local martingale w.r.t. (F t ). (In fact, it is a true martingale)