Problems 5: Continuous Markov process and the diffusion equation

Similar documents
UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

1 Introduction. 2 Diffusion equation and central limit theorem. The content of these notes is also covered by chapter 3 section B of [1].

ELEMENTS OF PROBABILITY THEORY

Manual for SOA Exam MLC.

1. Stochastic Process

Math 180C, Spring Supplement on the Renewal Equation

Session 1: Probability and Markov chains

Elementary Applications of Probability Theory

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

Chapter 6 - Random Processes

Gillespie s Algorithm and its Approximations. Des Higham Department of Mathematics and Statistics University of Strathclyde

Applied Probability and Stochastic Processes

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

Kolmogorov Equations and Markov Processes

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

Statistics 150: Spring 2007

Lecture 4: Introduction to stochastic processes and stochastic calculus

Verona Course April Lecture 1. Review of probability

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

Math Spring Practice for the final Exam.

Chapter 6: Random Processes 1

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations

A Short Introduction to Diffusion Processes and Ito Calculus

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

Poisson Jumps in Credit Risk Modeling: a Partial Integro-differential Equation Formulation

LECTURE #6 BIRTH-DEATH PROCESS

Handbook of Stochastic Methods

Fig 1: Stationary and Non Stationary Time Series

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or

Handbook of Stochastic Methods

Brownian Motion and Poisson Process

Modelling and Analyzing Unevenly Spaced Measurements. David R. Brillinger Statistics Department University of California, Berkeley

Local vs. Nonlocal Diffusions A Tale of Two Laplacians

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Derivation of Itô SDE and Relationship to ODE and CTMC Models

Lecture 1: Brief Review on Stochastic Processes

Modeling point processes by the stochastic differential equation

Numerical Integration of SDEs: A Short Tutorial

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Probability and Statistics

Systems Driven by Alpha-Stable Noises

Exponential Distribution and Poisson Process

Brownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion

Tyler Hofmeister. University of Calgary Mathematical and Computational Finance Laboratory

Solution Manual for: Applied Probability Models with Optimization Applications by Sheldon M. Ross.

6.2.2 Point processes and counting processes

1 Delayed Renewal Processes: Exploiting Laplace Transforms

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Limit Theorems. STATISTICS Lecture no Department of Econometrics FEM UO Brno office 69a, tel

The exponential distribution and the Poisson process

AN INTRODUCTION TO STOCHASTIC EPIDEMIC MODELS-PART I

Stochastic Particle Methods for Rarefied Gases

9 More on the 1D Heat Equation

THE QUEEN S UNIVERSITY OF BELFAST

6 Continuous-Time Birth and Death Chains

Lycka till!

08. Brownian Motion. University of Rhode Island. Gerhard Müller University of Rhode Island,

Continuous and Discrete random process

Expectation, variance and moments

Random Vibrations & Failure Analysis Sayan Gupta Indian Institute of Technology Madras

Ernesto Mordecki. Talk presented at the. Finnish Mathematical Society

Adventures in Stochastic Processes

Introduction to nonequilibrium physics

Lie Symmetry of Ito Stochastic Differential Equation Driven by Poisson Process

Lecture 12: Detailed balance and Eigenfunction methods

1. Stochastic Processes and filtrations

Lecture 1: Pragmatic Introduction to Stochastic Differential Equations

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Exercises. (a) Prove that m(t) =

Table of Contents [ntc]

STAT331. Combining Martingales, Stochastic Integrals, and Applications to Logrank Test & Cox s Model

Example 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ)

Stat 516, Homework 1

The dynamics of small particles whose size is roughly 1 µmt or. smaller, in a fluid at room temperature, is extremely erratic, and is

Figure 10.1: Recording when the event E occurs

Hierarchical Modeling of Complicated Systems

Convergence of Random Walks

Stochastic particle models for transport problems in coastal waters

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

IEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013

Lecture on Parameter Estimation for Stochastic Differential Equations. Erik Lindström

Data analysis and stochastic modeling

Solution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.

Stochastic Differential Equations.

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

Week 1 Quantitative Analysis of Financial Markets Distributions A

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Question Paper Code : AEC11T03

Gaussian processes for inference in stochastic differential equations

Name of the Student: Problems on Discrete & Continuous R.Vs

Simulation methods for stochastic models in chemistry

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

Linear Filtering of general Gaussian processes

Transcription:

Problems 5: Continuous Markov process and the diffusion equation Roman Belavkin Middlesex University Question Give a definition of Markov stochastic process. What is a continuous Markov process? Answer: A process x(t) is Markov, if for any t,..., t n, t n+ the conditional probability density has the property: p(x(t n+ ) x(t n ),..., x(t )) = p(x(t n+ ) x(t n )) This is sometimes referred as a process with no memory or with no aftereffect. Markov process is called continuous, if for all moments E{( x(t)) n x(t)} = 0 with n > 2 the following coefficients tend to zero: K n (x, t) = lim t 0 t E{( x(t))n x(t)} = 0, n > 2 The first and the second coefficients are called drift and diffusion coefficients: K (x, t) = lim E{ x(t) x(t)}, t 0 t K 2(x, t) = lim t 0 t E{( x(t))2 x(t)} Continuous Markov processes are described by the Fokker-Planck equation: p(x, t) t = x [K (x, t)p(x, t)] + 2 x 2 [K 2(x, t)p(x, t)] 2 Question 2 Let {x n } n N be a sequence of independent identically distributed random variables. Which of the following processes is Markov?

MSO42 2 a) y n = x + + x n b) y n = x x n c) y n = max{0, x,..., x n } d) y n = ( n, x + +x n ) n Hint: convert to iteration y n+ = f(y n ). Answer: All of the processes {y n } n N are Markov, as they can be represented by the following iterations: a) y n+ = y n + x n+, y = 0. b) y n+ = y n x n+, y =. c) y n+ = max{y n, x n+ }, y = 0. ( d) y n+ = (z n+, v n+ ) = Question 3 z n +, ( vn+x n+ z n+ )), y = (, x ). Let {x n } n N be a sequence of independent identically distributed random variables. Why is the following process not Markov? where [ ] means rational part. y n = [x + + x n ] Answer: Although the sum of independent random variables is Markov (see previous example), the rational part y n+ of the sum x + + x n depends on all elements of the sequence, not just x n+. Therefore, y n+ cannot be computed from y n and x n+. Question 4 Let {m(t)} t 0 be a stochastic process defined by m(t) = n(t) t where n(t) is the value of a stationary Poisson process {n(t)} t 0 with expected value E{n(t)} = t (see Appendix A). Use properties of the Poisson process to a) Show that the expected value and variance of m(t) are: E{m(t)} = 0, σ 2 (m(t)) = t

MSO42 3 Answer: Expected value: { } n(t) t E{n(t)} t t t E{m(t)} = E = = = 0 Variance: σ 2 (m(t) = E{m 2 (t)} (E{m(t)}) 2 = E{m 2 (t)} (by E{m(t)} = 0) { [n(t) ] } t 2 = E { n 2 (t) 2n(t)t + (t) 2 } = E = E{n2 (t)} 2E{n(t)}t + (t) 2 = E{n2 (t)} 2(t) 2 + (t) 2 = E{n2 (t)} (t) 2 = E{n2 (t)} E{n(t)} 2 = t = t (by E{n(t)} = t) (by E{n(t)} = t) (by σ2 (n(t)) = t) b) Prove that the differential dm(t) = m(t + dt) m(t) has the property Answer: The differential dm is dm(t) = m(t+dt) m(t) = lim (dm2 ) = dt n(t + dt) (t + dt) n(t) t dn(t) dt =

MSO42 4 The formula for dm 2 is dm 2 = (dn dt)2 = dn2 2dndt + 2 dt 2 = dn2 = dn (by dn = 0 a.e. and dt 2 = 0) (by dn 2 = dn) = dn dt + dt = ( dn dt ) + dt = dm + dt Using dm 2 = dm + dt ( ) dm lim (dm2 ) = lim + dt = dt c) The stochastic process {m(t)} t 0 becomes Wiener process as. Answer: Recall that Wiener process {w(t)} t 0 is a stationary Gaussian stochastic process with independent increments w(t) and with the expected value and variance equal to E{w(t)} = 0, σ 2 (w(t)) = t and with stochastic differentials dw(t) having the property: dw 2 = dt We have shown that the expected value and variance of stochastic process {m(t)} t 0 are also equal to The increments E{m(t)} = 0, m(t) = σ 2 (m(t)) = t n(t) t are independent, because n(t) are independent increments of the Poisson process.

MSO42 5 We have shown also that as the rate of events increases, the differentials dm 2 tend to dt, which agrees with the property dw 2 = dt of the Wiener process. It only remains to show that the probability distribution of m(t) tends to Gaussian distribution as. This follows form the fact that m(t) is a transformation of the values n(t) of a Poisson process, so that the distribution of m(t) is a push-forward probability P (n(t) = m(t) + t), where P (n(t)) is the Poisson distribution with the expected rate. It is well-known that the Poisson distribution with large rate (and hence large E{n(t)} = t) can be approximated by Gaussian distribution with the mean and variance equal to. This can be understood from the fact that both the Poisson and the Gaussian distributions can be obtained from binomial distribution in the limit n (these facts are known as the Poisson theorem and the De Moivre-Laplace theorems respectively). Therefore, in the limit, stochastic process m(t) becomes Gaussian with expected value E{m(t)} = 0 and variance σ 2 (m(t)) = t. A Poisson process The Poisson point process {n(t)} t 0 is a discrete-valued stochastic process counting the number n N 0 = {0,, 2, 3,...} of occurrences of some event during time interval [0, t], and satisfying the following properties: Independent increments : The number n( t) = n(t + t) n(t) of events in the interval t is independent of the number of events in any other interval non-overlapping with t (e.g. [0, t]). This property implies {n(t)} t 0 is a Markov process. Orderliness : The probability of two or more events during sufficiently small interval t is essentially zero: P { n( t) 2} = o( t) (note the use of the small o notation.) These two properties imply that the number n(t) of events in [0, t] has Poisson distribution. For stationary (or homogeneous) Poisson process this distribution is P (n(t)) = (t)n e t n! where is the expected rate or intensity parameter (the expected number of events in a unit interval). The expected value and the variance for stationary Poisson process are respectively E{n(t)} = t, σ 2 (n(t)) = t

MSO42 6 The orderliness property implies also that differential dn(t) = n(t+dt) n(t) can have only two values: dn(t) = 0 (almost everywhere) or dn(t) = (in a set of measure zero). This means that the differential dn of a Poisson process satisfies the property dn 2 = dn