Lecture 1: Brief Review on Stochastic Processes

Similar documents
Lecture 4 An Introduction to Stochastic Processes

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

Stochastic process for macro

1.1 Review of Probability Theory

Module 9: Stationary Processes

Basic Definitions: Indexed Collections and Random Functions

DISCRETE STOCHASTIC PROCESSES Draft of 2nd Edition

Stochastic Processes

Stochastic Processes and Advanced Mathematical Finance. Stochastic Processes

Basic Probability space, sample space concepts and order of a Stochastic Process

COPYRIGHTED MATERIAL CONTENTS. Preface Preface to the First Edition

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

ELEMENTS OF PROBABILITY THEORY

Examples of Countable State Markov Chains Thursday, October 16, :12 PM

Detailed Balance and Branching Processes Monday, October 20, :04 PM

Discrete Probability Refresher

PROBABILITY AND STOCHASTIC PROCESSES A Friendly Introduction for Electrical and Computer Engineers

Homework #2 Solutions Due: September 5, for all n N n 3 = n2 (n + 1) 2 4

STAT 248: EDA & Stationarity Handout 3

Lecture 4a: Continuous-Time Markov Chain Models

An Introduction to Stochastic Modeling

Figure 10.1: Recording when the event E occurs

I will post Homework 1 soon, probably over the weekend, due Friday, September 30.

2. Transience and Recurrence

Random point patterns and counting processes

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes

Stochastic processes. MAS275 Probability Modelling. Introduction and Markov chains. Continuous time. Markov property

Probability Theory, Random Processes and Mathematical Statistics

JUSTIN HARTMANN. F n Σ.

Probability and statistics; Rehearsal for pattern recognition

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

Stochastic Processes

HANDBOOK OF APPLICABLE MATHEMATICS

STOCHASTIC PROCESSES Basic notions

The strictly 1/2-stable example

Sample Spaces, Random Variables

Stochastic Processes

Estimation of uncertainties using the Guide to the expression of uncertainty (GUM)

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0

Stochastic Modelling Unit 1: Markov chain models

PRESENT STATE AND FUTURE PROSPECTS OF STOCHASTIC PROCESS THEORY

PROBABILITY AND STATISTICS Vol. I - Random Variables and Their Distributions - Wiebe R. Pestman RANDOM VARIABLES AND THEIR DISTRIBUTIONS

An Introduction to Probability Theory and Its Applications

Probability and Stochastic Processes

Adventures in Stochastic Processes

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

LECTURE 12 UNIT ROOT, WEAK CONVERGENCE, FUNCTIONAL CLT

Important Properties of R

1. Let X and Y be independent exponential random variables with rate α. Find the densities of the random variables X 3, X Y, min(x, Y 3 )

Spectral representations and ergodic theorems for stationary stochastic processes

Communication Theory II

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Exponential Distribution and Poisson Process

Econometría 2: Análisis de series de Tiempo

Markov Chain Monte Carlo

Applied Probability and Stochastic Processes

CHAPTER 3 MATHEMATICAL AND SIMULATION TOOLS FOR MANET ANALYSIS

Sampling: A Brief Review. Workshop on Respondent-driven Sampling Analyst Software

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

M4A42 APPLIED STOCHASTIC PROCESSES

Notes 13 : Conditioning

Homework 4 due on Thursday, December 15 at 5 PM (hard deadline).

b. ( ) ( ) ( ) ( ) ( ) 5. Independence: Two events (A & B) are independent if one of the conditions listed below is satisfied; ( ) ( ) ( )

ECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018

Stochastic Processes

Math-Stat-491-Fall2014-Notes-I

Lecturer: Olga Galinina

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

Stat 150 Practice Final Spring 2015

Universal examples. Chapter The Bernoulli process

Lecture 20: Reversible Processes and Queues

Markov Chains. Arnoldo Frigessi Bernd Heidergott November 4, 2015

Chapter 6. Markov processes. 6.1 Introduction

Lecture 20 : Markov Chains

FDST Markov Chain Models

Review of Probability Theory

1 if the i-th toss is a Head (with probability p) ξ i = 0 if the i-th toss is a Tail (with probability 1 p)

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions

Lecture 1. Introduction to stochastic processes: concepts, definitions and applications.

STAT 331. Martingale Central Limit Theorem and Related Results

Stat 5101 Lecture Slides Deck 4. Charles J. Geyer School of Statistics University of Minnesota

Fundamentals of Applied Probability and Random Processes

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

No class on Thursday, October 1. No office hours on Tuesday, September 29 and Thursday, October 1.

Stochastic2010 Page 1

30 Classification of States

Manual for SOA Exam MLC.

STAT STOCHASTIC PROCESSES. Contents

Advanced topics from statistics

Transience: Whereas a finite closed communication class must be recurrent, an infinite closed communication class can be transient:

Finite-Horizon Statistics for Markov chains

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

Theory of Stochastic Processes 3. Generating functions and their applications

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

Module 1. Probability

Transcription:

Lecture 1: Brief Review on Stochastic Processes A stochastic process is a collection of random variables {X t (s) : t T, s S}, where T is some index set and S is the common sample space of the random variables. For each fixed t T, X t (s) denotes a single random variable defined on S. For each fixed s S, X t (s) corresponds to a function defined on T that is called a sample path or a stochastic realization of the process [1]. Unless otherwise stated, most of the time, the variable s is omitted when speaking of a stochastic process, and this is what we follow in this lecture. In many stochastic processes, the index set T often represents time, and we refer to X t as the state of the process at time t, where t T. Any realization or sample path is also called a sample function of a stochastic process. For example, if events are occurring randomly in time and X t represents the number of events that occur in [0, t], as shown in Fig. 1, then a sample path corresponds to the initial event that occurs at time t = 1, the second event occurs at t = 3, and the third event at t = 4. There are no events occurring anywhere else [4]. Figure 1: A sample path of X t which is the number of events in [0, t]. From [4]. The main elements that distinguish stochastic processes are the state space S, the index set T, and the dependence relations among the random variables X t. State space S is the space in which the possible values of each X t lie: 1

If the state space is countable, such as S = (0, 1, 2, 3, ), we refer to the process as an integer-valued or a discrete-state stochastic process. If S = (, ) is a real-valued space, the process is called a real-valued stochastic processes. If S is Euclidean k space, then it is a k-vector stochastic process. Index set T characterizes stochastic processes into: If T is a countable set (or a set of integers) representing specific time points, the stochastic process is said to be a discrete-time process. For instance, {X t, t = 0, 1, 2, } is a discrete-time stochastic process indexed by the nonnegative integers. If T is an interval of the real line, the stochastic process is said to be a continuoustime process. To distinguish from the discrete-time process, for the continuoustime process we may change the notation slightly, writing X(t) rather than X t. Hence, {X(t), 0 t < } is a continuous-time stochastic process indexed by the nonnegative real numbers. The values of X t may be one-dimensional, two-dimensional, or general m dimensions. Two major examples of stochastic processes are the Brownian motion and Poisson process. The classification of stochastic processes can be categorized into classical and modern classifications. Classical classification of stochastic processes is characterized by different dependence relationships among random variables, such as [2]: (1) Process with stationary independent increments If the real-valued random variables X t2 X t1, X t3 X t2,, X tn X tn 1 are independent for all choices of t 1, t 2,, t n, for t i 0, satisfying t 1 < t 2 < < t n, then {X t } is said to be a process with independent increments. If the index set contains a smallest index t 0, it is also assumed that X t0, X t1 X t0, X t2 X t1,, X tn X tn 1 are independent. If the index set is discrete, that is, T = (0, 1, 2, ), then a process with independent increments reduces to a sequence of independent random variables Z 0 = X 0, Z i = X i X i 1, for i = 1, 2, 3,. 2

If the distribution of the increments X(t 1 + h) X(t 1 ) (1) depends only on the length of the interval h and not on the time t 1, the process is said to have stationary increments. For a process with stationary increments, the distribution of X(t 1 + h) X(t 1 ) is the same as the distribution of X(t 2 + h) X(t 2 ), regardless the values of t 1, t 2, and h. If a process {X t, t T }, where T is either continuous or discrete, has stationary independent increments and a finite mean, then the expectation of X t is given by E (X t ) = m 0 + m 1 t, (2) where m 0 = E(X 0 ) and m 1 = E(X 1 ) m 0. A similar assertion holds for the variance, where and σ 2 X t = σ 2 0 + σ 2 1 t, (3) σ 2 0 = E((X 0 m 0 ) 2 ) σ 2 1 = E((X 1 m 1 ) 2 ) σ 2 0. Both the Brownian motion and Poisson process have stationary independent increments, as we shall see in next lectures. (2) Martingales Suppose X is a random variable measuring the outcome of some random experiment. If one knows nothing about the outcome of the experiment, then the best guess for the value of X would be the expectation, E(X). Conditional expectation concerns with making the best guess for X given some but not all information about the outcome. Let {X t } be a real-valued stochastic process with discrete or continuous index set T. The process {X t } is called a martingale if E ( X t ) < for t = 0, 1, 2, (4) and, E ( X tn+1 X t0 = a 0, X t1 = a 1,, X tn = a n ) = an (5) for any t 0 < t 1 < < t n+1 and for all values a 0, a 1,, a n, states that the expected value of X tn+1 given the past and present values of X t0, X t1,, X tn equals the present value of X tn. A sequence of random variables having this property is called a martingale. 3

We may rewrite Eq. (5) as E(X tn+1 X t0, X t1,, X tn ) = X tn and taking expectations on both sides, E ( E(X tn+1 X t0, X t1,, X tn ) ) = E(X tn ) using the law of total probability in the form E ( E(X tn+1 X t1, X t2,, X tn ) ) = E(X tn+1 ), gives us E(X tn+1 ) = E(X tn ), and consequently, a martingale has constant mean: E(X t0 ) = E(X tk ) = E(X tn ), 0 k n. It can be verified that the process X n = Z 1 + Z 2 + + Z n for n = 1, 2, 3, is a discrete-time martingale if the Z i are independent and have means zero. Similarly, if X t for 0 t < has independent increments whose means are zero, then {X t } is a continuous-time martingale. (3) Markov processes A Markov process is a process with the property that, given the present state X tn, the future state X tn+1 depends only on the present state X tn, not on the past states X t1, X t2,, X tn 1. However, if our knowledge of the present state of the process is imprecise, then the probability of some future behavior will be altered by additional information relating to the past behavior of the system. A process is said to be Markov if P {a < X tn+1 b X tn = x n,, X t2 = x 2, X t1 = x 1 } = P {a < X tn+1 b X tn = x n } whenever t 1 < t 2 < < t n < t. A Markov process having a countable or denumerable state space is called a Markov chain. A Markov process for which all sample functions {X t, 0 t < } are continuous functions is called a diffusion process. The Poisson process is a continuoustime Markov chain and Brownian motion is a diffusion process. (4) Stationary processes A stochastic process {X t } for t T, where T is either discrete or continuous, is 4

said to be strictly stationary if the joint distribution functions of the families of random variables (X t1 +h, X t2 +h,, X tn+h) and (X t1, X t2,, X tn ) are the same for all h > 0 and arbitrary selections t 1, t 2,, t n from T. This condition asserts that the process is in probabilistic equilibrium and that the particular times of the process being examined are of no relevance. A stochastic process {X t, t T } is said to be wide sense stationary or covariance stationary if it possesses finite second moments and if Cov(X t, X t+h ) = E(X t X t+h ) E(X t ) E(X t+h ) depends only on h for all t T. Neither the Poisson process nor the Brownian motion is stationary. No non-constant process with stationary independent increment is stationary. However, if {X t, t [0, )} is a Brownian motion or a Poisson process, then Z t = X t+h X t is a stationary process for any fixed h 0. (5) Renewal processes Let T 1, T 2, be independent, identically distributed, nonnegative random variables with distribution function F (x) = P {T i x}. We may think of the random variables T i as being the lifetimes of a component or as the times between occurrences of some events: the first event is placed in operation at time zero; it fails at time T 1 and is immediately replaced by a new event which then fails at time T 1 + T 2, and so on, thus motivating the name renewal process. The time of the nth renewal is S n = T 1 + T 2 + + T n. A renewal counting process N t counts the number of renewals in the interval [0, t], N t = n for S n t < S n+1, n = 0, 1, 2, The Poisson process with parameter λ is a renewal counting process with µ = 1/λ. (6) Point processes Let S be a set in n-dimensional space and let A be a family of subsets of S. A point process is a stochastic process indexed by the sets A A and having the set {0, 1,, } of nonnegative integers as its state space. Meanwhile, modern classification of stochastic processes depends on whether the random variables or the index set are discrete or continuous [1, 3]: (1) Stochastic processes whose time and random variables are discrete-valued. Some examples of these processes are discrete-time Markov chain models and discrete-time branching processes. 5

(2) Stochastic processes whose time is continuous but the random variables are discretevalued. Some examples are continuous-time Markov chain models. (3) Stochastic processes whose random variables are continuous but the time is discretevalued. (4) Stochastic processes whose both time and random variables are continuous-valued. Examples are continuous-time and continuous-state Markov processes. These models are also referred to as diffusion processes, where the stochastic realization is a solution of a stochastic differential equation. In this lecture, we follow the modern classification of stochastic processes. Our materials here are based on known textbooks on stochastic processes and modeling [1, 2, 3], where we include stochastic processes that have a wide variety of applications. As these lecture notes are growing, as we will add more materials and technique over time. References [1] Linda J. S. Allen. An Introduction to Stochastic Processes with Applications to Biology 2nd Edition. CRC Press, 2010. [2] Samuel Karlin, Howard M. Taylor. A First Course in Stochastic Processes. Academic Press, 1975. [3] Sheldon M. Ross. Introduction to Probability Models 9th Edition. Elsevier, 2007. [4] Sheldon M. Ross. Stochastic Processes 2nd Edition. John Wiley & Sons, 1996. 6