Lecture 4: Stochastic Processes (I)
|
|
- Emory Johnson
- 6 years ago
- Views:
Transcription
1 Miranda Holmes-Cerfon Applied Stochastic Analysis, Spring 215 Lecture 4: Stochastic Processes (I) Readings Recommended: Pavliotis [214], sections 1.1, 1.2 Grimmett and Stirzaker [21], section 9.4 (spectral representation and stochastic integration) Optional: Grimmett and Stirzaker [21] 8.1, 8.2, 8.6, 9.1, 9.3, 9.5, 9.6. Koralov and Sinai [21], Ch. 12, 15.3, 15.7, other selections from Ch. 15, 16. Yaglom [1962], Ch. 1, 2 is a nice short book with many details about stationary random functions We want to be able to describe families of random functions : These functions could represent, for example: velocity of a particle in a smooth chaotic flow # of molecules in the reactant state of a chemical reaction bond angles of a folding protein randomly perturbed harmonic oscillator (swingset) size of a self-assembling viral capsid etc. Such random functions are called stochastic processes. They are written as X(t,ω) or X t (ω) or X(t)(ω), where t T is a parameter taking values an ordered parameter set T, such as T = R,Z +,[a,b], etc, and ω is an element in the sample space Ω. For each fixed t T, X t is a random variable defined on some probability space (Ω,F,P). We can think of a stochastic process in two ways: (1) A parameterized family of random variables. Consider the function t X t (ω) (2) A probability distribution on path space. Consider the function ω X t (ω) Each realization is a function Events are sets of paths, e.g. all paths that pass through [x,x + h] at time t, all paths whose maximum value is bigger than M, etc. From measure theory, we know we cannot construct a probability density on this space, because it is infinite-dimensional
2 4.1 Finite-dimensional distributions How can we construct / characterize stochastic processes? We can t characterize the joint distribution of all values X t, t T, at once. However, we can look at a finite number of points (t 1,t 2,...,t k ) and consider the distribution of (X t1,x t2,...x tk ). Definition. The finite-dimensional distributions (fdds) of a stochastic process X t is the set of measures µ t1,t 2,...,t k given by µ t1,t 2,...,t k (F 1 F 2 F k ) = P(X t1 F 1,X t2 F 2,...X tk F k ), where (t 1,...,t k ) is any point in T k, k is any integer, and F i are measurable events in R. Examples Here are some examples of processes and their fdds. 1. A sequence of i.i.d. random variables. t Z +, P t1,...,t k (F 1 F k ) = k i=1 P(F i). 2. Markov chains. Let X t be a continuous-time, homogeneous Markov chain with generator Q, and let = t < t 1 < < t k. Then the fdd is given by P(X t F,X t1 F 1,...,X tk F k ) = i F µ () (i ) i 1 F 1 P i i 1 (t 1 t ) i k F k P ik 1 i k (t k t k 1 ). 3. Poisson process. For any t 1... t k, let η 1,η 2,...,η k be independent Poisson random variables with rates λt 1,λ(t 2 t 1 ),...,λ(t k t k 1 ). Let P t1,...,t k be the joint distribution of η 1,η 1 + η 2,...,η 1 + η η k ). Suppose we are given a set of finite-dimensional distributions. Is there a corresponding stochastic process X t? Is it unique? In order for the first to be true, we must at least require the fdds to be consistent. Two criteria that must be satisfied are: (1) Invariant under permutations. For any permutation σ of {1, 2,..., k}, we have µ tσ(1),...t σ(k) (F 1 F k ) = µ t1,...,t k (F σ(1) F σ(k) ). (2) Marginal distributions are consistent. For any m N, µ t1,...,t k (F 1 F k ) = µ t1,...,t k,t k+1,...,t k+m (F 1 F k R R) Kolmogorov Extension Theorem. If the family of measures {µ t1,...,t k } satisfies the consistency conditions (1), (2) above, then there exists a stochastic process with the corresponding finite-dimensional distributions. Proof. See Koralov and Sinai [21], p Unfortunately, the construction of a stochastic process via fdds does not let us ask many questions that we may be interested in, such as continuity, differentiability, boundedness, first-passage times, etc none of these properties can be asked about using sets that are measurable in the σ-algebra generated by the fdds. Therefore, in what follows we will usually restrict our attention to a smaller class of functions, such as those that are continuous, or right-continuous, and ask what can be said about these. 2
3 4.2 Gaussian processes A very important class of processes that can be defined by their fdds are Gaussian processes. Definition. X t is Gaussian iff all its fdds are Gaussian, i.e. if (X t1,...,x tk ) is a multivariate Gaussian for any (t 1,...,t k ). Note. Recall that a multivariate Gaussian (X 1,...,X k ) T is defined by its mean µ = (µ 1,..., µ k ) T = (EX 1,...,EX k ) T and covariance matrix C, which has entries C i j = E(X i EX i )(X j EX j ). The density function is where x = (x 1,...,x k ) T. p(x 1,...,x k ) = 1 (2π) k/2 1 detc 1/2 e 1 2 (x µ)t C 1 (x µ), (1) This means that given functions m(t), B(s,t), such that B is positive semi-definite (see below), we can construct a Gaussian process with mean EX t = m(t) and whose finite-dimensional distributions have covariance matrix (B(t i,t j )) k i, j=1. This gives us a lot of flexibility in constructing a Gaussian process. We will investigate properties of them later, when we study covariance matrices and when we consider the Karhonen-Loeve decomposition. 4.3 Stationary processes Another important class of processes are those for which the distribution is invariant with time. For example, one might assume this is the case for waves in the ocean fluctuations in annual mean temperature (prior to 18...) the bond angles in a protein that is in thermodynamic equilibrium any Markov chain that has been run for a long time turbulent velocities in the atmosphere etc It would not be the case for a process that still retains memory of its initial condition, for example, (such as a protein shortly after we observe it in a particular configuration), or a process that can wander far away without bound (such as a random walker on a line or infinite lattice.) Definition. A stochastic process is strongly stationary, or strictly stationary, if all the fdds are invariant with shifts in time, i.e. µ t1 +h,t 2 +h,...,t k +h = µ t1,...,t k (t 1,...,t k ), h R. This definition is usually too hard to work with in practice: it is hard to check it may be too strong in many cases, when the information we are interested in depends only on the one-point and two-point fdds (mean and covariance) A weaker definition is more commonly used in practice. To introduce this we need the following: 3
4 Definition. The covariance function of a real-valued stochastic process X t is where m(t) = EX t is the mean of the process. B(s,t) E[(X s EX s )(X t EX t )] = EX s X t m(s)m(t), Definition. A function B(s,t) is positive (semi-)definite if if the matrix (B(t i,t j )) k i, j=1 is positive (semi- )definite for all finite slices {t i } k i=1. Lemma. The covariance function B(s,t) of a stochastic process is positive semi-definite. Proof. Left for reader. Now consider what happens to the mean and covariance function of a strongly stationary process when we shift the process in time. Mean: we have EX t = EX t+h, so m(t) = m(t + h) for all h, so the mean must be constant. Covariance: we have EX s+h X t+h = EX s X t, so B(s + h,t + h) = B(s,t) for all h. Therefore B(s,t) = f (s t) for some function f. This motivates the weaker definition of stationarity: Definition. A stochastic process is weakly stationary, second-order stationary, or wide-sense stationary if m(t) m, B(s,t) = C(s t). In other words, a process is weakly stationary if its mean and covariance function do not change with time. The covariance function function C(t) = EX t X = EX s+t X s for all s. Notes A weakly stationary process is not strongly stationary in general. If X t is Gaussian and it is weakly stationary, then it is strongly stationary. This is because the mean and covariance completely characterize a Gaussian process Covariance function and spectral measure for weakly stationary processes The covariance function is a very useful way to analyze properties of weakly stationary processes. Let s look at some of its properties. (Without loss of generality, we can assume m =, since we can consider Y t = X t m which has mean zero.) The variance of the process is C() = EX 2 t. C() > C(h) for all h (except for a trivial process which is identically one constant number.) To see why, calculate: C(t) = EX t+s X s (EX 2 t+sex 2 s ) 1/2, by Cauchy-Schwartz. But the RHS is C(). By definition, C(t) = C( t) (for a complex-valued processes X t = X (1) t +ix (2) t we have C(t) = C( t), see section ) 4
5 C(t) is a positive semi-definite function: the matrix (C(t j t i )) k i, j=1 is positive semi-definite for all finite slices {t i } k i=1. You can show this yourself as an exercise, or see Pavliotis [214], p.7. Here s another nice property: Lemma. If the covariance function C(t) of a weakly stationary stochastic process X t is continuous at t =, then it is uniformly continuous for all t R. Furthermore, C(t) is continuous at if and only if X t is continuous in the L 2 sense. Proof. (Mostly from Pavliotis [214], p.6): To show that C(t) is continuous at t = implies it is uniformly continuous: Fix t R, and suppose EX t =. Then C(t + h) C(t) 2 = EX t+h X EX t X 2 = E (X t+h X t )X ) 2 EX 2 E(X t+h X t ) 2 = C() ( EX 2 t+h + EX 2 t 2E(X t X t+h ) ) = 2C()(C() C(h)) as h (Cauchy-Schwartz) Now, Suppose C(t) is continuous. The above showed that E X t+h X t 2 = 2(C() C(h)), (2) which converges to as h. Conversely, if X t is L 2 -continuous, then the above implies lim h C(h) = C(). Note that (2) implies that C() > C(h) (unless X t+h = X t a.s..) Therefore the maximum of a covariance function is always at. There is a very beautiful result in Fourier analysis, which turns out to provide a highly practical way to infer and analyze covariance functions in practice. The main result is given in the following theorem: Bochner s Theorem. There is a one-to-one correspondence between the set of continuous positive semidefinite functions and the set of finite measures on the Borel σ-algebra of R: if ρ is a finite measure then f (x) = e iλx dρ(λ) (3) R is positive semi-definite. Conversely, any continuous positive semi-definite function can be represented in this form. 1 Proof. See Koralov and Sinai [21] and references within. Remark. This theorem was discovered independently by Khinchin slightly after its publication by Bochner, so it is sometimes called the Bochner-Khinchin theorem. 1 cited from Koralov and Sinai [21], p.213 5
6 What this means is that if C(t) is the covariance function of a weakly stationary stochastic process X(t), and C(t) is continuous at, then there exists a measure F such that C(t) = R eiλx df(λ). The measure F is called the spectral measure of the process X t. It measures the amount of energy, or power, in different frequencies of X t : the total amount of power in an interval of frequencies [λ 1,λ 2 ] is F(λ 2 ) F(λ 1 ). Therefore, F(λ) can be thought of as a distribution function in frequency space. If it is absolutely continuous with respect to the Lebesgue measure on R, then we can write df(λ) = S(λ)dλ where S(λ) is called the spectral density of the process. The functions S(λ), C(t) are Fourier transforms of each other: C(t) = e iλt S(λ)dλ, S(λ) = 1 e itλ C(t)dt. (4) R 2π The spectral density S(λ) is also a way to measure the amount of energy in different frequencies of the stochastic process. It is actually somewhat remarkable that it exists: we know, from Fourier theory, that X t itself does not have a Fourier transform indeed, it cannot be L 1 -integrable since the fluctuations are statistically constant for all time, so it never decays. Yet, the spectral measure still lets us talk about the frequencies that make up X t (and we will see later that actually, X t has a sort of generalized spectral representation.) Example. Consider a process with covariance function C(t) = Ae αt, where A,α > are parameters. We can calculate the Fourier transform to be (see Pavliotis [214], p.8) S(λ) = A π α λ 2 + α 2. If the process it represents is Gaussian then it is a stationary Ornstein-Uhlenbeck process. We will return to this example throughout the course, because it is the only example of a stationary, Gaussian, Markov process, and it is used extremely frequently in modelling. Example. How can we make up covariance functions, e.g. for testing code or theory? It is hard to do this directly, because it is hard to quickly tell whether a function is positive semi-definite. However, it is easy to make up functions (or measures) that are non-negative, and provided these are integrable we can take the Fourier transform and obtain an example of a positive semi-definite covariance function. For example, all of these are spectral densities of some covariance function: S(λ) = 1 1 λ 2, S(λ) = (1 λ 2 )1 λ 1, S(λ) = Ae Bλ 2 /2, S(λ) = Ae Bλ 2 /2 1 λ λ 12. Example. Turbulent fluids are frequently characterized by their spectrum, which says what are the energycontaining scales. This is calculated by looking at one component of the velocity field, computing its covariance function, and finding the spectral density of this. Here is an example that shows a typical velocity field, and schematic of a typical spectrum, for turbulence in the atmosphere: 2 2 from Barbara Ryden s notes on Basic Turbulence, see ryden/ast825/ch7.pdf 6
7 Here is another example, that shows measurements of the three components of magnetic field and corresponding velocity field in the sun (left), and a typical computed spectrum for magnetic field turbulence (right): Ergodic properties of weakly stationary processes If a process is stationary, then we may hope to be able to extract all the statistical information we need from a single trajectory of the process. This would be useful, for example, if we only have access to one trajectory: perhaps it is the temperature fluctuations in New York, the velocity field at a single location in the ocean, etc; or if it is hard to obtain enough representative samples, such as when running a molecular dynamics or monte-carlo simulation, which require a large equilibration time before one can start collecting statistics. However, it is not obvious that the time average of a function of of a single trajectory should equal the average over an ensemble of trajectories, and in fact this is only true when the covariance function decays quickly enough. Theorem (Ergodic Theorem for stationary processes). 4 Let X t be a weakly stationary process with mean µ and covariance C(t), and suppose that C(t) L 1 (, ). Then lim E 1 T T 3 from cairns/teaching/lecture12/node2.html 4 Theorem, Lemma, and proof as quoted in Pavliotis [214] 2 X s ds µ =. (5) 7
8 Notes This says that the time average of X t converges in mean-square to its mean µ. A sequence of m.s. random variables X t converges in mean-square to another random variable Y, written X t Y, if lim t E X t Y 2 =. This theorem generalizes trivially to weakly stationary sequences X,X 1,X 2,... If X t is strongly stationary then the convergence in the Ergodic Theorem is almost surely see Grimmett and Stirzaker [21], section 9.5. For a weaker version of the theorem, see Grimmett and Stirzaker [21], section 9.5. This shows that, with no restrictions on the covariance function, there exists a random variable Y such that X t dt m.s. Y. 1 T To prove this ergodic theorem, we need the following result. Lemma. Let C(t) be an integrable symmetric function. Then C(t s)dtds = 2 (T s)c(s)ds. Proof. Change variables to u = t s, v = t +s. The domain of integration in (u,v) is [ T,T ] [ u,2t u ], and the Jacobian is (t,s) (u,v) = 2 1. The integral becomes 2T u 1 T C(t s)dtds = T u 2 C(u)dvdu = (T u )C(u)du = 2 (T u)c(u)du, T where the last step follows because C(u) is symmetric. Proof of Ergodic theorem. Calculate: E 1 T 2 X s ds µ T = 1 T 2 E 2 (X s µ) ds = 1 T 2 E(X t µ)(x s µ)dtds Fubini s Theorem to interchange E, = 1 T 2 C(t s)dtds = 2 (T u)c(u)du using Lemma 2 T 2 T T 2 which, since C L 1 (, ). ( 1 u ) C(u) du T C(u)du Dominated Convergence Theorem Remark. We see from the proof that we can actually weaken the condition that C L 1 it is enough to know 1 that lim T T T C(u)(1 T u )du =. 8
9 Notes The ergodic theorem shows that if the correlation function decays quickly enough, then time-averaging is like computing an expectation. We can replace one with the other: either we can simulate (or observe) a system for a very long time, or we can average over many short simulations. A similar result holds for the empirical calculation of covariances. Here, we need 4th moments to decay sufficiently rapidly to get convergence to the ensemble covariances. Suppose we want to calculate the average value of a function of a process, E f (X t ). Can we do this by taking the time-average? We can, if we know that Y t = f (X t ) is weakly stationary with a covariance function that decays quickly enough. In general this will not hold if X t is only weakly stationary, but if X t is strongly stationary then Y t will also be strongly stationary, so we can apply the ergodic theorem provided C f (t) = E[( f (X t ) E f (X )( f (X ) E f (X ))] is L 1 -integrable. Notice that the Ergodic Theorem implies the Law of Large Numbers for i.i.d. random variables you can think about why Spectral representation of stationary processes It is actually possible to use the spectral measure obtained from the covariance function, to represent X t in a spectral form. This is possible despite the fact that it has no Fourier transform, as discussed before. The spectral representation of X t is like a random Fourier transform, and is possible because fluctuations in nearby mode numbers nearly cancel, to give a function that is finite in the end. Definition. A complex-valued stochastic process has the form X t = X (r) t + ix (i) t, where X (r) t,x (i) t are realvalued stochastic processes. The covariance function of a mean-zero complex-valued stochastic process X t is C(s,t) = EX s X t, and for a stationary process, it is C(t) = EX s+t X s. Theorem (Spectral Theorem). Given a stationary, mean-zero stochastic process X t with spectral distribution function F(λ), there exists a complex-valued process Z(λ) such that Z has the following properties: X t = e iλt dz(λ). (6) (i) Orthogonal increments: if the intervals [λ 1,λ 2 ], [λ 3,λ 4 ] are disjoint, then E(Z(λ 2 ) Z(λ 1 ))(Z(λ 4 ) Z(λ 3 )) =. (ii) Spectral weight: if λ 1 λ 2, then E Z(λ 2 ) Z(λ 1 ) 2 = F(λ 2 ) F(λ 1 ). Proof. See Grimmett and Stirzaker [21], p , or Yaglom [1962], p Notes The process Z is called the spectral process of X. 9
10 If X t is Gaussian, then Z(λ) is also Gaussian. A discrete-time stationary process X =...,X 1,X,X 1,... also has a spectral representation; the only difference is that the domain of the spectral process can be taken to be ( π,π]. The spectral process is found by Z(λ) = lim T T T e iλt 1 it X t dt, exactly as for the Fourier transform or for characteristic function inversion. The integral in (6) is defined to be lim Λ Λ Λ eiλt dz(λ), with Λ Λ e iλt dz(λ) = lim max k k e iλ k t (Z(λ k ) Z(λ k 1 )), where λ k k = [λ k 1,λ k ]. It can be verified that this definition is consistent; see for e.g. Grimmett and Stirzaker [21], sec Another form of the Spectral Theorem is X t = e iλt Z(dλ), where Z( λ) is a random interval function. This means it associates a random variable to each interval λ. It has the following properties: (i) E Z( λ) = (ii) Z( ) = Z( 1 ) + Z( 2 ) if 1, 2 are disjoint. (iii) E Z( 1 ) Z( 2 ) = if 1, 2 are disjoint. (iv) E Z([λ 1,λ 2 ]) 2 = F = F(λ 2 ) F(λ 1 ) if λ 1 λ 2. This is related to the previous formulation by Z(λ) = Z([,λ]). Example. Suppose the spectral measure is concentrated on a finite set of points λ 1,λ 2,...,λ N, i.e. df(λ) = N i=1 a i δ(λ λ i )dλ, where a i are positive weights. Then dz(λ) will also be concentrated on these points, so we can write dz(λ) = if λ / {λ i }, and dz(λ) = ξ i δ(λ λ i )dλ, i where {ξ i } N i=1 are uncorrelated random variables with E ξ i 2 = a i. 5 This means that Z(λ b ) Z(λ a ) = λi [λ a,λ b ] ξ i. Then, we can write X t = N i=1 ξ i e iλ it, i.e. the process is a random sum of Fourier components. This sum makes perfect sense and converges pointwise, however notice that it doesn t have a Fourier transform in the traditional sense since it is not in 5 Let s check we have all the δ-functions in the right places: suppose df(λ) = a i δ(λ λ i )dλ, dz(λ) = ξ i δ(λ λ i )dλ. Then EdZ(λ)dZ(λ ) = a i δ(λ λ i )δ(λ λ i )dλdλ. We need this to equal df(λ)δ(λ λ )dλ = a i δ(λ λ i )δ(λ λ )dλdλ. But δ(λ λ i )δ(λ λ i )dλdλ = δ(λ λ i )δ(λ λ)dλdλ, so this works. 1
11 L 1, and it may not have a discrete Fourier transform either if the frequencies {λ i } are not rational multiples of some common increment λ. Example (Covariance function). Let s check that the covariance function computed from the spectral representation is consistent: EX s+t X s = e iλ(s+t) e i λs E(dZ(λ)dZ( λ)) = = e is(λ λ) e iλt δ(λ λ)df(λ)d λ e iλt df(λ) = C(t). We used the formal result that E(dZ(λ)dZ( λ)) = δ(λ λ)df(λ)d λ, which is a consequence of the orthogonal increments property. The spectral representation is a very useful way to determine relationships between a process and its derivatives, its integrals, its convolution with various functions, and linear functionals of the process in general. For example, once we know the spectral representation of X t, we can very easily calculate the spectral representation of X t and hence its covariance function. We will explore some of these relationships on the homework. References G. Grimmett and D. Stirzaker. Probability and Random Processes. Oxford University Press, 21. L. B. Koralov and Y. G. Sinai. Theory of Probability and Random Processes. Springer, 21. G. A. Pavliotis. Stochastic Processes and Applications. Springer, 214. A. M. Yaglom. An Introduction to the Theory of Stationary Random Functions. Dover,
PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationSTA205 Probability: Week 8 R. Wolpert
INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and
More informationSpectral representations and ergodic theorems for stationary stochastic processes
AMS 263 Stochastic Processes (Fall 2005) Instructor: Athanasios Kottas Spectral representations and ergodic theorems for stationary stochastic processes Stationary stochastic processes Theory and methods
More informationIn terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.
1. GAUSSIAN PROCESSES A Gaussian process on a set T is a collection of random variables X =(X t ) t T on a common probability space such that for any n 1 and any t 1,...,t n T, the vector (X(t 1 ),...,X(t
More informationLecture 12: Detailed balance and Eigenfunction methods
Miranda Holmes-Cerfon Applied Stochastic Analysis, Spring 2015 Lecture 12: Detailed balance and Eigenfunction methods Readings Recommended: Pavliotis [2014] 4.5-4.7 (eigenfunction methods and reversibility),
More informationM5A42 APPLIED STOCHASTIC PROCESSES
M5A42 APPLIED STOCHASTIC PROCESSES Professor G.A. Pavliotis Department of Mathematics Imperial College London, UK LECTURE 1 06/10/2016 Lectures: Thursdays 14:00-15:00, Huxley 140, Fridays 10:00-12:00,
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationStochastic process for macro
Stochastic process for macro Tianxiao Zheng SAIF 1. Stochastic process The state of a system {X t } evolves probabilistically in time. The joint probability distribution is given by Pr(X t1, t 1 ; X t2,
More informationOne-Parameter Processes, Usually Functions of Time
Chapter 4 One-Parameter Processes, Usually Functions of Time Section 4.1 defines one-parameter processes, and their variations (discrete or continuous parameter, one- or two- sided parameter), including
More informationBrownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539
Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory
More information25.1 Ergodicity and Metric Transitivity
Chapter 25 Ergodicity This lecture explains what it means for a process to be ergodic or metrically transitive, gives a few characterizes of these properties (especially for AMS processes), and deduces
More informationModule 9: Stationary Processes
Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.
More informationMoving Average Representation and Prediction for Multidimensional Harmonizable Processes
Moving Average Representation and Prediction for Multidimensional Harmonizable Processes Marc H. Mehlman 12 December 2002 Abstract The conditions under which a purely nondeterministic, multidimensional
More informationSTOCHASTIC PROCESSES Basic notions
J. Virtamo 38.3143 Queueing Theory / Stochastic processes 1 STOCHASTIC PROCESSES Basic notions Often the systems we consider evolve in time and we are interested in their dynamic behaviour, usually involving
More informationLecture 17 Brownian motion as a Markov process
Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is
More informationIEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion
IEOR 471: Stochastic Models in Financial Engineering Summer 27, Professor Whitt SOLUTIONS to Homework Assignment 9: Brownian motion In Ross, read Sections 1.1-1.3 and 1.6. (The total required reading there
More informationLecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process
Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes
More informationGaussian Processes. 1. Basic Notions
Gaussian Processes 1. Basic Notions Let T be a set, and X : {X } T a stochastic process, defined on a suitable probability space (Ω P), that is indexed by T. Definition 1.1. We say that X is a Gaussian
More informationGaussian, Markov and stationary processes
Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November
More informationRandom Times and Their Properties
Chapter 6 Random Times and Their Properties Section 6.1 recalls the definition of a filtration (a growing collection of σ-fields) and of stopping times (basically, measurable random times). Section 6.2
More informationFor a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,
CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance
More informationFE 5204 Stochastic Differential Equations
Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 20, 2009 Preliminaries for dealing with continuous random processes. Brownian motions. Our main reference for this lecture
More informationWinter 2019 Math 106 Topics in Applied Mathematics. Lecture 9: Markov Chain Monte Carlo
Winter 2019 Math 106 Topics in Applied Mathematics Data-driven Uncertainty Quantification Yoonsang Lee (yoonsang.lee@dartmouth.edu) Lecture 9: Markov Chain Monte Carlo 9.1 Markov Chain A Markov Chain Monte
More informationStochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno
Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.
More information1. Stochastic Processes and Stationarity
Massachusetts Institute of Technology Department of Economics Time Series 14.384 Guido Kuersteiner Lecture Note 1 - Introduction This course provides the basic tools needed to analyze data that is observed
More informationProduct measure and Fubini s theorem
Chapter 7 Product measure and Fubini s theorem This is based on [Billingsley, Section 18]. 1. Product spaces Suppose (Ω 1, F 1 ) and (Ω 2, F 2 ) are two probability spaces. In a product space Ω = Ω 1 Ω
More informationProbability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver
Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation
More informationA Short Introduction to Diffusion Processes and Ito Calculus
A Short Introduction to Diffusion Processes and Ito Calculus Cédric Archambeau University College, London Center for Computational Statistics and Machine Learning c.archambeau@cs.ucl.ac.uk January 24,
More informationLECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.
LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation
More informationM4A42 APPLIED STOCHASTIC PROCESSES
M4A42 APPLIED STOCHASTIC PROCESSES G.A. Pavliotis Department of Mathematics Imperial College London, UK LECTURE 1 12/10/2009 Lectures: Mondays 09:00-11:00, Huxley 139, Tuesdays 09:00-10:00, Huxley 144.
More informationSAMPLE CHAPTERS UNESCO EOLSS STATIONARY PROCESSES. E ξ t (1) K.Grill Institut für Statistik und Wahrscheinlichkeitstheorie, TU Wien, Austria
STATIONARY PROCESSES K.Grill Institut für Statistik und Wahrscheinlichkeitstheorie, TU Wien, Austria Keywords: Stationary process, weak sense stationary process, strict sense stationary process, correlation
More informationLecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1
Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).
More informationBrownian Motion and Conditional Probability
Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical
More informationStochastic Processes. A stochastic process is a function of two variables:
Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:
More informationProbability and Measure
Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability
More informationLecture 1. Introduction to stochastic processes: concepts, definitions and applications.
Lecture 1. Introduction to stochastic processes: concepts, definitions and applications. Jesper Rydén Department of Mathematics, Uppsala University jesper@math.uu.se Stationary Stochastic Processes Fall
More informationBuilding Infinite Processes from Finite-Dimensional Distributions
Chapter 2 Building Infinite Processes from Finite-Dimensional Distributions Section 2.1 introduces the finite-dimensional distributions of a stochastic process, and shows how they determine its infinite-dimensional
More informationReview. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with
More informationSpectral Analysis and L 2 Ergodicity
Chapter 21 Spectral Analysis and L 2 Ergodicity Section 21.1 introduces the spectral representation of weakly stationary processes, and the central Wiener-Khinchin theorem connecting autocovariance to
More informationChapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes
Chapter 2 Some basic tools 2.1 Time series: Theory 2.1.1 Stochastic processes A stochastic process is a sequence of random variables..., x 0, x 1, x 2,.... In this class, the subscript always means time.
More informationBasic Definitions: Indexed Collections and Random Functions
Chapter 1 Basic Definitions: Indexed Collections and Random Functions Section 1.1 introduces stochastic processes as indexed collections of random variables. Section 1.2 builds the necessary machinery
More informationLet (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t
2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition
More informationStatistical signal processing
Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable
More informationFunctional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals
Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico
More information1. Fundamental concepts
. Fundamental concepts A time series is a sequence of data points, measured typically at successive times spaced at uniform intervals. Time series are used in such fields as statistics, signal processing
More information4 Sums of Independent Random Variables
4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables
More information18.175: Lecture 15 Characteristic functions and central limit theorem
18.175: Lecture 15 Characteristic functions and central limit theorem Scott Sheffield MIT Outline Characteristic functions Outline Characteristic functions Characteristic functions Let X be a random variable.
More information7.1 Coupling from the Past
Georgia Tech Fall 2006 Markov Chain Monte Carlo Methods Lecture 7: September 12, 2006 Coupling from the Past Eric Vigoda 7.1 Coupling from the Past 7.1.1 Introduction We saw in the last lecture how Markov
More informationGaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008
Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:
More information1 Tridiagonal matrices
Lecture Notes: β-ensembles Bálint Virág Notes with Diane Holcomb 1 Tridiagonal matrices Definition 1. Suppose you have a symmetric matrix A, we can define its spectral measure (at the first coordinate
More informationExample 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ)
Chapter 4 Stochastic Processes 4. Definition In the previous chapter we studied random variables as functions on a sample space X(ω), ω Ω, without regard to how these might depend on parameters. We now
More informationMultivariate Generalized Ornstein-Uhlenbeck Processes
Multivariate Generalized Ornstein-Uhlenbeck Processes Anita Behme TU München Alexander Lindner TU Braunschweig 7th International Conference on Lévy Processes: Theory and Applications Wroclaw, July 15 19,
More informationSTAT 331. Martingale Central Limit Theorem and Related Results
STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal
More informationTheorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1
Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be
More informationChapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan
Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition
More informationBrownian Motion. Chapter Stochastic Process
Chapter 1 Brownian Motion 1.1 Stochastic Process A stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ,P and a real valued stochastic
More informationBrownian Motion. 1 Definition Brownian Motion Wiener measure... 3
Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................
More information6 Markov Chain Monte Carlo (MCMC)
6 Markov Chain Monte Carlo (MCMC) The underlying idea in MCMC is to replace the iid samples of basic MC methods, with dependent samples from an ergodic Markov chain, whose limiting (stationary) distribution
More informationGaussian Random Fields: Geometric Properties and Extremes
Gaussian Random Fields: Geometric Properties and Extremes Yimin Xiao Michigan State University Outline Lecture 1: Gaussian random fields and their regularity Lecture 2: Hausdorff dimension results and
More informationRandom Process. Random Process. Random Process. Introduction to Random Processes
Random Process A random variable is a function X(e) that maps the set of experiment outcomes to the set of numbers. A random process is a rule that maps every outcome e of an experiment to a function X(t,
More informationStochastic Processes
Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic
More informationSequence convergence, the weak T-axioms, and first countability
Sequence convergence, the weak T-axioms, and first countability 1 Motivation Up to now we have been mentioning the notion of sequence convergence without actually defining it. So in this section we will
More informationSTAT 7032 Probability Spring Wlodek Bryc
STAT 7032 Probability Spring 2018 Wlodek Bryc Created: Friday, Jan 2, 2014 Revised for Spring 2018 Printed: January 9, 2018 File: Grad-Prob-2018.TEX Department of Mathematical Sciences, University of Cincinnati,
More informationLesson 4: Stationary stochastic processes
Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means
More information18.175: Lecture 8 Weak laws and moment-generating/characteristic functions
18.175: Lecture 8 Weak laws and moment-generating/characteristic functions Scott Sheffield MIT 18.175 Lecture 8 1 Outline Moment generating functions Weak law of large numbers: Markov/Chebyshev approach
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete
More informationhttp://www.math.uah.edu/stat/markov/.xhtml 1 of 9 7/16/2009 7:20 AM Virtual Laboratories > 16. Markov Chains > 1 2 3 4 5 6 7 8 9 10 11 12 1. A Markov process is a random process in which the future is
More informationClass Meeting # 2: The Diffusion (aka Heat) Equation
MATH 8.52 COURSE NOTES - CLASS MEETING # 2 8.52 Introduction to PDEs, Fall 20 Professor: Jared Speck Class Meeting # 2: The Diffusion (aka Heat) Equation The heat equation for a function u(, x (.0.). Introduction
More informationErgodic Theory and Topological Groups
Ergodic Theory and Topological Groups Christopher White November 15, 2012 Throughout this talk (G, B, µ) will denote a measure space. We call the space a probability space if µ(g) = 1. We will also assume
More informationSTA 294: Stochastic Processes & Bayesian Nonparametrics
MARKOV CHAINS AND CONVERGENCE CONCEPTS Markov chains are among the simplest stochastic processes, just one step beyond iid sequences of random variables. Traditionally they ve been used in modelling a
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More information21 Gaussian spaces and processes
Tel Aviv University, 2010 Gaussian measures : infinite dimension 1 21 Gaussian spaces and processes 21a Gaussian spaces: finite dimension......... 1 21b Toward Gaussian random processes....... 2 21c Random
More informationIntroduction to Machine Learning CMU-10701
Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos & Aarti Singh Contents Markov Chain Monte Carlo Methods Goal & Motivation Sampling Rejection Importance Markov
More informationLecture 12: Detailed balance and Eigenfunction methods
Lecture 12: Detailed balance and Eigenfunction methods Readings Recommended: Pavliotis [2014] 4.5-4.7 (eigenfunction methods and reversibility), 4.2-4.4 (explicit examples of eigenfunction methods) Gardiner
More informationPart II Probability and Measure
Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)
More informationMetric spaces and metrizability
1 Motivation Metric spaces and metrizability By this point in the course, this section should not need much in the way of motivation. From the very beginning, we have talked about R n usual and how relatively
More informationStochastic process. X, a series of random variables indexed by t
Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,
More informationStationary particle Systems
Stationary particle Systems Kaspar Stucki (joint work with Ilya Molchanov) University of Bern 5.9 2011 Introduction Poisson process Definition Let Λ be a Radon measure on R p. A Poisson process Π is a
More informationExercises. T 2T. e ita φ(t)dt.
Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.
More informationModeling and testing long memory in random fields
Modeling and testing long memory in random fields Frédéric Lavancier lavancier@math.univ-lille1.fr Université Lille 1 LS-CREST Paris 24 janvier 6 1 Introduction Long memory random fields Motivations Previous
More information6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )
6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined
More information18.175: Lecture 2 Extension theorems, random variables, distributions
18.175: Lecture 2 Extension theorems, random variables, distributions Scott Sheffield MIT Outline Extension theorems Characterizing measures on R d Random variables Outline Extension theorems Characterizing
More informationStat 516, Homework 1
Stat 516, Homework 1 Due date: October 7 1. Consider an urn with n distinct balls numbered 1,..., n. We sample balls from the urn with replacement. Let N be the number of draws until we encounter a ball
More informationStochastic Modelling Unit 1: Markov chain models
Stochastic Modelling Unit 1: Markov chain models Russell Gerrard and Douglas Wright Cass Business School, City University, London June 2004 Contents of Unit 1 1 Stochastic Processes 2 Markov Chains 3 Poisson
More informationStochastic Processes
Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.
More informationProbability and Measure
Probability and Measure Robert L. Wolpert Institute of Statistics and Decision Sciences Duke University, Durham, NC, USA Convergence of Random Variables 1. Convergence Concepts 1.1. Convergence of Real
More informationTopics in fractional Brownian motion
Topics in fractional Brownian motion Esko Valkeila Spring School, Jena 25.3. 2011 We plan to discuss the following items during these lectures: Fractional Brownian motion and its properties. Topics in
More informationOptional Stopping Theorem Let X be a martingale and T be a stopping time such
Plan Counting, Renewal, and Point Processes 0. Finish FDR Example 1. The Basic Renewal Process 2. The Poisson Process Revisited 3. Variants and Extensions 4. Point Processes Reading: G&S: 7.1 7.3, 7.10
More informationMET Workshop: Exercises
MET Workshop: Exercises Alex Blumenthal and Anthony Quas May 7, 206 Notation. R d is endowed with the standard inner product (, ) and Euclidean norm. M d d (R) denotes the space of n n real matrices. When
More informationMean-field dual of cooperative reproduction
The mean-field dual of systems with cooperative reproduction joint with Tibor Mach (Prague) A. Sturm (Göttingen) Friday, July 6th, 2018 Poisson construction of Markov processes Let (X t ) t 0 be a continuous-time
More informationLecture 4: Introduction to stochastic processes and stochastic calculus
Lecture 4: Introduction to stochastic processes and stochastic calculus Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London
More informationLecture 19 L 2 -Stochastic integration
Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes
More informationIf we want to analyze experimental or simulated data we might encounter the following tasks:
Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction
More informationECO 513 Fall 2009 C. Sims CONDITIONAL EXPECTATION; STOCHASTIC PROCESSES
ECO 513 Fall 2009 C. Sims CONDIIONAL EXPECAION; SOCHASIC PROCESSES 1. HREE EXAMPLES OF SOCHASIC PROCESSES (I) X t has three possible time paths. With probability.5 X t t, with probability.25 X t t, and
More informationLarge Deviations for Weakly Dependent Sequences: The Gärtner-Ellis Theorem
Chapter 34 Large Deviations for Weakly Dependent Sequences: The Gärtner-Ellis Theorem This chapter proves the Gärtner-Ellis theorem, establishing an LDP for not-too-dependent processes taking values in
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan
More informationwhere r n = dn+1 x(t)
Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution
More informationA Concise Course on Stochastic Partial Differential Equations
A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original
More information