Stochastic Processes. A stochastic process is a function of two variables:

Similar documents
ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Chapter 6. Random Processes

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

A6523 Modeling, Inference, and Mining Jim Cordes, Cornell University. False Positives in Fourier Spectra. For N = DFT length: Lecture 5 Reading

16.584: Random (Stochastic) Processes

Stochastic Processes

Statistical signal processing

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

Signals and Spectra - Review

Econ 424 Time Series Concepts

Module 9: Stationary Processes

Utility of Correlation Functions

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

IV. Covariance Analysis

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Random Processes Why we Care

STAT 248: EDA & Stationarity Handout 3

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Some Time-Series Models

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Econometría 2: Análisis de series de Tiempo

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Stochastic Process II Dr.-Ing. Sudchai Boonto

Problem Sheet 1 Examples of Random Processes

where r n = dn+1 x(t)

Definition of a Stochastic Process

Fig 1: Stationary and Non Stationary Time Series

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

Chapter 6 - Random Processes

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES

Probability and Statistics for Final Year Engineering Students

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Random Process. Random Process. Random Process. Introduction to Random Processes

Introduction to Stochastic processes

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Probability and Statistics

Classic Time Series Analysis

7 The Waveform Channel

Brownian Motion and Poisson Process

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

Time Series: Theory and Methods

Massachusetts Institute of Technology

Communication Theory II

Class 1: Stationary Time Series Analysis

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Review of Probability

Introduction to Probability and Stochastic Processes I

Spectral representations and ergodic theorems for stationary stochastic processes

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides

Properties of the Autocorrelation Function

Problems on Discrete & Continuous R.Vs

Fourier Analysis Linear transformations and lters. 3. Fourier Analysis. Alex Sheremet. April 11, 2007

Chapter 2 Random Processes

Stochastic Processes. Monday, November 14, 11

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

1 Linear Difference Equations

Spectral Analysis of Random Processes

Signals and Spectra (1A) Young Won Lim 11/26/12

Stochastic Processes

Statistics of Stochastic Processes

Fundamentals of Noise

ECE 636: Systems identification

Example 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ)

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

STOCHASTIC PROBABILITY THEORY PROCESSES. Universities Press. Y Mallikarjuna Reddy EDITION

Gaussian, Markov and stationary processes

1. Fundamental concepts

Chapter 5 Random Variables and Processes

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

ELEMENTS OF PROBABILITY THEORY

14 - Gaussian Stochastic Processes

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).

Discrete time processes

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

Module 4. Signal Representation and Baseband Processing. Version 2 ECE IIT, Kharagpur

Lecture - 30 Stationary Processes

7. MULTIVARATE STATIONARY PROCESSES

1. Stochastic Processes and Stationarity

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes

white noise Time moving average

Long-range dependence

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011

Time Series 2. Robert Almgren. Sept. 21, 2009

Statistics of stochastic processes

covariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of

Fourier Analysis and Power Spectral Density

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

Chapter 4 Random process. 4.1 Random process

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

Lecture 15. Theory of random processes Part III: Poisson random processes. Harrison H. Barrett University of Arizona

Minitab Project Report Assignment 3

Empirical Macroeconomics

Transcription:

Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter: 1. {X t (ζ), t [a, b]} is an infinite set of random variables defined over an interval [a, b], all of which map from the same set of events {ζ}. 2. {X n (ζ), n = integer} is a discrete parameter random process. A stochastic process is a function of two variables: 1. the parameter t or n in the examples above (e.g. time); and 2. ζ S = the sample or event space. At a fixed time, X t (ζ) is a simple random variable. For a fixed event ζ, X t (ζ) is a simple function of time. One of these functions is called a realization of the random process. The collection of all possible realizations (perhaps infinite) is an assembly or ensemble. 1

Ensemble vs. Realization Averages Revisited Two kinds of averages may be defined, according to which variable the process is averaged over: 1. Averages over t for a single realization are sample averages or (for t = time) time averages. 2. Averages over ζ are ensemble averages. In general, ensemble averages sample averages. If a sample average converges to the ensemble average as the length of the realization tends to infinity, the process is said to be ergodic and is said to have stationary statistics. The Gibbs ensemble in statistical mechanics has stationary statistics, since it is an infinite assembly of oscillators or systems in equilibrium with a temperature bath. Why is this important? One reason is that we will be considering estimators for ensemble average quantities that are based on single (or a finite number of) realizations. In some cases the estimator will converge to the ensemble average quantity, in others it will not. 2

Recall the graphics that show realizations of time series: Figure 1: Realizations for stationary stochastic white noise (left) and a nonstationary random walk (right) 3

Characterization of Stochastic Processes To totally specify a random process, we must know the multivariate pdf (or distribution function) of a large number (possibly infinite) of random variables. For a discrete process {X(t j ), j = 1,..., n} we would need to know the 2n dimensional distribution function: F X(t1 )...X(t 2 )(x 1,..., x n ; t 1,..., t n ) P {X(t 1 ) x 1,..., X(t n ) x n }. In practice we will be much less ambitious and will be satisfied with knowing (or constraining) only a few low order moments of the process. These include first order moments like X n (t) = dz z n f X(t) (z; t) which are ensemble averages that may be functions of time. Second order moments include the autocorrelation function R X (t 1, t 2 ) X(t 1 )X (t 2 ) dw dz wzf X(t1 )X(t 2 )(w, z; t 1, t 2 ) and the autocovariance function C X (t 1, t 2 ) [X(t 1 ) X(t 1 ) ] [X(t 2 ) X(t 2 ) ]. 4

Stationarity If any moments of a process are functions of time, the process is nonstationary. Different orders of stationarity are defined according to the order of moment. 1. Stationarity of order 1: 2. Second order stationarity: for any t F X(t1 )(x; t 1 ) = F X(t2 )(x; t 2 ). F (x 1, x 2 ; t 1, t 2 ) = F (x 1, x 2 ; t 1 + t, t 2 + t). In particular, for t = t 1, the right hand side depends only on the difference or lag t 2 t 1. 3. Strict stationarity: time or lag invariance of the distribution function holds for all orders. 4. Wide sense stationarity (WSS) is defined up to only second order. Note the congruence with the complete determination of Gaussian processes by their first and second moments. The constraints for WSS are i. X 2 (t) < t. ii. X(t) = constant. iii. R(t 1, t 2 ) = X(t 1 )X(t 2 ) = R(t 2 t 1 ). i.e. the autocorrelation function depends only on time differences. 5

Correlation Functions and Power Spectra of WSS Processes Autocorrelation functions of WSS processes (distinct from autocorrelations of functions) R(τ) = X(t)X (t + τ) have the properities: 1. Hermiticity: R X( τ) = R X (τ). 2. R X (0) = X 2. 3. R X (τ) R X (0). Autocorrelation (and autocovariance) functions are useful as: 1. probes of characteristic time (or length or velocity, etc.) scales of a process. 2. quantities used in estimation procedures (via the covariance matrix). 3. a means for calculating the power spectrum of a process. 6

Wiener-Kinchin theorem The power spectrum S(f) is simply the Fourier transform of the autocorrelation function (sometimes the autocovariance function). S(f) = dτe 2πifτ R X (τ). As such it (as well as the ACF) is an ensemble average quantity. With finite measurements of realization(s) of a process, the best we can do is to estimate the power spectrum. Properties of S(f) are: 1. S(f) 0. 2. Real since R(τ) hermitian. 3. Is the distribution of the second moment (or variance) in frequency space. 4. Partakes of the analogy S(f) : R(τ) :: f X (x) : Φ X (ω). In some contexts (e.g. maximum entropy spectral estimation), it is convenient to view the power spectrum as a probability distribution of frequency components. In some Bayesian treatments, the PDF of the frequency is explicitly calculated. 7

Correlation Functions and Power Spectra Recall from Fourier transform theorems for deterministic functions we have the relationships: f(t) FT F (f) irreversible irreversible dt f(t)f (t + τ) FT F (f) 2 For stochastic processes the situation is different. We need to distinguish the power spectrum of a realization from the ensemble-average (true) power spectrum: x(t) FT X(f) irreversible irreversible x(t)x (t + τ) FT S(f) = X(f) 2 8

Cross-correlation Functions: Suppose we have two random processes X(t) and Y (t) and we wish to test whether they are statistically related. e.g. X(t) = sunspot number Y (t) = number of airline accidents X(t) = pressure Y (t) = temperature X(t) = seismic activity Y (t) = animal behavior A useful statistic is the cross-correlation function (CCF) as is the cross-covariance function (CCV) R XY (t 1, t 2 ) X(t 1 )Y (t 2 ) C XY (t 1, t 2 ) = [X(t 1 ) X(t 1 ) ][Y (t 2 ) Y (t 2 ) ] Two random processes are uncorrelated if C XY (t 1, t 2 ) = 0 t 1, t 2 Two r.p. s are orthogonal if R XY (t 1, t 2 ) = 0 t 1, t 2. 9