Fig 1: Stationary and Non Stationary Time Series

Similar documents
16.584: Random (Stochastic) Processes

Stochastic Processes

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

IV. Covariance Analysis

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Random Process. Random Process. Random Process. Introduction to Random Processes

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

Random Processes Why we Care

Introduction to Probability and Stochastic Processes I

Signals and Spectra (1A) Young Won Lim 11/26/12

Chapter 6 - Random Processes

Chapter 6. Random Processes

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Module 9: Stationary Processes

Signals and Spectra - Review

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

Probability and Statistics

Stochastic Processes. Chapter Definitions

Stochastic Processes. A stochastic process is a function of two variables:

Statistical signal processing

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Lecture - 30 Stationary Processes

ECE 636: Systems identification

Chapter 2 Random Processes

Chapter 6: Random Processes 1

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

Fundamentals of Noise

Basic Descriptions and Properties

Communication Theory II

Econometría 2: Análisis de series de Tiempo

Properties of the Autocorrelation Function

Name of the Student: Problems on Discrete & Continuous R.Vs

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

Continuous Stochastic Processes

Problem Sheet 1 Examples of Random Processes

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

ELEMENTS OF PROBABILITY THEORY

Question Paper Code : AEC11T03

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

7 The Waveform Channel

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Signal Processing Signal and System Classifications. Chapter 13

3F1 Random Processes Examples Paper (for all 6 lectures)

2A1H Time-Frequency Analysis II

Analysis and Design of Analog Integrated Circuits Lecture 14. Noise Spectral Analysis for Circuit Elements

PROBABILITY AND RANDOM PROCESSESS


Name of the Student: Problems on Discrete & Continuous R.Vs

MA6451 PROBABILITY AND RANDOM PROCESSES

Stochastic process. X, a series of random variables indexed by t

Chapter 5 Random Variables and Processes

STOCHASTIC PROBABILITY THEORY PROCESSES. Universities Press. Y Mallikarjuna Reddy EDITION

ECE-340, Spring 2015 Review Questions

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes

1 Elementary probability

CHAPTER 3 MATHEMATICAL AND SIMULATION TOOLS FOR MANET ANALYSIS

5.9 Power Spectral Density Gaussian Process 5.10 Noise 5.11 Narrowband Noise

08. Brownian Motion. University of Rhode Island. Gerhard Müller University of Rhode Island,

4 Classical Coherence Theory

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

Fourier Analysis and Power Spectral Density

Review of Probability

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

Problems on Discrete & Continuous R.Vs

TSKS01 Digital Communication Lecture 1

Definition of a Stochastic Process

EE4601 Communication Systems

1 Signals and systems

Modern Navigation. Thomas Herring

CHAPTERl BASIC DESCRIPTIONS AND PROPERTIES

Name of the Student: Problems on Discrete & Continuous R.Vs

Problems 5: Continuous Markov process and the diffusion equation

Chapter 4 Random process. 4.1 Random process

PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES

Random Locations of Periodic Stationary Processes

Statistics 349(02) Review Questions

Stochastic Processes. Monday, November 14, 11

TLT-5200/5206 COMMUNICATION THEORY, Exercise 3, Fall TLT-5200/5206 COMMUNICATION THEORY, Exercise 3, Fall Problem 1.

1. SIGNALS AND INFORMATION AND CLASSIFICATION OF SIGNALS

ECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process

ECE Homework Set 3

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Probability and Statistics for Final Year Engineering Students

EE538 Final Exam Fall :20 pm -5:20 pm PHYS 223 Dec. 17, Cover Sheet

Empirical Macroeconomics

Utility of Correlation Functions

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

Lecture 7 Random Signal Analysis

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES

2.161 Signal Processing: Continuous and Discrete Fall 2008

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE

Transcription:

Module 23 Independence and Stationarity Objective: To introduce the concepts of Statistical Independence, Stationarity and its types w.r.to random processes. This module also presents the concept of Ergodicity. Introduction In probability theory, two events are independent, statistically independent, or stochastically independent if the occurrence of one does not affect the probability of occurrence of other i.e., having the probability of their joint occurrenceequal to the product of their individual probabilities. A random process becomes a random variable when time is fixed at some particular value. The random variable will possess statistical properties such as a mean value, moments, variance etc. that are related to its density function. If two random variables are obtained from the process for two time instants they will have statistical properties related to their joint density function. More generally, N random variables will possess statistical properties related to their N dimensional joint density function.a random process is said to be stationary if all its statistical properties do not change with time other processes are called non-stationary. It can also be defined as a stationary time series is one whose statistical properties such as mean, variance, autocorrelation, etc. are all constant over time. Fig 1: Stationary and Non Stationary Time Series As an example, white noise is stationary. The sound of a cymbal clashing, if hit only once, is not stationary because the acoustic power of the clash (and hence its variance) diminishes with time. Statistical Independence In case of two random variables X and Y, defined by the events A = {X x}, B = {Y y}, to be statistically independent, Consider two processes X(t) and Y(t). The two processes, X(t) and Y(t) are said to be statistically independent, if the random variable group X(t 1 ), X(t 2 ), X(t 3 ) X(t N ) is

independent of the group Y(t 1 ), Y(t 2 ), Y(t 3 ).Y(t M ). Independence requires that the joint density function be factorable by groups. Independence requires that the joint density be factorable by groups f x,y (x 1,... x N, y 1,...y N ; t 1,...t N,t 1,... t M ) = f x (x 1,... x N ; t 1,...t N ) f y (y 1,... y N ; t 1,... t M ) Statistical independence means one event conveys no information about the other; statistical dependence means there is some information. Stationarity To define Stationarity, Distribution and density functions of random processx(t) must be defined. For a particular time t 1, the distribution function associated with the random variable X 1 =X(t 1 )will be denoted F x (x 1 ; t 1 ). It is defined as For any real number x 1. F x (x 1 ; t 1 )=P{X (t 1 ) x 1 } For two random variables X 1 =X(t 1 ) and X 2 =X(t 2 ), the second order joint distribution function is the two dimensional extension of above equation F x (x 1, x 2 ; t 1,t 2 )=P{X(t 1 ) x 1, X(t 2 ) x 2 } In a similar manner, for N random variables X i =X(t i ), i=1,2,3,...n, the Nth order joint distribution function is F x (x 1,... x N ; t 1,...t N )=P{X(t 1 ) x 1,... X(t N ) x N } Joint density functions are found from derivatives of the above joint distribution functions f x (x 1 ; t 1 )=d F x (x 1 ; t 1 )/dx 1 f x (x 1, x 2 ; t 1,t 2 )=d 2 F x (x 1, x 2 ; t 1,t 2 )/(dx 1 dx 2 ) f x (x 1,... x N ; t 1,...t N )=d N F x (x 1,... x N ; t 1,...t N )/(dx 1...dx N ) First order stationary processes A Random process is called stationary to order one, if its first order density function does not change with a shift in time origin. In other words f x (x 1 ; t 1 ) =f x (x 1 ; t 1 +Δ) must be true for any t 1 and any real number Δ if X(t) is to be a first order stationary process. If fx(x 1 ; t 1 ) is independent of t 1, the process mean value E[X(t)]=x=constant. Second Order and wide sense Stationarity

A process is called stationary to order two if its second order density function satisfies f x (x 1, x 2 ; t 1,t 2 )= f x (x 1, x 2 ; t 1 +Δ,t 2 +Δ ) for all t 1, t 2 and Δ. The correlation E[X(t 1 ) X(t 2 )]of a random process will be a function of t 1 and t 2. Let us denote the function by R XX (t 1,t 2 ) and call it the autocorrelation function of the random process X(t): R XX (t 1,t 2 )= E[X(t 1 ) X(t 2 )] The autocorrelation function of second order stationary process is a function only of time differences and not absolute time, that is, if τ=t2-t1, then R XX (t 1,t 1 + τ)= E*X(t 1 ) X(t 1 + τ)= R XX (τ) A Second order stationary process is wide sense stationary for which the following two conditions are true: E[X(t)]=x=constant and E[X(t 1 ) X(t 1 + τ)= R XX (τ) Most of the practical problems, deal with autocorrelation and mean value of a random process. If these quantities are not dependent on absolute time, problem solutions are greatly simplified. N Order and Strict Sense Stationary A Random process is stationary to order N if its Nth order density function is invariant to a time origin shift ; that is, if f x (x 1,... x N ; t 1,...t N )= f x (x 1,... x N ; t 1 +Δ,...t N +Δ ) for all t 1,...t N and Δ. Stationary of order N implies stationarity to all orders k N. A process stationary to all orders N=1,2,3,... is called Strict sense Stationary.

Time Averages and Ergodicity

In probability theory, an ergodic system is one that has the same behaviour averaged over time as averaged over the samples. A random process is ergodic if its time average is the same as its average over the probability space The time average of a quantity is defined as A[.]=lim T 1/2π Here A is used to denote time average in a manner analogous to E for the statistical average. Specific averages of interest are the mean value x=a[x(t)] of a sample function and the time autocorrelation function is denoted R xx =A[x(t)x(t + τ)]. These functions are defined by x=a[x(t)] = lim T 1/2π T T R xx (τ)=a[x(t)x(t + τ)]=lim T 1/2π [. ]dt T T [x(t)]dt T T [x(t)x(t + τ)]dt By taking the expected value on both sides of the above two equations and assuming the expectation can be brought inside the integrals, then Ergodic Theorem E[x]=X E[R xx (τ)]= R XX (τ) The Ergodic Theorem states that for any random process X(t), all the time averages of the sample functions of the X(t) are equal to the corresponding statistical or ensemble averages of X(t). i.e., x=x R xx (τ) = R XX (τ) Ergodic Process Random processes that satisfy the ergodic theorem are called ergodic processes. The analysis of Ergodicity is extremely complex. In most physical applications, it is assumed that all stationary processes are ergodic processes. Mean Ergodic process A process X(t) with a constant mean value X is called mean ergodic or ergodic in the mean, if its statistical average X equals the time average x of any sample function x(t) with probability 1 for all sample functions; that is, if E[x]=X=A[x(t)] =x with probability 1 for all x(t). Correlation Ergodic Processes Analogous to a mean ergodic process, a stationary continuous process X(t) with autocorrelation function R XX (τ) is called autocorrelation ergodic or ergodic in the autocorrelation if, and only if, for all τ lim T 1/2π T T Example for the application of concept of Ergodicity [X(t)X(t + τ)]dt = R XX (τ).

Ergodicity means the ensemble average equals the time average. Each resistor has thermal noise associated with it and it depends on the temperature. Take N resistors (N should be very large) and plot the voltage across those resistors for a long period. For each resistor you will have a waveform. Calculate the average value of that waveform. This gives you the time average. You should also note that you have N waveforms as we have N resistors. These N plots are known as an ensemble. Now take a particular instant of time in all those plots and find the average value of the voltage. That gives you the ensemble average for each plot. If both ensemble average and time average are the same then it is ergodic.