Random Process. Random Process. Random Process. Introduction to Random Processes

Similar documents
Stochastic Processes

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Fig 1: Stationary and Non Stationary Time Series

16.584: Random (Stochastic) Processes

Chapter 6: Random Processes 1

Probability and Statistics

ELEMENTS OF PROBABILITY THEORY

Stochastic Processes. Monday, November 14, 11

Introduction to Probability and Stochastic Processes I

Chapter 6. Random Processes

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ECE Homework Set 3

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Name of the Student: Problems on Discrete & Continuous R.Vs

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Stochastic Processes. A stochastic process is a function of two variables:

Module 9: Stationary Processes

Statistical signal processing

Problems on Discrete & Continuous R.Vs

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Lecture - 30 Stationary Processes

MA6451 PROBABILITY AND RANDOM PROCESSES

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Chapter 2 Random Processes

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Probability and Statistics for Final Year Engineering Students

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

3F1 Random Processes Examples Paper (for all 6 lectures)

Chapter 4 Random process. 4.1 Random process

IV. Covariance Analysis

Name of the Student: Problems on Discrete & Continuous R.Vs

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

1.1 Review of Probability Theory

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

ECE Lecture #10 Overview

Northwestern University Department of Electrical Engineering and Computer Science

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

Communication Theory II

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

ECE 636: Systems identification

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

Unit Random Process. 2. Markov Chain. 3. Poisson Process

Random Processes Why we Care

Chapter 6 - Random Processes

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

4. Distributions of Functions of Random Variables

Stochastic Process II Dr.-Ing. Sudchai Boonto

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

Fundamentals of Noise

FINAL EXAM: 3:30-5:30pm

ECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

1 Probability and Random Variables

Lecture 19: Properties of Expectation

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

Stochastic Processes. Chapter Definitions

1. Fundamental concepts

Homework 3 (Stochastic Processes)

CHAPTER 3 MATHEMATICAL AND SIMULATION TOOLS FOR MANET ANALYSIS

Name of the Student: Problems on Discrete & Continuous R.Vs

PROBABILITY DISTRIBUTION

MAS113 Introduction to Probability and Statistics. Proofs of theorems

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

DIFFERENTIAL EQUATIONS

ECSE B Solutions to Assignment 8 Fall 2008

4 Classical Coherence Theory

Name of the Student: Problems on Discrete & Continuous R.Vs

STOCHASTIC PROBABILITY THEORY PROCESSES. Universities Press. Y Mallikarjuna Reddy EDITION

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

EDRP lecture 7. Poisson process. Pawe J. Szab owski

Random Vectors and Multivariate Normal Distributions

Chapter 5 Random Variables and Processes

Formulas for probability theory and linear models SF2941

Massachusetts Institute of Technology

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES

Probability and Statistics Concepts

Lecture 1. Introduction to stochastic processes: concepts, definitions and applications.

Basics on Probability. Jingrui He 09/11/2007

Stat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1

Communication Theory II


ECE534, Spring 2018: Solutions for Problem Set #5

dt 2 The Order of a differential equation is the order of the highest derivative that occurs in the equation. Example The differential equation

EE 3025 S2010 Demo 10 Apr 19-20, Reading Assignment: Read Sections 9.5 and of the EE 3025 Matlab Notes.

Properties of the Autocorrelation Function

Things to remember when learning probability distributions:

5.9 Power Spectral Density Gaussian Process 5.10 Noise 5.11 Narrowband Noise

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Brownian motion and the Central Limit Theorem

Discrete Distributions

Definition of a Stochastic Process

Selected Exercises on Expectations and Some Probability Inequalities

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes

Transcription:

Random Process A random variable is a function X(e) that maps the set of experiment outcomes to the set of numbers. A random process is a rule that maps every outcome e of an experiment to a function X(t, e). Introduction to Random Processes Lecture 12 Spring 2002 A random process is usually conceived of as a function of time, but there is no reason to not consider random processes that are functions of other independent variables, such as spatial coordinates. The function X(u, v, e) would be a function whose value depended on the location (u, v) and the outcome e, and could be used in representing random variations in an image. Lecture 12 1 Random Process Random Process The domain of e is the set of outcomes of the experiment. We assume that a probability distribution is known for this set. The domain of t is a set, T, of real numbers. If T is the real axis then X(t, e) is a continuous-time random process A RP is a family of functions, X(t, e). Imagine a giant strip chart recording in which each pen is identified with a different e. This family of functions is traditionally called an ensemble. A single function X(t, e k ) is selected by the outcome e k. This is just a time function that we could call X k (t). Different outcomes give us different time functions. If T is the set of integers then X(t, e) is a discrete-time random process We will often suppress the display of the variable e and write X(t) for a continuous-time RP and X[n] orx n for a discrete-time RP. If t is fixed, say t = t 1, then X(t 1,e) is a random variable. value depends on the outcome e. If both t and e are given then X(t, e) is just a number. Its Lecture 12 2 Lecture 12 3

Random Processes Moments and Averages X(t 1,e) is a random variable that represents the set of samples across the ensemble at time t 1 If it has a probability density function f X (x; t 1 ) then the moments are m n (t 1 )=E[X n (t 1 )] = xn f X (x; t 1 ) dx The notation f X (x; t 1 ) may be necessary because the probability density may depend upon the time the samples are taken. The mean value is µ X = m 1, which may be a function of time. The central moments are E[(X(t 1 ) µ X (t 1 )) n ]= (x µ X(t 1 )) n f X (x; t 1 ) dx Lecture 12 4 Lecture 12 5 Pairs of Samples The numbers X(t 1,e) and X(t 2,e) are samples from the same time function at different times. They are a pair of random variables (X 1,X 2 ). They have a joint probability density function f(x 1,x 2 ; t 1,t 2 ). From the joint density function one can compute the marginal densities, conditional probabilities and other quantities that may be of interest. Covariance and Correlation The covariance of the samples is C(t 1,t 2 ) = E[(X 1 µ 1 )(X 2 µ 2 ) ] = (x 1 µ 1 )(x 2 µ 2 ) f(x 1,x 2 ; t 1,t 2 )dx 1 dx 2 The correlation function is R(t 1,t 2 )=E[X 1 X2 ]= x 1x 2 f(x 1,x 2 ; t 1,t 2 )dx 1 dx 2 C(t 1,t 2 )=R(t 1,t 2 ) µ 1 µ 2 Note that both the covariance and correlation functions are conjugate symmetric in t 1 and t 2. C(t 1,t 2 )=C (t 2,t 1 ) and R(t 1,t 2 )= R (t 2,t 1 ) Lecture 12 6 Lecture 12 7

Mean-Squared Value The average power in the process at time t is represented by R(t, t) =E[ X(t) 2 ] and C(t, t) represents the power in the fluctuation about the mean value. Example: Poisson Random Process Let N(t 1,t 2 ) be the number of events produced by a Poisson process in the interval (t 1,t 2 ) when the average rate is λ events per second. The probability that N = n is P [N = n] = (λτ)n e λτ n! where τ = t 2 t 1. Then E[N(t 1,t 2 )] = λτ. A random process can be defined as the number of events in the interval (0, t). Thus, X(t) = N(0, t). The expected number of events in t is E[X(t)] = λt. Lecture 12 8 Lecture 12 9 Example-continued For a Poisson distribution we know that the variance is E[(X(t) λt) 2 ]=E[X 2 (t)] (λt) 2 = λt The average power in the function X(t) is E[X 2 (t)] = λt + λ 2 t 2 A graph of X(t) would show a function fluctuating about an average trend line with a slope λ. Lecture 12 10

Program for Poisson Random Process FUNCTION PoissonProcess,t,lambda,p ; S=PoissonProcess(t,lambda,p) ; divides the interval [0,t] into intervals of size ; deltat=p/lambda where p is sufficiently small so that ; the Poisson assumptions are satisfied. ; ; The interval (0,t) is divided into n=t*lambda/p intervals ; and the number of events in the interval (0,k*deltaT) is ; returned in the array S. The maximum length of S is 10000. ; ; USAGE ; S=PoissonProcess(10,1,0.1) ; Plot,S ; FOR m=1,10 DO OPLOT,PoissonProcess(10,1,0.1) NP=N_PARAMS() IF NP LT 3 THEN p=0.1 n=lambda*t/p u=randomn(seed,n,poisson=p) s=intarr(n+1) FOR k=1,n DO s[k]=s[k-1]+u[k-1] RETURN,s END Consider a random process that has the following properties: 1. X(t) =±1, 2. The number of zero crossings in the interval (0,t) is described by a Poisson process 3. X(0) = 1. (to be removed later) Find the expected value at time t. Lecture 12 12 Lecture 12 14 Let N(t) equal the number of zero crossings in the interval (0,t). with t 0. P (N = n) = (λt)n e λt n! P [X(t) = 1] = P [N = even number] [ = e λt 1+ (λt)2 + (λt)4 2! 4! = e λt cosh λt P [X(t) = 1] = P [N = odd number] [ = e λt λt + (λt)3 + (λt)5 3! 5! = e λt sinh λt + ] + ] Lecture 12 15

The expected value is E[X(t) X(0) = 1] = e λt cosh λt e λt sinh λt = e 2λt Note that the expected value decays toward x = 0 for large t. That happens because the influence of knowing the value at t = 0 decays exponentially. The autocorrelation function is computed by finding R(t 1,t 2 ) = E [X (t 1 ) X (t 2 )]. Let x 0 = 1 and x 1 = 1 denote the two values that X can attain. For the moment assume that t 2 t 1. Then 1 1 R(t 1,t 2 )= x j x k P [X(t 1 )=x k ]P [X(t 2 )=x j X(t 1 )=x k ] j=0 k=0 The first term in each product is given above. To find the conditional probabilities we take note of the fact that the number of sign changes in t 2 t 1 is a Poisson process. Hence, in a manner that is similar to the analysis above, P [X(t 2 )=1 X(t 1 ) = 1] = P [X(t 2 )= 1 X(t 1 )= 1] = e λ(t 2 t 1 ) cosh λ(t 2 t 1 ) Lecture 12 16 Lecture 12 17 P [X(t 2 )= 1 X(t 1 ) = 1] = P [X(t 2 )=1 X(t 1 )= 1] = e λ(t 2 t 1 ) sinh λ(t 2 t 1 ) Hence R(t 1,t 2 )= e λt [ 1 cosh λt 1 e λ(t 2 t 1 ) cosh λ(t 2 t 1 ) e λ(t 2 t 1 ) sinh λ(t 2 t 1 ) ] e λt [ 1 sinh λt 1 e λ(t 2 t 1 ) cosh λ(t 2 t 1 ) e λ(t 2 t 1 ) sinh λ(t 2 t 1 ) ] After some algebra this reduces to R(t 1,t 2 )=e λ(t 2 t 1 ) for t 2 t 1 A parallel analysis applies to the case t 2 t 1, so that R(t 1,t 2 )=e λ t 2 t 1 The autocorrelation for the telegraph signal depends only upon the time difference, not the location of the time interval. We will see soon that this is a very important characteristic of stationary random processes. We can now remove condition (3) on the telegraph process. Let Y (t) = AX(t) wherea is a random variable independent of X that takes on the values ±1 with equal probability. Then Y (0) will equal ±1 with equal probability, and the telegraph process will no longer have the restriction of being positive at t =0. Since A and X are independent, the autocorrelation for Y (t) is given by E [Y (t 1 )Y (t 2 )] = E[A 2 ]E [X(t 1 )X (t 2 )] = e λ t 2 t 1 since E[A 2 ]=1. Lecture 12 18 Lecture 12 19

Stationary Random Process The random telegraph is one example of a process that has at least some statistics that are independent of time. Random processes whose statistics do not depend on time are called stationary. In general, random processes can have joint statistics of any order. If the process is stationary, they are independent of time shift. The first order statistics are described by the cumulative distribution function F (x; t). If the process is stationary then the distribution function at times t = t 1 and t = t 2 will be identical. If a process is strict-sense stationary then joint probability distributions of all orders are independent of time origin. Wide-sense Stationary Processes We often are particularly interested in processes that are stationary up to at least order n =2. Such processes are called wide-sense stationary (wss). If a process is wss then its mean, variance, autocorrelation function and other first and second order statistical measures are independent of time. We have seen that a Poisson random process has mean µ(t) =λt, so it is not stationary in any sense. The telegraph signal has mean µ = 0, variance σ 2 = 1 and autocorrelation function R(t 1,t 2 )=R(τ) =e λτ where τ = t 2 t 1. It is a wss process. Lecture 12 20 Lecture 12 21 Correlation and Covariance The autocorrelation function of a wss process satisfies R(τ) =E[X(t)X(t + τ)] for any value of t. Then R(0) = E[X 2 ] The covariance function is C(τ) =E[(X(t) µ)(x(t + τ) µ)] =R(τ) µ 2 C(0) = E[(X(t) µ) 2 ]=σ 2 Correlation and Covariance Two random processes X(t) and Y (t) are called jointly wide-sense stationary if each is wss and their cross correlation depends only on τ = t 2 t 1. Then R xy (τ) =E[X(t)Y (t + τ)] is called the cross-correlation function and C xy (τ) =R xy (τ) µ x µ y is called the cross-covariance function. Lecture 12 22 Lecture 12 23

Simplification with Wide-Sense Stationary A stochastic process x(t) is wss if its mean is constant E[x(t)] = µ and its autocorrelation depends only on τ = t 1 t 2 R xx (t 1,t 2 )=E[x(t 1 )x (t 2 )] E[x(t + τ)x (t)] = R xx (τ) Because the result is indifferent to time origin, it can be written as [ ( R xx (τ) =E x t + τ ) ( x t + τ )] 2 2 Note that R xx ( τ) =Rxx (τ) and R xx (0) = E[ x(t) 2 ] Example Suppose that x(t) iswsswith R xx (τ) =Ae bτ Determine the second moment of the random variable x(6) x(2). E[(x(6) x(2)) 2 ] = E[x 2 (6)] 2E[x(6)x(2)] + E[x 2 (2)] = R xx (0) 2R xx (4) + R xx (0) = 2A Ae 4b Determine the second moment of x(12) x(8) E[(x(12) x(8)) 2 ] = E[x 2 (12)] 2E[x(12)x(8)] + E[x 2 (8)] = R xx (0) 2R xx (4) + R xx (0) = 2A Ae 4b Lecture 12 24 Lecture 12 25 Ergodic Random Process A practical problem arises when we want to calculate parameters such as mean or variance of a random process. The definition would require that we have a large number of examples of the random process and that we calculate the parameters for different values of t by averaging across the ensemble. Often we are faced with the situation of having only one member of the ensemble that is, one of the time functions. Under what circumstances is it appropriate to use it to draw conclusions about the whole ensemble? A random process is ergodic if every member of the process carries with it the complete statistics of the whole process. Then its ensemble averages will equal appropriate time averages. Of necessity, an ergodic process must be stationary, but not all stationary processes are ergodic. Lecture 12 26