ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Similar documents
ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Stochastic Processes

Chapter 6. Random Processes

Statistical signal processing

Stochastic Processes. A stochastic process is a function of two variables:

Probability and Statistics

Module 9: Stationary Processes

Random Processes Why we Care

Econometría 2: Análisis de series de Tiempo

Fig 1: Stationary and Non Stationary Time Series

Communication Theory II

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

16.584: Random (Stochastic) Processes

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Class 1: Stationary Time Series Analysis

1 Linear Difference Equations

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Chapter 5 Random Variables and Processes

1. Fundamental concepts

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

Econ 424 Time Series Concepts

Some Time-Series Models

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

Lecture - 30 Stationary Processes

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Signals and Spectra - Review

where r n = dn+1 x(t)

Chapter 6: Random Processes 1

Introduction to Probability and Stochastic Processes I

Definition of a Stochastic Process

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

Stochastic Process II Dr.-Ing. Sudchai Boonto

ECE 636: Systems identification

Properties of the Autocorrelation Function

IV. Covariance Analysis

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Random Process. Random Process. Random Process. Introduction to Random Processes

PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar

Review of Probability

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

STAT 248: EDA & Stationarity Handout 3

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Chapter 2 Random Processes

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

Northwestern University Department of Electrical Engineering and Computer Science

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

Stochastic Processes. Monday, November 14, 11

ENSC327 Communications Systems 2: Fourier Representations. Jie Liang School of Engineering Science Simon Fraser University

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

ECE Lecture #10 Overview

14 - Gaussian Stochastic Processes

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

STAT Financial Time Series

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

Introduction to Information Entropy Adapted from Papoulis (1991)

5.9 Power Spectral Density Gaussian Process 5.10 Noise 5.11 Narrowband Noise

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

Discrete time processes

Introduction to Stochastic processes

4 Classical Coherence Theory

TSKS01 Digital Communication Lecture 1

Stochastic Processes

Lecture 7 Random Signal Analysis

Stochastic process for macro

Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0

1 Probability and Random Variables

7 The Waveform Channel

Exercises with solutions (Set D)

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

Signals and Spectra (1A) Young Won Lim 11/26/12

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Time series models in the Frequency domain. The power spectrum, Spectral analysis

3F1 Random Processes Examples Paper (for all 6 lectures)

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Empirical Market Microstructure Analysis (EMMA)

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

Probability and Distributions

A Probability Review

Chapter 6 - Random Processes

white noise Time moving average

Chapter 4 Random process. 4.1 Random process

Lecture 2: Repetition of probability theory and statistics

4. Distributions of Functions of Random Variables

Recitation 2: Probability

FINAL EXAM: 3:30-5:30pm

EE4601 Communication Systems

Statistics of stochastic processes

Chapter 4. Chapter 4 sections

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1

Transcription:

ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1

Outline Random processes Stationary random processes Autocorrelation of random processes 2

Definition of Random Process A deterministic process has only one possible 'reality' of how the process evolves under time. In a stochastic or random process there are some uncertainties in its future evolution described by probability distributions. Even if the initial condition (or starting point) is known, there are many possibilities the process might go to, but some paths are more probable and others less. http://en.wikipedia.org/wiki/stochastic_process 3

Definition of Random Process Many time-varying signals are random in nature: Noises Image, audio: usually unknown to the distant receiver. Random process represents the mathematical model of these random signals. Definition: A random process (or stochastic process) is a collection of random variables (functions) indexed by time. Notation: s: the sample point of the random experiment. t: time. Simplified notation: X ( t, s) X(t) 4

Random Processes The difference between random variable and random process: Random variable: an outcome is mapped to a number. Random process: an outcome is mapped to a random waveform that is a function of time We are interested in the ways that these time functions evolve correlation spectra linear systems 5

Cont For a fixed sample point sj, X(t, sj) is a realization or sample function of the random process. Simplified Notation: X ( t, s ) is denoted as x ( t). j j For a fixed time tk, the set of numbers {X(tk, s1),, X(tk, sn)} = {x1(tk),, xn(tk)} is a random variable, denoted by X(tk). For a fixed sj and tk, X(tk, sj) is a number. 6

Pictorial View Each sample point represents a time-varying function. Ensemble: The set of all time-varying functions. 7

Examples of Random Processes 1. X ( t, s) = Y ( s) f ( t) or X ( t, s) = Yf ( t) for short. Y: a random variable. f: a deterministic function of parameter t. 2. X ( t, s) = A( s)cos(2πf t+ Q) or X ( t) = Acos(2πf0t+ A: a random variable. Q : a random variable. 0 Q ). 3. X(n), T(n): random sequences. p n (t): deterministic waveforms. 8

Probability Distribution of a Random Process For any random process, its probability distribution function is uniquely determined by its finite dimensional distributions. The k dimensional cumulative distribution function of a process is defined by X ( t 1 ),..., X ( t ( x x ) = P( X ( t ) x,..., X ( t ) x ) F k ) 1,..., k 1 1 for any t 1,...,t k and any real numbers x 1,, x k. The cumulative distribution function tells us everything we need to know about the process {X t }. k k 9

Outline Random processes Stationary random processes Autocorrelation of random processes 10

Stationarity In general, the time-dependent N-fold joint pdf s are needed to describe a random process for all possible N: Very difficult to obtain all pdf s. The analysis can be simplified if the statistics are time independent. The random process is called first-order stationary if F X t ( )( x) : the CDF of the random process X(t) at a fixed time t. 11

Stationarity the pdf f X t) ( ( x ) is independent of time The mean and variance of the first-order stationary random process are independent of time: Proof: 2 2 µ X ( t ) = µ ( ), σ ( = σ. 1 X t1+ τ X t ) X ( t+ ) ( 1 1 τ 12

Stationarity The random process is called second-order stationary if the 2 nd order CDF is independent of time: 13

Strict Stationarity A strictly stationary process (or strongly stationary process, or stationary process) is a stochastic process whose joint pdf does not change when shifted in time. Definition: a random process X(t) is said to be stationary if, for allk, for all τ, and for all t1, t2,, tk, 14

Strict Stationarity An example of strictly stationary process is one in which all X(ti) s are mutually Independent and Identically Distributed. Such a random process is called IID random process. In this case, Since the joint pdf above does not depend on the times {ti}, the process is strictly stationary. An example of IID process is white noise (studied later) Widely used in communications theory 15

Outline Random processes Stationary random processes Autocorrelation of random processes 16

Correlation of Random Processes Recall: Covariance of two random variables: {[ X µ X][ Y µ Y] } = E{ XY} µ X Y Cov ( X, Y ) = E µ Consider X ( t1) and X ( t 2 ) : samples of X(t) at t1 and t2. X ( t1) and X ( t2 ) are both random variables So we can also define their covariance: { X ( t1) X ( t2) } µ X ( t ) X ( ) Cov µ ( X ( t1), X ( t2)) = E 1 t2 17

Correlation of Random Processes Recall: autocorrelation of deterministic energy signals: R x ( τ ) = x( t) x * ( t τ ) dt Similarly, the autocorrelation of deterministic power signal is: 1 T R X ( τ ) = lim x( t) x ( t τ ) T T 2T The limit is necessary since the energy of the power signal can be infinite. dt 18

Correlation of Random Processes For random processes: need to consider probability distributions. The autocorrelation function of a random process: * Note: steps to get E X t) X ( ) : 1: For each sample function X(t, sj), calculate X(t1, sj) X*(t2, sj). 2: Take weighted average over all possible sample functions sj. (See Example 1 later) { } ( 1 t2 If X(t) is stationary to the 2 nd order or higher order, RX(t1,t2) only depends on the time difference t1 - t2, so it can be written as a single variable function: 19

Wide-Sense Stationarity (WSS) In many cases we do not require a random process to have all of the properties of the 2 nd order stationarity. A random process is said to be wide-sense stationary or weakly stationary if and only if 20

{ * R } X ( t, s) = E X ( t) X ( s) = RX ( t s). Property of Autocorrelation For real-valued wide-sense stationary X(t), we have: 1. R (0) { X 2 X = E ( t) }. 2. R X (τ) is even symmetry: R X (-τ) = R X (τ). Proof: 3. R X (τ) is max at the origin τ = 0. Proof: 21

Property of Autocorrelation Example of the autocorrelation function: { } ). ( ) ( ) ( ), ( * s t R s X t X E s t R X X = = 22 Symmetric, peak at 0. Change slow if X(t) changes slow.

Example 1 X ( t) = Acos(2πft + Θ) A: constant. Θ: uniform random variable in [0, 2π]. Find the autocorrelation of X. Is X wide-sense stationary? Solution: 23

Example 1 Y = g(x) µ Y g( x) f X( x) = dx 24

Example 2 X ( t) = Acos(2πft) A: uniform random variable in [0, 1]. Find the autocorrelation of X. Is X WSS? Solution: 25

Example 2 X ( t) = Acos(2πft) Note the value of A at time t1 and t2 are same in this example, because it s determined by the random experiment. 26

Example 3 A random process X(t) consists of three possible sample functions: x1(t)=1, x2(t)=3, and x3(t)=sin(t). Each occurs with equal probability. Find its mean and auto-correlation. Is it wide-sense stationary? Solution: 27