Introduction to Probability and Stochastic Processes I

Similar documents
for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

16.584: Random (Stochastic) Processes

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

Stochastic Processes

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Chapter 6: Random Processes 1

Fig 1: Stationary and Non Stationary Time Series

Chapter 6 - Random Processes

Introduction to Probability and Stocastic Processes - Part I

Chapter 6. Random Processes

Name of the Student: Problems on Discrete & Continuous R.Vs

Problem Sheet 1 Examples of Random Processes

Stochastic Processes. Chapter Definitions

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Random Process. Random Process. Random Process. Introduction to Random Processes

ECE Homework Set 3

Introduction to Probability and Stocastic Processes - Part I

ECE 636: Systems identification

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

7 The Waveform Channel

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

Probability and Statistics

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7

3F1 Random Processes Examples Paper (for all 6 lectures)

Chapter 5 Random Variables and Processes

Chapter 4 Random process. 4.1 Random process

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Problems on Discrete & Continuous R.Vs

5 Analog carrier modulation with noise

Probability and Statistics for Final Year Engineering Students

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Fundamentals of Noise

Definition of a Stochastic Process

Statistical signal processing

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

MA6451 PROBABILITY AND RANDOM PROCESSES

Stochastic Process II Dr.-Ing. Sudchai Boonto

Name of the Student: Problems on Discrete & Continuous R.Vs

Chapter 2 Random Processes

ECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process

Signals and Spectra (1A) Young Won Lim 11/26/12

Random Processes Why we Care

5.9 Power Spectral Density Gaussian Process 5.10 Noise 5.11 Narrowband Noise

Random Processes Handout IV

Stochastic Processes. A stochastic process is a function of two variables:

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

Signals and Spectra - Review

Module 9: Stationary Processes


Stochastic Processes. Monday, November 14, 11

Question Paper Code : AEC11T03

EE4601 Communication Systems

Lecture - 30 Stationary Processes

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

ELEMENTS OF PROBABILITY THEORY

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

Communication Theory II

PROBABILITY AND RANDOM PROCESSESS

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes

1.1 Review of Probability Theory

ECE-340, Spring 2015 Review Questions

Spectral Analysis of Random Processes

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

2A1H Time-Frequency Analysis II

Preliminary statistics

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University

UNIT-4: RANDOM PROCESSES: SPECTRAL CHARACTERISTICS

2016 Spring: The Final Exam of Digital Communications

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

Example 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ)

If we want to analyze experimental or simulated data we might encounter the following tasks:

TSKS01 Digital Communication Lecture 1

Name of the Student: Problems on Discrete & Continuous R.Vs

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES

Algorithms for Uncertainty Quantification

Final. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:

UNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS

Econometría 2: Análisis de series de Tiempo

PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar

STOCHASTIC PROBABILITY THEORY PROCESSES. Universities Press. Y Mallikarjuna Reddy EDITION

1 Elementary probability

Communication Theory II

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

Name of the Student: Problems on Discrete & Continuous R.Vs

Lecture 2: Repetition of probability theory and statistics

This examination consists of 10 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

Appendix A PROBABILITY AND RANDOM SIGNALS. A.1 Probability

Transcription:

Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides originally by: Line Ørtoft Endelt Introduction to Probability and Stochastic Processes I p. /29

Random processes and sequences I Signals can be classified into two main groups: Deterministic Random Random signals can be described by properties e.g.. Average power. 2. Spectral distribution on the average. 3. The probability that the signal amplitude exceeds a given value. The probabilistic model used to describe random signals are called a random process (stochastic process or time series). Introduction to Probability and Stochastic Processes I p. 2/29

Random processes and sequences II Consider the communication system shown in figure 3.. The input Knowledge of x i (t) for t [t,t 2 ], does not tell anything about x i (t) for any other t / [t,t 2 ]. Knowledge of a member function x i (t) does not tell anything about another member function x j (t). The input to the system is a random signal, and during the transmission noise (random) is added. The output is also random. If the channel is linear, its impulse response h i (t) is know and the noise n i (t) is additive, then y i (t) = x i (t) h i (t) + n i (t) Introduction to Probability and Stochastic Processes I p. 3/29

Random processes and sequences III Mapping of the outcomes of a random experiment to... Random Variable: S a set of real numbers. Random Process: S a set of waveforms or functions of time. Introduction to Probability and Stochastic Processes I p. 4/29

Example At time t = 0 a die is tossed, a time function x i (t) is assigned to each possible outcome of the experiment: Outcome Waveform x (t) = 4 2 x 2 (t) = 2 3 x 3 (t) = 2 4 x 4 (t) = 4 5 x 5 (t) = t 2 6 x 6 (t) = t 2 X is a Random Process: outcome of experiment set of waveforms Introduction to Probability and Stochastic Processes I p. 5/29

Example (continued) 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) -4 2 4 6 8 0 2 t t 2 t Introduction to Probability and Stochastic Processes I p. 6/29

Notation I A random process is denoted by: X(t, Λ) where t represents time, and Λ is a variable that represents an outcome in the sample space S. With each λ i Λ is associated a member function (sample function or realization) x i (t) of the ensemble (the collection of waveforms). The member functions are deterministic functions of time. For t = t 0, X(t 0, Λ) is a set of numerical values corresponding to the values of each member function at t = t 0. The probability distribution of X(t 0, Λ) can be derived from the probability distribution of the outcome of the random experiment. X(t 0,λ i ) is a numerical value. Introduction to Probability and Stochastic Processes I p. 7/29

Notation II X(t, Λ) can denote the following quantities:. X(t, Λ) = {X(t,λ i ) λ i S} = {x (t),x 2 (t),...} a collection of functions of time. 2. X(t,λ i ) = x i (t), a specific member function. 3. X(t 0, Λ) = {X(t 0,λ i ) λ i S} = {x (t 0 ),x 2 (t 0 ),...}, a collection of numerical values. 4. X(t 0,λ i ) = x i (t 0 ) numerical value of x i at time t 0. Instead of using the above notations, X(t) is used to denote all of them. Usually the meaning can be understood from the context. Introduction to Probability and Stochastic Processes I p. 8/29

Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) -4 2 4 6 8 0 2 t t 2 t X(t, λ ) =? Introduction to Probability and Stochastic Processes I p. 9/29

Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) -4 2 4 6 8 0 2 t t 2 t X(t, λ ) = X(t, Λ = ) = x (t) = 4, 0 t Introduction to Probability and Stochastic Processes I p. 9/29

Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) -4 2 4 6 8 0 2 t t 2 t X(t, λ 5 ) =? Introduction to Probability and Stochastic Processes I p. 9/29

Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) -4 2 4 6 8 0 2 t t 2 t X(t, λ 5 ) = X(t, Λ = 5) = x 5 (t) = t 2, 0 t Introduction to Probability and Stochastic Processes I p. 9/29

Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) -4 2 4 6 8 0 2 t t 2 t X(6, Λ) =? Introduction to Probability and Stochastic Processes I p. 9/29

Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) -4 2 4 6 8 0 2 t t 2 t X(6, Λ) = X(6) is a random variable with values in { 4, 3, 2, 2, 3, 4} Introduction to Probability and Stochastic Processes I p. 9/29

Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) -4 2 4 6 8 0 2 t t 2 t X(t = 6, Λ = 5) = X(6, λ 5 ) =? Introduction to Probability and Stochastic Processes I p. 9/29

Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) -4 2 4 6 8 0 2 t t 2 t X(t = 6, Λ = 5) = X(6, λ 5 ) = 3 Introduction to Probability and Stochastic Processes I p. 9/29

Probabilistic Structure If we know. the probability of each outcome of the experiment E, 2. the member function it corresponds to. Then properties like P[X(t ) a ] and P[X(t ) a,x(t 2 ) a 2 ] can be derived. If A = {λ i X(t,λ i ) a } then P[X(t ) a ] = P(A ) Joint and conditional probabilities can be found in the same way using the probabilities of the underlying experiment E. Introduction to Probability and Stochastic Processes I p. 0/29

Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) -4 2 4 6 8 0 2 t t 2 t P(X(4) = 2) =? Introduction to Probability and Stochastic Processes I p. /29

Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) -4 2 4 6 8 0 2 t t 2 t P(X(4) = 2) = P({2, 5}) = 3 Introduction to Probability and Stochastic Processes I p. /29

Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) -4 2 4 6 8 0 2 t t 2 t P(X(4) 0) =? Introduction to Probability and Stochastic Processes I p. /29

Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) -4 2 4 6 8 0 2 t t 2 t P(X(4) 0) = P({, 2, 5}) = 2 Introduction to Probability and Stochastic Processes I p. /29

Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) -4 2 4 6 8 0 2 t t 2 t P(X(0) = 0, X(4) = 2) =? Introduction to Probability and Stochastic Processes I p. /29

Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) -4 2 4 6 8 0 2 t t 2 t P(X(0) = 0, X(4) = 2) = P({5}) = 6 Introduction to Probability and Stochastic Processes I p. /29

Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) -4 2 4 6 8 0 2 t t 2 t P(X(4) = 2 X(0) = 0) =? Introduction to Probability and Stochastic Processes I p. /29

Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) -4 2 4 6 8 0 2 t t P(X(4) = 2 X(0) = 0) = P(X(0) = 0, X(4) = 2) P(X(0) = 0) 2 t = /6 2/6 = 2 Introduction to Probability and Stochastic Processes I p. /29

Classification of Random processes X(t) t Continuous Discrete Continuous Continuous Random Continuous Random process sequence Discrete Discrete Random Discrete Random process sequence Stationarity is an attribute also used to classify random processes, in case certain probability distributions or averages do not depend on time. More about stationarity in next lecture. Introduction to Probability and Stochastic Processes I p. 2/29

Example A random process is given by Z(t) = A(t) cos[2πf c t + Θ(t)] Where A(t) and Θ(t) are real-valued random processes. Z(t) = Re{A(t) exp[jθ(t)] exp[j2πf c t]} = Re{W(t) exp[j2πf c t]} Where the envelope W(t) is a complex random process W(t) = A(t) cos Θ(t) + ja(t) sinθ(t) = X(t) + jy (t) Introduction to Probability and Stochastic Processes I p. 3/29

Predictability A random process is predictable if the values of the member functions can be predicted based on its past values. Otherwise it is unpredictable. Example: In the case of the ensamble binary waveforms X(t) shown in Figure 3. randomness is evident, and no future values of a member function can be predicted knowing its past values. Hence the random process is unpredictable. Introduction to Probability and Stochastic Processes I p. 4/29

Formal definition of Random Processes Let S be the sample space of a random experiment, and let t be a variable with values in Γ R. A Real-valued random process X(t) is a function X(t) : Γ S R. If Γ is a subset of integers, then X(t) is a random sequence, otherwise X(t) is a random process. The n th order distribution of the random process is F X(t ),X(t 2 ),...,X(t n )(x,x 2,...,x n ) = P[X(t ) x,x(t 2 ) x 2,...X(t n ) x n ] For all n and t,...t n Γ.These functions satisfies all the requirements of joint probability functions. Introduction to Probability and Stochastic Processes I p. 5/29

Joint Distribution The joint distribution function is derived from the probability distribution of the experiment and the mapping of the sample space into member functions. It is NOT possible to go the other way!!! There is no technique to construct memberfunctions from joint distribution functions. Introduction to Probability and Stochastic Processes I p. 6/29

Example I Using the above die example of a discrete random process, the joint and marginal probability mass functions for X(0) and X(6) can be found: X(0) X(6) -4-3 -2 2 3 4 Marginals of X(0) -4 6 0 0 0 0 0 6-2 0 0 6 0 0 0 6 2-0 0 6 0 0 6 0 6 2 0 0 0 6 0 0 6 4 0 0 0 0 0 6 Marginals of X(6) 6 6 6 6 6 6 6 Introduction to Probability and Stochastic Processes I p. 7/29

Example II If an FM station broadcast a tone/frequency X(t) = 00 cos(0 8 t) Then the listeners will receive X (t) = a i cos(0 8 t + θ i ) and the received signals can be modeled by a random process Y (t) = A cos(0 8 t + Θ) Introduction to Probability and Stochastic Processes I p. 8/29

Average Values For a random process (or sequence) X(t) The Mean Value µ X (t) = E{X(t)} The Auto-correlation The Auto-covariance R XX (t,t 2 ) = E{X (t )X(t 2 )} C XX (t,t 2 ) = R XX (t,t 2 ) µ X (t )µ X (t 2 ) The Correlation Coefficient r XX (t,t 2 ) = C XX (t,t 2 ) CXX (t,t )C XX (t 2,t 2 ) Introduction to Probability and Stochastic Processes I p. 9/29

Example I For the discrete Random process based on the die example, find µ X (t), R XX (t,t 2 ), C XX (t,t 2 ), and r XX (t,t 2 ). µ X (t) = E{X(t)} = 6 6 x i (t) = 0 i= R XX (t,t 2 ) = E{X (t )X(t 2 )} = 6 x i (t )x i (t 2 ) 6 i= {6 + 4 + 4 + 6 + 4 t t 2 + 4 } t t 2 = 6 {40 + 2 t t 2 } = 6 Introduction to Probability and Stochastic Processes I p. 20/29

Example I (continued) C XX (t,t 2 ) = R XX (t,t 2 ) µ X (t )µ X (t 2 ) = R XX (t,t 2 ) r XX (t,t 2 ) = C XX (t,t 2 ) CXX (t,t )C XX (t 2,t 2 ) = 40 + 2 t t 2 (40 + 2 )( t2 40 + 2 t 2 ) 2 Introduction to Probability and Stochastic Processes I p. 2/29

Example II X: a random process described by X(t) = A cos(00t + Θ) A is a normal random variable with mean 0 and variance, and Θ is uniformly distributed in [ π,π]. Assume that A and Θ are independent. µ X (t) = E{A}E{cos(00t + Θ)} = 0 Introduction to Probability and Stochastic Processes I p. 22/29

Example II (continued) Let t = t and t 2 = t + τ, then R XX (t,t + τ) = E{X(t )X(t 2 )} = E{X(t)X(t + τ)} = E{A cos(00t + Θ)A cos(00t + 00τ + Θ)} = E{ A2 2 (cos(00τ) + cos(200t + 00τ + 2Θ))} = 2 cos(00τ) + E{cos(200t + 00τ + 2Θ)} 2 = 2 cos(00τ) since 2 cos(x) cos(y) = cos(x + y) + cos(x y) and E{cos(200t + 00τ + 2Θ)} = 0. Introduction to Probability and Stochastic Processes I p. 23/29

Two or More Random Processes I Two random processes have a joint distribution function P[X(t ) x,...x(t n ) x n,y (t ) y,...y (t m ) x m] The relation between X(t) and Y (t) are described by The Cross-correlation Function R XY (t,t 2 ) = E{X (t )Y (t 2 )} The Cross-covariance Function C XY (t,t 2 ) = R XY (t,t 2 ) µ X (t )µ Y (t 2 ) TheCorrelation Coefficient C XY (t,t 2 ) r XY (t,t 2 ) = CXX (t,t )C Y Y (t 2,t 2 ) Introduction to Probability and Stochastic Processes I p. 24/29

Two or More Random Processes II Equality: Two random processes are equal if their member functions are identical for each outcome λ S. And they are defined on the same random experiment. Uncorrelated: Two random processes X(t) and Y (t) are uncorrelated if C XY (t,t 2 ) = 0, t,t 2 Γ Orthogonal: Two random processes X(t) and Y (t) are orthogonal if R XY (t,t 2 ) = 0, t,t 2 Γ Introduction to Probability and Stochastic Processes I p. 25/29

Two or More Random Processes III Independent: Two random processes X(t) and Y (t) are independent if P[X(t ) x,...,x(t n ) x n,y (t ) y,...,y (t m) y m ] = P[X(t ) x,...,x(t n ) x n ]P[Y (t ) y,...,y (t m) y m ] for all n,m and t,t,...,t n,t,t 2,...,t m Λ As in the case of random variables independent implies uncorrelated but not conversely. Introduction to Probability and Stochastic Processes I p. 26/29

Example E having sample space S = {, 2, 3, 4, 5, 6} = {λ,λ 2,λ 3,λ 4,λ 5,λ 6 }. E 2 having sample space S 2 = {head,tail} = {q,q 2 }. λ i x i (t) y i (t) q i z j (t) 4 2 (head) cos t 2 2-4 2(tail) sin t 3 2 4 4 4-2 5 t 2 0 6 t 2 0 Two random processes X(t) and Y (t) are defined on the same experiment, but they are not equal. Introduction to Probability and Stochastic Processes I p. 27/29

Example (continued) λ i x i (t) y i (t) q i z j (t) 4 2 (head) cos t 2 x 2-4 2(tail) sin t 3 2 4 4 4-2 5 t 2 0 6 t 2 0 R XY (t,t 2 ) = E{X(t )Y (t 2 )} 6 = x i (t )y i (t 2 )P(λ i ) = 6 = 0 i= So X and Y are orthogonal. ( 8 + 8 + 8 8 + 0 + 0) Introduction to Probability and Stochastic Processes I p. 28/29

Example (continued) λ i x i (t) y i (t) q i z j (t) 4 2 (head) cos t 2 2-4 2(tail) sin t 3 2 4 4 4-2 5 t 2 0 6 t 2 0 Since C XY (t,t 2 ) = R XY (t,t 2 ) µ X (t )µ Y (t 2 ) = 0, X and Y are uncorrelated, but they are clearly not independent. X and Z are independent, since they are based on two unrelated experiments, so P(λ i and q j ) = P(λ i )P(q j ) Introduction to Probability and Stochastic Processes I p. 29/29

Introduction to Probability and Stochastic Processes I Lecture 4 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides originally by: Line Ørtoft Endelt Introduction to Probability and Stochastic Processes I p. /30

Definition of Random Process Let S be the sample space of a random experiment, and let t be a variable with values in Γ R. A Real-valued random process X(t) is a function X(t) : Γ S R. If Γ is a subset of integers, then X(t) is a random sequence, otherwise X(t) is a random process. The random process is a mapping X(t) from the sample space S to a space of continuous time functions {x i (t)} i I (member functions). At each fixed time t 0 the mapping X(t 0 ) is a random variable (with values in the set {x i (t 0 )} i I ). Introduction to Probability and Stochastic Processes I p. 2/30

Strict-sense Stationarity I A Random process X(t) is called stationary in the strict sense (SSS) if for all t,t 2,...,t k,t + τ,t 2 + τ,...,t k + τ Γ and for all k =, 2,..., P[X(t ) x,x(t 2 ) x 2,...X(t k ) x k ] = P[X(t + τ) x,x(t 2 + τ) x 2,...X(t k + τ) x k ] If the abovementioned definition only holds for k N but not neccessary for k > N, then the process is called Nth order stationary. Introduction to Probability and Stochastic Processes I p. 3/30

Strict-sense Stationarity II Consider a SSS random process, then for any τ P[X(t) x] = P[X(t + τ) x] which gives E{X(t)} = µ X = constant Introduction to Probability and Stochastic Processes I p. 4/30

Strict-sense Stationarity III The second-order distribution P[X(t ) x,x(t 2 ) x 2 ] = P[X(t + τ) x,x(t 2 + τ) x 2 ] only depends on the difference t 2 t. So the autocorrelation function of a SSS random process can be written as R XX (t,t 2 ) = E{X (t )X(t 2 )} = R XX (t 2 t ). Remark: A random process with a constant mean, and an autocorrelation function that only depends on the time difference, is not nessesarely SSS, it need not even be first order stationary. Introduction to Probability and Stochastic Processes I p. 5/30

Wide-sense Stationary I A Random process X(t) is called wide-sense stationary (WSS) if E{X(t)} E{X (t)x(t + τ)} = µ X = R XX (τ) Two Random processes X(t) and Y (t) are jointly WSS if E{X(t)} E{Y (t)} E{X (t)x(t + τ)} E{Y (t)y (t + τ)} E{X (t)y (t + τ)} = µ X = µ Y = R XX (τ) = R Y Y (τ) = R XY (τ) Introduction to Probability and Stochastic Processes I p. 6/30

Wide-sense Stationary II A random sequence X(k) is wide-sense stationary if E{X(k)} E{X (n)x(n + k)} = µ X = R XX (k) Note: SSS WSS WSS SSS Introduction to Probability and Stochastic Processes I p. 7/30

Example I λ i x i (t) φ j y j (t) 5 6 2 3 2 3 sin(t) 3 3 3 sin(t) 4 4 3 cos(t) 5 3 5 3 cos(t) 6 5 6 6 E{X(t)} = 0 R XX (t,t 2 ) = 70 (25 + 9 + + + 9 + 27) = 6 6 X is SSS, since the member functions do not change under time shifts. Introduction to Probability and Stochastic Processes I p. 8/30

Example I (continued) λ i x i (t) φ j y j (t) 5 6 2 3 2 3 sin(t) 3 3 3 sin(t) 4 4 3 cos(t) 5 3 5 3 cos(t) 6 5 6 6 E{Y (t)} = 0 R Y Y (t,t 2 ) = 6 6 j= y i (t )y i (t 2 ) Y is WSS but not SSS!! = 6 {72 + 8 cos(t 2 t )} = R Y Y (t 2 t ) Introduction to Probability and Stochastic Processes I p. 9/30

Example II X(n) is a binary Markov sequence, for n Z. The sequence is described by: P[X(n) = 0, X(n + ) = 0] = 0.2 P[X(n) = 0, X(n + ) = ] = 0.2 P[X(n) =, X(n + ) = 0] = 0.2 P[X(n) =, X(n + ) = ] = 0.4 P[X(n) = 0] = P[X(n) = 0, X(n + ) = 0] + P[X(n) = 0, X(n + ) = ] = 0.4 P[X(n) = ] = P[X(n) =, X(n + ) = 0] + P[X(n) =, X(n + ) = ] = 0.6 µ X = 0 P[X(n) = 0] + P[X(n) = ] = 0.6 R XX (n, n) = 0 2 P[X(n) = 0] + 2 P[X(n) = ] = 0.6 Introduction to Probability and Stochastic Processes I p. 0/30

Example II (continued) R XX (n, n + ) = R XX (n, n + 2) = i=0 j=0 ijp[x(n) = i, X(n + ) = j] = P[X(n) =, X(n + ) = ] = 0.4 i=0 j=0 ijp[x(n) = i, X(n + 2) = j] = P[X(n) =, X(n + 2) = ] = P[X(n) =, X(n + ) = 0, X(n + 2) = ] +P[X(n) =, X(n + ) =, X(n + 2) = ] = P[X(n) = ]P[X(n) = X(n + ) = 0] P[X(n + 2) = X(n) =, X(n + ) = 0] +P[X(n) = ]P[X(n) = X(n + ) = ] P[X(n + 2) = X(n) =, X(n + ) = ] = P[X(n) = ]P[X(n) = X(n + ) = 0] P[X(n + 2) = X(n + ) = 0] +P[X(n) = ]P[X(n) = X(n + ) = ] P[X(n + 2) = X(n + ) = ] Introduction to Probability and Stochastic Processes I p. /30

Example II (continued) so R XX (n, n + 2) = (0.6) 0.4 0.6 0.4 0.6 + (0.6)0.2 0.6 0.2 0.4 0.367 E{X} = 0.6 R XX (n, n) = 0.6 R XX (n, n + ) = 0.4 R XX (n, n + 2) = 0.367 It can be shown, that R XX (n,n + k) is independent of n for k Z, so the process is wide-sense stationary. Introduction to Probability and Stochastic Processes I p. 2/30

Example III A i and B i have a joint gaussian distribution for i =,2,3,..., n, with µ Ai = µ Bi = 0 and σ 2 A i = σ 2 B i = σ 2. They are assumed to be uncorrelated. A random process is defined as X(t) = Show that the process i WSS: n i= (A i cos ω i t + B i sin ω i t) E{X(t)} = n i= E{A i }cos ω i t + E{B i } sin ω i t = 0 Ò n E{X(t)X(t + τ)} = E i= n j= [A i cos ω i t + B i sin ω i t] Ó [A j cos ω j (t + τ) + B j sin ω j (t + τ)] = n i= E{A 2 i } cos ω itcos ω i (t + τ) +E{B 2 i } sin ω itsin ω i (t + τ) = σ 2 n i= cos ω i τ = R XX (τ) Introduction to Probability and Stochastic Processes I p. 3/30

Example III (continued) A i and B i have a joint gaussian distribution for i =,2,3,..., n, with µ Ai = µ Bi = 0 and σ 2 A i = σ 2 B i = σ 2. They are assumed to be uncorrelated. A random process is defined as Is it SSS? X(t) = n i= (A i cos ω i t + B i sin ω i t) Introduction to Probability and Stochastic Processes I p. 4/30

Other forms of Stationarity I A random process X(t) is asymptotically stationary if the distribution of X(t + τ),x(t 2 + τ),...x(t k + τ) does not depend on τ when τ is large. A random process is stationary in an interval if P[X(t ) x,x(t 2 ) x 2,...X(t k ) x k ] = P[X(t + τ) x,x(t 2 + τ) x 2,...X(t k + τ) x k ] hold for all τ for which t,t 2,...,t k,t + τ,t 2 + τ,...,t k + τ lie in an interval that is a subset of Γ. Introduction to Probability and Stochastic Processes I p. 5/30

Other forms of Stationarity II A random process X(t) is said to have stationary increments if its increments Y (t) = X(t + τ) X(t) form a stationary process for every τ. A random process is cyclostationary or periodically stationary if it is stationary under a shift of the time origin by integer multiples of a constant T 0 (the period of the process). Introduction to Probability and Stochastic Processes I p. 6/30

Autocorrelation Fct. of a Real WSS RP The autocorrelation function of a Real WSS random process is given by R XX (τ) = E{X(t)X(t + τ)} The autocorrelation function satisfies the following properties. If X(t) is a voltage waveform across a -Ω resistance, then the ensemble average value of X 2 (t) is the average value of the power delivered to the -Ω resistance by X(t): E{X 2 (t)} = Average Power = R XX (0) 0 Introduction to Probability and Stochastic Processes I p. 7/30

Autocorrelation Fct. of a Real WSS RP 2. The autocorrelation function R XX (τ) is an even function of τ R XX (τ) = R XX ( τ) 3. The autocorrelation function R XX (τ) is bounded R XX (τ) R XX (0) Introduction to Probability and Stochastic Processes I p. 8/30

Proof of 3 E{[X(t + τ) X(t)] 2 } 0 E{[X(t + τ) + X(t)] 2 } 0 which implies {X 2 (t + τ)} + E{X 2 (t)} 2R XX (τ) 0 E{X 2 (t + τ)} + E{X 2 (t)} + 2R XX (τ) 0 since E{X 2 (t + τ)} = E{X 2 (t)} = R XX (0) Hence 2R XX (0) 2R XX (τ) 0 2R XX (0) + 2R XX (τ) 0 R XX (τ) R XX (0) Introduction to Probability and Stochastic Processes I p. 9/30

Cross-correlation Function The cross-correlation function of two real random processes, that are jointly WSS is R XY (τ) = E{X(t)Y (t + τ)} Its has the following properties. R XY (τ) = R Y X ( τ) 2. R XY (τ) R XX (0)R Y Y (0) 3. R XY (τ) 2 [R XX(0) + R Y Y (0)] 4. R XY (τ) = 0 if the processes are orthogonal and R XY (τ) = µ X µ Y if the processes are independent. Introduction to Probability and Stochastic Processes I p. 20/30

Power Spectral Density Function I For a deterministic signal the average power in the signal is defined as P x = lim T 2T T T x 2 (t)dt If the deterministic signal is periodic with period T 0, then the timeaveraged autocorrelation function is defined as R xx (τ) T0 = T 0 T0 0 x(t)x(t + τ)dt If S xx (f) is the Fourier transform of R xx (τ) T0, then P x = S xx (f)df Introduction to Probability and Stochastic Processes I p. 2/30

Power Spectral Density Function II The Power Spectral Density Function (psd) of a WSS random process X(t) is defined as S XX (f) = F {R XX (τ)} = R XX (τ) exp( j2πfτ)dτ Called the Wiener-Khinchine relation. The autocorrelation is recovered by R XX (τ) = F {S XX (f)} = S XX (f) exp(j2πfτ)df Introduction to Probability and Stochastic Processes I p. 22/30

Properties of psd The psd function is also called the spectrum of X(t), and has the following properties:. S XX (f) is real and nonnegative. 2. The average power in X(t) is given by E{X 2 (t)} = R XX (0) = S XX (f)df 3. For X(t) real, R XX (τ) is an even function and hence S XX ( f) = S XX (f) 4. If X(t) has periodic components, then S XX (f) will have impulses. Introduction to Probability and Stochastic Processes I p. 23/30

Lowpass and Bandpass Processes A random proces is lowpass if its psd is zero for f > B, and B is called the bandwidth of the process. A random process is bandpass if its psd is zero outside the band Figure p. 47!! f c B 2 f f c + B 2 Introduction to Probability and Stochastic Processes I p. 24/30

Power and bandwidth Calculations I The power in a band of frequencies, f to f 2, for 0 < f < f 2 is for a real random process X(t) P X [f,f 2 ] = 2 f2 f S XX (f)df For a zero mean random process with continuous psd, the effiective bandwidth is defined as S XX(f)df B eff = 2 max[s XX (f)] Introduction to Probability and Stochastic Processes I p. 25/30

Power and bandwidth Calculations II The effective bandwidth is related to the correlation time τ c = R XX(τ)dτ R XX (0) If S XX (f) is continuous and has maximum at f = 0, then B eff = 2τ c Introduction to Probability and Stochastic Processes I p. 26/30

Cross-power Spectral Density Function For two real-valued random processes X(t) and Y (t) the cross-power spectral density (cpsd) function S XY (f) is defined by S XY (f) = R XY (τ) exp( j2πfτ)dτ and the cross-correlation function is recovered as R XY (τ) = S XY (f) exp(j2πfτ)df The cross-power spectral function will in general be a complex valued function. Introduction to Probability and Stochastic Processes I p. 27/30

Cross-power Spectral Density Function Some properties of the cpsd are:. S XY (f) = S Y X (f) 2. Re(S XY (f)) is an even function of f, and Im(S XY (f)) is an odd function of f. 3. S XY (f) = 0 if X(t) and Y (t) are orthogonal and S XY (f) = µ X µ Y δ(f) if X(t) and Y (t) are independent. Introduction to Probability and Stochastic Processes I p. 28/30

Coherence function The real-valued coherence function between two random processes is defined as ρ 2 XY (f) = S XY (f) 2 S XX (f)s Y Y (f) When ρ 2 XY (f 0) = 0, then X(t) and Y (t) are incoherent at f 0. When ρ 2 XY (f 0) =, then X(t) and Y (t) are fully coherent at f 0. If X(t) and Y (t) are statistically independent, then ρ 2 XY (f 0) = 0 at all frequencies except at f = 0. Introduction to Probability and Stochastic Processes I p. 29/30

PSD of Random Sequences The Power Spectral Density (psd) of a random sequence X(nT s ) with uniform sampling time of one second (T s = ) is defined by the Fourier Transform of the sequence: S XX (f) = n= exp( j2πfn)r XX (n), 2 < f < 2 The auto-correlation is recovered as R XX (n) = 2 2 S XX (f) exp(j2πfn)df If the sampling time is not, then the psd is defined for 2T s < f < 2T s. Introduction to Probability and Stochastic Processes I p. 30/30