ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

Similar documents
ECE 353 Probability and Random Signals - Practice Questions

Chapter 6: Random Processes 1

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

16.584: Random (Stochastic) Processes

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

2 Continuous Random Variables and their Distributions

Exponential Distribution and Poisson Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ECE 541 Stochastic Signals and Systems Problem Set 11 Solution

Module 9: Stationary Processes

Northwestern University Department of Electrical Engineering and Computer Science

Random Process. Random Process. Random Process. Introduction to Random Processes

Random Processes Why we Care

ECE353: Probability and Random Processes. Lecture 5 - Cumulative Distribution Function and Expectation

1 Probability and Random Variables

Variance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Proving the central limit theorem

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

Stochastic Processes. Chapter Definitions

Chapter 2: Random Variables

The exponential distribution and the Poisson process

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

Chapter 2 Random Processes

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Lecture 2: Repetition of probability theory and statistics

FINAL EXAM: 3:30-5:30pm

Massachusetts Institute of Technology

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

1.1 Review of Probability Theory

ELEMENTS OF PROBABILITY THEORY

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Optimization and Simulation

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7

Recitation 2: Probability

STAT Chapter 5 Continuous Distributions

Stochastic Process II Dr.-Ing. Sudchai Boonto

Stochastic Processes

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Probability and Distributions

Chapter 4 Random process. 4.1 Random process

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

ECE534, Spring 2018: Solutions for Problem Set #5

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

Basics of Stochastic Modeling: Part II

ENGG2430A-Homework 2

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

1 Simulating normal (Gaussian) rvs with applications to simulating Brownian motion and geometric Brownian motion in one and two dimensions

F X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X.

4. Distributions of Functions of Random Variables

Chapter 5 continued. Chapter 5 sections

Properties of the Autocorrelation Function

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Final Solutions Fri, June 8

conditional cdf, conditional pdf, total probability theorem?

3 Continuous Random Variables

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Part I Stochastic variables and Markov chains

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

Chapter 6. Random Processes

ECE Lecture #10 Overview

Continuous Distributions

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

CS145: Probability & Computing

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Sampling Distributions

ECE Homework Set 3

Algorithms for Uncertainty Quantification

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Poisson processes and their properties

MATH2715: Statistical Methods

Basic concepts of probability theory

Review of Probability. CS1538: Introduction to Simulations

Limiting Distributions

ECE302 Spring 2006 Practice Final Exam Solution May 4, Name: Score: /100

ECSE B Solutions to Assignment 8 Fall 2008

FINAL EXAM: Monday 8-10am

EE 3025 S2010 Demo 10 Apr 19-20, Reading Assignment: Read Sections 9.5 and of the EE 3025 Matlab Notes.

Math Spring Practice for the final Exam.

IEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013

Introduction to Probability and Stochastic Processes I

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Final. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:

Chapter 5 Random Variables and Processes

Review of Mathematical Concepts. Hongwei Zhang

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

Stat 426 : Homework 1.

EE4601 Communication Systems

7 The Waveform Channel

1 Expectation of a continuously distributed random variable

Transcription:

ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu

From RV to Stochastic Process Recall that a RV X is a mapping from the sample space to a real number (i.e., X(s)). 5 ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 1

From RV to Stochastic Process A random pair is a mapping to two random variables. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 2

From RV to Stochastic Process A random vector is a mapping to a sequence of random variables. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 3

From RV to Stochastic Process A stochastic process is a mapping X(t, s) that maps an outcome to an infinitelength sequence that is indexed by time. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 4

Sample Path Sample Path ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 5

Sample Path Fixing time t = t 1, X(t 1, s) is a single RV. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 6

X(t,s) X(t,s) Examples Example 1: pick up a video on YouTube at random to play. Every video is a unique stream of bits. Example 2: random sinusoid X(t, s) = A(s) sin(ω(s)t + φ(s)). modulation in communications. 5 0-5 -5 0 5 10 t 2 0-2 -5 0 5 10 t ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 7

X(t,s) X(t,s) Types of Stochastic Processes Continuous time process: t is continuous. X(t, s). Discrete-time process: t is not continuous (e.g., digital signal processing). X n (s). Discrete-valued process: X(t, s) is a discrete RV. Continuous-valued process: X(t, s) is continuous RV. Q: what is the type of the following: 5 0-5 -5 0 5 10 t 2 0-2 -5 0 5 10 t ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 8

Types of Stochastic Processes Continuous time process: t is continuous. X(t, s). Discrete-time process: t is not continuous (e.g., digital signal processing). X n (s). Discrete-valued process: X(t, s) is a discrete RV. Continuous-valued process: X(t, s) is continuous RV. Q: what about this: 5 X(t,s) 0-5 -5 0 5 10 t 2 X(t,s) 0-2 -5 0 5 10 t ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 9

Poisson Processes of Rate λ Motivation: we wish to model the number of data packages arriving at a data center over time; or the number of customer arriving at a mall over time. Definition: Poisson Process of Rate λ, denoted by N(t, s) (abused notation N(t) since we know that s is always playing role). 1. N(t) = 0, t < 0; 2. for all t > t 0, the increment N(t 1 ) N(t 0 ) is a Poisson RV with mean λ(t 1 t 0 ). 3. if [t 0, t 1 ] and [t 0, t 1] are non-overlapping, then, the corresponding increments, are independent RVs. Note: Poisson RV with mean α > 0 N(t 1 ) N(t 0 ), N(t 1) N(t 0) P N (n) = {α ne α n!, n = 0, 1, 2,... 0, o.w. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 10

Poisson Processes of Rate λ Illustration ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 11

Poisson Processes of Rate λ Example: Let us assume t 1 t 2 t 3 and n 1 n 2 n 3. What is the joint PMF P N(t1 ),N(t 2 ),N(t 3 )(n 1, n 2, n 3 )? ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 12

Poisson Processes of Rate λ We are interested in the joint Probability P [N(t 1 ) = n 1, N(t 2 ) = n 2, N(t 3 ) = n 3 ] = P [N(t 1 ) N(0) = n 1, N(t 2 ) N(t 1 ) = n 2 n 1, N(t 3 ) N(t 2 ) = n 3 n 2 ] = P [N(t 1 ) N(0) = n 1 ]P [N(t 2 ) N(t 1 ) = n 2 n 1 ]P [N(t 3 ) N(t 2 ) = n 3 n 2 ] ( λt n 1 ) ( 1 λ(t2 t = n 1! e λt 1 1 ) n ) ( 2 n 1 e λ(t 2 t 1 ) λ(t3 t 2 ) n ) 3 n 2 e λ(t 3 t 2 ) (n 2 n 1 )! (n 3 n 2 )! ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 13

Arrival Time From the Poisson process, one can also have characteristics of the arrival times. Let N(t) denote the number of customers that one observe at time t, which is a Poisson process. The time that the first customer arrives is a random variable. The inter-arrival time is also random. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 14

Arrival Time Let us consider the arrival time of the first customer, X 1. What is the PDF? We start with the CDF and P [X 1 x 1 ]. This is not easy to compute, but we may compute P [X 1 > x 1 ] = P [no arrival until time point x 1 ] (note that the number of arrivals between t = 0 and t = x 1 is a Poisson RV with mean λ(x 1 0)): P [X 1 > x 1 ] = P [N(x 1 ) N(0) = 0] = λx0 1 0! e λx 1 = e λx 1 Hence, F X1 (x 1 ) = P [X 1 x 1 ] = 1 e λx 1. The PDF is f X1 (x 1 ) = df X 1 (x 1 ) x 1 = λe λx 1 for x 1 0: Beautiful! f X1 (x 1 ) = { λe λx 1, x 1 0 0, o.w. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 15

Inter-arrival Time What is the PDF X 2, the first inter-arrival time? What we know is that {X 1 = x 1 } has already happened; and N(x 1 ) = 1. P [X 2 > x 2 X 1 = x 1 ] = P [N(x 1 + x 2 ) N(x 1 ) = 0 N(x 1 ) = 1] = P [N(x 1 + x 2 ) N(x 1 ) = 0 N(x 1 ) N(0) = 1] = P [N(x 1 + x 2 ) N(x 1 ) = 0] = e λx 2 The above has nothing to do with x 1 X 2 and X 1 are independent; and { 1 e λx 2, x 2 > 0 F X2 (x 2 ) = 0 o.w. x 2 is also an exponentially distributed RV! For Poisson N(t) of rate λ, {X i } i=1 : i.i.d. exponential RVs. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 16

Brownian Motion Process Definition: The continuous time Brownian Motion: W (t) such that W (t) t=0 = W (0) = 0, W (t + τ) W (t) N (0, σ 2 = ατ), i.e., W (t + τ) W (t) is a Gaussian RV with variance ατ. Discrete-Time Brownian Motion: X n+1 = X n + W n+1, X 0 = 0, W n N (0, σ 2 ), {W n } n=1, i.i.d. X 1 = X 0 + W 1 X 2 = X 1 + W 2 = W 1 + W 2 X 3 = X 2 + W 3 = W 1 + W 2 + W 3. x n = { n i=1 W n, n 1 0, n 0. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 17

Brownian Motion Process E[X n ] = E[ n i=1 W i] = n i=1 E[W i] = 0. Var[X n ] = Var[ n i=1 W i] = nσ 2 (the variance goes unbounded when n ). Let Z n = (1/n)X n = (1/n) n i=1 W i. The factor 1/n matters so much! E[Z n ] = 0, Var[Z n ] = (1/n) 2 Var[X n ] = σ2 n. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 18

Basic Statistics of Stochastic Process Definition: Expected value function of stochastic process X(t) is defined as µ X (t) = E[X(t)]. Note: µ X (t) is a deterministic function that gives the mean of X(t) for all t. Discrete-time: µ X [n] = E[X n ], for all n Z. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 19

Basic Statistics of Stochastic Process Example: Random amplitude cosine process: X(t) = A cos(ωt + φ) = A(s) cos(ωt + φ). }{{} random X(t) X(t) X(t) 5 0-5 -5 0 5 10 t 5 0-5 -5 0 5 10 t 5 0-5 -5 0 5 10 t ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 20

Basic Statistics of Stochastic Process We can compute µ X (t) = E[X(t)] = E[A cos(ωt + φ)] = E[A] cos(ωt + φ) E.g., if A N (0, 1), then we have µ X (t) = 0, t ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 21

Basic Statistics of Stochastic Process Definition: Auto-covariance of random process: C X (t, τ) = Cov[X(t), X(t + τ)] Discrete-time: C X [m, k] = Cov[X m, X m+k ] Definition: Auto-correlation of random process: R X (t, τ) = E[X(t)X(t + τ)] R X [m, k] = E[X m, X m+k ] C X (t, τ) = R X (t, τ) µ X (t)µ X (t + τ) C X [m, k] = R X [m, k] µ X [m]µ X [m + k] ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 22

Stationary Process Let us look at a particular time t 1 : X(t 1 ) is a RV. The PDF f X(t1 )(x), generally speaking, is a function of t. Definition: X(t) is stationary if and only if joint PDF f X(t1 ),...,X(t m )(x 1,..., x m ) = f X(t1 +τ),...,x(t m +τ)(x 1,..., x m ), τ, m Hence if X(t) is stationary, f X(t) (x) is the same for all t. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 23

Stationary Process Example: {W n } n= : i.i.d. Gaussian (WGN). Is it stationary? How to check? f W1 (w 1 ) = f W1+q (w 1 )? f W1,W 2 (w 1, w 2 ) = f W1+q,W 2+q (w 1, w 2 )? i.i.d. = Stationary. (The converse is not true). Example: X n (s) = A(s). Given s, A is fixed (P Xn1,X n2 (x 1, x 2 ) = P [A 2 ]); always stationary, but not independent. Example: Discrete-time Brownian Motion X n : Var[X n ] = nσ 2. Var[X 1 ] = σ 2 and Var[X 100 ] = 100σ 2. Cannot have the same PDFs. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 24

Stationary Process Theorem: If X(t) is a stationary process, then we have µ X (t) = µ X, t and R X (t, τ) = E[X(t)X(t + τ)] = R X (0, τ) = R X (τ). These are necessary conditions of being stationary. Proof: µ X (t) = E[X(t)] = = x= x= xf X(t) (x)dx xf X(0) (x)dx = µ X where we have used stationarity f X(t) (x) = f X(0) (x). t, ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 25

For the auto-correlation part: R X (t, τ) = E[X(t)X(t + τ)] = = Stationary Process x 1 = x 2 = x 1 = x 2 = = R X (0, τ). x 1 x 2 f X(t),X(t+τ) (x 1, x 2 )dx 1 dx 2 x 1 x 2 f X(0),X(τ) (x 1, x 2 )dx 1 dx 2 ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 26

Stationary Process Necessary conditions are used for disqualifying X(t) as a stationary process. Example: Y (t) = A cos(2πf c t + θ); A N (0, 1) is random. Is Y (t) stationary? Sanity check: E[Y (t)] = E[A] cos(2πf c t + θ) = 0. Let 2πf c t+θ = π/2+2kπ for k Z. There exist points t : πf c t +θ = π/2+2kπ where Y (s, t ) = 0 for all s. Can this be stationary? R X (t, τ) = E[X(t )X(t + τ)] =? ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 27

Wide Sense Stationary (WSS) Process Definition: X(t) is WSS if and only if E[X(t)] = µ X, t R R X (t, τ) = E[X(t)X(t + τ)] = R X (0, τ), t, τ Example: Y (t) = A cos(2πf c t + θ); θ U[0, 2π] is random. Is Y (t) WSS? Let α(t) = 2πf c t E[Y (t)] = AE[cos(α(t) + θ)] = A = A 2π 2π θ=0 2π θ=0 cos(α(t) + θ)dθ = 0. cos(α(t) + θ) 1 2π dθ ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 28

In addition, we have Wide Sense Stationary (WSS) Process R X (t, τ) = E[X(t)X(t + τ)] = A 2 E[cos(2πf c t + θ) cos(2πf c (t + τ) + θ)] Recall that Hence, we have cos A cos B = 1 2 cos(a B) + 1 2 cos(a + B). R X (t, τ) = A2 2 2π θ=0 2π + A2 = A2 2 2 θ=0 2π θ=0 cos(4πf c t + 2πf c τ + 2θ)dθ cos( 2πf c τ)dθ cos(2πf c τ)dθ = R X (0, τ). ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 29