ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu
From RV to Stochastic Process Recall that a RV X is a mapping from the sample space to a real number (i.e., X(s)). 5 ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 1
From RV to Stochastic Process A random pair is a mapping to two random variables. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 2
From RV to Stochastic Process A random vector is a mapping to a sequence of random variables. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 3
From RV to Stochastic Process A stochastic process is a mapping X(t, s) that maps an outcome to an infinitelength sequence that is indexed by time. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 4
Sample Path Sample Path ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 5
Sample Path Fixing time t = t 1, X(t 1, s) is a single RV. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 6
X(t,s) X(t,s) Examples Example 1: pick up a video on YouTube at random to play. Every video is a unique stream of bits. Example 2: random sinusoid X(t, s) = A(s) sin(ω(s)t + φ(s)). modulation in communications. 5 0-5 -5 0 5 10 t 2 0-2 -5 0 5 10 t ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 7
X(t,s) X(t,s) Types of Stochastic Processes Continuous time process: t is continuous. X(t, s). Discrete-time process: t is not continuous (e.g., digital signal processing). X n (s). Discrete-valued process: X(t, s) is a discrete RV. Continuous-valued process: X(t, s) is continuous RV. Q: what is the type of the following: 5 0-5 -5 0 5 10 t 2 0-2 -5 0 5 10 t ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 8
Types of Stochastic Processes Continuous time process: t is continuous. X(t, s). Discrete-time process: t is not continuous (e.g., digital signal processing). X n (s). Discrete-valued process: X(t, s) is a discrete RV. Continuous-valued process: X(t, s) is continuous RV. Q: what about this: 5 X(t,s) 0-5 -5 0 5 10 t 2 X(t,s) 0-2 -5 0 5 10 t ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 9
Poisson Processes of Rate λ Motivation: we wish to model the number of data packages arriving at a data center over time; or the number of customer arriving at a mall over time. Definition: Poisson Process of Rate λ, denoted by N(t, s) (abused notation N(t) since we know that s is always playing role). 1. N(t) = 0, t < 0; 2. for all t > t 0, the increment N(t 1 ) N(t 0 ) is a Poisson RV with mean λ(t 1 t 0 ). 3. if [t 0, t 1 ] and [t 0, t 1] are non-overlapping, then, the corresponding increments, are independent RVs. Note: Poisson RV with mean α > 0 N(t 1 ) N(t 0 ), N(t 1) N(t 0) P N (n) = {α ne α n!, n = 0, 1, 2,... 0, o.w. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 10
Poisson Processes of Rate λ Illustration ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 11
Poisson Processes of Rate λ Example: Let us assume t 1 t 2 t 3 and n 1 n 2 n 3. What is the joint PMF P N(t1 ),N(t 2 ),N(t 3 )(n 1, n 2, n 3 )? ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 12
Poisson Processes of Rate λ We are interested in the joint Probability P [N(t 1 ) = n 1, N(t 2 ) = n 2, N(t 3 ) = n 3 ] = P [N(t 1 ) N(0) = n 1, N(t 2 ) N(t 1 ) = n 2 n 1, N(t 3 ) N(t 2 ) = n 3 n 2 ] = P [N(t 1 ) N(0) = n 1 ]P [N(t 2 ) N(t 1 ) = n 2 n 1 ]P [N(t 3 ) N(t 2 ) = n 3 n 2 ] ( λt n 1 ) ( 1 λ(t2 t = n 1! e λt 1 1 ) n ) ( 2 n 1 e λ(t 2 t 1 ) λ(t3 t 2 ) n ) 3 n 2 e λ(t 3 t 2 ) (n 2 n 1 )! (n 3 n 2 )! ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 13
Arrival Time From the Poisson process, one can also have characteristics of the arrival times. Let N(t) denote the number of customers that one observe at time t, which is a Poisson process. The time that the first customer arrives is a random variable. The inter-arrival time is also random. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 14
Arrival Time Let us consider the arrival time of the first customer, X 1. What is the PDF? We start with the CDF and P [X 1 x 1 ]. This is not easy to compute, but we may compute P [X 1 > x 1 ] = P [no arrival until time point x 1 ] (note that the number of arrivals between t = 0 and t = x 1 is a Poisson RV with mean λ(x 1 0)): P [X 1 > x 1 ] = P [N(x 1 ) N(0) = 0] = λx0 1 0! e λx 1 = e λx 1 Hence, F X1 (x 1 ) = P [X 1 x 1 ] = 1 e λx 1. The PDF is f X1 (x 1 ) = df X 1 (x 1 ) x 1 = λe λx 1 for x 1 0: Beautiful! f X1 (x 1 ) = { λe λx 1, x 1 0 0, o.w. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 15
Inter-arrival Time What is the PDF X 2, the first inter-arrival time? What we know is that {X 1 = x 1 } has already happened; and N(x 1 ) = 1. P [X 2 > x 2 X 1 = x 1 ] = P [N(x 1 + x 2 ) N(x 1 ) = 0 N(x 1 ) = 1] = P [N(x 1 + x 2 ) N(x 1 ) = 0 N(x 1 ) N(0) = 1] = P [N(x 1 + x 2 ) N(x 1 ) = 0] = e λx 2 The above has nothing to do with x 1 X 2 and X 1 are independent; and { 1 e λx 2, x 2 > 0 F X2 (x 2 ) = 0 o.w. x 2 is also an exponentially distributed RV! For Poisson N(t) of rate λ, {X i } i=1 : i.i.d. exponential RVs. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 16
Brownian Motion Process Definition: The continuous time Brownian Motion: W (t) such that W (t) t=0 = W (0) = 0, W (t + τ) W (t) N (0, σ 2 = ατ), i.e., W (t + τ) W (t) is a Gaussian RV with variance ατ. Discrete-Time Brownian Motion: X n+1 = X n + W n+1, X 0 = 0, W n N (0, σ 2 ), {W n } n=1, i.i.d. X 1 = X 0 + W 1 X 2 = X 1 + W 2 = W 1 + W 2 X 3 = X 2 + W 3 = W 1 + W 2 + W 3. x n = { n i=1 W n, n 1 0, n 0. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 17
Brownian Motion Process E[X n ] = E[ n i=1 W i] = n i=1 E[W i] = 0. Var[X n ] = Var[ n i=1 W i] = nσ 2 (the variance goes unbounded when n ). Let Z n = (1/n)X n = (1/n) n i=1 W i. The factor 1/n matters so much! E[Z n ] = 0, Var[Z n ] = (1/n) 2 Var[X n ] = σ2 n. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 18
Basic Statistics of Stochastic Process Definition: Expected value function of stochastic process X(t) is defined as µ X (t) = E[X(t)]. Note: µ X (t) is a deterministic function that gives the mean of X(t) for all t. Discrete-time: µ X [n] = E[X n ], for all n Z. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 19
Basic Statistics of Stochastic Process Example: Random amplitude cosine process: X(t) = A cos(ωt + φ) = A(s) cos(ωt + φ). }{{} random X(t) X(t) X(t) 5 0-5 -5 0 5 10 t 5 0-5 -5 0 5 10 t 5 0-5 -5 0 5 10 t ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 20
Basic Statistics of Stochastic Process We can compute µ X (t) = E[X(t)] = E[A cos(ωt + φ)] = E[A] cos(ωt + φ) E.g., if A N (0, 1), then we have µ X (t) = 0, t ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 21
Basic Statistics of Stochastic Process Definition: Auto-covariance of random process: C X (t, τ) = Cov[X(t), X(t + τ)] Discrete-time: C X [m, k] = Cov[X m, X m+k ] Definition: Auto-correlation of random process: R X (t, τ) = E[X(t)X(t + τ)] R X [m, k] = E[X m, X m+k ] C X (t, τ) = R X (t, τ) µ X (t)µ X (t + τ) C X [m, k] = R X [m, k] µ X [m]µ X [m + k] ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 22
Stationary Process Let us look at a particular time t 1 : X(t 1 ) is a RV. The PDF f X(t1 )(x), generally speaking, is a function of t. Definition: X(t) is stationary if and only if joint PDF f X(t1 ),...,X(t m )(x 1,..., x m ) = f X(t1 +τ),...,x(t m +τ)(x 1,..., x m ), τ, m Hence if X(t) is stationary, f X(t) (x) is the same for all t. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 23
Stationary Process Example: {W n } n= : i.i.d. Gaussian (WGN). Is it stationary? How to check? f W1 (w 1 ) = f W1+q (w 1 )? f W1,W 2 (w 1, w 2 ) = f W1+q,W 2+q (w 1, w 2 )? i.i.d. = Stationary. (The converse is not true). Example: X n (s) = A(s). Given s, A is fixed (P Xn1,X n2 (x 1, x 2 ) = P [A 2 ]); always stationary, but not independent. Example: Discrete-time Brownian Motion X n : Var[X n ] = nσ 2. Var[X 1 ] = σ 2 and Var[X 100 ] = 100σ 2. Cannot have the same PDFs. ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 24
Stationary Process Theorem: If X(t) is a stationary process, then we have µ X (t) = µ X, t and R X (t, τ) = E[X(t)X(t + τ)] = R X (0, τ) = R X (τ). These are necessary conditions of being stationary. Proof: µ X (t) = E[X(t)] = = x= x= xf X(t) (x)dx xf X(0) (x)dx = µ X where we have used stationarity f X(t) (x) = f X(0) (x). t, ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 25
For the auto-correlation part: R X (t, τ) = E[X(t)X(t + τ)] = = Stationary Process x 1 = x 2 = x 1 = x 2 = = R X (0, τ). x 1 x 2 f X(t),X(t+τ) (x 1, x 2 )dx 1 dx 2 x 1 x 2 f X(0),X(τ) (x 1, x 2 )dx 1 dx 2 ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 26
Stationary Process Necessary conditions are used for disqualifying X(t) as a stationary process. Example: Y (t) = A cos(2πf c t + θ); A N (0, 1) is random. Is Y (t) stationary? Sanity check: E[Y (t)] = E[A] cos(2πf c t + θ) = 0. Let 2πf c t+θ = π/2+2kπ for k Z. There exist points t : πf c t +θ = π/2+2kπ where Y (s, t ) = 0 for all s. Can this be stationary? R X (t, τ) = E[X(t )X(t + τ)] =? ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 27
Wide Sense Stationary (WSS) Process Definition: X(t) is WSS if and only if E[X(t)] = µ X, t R R X (t, τ) = E[X(t)X(t + τ)] = R X (0, τ), t, τ Example: Y (t) = A cos(2πf c t + θ); θ U[0, 2π] is random. Is Y (t) WSS? Let α(t) = 2πf c t E[Y (t)] = AE[cos(α(t) + θ)] = A = A 2π 2π θ=0 2π θ=0 cos(α(t) + θ)dθ = 0. cos(α(t) + θ) 1 2π dθ ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 28
In addition, we have Wide Sense Stationary (WSS) Process R X (t, τ) = E[X(t)X(t + τ)] = A 2 E[cos(2πf c t + θ) cos(2πf c (t + τ) + θ)] Recall that Hence, we have cos A cos B = 1 2 cos(a B) + 1 2 cos(a + B). R X (t, τ) = A2 2 2π θ=0 2π + A2 = A2 2 2 θ=0 2π θ=0 cos(4πf c t + 2πf c τ + 2θ)dθ cos( 2πf c τ)dθ cos(2πf c τ)dθ = R X (0, τ). ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 29