Chapter 6: Random Processes 1

Size: px
Start display at page:

Download "Chapter 6: Random Processes 1"

Transcription

1 Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof. Mao-Ching Chiu

2 Y. S. Han Random Processes 1 Definition of a Random Process Random experiment with sample space S. To every outcome ζ S, we assign a function of time according to some rule: X(t,ζ) t I. For fixed ζ, the graph of the function X(t,ζ) versus t is a sample function of the random process. For each fixed t k from the index set I, X(t k,ζ) is a random variable.

3 Y. S. Han Random Processes 2 The indexed family of random variables {X(t,ζ),t I} is called a random process or stochastic process.

4 Y. S. Han Random Processes 3

5 Y. S. Han Random Processes 4 A stochastic process is said to be discrete-time if the index set I is a countable set. A continuous-time stochastic process is one in which I is continuous. Example: Let ζ be a number selected at random from the interval S = [0, 1], and let b 1 b 2 be the binary expansion of ζ ζ = b i 2 i b i {0, 1}. i=1 Define the discrete-time random process X(n,ζ) by X(n,ζ) = b n n = 1, 2,. A sequence of binary numbers is obtained.

6 Y. S. Han Random Processes 5 Example: 1. Let ζ S = [ 1, +1] be selected at random. Define the continuous-time random process X(t,ζ) by X(t,ζ) = ζ cos(2πt) < t <. 2. Let ζ S = ( π,π) be selected at random, and let Y (t,ζ) = cos(2πt + ζ)

7 Y. S. Han Random Processes 6

8 Y. S. Han Random Processes Specifying a Random Process Joint Distributions of Time Samples Let X 1,X 2,...,X k be the k random variables obtained by sampling the random process X(t,ζ) at the time t 1,t 2,...,t k : X 1 = X(t 1,ζ), X 2 = X(t 2,ζ),..., X k = X(t k,ζ). The joint behavior of the random process at these k time instants is specified by the joint cdf of (X 1,X 2,...,X k ).

9 Y. S. Han Random Processes 8 A stochastic process is specified by the collection of kth-order joint cumulative distribution functions: F X1,...,X k (x 1,...,x k ) = P[X 1 x 1,...,X k x k ] for any k and any choice of sampling instants t 1,...,t k.

10 Y. S. Han Random Processes 9 If the stochastic process is discrete-valued, then a collection of probability mass functions can be used to specify the stochastic process p X1,...,X k (x 1,...,x k ) = P[X 1 = x 1,...,X k = x k ]. If the stochastic process is continuous-valued, then a collection of probability density functions can be used instead: f X1,...,X k (x 1,...,x k ).

11 Y. S. Han Random Processes 10 Example: Let X n be a sequence of independent, identically distributed Bernoulli random variables with p = 1/2. The joint pmf for any k time samples is then P[X 1 = x 1,X 2 = x 2,...,X k = x k ] = 2 k x i {0, 1} i.

12 Y. S. Han Random Processes 11 A random process X(t) is said to have independent increments if for any k and any choice of sampling instants t 1 < t 2 < t k, the random variables X(t 2 ) X(t 1 ),X(t 3 ) X(t 2 ),...,X(t k ) X(t k 1 ) are independent random variables. A random process X(t) is said to be Markov if the future of the process given the present is independent of the past; that is, for any k and any choice of sampling instants t 1 < t 2 < < t k and for any x 1,x 2,...,x k, f X(tk )(x k X(t k 1 ) = x k 1,...,X(t 1 ) = x 1 ) = f X(tk )(x k X(t k 1 ) = x k 1 )

13 Y. S. Han Random Processes 12 if X(t) is continuous-valued, and P[X(t k ) = x k X(t k 1 ) = x k 1,...,X(t 1 ) = x 1 ) = P[X(t k ) = x k X(t k 1 ) = x k 1 ) if X(t) is discrete-valued.

14 Y. S. Han Random Processes 13 Independent increments Markov; Markov NOT independent increments The Mean, Autocorrelation, and Autocovariance Functions The mean m X (t) of a random process X(t) is defined by m X (t) = E[X(t)] = + where f X(t) (x) is the pdf of X(t). m X (t) is a function of time. xf X(t) (x)dx, The autocorrelation R X (t 1,t 2 ) of a random process

15 Y. S. Han Random Processes 14 X(t) is defined as R X (t 1,t 2 ) = E[X(t 1 )X(t 2 )] = + + xyf X(t1 ),X(t 2 )(x,y)dxdy. In general, the autocorrelation is a function of t 1 and t 2.

16 Y. S. Han Random Processes 15 The autocovariance C X (t 1, t 2 ) of a random process X(t) is defined as the covariance of X(t 1 ) and X(t 2 ) C X (t 1, t 2 ) = E [{X(t 1 ) m X (t 1 )}{X(t 2 ) m X (t 2 )}]. C X (t 1, t 2 ) = R X (t 1, t 2 ) m X (t 1 )m X (t 2 ). The variance of X(t) can be obtained from C X (t 1, t 2 ): VAR[X(t)] = E[(X(t) m X (t)) 2 ] = C X (t, t). The correlation coefficient of X(t) is given by ρ X (t 1, t 2 ) = C X (t 1, t 2 ) CX (t 1, t 1 ) C X (t 2, t 2 ). ρ X (t 1, t 2 ) 1.

17 Y. S. Han Random Processes 16 Example: Let X(t) = A cos 2πt, where A is some random variable. The mean of X(t) is given by The autocorrelation is and the autocovariance m X (t) = E[A cos 2πt] = E[A] cos2πt. R X (t 1, t 2 ) = E[A cos(2πt 1 )A cos(2πt 2 )] = E[A 2 ] cos(2πt 1 ) cos(2πt 2 ), C X (t 1, t 2 ) = R X (t 1, t 2 ) m X (t 1 )m X (t 2 ) = {E[A 2 ] E[A] 2 } cos(2πt 1 ) cos(2πt 2 ) = VAR[A] cos(2πt 1 ) cos(2πt 2 ).

18 Y. S. Han Random Processes 17 Example: Let X(t) = cos(ωt + Θ), where Θ is uniformly distributed in the interval ( π, π). The mean of X(t) is given by m X (t) = E[cos(ωt + Θ)] = 1 2π π π cos(ωt + θ)dθ = 0. The autocorrelation and autocovariance are then C X (t 1, t 2 ) = R X (t 1, t 2 ) = E[cos(ωt 1 + Θ) cos(ωt 2 + Θ)] = 1 2π π π = 1 2 cos(ω(t 1 t 2 )). 1 2 {cos(ω(t 1 t 2 )) + cos(ω(t 1 + t 2 ) + 2θ)}dθ

19 Y. S. Han Random Processes 18 Gaussian Random Process A random process X(t) is a Gaussian random process if the samples X 1 = X(t 1 ), X 2 = X(t 2 ),...,X k = X(t k ) are joint Gaussian random variables for all k, and all choices of t 1,...,t k : where m = f X1,X 2,...,X k (x 1,...,x k ) = e 1/2(x m)k 1 (x m) (2π) k/2 K 1/2, m X (t 1 ). m X (t k ) K = C X (t 1, t 1 ) C X (t 1, t 2 ) C X (t 1, t k ) C X (t 2, t 1 ) C X (t 2, t 2 ) C X (t 2, t k ).. C X (t k, t 1 ) C X (t k, t k ).. The joint pdf s of Gaussian random process are completely specified by the mean and by covariance function.

20 Y. S. Han Random Processes 19 Linear operation on a Gaussian random process results in another Gaussian random process.

21 Y. S. Han Random Processes 20 Example: Let the discrete-time random process X n be a sequence of independent Gaussian random variables with mean m and variance σ 2. The covariance matrix for the time t 1,...,t k is {C X (t i, t j )} = {σ 2 δ ij } = σ 2 I, where δ ij = 1 when i = j and 0 otherwise, and I is the identity matrix. The corresponding joint pdf f X1,...,X k (x 1,...,x k ) = { 1 exp (2πσ 2 ) k/2 } k (x i m) 2 /2σ 2 i=1 = f X (x 1 )f X (x 2 ) f X (x k ).

22 Y. S. Han Random Processes 21 Multiple Random Processes The joint behavior of X(t) and Y (t) must specify all possible joint density functions of X(t 1 ),...,X(t k ) and Y (t 1),...,Y (t j ) for all k, j and all choices of t 1,...,t k and t 1,...,t j. X(t) and Y (t) are said to be independent if the vector random variables (X(t 1 ),...,X(t k )) and (Y (t 1),...,Y (t j )) are independent for all k, j and all choices of t 1,...,t k and t 1,...,t j. The cross-correlation R X,Y (t 1, t 2 ) of X(t) and Y (t) is defined by R X,Y (t 1, t 2 ) = E[X(t 1 )Y (t 2 )]. The process X(t) and Y (t) are said to be orthogonal if R X,Y (t 1, t 2 ) = 0 for all t 1 and t 2.

23 Y. S. Han Random Processes 22 The cross-covariance C X,Y (t 1, t 2 ) of X(t) and Y (t) is defined by C X,Y (t 1, t 2 ) = E[{X(t 1 ) m X (t 1 )}{Y (t 2 ) m Y (t 2 )}] = R X,Y (t 1, t 2 ) m X (t 1 )m Y (t 2 ). The process X(t) and Y (t) are said to be uncorrelated if C X,Y (t 1, t 2 ) = 0 for all t 1 and t 2. Note that C X,Y (t 1, t 2 ) = 0 R X,Y (t 1, t 2 ) = E[X(t 1 )Y (t 2 )] = m X (t 1 )m Y (t 2 ) = E[X(t 1 )]E[Y (t 2 )].

24 Y. S. Han Random Processes 23 Example: Let X(t) = cos(ωt + Θ) and Y (t) = sin(ωt + Θ), where Θ is a random variable uniformly distributed in [ π, π]. Find the cross-covariance of X(t) and Y (t). Sol: Since X(t) and Y (t) are zero-mean, the cross-covariance is equal to the cross-correlation. R X,Y (t 1, t 2 ) = E[cos(ωt 1 + Θ) sin(ωt 2 + Θ)] [ = E 1 2 sin(ω(t 1 t 2 )) + 1 ] 2 sin(ω(t 1 + t 2 ) + 2Θ) = 1 2 sin(ω(t 1 t 2 )), since E[sin(ω(t 1 + t 2 ) + 2Θ)] = 0.

25 Y. S. Han Random Processes 24 Example: Suppose we observe a process Y (t): Y (t) = X(t) + N(t). Find the cross-correlation between the observed signal and the desired signal assuming that X(t) and N(t) are independent random processes. R X,Y (t 1,t 2 ) = E[X(t 1 )Y (t 2 )] = E[X(t 1 ) {X(t 2 ) + N(t 2 )}] = E[X(t 1 )X(t 2 )] + E[X(t 1 )N(t 2 )] = R X (t 1,t 2 ) + E[X(t 1 )]E[N(t 2 )] = R X (t 1,t 2 ) + m X (t 1 )m N (t 2 ).

26 Y. S. Han Random Processes Examples of Discrete-Time Random Processes iid Random Processes The sequence X n is called independent, identically distributed (iid) random process, if the joint cdf for any time instants n 1,...,n k can be expressed as F Xn1,...,X nk (x n1,...,x nk ) = P[X n1 x n1,...,x nk x nk ] The mean of an iid process is = F Xn1 (x n1 ) F Xnk (x nk ). m X (n) = E[X n ] = m for all n.

27 Y. S. Han Random Processes 26 Autocovariance function: If n 1 n 2 C X (n 1,n 2 ) = E[(X n1 m)(x n2 m)] = E[(X n1 m)]e[(x n2 m)] = 0 since X n2 and X n2 are independent. If n 1 = n 2 = n C X (n 1,n 2 ) = E[(X n m) 2 ] = σ 2. Therefore C X (n 1,n 2 ) = σ 2 δ n1,n 2.

28 Y. S. Han Random Processes 27 The autocorrelation function of an iid process R X (n 1,n 2 ) = C X (n 1,n 2 ) + m 2.

29 Y. S. Han Random Processes 28 Sum Processes: The Binomial Counting and Random Walk Process Consider a random process S n which is the sum of a sequence of iid random variables, X 1, X 2,...: where S 0 = 0. S n = X 1 + X X n n = 1, 2,... = S n 1 + X n,

30 Y. S. Han Random Processes 29 We call S n the sum process. S n is independent of the past when S n 1 is known.

31 Y. S. Han Random Processes 30 Example: Let I i be the sequence of independent Bernoulli random variable, and let S n be the corresponding sum process. S n is a binomial random variable with parameter n and p = P[I = 1]: ( n ) P[S n = j] = j p j (1 p) n j for 0 j n, and zero otherwise. Thus S n has mean np and variance np(1 p).

32 Y. S. Han Random Processes 31 The sum process S n has independent increments in nonoverlapping time intervals. For example: n 0 < n n 1 and n 2 < n n 3, where n 1 n 2. We have S n1 S n0 = X n X n1 S n3 S n2 = X n X n3. The independence of the X n s implies (S n1 S n0 ) and (S n3 S n2 ) are independent random variables. For n > n, (S n S n ) is the sum of n n iid random variables, so it has the same distribution as S n n P[S n S n = y] = P[S n n = y].

33 Y. S. Han Random Processes 32 Thus, increments in intervals of the same length have the same distribution regardless of when the interval begins. We say that S n has stationary increments.

34 Y. S. Han Random Processes 33 Compute the joint pmf of S n at time n 1, n 2, and n 3 P[S n1 = y 1,S n2 = y 2,S n3 = y 3 ] = P[S n1 = y 1,S n2 S n1 = y 2 y 1,S n3 S n2 = y 3 y 2 ] = P[S n1 = y 1 ]P[S n2 S n1 = y 2 y 1 ] P[S n3 S n2 = y 3 y 2 ]. The stationary increments property implies that P[S n1 = y 1,S n2 = y 2,S n3 = y 3 ] = P[S n1 = y 1 ]P[S n2 n 1 = y 2 y 1 ]P[S n3 n 2 = y 3 y 2 ]. In general, we have P[S n1 = y 1,S n2 = y 2,...,S nk = y k ]

35 Y. S. Han Random Processes 34 = P[S n1 = y 1 ]P[S n2 n 1 = y 2 y 1 ] P[S nk n k 1 = y k y k 1 ]. If X n are continuous-valued random variables, then f Sn1,...,S nk (y 1,...,y k ) = f Sn1 (y 1 )f Sn2 n 1 (y 2 y 1 ) f Snk S nk 1 (y k y k 1 ). Example: Find the joint pmf for the binomial counting process at times n 1 and n 2. P[S n1 = y 1,S n2 = y 2 ] = P[S n1 = y 1 ]P[S n2 n 1 = y 2 y 1 ] ( ) n2 n 1 = p y 2 y 1 (1 p) n 2 n 1 y 2 +y 1 y 2 y 1

36 Y. S. Han Random Processes 35 = ( n1 y 1 ( n2 n 1 y 2 y 1 ) p y 1 (1 p) n 1 y 1 ) ( ) n1 p y 2 (1 p) n 2 y 2. y 1

37 Y. S. Han Random Processes 36 The mean and variance of a sum process m S (n) = E[S n ] = ne[x] = nm VAR[S n ] = nvar[x] = nσ 2. The autocovariance of S n C S (n,k) = E[(S n E[S n ])(S k E[S k ])] = E[(S n nm)(s k km)] [{ n } { k }] = E (X i m) (X j m) i=1 j=1 n k = E[(X i m)(x j m)] i=1 j=1

38 Y. S. Han Random Processes 37 = = = n i=1 n i=1 k C X (i,j) j=1 k σ 2 δ i,j j=1 min(n,k) i=1 C X (i,i) = min(n,k)σ 2. The property of independent increments allows us to compute the autocovariance in another way. Suppose n k so n = min(n,k) C S (n, k) = E[(S n nm)(s k km)]

39 Y. S. Han Random Processes 38 = E[(S n nm){(s n nm) + (S k km) (S n nm)}] = E[(S n nm) 2 ] + E[(S n nm)(s k S n (k n)m)] = E[(S n nm) 2 ] + E[S n nm]e[s k S n (k n)m] = E[(S n nm) 2 ] = VAR[S n ] = nσ 2.

40 Y. S. Han Random Processes 39 first-order Autoregressive Random Process

41 Y. S. Han Random Processes 40 Moving Average Process

42 Y. S. Han Random Processes Examples of Continuous-Time Random Processes Poisson Process Events occur at random instants of time at an average rate of λ events per second. Let N(t) be the number of event occurrences in the time interval [0,t].

43 Y. S. Han Random Processes 42 Divide [0,t] into n subintervals of duration δ = t/n. Assume that the following two conditions hold:

44 Y. S. Han Random Processes The probability of more than one event occurrence in a subinterval is negligible compared to the probability of observing one or zero events. Bernoulli trial 2. Whether or not an event occurs in a subinterval is independent of the outcomes in other subintervals. Bernoulli trials are independent. N(t) can be approximated by the binomial counting process. Let p be the prob. of event occurrence in each subinterval. Then the expected number of event occurrence in the interval [0,t] is np.

45 Y. S. Han Random Processes 44 The average number of events in the interval [0,t] is also λt. Thus λt = np. Let n and then p 0 while np = λt remains fixed. Binomial distribution approaches Poisson distribution with parameter λt. P[N(t) = k] = (λt)k k! N(t) is called Poisson process. e λt for k = 0, 1,...

46 Y. S. Han Random Processes 45 We will show that if n is large and p is small, then for α = np, p k = ( n )p k (1 p) n k αk k k! e α k = 0, 1,... Consider the probability that no events occur in n trials: ( p 0 = (1 p) n = 1 α ) n e α as n. n Let q = 1 p. Noting that p k+1 p k = ( n k+1 ( n k ) p k+1 q n k 1 ) pk q n k

47 Y. S. Han Random Processes 46 Thus and = p k+1 = (n k)p (k + 1)q = (1 k/n)α (k + 1)(1 α/n) α as n. k + 1 α k + 1 p k for k = 0, 1, 2,... p 0 = e α. A simple induction argument then shows that p k = αk k! e α for k = 0, 1, 2,... The mean (the variance) of N(t) is λt.

48 Y. S. Han Random Processes 47 Independent and stationary increments P[N(t 1 ) = i, N(t 2 ) = j] = P[N(t 1 ) = i]p[n(t 2 ) N(t 1 ) = j i] = P[N(t 1 ) = i]p[n(t 2 t 1 ) = j i] = (λt 1) i e λt 1 i! Covariance of N(t). Suppose t 1 t 2, then (λ(t 2 t 1 )) j i e λ(t 2 t 1 ). (j i)! C N (t 1,t 2 ) = E[(N(t 1 ) λt 1 )(N(t 2 ) λt 2 )] = E[(N(t 1 ) λt 1 ){N(t 2 ) N(t 1 ) λt 2 + λt 1 +(N(t 1 ) λt 1 )}] = E[N(t 1 ) λt 1 ]E[N(t 2 ) N(t 1 ) λ(t 2 t 1 )] +VAR[N(t 1 )] = VAR[N(t 1 )] = λt 1 = λ min(t 1,t 2 ).

49 Y. S. Han Random Processes 48 Interevent Times Consider the time T between event occurrences in a Poisson process. [0,t] is divided into n subintervals of length δ = t/n. The probability that T > t is P[T > t] = P[no events in t seconds] = (1 p) n ( = 1 λt n ) n e λt as n.

50 Y. S. Han Random Processes 49 The cdf of T is then 1 e λt. T is an exponential random variable with parameter λ. Since the times between event occurrences in the underlying binomial process are independent geometric random variable, it follows that the interevent times in a Poisson Process form an iid sequence of exponential random variables with mean 1/λ.

51 Y. S. Han Random Processes 50 Individual Arrival Times In applications where the Poisson process models customer interarrival times, it is customary to say that arrivals occur at random. Suppose that we are given that only one arrival occurred in an interval [0,t], and let X be the arrival time of the single customer. For 0 < x < t, let N(x) be the number of events up to time x, and let N(t) N(x) be the increment in the interval (x,t], then P[X x] = P[N(x) = 1 N(t) = 1]

52 Y. S. Han Random Processes 51 = = = P[N(x) = 1 and N(t) = 1] P[N(t) = 1] P[N(x) = 1 and N(t) N(x) = 0] P[N(t) = 1] P[N(x) = 1]P[N(t) N(x) = 0] P[N(t) = 1] = λxe λx e λ(t x) λte λt = x t. It can be shown that if the number of arrivals in the interval [0,t] is k, then the individual arrival times are distributed independently and uniformly in the interval.

53 Y. S. Han Random Processes Stationary Random Processes We now consider those random processes that Randomness in the processes does not change with time, that is, they have the same behaviors between an observation in (t 0,t 1 ) and (t 0 + τ,t 1 + τ). A discrete-time or continuous-time random process X(t) is stationary if the joint distribution of any set of samples does not depend on the placement of the time origin. That is, F X(t1 ),...,X(t k )(x 1,...,x k ) = F X(t1 +τ),...,x(t k +τ)(x 1,...,x k )

54 Y. S. Han Random Processes 53 for all time shift τ, all k, and all choices of sample times t 1,...,t k.

55 Y. S. Han Random Processes 54 Two processes X(t) and Y (t) are said to be jointly stationary if the joint cdf s of X(t 1 ),...,X(t k ) and Y (t 1),...,Y (t j) do not depend on the placement of the time origin for all k and j and all choices of sampling times t 1,...,t k and t 1,...,t j. The first-order cdf of a stationary random process must be independent of time, i.e., F X(t) (x) = F X(t+τ) (x) = F X (x) for all t and τ; m X (t) = E[X(t)] = m for all t; VAR[X(t)] = E[(X(t) m) 2 ] = σ 2 for all t. The second-order cdf of a stationary random process is

56 Y. S. Han Random Processes 55 with F X(t1 ),X(t 2 )(x 1,x 2 ) = F X(0),X(t2 t 1 )(x 1,x 2 ) for all t 1, t 2.

57 Y. S. Han Random Processes 56 The autocorrelation and autocovariance of stationary random process X(t) depend only on t 2 t 1 : R X (t 1,t 2 ) = R X (t 2 t 1 ) for all t 1,t 2 ; C X (t 1,t 2 ) = C X (t 2 t 1 ) for all t 1,t 2. Example: Is the sum process S n a discrete-time stationary process? We have where X i is an iid sequence. Since S n = X 1 + X X n, m S (n) = nm and VAR[S n ] = nσ 2,

58 Y. S. Han Random Processes 57 mean and variance of S n are not constant. Thus, S n cannot be a stationary process.

59 Y. S. Han Random Processes 58 Wide-Sense Stationary Random Processes A discrete-time or continuous-time random process X(t) is wide-sense stationary (WSS) if it satisfies m X (t) = m for all t and C X (t 1,t 2 ) = C X (t 1 t 2 ) for all t 1 and t 2. Two processes X(t) and Y (t) are said to be jointly wide-sense stationary if they are both wide-sense stationary and if their cross-covariance depends only on t 1 t 2. When X(t) is wide-sense stationary, we have C X (t 1,t 2 ) = C X (τ) and R X (t 1,t 2 ) = R X (τ),

60 Y. S. Han Random Processes 59 where τ = t 1 t 2. Stationary random process wide-sense stationary process

61 Y. S. Han Random Processes 60 Assume that X(t) is a wide-sense stationary process. The average power of X(t) is given by E[X(t) 2 ] = R X (0) for all t. The autocorrelation function of X(t) is an even function since R X (τ) = E[X(t+τ)X(t)] = E[X(t)X(t+τ)] = R X ( τ). The autocorrelation function is a measure of the rate of change of a random process. Consider the change in the process from time t to t + τ: P[ X(t + τ) X(t) > ǫ] = P[(X(t + τ) X(t)) 2 > ǫ 2 ]

62 Y. S. Han Random Processes 61 E[(X(t + τ) X(t))2 ] ǫ 2 = 2(R X(0) R X (τ)) ǫ 2, where we apply the Markov inequality to obtain the upper bound. If R X (0) R X (τ) is small, then the probability of a large change in X(t) in τ seconds is small. The autocorrelation function is maximum at τ = 0 since R X (τ) 2 = E[X(t+τ)X(t)] 2 E[X 2 (t+τ)]e[x 2 (t)] = R X (0) 2,

63 Y. S. Han Random Processes 62 where we use the fact that E[XY ] 2 E[X 2 ]E[Y 2 ]. If R X (0) = R X (d), then R X (τ) is periodic with period d and X(t) is mean square periodic, i.e., E[(X(t + d) X(t)) 2 ] = 0. E[(X(t + τ + d) X(t + τ))x(t)] 2 E[(X(t + τ + d) X(t + τ)) 2 ]E[X 2 (t)], which implies that {R X (τ + d) R X (τ)} 2 2 {R X (0) R X (d)} R X (0).

64 Y. S. Han Random Processes 63 Therefore, R X (τ + d) = R X (τ). The fact that X(t) is mean square periodic is from E[(X(t + d) X(t)) 2 ] = 2{R X (0) R X (d)} = 0. Let X(t) = m + N(t), where N(t) is a zero-mean process for which R N (τ) 0 as τ. Then R X (τ) = E[(m + N(t + τ))(m + N(t))] = m 2 + 2mE[N(t)] + R N (τ) = m 2 + R N (τ) m 2 as τ. In summary, the autocorrelation function can have three types of components: (1) a component that

65 Y. S. Han Random Processes 64 approaches zero as τ ; (2) a periodic component; and (3) a component due to a non zero mean.

66 Y. S. Han Random Processes 65

67 Y. S. Han Random Processes 66

68 Y. S. Han Random Processes 67 Wide-Sense Stationary Gaussian Random Processes If a Gaussian random process is wide-sense stationary, then it is also stationary. This is due to the fact that the joint pdf of a Gaussian random process is completely determined by the mean m X (t) and the autocovariance C X (t 1,t 2 ). Example: Let X n be an iid sequence of Gaussian random variables with zero mean and variance σ 2, and let Y n be Y n = X n + X n 1 2.

69 Y. S. Han Random Processes 68 The mean of Y n is zero since E[X i ] = 0 for all i. The covariance of Y n is C Y (i,j) = E[Y i Y j ] = 1 4 E[(X i + X i 1 )(X j + X j 1 )] = 1 4 {E[X ix j ] + E[X i X j 1 ] + E[X i 1 X j ] + E[X i 1 X j 1 ]} 1 2 σ2, if i = j = 1 4 σ2, if i j = 1. 0, otherwise Y n is a wide sense stationary process since it has a constant mean and a covariance function that depends

70 Y. S. Han Random Processes 69 only on i j. Y n is a Gaussian random variable since it is defined by a linear function of Gaussian random variables.

71 Y. S. Han Random Processes 70 Cyclostationary Random Processes A random process X(t) is said to be cyclostationary with period T if the joint cdf s of X(t 1 ),X(t 2 ),...,X(t k ) and X(t 1 + mt),x(t 2 + mt),...,x(t k + mt) are the same for all k,m and all choices of t 1,...,t k : F X(t1 ),X(t 2 ),...,X(t k )(x 1,x 2,...,x k ) = F X(t1 +mt),x(t 2 +mt),...,x(t k +mt)(x 1,x 2,...,x k ). X(t) is said to be wide-sense cyclostationary if m X (t + mt) = m X (t) and C X (t 1 + mt,t 2 + mt) = C X (t 1,t 2 ).

72 Y. S. Han Random Processes 71 Example: Consider a random amplitude sinusoid with period T: X(t) = A cos(2πt/t). Is X(t) cyclostationary? wide-sense cyclostationary? We have P[X(t 1 ) x 1, X(t 2 ) x 2,..., X(t k ) x k ] = P[A cos(2πt 1 /T) x 1,..., A cos(2πt k /T) x k ] = P[A cos(2π(t 1 + mt)/t) x 1,..., A cos(2π(t k + mt)/t) x k ] = P[X(t 1 + mt) x 1, X(t 2 + mt) x 2,..., X(t k + mt) x k ]. Thus, X(t) is a cyclostationary random process.

73 Y. S. Han Random Processes Time Averages of Random Processes and Ergodic Theorems We consider the measurement of repeated random experiments. We want to take arithmetic average of the quantities of interest. To estimate the mean m X (t) of a random process X(t,ζ) we have ˆm X (t) = 1 N X(t,ζ i ), N i=1

74 Y. S. Han Random Processes 73 where N is the number of repetitions of the experiment. Time average of a single realization is given by X(t) T = 1 2T T T X(t, ζ)dt. Ergodic theorem states conditions under which a time average converges as the observation interval becomes large. We are interested in ergodic theorems that state when time average converge to the ensemble average.

75 Y. S. Han Random Processes 74 The strong law of large numbers given as [ ] 1 n P lim X i = m = 1 n n i=1 is one of the most important ergodic theorems, where X n is an iid discrete-time random process with finite mean E[X i ] = m.

76 Y. S. Han Random Processes 75 Example: Let X(t) = A for all t, where A is a zero mean, unit-variance random variable. Find the limit value of the time average. The mean of the process m X (t) = E[X(t)] = E[A] = 0. The time average gives X(t) T = 1 T Adt = A. 2T The time average does not converge to m X (t) = 0. Stationary processes need not be ergodic. T

77 Y. S. Han Random Processes 76 Let X(t) be a WSS process. Then [ 1 E[ X(t) T ] = E 2T = 1 2T T T T T ] X(t)dt E[X(t)]dt = m. Hence, X(t) T is an unbiased estimator for m. Variance of X(t) T is given by VAR[ X(t) T ] = E[( X(t) T m) 2 ] = E 1 2T = 1 4T 2 T T T T T T (X(t) m)dt 1 2T T T E[(X(t) m)(x(t ) m)]dtdt (X(t ) m)dt

78 Y. S. Han Random Processes 77 = 1 4T 2 T T T T C X (t, t )dtdt. Since the process X(t) is WSS, we have VAR[ X(t) T ] = 1 4T 2 = 1 4T 2 = 1 2T T T 2T 2T 2T 2T T T C X (t t )dtdt (2T u )C X (u)du ( 1 u 2T ) C X (u)du.

79 Y. S. Han Random Processes 78

80 Y. S. Han Random Processes 79 X(t) T will approach m in the mean square sense, that is, E[( X(t) T m) 2 ] 0, if VAR[ X(t) T ] approaches zero. Theorem: Let X(t) be a WSS process with m X (t) = m, then lim T X(t) T = m in the mean square sense, if and only if 1 2T ( lim 1 u ) C X (u)du = 0. T 2T 2T 2T

81 Y. S. Han Random Processes 80 Time-average estimate for the autocorrelation function is given by X(t + τ)x(t) T = 1 2T T T X(t + τ)x(t)dt. E[ X(t + τ)x(t) T ] = R X (τ) if X(t) is WSS random process. Time average autocorrelation converges to R X (τ) in the mean square sense if VAR[ X(t + τ)x(t) T ] converges to zero.

82 Y. S. Han Random Processes 81 If the random process is discrete time, then X n T = 1 2T + 1 X n+k X n T = T n= T 1 2T + 1 X n ; T n= T X n+k X n. If X n is a WSS random process, then E[ X n T ] = m. Variance of X n T is given by VAR[ X n T ] = 1 2T + 1 2T k= 2T ( 1 k ) C X (k). 2T + 1

83 Y. S. Han Random Processes 82 X n T approaches m in the mean square sense and is mean ergodic if 1 2T + 1 2T k= 2T ( 1 k ) C X (k) 0. 2T + 1

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process. Stationary independent increments 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process. 2. If each set of increments, corresponding to non-overlapping collection of

More information

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu From RV

More information

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE 53 Handout #46 Prof. Young-Han Kim Thursday, June 5, 04 Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei). Discrete-time Wiener process. Let Z n, n 0 be a discrete time white

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Chapter 2 Random Processes

Chapter 2 Random Processes Chapter 2 Random Processes 21 Introduction We saw in Section 111 on page 10 that many systems are best studied using the concept of random variables where the outcome of a random experiment was associated

More information

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017) UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

Introduction to Probability and Stochastic Processes I

Introduction to Probability and Stochastic Processes I Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

16.584: Random (Stochastic) Processes

16.584: Random (Stochastic) Processes 1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable

More information

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7 UCSD ECE50 Handout #4 Prof Young-Han Kim Wednesday, June 6, 08 Solutions to Exercise Set #7 Polya s urn An urn initially has one red ball and one white ball Let X denote the name of the first ball drawn

More information

Probability and Statistics

Probability and Statistics Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

Stochastic Process II Dr.-Ing. Sudchai Boonto

Stochastic Process II Dr.-Ing. Sudchai Boonto Dr-Ing Sudchai Boonto Department of Control System and Instrumentation Engineering King Mongkuts Unniversity of Technology Thonburi Thailand Random process Consider a random experiment specified by the

More information

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7 UCSD ECE50 Handout #0 Prof. Young-Han Kim Monday, February 6, 07 Solutions to Exercise Set #7. Minimum waiting time. Let X,X,... be i.i.d. exponentially distributed random variables with parameter λ, i.e.,

More information

Stochastic Processes

Stochastic Processes Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic

More information

Random Process. Random Process. Random Process. Introduction to Random Processes

Random Process. Random Process. Random Process. Introduction to Random Processes Random Process A random variable is a function X(e) that maps the set of experiment outcomes to the set of numbers. A random process is a rule that maps every outcome e of an experiment to a function X(t,

More information

Chapter 4 Random process. 4.1 Random process

Chapter 4 Random process. 4.1 Random process Random processes - Chapter 4 Random process 1 Random processes Chapter 4 Random process 4.1 Random process 4.1 Random process Random processes - Chapter 4 Random process 2 Random process Random process,

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Part I Stochastic variables and Markov chains

Part I Stochastic variables and Markov chains Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)

More information

ECE534, Spring 2018: Solutions for Problem Set #5

ECE534, Spring 2018: Solutions for Problem Set #5 ECE534, Spring 08: s for Problem Set #5 Mean Value and Autocorrelation Functions Consider a random process X(t) such that (i) X(t) ± (ii) The number of zero crossings, N(t), in the interval (0, t) is described

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan

More information

ECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018

ECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018 ECE534, Spring 2018: s for Problem Set #4 Due Friday April 6, 2018 1. MMSE Estimation, Data Processing and Innovations The random variables X, Y, Z on a common probability space (Ω, F, P ) are said to

More information

Manual for SOA Exam MLC.

Manual for SOA Exam MLC. Chapter 10. Poisson processes. Section 10.5. Nonhomogenous Poisson processes Extract from: Arcones Fall 2009 Edition, available at http://www.actexmadriver.com/ 1/14 Nonhomogenous Poisson processes Definition

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

Random Processes. DS GA 1002 Probability and Statistics for Data Science. Random Processes DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Modeling quantities that evolve in time (or space)

More information

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1 ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen D. van Alphen 1 Lecture 10 Overview Part 1 Review of Lecture 9 Continuing: Systems with Random Inputs More about Poisson RV s Intro. to Poisson Processes

More information

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages ELEC206 Probability and Random Processes, Fall 2014 Gil-Jin Jang gjang@knu.ac.kr School of EE, KNU page 1 / 15 Chapter 7. Sums of Random

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes

More information

Chapter 6. Random Processes

Chapter 6. Random Processes Chapter 6 Random Processes Random Process A random process is a time-varying function that assigns the outcome of a random experiment to each time instant: X(t). For a fixed (sample path): a random process

More information

Stochastic Processes. Monday, November 14, 11

Stochastic Processes. Monday, November 14, 11 Stochastic Processes 1 Definition and Classification X(, t): stochastic process: X : T! R (, t) X(, t) where is a sample space and T is time. {X(, t) is a family of r.v. defined on {, A, P and indexed

More information

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution

More information

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011 UCSD ECE53 Handout #40 Prof. Young-Han Kim Thursday, May 9, 04 Homework Set #8 Due: Thursday, June 5, 0. Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process,

More information

Proving the central limit theorem

Proving the central limit theorem SOR3012: Stochastic Processes Proving the central limit theorem Gareth Tribello March 3, 2019 1 Purpose In the lectures and exercises we have learnt about the law of large numbers and the central limit

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Continuous Time Processes

Continuous Time Processes page 102 Chapter 7 Continuous Time Processes 7.1 Introduction In a continuous time stochastic process (with discrete state space), a change of state can occur at any time instant. The associated point

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

Continuous-time Markov Chains

Continuous-time Markov Chains Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017

More information

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y) Stochastic Processes - MM3 - Solutions MM3 - Review Exercise Let X N (0, ), i.e. X is a standard Gaussian/normal random variable, and denote by f X the pdf of X. Consider also a continuous random variable

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:

More information

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and

More information

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X

More information

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations Chapter 5 Statistical Models in Simulations 5.1 Contents Basic Probability Theory Concepts Discrete Distributions Continuous Distributions Poisson Process Empirical Distributions Useful Statistical Models

More information

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities

More information

Fig 1: Stationary and Non Stationary Time Series

Fig 1: Stationary and Non Stationary Time Series Module 23 Independence and Stationarity Objective: To introduce the concepts of Statistical Independence, Stationarity and its types w.r.to random processes. This module also presents the concept of Ergodicity.

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables CDA5530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (R.V.) X: A function on sample space X: S R Cumulative distribution function

More information

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

EE126: Probability and Random Processes

EE126: Probability and Random Processes EE126: Probability and Random Processes Lecture 18: Poisson Process Abhay Parekh UC Berkeley March 17, 2011 1 1 Review 2 Poisson Process 2 Bernoulli Process An arrival process comprised of a sequence of

More information

5 Operations on Multiple Random Variables

5 Operations on Multiple Random Variables EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y

More information

Chapter 5 Random Variables and Processes

Chapter 5 Random Variables and Processes Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering National Sun Yat-sen University Table of Contents 5.1 Introduction 5. Probability

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship Random data Deterministic Deterministic data are those can be described by an explicit mathematical relationship Deterministic x(t) =X cos r! k m t Non deterministic There is no way to predict an exact

More information

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Review. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 19 28 March 2012 We have been studying stochastic processes; i.e., systems whose time evolution has an element

More information

Estimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators

Estimation theory. Parametric estimation. Properties of estimators. Minimum variance estimator. Cramer-Rao bound. Maximum likelihood estimators Estimation theory Parametric estimation Properties of estimators Minimum variance estimator Cramer-Rao bound Maximum likelihood estimators Confidence intervals Bayesian estimation 1 Random Variables Let

More information

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3) STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it

More information

Poisson Processes. Stochastic Processes. Feb UC3M

Poisson Processes. Stochastic Processes. Feb UC3M Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

1 Inverse Transform Method and some alternative algorithms

1 Inverse Transform Method and some alternative algorithms Copyright c 2016 by Karl Sigman 1 Inverse Transform Method and some alternative algorithms Assuming our computer can hand us, upon demand, iid copies of rvs that are uniformly distributed on (0, 1), it

More information

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else ECE 450 Homework #3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3 4 5

More information

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. NAME: ECE 302 Division MWF 0:30-:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. If you are not in Prof. Pollak s section, you may not take this

More information

ECE302 Spring 2006 Practice Final Exam Solution May 4, Name: Score: /100

ECE302 Spring 2006 Practice Final Exam Solution May 4, Name: Score: /100 ECE302 Spring 2006 Practice Final Exam Solution May 4, 2006 1 Name: Score: /100 You must show ALL of your work for full credit. This exam is open-book. Calculators may NOT be used. 1. As a function of

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Definition of a Stochastic Process

Definition of a Stochastic Process Definition of a Stochastic Process Balu Santhanam Dept. of E.C.E., University of New Mexico Fax: 505 277 8298 bsanthan@unm.edu August 26, 2018 Balu Santhanam (UNM) August 26, 2018 1 / 20 Overview 1 Stochastic

More information

Chapter 6 - Random Processes

Chapter 6 - Random Processes EE385 Class Notes //04 John Stensby Chapter 6 - Random Processes Recall that a random variable X is a mapping between the sample space S and the extended real line R +. That is, X : S R +. A random process

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

S n = x + X 1 + X X n.

S n = x + X 1 + X X n. 0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each

More information

Random Variables. P(x) = P[X(e)] = P(e). (1)

Random Variables. P(x) = P[X(e)] = P(e). (1) Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

FINAL EXAM: 3:30-5:30pm

FINAL EXAM: 3:30-5:30pm ECE 30: Probabilistic Methods in Electrical and Computer Engineering Spring 016 Instructor: Prof. A. R. Reibman FINAL EXAM: 3:30-5:30pm Spring 016, MWF 1:30-1:0pm (May 6, 016) This is a closed book exam.

More information

Probability Models in Electrical and Computer Engineering Mathematical models as tools in analysis and design Deterministic models Probability models

Probability Models in Electrical and Computer Engineering Mathematical models as tools in analysis and design Deterministic models Probability models Probability Models in Electrical and Computer Engineering Mathematical models as tools in analysis and design Deterministic models Probability models Statistical regularity Properties of relative frequency

More information

Limiting Distributions

Limiting Distributions Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

The exponential distribution and the Poisson process

The exponential distribution and the Poisson process The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]

More information

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002 EE/CpE 345 Modeling and Simulation Class 5 September 30, 2002 Statistical Models in Simulation Real World phenomena of interest Sample phenomena select distribution Probabilistic, not deterministic Model

More information

Relationship between probability set function and random variable - 2 -

Relationship between probability set function and random variable - 2 - 2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

2 Continuous Random Variables and their Distributions

2 Continuous Random Variables and their Distributions Name: Discussion-5 1 Introduction - Continuous random variables have a range in the form of Interval on the real number line. Union of non-overlapping intervals on real line. - We also know that for any

More information

Stochastic Processes. Chapter Definitions

Stochastic Processes. Chapter Definitions Chapter 4 Stochastic Processes Clearly data assimilation schemes such as Optimal Interpolation are crucially dependent on the estimates of background and observation error statistics. Yet, we don t know

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes STOCHASTIC PROCESSES, DETECTION AND ESTIMATION 6.432 Course Notes Alan S. Willsky, Gregory W. Wornell, and Jeffrey H. Shapiro Department of Electrical Engineering and Computer Science Massachusetts Institute

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation

More information

Gaussian, Markov and stationary processes

Gaussian, Markov and stationary processes Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November

More information