Chapter 4 Random process. 4.1 Random process
|
|
- Lisa Anthony
- 6 years ago
- Views:
Transcription
1 Random processes - Chapter 4 Random process 1 Random processes Chapter 4 Random process 4.1 Random process 4.1 Random process
2 Random processes - Chapter 4 Random process 2 Random process Random process, stochastic process The infinite set {X t, t T } of random variables is called a random process, where the index set T is an infinite set. In other words, a random vector with an infinite number of elements is called a random process. Discrete time process, continuous time process A random process is said to be discrete time if the index set is a countably infinite set. When the index set is an uncountable set, the random process is called a continuous time random process. 4.1 Random process
3 Random processes - Chapter 4 Random process 3 Discrete (alphabet) process, continuous (alphabet) process A random process is called a discrete alphabet, discrete amplitude, or discrete state process if all finite length random vectors drawn from the random process are discrete random vectors. A process is called a continuous alphabet, continuous amplitude, or continuous state process if all finite length random vectors drawn from the random process are continuous random vectors. A random process {X( )} maps an element ω of the sample space on a time function X(ω, t). A random process {X( )} is the collection of time functions X(t) called sample functions. This collection is called an ensemble. The value of the time function X(t) is a random variable at t. 4.1 Random process
4 Random processes - Chapter 4 Random process 4 A random process and sample functions 4.1 Random process
5 Random processes - Chapter 4 Random process 5 Since a random process is a collection of random variables with X(t) denoting a random variable at t, the statistical characteristics of the random process can be considered vi the cdf and pdf of X(t). For example, the first-order cdf, first-order pdf, second-order cdf, and nth-order cdf of the random process {X(t)} are and respectively. F X(t) (x) = Pr{X(t) x}, f X(t) (x) = df X(t)(x), dx F X(t1 ),X(t 2 )(x 1, x 2 ) = Pr{X(t 1 ) x 1, X(t 2 ) x 2 }, F X(t1 ),,X(t n )(x 1,, x n ) = Pr{X(t 1 ) x 1,, X(t n ) x n }, 4.1 Random process
6 Random processes - Chapter 4 Random process 6 Mean function The mean function m X (t) of a random process {X(t)} is defined by Autocorrelation function m X (t) = E{X(t)} = xf X(t) (x)dx. The autocorrelation function R X (t 1, t 2 ) of a random process {X(t)} is defined by R X (t 1, t 2 ) = E{X(t 1 )X (t 2 )}. 4.1 Random process
7 Random processes - Chapter 4 Random process 7 Known signal An extreme example of a random process is a known or deterministic signal. When X(t) = s(t) is a known signal, we have m(t) = E{s(t)} = s(t), R(t 1, t 2 ) = E{s(t 1 )s(t 2 )} = s(t 1 )s(t 2 ). Consider the random process {X(t)} with mean E{X(t)} = 3 and autocorrelation function R(t 1, t 2 ) = exp( 0.2 t 1 t 2 ). If Z = X(5), W = X(8), we can easily obtain E{Z} = E{X(5)} = 3, E{W } = 3, E{Z 2 } = R(5, 5) = 13, E{W 2 } = R(8, 8) = 13, Var{Z} = = 4, Var{W } = = 4, and E{ZW } = R(5, 8) = 9 + 4e In other words, the random variables Z and W have the variance σ 2 = 4 and covariance Cov(5, 8) = 4e Random process
8 Random processes - Chapter 4 Random process 8 The autocorrelation function of X(t) = Ae jωt defined with a random variable A can be obtained as R X (t 1, t 2 ) = E{Ae jωt 1 A e jωt 2 } = e jω(t 1 t 2 ) E{ A 2 }. Autocovariance function The autocovariance function K X (t 1, t 2 ) of a random process {X(t)} is defined by K X (t 1, t 2 ) = E{[X(t 1 ) m X (t 1 )][X (t 2 ) m X(t 2 )]}. In general, the autocovariance and autocorrelation functions are functions of t 1 and t 2. The autocovariance function can be expressed in terms of the autocorrelation and mean functions as K X (t 1, t 2 ) = R X (t 1, t 2 ) m X (t 1 )m X(t 2 ). 4.1 Random process
9 Random processes - Chapter 4 Random process 9 Uncorrelated random process A random process {X t } is said to be uncorrelated if R X (t, s) = E{X t }E{X s } or K X (t, s) = 0 for t s. If a random process {X(t)} is uncorrelated, the autocorrelation and autocovariance functions are and respectively. R X (t, s) = E{X t Xs } { E{ Xt 2 }, t = s, = E{X t }E{Xs }, t s, { σ 2 Xt, t = s, K X (t, s) = 0, t s, 4.1 Random process
10 Random processes - Chapter 4 Random process 10 Correlation coefficient function The correlation coefficient function ρ X (t 1, t 2 ) of a random process {X(t)} is defined by ρ X (t 1, t 2 ) = K X (t 1, t 2 ) KX (t 1, t 1 ) K X (t 2, t 2 ) = K X(t 1, t 2 ) σ(t 1 )σ(t 2 ), where σ(t i ) is the standard deviation of X(t i ). We can show that ρ X (t 1, t 2 ) 1, ρ X (t, t) = Random process
11 Random processes - Chapter 4 Random process 11 Crosscorrelation function The crosscorrelation function R XY (t 1, t 2 ) of random processes {X(t)} and {Y (t)} is defined by R XY (t 1, t 2 ) = E{X(t 1 )Y (t 2 )}. The autocorrelation and crosscorrelation functions satisfy R X (t, t) = E{X(t)X (t)} = σ 2 X(t) + E{X(t)} 2 0, R X (t 1, t 2 ) = R X(t 2, t 1 ), R XY (t 1, t 2 ) = R Y X(t 2, t 1 ). The autocorrelation function R X (t 1, t 2 ) is positive semi-definite. In other words, a i a j R(t i, t j ) 0 for non-negative constants {a k }. i j 4.1 Random process
12 Random processes - Chapter 4 Random process 12 Crosscovariance function The crosscovariance function K XY (t 1, t 2 ) of random processes {X(t)} and {Y (t)} is defined by K XY (t 1, t 2 ) = E{[X(t 1 ) m X (t 1 )][Y (t 2 ) m Y (t 2 )]} = R XY (t 1, t 2 ) m X (t 1 )m Y (t 2 ). Two random processes which are uncorrelated The random processes {X(t)} and {Y (t)} are said to be uncorrelated if R XY (t 1, t 2 ) = E{X(t 1 )}E{Y (t 2 )} or K XY (t 1, t 2 ) = 0 for all t 1 and t 2. Orthogonality The two random processes {X(t)} and {Y (t)} are said to be orthogonal if R XY (t 1, t 2 ) = 0 for all t 1 and t Random process
13 Random processes - Chapter 4 Random process 1 Random process Chapter 4 Random process 4.2 Properties of random processes 4.2 Properties of random processes
14 Random processes - Chapter 4 Random process 2 Stationary process and independent process A random process is said to be stationary if the probabilistic properties do not change under time shifts. Stationary process A random process {X(t)} is stationary, strict-sense stationary (s.s.s.), or stronglystationary if the joint cdf of {X(t 1 ), X(t 2 ),, X(t n )} is the same as the joint cdf of {X(t 1 + τ), X(t 2 + τ),, X(t n + τ)} for all n, τ, t 1, t 2,, t n. 4.2 Properties of random processes / Stationary process and independent process
15 Random processes - Chapter 4 Random process 3 Wide-sense stationary (w.s.s.) process A random process {X(t)} is w.s.s., weakly-stationary, or second-order stationary if (1) the mean function is constant and (2) the autocorrelation function R X (t, s) depends only on t s but not on t and s individually. The mean function m X and the autocorrelation function R X of a w.s.s. process {X(t)} are thus and respectively. m X (t) = m R X (t 1, t 2 ) = R(t 1 t 2 ), In other words, the autocorrelation function R X (t 1, t 2 ) of a w.s.s. process {X(t)} is a function of τ = t 1 t 2. For all t and τ, we have When τ = 0, R X (0) = E{ X(t) 2 }. R X (τ) = E{X(t + τ)x (t)}. 4.2 Properties of random processes / Stationary process and independent process
16 Random processes - Chapter 4 Random process 4 Consider two sequences of uncorrelated random variables A 0, A 1,, A m and B 0, B 1,, B m having mean zero and variance σi 2. Assume that the two sequences are uncorrelated with each other. Let ω 0, ω 1,, ω m be distinct frequencies in [0, 2π), and let X n = m {A k cos nω k +B k sin nω k } for n = 0, ±1, ±2,. Then we can obtain Thus, {X n } is w.s.s.. k=0 m E{X n X n+l } = σk 2 cos lω k, k=0 E{X n } = Properties of random processes / Stationary process and independent process
17 Random processes - Chapter 4 Random process 5 Properties of the autocorrelation function R X (τ) of a real stationary process {X(t)} R X ( τ) = R X (τ) : R X(τ) is an even function. R X (τ) R X (0) : R X (τ) is maximum at the origin. If R X (τ) is continuous τ = 0, then it is also continuous at every value of τ. If there is a constant T > 0 such that R X (0) = R X (T ), then R X (τ) is periodic. 4.2 Properties of random processes / Stationary process and independent process
18 Random processes - Chapter 4 Random process 6 Independent random process A random process is said to be independent if the joint cdf satisfies n F Xt1,X t2,,x (x) = t F n Xti (x i ) for all n and t 1, t 2,, t n, x 1, x 2,, x n. Independent and identically distributed (i.i.d.) process A random process is said to be i.i.d. if the joint cdf satisfies n F Xt1,X t2,,x (x) = t F n X (x i ) for all n and t 1, t 2,, t n, x 1, x 2,, x n. The i.i.d. process is sometimes called a memoryless process or a white noise. The i.i.d. process is the simplest process, and yet it is the most stochastic process in that past outputs do not have any information on the future. i=1 i=1 4.2 Properties of random processes / Stationary process and independent process
19 Random processes - Chapter 4 Random process 7 Bernoulli process An i.i.d. random process with two possible values is called a Bernoulli process. For example, consider the random process {X n } defined by X n = { 1, if the nth result is head, 0, if the nth result is tail, when we toss a coin infinitely. The random process {X n } is a discrete-time discreteamplitude random process. The success ( head ) and failure ( tail ) probabilities are P {X n = 1} = p and P {X n = 0} = 1 p, respectively. 4.2 Properties of random processes / Stationary process and independent process
20 Random processes - Chapter 4 Random process 8 We can easily obtain and m X (n) = E{X n } = p K X (n 1, n 2 ) = E{X n1 X n2 } m X (n 1 )m X (n 2 ) { p(1 p), n 1 = n 2, = 0, n 1 n 2. The mean function m X (n) of a Bernoulli process is not a function of time but a constant. The autocovariance K X (n 1, n 2 ) depends not on n 1 and n 2 individually, but only on the difference n 1 n 2. Clearly, the Bernoulli process is w.s.s Properties of random processes / Stationary process and independent process
21 Random processes - Chapter 4 Random process 9 Two random processes independent of each other The random processes {X(t)} and {Y (t)} are said to be independent of each other if the random vector (X t1, X t2,, X tk ) is independent of the random vector (Y s1, Y s2,, Y sl ) for all k, l, and t 1, t 2,, t k, s 1, s 2,, s l. 4.2 Properties of random processes / Stationary process and independent process
22 Random processes - Chapter 4 Random process 10 Normal process, Gaussian process A random process {X t } is said to be normal if (X t1, X t2,, X tk ) is a k dimensional normal random vector for all k and t 1, t 2,, t k. A stationary process is always w.s.s., but the converse is not always true. On the other hand, a w.s.s. normal process is s.s.s.. This result can be obtained from the pdf f X (x) = { 1 exp 1 } (2π) n/2 K X 1/2 2 (x m X) T K 1 X (x m X) of a jointly normal random vector. 4.2 Properties of random processes / Stationary process and independent process
23 Random processes - Chapter 4 Random process 11 Jointly w.s.s. processes Two random processes are said to be jointly w.s.s. if (1) the mean functions are constants and and (2) the autocorrelation functions and crosscorrelation function are all functions only of time differences. If two random processes {X(t)} and {Y (t)} are jointly w.s.s., then {X(t)} and {Y (t)} are both w.s.s.. The crosscorrelation function of {X(t)} and {Y (t)} is R XY (t + τ, t) = R XY (τ) = E{X(t + τ)y (t)} = ( E{Y (t)x (t + τ)} ) = R Y X( τ). 4.2 Properties of random processes / Stationary process and independent process
24 Random processes - Chapter 4 Random process 12 The crosscorrelation function R XY of two jointly w.s.s. random processes has the following properties: 1. R Y X (τ) = R XY ( τ). 2. R XY (τ) R XX (0)R Y Y (0) 1 2 {R XX(0) + R Y Y (0)}. 3. R XY (τ) is not always maximum at the origin. Linear transformation and jointly normal process Two processes X(t) and Y (t) are w.s.s. if the linear combination Z(t) = ax(t) + by (t) is w.s.s. for all a and b. The converse is also true. 4.2 Properties of random processes / Stationary process and independent process
25 Random processes - Chapter 4 Random process 13 Moving average (MA) process Let a 1, a 2,, a l be a sequence of real numbers and W 0, W 1, W 2, be a sequence of uncorrelated random variables with mean E{W n } = m and variance Var{W n } = σ 2. Then the following process {X n } is called a moving average process. X n = a 1 W n + a 2 W n a l W n l+1 l = a i W n i+1. i=1 The mean and variance of X n are E{X n } = (a 1 + a a l )m, Var{X n } = (a a a 2 l )σ Properties of random processes / Stationary process and independent process
26 Random processes - Chapter 4 Random process 14 Since E{ ˆX 2 n} = σ 2 when ˆX n = W n m, {X n } is w.s.s. from Cov(X n, X n+k ) = E{ ( X n m { ( l = E = = i=1 l a i )(X n+k m i=1 a i ˆXn i+1 )( l ) } a i i=1 l ) } a j ˆXn+k j+1 j=1 E { a l a l k ˆX2 n+k l+1 + a l 1 a l k 1 ˆX n+k l a } k+1a 1 ˆX2 n, k l 1, 0, k l, { (a l a l k + + a k+1 a 1 )σ 2, k l 1, 0, k l. 4.2 Properties of random processes / Stationary process and independent process
27 Random processes - Chapter 4 Random process 15 Autoregressive (AR) process Let the variance of an uncorrelated zero-mean random sequence Z 0, Z 1, be { σ 2, n = 0, Var{Z n } = 1 λ 2 σ 2, n 1, where λ 2 < 1. Then the random process {X n } defined by X 0 = Z 0, X n = λx n 1 + Z n, n 1. is called the first order autoregressive process. We can obtain X n = λ(λx n 2 + Z n 1 ) + Z n = λ 2 X n 2 + λz n 1 + Z n. n = λ n i Z i. i=0 4.2 Properties of random processes / Stationary process and independent process
28 Random processes - Chapter 4 Random process 16 Thus the autocovariance function of {X n } is ( n Cov(X n, X n+m ) = Cov λ n i Z i, = i=0 n+m i=0 n λ n i λ n+m i Cov(Z i, Z i ) i=0 λ n+m i Z i ) ( ) = σ 2 λ 2n+m 1 n 1 λ + λ 2i 2 = σ2 λ m 1 λ 2. Now, from the result above and the fact that the mean of {X n } is E{X n } = 0, it follows that {X n, n 0} is w.s.s.. i=1 4.2 Properties of random processes / Stationary process and independent process
29 Random processes - Chapter 4 Random process 17 Square-law detector Let Y (t) = X 2 (t) where {X(t)} is a Gaussian random process with mean 0 and autocorrelation R X (τ). Then the expectation of Y (t) is E{Y (t)} = E{X 2 (t)} = R X (0). Since X(t+τ) and X(t) are jointly Gaussian with mean 0, the autocorrelation of Y (t) can be found as R Y (t, t + τ) = E{X 2 (t)x 2 (t + τ)} = E{X 2 (t + τ)}e{x 2 (t)} + 2E 2 {X(t + τ)x(τ)} = R 2 X(0) + 2R 2 X(τ). Thus E{Y 2 (t)} = R Y (0) = 3RX 2 (0) and σ2 Y = 3R2 X (0) R2 X (0) = 2R2 X (0). Clearly, {Y (t)} is w.s.s Properties of random processes / Stationary process and independent process
30 Random processes - Chapter 4 Random process 18 Limiter Let {Y (t)} = {g(x(t))} be a random process which is defined by a random process {X(t)} and a limiter { 1, x > 0, g(x) = 1, x < 0. Then we can easily obtain P {Y (t) = 1} = P {X(t) > 0} = 1 F X (0) and P {Y (t) = 1} = P {X(t) < 0} = F X (0). Thus the mean and autocorrelation of {Y (t)} are E{Y (t)} = 1 P {Y (t) = 1} + ( 1) P {Y (t) = 1} = 1 2F X (0), R Y (τ) = E{Y (t)y (t + τ)} = P {Y (t)y (t + τ) = 1} P {Y (t)y (t + τ) = 1} = P {X(t)X(t + τ) > 0} P {X(t)X(t + τ) < 0}. Now, if {X(t)} is a stationary Gaussian random process with mean 0, X(t + τ) and X(t) are jointly Gaussian with mean 0, variance R X (0), and correlation coefficient R X (τ)/r X (0). Clearly, F X (0) = 1/ Properties of random processes / Stationary process and independent process
31 Random processes - Chapter 4 Random process 19 We have (refer to (3.100), p. 156, Random Processes, Park, Song, Nam, 2004) { X(t) } P {X(t)X(t + τ) < 0} = P X(t + τ) < 0 = F Z (0) = π tan 1 rσ 1 σ 1 1 r 2 = π sin 1 r = π sin 1 R X(τ) R X (0), P {X(t)X(t + τ) > 0} = 1 P {X(t)X(t + τ) < 0} = π sin 1 R X(τ) R X (0). Thus the autocorrelation of the limiter output is R Y (τ) = 2 π sin 1 R X(τ) R X (0), from which we have E{Y 2 (t)} = R Y (0) = 1 and σ 2 Y = 1 {1 2F X(0)} 2 = Properties of random processes / Stationary process and independent process
32 Random processes - Chapter 4 Random process 20 Power of a random process Power spectrum The Fourier transform of the autocorrelation function of a w.s.s. random process is called the power spectrum, power spectral density, or spectral density. It is usually assumed that the mean is zero. When the autocorrelation is R X (τ), the power spectrum is S X (ω) = F{R X } R X (k)e jωk, k = R X (τ)e jωτ dτ, discrete time, continuous time. If the mean is not zero, the power spectral density is defined by the Fourier transform of the autocovariance instead of the autocorrelation. 4.2 Properties of random processes / Power of random process
33 Random processes - Chapter 4 Random process 21 White noise, white process Suppose that a discrete time random process {X n } is uncorrelated so that R X (k) = σ 2 δ k. Then we have S X (ω) = k σ 2 δ k e jωk = σ 2. Such a process is called a white noise or white process. If {X n } is Gaussian in addition, it is called a white Gaussian noise. 4.2 Properties of random processes / Power of random process
34 Random processes - Chapter 4 Random process 22 Telegraph signal Consider Poisson points with parameter λ. Let N(t) be the number of points in the interval (0, t]. As shown in the figure above, consider the continuous-time random process X(t) = ( 1) N(t) with X(0) = 1. Here, W i is the time between adjacent Poisson points. Assuming τ > 0, the autocorrelation of X(t) is R X (τ) = E{X(t + τ)x(t)} = 1 P {the number of points in the interval (t, t + τ] is even} +( 1) P {the number of points in the interval (t, t + τ] is odd} { } } = e λτ 1 + (λτ)2 + e {λt λτ + (λτ)3 + 2! 3! = e λτ cosh λτ e λτ sinh λτ = e 2λτ. 4.2 Properties of random processes / Power of random process
35 Random processes - Chapter 4 Random process 23 We can obtain a similar result when τ < 0, Combining the two results, we have R X (τ) = e 2λ τ. Consider a random variable A which is independent of X(t) and takes on 1 or 1 with equal probability. Let Y (t) = AX(t). We then have E{Y (t)} = E{A}E{X(t)} = 0, E{Y (t 1 )Y (t 2 )} = E{A 2 }E{X(t 1 )X(t 2 )} = E{A 2 }R X (t 1 t 2 ) = e 2λ t 1 t 2 since E{A} = 0, E{A 2 } = 1. Thus {Y (t)} is w.s.s., and the power spectral density of {Y (t)} is S Y (ω) = F{e 2λ τ } 4λ = ω 2 + 4λ Properties of random processes / Power of random process
36 Random processes - Chapter 4 Random process 24 Band limited noise, colored noise When W > 0, let us consider a random process with the power spectral density { 1, ω [ W, W ], S X (ω) = 0, otherwise. Such a process is called a colored noise. The autocorrelation of a colored noise is thus R X (τ) = F 1 {S X (ω)} = sin(w τ). πτ 4.2 Properties of random processes / Power of random process
37 Random processes - Chapter 4 Random process 25 Power spectral density is not less than 0. That is, S X (ω) 0. Cross power spectral density The cross power spectral density S XY (ω) of jointly w.s.s. {Y (t)} is S XY (ω) = R XY (τ)e jωτ dτ. processes {X(t)} and Thus S XY (ω) = S Y X (ω), and the inverse Fourier transform of S XY (ω) is R XY (τ) = 1 2π S XY (ω)e jωτ dω. 4.2 Properties of random processes / Power of random process
38 Random processes - Chapter 4 Random process 26 Time delay process Consider a w.s.s. process {X(t)} of which the power spectral density is S X (ω). Letting Y (t) = X(t d), we have R Y (t, s) = E{Y (t)y (s)} = E{X(t d)x (s d)} = R X (t s). Thus the process {Y (t)} is w.s.s. and the power spectral density S Y (ω) equals to S X (ω). In addition, the crosscorrelation and cross power spectral density of {X(t)} and {Y (t)} are and R XY (t, s) = E{X(t)Y (s)} = E{X(t)X (s d)} = R X (t s + d) = R X (τ + d) S XY (ω) = F{R X (τ + d)} = = That is, {X(t)} and {Y (t)} are jointly w.s.s.. R X (τ + d)e jωτ dτ R X (u)e jωu e jωd du = S X (ω)e jωd. 4.2 Properties of random processes / Power of random process
39 Random processes - Chapter 4 Random process 27 Random process and linear systems If the input random process is two-sided and w.s.s., the output of a linear time invariant (LTI) filter is also w.s.s.. If the input random process is one-sided and w.s.s., however, the output of an LTI filter is not w.s.s. in general. 4.2 Properties of random processes / Random process and linear systems
40 Random processes - Chapter 4 Random process 28 Let h(t) be the impulse response of an LTI system and let H(ω) = F{h(t)} be the transfer function. Then the crosscorrelation R XY (t 1, t 2 ) of the input random process {X(t)} and output random process {Y (t)} and autocorrelation R Y (t 1, t 2 ) of the output are { } R XY (t 1, t 2 ) = E{X(t 1 )Y (t 2 )} = E X(t 1 ) X (t 2 α)h (α)dα = E{X(t 1 )X (t 2 α)}h (α)dα = R X (t 1, t 2 α)h (α)dα and R Y (t 1, t 2 ) = E{Y (t 1 )Y (t 2 )} = E{ } X(t 1 α)h(α)dα X (t 2 β)h (β)dβ = R X (t 1 α, t 2 β)h(α)h (β)dαdβ = R XY (t 1 α, t 2 )h(α)dα, respectively. 4.2 Properties of random processes / Random process and linear systems
41 Random processes - Chapter 4 Random process 29 If the input and output are jointly w.s.s., we can obtain R XY (τ) = R X (t + α)h (α)dα = R X (τ) h ( τ), R Y (τ) = R XY (τ) h(τ). since R X (t 1, t 2 α) = R X (t 1 t 2 + α) = R X (τ + α) and R XY (t 1 α, t 2 ) = R XY (t 1 t 2 α) = R XY (τ α). The cross power spectral density and power spectral density of output are respectively. S XY (ω) = S X (ω)h (ω), S Y (ω) = S XY (ω)h(ω), 4.2 Properties of random processes / Random process and linear systems
42 Random processes - Chapter 4 Random process 30 We can express the autocorrelation and power spectral density of the output in terms of those of the input. Specifically, we have and R Y (τ) = R X (τ) ρ h (τ) S Y (ω) = S X (ω) H(ω) 2, where ρ h (t) is called the deterministic autocorrelation of h(t) and is defined by ρ h (t) = F 1 ( H(ω) 2 ) = h(t) h ( t) = h(t + τ)h (τ)dτ. Let S Y (ω) be the power spectral density of the output process {Y t }. Then we can obtain R Y (τ) = F 1 {S Y (ω)} = F 1 { H(ω) 2 S X (ω)}. 4.2 Properties of random processes / Random process and linear systems
43 Random processes - Chapter 4 Random process 31 Coherence function A measure of the degree to which two w.s.s. processes are related by an LTI transformation is the coherence function ρ(ω) defined by ρ XY (ω) = S XY (ω) [S X (ω)s Y (ω)] 1/2. Here, ρ(ω) = 1 if and only if {X(t)} and {Y (t)} are the linearly related, that is, Y (t) = X(t) h(t). Note the similarity between the coherence function ρ(ω) and the correlation coefficient ρ, a measure of the degree to which two random variables are linearly related. The coherence function exhibits the property ρ(ω) 1 similar to the correlation coefficient between two random variables. 4.2 Properties of random processes / Random process and linear systems
44 Random processes - Chapter 4 Random process 32 Ergodic theorem* If there exists a random variable ˆX such that 1 T T 0 1 n n 1 i=0 X i n ˆX, discrete time random process, X(t)dt T ˆX, continuous time random process. {X n, n I} is said to satisfy ergodic theorem. When a process satisfies an ergodic theorem, the sample mean converges to something, which may be different from the expectation. In some cases, a random process with time-varying mean satisfies an ergodic theorem as shown in the example below. 4.2 Properties of random processes / Ergodic theorem*
45 Random processes - Chapter 4 Random process 33 Suppose that nature at the beginning of time randomly selects one of two coins with equal probability, one having bias p and the other having bias q. After the coin is selected it is flipped once per second forever. The output random process is a one-zero sequence depending on the face of a coin. Clearly, the time average will converge: it will converge to p if the first coin was selected and to q if the second coin was selected. That is, the time average will converge to a random variable. In particular, it will not converge to the expected value p/2 + q/2. If lim n E{(Y n Y ) 2 } = 0, Y n, n = 1, 2, is said to converge to Y in mean square, which is denoted as where l.i.m. denotes limit in the mean. l.i.m. n Y n = Y, 4.2 Properties of random processes / Ergodic theorem*
46 Random processes - Chapter 4 Random process 34 Mean ergodic theorem Let {X n } be an uncorrelated discrete time random process with finite mean E{X n } = m and finite variance σx 2 n = σx 2. Then the sample mean S n = n 1 X i /n converges to the expected value E{X n } = m in mean square. That is, l.i.m. n 1 n n 1 i=0 X i = m. A sufficient condition for a w.s.s. discrete time random process {X n, n I} to satisfy a mean ergodic theorem is K X (0) < and lim K X(n) = 0. n i=0 4.2 Properties of random processes / Ergodic theorem*
47 Random processes - Chapter 4 Random process 35 Let {X n } be a discrete time random process with mean E{X n } and autocovariance function K X (i, j). The process need not be stationary in any sense. A necessary and sufficient condition for is and lim n l.i.m. n lim n 1 n 2 1 n 1 n n 1 i=0 n 1 i=0 n 1 i=0 n 1 k=0 X i = m E{X i } = m K X (i, k) = 0. That is, if and only if a process is asymptotically uncorrelated and its sample averages converge, the sample mean converge in mean square. 4.2 Properties of random processes / Ergodic theorem*
48 Random processes - Chapter 4 Random process 36 Mean square ergodic theorem Let Xn = n i=1 X i /n where {X n, n 1} is a second order stationary process with mean m and autocovariance K(i) = Cov(X n, X n+i ). Then lim E{( X n m) 2 } = 0 n if and only if lim n n 1 i=0 K(i)/n = 0. Let K(i) be the autocovariance of {X n }, a second order stationary Gaussian process with mean 0. If then lim T 1 T T K 2 (i) = 0, i=1 lim E{ ˆK T (i) K(i) 2 } = 0 T for i = 1, 2,, where ˆK T (i) = T X l X l+i /T is the sample autocovariance. l=1 4.2 Properties of random processes / Ergodic theorem*
49 Random processes - Chapter 4 Random process 37 Ergodicity* Shift operator An operator T for which T ω = {x t+1, t I}, where ω = {x t, t I} is an infinite sequence in the probability space (R I, B(R) I, P ), is called a shift operator. Stationary process A discrete time random process with process distribution P P (T 1 F ) = P (F ) for any element F in B(R) I. is stationary if 4.2 Properties of random processes / Ergodicity*
50 Random processes - Chapter 4 Random process 38 Invariant event An event F is said to be invariant with respect to the shift operator T if and only if T 1 F = F. Ergodicity, ergodic process A random process is said to be ergodic if P (F ) = 0 or P (F ) = 1 for any invariant event F. Consider a two-sided process with distribution P (, x 1 = 1, x 0 = 0, x 1 = 1, x 2 = 0, ) = p, P (, x 1 = 0, x 0 = 1, x 1 = 0, x 2 = 1, ) = 1 p. Clearly, F = {sequence of alternating 0 and 1} is an invariant event, and has probability P (F ) = 1. Any other invariant event - for example, the all 1 sequence - that does not include F has probability 0. Thus the random process is ergodic. Ergodicity has nothing to do with stationarity or convergence of sample averages. 4.2 Properties of random processes / Ergodicity*
51 Random processes - Chapter 4 Random process 1 Random processes Chapter 4 Random process 4.3 Process with i.s.i. 4.3 Process with i.s.i.
52 Random processes - Chapter 4 Random process 2 Process with i.s.i. Process with independent increments A random process {Y t,t I} is said to have independent increments or to be an independent increment process if for all choices of k =1, 2, and all choices of ordered sample times {t 0,t 1,,t k }, the k increments Y ti Y ti 1,i =1, 2,,k are independent random variables. Process with stationary increments When the increments {Y t Y s } are stationary, the random process {Y t } is called a stationary increment random process. Process with i.s.i. A random process is called an independent and stationary increment (i.s.i.) process if its increments are independent and stationary. 4.3 Process with i.s.i.
53 Random processes - Chapter 4 Random process 3 A discrete time random process is an i.s.i. process if and only if it can be represented as the sum of i.i.d. random variables. Mean and autocovariance of i.s.i. process The mean and autocovariance of a discrete time i.s.i. process are and E{Y t } = te(y 1 ), t 0 K Y (t, s) = σ 2 Y 1 min(t, s), t,s 0. Clearly, an i.s.i. process itself is not stationary. 4.3 Process with i.s.i.
54 Random processes - Chapter 4 Random process 4 Let the process {X t,t T } be an i.s.i. process. If we have m 0 = E{X 0 }, m 1 = E{X 1 } m 0, σ0 2 = E{(X 0 m 0 ) 2 }, σ1 2 = E{(X 1 m 1 ) 2 } σ0, 2 E{X t } = m 0 + m 1 t, Var{X t } = σ0 2 + σ1t Process with i.s.i.
55 Random processes - Chapter 4 Random process 5 Point process and counting process A sequence T 1 T 2 T 3 of ordered random variables is called a point process. For example, the set of times defined by Poisson points is a point process. A counting process Y (t) can be defined as the number of points in the interval [0,t). We have, with T 0 =0,. Y (t) =i, T i t<t i+1, i =0, 1,. 4.3 Process with i.s.i.
56 Random processes - Chapter 4 Random process 6 A counting process A process constructed by summing the outputs of a Bernoulli process Let {X n,n =1, 2, } be a Bernoulli process with parameter p. random process {Y n,n=1, 2, } as Define the Y 0 = 0, n Y n = i=1 X i = Y n 1 + X n,n=1, 2,. Since the random variable Y n represents the number of 1 s in {X 1,X 2,,X n }, we have Y n = Y n 1 or Y n = Y n 1 +1, n =2, 3,. In general, a discrete time process satisfying this relation is called a counting process since it is nondecreasing, and when it jumps, it is always with an increment of Process with i.s.i.
57 Random processes - Chapter 4 Random process 7 Properties of the random process {Y n } E{Y n } = np, Var{Y n } = np(1 p), K Y (k, j) =p(1 p)min(k, j) Marginal pmf for Y n p Yn (y) = Pr{ there are exactly y ones in X 1,X 2,,X n.} ( ) n = p y (1 p) n y, y =0, 1,,n. y Since the marginal pdf is binomial, the process {Y n } is called a binomial counting process. The process {Y n } is not stationary since the marginal pmf depends on the time index n. 4.3 Process with i.s.i.
58 Random processes - Chapter 4 Random process 8 Random walk process One dimensional random walk, random walk Consider the modified Bernoulli process for which the event failure is represented by 1 instead of 0. { +1, for success in the nth trial, Z n = 1, for failure in the nth trial. Let p = Pr{Z n =1}, and consider the sum n W n = of the variables Z n. The process {W n } is referred to as one dimensional random walk or random walk. i=1 Z i 4.3 Process with i.s.i.
59 Random processes - Chapter 4 Random process 9 Since Z i = 2X i 1, it follows that the random walk process {W n } is related to the binomial counting process {Y n } by W n = 2Y n n. Using the results on the mean and autocorrelation functions of the binomial counting process and the linearity of expectation, we have m W (n) = (2p 1)n, K W (n 1,n 2 ) = 4p(1 p)min(n 1,n 2 ), σ 2 W n = 4p(1 p)n. 4.3 Process with i.s.i.
60 Random processes - Chapter 4 Random process 1 Random process Chapter 4 Random process 4.4 Discrete time process with i.s.i. 4.4 Discrete time process with i.s.i.
61 Random processes - Chapter 4 Random process 2 Discrete time process with i.s.i. Discrete time discrete alphabet process {Y n } with i.s.i. As mentioned before, {Y n } can be defined by the sum of i.i.d. random variables {X i }. Let us consider the following conditional pmf. p Yn Y n 1(y n y n 1 ) = p Yn Y n 1(y n y n 1,,y 1 ) = Pr(Y n = y n Y n 1 = y n 1 ), where Y n =(Y n,y n 1,,Y 1 ) and y n =(y n,y n 1,,y 1 ). The conditioning event {Y i = y i,i=1, 2,,n 1} above is the same as the event {X 1 = y 1,X i = y i y i 1, i =2,,n 1}. In addition, under the conditioning event, we have Y n = y n if and only if X n = y n y n Discrete time process with i.s.i.
62 Random processes - Chapter 4 Random process 3 Assuming y 0 =0, p Yn Y n 1(y n y n 1 ) = Pr(Y n = y n X 1 = y 1,X i = y i y i 1,i=2, 3,,n 1) = Pr(X n = y n y n 1 X i = y i y i 1,i=1, 2,,n 1) = p Xn X n 1(y n y n 1 y n 1 y n 2,,y 2 y 1,y 1 ), where X n 1 =(X n 1,X n 2,,X 1 ). If {X n } are i.i.d., p Yn Y n 1(y n y n 1 )=p X (y n y n 1 ) since X n is independent of X k for k<n. Thus the joint pmf is p Y n(y n ) = p Yn Y n 1(y n y n 1 ) p Y n 1(y n 1 ). n = p Y1 (y 1 ) p Yi Y i 1,,Y 1 (y i y i 1,,y 1 )= i=2 n p X (y i y i 1 ). i=1 4.4 Discrete time process with i.s.i.
63 Random processes - Chapter 4 Random process 4 Applying the result above to the binomial counting process, we obtain p Y n(y n )= n p (y i y i 1 ) (1 p) 1 (y i y i 1 ), where y i y i 1 =0or 1 for i =1, 2,,n and y 0 =0. i=1 Properties of processes with i.s.i. We can express the conditional pmf of Y n given Y n 1 as follows: p Yn Y n 1 (y n y n 1 ) = Pr(Y n = y n Y n 1 = y n 1 ) = Pr(X n = y n y n 1 Y n 1 = y n 1 ). The conditioning event {Y n 1 = y n 1 } depends only on X k for k<n,andx n is independent of X k for k<n. Thus, this conditioning event does not affect X n. Consequently, p Yn Y n 1 (y n y n 1 )=p X (y n y n 1 ). 4.4 Discrete time process with i.s.i.
64 Random processes - Chapter 4 Random process 5 Discrete time i.s.i. processes (such as the binomial counting process and discrete random walk) has the following property: p Yn Y n 1(y n y n 1 ) = p Yn Y n 1 (y n y n 1 ), Pr{Y n = y n Y n 1 = y n 1 } = Pr{Y n = y n Y n 1 = y n 1 }. Roughly speaking, given the most recent past sample (or the current sample), the other past samples do not affect the probability of what happens next. A discrete time discrete alphabet random process with this property is called a Markov process. Thus all i.s.i. processes are Markov processes. 4.4 Discrete time process with i.s.i.
65 Random processes - Chapter 4 Random process 6 Gambler s ruin problem A person wants to buy a new car of which the price is N won. The person has k (0 <k<n) won, and he intends to earn the difference from gambling. The game this person is going to play is that if a toss of a coin results in a head, he will earn 1 won, and if it results in a tail, he will lose 1 won. Let p represent the probability of heads, and q =1 p. Assuming the man continues to play the game until he earns enough money for a new car or lose all the money he has, what is the probability that the man loses all the money he has? Let A k be the event that the man loses all the money when he has started with k won and B be the event the man wins a game. Then, P (A k ) = P (A k B)P (B)+P (A k B c )P (B c ). Since the game will start again with k +1won if a toss of a coin results in a head and k 1 won if a toss of a coin results in a tail, it is easy to see that P (A k B) = P (A k+1 ), P (A k B c ) = P (A k 1 ). 4.4 Discrete time process with i.s.i.
66 Random processes - Chapter 4 Random process 7 Let p k = P (A k ), then p 0 =1, p N =0,and p k = pp k+1 + qp k 1, 1 k N 1. Assuming p k = θ k, we get from the equation above which gives θ 1 =1and θ 2 = q/p. pθ 2 θ + q =0, If p 0.5, p k = A 1 θ1 k + A 2 θ2. k Thus using the boundary conditions p 0 =1and p N =0, we can find p k = (q/p)k (q/p) N. 1 (q/p) N If p =0.5, wehavep k =(A 1 + A 2 k)θ k 1 since q/p =1. Thus using the boundary conditions p 0 =1and p N =0,wecanfind p k =1 k N. 4.4 Discrete time process with i.s.i.
67 Random processes - Chapter 4 Random process 8 Discrete time Wiener process Discrete time Wiener process Let {X n } be an i.i.d. zero-mean Gaussian process with variance σ 2. The discrete time Wiener process {Y n } is defined by Y 0 = 0, n Y n = i=1 X i = Y n 1 + X n, n =1, 2,. The discrete time Wiener process is also called the discrete time diffusion process or discrete time Brownian motion. Since the discrete time Wiener process is formed as sums of an i.i.d. process, it has i.s.i.. Thus we have E{Y n } =0and K Y (k, j) =σ 2 min(k, j). The Wiener process is a first-order autoregressive process. 4.4 Discrete time process with i.s.i. / discrete time Wiener process
68 Random processes - Chapter 4 Random process 9 The discrete time Wiener process is a Gaussian process with mean function m(t) =0and autocovariance function K X (t, s) =σ 2 min(t, s). Since the discrete time Wiener process is an i.s.i. process, we have f Yn Y n 1 (y n y n 1 ) = f X (y n y n 1 ), f Yn Y n 1(y n y n 1 ) = f Yn Y n 1 (y n y n 1 ) = f X (y n y n 1 ). As in the discrete alphabet case, a process with this property is called a Markov process. Markov process A discrete time random process {Y n } is called a first order Markov process if it satisfies for all n, y n,y n 1,y n 2,. Pr{Y n y n y n 1,y n 2, }= Pr{Y n y n y n 1 }. 4.4 Discrete time process with i.s.i. / discrete time Wiener process
69 Random processes - Chapter 4 Random process 1 Random processes Chapter 4 Random process 4.5 Continuous time i.s.i. process 4.5 Continuous time i.s.i. process
70 Random processes - Chapter 4 Random process 2 Continuous time i.s.i. process Continuous time i.s.i. process When we deal with a continuous time process with i.s.i. we need to consider more general collection of sample times than in the case of discrete time process. In the continuous time case, we assume that we are given the cdf of the increments as F Yt Y s (y) = F Y t s Y 0 (y) = F Y t s (y), t > s. 4.5 Continuous time i.s.i. process
71 Random processes - Chapter 4 Random process 3 The joint probability functions of a continuous time process Define the random variable {X n } by X n = Y tn Y tn 1. Then {X n } are independent and Y tn = n X i, i=1 Pr{Y tn y n Y tn 1 = y n 1,Y tn 2 = y n 2, }= F Xn (y n y n 1 ) = F Yt n Y t n 1 (y n y n 1 ). As in the case of discrete time processes, these can be used to find the joint pmf or pdf as n p Yt1,,Y (y t n 1,,y n ) = p Yti Y ti 1 (y i y i 1 ), f Yt1,,Y t n (y 1,,y n ) = i=1 n f Yti Y ti 1 (y i y i 1 ). i=1 4.5 Continuous time i.s.i. process
72 Random processes - Chapter 4 Random process 4 If a process {Y t } has i.s.i., and the cdf pdf, or pmf for Y t = Y t Y 0 is given, the process can be completely described as shown above. As in the discrete time case, a continuous time random process {Y t } is called a Markov process if we have or Pr{Y tn y n Y tn 1 = y n 1,Y tn 2 = y n 2, }= Pr{Y tn y n Y tn 1 = y n 1 }, f Yt n Y t n 1,,Y t1 (y n y n 1,,y 1 )=f Yt n Y t n 1 (y n y n 1 ), p Yt n Y t n 1,,Y t1 (y n y n 1,,y 1 )=p Yt n Y t n 1 (y n y n 1 ) for all n, y n,y n 1,,andt 1,t 2,,t n. A continuous time i.s.i. process is a Markov process. 4.5 Continuous time i.s.i. process
73 Random processes - Chapter 4 Random process 5 Wiener process Wiener process AprocessiscalledWienerprocessifitsatisfies The initial position is zero. That is, W (0) = 0. The mean is zero. That is, E{W (t)} =0,t 0. The increments of W (t) are independent, stationary, and Gaussian. Wiener process is a continuous time i.s.i. process. The increments of Wiener process are Gaussian random variables with zero mean. The first order pdf of Wiener process is f Wt (x) = { } 1 exp x2 2πtσ 2 2tσ Continuous time i.s.i. process / Wiener process
74 Random processes - Chapter 4 Random process 6 Wiener process, Brownian motion Wiener process is the limit of the random-walk process. Properties of Wiener process The distribution of X t2 X t1,t 2 >t 1, depends only on t 2 t 1,nott 1 and t 2 individually. 4.5 Continuous time i.s.i. process / Wiener process
75 Random processes - Chapter 4 Random process 7 Let us show that the Wiener process is Gaussian. From the definition of the Wiener process, the random variables W (t 1 ), W (t 2 ) W (t 1 ), W (t 3 ) W (t 2 ),, W (t k ) W (t k 1 ) are independent Gaussian random variables. Thus the random variables W (t 1 ), W (t 2 ), W (t 3 ),, W (t k ) can be obtained from the following linear transformation of W (t 1 ) and the increments W (t 1 ) = W (t 1 ), W (t 2 ) = W (t 1 )+{W(t 2 ) W (t 1 )}, W (t 3 ) = W (t 1 )+{W(t 2 ) W (t 1 )} + {W (t 3 ) W (t 2 )},. W (t k ) = W (t 1 )+{W(t 2 ) W (t 1 )} + +{W (t k ) W (t k 1 )}. Since W (t 1 ), W (t 2 ), W (t 3 ),, W (t k ) is jointly Gaussian, {W (t)} is Gaussian. 4.5 Continuous time i.s.i. process / Wiener process
76 Random processes - Chapter 4 Random process 8 Poisson counting process Poisson counting process A continuous time counting process {N t,t 0} with the following properties is called the Poisson counting process. N 0 =0. The process has i.s.i.. Hence, the increments over non-overlapping time intervals are independent random variables. In a very small time interval, the probability of an increment of 1 is proportional to the length of the interval, and the probability of an increment larger than 1 is 0. Thus, Pr{N t+ t N t =1} = λ t + o( t), Pr{N t+ t N t 2} = o( t), and Pr{N t+ t N t =0} =1 λ t + o( t), where λ is a proportionality constant. 4.5 Continuous time i.s.i. process / Poisson counting process
77 Random processes - Chapter 4 Random process 9 The Poisson counting process is a continuous time discrete alphabet i.s.i. process. We have obtained the Wiener process as the limit of a discrete time discrete amplitude random-walk process. Similarly, the Poisson counting process can be derived as the limit of a binomial counting process using the Poisson approximation. 4.5 Continuous time i.s.i. process / Poisson counting process
78 Random processes - Chapter 4 Random process 10 The probability mass function p Nt N 0 (k) =p Nt (k) of the increment N t N 0 between the starting time 0 and t>0 Let us use the notation p(k, t) =p Nt N 0 (k), t > 0. Using the independence of increments and the third property of the Poisson counting process, we have k p(k, t + t) = Pr{N t = n}pr{n t+ t N t = k n N t = n} which yields = n=0 k Pr(N t = n)pr(n t+ t N t = k n) n=0 p(k, t)(1 λ t)+p(k 1,t)λ t, p(k, t + t) p(k, t) t = p(k 1,t)λ p(k, t)λ. 4.5 Continuous time i.s.i. process / Poisson counting process
79 Random processes - Chapter 4 Random process 11 When t 0, the equation above becomes the following differential equation d p(k, t)+λp(k, t) = λp(k 1,t), t > 0, dt where the initial conditions are { 0, k 0, p(k, 0) = 1, k =0 since Pr{N 0 =0} =1. Solving the differential equation gives p Nt (k) = p(k, t) = (λt)k e λt, k =0, 1, 2,, t 0. k! The pmf for k jumps in an arbitrary interval (s, t), t s p Nt N s (k) = (λ(t s))k e λ(t s), k =0, 1,, t s. k! 4.5 Continuous time i.s.i. process / Poisson counting process
80 Random processes - Chapter 4 Random process 12 Martingale Martingale property An independent increment process {X t } with zero mean satisfies E {X(t n ) X(t n 1 ) X(t 1 ),X(t 2 ),,X(t n 1 )} =0 for all t 1 <t 2 < <t n and integer n 2. This property can be rewritten as E {X(t n ) X(t 1 ),X(t 2 ),,X(t n 1 )} = X(t n 1 ), which is called the martingale property. Martingale A process {X n,n 0} with the following properties is called a martingale process. E{ X n } <. E{X n+1 X 0,,X n } = X n. 4.5 Continuous time i.s.i. process / Martingale
81 Random processes - Chapter 4 Random process 1 Random processes Chapter 4 Random process 4.6 Compound process* 4.6 Compound process*
82 Random processes - Chapter 4 Random process 2 Discrete time compound process* Discrete time compound process Let {N k,k =0, 1, } be a discrete time counting process such as the binomial counting process, and let {X k,k=0, 1, }be an i.i.d. random process. Assume that the two processes are independent of each other. Define the random process {Y k,k =0, 1, } by Y 0 = 0, N k Y k = X i, k =1, 2,, i=1 where we assume Y k =0if N k =0. The process {Y k } is called a discrete time compound process. The process is also referred to as a doubly stochastic process because of the two sources {N k } and {X k } of randomness. 4.6 Compound process* / Discrete time compound process*
83 Random processes - Chapter 4 Random process 3 The expectation of Y k can be obtained using the conditional probability as { N k E{Y k } = E i=1 X i } = E{E{Y k N k }} = p Nk (n)e{y k N k = n} n (1) = n p Nk (n)ne{x} = E{N k }E{X}. Let X 1,X 2, be an i.i.d. sequence of random variables and G X be their common moment generating function. Let the random variable N be independent of X i and G N be the moment generating function of N. Then the moment generating function of S = N i=1 X i is G S (t) =G N (G X (t)). 4.6 Compound process* / Discrete time compound process*
84 Random processes - Chapter 4 Random process 4 Continuous time compound process* When a continuous time counting process {N t,t 0} and an i.i.d. process {X i,i=1, 2, }are independent of each other, the process Y t = Y (t) = N t is called a continuous time compound process. Here, we put Y (t) =0 when N t =0. We have i=1 X i E{Y t } = E{N t }E{X}, M Yt (u) = E{M N t X (u)}. 4.6 Compound process* / Continuous time compound process*
85 Random processes - Chapter 4 Random process 5 A compound process is continuous and differentiable except at the jumps, where a new random variable is added. If {N t } is a Poisson counting process, we have E{Y t } = λte{x}, (λt) k e λt M Yt (u) = M k k! X(u) k=0 = e λt (λtm X (u)) k k! k=0 = e λt(1 M X(u)). 4.6 Compound process* / Continuous time compound process*
Stochastic Processes
Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic
More informationProbability and Statistics
Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph
More informationStochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno
Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More information16.584: Random (Stochastic) Processes
1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable
More informationChapter 6: Random Processes 1
Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationChapter 6 - Random Processes
EE385 Class Notes //04 John Stensby Chapter 6 - Random Processes Recall that a random variable X is a mapping between the sample space S and the extended real line R +. That is, X : S R +. A random process
More informationIntroduction to Probability and Stochastic Processes I
Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides
More informationRandom Process. Random Process. Random Process. Introduction to Random Processes
Random Process A random variable is a function X(e) that maps the set of experiment outcomes to the set of numbers. A random process is a rule that maps every outcome e of an experiment to a function X(t,
More informationStationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.
Stationary independent increments 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process. 2. If each set of increments, corresponding to non-overlapping collection of
More informationEAS 305 Random Processes Viewgraph 1 of 10. Random Processes
EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome
More informationUCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7
UCSD ECE50 Handout #4 Prof Young-Han Kim Wednesday, June 6, 08 Solutions to Exercise Set #7 Polya s urn An urn initially has one red ball and one white ball Let X denote the name of the first ball drawn
More informationfor valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I
Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:
More informationECE 636: Systems identification
ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental
More informationStatistical signal processing
Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable
More informationUCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)
UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable
More informationLecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process
Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the
More informationECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes
ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu From RV
More informationECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1
ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen D. van Alphen 1 Lecture 10 Overview Part 1 Review of Lecture 9 Continuing: Systems with Random Inputs More about Poisson RV s Intro. to Poisson Processes
More informationProperties of the Autocorrelation Function
Properties of the Autocorrelation Function I The autocorrelation function of a (real-valued) random process satisfies the following properties: 1. R X (t, t) 0 2. R X (t, u) =R X (u, t) (symmetry) 3. R
More informationGaussian, Markov and stationary processes
Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November
More informationSTOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes
STOCHASTIC PROCESSES, DETECTION AND ESTIMATION 6.432 Course Notes Alan S. Willsky, Gregory W. Wornell, and Jeffrey H. Shapiro Department of Electrical Engineering and Computer Science Massachusetts Institute
More informationUCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE 53 Handout #46 Prof. Young-Han Kim Thursday, June 5, 04 Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei). Discrete-time Wiener process. Let Z n, n 0 be a discrete time white
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationRandom Processes Why we Care
Random Processes Why we Care I Random processes describe signals that change randomly over time. I Compare: deterministic signals can be described by a mathematical expression that describes the signal
More informationP 1.5 X 4.5 / X 2 and (iii) The smallest value of n for
DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X
More informationFor a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,
CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University
ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationFundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes
Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of
More informationHomework 3 (Stochastic Processes)
In the name of GOD. Sharif University of Technology Stochastic Processes CE 695 Dr. H.R. Rabiee Homework 3 (Stochastic Processes). Explain why each of the following is NOT a valid autocorrrelation function:
More informationUCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7
UCSD ECE50 Handout #0 Prof. Young-Han Kim Monday, February 6, 07 Solutions to Exercise Set #7. Minimum waiting time. Let X,X,... be i.i.d. exponentially distributed random variables with parameter λ, i.e.,
More informationQuestion Paper Code : AEC11T03
Hall Ticket No Question Paper Code : AEC11T03 VARDHAMAN COLLEGE OF ENGINEERING (AUTONOMOUS) Affiliated to JNTUH, Hyderabad Four Year B Tech III Semester Tutorial Question Bank 2013-14 (Regulations: VCE-R11)
More informationNorthwestern University Department of Electrical Engineering and Computer Science
Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability
More informationUCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011
UCSD ECE53 Handout #40 Prof. Young-Han Kim Thursday, May 9, 04 Homework Set #8 Due: Thursday, June 5, 0. Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process,
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationGaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts
White Gaussian Noise I Definition: A (real-valued) random process X t is called white Gaussian Noise if I X t is Gaussian for each time instance t I Mean: m X (t) =0 for all t I Autocorrelation function:
More informationFundamentals of Noise
Fundamentals of Noise V.Vasudevan, Department of Electrical Engineering, Indian Institute of Technology Madras Noise in resistors Random voltage fluctuations across a resistor Mean square value in a frequency
More informationChapter 5 Random Variables and Processes
Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering National Sun Yat-sen University Table of Contents 5.1 Introduction 5. Probability
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationE X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.
E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,
More informationStochastic Processes. A stochastic process is a function of two variables:
Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:
More informationModule 9: Stationary Processes
Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.
More informationProbability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver
Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation
More information3F1 Random Processes Examples Paper (for all 6 lectures)
3F Random Processes Examples Paper (for all 6 lectures). Three factories make the same electrical component. Factory A supplies half of the total number of components to the central depot, while factories
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationAlgorithms for Uncertainty Quantification
Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationDefinition of a Stochastic Process
Definition of a Stochastic Process Balu Santhanam Dept. of E.C.E., University of New Mexico Fax: 505 277 8298 bsanthan@unm.edu August 26, 2018 Balu Santhanam (UNM) August 26, 2018 1 / 20 Overview 1 Stochastic
More informationStochastic Processes. Chapter Definitions
Chapter 4 Stochastic Processes Clearly data assimilation schemes such as Optimal Interpolation are crucially dependent on the estimates of background and observation error statistics. Yet, we don t know
More informationProbability and Statistics for Final Year Engineering Students
Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: May 24, 2011. Lecture 6p: Spectral Density, Passing Random Processes through LTI Systems, Filtering Terms
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationProblems on Discrete & Continuous R.Vs
013 SUBJECT NAME SUBJECT CODE MATERIAL NAME MATERIAL CODE : Probability & Random Process : MA 61 : University Questions : SKMA1004 Name of the Student: Branch: Unit I (Random Variables) Problems on Discrete
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationTSKS01 Digital Communication Lecture 1
TSKS01 Digital Communication Lecture 1 Introduction, Repetition, and Noise Modeling Emil Björnson Department of Electrical Engineering (ISY) Division of Communication Systems Emil Björnson Course Director
More informationRandom Processes. DS GA 1002 Probability and Statistics for Data Science.
Random Processes DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Modeling quantities that evolve in time (or space)
More informationFinal. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:
ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 06 Instructor: Prof. Stanley H. Chan Final Fall 06 (Dec 6, 06) Name: PUID: Please copy and write the following statement: I certify
More informationCourse: ESO-209 Home Work: 1 Instructor: Debasis Kundu
Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear
More informationRandom Processes Handout IV
RP-IV.1 Random Processes Handout IV CALCULATION OF MEAN AND AUTOCORRELATION FUNCTIONS FOR WSS RPS IN LTI SYSTEMS In the last classes, we calculated R Y (τ) using an intermediate function f(τ) (h h)(τ)
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationSRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS
UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS 1. Define random process? The sample space composed of functions of time is called a random process. 2. Define Stationary process? If a random process is divided
More informationPROBABILITY AND RANDOM PROCESSESS
PROBABILITY AND RANDOM PROCESSESS SOLUTIONS TO UNIVERSITY QUESTION PAPER YEAR : JUNE 2014 CODE NO : 6074 /M PREPARED BY: D.B.V.RAVISANKAR ASSOCIATE PROFESSOR IT DEPARTMENT MVSR ENGINEERING COLLEGE, NADERGUL
More informationStochastic Processes. Monday, November 14, 11
Stochastic Processes 1 Definition and Classification X(, t): stochastic process: X : T! R (, t) X(, t) where is a sample space and T is time. {X(, t) is a family of r.v. defined on {, A, P and indexed
More informationECE534, Spring 2018: Solutions for Problem Set #5
ECE534, Spring 08: s for Problem Set #5 Mean Value and Autocorrelation Functions Consider a random process X(t) such that (i) X(t) ± (ii) The number of zero crossings, N(t), in the interval (0, t) is described
More informationUniversal examples. Chapter The Bernoulli process
Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial
More informationLECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.
LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation
More informationIf we want to analyze experimental or simulated data we might encounter the following tasks:
Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction
More informationChapter 2 Random Processes
Chapter 2 Random Processes 21 Introduction We saw in Section 111 on page 10 that many systems are best studied using the concept of random variables where the outcome of a random experiment was associated
More informationChapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes
Chapter 2 Some basic tools 2.1 Time series: Theory 2.1.1 Stochastic processes A stochastic process is a sequence of random variables..., x 0, x 1, x 2,.... In this class, the subscript always means time.
More informationStochastic Process II Dr.-Ing. Sudchai Boonto
Dr-Ing Sudchai Boonto Department of Control System and Instrumentation Engineering King Mongkuts Unniversity of Technology Thonburi Thailand Random process Consider a random experiment specified by the
More informationParametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes
Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides were adapted
More informationSpectral representations and ergodic theorems for stationary stochastic processes
AMS 263 Stochastic Processes (Fall 2005) Instructor: Athanasios Kottas Spectral representations and ergodic theorems for stationary stochastic processes Stationary stochastic processes Theory and methods
More informationDeterministic. Deterministic data are those can be described by an explicit mathematical relationship
Random data Deterministic Deterministic data are those can be described by an explicit mathematical relationship Deterministic x(t) =X cos r! k m t Non deterministic There is no way to predict an exact
More informationChapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.
Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept
More informationSignals and Spectra (1A) Young Won Lim 11/26/12
Signals and Spectra (A) Copyright (c) 202 Young W. Lim. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version.2 or any later
More informationFig 1: Stationary and Non Stationary Time Series
Module 23 Independence and Stationarity Objective: To introduce the concepts of Statistical Independence, Stationarity and its types w.r.to random processes. This module also presents the concept of Ergodicity.
More informationECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018
ECE534, Spring 2018: s for Problem Set #4 Due Friday April 6, 2018 1. MMSE Estimation, Data Processing and Innovations The random variables X, Y, Z on a common probability space (Ω, F, P ) are said to
More informationwhere r n = dn+1 x(t)
Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution
More information1: PROBABILITY REVIEW
1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following
More informationLecture 15. Theory of random processes Part III: Poisson random processes. Harrison H. Barrett University of Arizona
Lecture 15 Theory of random processes Part III: Poisson random processes Harrison H. Barrett University of Arizona 1 OUTLINE Poisson and independence Poisson and rarity; binomial selection Poisson point
More informationFundamentals of Applied Probability and Random Processes
Fundamentals of Applied Probability and Random Processes,nd 2 na Edition Oliver C. Ibe University of Massachusetts, LoweLL, Massachusetts ip^ W >!^ AMSTERDAM BOSTON HEIDELBERG LONDON NEW YORK OXFORD PARIS
More informationMassachusetts Institute of Technology
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.011: Introduction to Communication, Control and Signal Processing QUIZ, April 1, 010 QUESTION BOOKLET Your
More informationRandom Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay
1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationECE 630: Statistical Communication Theory
ECE 630: Statistical Communication Theory Dr. B.-P. Paris Dept. Electrical and Comp. Engineering George Mason University Last updated: April 6, 2017 2017, B.-P. Paris ECE 630: Statistical Communication
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationEEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:
EEM 409 Random Signals Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Consider a random process of the form = + Problem 2: X(t) = b cos(2π t + ), where b is a constant,
More information1 Elementary probability
1 Elementary probability Problem 1.1 (*) A coin is thrown several times. Find the probability, that at the n-th experiment: (a) Head appears for the first time (b) Head and Tail have appeared equal number
More informationMA6451 PROBABILITY AND RANDOM PROCESSES
MA6451 PROBABILITY AND RANDOM PROCESSES UNIT I RANDOM VARIABLES 1.1 Discrete and continuous random variables 1. Show that the function is a probability density function of a random variable X. (Apr/May
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationIntroduction to Stochastic processes
Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space
More informationECE Homework Set 3
ECE 450 1 Homework Set 3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3
More informationLecture 2: Review of Probability
Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................
More informationCommunication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University
Communication Systems Lecture 1, Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University 1 Outline Linear Systems with WSS Inputs Noise White noise, Gaussian noise, White Gaussian noise
More information