Unit Random Process. 2. Markov Chain. 3. Poisson Process

Size: px
Start display at page:

Download "Unit Random Process. 2. Markov Chain. 3. Poisson Process"

Transcription

1 Unit. Random Process. Markov Chain 3. Poisson Process

2 Chapter Random Process. Classification of random processes Consider a random experiment with outcomes s S where S is a sample space. If to every outcomes s S, we assign areal value time function X(t, s) and the collection of such time functions is called random process or stochastic process. Definition.. A random process is a collection of random variables {X(t, s)} that are functions of a real variable, namely time t where s S and t T ( S : sample space and T : parameter set or index set)

3 Types of random process Definition.. (Discrete random sequence) If both T and S are discrete, the random process is called a discrete random sequence. Example..3 If X n represents the outcome of the n th toss of a fair die, then {X n, n } is a discrete random sequence, since T = {,, 3,...} and S = {,, 3,, 5, 6}. If X(t) = the number of defective items found at trial t =,, 3,... then {X(t)} is a discrete random sequence. Definition.. (Discrete random process) If T is continuous and S is discrete, the random process is called a discrete random process. Example..5 If X(t) represents the number of telephone calls received in the interval (0, t), then {X(t)} is a discrete random process, since S = {,, 3,...}. Also if X(t) = the number of defective items found at time t 0, then {X(t)} is a discrete random process. Definition..6 (Continuous random sequence) If T is discrete and S is continuous, the random process is called a continuous random sequence. Example..7 If X(t) = amount of rainfall measured at trial, t =,, 3,..., then X(t) is a continuous random sequence. 3

4 Example..8 If X n represents the temperature at the end of the n th hour of a day, then X n, n, is a continuous random sequence. Definition..9 (Continuous random process) If both T and S are continuous, the random process is called a continuous random process. Example..0 If X(t) represents the maximum temperature at a place in the interval (0, t), {X(t)} is a continuous random process. Example.. If X(t) = amount of rainfall measured at time t, t 0, then {X(t)} is a continuous random process. Statistical Averages or Ensemble Averages Let {X(t)} be a random process.. The mean of {X(t)} is defined by E[ X(t) ] = µ X (t) = xf x(x, t)dx where X(t) is treated as a random variable for a fixed value of t.. The auto correlation (A.C.F ) of X(t) is defined by R XX (t, t ) = E[X(t )X(t )] = x x f x (x, x : t, t )dx dx 3. The auto co-variance of {X(t)} is defined by C XX (t, t ) = E[{X(t ) µ X (t )}{X(t ) µ X (t )}] = R XX (t, t ) µ X (t )µ X (t ) where µ X (t ) = E[X(t )] and µ X (t ) = E[X(t )]

5 Definition.. (Stationary processes) If certain probability distribution or averages do not depend on t, then the random process {X(t)} is called stationary.. Types of stationary Definition.. (Strict Sense Stationary) (SSS) A random process is called a strict sense stationary if all its finite dimensional distribution are invariant under translation of time parameter. f X (x, x,..., x n ; t, t,..., t n ) = f X (x, x,..., x n ; t +, t +,..., t n + ) for any t and any real number. First order stationary Definition.. A random process {X(t)} is said to be a first order SSS process if f(x, t + c) = f(x, t ). That is the first order density of a stationary process {X(t)} is independent of time t. Thus E[X(t)] = µ = a constant in a first order stationary random process. Second order stationary Definition..3 A random process {X(t)} is said to be a second order stationary process if f(x, x, t, t ) = f(x, x, t + c, t + c) for any c. That is the 5

6 second order density must be invariant under translation of time. Wide-Sense Stationary (or) Weakly Stationary (or) Co-variance Stationary Definition.. A random process {X(t)} is called a Wide-Sense Stationary (W SS) if its mean is a constant and the auto-correlation depends only on the time difference. That is, EX(t) = a constant and E[X(t)X(t + τ)] = R XX (τ) depends only on τ. It follows that auto co-variance of a W SS process also depends only on the time difference τ. Thus C XX (t, t + τ) = R XX (τ) µ X. Definition..5 (Jointly WSS process) Two processes {X(t)} and {Y (t)} are called Jointly W SS process if each is WSS and their cross correlation depends only on the time difference ( τ ). That is R XY (t, t + τ) = E[X(t)Y (t + τ)] = R ( XY )(τ). It follows that the cross co-variance of jointly WSS process {X(t)} and {Y (t)} also depends only on ( τ ), the time difference. Thus C XY (t, t + τ) = R XY (τ) µ X µ Y. Remark..6 For a two dimensional random process {X(t), Y (t) ; t 0}, we define the following. Mean = E[X(t)] = µ X (t). Auto correlation = E[X(t )X(t )] 6

7 3. Cross correlation = E[X(t )Y (t )]. Auto co-variance = E[{X(t ) µ X (t )}{X(t ) µ X (t )}] 5. Cross co-variance = E[{X(t ) µ X (t )}{Y (t ) µ Y (t )}] Definition..7 A random process that is not stationary in any sense is called an evolutionary random process..3 Ergodic random process Time averages of a random process. The time averaged of a sample function X(t) of a random process {X(t)} T is defined as X T = lim T X(t)dt. T T. The time - averaged auto correlation of the ramdon process {X(t)} is T defined by Z T = R XX (t, t + τ) = lim T X(t)X(t + τ)dt. T T Definition.3. (Ergodic random process) A random process {X(t)} is said to be ergodic random process if its ensemble averages are equal to appropriate time averages. Definition.3. (Mean-Ergodic process) A random process {X(t)} is said 7

8 to be mean ergodic if ensembled mean is equal to time average mean. If E[X(t)] = T µ and X T = lim T X(t)dt, then µ = X T T T with probability one. Definition.3.3 (Correlation Ergodic) A random process {X(t)} is said to be correlation ergodic if ensembled A.C.F is equal to time averaged A.C.F. Definition.3. (Cross Correlation Function) For a two dimensional random process {X(t), Y (t) : t 0}, the cross correlation function R XY (τ) is defined by R XX (t, t + τ) = E[X(t)Y (t + τ)]. Properties of cross correlation. (i) R Y X (τ) = R XY ( τ). (ii) RXY (τ) RXX (τ)r Y Y (τ). (iii) RXY (τ) {R XX(0) + R Y Y (0)}. (iv) If the processes {X(t)} and {Y (t)} are orthogonal if R XY (τ) = 0. (v) If the process {X(t)} and {Y (t)} are independent, then R XY (τ) = µ X µ Y. 8

9 . Examples of Stationary Processes Example.. Examine whether the Poisson process {X(t)} given by the probability law P {X(t) = r} = e λt (λt) r, r = 0,,,... r! is not covariance stationary. Solution: The Poisson process has the probability distribution as that of a Poisson distribution with parameter λt. Then E[X(t)] = λt and E[X (t)] = λ t + λt. Both are functions of t. Since E[X(t)] is not constant, the Poisson process is not covariance stationary. Example.. Show that the random process X(t) = A sin(ωt + φ) where A and ω are constants, φ is a random variable uniformly distributed in (0, π). is first order stationary. Also find the auto correlation function of the process. Solution: Given X(t) = A sin(ω + φ). Claim: E[X(t)] is constant. E[X(t)] = X(t)f(φ) dφ () 9

10 Since φ is uniformly distributed in (0, π), 0 < φ < π () π f(φ) = 0 otherwise Substituting () in (), we have E[X(t)] = π X(t) dφ (3) 0 π = A π = A π π sin(ωt + φ) dφ 0 [ ] π cos(ωt + φ) = A π[ cos(ωt + π) + cos(ωt)] 0 = A π[ cos(ωt) + cos(ωt)] = 0, a constant. Hence {X(t)} is a first order stationary process. Find auto correlation function: By definition, R XX (t, t + τ) = E[X(t)X(t + τ)] = E [ A sin(ωt + φ) A sin(ω(t + τ) + φ) ] = A E [ sin(ωt + φ) sin(+ωτ + φ) ] () = A E[ cos(ωt + φ (ωt + ωτ + φ)) cos(ωt + ωτ + φ + ωt) ] = A E[ cos( ωτ) cos(ωt + ωτ + φ) ] = A E[ cos( ωτ)] A E[cos(ωt + ωτ + φ)] (5) Applying the formula of (3) in the second term of (5) we get R XX (t, t + τ) = A E[ cosωτ] A π cos(ωt + ωτ + φ) dφ 0 π 0

11 = A = A [ ) A cosωτ 8π sin(ωt + ωτ + φ ] π 0 cosωτ A 8π = A cosωτ. Hence auto correlation function = A cosωτ [ sin(ωt + ωτ) sin(ωt + ωτ) ] Example..3 Given a random variable Y with characteristic function φ(ω) = E[e iωy ] = E[cos ωy +i sinωy ] and a random process defined by X(t) = cos(λt+ Y ), show that the random process {X(t)} is stationary in the wide sense if φ() = φ() = 0. Solution: E{X(t)} = E{cos(λt + Y )} = E{cosλt cosy sinλt siny } = cosλt E[cosY ] sinλt E[sinY ] () Since φ() = 0, E{cosY + isiny } = 0. Therefore E(cosY ) = E(sinY ) = 0 () using() in (), we get E{X(t)} = 0 (3) E{X(t ) X(t )} = E{cos(λt + Y ) cos(λt + Y )} = E{(cosλt cosy sinλt siny )(cosλt cosy sinλt siny )} = E{cosλt cosλt cos Y + sinλt sinλt sin Y cosλt sinλt cosy siny sinλt cosλt cosy siny )} = cosλt cosλt E[cos Y ] + sinλt sinλt E[sin Y ] sinλ)t + t ) E[cosY siny ]

12 = cosλt cosλt E[ +cosy ]+sinλt sinλt E[ cosy ] sinλ)t +t ) E[sinY ] () Since φ() = 0, E{cos Y + isiny } = 0 and so E[cosY ] = 0 = E[sinY ] (5) using (5) in (), we get, E{X(t )X(t )} = R(t, t ) = cosλt cosλt + sinλt sinλt = cosλ(t t ). (6) From (3) and (6), mean is a constant and ACF is a function of τ = t t only. Hence {X(t)} is WSS process. Example.. Two random processes {X(t)} and {Y (t)} are defined by X(t) = A cos ωt+b sin ωt and Y (t) = A cos ωt B sin ωt. Show that X(t) are jointly WSS if A and B are uncorrelated random variables with zero means and the same variances and ω is constant. Solution: Given E(A) = E(B) = 0 V ar(a) = V ar(b) E(A ) = E(B ) = k(say) Since A and B are uncorrelated, E(AB) = 0 Let us prove that X(t) and Y (t) are individually WSS processes. E[X(t)] = E[A cos ωt + B sin ωt] = cos ωt E(A) + sin ωt E(B) = 0, a constant R(t, t ) = E[X(t ) X(t )] = E[(A cos ωt + B sin ωt )(A cos ωt + B sin ωt )] = E[A cos ωt cos ωt +AB cos ωt sin ωt +AB sin ωt cos ωt +B sin ωt sin ωt ]

13 E(AB) = 0 = E(A ) cos ωt cos ωt + E(B ) sin ωt sin ωt = k[cos ωt cos ωt + sin ωt sin ωt ], since E(A ) = E(B ) = k and R(t, t ) = kcosω(t t ),a function of τ = t t Hence {X(t)} is a WSS process. Now prove that Y (t) is a WSS process: R XY (t, t ) = E[X(t ) Y (t )] = E[(A cos ωt + B sin ωt )(B cos ωt A sin ωt )] = E[AB cos ωt cos ωt A cos ωt sin ωt +B sin ωt cos ωt AB sin ωt sin ωt ] = E[B sin ωt cos ωt A cos ωt sin ωt ] = k sin ω(t t ). Therefore R XY (t, t ) is a function of τ = t t and so {X(t)} and {Y (t)} are jointly WSS process. Example..5 If {X(t)} is a WSS process with auto correlation R(τ) = Ae α τ, determine the second order moment of the RV X(8) X(5). Solution: The second moment of X(8) X(5) is given by E[{X(8) X(5)} ] = E {X (8)} + E {X (5)} E {X(8)X(5)} Given R(τ) = Ae α τ. Then R(t, t ) = Ae τ t t E[X (t)] = R(t, t) = A 3

14 E[X (8)] = E[X (5)] = A - E[X(8)X(5)] = R(8, 5) = Ae 3α -3 Using and 3 in, we get E[{X(8) X(5)} ] = A Ae 3α = A( e 3α ). Example..6 Show that the random process χ(t) = Acos(ωt + θ) is a WSS process if A and ω are constants and θ is a uniformly distributed random variable in (0, π). Proof. The Pdf of uniformal distribution is f(θ) = π, 0 θ π We have to show that the mean is a constant. E[x(t)] = π 0 = π X(t) π dθ π 0 Acos(+θ)dθ = A π [sin(ωt + θ)]0 π = A [sinωt sinωt] π Therefore Mean is a constant. = 0, a constant. Now E[X(t) X(t + τ)] = R XX (t, t + τ) = E[Acos(ωt + θ)acos(ωt + ωτ + θ)] = E[A cos(ωt + θ)cos(ωt + ωτ + θ)] = A E[cosωt + cos(ωt + ωτ + θ)] = A A cosωτ + E[cos(ωt + ωτ + θ)]

15 = A A π cosωτ + cos( + ωτ + θ)dθ π 0 = A cosωτ + A π = A cosωτ + A π(0) [ sin(ωt+ωτ+θ) = A cosωτ, a function τ only. Therefore mean is a constant and Auto correlation function depends only τ and so X(t) is a WSS process. ] π 0 Example..7 Consider a random process {X(t)} defined by X(t) = Y cos(ωt+ φ) where Y and φ are independent random variables and are uniformly distributed over ( A, A) and ( π, π) respectively. a) Find E[X(t)]. b) Find the auto correlation function R XX (t, t + τ) of X(t). c) Is the process X(t) W.S.S. Solution (a) The p.d.f of Y is f(y ) = π The p.d.f of θ is f(θ) = π E[X(t)] = E[Y cos(ωt + θ)] = E[Y ]E[cos(ωt + θ)]. Now E[Y ] = A A Y A dy = A [ Y Therefore E[X(t)] = 0, a constant. ] A A = A[ A A ] = 0. (b) R XX (t, t + τ) = E [ X(t) X(t + τ)] = E[Y cos(ωt + θ) Y cos(ωt + ωτ + θ)] 5

16 = E[Y ] E[cos(ωt + θ) cos(ωt + ωτ + θ)] Since V ar(y ) = σ = E[Y ] (E[Y ]), σ = E[Y ]. Therefore R XX (t, t + τ) = σ E[ cos(ωt + θ) cos(ωt + ωτ + θ) ] = σ E[ cos(ωt + ωτ + θ) + cos ωτ ] = σ cos ωτ + σ E[ cos(ωt + ωτ + θ) ] = σ cos ωτ + σ π π π cos(ωt + ωτ + θ) dθ = σ cos ωτ + σ 8π = σ cos ωτ + σ 8π [ sin(ωt + ωτ + θ) ] π π [0] = σ cos ωτ. Hence R XX (τ) = σ cos ωτ, a function of τ only. (c) Yes, X(t) is W.S.S process, because mean is constant and R XX (τ) is function of τ only. Example..8 The probability distribution of the process {X(t)} is given by (at) n if n =,, 3,... (+at) P (X(t) = n) = n+ at, n = 0 +at Show that it is not stationary. Solution E [ X(t) ] = n P (n) = 0 at + at (at) at (+at) (+at) 3 (+at) = (+at) [ + u + 3u +... ], where u = = (+at) [ u ] = (+at) [ ] at +at 6 at +at

17 [ = +at at ] (+at) +at =, Therefore E[X(t)] =, a constant. E[X (t)] = n P (n) = (at)n n = {n(n + ) n} (+at) n+ [ = (+at) n = n(n + )( ) at n +at n = n( ) at n ] +at = (+at) [ n = = (+at) [ ( at +at n(n+) ( at ) n +at n = n( at +at ) 3 ( at +at ) n ] ) ], since ( x) 3 = n(n+) n= x n = (+at)3 (+at) (+at) (+at) = ( + at) = + at. Therefore V ar(x(t)) = at, a function of t and so {X(t)} is not stationary. Example..9 Assume that a random process {X(t)} with four sample functions X(t, s ) = cos t ; X(t, s ) = cos t, X(t, s 3 ) = sin t, X(t, s ) = sin t, all of which are equally likely. Show that it is WSS process. Solution E[X(t)] = i= X(t, s i) = [cos t cos t + sin t sin t] = 0 A.C.F R XX (t, t ) = E[X(t ) X(t )] = i= X(t, s i ) X(t, s i ) = [ cos t cos t + cos t cos t + sin t sin t + sin t sin t ] = [ cos t cos t + sin t sin t ] = cos(t t ) = cos τ, where τ = t t Since E[X(t)] is constant and R XX (t, t ) = a function of t t, the process {X(t)} is WSS. Example..0 Consider a random process {X(t)} = P + Qt, where P and 7

18 Q are independent random variables E(P ) = p, E(Q) = q, V ar(p ) = σ and V ar(q) = σ. Find E {X(t)}, R(t, t ) and C(t, t ). Is the {X(t)} stationary? Solution: Given the Random process X(t) = P+Qt. Then E[{X(t)}] = E(P ) + te(q) = p + qt A.C.F. R(t, t ) = E[X(t )X(t )] = E[(P + Qt )(P + Qt )] = E[P + P Q(t + t ) + Q t t ] = E[P ] + E[P Q](t + t ) + E[Q ]t t Since P and Q are independent, E(P Q) = E(P ) E(Q) E(P ) = V (P ) + [E(P )] = σ + p E(Q ) = V (Q) + [E(Q)] = σ + q Therefore R(t, t ) = σ + p + pq(t + t ) + t t (σ + q ) R(t, t ) = σ + t t σ + p + t t q + pq(t + t ) E[X (t)] = E[P + Q t + P Qt] = E(P ) + t E(Q ) + te(p Q) = σ + p + t (σ + q ) + te(p )E(Q) = σ + p + t (σ + q ) + tpq V ar[x(t)] = E[X (t)] E[{X(t)}] = σ + p + t (σ + q ) + tpq (P + qt) = σ + t σ 8

19 Since E[X(t)] and V ar[x(t)] are functions of time t, the Random process {X(t)} is not stationary in any sense. It is evolutionary. Auto covariance: C XX (t, t ) = R XX (t, t ) E[{X(t )}] E[{X(t )}] = σ + t t σ + p + t t q + pq(t + t ) (p + qt )(p + qt ). Therefore C XX (t, t ) = σ + t t σ. Example.. If X(t) = Y cos t+z sin t for all where Y and Z are independent binary random variables, each of which assumes the values and with probabilities 3 and 3 respectively, prove that {X(t)} is wide sense stationary. Solution: E(Y ) = ( )( ) + ( ) = E(Y ) = ( ) ( ) + ( ) = 3 3 V ar(y ) = E(Y ) [E(Y )] = Similarly, E(Z) = 0 ; V ar(z) = E[X(t)] = E[Y cos t + Z sin t] = E[Y ] cos t + E[Z] sin t = 0 R XX (t, t + τ) = E[X(t)X(t + τ)] = E[(Y cos t + Z sin t)(y cos(t + τ) + Z sin(t + τ))] = E[Y cos t cos(t+τ)]+e[z sin t sin(t+τ)]+e[y Z]cos t sin(t + τ) + sin t cos(t + τ) = E[Y ] cos(t + τ) cos t + E(Z ) sin(t + τ) sin t + E(Y )E(Z) sin(t + τ) = [cos(t + τ) cos t + sin(t + τ) sin t] + 0 Therefore R XX (τ) = cos τ. 9

20 Since E[{X(t)}] is a constant and A.C.F is a function of τ only, {X(t)} is WSS process. Example.. If X(t) = R cos(ωt + φ), where R and φ are independent random variables with E(R) = and V ar(r) = 6 and φ is uniformly distributed in ( π, π). Prove that {X(t)} is WSS process. Solution: Since φ is uniformly distributed in ( π, π), the P.d.f of φ is if π < φ < π π f(φ) = 0 otherwise. Now E[X(t)] = E(R)E[cos(ωt + φ)] = π (cos(ωt + φ)dφ π π = π [sin(ωt + φ)]π π = [sin(ωt + π) + sin(π ωt)] π = [ sin(ωt) + sin(ωt)] = 0 π R XX (t, t + τ) = E[X(t)X(t + τ)] = E[R cos(ωt + φ) cos(ωt + ωτ + φ)] = E[R ]E[cos(ωt + φ) cos(ωt + ωτ + φ)] Since V ar(r) = 6, we get E(R ) = V (R) + [E(R)] = 6 + = 0 Therefore R XX (t, t + τ) = 0 E[ cos(ωt + φ) cos(ωt + ωτ + φ)] = 5E[cos(ωt + ωτ + φ) + cos(ωt + φ)] = 5 cos ωτ + 5 π cos(ωt + ωτ + φ)dφ (π) π = 5 cos ωτ + 5 (π) [sin(ωt + ωτ + φ)]π π = 5 cos ωτ + 0 0

21 R XX (t, t + τ) = 5 cos ωτ, a function of τ only. Therefore E[X(t)] is a constant and A.C.F is a function of τ only, {X(t)} is a W.S.S process. Example..3 Given a random variable Ω with density f(ω) and another random variable φ is uniformly distributed in ( π, π) and is independent of Ω and X(t) = a cos(ωt + φ), prove that {X(t)} is a WSS process. Solution: E[X(t)] = ae[cos(ωt + φ)] = ae [ E cos(ωt + φ)/ω}] = ae[cos ΩtE(cos φ) sin ΩtE(sin φ)] Therefore E[X(t)] = ae[cos Ωt ( ) π cos φ dφ sin Ωt ( ) π sin φdφ] π π π π = ae[cos Ωt(0) sin Ωt(0)] = 0. E[X(t )X(t )] = a E[cos(Ωt + φ) cos(ωt + φ)] = a E[E { cos Ωt cos Ωt cos φ + sin Ωt sin Ωt sin φ sin Ω(t + t ) sin φ cos φ } /Ω] = a E [ { π cos Ωt cos Ωt φdφ} (π) π cos { π + sin Ωt sin Ωt (π) π sin π φ dφ sin Ω(t + t )} sin φ dφ] π π = a E[cos Ωt cos Ωt + sin Ωt sin Ωt ] = a E[cos Ω(t t )]. Therefore R XX (t, t ) is a function of t t whatever be the value of f(ω). Hence {X(t)} is a W.S.S process.

22 Example.. Show that the random process X(t) = A sin t + B cos t, where A and B are independent random variables with zero means and equal standard deviations is stationary of the second order. Solution: The A.C.F of a second order stationary process of a function of time difference τ only and not on absolute time. Consider R XX (t, t + τ) = E[X(t)X(t + τ)]. Therefore R X X (t, t + τ) = E[(A sin t + B cos t) (A sin(t + τ) + B cos(t + τ))] = E[A ] sin(t+τ) sin t+e[b ] cos(t+τ) cos t+e[ab] {sin t cos(t + τ) + sin(t + τ) cos t} Given E(A) = E(B) = 0 E(AB) = 0 and E(A ) = E(B ) = σ Since V (A) = V (B) = σ, Therefore R X X (t, t + τ) = σ [sin(t + τ) sin t + cos(t + τ) cos t] = σ cos(t + τ t) = σ cos τ Therefore A.C.F is a function of time difference τ only. Hence {X(t)} is stationary of order two. Example..5 Consider a random process Y (t) = X(t) cos(ωt + θ), where X(t) is wide stationary and θ is uniformly distributed in ( π, π ) and is independent of X(t) and ω is a constant. Prove that Y (t) is wide sense stationary.

23 Solution: E[Y (t)] = E[X(t)]E[cos(ωt + θ)] { } π = E[X(t)] cos(ωt + θ)dθ = E[X(t)](0) = 0. π (π) R Y Y (t, t + τ) = E[X(t)X(t + τ)]e[cos(ωt + ωτ + θ) cos(ωt + θ)] = R XX(τ) E[(cos ωτ) + cos(ωt + ωτ + θ)], since {X(t)} is W.S.S. Therefore R Y Y (t, t + τ) = R X X(τ) [cos ωτ + (π) = R X X(τ) cos ωτ + R X X(τ) (0) = R X X(τ) cos ωτ π π Therefore A.C.F of Y (t) is a function of τ only. cos(ωt + ωτ + θ)dθ] Since E[Y (t)] is a constant and A.C.F of Y (t) is a function of τ only, {Y (t)} is WSS process. Example..6 Let X(t) = A cos λt + B sin λt, where A and B are independent normally distributed random variables N(0, σ ). Obtain the covariance function of the process {X(t) : < t < }. Is {X(t)} covariance stationary? Solution: E[X(t)] = E[A] cos λt + E[B] sin λt Since A and B are independent normal random variables N(0, σ ), E(A) = E(B) = 0 and V (A) = V (B) = σ. and E(A ) = E(B ) = σ. Thus E[X(t)] = 0. 3

24 The A.C.F of {X(t)} is given by E[X(t)X(s)] = E[{A cos λt + B sin λt} {A cos λs + B sin λs}] = E[A ] cos λt cos λs + E[B ] sin λt sin λs + E[AB]{cos λt sin λs + sin λt cos λs} Since E(A ) = E(B ) = σ and E(AB) = E(A)E(B) = 0, R(t, s) = σ {cos λt cos λs + sin λt sin λs} = σ cos λ(t s) Covariance C(t, s) = R(t, s) E[X(t)]E[X(s)] = R(t, s) = σ cos λ(t s) Since covariance function is a function of τ = t s only, and E[X(t)] is constant, {X(t)} is covariance stationary. Example..7 Consider a random process X(t) = B cos(50t + φ), where B and φ are independent random variables. B is a random variable with mean 0 and variance. φ is uniformly distributed in the interval ( π, π ). Show that {X(t)} is WSS. Solution: E(B) = 0 and V ar(b) = E(B ) =. E[X(t)] = E[B]E[cos(50t + φ)] E[X(t)] = 0. R X X (t, t ) = E[B cos(50t + φ)b cos(50t + φ)] = E[B ] E[ cos(50t + φ) cos(50t + φ)] = E[cos 50(t t )] + E[cos(50t + 50t + φ)] R X X (t, t ) = cos 50(t t ) + π cos(50t (π) π + 50t + φ)dφ = cos 50(t t ) + [sin(50t (8π) + 50t + φ)] π π = cos 50(t t )

25 Therefore R X X (t, t ) = a function of time difference t t. Since E[X(t)] is a constant and A.C.F is a function of τ = t t only, {X(t)} is WSS. Example..8 Consider a random process defined on a finite sample space with three sample points and is defined by the three sample functions X(t, s ) = 3, X(t, s ) = 3 cos t and X(t, s 3 ) = sin t and all of which are equally likely ie, P (S ) = 3 for all i, compute the mean and a.c.f. Is tha process strict-sense stationary? Is it wide sense stationary? Solution: Mean = 3 i= X(t, s i) P (s i ) = (3) + (3 cos t) + sin t ( sin t) = + cos t A.C.F = E[X(t )X(t )] = 3 i= X(t, s i )X(t, s i ) P (s i ) = 3 [9 + 9 cos t + 6 sin t ] = 3 + cos(t t ) sin t sin t R XX (t, t ) is not a function of t t only. Since E[X(t)] is not a constant X(t) is not WSS. Also R XX (t, t ) is not a function t t, X(t) is not S.S.S. 5

26 .5 Examples of Ergodic Process Example.5. If the WSS process {X(t)} is given by X(t) = 0 cos(00t + θ) where θ is uniformly distributed over ( π, π). Prove that {X(t)} is correlation ergodic. Solution: R XX (t, t + τ) = [ X(t)X(t + τ) ] = E [ 0 cos(00t + 00τ + θ) 0 cos(00t + θ) ] = 50E [ cos(00t + 00τ + θ) cos(00t + θ) ] = 50 cos(00τ) + 50 π cos(00t + 00τ + θ) dθ π π = 50 cos(00τ) + 50 π[ sin(00t + 00τ + θ) ] π π R XX (t, t + τ) = 50 cos(00τ) + 50 [0] = 50 cos(00τ) π Consider the time averaged ACF lim T Z T = lim T T T X(t) X(t + τ) dt T = lim T 00 cos(00t + θ) cos(00t + 00τ + θ) dt T = lim T 5 T T T cos(00τ)dt + lim T 5 T [ = 50 cos 00τ + lim sin(00t+00τ+θ) ] T T 8T T T T cos(00t + 00τ + θ) dt = 50 cos(00τ) Therefore lim T Z T = R(XX)(τ) = 50 cos(00τ) Since the ensembled A.C.F= Time averaged ACF, {X(t)} is correlation ergodic. 6

27 Example.5. Prove that the random process {X(t)} defined by X(t) = A cos(ωt+ θ) where A and ω are constants and θ is uniformly distributed over (0, π) is ergodic in both the mean and the auto correlation function. Solution: Ensembled Mean and ACF E[X(t)] = A π [ ] cos(ωt + θ) dθ = A π ] π 0 π sin(ωt + θ) = A 0 π[ sin ωt sin ωt = 0 ACF R XX (t, t + τ) = [ X(t)X(t + τ) ] = A E[ cos(ωt + ωτ + θ) cos(ωt + θ) ] = A E[ cos ωτ + cos(ωt + ωτ + θ) ] = A cos ωτ + A π π 0 cos(ωt + ωτ + θ) dθ = A cos ωτ + A 8π Therefore R XX (τ) = A cos ωτ Time averaged ACF and Mean: [ sin(ωt + ωτ + θ) dθ ] π 0 = A cos ωτ Time Averaged Mean = X T = lim T T T T X(t) dt A = lim T T [ A = lim T T T ] T sin(ωt+θ) T Therefore X T = 0. cos(ωt + θ) dt ωt = 0. Time averaged ACF x(t + τ)x(t) = lim T T T T x(t + τ)x(t) dt = lim T A T = lim T A T T T T T A cos(ωt + ωτ + θ) cos(ωt + θ) dt {cos ωτ + cos(ωt + ωτ + θ)} dt A = lim T (cos ωτ)(t ) + lim T T A T T T cos(ωt + ωτ + θ) dt 7

28 [ = A A cos ωτ + lim sin(ωt+ωτ+θ) ] T T T ω T = A cos ωτ Therefore x(t + τ)x(t) = A cos ωτ Since ensembled mean=time aveaged mean, {X(t)} is mean ergodic. Also since ensembled ACF=Time averaged ACF, {X(t)} is correlation ergodic. Example.5.3 {X(t)} is the random telegraph signal process with E[X(t)] = 0 and R(τ) = e λ τ. Find the mean and variance of the time average of {X(t)} over ( T, T ). Is it mean ergodic? Solution: Mean of the time average of {X(t)} X T = T T T X(t) dt Therefore E[ X T ] = T T T E[X(t)] dt = 0, since E[X(t)] = 0. To find V ar(x T ) V ar(x T ) = T T 0 ( τ ) T C(τ) dτ = T ( ) T 0 τ T e λτ dτ = T T e λτ dτ T 0 T 0 τ T e λτ dτ = T [ e λτ λ ] T [ 0 T τe λτ λ ] T e λτ λ 0 = λt ( e λτ ) + λt e λt + 8λ T (e λτ ). Therefore V ar(x T ) = λt 8λ T (e λτ ). Since lim T V ar(x T ) = 0, the process is mean ergodic. 8

29 Example.5. Let {X(t) : t 0} be a random process where X(t) = total if k is even number of points in the interval (0, t) = k say and X(t) =. if k is odd Find the ACF of X(t). Also if P (A = ) = P (A = ) = and A is independent of X(t), find the ACF of Y (t) = A X(t). Solution: Probability law of {X(t)} is given by P [X(t) = k] = e λt (λt) k k!, k =,, 3,... Then P [X(t) = ] = P [X(t) is even] = P [X(t) = 0] + P [X(t) = ] +... = e λt[ + λt! + (λt)! +... ] = e λt cosh λt. P [X(t) = ] = P [X(t) is odd] = P [X(t) = ] + P [X(t) = 3] +... = e λt[ λt! + (λt)3 3! +... ] = e λt sinh λt. P [X(t ) =, X(t ) = ] = P [X(t ) = /X(t ) = ] P [X(t ) = ] = (e λτ cosh λτ)(e λt cosh λt ), where τ = t t P [X(t ) =, X(t ) = ] = (e λτ cosh λτ)(e λt sinh λt ) P [X(t ) =, X(t ) = ] = (e λτ sinh λτ)(e λt sinh λt ) P [X(t ) =, X(t ) = ] = (e λτ sinh λτ)(e λt cosh λt ). Then P [X(t )X(t ) = ] = e λτ cosh λτ and P [X(t )X(t ) = ] = e λτ sinh λτ Therefore R(t, t ) = e λτ cosh λτ e λτ sinh λτ = e λτ = e λ(t t ). To find ACF of Y (t) = AX(t) E(A) = ()P [X = ] + ( )P [X = ] = = 0 9

30 E(A ) = () P [X = ] + ( ) P [X = ] = + = R Y Y (t, t ) = E[A X(t )X(t )] = E(A )R XX (t, t ) = e λτ = e λ(t t ). 30

31 Chapter Markov Process and Markov chain. Basic Definitions Definition.. (Markov Process) A random process {X(t)} is called a Markov process if P [X(t n ) = a n /X(t n ) = a n, X(t n )... X(t ) = a, X(t ) = a ] = P [X(t n ) = a n /X(t n ) = a n ] for all t < t <... t n. In other words, if the future behavior of a process depends on the present value but not on the past, then the process is called a Markov process. Example.. The probability of raining today depends only on previous weather 3

32 conditions existed for the last two days and not on past weather conditions. Definition..3 (Markov Chain) If the above condition is satisfied for all n, then he process {X n }; n = 0,,... is called a Markov chain and the constants (a, a,..., a n ) are called the states of the Markov chain. In other words, a discrete parameter Markov process is called a Markov Chain. Definition.. (One-Step Transition Probability) The conditional transition probability P [X n = a j /X n = a i ] is called the one-step transition probability from state a i to state a j at the n th step and is denoted by P ij (n, n). Definition..5 (Homogeneous Markov Chain) If the one-step transition probability does not depend on the step i.e., P ij (n, n) = P ij (m, m), the Markov chain is called a homogeneous Markov chain. Definition..6 (Transition Probability Matrix) When the Markov chain is homogeneous, the one-step transition probability is denoted by p ij. The matrix P = (p i j) is called the transition probability matrix (tpm) satisfying the conditions (i) p ij 0 and (ii) p ij = for all i i.e., the sum of the elements of anyrow of the t.pm is. Definition..7 (n-step Transition Probability) The conditional probability that the process is in state a j at step n, given that it was in state a i at step 3

33 0, P [X n = a j /X 0 = a i ] is called n-step transition probability and is denoted by P (n) ij. That is P (n) ij = P [X n = a j /X 0 = a i ] Chapman-Kolmogorov theorem: If P is the tpm of a homogeneous Markov chain,then n-step tpm P (n) is equal to P n.thus [P (n) ij ] = [P ij ] n. Definition..8 (Regular Markov Chain) A stochastic matrix P is said to be a regular matrix if all the entries of P m (for some positive integer m) are positive. A homogeneous Markov chain is said to be regular if its tpm is regular. Definition..9 (Steady state distribution) If a homogeneous Markov chain is regular, then every sequence of state probability distribution approaches a unique fixed distribution of the Markov chain. That is lim n {P (n) } = π where P (n) = {p (), p ()... p (n)} k and π = (π, π, π 3,... π k ) If P is the tpm of the regular Markov chain, and π = (π, π, π 3,... π k ) is the steady state distribution, then πp = π and π + π + π π k =.. Classification of states of Markov chain Definition.. If p (n) ij > 0 for some n and for all i and j, then every state can be reached from every other state. When this condition is satisfied, the Markov 33

34 chain is said to be irreducible.the tpm of an irreducible chain is an irreducible matrix. Otherwie the chain is said to be non-irreducible or reducible. Definition.. state i of a Markov chain is called a return state,i f p (n) ii > 0 for some n >. Definition..3 The period d i of a return state i is defined as the greatest common divisor of all m such that p (m) ii > 0 i.e., d i = GCD{m : p (m) ii > 0} state i is said to be periodic with period d i if d i > and aperiodic if d i =. Obviously state i is aperiodic if p ii 0 Definition.. (Recurrence time probability) The probability that the chain returns to state i, starting from state i, for the first time at the n th step is called the recurrence time probability or the first return time probability. It is denoted by f ii (n). state i. {n, f (n) ii }, n =,, 3,... is the distribution of recurrence time of the If F ii = n= f (n) ii =, then return to state i is certain and µ ii = (n) n= nf ii is called the mean recurrence time of the state i. Definition..5 A state i is said to be persistent or recurrent if the return to state i is certain i.e., if F ii =. 3

35 The state i is said to be transient if the return to state i is uncertain i.e., F ii <. The state i is said to be non-nullpersistent if its mean recurrence time µ ii is finite and null persistent if µ ii =. The non-null persistent and aperiodic state is called ergodic. Remark..6. If a Markov chain is irreducible,all its states are of the same type.they are all transient,all null persistent or all non-null persistent.all its states are either aperiodic or periodic with the same period.. If a Markov chain is finite irreducible, all its states are non-null persistent. Definition..7 (Absorbing state): A state i is called an absorbing state if p ij = and p ij = 0 for i j. Calculation of joint probability P [X 0 = a, X = b,... X n = i, X n = j, X n = k] = P [X n = k/x n = j]p [X n = j/x n = i]... P [X = b/x 0 = a]p [X 0 = a] = P jk P ij... P ab P (X 0 = a) 35

36 .3 Examples Example.3. Consider a Markov chain with transition probability matrix P = Find the steady state probabilities of the system Solution : Let the invariant probabilities of P be π = (π, π, π 3 ) By the property of π, πp = π (π π π 3 ) = (π π π 3 ) π + 0.3π + 0.π 3 = π 0.5π 0.3π 0.π 3 = 0 () 0.π + 0.π + 0.3π 3 = π 0.π 0.6π 0.3π 3 = 0 () 0.π + 0.3π + 0.5π 3 = π 3 0.π + 0.3π 0.5π 3 = 0 (3) ()+(3) 0.5 π -0.3 π -0. π 3 =0 which is () 36

37 Since π is the probability distribution π + π + π 3 =, () ()x π +0.3 π +0.3 π 3 =0.3 (5) ()+(5) 0.8π +0. π 3 =0.3 (6) ()+(3) 0.6π -0.7 π 3 =0 (7) (6)x7 5.6π +0.7 π 3 =. (8) (7)+(8) 6.π =. π =. 6. = 0.3 put π =0.3 in (7), π 3 = 0 π 3 = = 0.3 put π = π = 0.3 in () we get, 0.6+ π = π =0. Hence the invariant probabilities of P are (0.3, 0., 0.3). Example.3. At an intersection, a working traffic light will be out of order the next day with probability 0.07, and out of order traffic light will be working the next day with probability Let X n = if a day n the traffic will work; X n = 0 if on day n the traffic light will not work.is {X n ; n = 0,,...} a Markov chain?. If so, write the transition probability matrix. Solution: Yes, {X n ; n = 0,,...} is a Markov chain with state space {0, } Transition probability matrix Example.3.3 Let {X n } be a Markov chain with state space {0,, } with ini- 37

38 tial probability vector P (0) = (0.7, 0., 0.) and the one step transition probability matrix P = Compute P (X = 3) and P (X 3 =, X = 3, X = 3, X 0 = ). Solution: P () = P = P = (i) P [X = 3] = 3 i= P [X = 3/X 0 = i]p [X 0 = i] 38

39 given P (0) = (0.7, 0., 0.) This gives P [X 0 = ] = 0.7 ; P [X 0 = ] = 0. and P [X 0 = 3] = 0. Therefore P [X = 3] = P () 3 P (X 0 = ) + P () 3 (X 0 = ) + P () 33 P (X 0 = 3) = = = 0.79 (ii) P {X = 3 0 = } = P 3 = 0. () P {X = 3/X 0 = } = P {X = 3/X 0 = } P {X 0 = } = = 0.0 (by()) () P {X = 3, X = 3, X 0 = } = P {X = 3/X = 3, X 0 = } P {X = 3, X 0 = } = P {X = 3/X = 3} P {X = 3, X 0 = } = = 0.0 (by()) (by Markov property) P {X 3 =, X = 3, X = 3, X 0 = } = P {X 3 = /X = 3, X = 3, X 0 = } P{X = 3, X = 3, X 0 = } = P {X 3 = /X = 3} P {X = 3, X = 3, X 0 = } = = Example.3. A fair dice is tossed repeatedly. If X n denotes the maximum of the number occuring in the first n tosses,find the transition probability matrix P of the Markov chain {X n }. Find also P and P (X = 6). Solution: State space {,, 3,, 5, 6}. Let X n = the maximum of the numbers 39

40 occuring in the first n trails = 3(say) X n+ =3 if the (n+)th trail results in, or 3 = if the (n+)th trail results in =5 if the (n+)th trail results in 5 =6 if the (n+)th trail results in 6 Therefore P {X n+ = 3/X n = 3} = = 3 6 P {X n+ = 3/X n = 3} = 6 when i=,5,6 The transition probability matrix of the chain is P =

41 P = Initial state probability distribution is P (0) = (,,,,, ) since all the values ,, 3..., 6 are equally likely. P {X = 6} = 6 i= P {X = 6/X 0 = i} x P {X 0 = i} = 6 6 i= P i6 = 6 36 = 9 6 ( ) Example.3.5 Three girls G, G, G 3 are throwing ball to each other G always throws the ball to G and G always throws the ball to G 3, but G 3 is just

42 as likely to throws the ball to G as to G. Prove that the process is Markovian.Find the transition matrix and classify the states. Solution: The transition probability matrix of the process {X n } is given by States of X n depend only on states of X n, but not on states of X n, X n 3... or earlier states. Therefore {X n } is a Markov chain. Now P = , P 3 = 0 0 P (3) > 0, P () 3 > 0, P () > 0, P () > 0, P () 33 > 0 and all other P () ij > 0. Therefore the chain is irreducible.

43 P = 0, P 5 = and P 6 = and so on. P () ii, P (3) ii, P (5) ii, P (6) ii etc are > 0 for i =,, 3 and the GCD of, 3, 5, 6... = Therefore the state (state G ) is periodic with period (aperiodic). Example.3.6 Find the nature of the states of the Markov chain with the tpm P = Solution : 3

44 P = = P 3 = P.P = = = P P = P.P = = = P and so on. In general, P n = P, P n+ = P Also, P () 00 > 0, P () 0 > 0, P () 0 > 0

45 P () 0 > 0, P () > 0, P () > 0 P () 0 > 0, P () > 0, P () > 0. Therefore, the Markov chain is irreducible. Also P () ii = P () ii = P (6) ii... > 0, for all i, all the states of the chain are periodic, with period. Since the chain is finite and irreducible, all its states are non-null persistent. All states are not ergodic. Example.3.7 A gambler has Rs. He bets Rs. at a time and wins Rs with probability. He stops playing if he loses Rs or wins Rs. (a) What is the tpm of the related Markov chain? (b) What is the probability that he lost his money at the end of 5 plays? (c)what is the probability that the game lasts more than 7 plays? Solution: Let X n represent the amount with the player at the end of the n th round of the play. State space of {X n } = (0,,, 3,, 5, 6), as the game ends, if the player loses all the money (X n = 0) or wins Rs. that is has Rs.6 (X n = 6). 5

46 (a)the tpm of the Markov chain is P = Since the player has got Rs. initially the initial probability distribution of {X n } is 6

47 P (0) = =

48 P = P P = = similiary, P = P 3) P = P = P 3 P = P (5) = P () P =

49 P (6) = P (5) P = P (7) = P (6) P = (b) P{the man has lost money at the end of 5 plays} = P {X 5 = 0} 8 =the entry corresponding to state 0 in P (5) = 3 8 (c) P{the game lasts more than 7 days} =P{the system is neither in state 0 nor in 6 at the end of the seventh round} = P {X 7,,, 3, or5} = = = 5 8 = 7 6. Example.3.8 On a given day, a retired professor Dr.Charles Fish, amuses himself with only one of the following activities. Reading(activity ), gardening(activitity )or working on his book about a river vally(activity 3). For i 3, let X n = i if Dr. Fish devotes day n to activity i. Suppose that {X n : n =,, 3...} is a Markov chain and depending on which of these ac- 9

50 tivities on the next day is given by the tpm P = Solution: Let π, π and π 3 be the proportion of days Dr.Fish devotes to reading, gardening and writing respectively. Then,( π π π 3 )P = (π π π 3 ) ( π π π 3 ) = (π π π 3 ) π + 0.0π + 0.5π 3 = π... () 0.5π + 0.0π + 0.0π 3 = π... () 0.5π π π 3 = π 3... (3) π + π + π 3 =... () from (), we have π 3 = π π... (5) substituting π 3 in () we get, 50

51 0.30π + 0.0π + 0.5( π π ) = π 0.30π + 0.0π π 0.5π = π 0.05π + 0.5π π = π + 0.5π = (6) subtituting π 3 in () we get, 0.5π + 0.0π + 0.0( π π ) = π 0.5π + 0.0π π 0.0π = π 0.5π.30π = (7) (6) 6 + (7) 3 [ ]π = [ ] π = π =-7.70 π = substituting π in (6) we get, -0.95(0.306)+ 0.5 π = π =0.0 π =0.67 π 3 = π π = π 3 =0.7 Thererfore Dr.Charles devotes approximately 30 percentage of the days to read- 5

52 ing, 7 percentage of the days to gardening and 3 percentage of the days to writing. Example.3.9 The tpm of a Markov chain with three states 0,, is P = and the initial state distribution of the chain is P [X 0 = i] = 3, i = 0,,. Find (i) P [X = ] (ii) P [X 3 =, X =, X =, X 0 = ] (iii). P [X =, X 0 = 0]. Solution: Given P =

53 P () = = From the definition of conditional probability, (i) P [X = ] = i=0 P [X = /X 0 = i]p [X 0 = i] = P [X = /X 0 = 0]P [X 0 = 0] + P [X = /X 0 = ]P [X 0 = ] + P [X = /X 0 = ]P [X 0 = ] P = P [X = ] = P 0P [X 0 = 0] + P P [X 0 = ] + P P [X 0 = ] = 3 [ ] = 6 53

54 (ii) P = P [X 3 =.X =, X =, X 0 = ] = P [X 3 = /X = ] P [X = /X = ]P [X = /X 0 = ]P [X 0 = ] = P () P () P () P [X 0 = ] = 3 3 = From P,we get P [X = /X 0 = 0] = P () 0 = 5 6 P [X = ; X 0 = 0] = P [ X = /X 0 = 0]P [X 0 = 0] = 5 = Example.3.0 There are white balls in bag A and 3 red balls in bag B. At each step of the process,a ball is selected from each bag and the balls selected are interchanged. Let the state a i of the system be the number of red ball in A after i change. What is the probability that there are red balls in A after 3 steps? In the long run, what is the probability that there are red balls in bag A? Solution: State space of the chain {X n } = (0,, ), since the number of balls in the bag A is always. Let the transition probability matrix of the chain {X n } 5

55 P 00 P 0 P 0 be P = P 0 P P P 0 P P P 00 = 0 [the state 0, interchange of balls] P 0 = P 0 = 0 (After the process of interchanging,the number of red balls in bag cannot increase or decrease by ) A 0 red balls(before interchange) A red balls(after interchange) Let X n =, that is A contains red ball(and white ball) and B contains white and red balls. P {X n+ = 0/X n = } = P 0 = = 3 6 P = = 3 3 Since P is a stochastic matrix, P 0 + P + P = P = P = 3 and P = (P 0 + P ) = 3 55

56 Therefore P = Now P (0) = (, 0, 0) as there is no red ball in A in the beginning. P () = P (0) P = (0,, 0) P () = P () P = ( 6,, 3 ) P (3) = P () P = (, 3 6, 5 8 ) P { there are red balls in bag A after 3 steps } = P {X 3 = } = P (3) = 5 8. Let the stationary probability distribution of the chain be π = (π 0, π, π ). By the property of π, we have πp = π and π 0 + π + π = π0 π π = π0 π π π 6 = π 0 π 0 + π + π 3 = π 56

57 π 3 + π 3 = π Therefore π = 6π ;6 π +3 π + π 3 =6 π and π = π 3 Therefore 3 π = π 3 ; π = π 3 ; π + π + π 3 = π π 3 + π 3 = 0π 3 = 3 Therefore π 3 = 3, π 0 = 6 and π 0 =. Therefore the steady state probability 0 distribution is π = (, 6, 3 ). Thus in the long run, 30 percentage of the time, there will be two red marbles in urn A.. Exercise Two marks questions. Define Markov chain and one-step transition probability.. What is meant by steady-state distribution of Markov chain. 3. Define Markov process and example.. What is stochastic matrix? when it is said to be regular? 5. Define irreducible Markov chain and state Chapman-Kolmogorov theorem. 6. Find the invariant probabilities for the Markov chain [X n ; n ] with state 57

58 0 space S = [0, ] and one-step TPM P =. 7. At an intersection, a working traffic light will be out of order the next day with probability 0.07, and out of order traffic light will be working the next day with probability Let X n = if a day n the traffic will work; X n =0 if on day n the traffic light will not work. Is {X n ; n = 0,,...} a Markov chain?. If so, write the transition probability matrix. 8. The tpm ofa Markov chain with three states 0,, is P = and the initial state distribution of the chain is P [X 0 = i] =, 3 i=0,,. Find P [X 3 =, X =, X =, X 0 = ]. 9. Define recurrent state, absorbing state and transient state of a Markov chain. 0. Define regular and ergodic Markov chains. 58

59 Choose the Correct Answer (). All regular Markov chains are (a)ergodic (b)stationary (c)wss (d)none Ans:(a) (). If a Makkov chain is finite irreducible,all its state are (a)transient (b)null persistent (c)non-null persistent (d)return state. Ans:(c) (3). If d i =,then state i is said to be (a)periodic (b)return state (c)recurrent (d)aperiodic Ans:(d) (). A non null persistent and aperiodic state is (a)regular (b)ergodic (c)transient (d)mean recurrence time Ans:(b) (5). The sum of the elements of any row in the transition probability matrix of a finite state Markov chain is (a) 0 (b) (c) (d) 3 Ans:(b) 59

60 Chapter 3 Poisson Process 3. Basic Definitions Definition 3.. If X(t) represents the numbers of occurrences of certain events in (0, t), then the discrete random process {X(t)} is called the Poisson process, provided the following postulates are satisfied, (i) P [ occurrence in (t, t + t)] = λ t (ii) P [0 occurrence in (t, t + t)] = λ t (iii) P [ or more occurrence in (t, t + t)] = 0 (iv) X(t) is independent of the number of occurrences of the event in any interval prior and after the interval (0, t) (v) The probability that the event occurs a specified number of times in (t 0, t 0 + t) 60

61 depends only on t, but not on t 0 Example 3.. (i) The arrival of a customer at a bank. (ii) The occurance of lighting strike within some prescribed area. (iii) The failure of some component in a system. (iv) The emission of an electron from the surface of a light sensitive material. Probability law for the Poisson Process {X(t)} Let λ be the number of occurrences of the event in unit time. Let P n (t) = P [X(t) = n] P n (t + t) = P [X(t + t) = n] = P [(n ) calls in (0, t) and callin (t, t + t)] + P [n calls in (0, t) and no call in (t, t + t)]. Therefore P n (t + t) = P n (t)λ t + P n (t)( λ t) P n (t + t) P n (t) = P n (t)λ t λp n (t) t = λ t [P n (t) P n (t)] Therefore Pn(t+ t) Pn(t) t = λ [P n (t) P n (t)] Taking limit as t 0, lim t 0 P n(t+ t) P n(t) t = lim t 0 λ [P n (t) P n (t)] P n(t) = λ [P n (t) P n (t)] -() Let the solution of the equation () be P n (t) = (λt)n n! f(t) -() Differentiate () with respect to t P n(t) = λn n! [ntn f(t) + t n f (t)] (3) Using () and (3) in (), 6

62 We get λn n! [ntn f(t) + t n f (t)] = λ (λt)n (λt)n f(t) λ f(t) (n )! n! Therefore (λt)n n! f (t) = λ(λt)n n! f(t) f (t) = λf(t) Therefore f (t) f(t) = λ Integrating, log f(t) = λt + log k f(t) = ke λt From (), P 0 (t) = f(t)i.e., f(0) = P 0 (0) = P [X 0 (0) = 0] = P [ no event occurs in (0, 0)] =. But f(0) = k Therefore k = Hence f(t) = e λt Therefore P [X(t) = n] = P n (t) = e λt(λt)n n!, n = 0,,... This is the probability law for Poisson process. It is to be observed that the probability distribution of X(t) is the Poisson distribution with parameter λt. Mean of the Poisson process: Mean = E[X(t)] = n=0 np n(t) = n(λt) n n=0 e λt n! = e λt n=0 λt n n! [ = e λt λt + λ t + λ3 t !! = λte λt [ + λt! + λ t! +... = λte λt e λt = λt ] ] 6

63 Variance of the Poisson Process: Variance= E[X (t)] [E[X(t)]] -() E[X (t)] = n=0 n P n (t) = n=0 n e λt(λt) n n! Now n = n(n + ) + n Hence E[X (t)] = [n(n )+n]e λt (λt) n n=0 n! = n(n )e λt (λt) n n=0 + ne λt (λt) n n! n=0 n! = e λt n=0 (λt)n (n )! + λt = e λt [ (λt) ) 0! + (λt)3! +...] + λt = e λt λ t e λt + λt = λ t + λt Hence V ar[x(t)] = λ t + λt (λt) = λ t + λt λ t = λt Auto Correlation of the Poisson Process: R XX (t, t + τ) = E[X(t)X(t + τ)] (or) R XX (t, t ) = E[X(t )X(t )] = E[X(t ){X(t ) X(t ) + X(t )}] = E[X(t )]E[X(t X(t ))] + E[X (t )] Since {X(t)} is a process of independent increment, we get R XX (t, t ) = λt λ(t t ) + λt + λ t, if t t R XX (t, t ) = λ t t + λ min(t, t ) 63

64 Auto Co-variance of the Poisson Process: C(t, t ) = R(t, t ) E[X(t )]E[X(t )] = λ t t + λt λ t t = λt if t t. Therefore C(t, t ) = λ min{t, t }. 3. Properties of the Poisson Process Property : Poisson process is a Markov process. Proof: Consider P [X(t 3 ) = n 3 /X(t ) = n, X(t ) = n ] = P [X(t )=n,x(t )=n,x(t 3 )=n 3 ] P [X(t )=n X(t )=n ] = e λ(t 3 t ) λ n 3 n (t 3 t ) n 3 n (n 3 n )! = P [X(t 3 ) = n 3 /X(t ) = n ] This means that the conditional probability distribution of X(t 3 ) given all the past values X(t ) = n, X(t ) = n depends only on the most recent value X(t ) = n. Therefore Poisson processes the Markovian property. Hence Poisson process is a Markov Process. Property : The sum of independent Poisson processes is a Poisson process. Proof: P [X(t) = n] = n k=0 P [X (t) = k] P [X (t) = n k] = n e λ t (λ t) k k=0 k! e λ t (λ t) n k (n k)! 6

65 = e (λ +λ )t n! n k=0 n! k! (n k)! (λ t) k (λ t) n k P [X(t) = n] = e (λ +λ )t n! (λ t + λ t) n, n = 0,,,... Therefore X(t) = X (t) + X (t) is a Poisson process with parameter (λ + λ )t. Hence the sum of two independent Poisson processes is also a Poisson process. Property 3: The difference of two independent Poisson processes is not a Poisson process. Proof: Let X(t) = X (t) X (t). Then E[X(t)] = λ t λ t = (λ λ )t = E[X(t) X (t)x (t) + X(t)] = E[X (t)] E[X ( t) X (t)] + E[X(t)] = λ t + λ t + λ t + λ t λ λ t Since X (t) and X (t) are independent, E[X (t)] = (λ +λ )t+[(λ λ )t] () We know that E[X (t)] for a Poisson process with parameter λt is given by E[X (t)] = λt + λ t Since X(t) = X (t) X (t), E[X (t)] = λ λ )t + (λ λ ) t Expression () shows that X (t) X(t) is not a Poisson process. 65

66 Property : The inter arrival time of a Poisson process i.e., the interval between two successive occurrences of a Poisson process with parameter λ has an exponential distribution with mean λ i.e., with parameter λ. Proof: Let two consecutive occurrences of the event be E i and E i+ Let E i take place at time instant t i and T be the interval between the occurrences of E i and E i+. T is a continuous random variable. P [T > t] = P {E i did not occur in (t i, t + )} = P [ no event occurs in an interval of length t] = P [X(t) = 0] = e λt Therefore cumulative distribution function of T is given by F (t) = P [T t] = P [T < t] = e λt Therefore the p.df of T is the exponential distribution with parameter λ, given by f(t) = λe λt (t 0). i.e., T has an exponential distribution with mean. λ Property 5: If the number of occurrences of an event E in an interval of length t is a Poisson process {X(t)} with parameter λ and if each occurrence of E has a constant probability p of being recorded and the recordings are independent of each other, then the number N(t) of the recorded occurrences n t is also a 66

67 Poisson process with parameter λp. Hence P [N(t) = n] = e λpt (λpt) n n!, n = 0,,, Example Example 3.3. If {N (t)} and {N (t)} are two independent Poisson process with parameters λ and λ respectively, show that P [N (t) = k/n (t) + N (t) = n] nc r p k q n k where p = λ λ +λ and λ λ +λ. Solution: By definition, P [N (t) = r] = e λ t (λ t) r r!, r = 0,,... and P [N (t) = r] = e λ t (λ t) r r!, r = 0,,.... P [N (t) = k/n (t) + N (t) = n] = P [N (t)=k N (t)+n (t)=n] P [N (t)+n (t)=n] = P [N (t)=k N (t)=n N (t)] P [N (t)+n (t)=n] = P [N (t)=k N (t)=n k] P [N (t)+n (t)=n] = P [N (t)=k]p [N (t)=n k] P [N (t)+n (t)=n] = n!e λ t (λ t) k e λ t (tλ ) n k k!(n k)!e (λ +λ )t [(λ +λ )t] n λ = nc k λn k k (λ +λ ) n = nc k ( λ λ +λ ) k ( λ λ +λ ) n k Taking p = λ λ +λ, q = p = λ λ +λ, we have P [N (t) = k/n (t) + N (t) = n] = nc k p k q n k. Example 3.3. A radio active source emits particles at a rate 6 per minute in 67

68 accordance with Poisson process. Each particle emitted has a probability 3 of being recorded. Find the probability that at least 5 particles are recorded in a 5 minute period. Solution: Let N(t)be the number of recorded particles. Then {N(t)} is a Poisson process with λp as parameter. Now, λp = 6( 3 ) = P [N(t) = n] = e t (t) n n!, n = 0,,... Therefore P[at least 5 particles are recorded in a 5 minute period]= P [X(5) 5] = P [X(5) < 5] =- {P [X(5) = 0] + P [X(5) = ] + P [X(5) = ] + P [X(5) = 3] + P [X(5) = ]} = e 0 [ ! ! + 0! ] = Example If customers arrive at a counter in accordance with a Poisson process with a rate of 3 per minute, find the probability that the interval between consecutive arrivals is. more than minute. between minute and minutes 3. minutes or less Solution: By property of the Poisson process, we have the interval T between 68

69 consecutive arrivals follows an exponential distribution with parameter λ = 3. The pdf of the exponential distribution = λe λx.. P (T > ) = 3 e 3t dt = 3 [ ] e 3t 3 = e 3. P ( < T < ) = 3e 3t dt = 3 [ ] e 3t = 3 e 3 e 6 3. P (T ) = 3 0 e 3t dt = 3 [ ] e 3t 3 0 = [e e 0 ] = e. Example 3.3. A machine goes out of order, whenever a component fails. The failure of this part follows a Poisson process with a mean rate of per week. Find the probability that weeks have elapsed since last failure. If there are 5 spare parts of this component in an inventory and that the next supply is not due in 0 weeks, find the probability that the machine will not be out of order in the next 0 weeks. Solution:Here the unit time t = week. Mean failure rate = mean number of failures in the week i.e., λ = 69

Random Process. Random Process. Random Process. Introduction to Random Processes

Random Process. Random Process. Random Process. Introduction to Random Processes Random Process A random variable is a function X(e) that maps the set of experiment outcomes to the set of numbers. A random process is a rule that maps every outcome e of an experiment to a function X(t,

More information

Chapter 2 Random Processes

Chapter 2 Random Processes Chapter 2 Random Processes 21 Introduction We saw in Section 111 on page 10 that many systems are best studied using the concept of random variables where the outcome of a random experiment was associated

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the

More information

MA6451 PROBABILITY AND RANDOM PROCESSES

MA6451 PROBABILITY AND RANDOM PROCESSES MA6451 PROBABILITY AND RANDOM PROCESSES UNIT I RANDOM VARIABLES 1.1 Discrete and continuous random variables 1. Show that the function is a probability density function of a random variable X. (Apr/May

More information

Chapter 6: Random Processes 1

Chapter 6: Random Processes 1 Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X

More information

Problems on Discrete & Continuous R.Vs

Problems on Discrete & Continuous R.Vs 013 SUBJECT NAME SUBJECT CODE MATERIAL NAME MATERIAL CODE : Probability & Random Process : MA 61 : University Questions : SKMA1004 Name of the Student: Branch: Unit I (Random Variables) Problems on Discrete

More information

Introduction to Probability and Stochastic Processes I

Introduction to Probability and Stochastic Processes I Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides

More information

Lecture 4a: Continuous-Time Markov Chain Models

Lecture 4a: Continuous-Time Markov Chain Models Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : Additional Problems MATERIAL CODE : JM08AM004 REGULATION : R03 UPDATED ON : March 05 (Scan the above QR code for the direct

More information

VALLIAMMAI ENGINEERING COLLEGE SRM Nagar, Kattankulathur 603 203. DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING SUBJECT QUESTION BANK : MA6451 PROBABILITY AND RANDOM PROCESSES SEM / YEAR:IV / II

More information

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:

More information

Name of the Student:

Name of the Student: SUBJECT NAME : Probability & Queueing Theory SUBJECT CODE : MA 6453 MATERIAL NAME : Part A questions REGULATION : R2013 UPDATED ON : November 2017 (Upto N/D 2017 QP) (Scan the above QR code for the direct

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 03 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA 6 MATERIAL NAME : Problem Material MATERIAL CODE : JM08AM008 (Scan the above QR code for the direct download of

More information

Stochastic process. X, a series of random variables indexed by t

Stochastic process. X, a series of random variables indexed by t Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,

More information

Chapter 4 Random process. 4.1 Random process

Chapter 4 Random process. 4.1 Random process Random processes - Chapter 4 Random process 1 Random processes Chapter 4 Random process 4.1 Random process 4.1 Random process Random processes - Chapter 4 Random process 2 Random process Random process,

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

Engineering Mathematics : Probability & Queueing Theory SUBJECT CODE : MA 2262 X find the minimum value of c.

Engineering Mathematics : Probability & Queueing Theory SUBJECT CODE : MA 2262 X find the minimum value of c. SUBJECT NAME : Probability & Queueing Theory SUBJECT CODE : MA 2262 MATERIAL NAME : University Questions MATERIAL CODE : SKMA104 UPDATED ON : May June 2013 Name of the Student: Branch: Unit I (Random Variables)

More information

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu From RV

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

Markov Chains (Part 3)

Markov Chains (Part 3) Markov Chains (Part 3) State Classification Markov Chains - State Classification Accessibility State j is accessible from state i if p ij (n) > for some n>=, meaning that starting at state i, there is

More information

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

2. (a) What is gaussian random variable? Develop an equation for guassian distribution Code No: R059210401 Set No. 1 II B.Tech I Semester Supplementary Examinations, February 2007 PROBABILITY THEORY AND STOCHASTIC PROCESS ( Common to Electronics & Communication Engineering, Electronics &

More information

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time

More information

Question Paper Code : AEC11T03

Question Paper Code : AEC11T03 Hall Ticket No Question Paper Code : AEC11T03 VARDHAMAN COLLEGE OF ENGINEERING (AUTONOMOUS) Affiliated to JNTUH, Hyderabad Four Year B Tech III Semester Tutorial Question Bank 2013-14 (Regulations: VCE-R11)

More information

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1 Irreducibility Irreducible every state can be reached from every other state For any i,j, exist an m 0, such that i,j are communicate, if the above condition is valid Irreducible: all states are communicate

More information

16.584: Random (Stochastic) Processes

16.584: Random (Stochastic) Processes 1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable

More information

Fig 1: Stationary and Non Stationary Time Series

Fig 1: Stationary and Non Stationary Time Series Module 23 Independence and Stationarity Objective: To introduce the concepts of Statistical Independence, Stationarity and its types w.r.to random processes. This module also presents the concept of Ergodicity.

More information

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7 UCSD ECE50 Handout #4 Prof Young-Han Kim Wednesday, June 6, 08 Solutions to Exercise Set #7 Polya s urn An urn initially has one red ball and one white ball Let X denote the name of the first ball drawn

More information

Stochastic Processes. Chapter Definitions

Stochastic Processes. Chapter Definitions Chapter 4 Stochastic Processes Clearly data assimilation schemes such as Optimal Interpolation are crucially dependent on the estimates of background and observation error statistics. Yet, we don t know

More information

Stochastic Processes

Stochastic Processes Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic

More information

STOCHASTIC PROCESSES Basic notions

STOCHASTIC PROCESSES Basic notions J. Virtamo 38.3143 Queueing Theory / Stochastic processes 1 STOCHASTIC PROCESSES Basic notions Often the systems we consider evolve in time and we are interested in their dynamic behaviour, usually involving

More information

Chapter 16 focused on decision making in the face of uncertainty about one future

Chapter 16 focused on decision making in the face of uncertainty about one future 9 C H A P T E R Markov Chains Chapter 6 focused on decision making in the face of uncertainty about one future event (learning the true state of nature). However, some decisions need to take into account

More information

MATH HOMEWORK PROBLEMS D. MCCLENDON

MATH HOMEWORK PROBLEMS D. MCCLENDON MATH 46- HOMEWORK PROBLEMS D. MCCLENDON. Consider a Markov chain with state space S = {0, }, where p = P (0, ) and q = P (, 0); compute the following in terms of p and q: (a) P (X 2 = 0 X = ) (b) P (X

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

PROBABILITY AND RANDOM PROCESSESS

PROBABILITY AND RANDOM PROCESSESS PROBABILITY AND RANDOM PROCESSESS SOLUTIONS TO UNIVERSITY QUESTION PAPER YEAR : JUNE 2014 CODE NO : 6074 /M PREPARED BY: D.B.V.RAVISANKAR ASSOCIATE PROFESSOR IT DEPARTMENT MVSR ENGINEERING COLLEGE, NADERGUL

More information

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t 2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition

More information

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes? IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only

More information

Some Definition and Example of Markov Chain

Some Definition and Example of Markov Chain Some Definition and Example of Markov Chain Bowen Dai The Ohio State University April 5 th 2016 Introduction Definition and Notation Simple example of Markov Chain Aim Have some taste of Markov Chain and

More information

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE 53 Handout #46 Prof. Young-Han Kim Thursday, June 5, 04 Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei). Discrete-time Wiener process. Let Z n, n 0 be a discrete time white

More information

Problem Sheet 1 Examples of Random Processes

Problem Sheet 1 Examples of Random Processes RANDOM'PROCESSES'AND'TIME'SERIES'ANALYSIS.'PART'II:'RANDOM'PROCESSES' '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''Problem'Sheets' Problem Sheet 1 Examples of Random Processes 1. Give

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains

8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains 8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains 8.1 Review 8.2 Statistical Equilibrium 8.3 Two-State Markov Chain 8.4 Existence of P ( ) 8.5 Classification of States

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

Markov Chains. Chapter 16. Markov Chains - 1

Markov Chains. Chapter 16. Markov Chains - 1 Markov Chains Chapter 16 Markov Chains - 1 Why Study Markov Chains? Decision Analysis focuses on decision making in the face of uncertainty about one future event. However, many decisions need to consider

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of

More information

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory Chapter 1 Statistical Reasoning Why statistics? Uncertainty of nature (weather, earth movement, etc. ) Uncertainty in observation/sampling/measurement Variability of human operation/error imperfection

More information

Stochastic Processes. Monday, November 14, 11

Stochastic Processes. Monday, November 14, 11 Stochastic Processes 1 Definition and Classification X(, t): stochastic process: X : T! R (, t) X(, t) where is a sample space and T is time. {X(, t) is a family of r.v. defined on {, A, P and indexed

More information

STAT STOCHASTIC PROCESSES. Contents

STAT STOCHASTIC PROCESSES. Contents STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

Markov Processes Hamid R. Rabiee

Markov Processes Hamid R. Rabiee Markov Processes Hamid R. Rabiee Overview Markov Property Markov Chains Definition Stationary Property Paths in Markov Chains Classification of States Steady States in MCs. 2 Markov Property A discrete

More information

Part I Stochastic variables and Markov chains

Part I Stochastic variables and Markov chains Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)

More information

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y) Stochastic Processes - MM3 - Solutions MM3 - Review Exercise Let X N (0, ), i.e. X is a standard Gaussian/normal random variable, and denote by f X the pdf of X. Consider also a continuous random variable

More information

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 19 28 March 2012 We have been studying stochastic processes; i.e., systems whose time evolution has an element

More information

3F1 Random Processes Examples Paper (for all 6 lectures)

3F1 Random Processes Examples Paper (for all 6 lectures) 3F Random Processes Examples Paper (for all 6 lectures). Three factories make the same electrical component. Factory A supplies half of the total number of components to the central depot, while factories

More information

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities

More information

Chapter 6. Random Processes

Chapter 6. Random Processes Chapter 6 Random Processes Random Process A random process is a time-varying function that assigns the outcome of a random experiment to each time instant: X(t). For a fixed (sample path): a random process

More information

The exponential distribution and the Poisson process

The exponential distribution and the Poisson process The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]

More information

Figure 10.1: Recording when the event E occurs

Figure 10.1: Recording when the event E occurs 10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

More information

Probability and Statistics

Probability and Statistics Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition

More information

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome

More information

LTCC. Exercises. (1) Two possible weather conditions on any day: {rainy, sunny} (2) Tomorrow s weather depends only on today s weather

LTCC. Exercises. (1) Two possible weather conditions on any day: {rainy, sunny} (2) Tomorrow s weather depends only on today s weather 1. Markov chain LTCC. Exercises Let X 0, X 1, X 2,... be a Markov chain with state space {1, 2, 3, 4} and transition matrix 1/2 1/2 0 0 P = 0 1/2 1/3 1/6. 0 0 0 1 (a) What happens if the chain starts in

More information

Lecture - 30 Stationary Processes

Lecture - 30 Stationary Processes Probability and Random Variables Prof. M. Chakraborty Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur Lecture - 30 Stationary Processes So,

More information

Probability and Statistics Concepts

Probability and Statistics Concepts University of Central Florida Computer Science Division COT 5611 - Operating Systems. Spring 014 - dcm Probability and Statistics Concepts Random Variable: a rule that assigns a numerical value to each

More information

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling Norwegian University of Science and Technology Department of Mathematical Sciences Page of 7 English Contact during examination: Øyvind Bakke Telephone: 73 9 8 26, 99 4 673 TMA426 Stochastic processes

More information

MAS275 Probability Modelling Exercises

MAS275 Probability Modelling Exercises MAS75 Probability Modelling Exercises Note: these questions are intended to be of variable difficulty. In particular: Questions or part questions labelled (*) are intended to be a bit more challenging.

More information

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

Random Processes. DS GA 1002 Probability and Statistics for Data Science. Random Processes DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Modeling quantities that evolve in time (or space)

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Discrete time Markov chains. Discrete Time Markov Chains, Limiting. Limiting Distribution and Classification. Regular Transition Probability Matrices

Discrete time Markov chains. Discrete Time Markov Chains, Limiting. Limiting Distribution and Classification. Regular Transition Probability Matrices Discrete time Markov chains Discrete Time Markov Chains, Limiting Distribution and Classification DTU Informatics 02407 Stochastic Processes 3, September 9 207 Today: Discrete time Markov chains - invariant

More information

Slides 8: Statistical Models in Simulation

Slides 8: Statistical Models in Simulation Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Chapter 5 Random Variables and Processes

Chapter 5 Random Variables and Processes Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering National Sun Yat-sen University Table of Contents 5.1 Introduction 5. Probability

More information

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10. x n+1 = f(x n ),

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10. x n+1 = f(x n ), MS&E 321 Spring 12-13 Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10 Section 4: Steady-State Theory Contents 4.1 The Concept of Stochastic Equilibrium.......................... 1 4.2

More information

FINAL EXAM: 3:30-5:30pm

FINAL EXAM: 3:30-5:30pm ECE 30: Probabilistic Methods in Electrical and Computer Engineering Spring 016 Instructor: Prof. A. R. Reibman FINAL EXAM: 3:30-5:30pm Spring 016, MWF 1:30-1:0pm (May 6, 016) This is a closed book exam.

More information

MARKOV PROCESSES. Valerio Di Valerio

MARKOV PROCESSES. Valerio Di Valerio MARKOV PROCESSES Valerio Di Valerio Stochastic Process Definition: a stochastic process is a collection of random variables {X(t)} indexed by time t T Each X(t) X is a random variable that satisfy some

More information

STAT 516 Midterm Exam 2 Friday, March 7, 2008

STAT 516 Midterm Exam 2 Friday, March 7, 2008 STAT 516 Midterm Exam 2 Friday, March 7, 2008 Name Purdue student ID (10 digits) 1. The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional

More information

Chapter 6 - Random Processes

Chapter 6 - Random Processes EE385 Class Notes //04 John Stensby Chapter 6 - Random Processes Recall that a random variable X is a mapping between the sample space S and the extended real line R +. That is, X : S R +. A random process

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions MATH/STAT 3360, Probability Sample Final Examination Model Solutions This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are

More information

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

EE 3025 S2010 Demo 10 Apr 19-20, Reading Assignment: Read Sections 9.5 and of the EE 3025 Matlab Notes.

EE 3025 S2010 Demo 10 Apr 19-20, Reading Assignment: Read Sections 9.5 and of the EE 3025 Matlab Notes. EE 3025 S2010 Demo 10 Apr 19-20, 2010 Reading Assignment: Read Sections 9.5 and 10.1-10.5 of the EE 3025 Matlab Notes. Part I(25 min): Matlab Part II(25 min): Worked Problems on Chap 10 1 Matlab 1.1 Estimating

More information

Continuous Time Processes

Continuous Time Processes page 102 Chapter 7 Continuous Time Processes 7.1 Introduction In a continuous time stochastic process (with discrete state space), a change of state can occur at any time instant. The associated point

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010 Exercises Stochastic Performance Modelling Hamilton Institute, Summer Instruction Exercise Let X be a non-negative random variable with E[X ]

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Introduction to Information Entropy Adapted from Papoulis (1991)

Introduction to Information Entropy Adapted from Papoulis (1991) Introduction to Information Entropy Adapted from Papoulis (1991) Federico Lombardo Papoulis, A., Probability, Random Variables and Stochastic Processes, 3rd edition, McGraw ill, 1991. 1 1. INTRODUCTION

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information