Probability and Statistics 1
Contents some stochastic processes Stationary Stochastic Processes 2
4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph process 3
4.1 Bernoulli Process Definition The infinite sequence of random variables is called a random sequence in particular 1 and 0 n are statistically independent P[ 1] p P[ 0] 1 p q n {,,, } {, n 1,2,3, } 1 2 3 n n is a Bernoulli random variable, if n is fixed. { n, n 1, 2,3, } is called a Bernoulli random process 4
Probability Distribution Unending sequence of flips of a coin. Flip a coin at each positive-integer value of time (starting at time 1) and observe the result. To determine the joint probability flipping a coin, a finite number of times n. 1 If a head results from the coin flip at time n n 0 If not n are statistically independent the joint-probability distribution is the set of P[ x, x, x ] 1 1 2 2 n P[ x ] P[ x ] P[ x ] 1 2 n 1 1 2 2 n xi n 0,1 2 n probabilities n i1, 2,3, n 5
Example: two-dimensional case The joint-probability distribution is the set of probabilities and P[ 1, 1] p P[ 1, 0] pq 2 1 2 1 2 P[ 0, 1] qp P[ 0, 0] q 1 2 1 2 2 Exercise : Determine the probabilities of occurrence of the three particular Bernoulli sample sequences 6
Statistical Average E[ n] p var( ) pq p(1 p) n This expectation and variance are called the mean value and variance of the Bernoulli process 7
4.2 Binomial Process Definition A random process variable Y i1 random variables. n n i { Yn, n1, 2,3, } in which the counting random is defined to be a sum of independent Bernoulli Example: Unending sequence of flips of a coin by a zero-one Bernoulli process and count the number of heads in n flips. 8
Probability Distribution If the i can assume only the values +1 or 0,then n k P[ Yn k] P (1 P) k n n! k k!( n k)! k 0,1,2,, n nk Notice : while each counting random variable of independent random variable, the various independent. Y n Y n is the a sum are not 9
Probability Distribution E[ Yn ] np var( Yn ) npq The two difference counting random variables cov( Y, Y ) pq min( m, n) m n Yn and Ym var( Y Y ) m n pq m min( mn, ) : the smaller of the indices m and n n 10
4.3 Sine Wave Process Definition A random process { ( t), t, T} where the index set T is a continuous, and where ( t) V sin( t ) for all value of t in T,, and here are random variables V 11
Example in practice : The outputs of An electronics instrument manufacturer produces sine wave oscillators Output of a particular oscillator at any time can be characterized by the sample function x( t ) vsin( t ) 1 Outputs of various oscillator at a specified time can be characterized by the random variable Vsin( t ) t 1 1 Outputs of various oscillator at any time can be characterized by the sine wave process V sin( t ) {, t 0} t t 12
4.4 Random-telegraph process Random-telegraph process This is a real random process whose sample functions at any instant of time t may assume only the values zero or one. And it is assumed that 1 P[ ( t) 0] P[ ( t) 1] 2 The probability Pk [, ] that k transversals from one value to another occur in a time interval of length is given by the Poisson probability distribution ( ) k e P[ k, ] k 0,1,2 k! 13
The occurrence of k transversals in an interval of is statistically independent of the value assumed by any particular sample function at the start of the given interval 1 x A random-telegraph sample function t 14
Exercise-1: For the above random-telegraph process a. Show that b. Show that 1 E[ ( t)] 2 R ( t, t ) P[ ( t ) 1, ( t ) 1] 1 2 1 2 c. Show further that 1 R ( t1, t2) P[ k even] 2 Where P[ k even] is the probability that the number of transversals which occur in an interval of duration even t t 1 2 is 15
5. Strict-Sense Stationary (S.S.S) Definition if f ( x1, x2, xn, t1, t2, tn ) f ( x1, x2, xn, t1 c, t2 c, tn c) for any c, where the left side represents the joint density function of the random variables and 1 ( t1), 2 ( t2),, n ( tn ) the right side corresponds to the joint density function of the random variables t c), ( t c),, ( t 1 ( 1 2 2 n n c A process (t) is said to be strict-sense stationary if the above is true for all, i 1, 2,, n, n 1, 2, and any c. t i Stationary processes exhibit statistical properties that are invariant to shift in the time index. In strict terms, the statistical properties are governed by the joint probability density function. 16 ).
Strict-Sense Stationary (S.S.S) Probability Distribution s Properties For a first-order strict sense stationary process, ( x, t) f ( x, t c) for any c. In particular c = t gives f f ( x, t) f ( x) i.e., the first-order density of (t) is independent of t. In that case Similarly, for a second-order strict-sense stationary process we have from the previous page for any c. For c = t 2 we get f ( x, x ; t, t ) f ( x, x ; t t ) f ( x, x ; t, t ) f ( x, x ; t c, t c) 1 2 1 2 1 2 1 2 1 2 1 2 1 2 1 2 17
Strict-Sense Stationary (S.S.S) Statistical Average s Properties E[ ( t)] x f ( x) dx, a constant. the autocorrelation function is given by R ( t, t ) E{ ( t ) ( t )} * 1 2 1 2 x x f ( x, x, t t ) dx dx * 1 2 1 2 1 2 1 2 R t t R R * ( 1 2) ( ) ( ), i.e., the autocorrelation function of a second order strict-sense stationary process depends only on the difference of the time Indices. 18
Strict-Sense Stationary (S.S.S) Exercise-2 Consider the sine wave process {, t 0}. V cost t Show whether or not this random process is stationary in the strict sense. t 19
6. Wide-Sense Stationary (W.S.S), Definition a process (t) is said to be Wide-Sense Stationary if (i) (ii) (iii) 2 E ( t) E{ ( t)} E{ ( t ) ( t )} R ( t t ) R ( ) * 1 2 1 2 for wide-sense stationary processes, the mean is a constant and the autocorrelation function depends only on the difference between the time indices. Notice that they do not say anything about the nature of the probability density functions, and instead deal with the average behavior of the process. 20
Relationship between S.S.S and W.S.S Since they follow the previous definition, strict-sense stationarity always implies wide-sense stationarity only when the process is limited in power. The converse is not true in general, the only exception being the Gaussian process. 21
Properties of stationary stochastic process Autocorrelation Function If real random process { ( t), t } is stationary, then R ( ) R ( ) R ( ) R (0) lim R ( ) R ( ) m 2 in practice if then lim R ( ) R ( ) m 2 2 R (0) m 2 2 R R (0) ( ) 22
7. Jointly Stationarity 1) Definition A pair of real random process { ( t), t } and { Y( t), t } are jointly stationary in wide sense, when E[ ( t)] m E[ Y ( t)] m R ( t, t ) R ( ) R ( t, t ) R ( ) R ( t, t ) R ( ) R ( t, t ) R ( ) for all values t Y Y Y Y Y Y Y 23
Exercise-3 Let the two random process { U( t), t } { V ( t), t } be such that U( t) cos t Y sin t V ( t) Y cost sint for all t, where and Y are independent real random variables for which E E Y E E Y 2 2 [ ] 0 [ ] [ ] 1 [ ] and a. Show that the two processes are individually stationary in the wide sense. b. Show that they are not jointly stationary in the wide sense. 24
2) Properties of Cross-correlation Function The two real random process { ( t), t } and { Y( t), t } are individually stationary in the wide sense and are jointly stationary in the wide sense. Then R Y ( ) R ( ) Y 2 RY ( ) R (0) RY (0) 2 2 2 E Y E E Y This two processes are jointly stationary, then Z( t) ( t) Y( t) is stationary. 25
3) Cross-covariance Function The two random process { ( t), t } and { Y( t), t } are individually stationary in the wide sense and are jointly stationary in the wide sense. Then C Y ( ) C ( ) Y C ( ) 2 2 2 Y Y 26
Homework 10.6,10.8,10.9, 10.10, 10.13,10-20 27