Introduction to Probability and Stochastic Processes I

Size: px
Start display at page:

Download "Introduction to Probability and Stochastic Processes I"

Transcription

1 Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides originally by: Line Ørtoft Endelt Introduction to Probability and Stochastic Processes I p. /29

2 Random processes and sequences I Signals can be classified into two main groups: Deterministic Random Random signals can be described by properties e.g.. Average power. 2. Spectral distribution on the average. 3. The probability that the signal amplitude exceeds a given value. The probabilistic model used to describe random signals are called a random process (stochastic process or time series). Introduction to Probability and Stochastic Processes I p. 2/29

3 Random processes and sequences II Consider the communication system shown in figure 3.. The input Knowledge of x i (t) for t [t,t 2 ], does not tell anything about x i (t) for any other t / [t,t 2 ]. Knowledge of a member function x i (t) does not tell anything about another member function x j (t). The input to the system is a random signal, and during the transmission noise (random) is added. The output is also random. If the channel is linear, its impulse response h i (t) is know and the noise n i (t) is additive, then y i (t) = x i (t) h i (t) + n i (t) Introduction to Probability and Stochastic Processes I p. 3/29

4 Random processes and sequences III Mapping of the outcomes of a random experiment to... Random Variable: S a set of real numbers. Random Process: S a set of waveforms or functions of time. Introduction to Probability and Stochastic Processes I p. 4/29

5 Example At time t = 0 a die is tossed, a time function x i (t) is assigned to each possible outcome of the experiment: Outcome Waveform x (t) = 4 2 x 2 (t) = 2 3 x 3 (t) = 2 4 x 4 (t) = 4 5 x 5 (t) = t 2 6 x 6 (t) = t 2 X is a Random Process: outcome of experiment set of waveforms Introduction to Probability and Stochastic Processes I p. 5/29

6 Example (continued) 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) t t 2 t Introduction to Probability and Stochastic Processes I p. 6/29

7 Notation I A random process is denoted by: X(t, Λ) where t represents time, and Λ is a variable that represents an outcome in the sample space S. With each λ i Λ is associated a member function (sample function or realization) x i (t) of the ensemble (the collection of waveforms). The member functions are deterministic functions of time. For t = t 0, X(t 0, Λ) is a set of numerical values corresponding to the values of each member function at t = t 0. The probability distribution of X(t 0, Λ) can be derived from the probability distribution of the outcome of the random experiment. X(t 0,λ i ) is a numerical value. Introduction to Probability and Stochastic Processes I p. 7/29

8 Notation II X(t, Λ) can denote the following quantities:. X(t, Λ) = {X(t,λ i ) λ i S} = {x (t),x 2 (t),...} a collection of functions of time. 2. X(t,λ i ) = x i (t), a specific member function. 3. X(t 0, Λ) = {X(t 0,λ i ) λ i S} = {x (t 0 ),x 2 (t 0 ),...}, a collection of numerical values. 4. X(t 0,λ i ) = x i (t 0 ) numerical value of x i at time t 0. Instead of using the above notations, X(t) is used to denote all of them. Usually the meaning can be understood from the context. Introduction to Probability and Stochastic Processes I p. 8/29

9 Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) t t 2 t X(t, λ ) =? Introduction to Probability and Stochastic Processes I p. 9/29

10 Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) t t 2 t X(t, λ ) = X(t, Λ = ) = x (t) = 4, 0 t Introduction to Probability and Stochastic Processes I p. 9/29

11 Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) t t 2 t X(t, λ 5 ) =? Introduction to Probability and Stochastic Processes I p. 9/29

12 Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) t t 2 t X(t, λ 5 ) = X(t, Λ = 5) = x 5 (t) = t 2, 0 t Introduction to Probability and Stochastic Processes I p. 9/29

13 Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) t t 2 t X(6, Λ) =? Introduction to Probability and Stochastic Processes I p. 9/29

14 Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) t t 2 t X(6, Λ) = X(6) is a random variable with values in { 4, 3, 2, 2, 3, 4} Introduction to Probability and Stochastic Processes I p. 9/29

15 Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) t t 2 t X(t = 6, Λ = 5) = X(6, λ 5 ) =? Introduction to Probability and Stochastic Processes I p. 9/29

16 Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) t t 2 t X(t = 6, Λ = 5) = X(6, λ 5 ) = 3 Introduction to Probability and Stochastic Processes I p. 9/29

17 Probabilistic Structure If we know. the probability of each outcome of the experiment E, 2. the member function it corresponds to. Then properties like P[X(t ) a ] and P[X(t ) a,x(t 2 ) a 2 ] can be derived. If A = {λ i X(t,λ i ) a } then P[X(t ) a ] = P(A ) Joint and conditional probabilities can be found in the same way using the probabilities of the underlying experiment E. Introduction to Probability and Stochastic Processes I p. 0/29

18 Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) t t 2 t P(X(4) = 2) =? Introduction to Probability and Stochastic Processes I p. /29

19 Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) t t 2 t P(X(4) = 2) = P({2, 5}) = 3 Introduction to Probability and Stochastic Processes I p. /29

20 Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) t t 2 t P(X(4) 0) =? Introduction to Probability and Stochastic Processes I p. /29

21 Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) t t 2 t P(X(4) 0) = P({, 2, 5}) = 2 Introduction to Probability and Stochastic Processes I p. /29

22 Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) t t 2 t P(X(0) = 0, X(4) = 2) =? Introduction to Probability and Stochastic Processes I p. /29

23 Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) t t 2 t P(X(0) = 0, X(4) = 2) = P({5}) = 6 Introduction to Probability and Stochastic Processes I p. /29

24 Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) t t 2 t P(X(4) = 2 X(0) = 0) =? Introduction to Probability and Stochastic Processes I p. /29

25 Example S = {, 2, 3, 4, 5, 6} = {λ, λ 2, λ 3, λ 4, λ 5, λ 6 } 4 x 4 (t) x 3 (t) 2 x 6 (t) x 5 (t) -2 x 2 (t) x (t) t t P(X(4) = 2 X(0) = 0) = P(X(0) = 0, X(4) = 2) P(X(0) = 0) 2 t = /6 2/6 = 2 Introduction to Probability and Stochastic Processes I p. /29

26 Classification of Random processes X(t) t Continuous Discrete Continuous Continuous Random Continuous Random process sequence Discrete Discrete Random Discrete Random process sequence Stationarity is an attribute also used to classify random processes, in case certain probability distributions or averages do not depend on time. More about stationarity in next lecture. Introduction to Probability and Stochastic Processes I p. 2/29

27 Example A random process is given by Z(t) = A(t) cos[2πf c t + Θ(t)] Where A(t) and Θ(t) are real-valued random processes. Z(t) = Re{A(t) exp[jθ(t)] exp[j2πf c t]} = Re{W(t) exp[j2πf c t]} Where the envelope W(t) is a complex random process W(t) = A(t) cos Θ(t) + ja(t) sinθ(t) = X(t) + jy (t) Introduction to Probability and Stochastic Processes I p. 3/29

28 Predictability A random process is predictable if the values of the member functions can be predicted based on its past values. Otherwise it is unpredictable. Example: In the case of the ensamble binary waveforms X(t) shown in Figure 3. randomness is evident, and no future values of a member function can be predicted knowing its past values. Hence the random process is unpredictable. Introduction to Probability and Stochastic Processes I p. 4/29

29 Formal definition of Random Processes Let S be the sample space of a random experiment, and let t be a variable with values in Γ R. A Real-valued random process X(t) is a function X(t) : Γ S R. If Γ is a subset of integers, then X(t) is a random sequence, otherwise X(t) is a random process. The n th order distribution of the random process is F X(t ),X(t 2 ),...,X(t n )(x,x 2,...,x n ) = P[X(t ) x,x(t 2 ) x 2,...X(t n ) x n ] For all n and t,...t n Γ.These functions satisfies all the requirements of joint probability functions. Introduction to Probability and Stochastic Processes I p. 5/29

30 Joint Distribution The joint distribution function is derived from the probability distribution of the experiment and the mapping of the sample space into member functions. It is NOT possible to go the other way!!! There is no technique to construct memberfunctions from joint distribution functions. Introduction to Probability and Stochastic Processes I p. 6/29

31 Example I Using the above die example of a discrete random process, the joint and marginal probability mass functions for X(0) and X(6) can be found: X(0) X(6) Marginals of X(0) Marginals of X(6) Introduction to Probability and Stochastic Processes I p. 7/29

32 Example II If an FM station broadcast a tone/frequency X(t) = 00 cos(0 8 t) Then the listeners will receive X (t) = a i cos(0 8 t + θ i ) and the received signals can be modeled by a random process Y (t) = A cos(0 8 t + Θ) Introduction to Probability and Stochastic Processes I p. 8/29

33 Average Values For a random process (or sequence) X(t) The Mean Value µ X (t) = E{X(t)} The Auto-correlation The Auto-covariance R XX (t,t 2 ) = E{X (t )X(t 2 )} C XX (t,t 2 ) = R XX (t,t 2 ) µ X (t )µ X (t 2 ) The Correlation Coefficient r XX (t,t 2 ) = C XX (t,t 2 ) CXX (t,t )C XX (t 2,t 2 ) Introduction to Probability and Stochastic Processes I p. 9/29

34 Example I For the discrete Random process based on the die example, find µ X (t), R XX (t,t 2 ), C XX (t,t 2 ), and r XX (t,t 2 ). µ X (t) = E{X(t)} = 6 6 x i (t) = 0 i= R XX (t,t 2 ) = E{X (t )X(t 2 )} = 6 x i (t )x i (t 2 ) 6 i= { t t } t t 2 = 6 { t t 2 } = 6 Introduction to Probability and Stochastic Processes I p. 20/29

35 Example I (continued) C XX (t,t 2 ) = R XX (t,t 2 ) µ X (t )µ X (t 2 ) = R XX (t,t 2 ) r XX (t,t 2 ) = C XX (t,t 2 ) CXX (t,t )C XX (t 2,t 2 ) = t t 2 ( )( t t 2 ) 2 Introduction to Probability and Stochastic Processes I p. 2/29

36 Example II X: a random process described by X(t) = A cos(00t + Θ) A is a normal random variable with mean 0 and variance, and Θ is uniformly distributed in [ π,π]. Assume that A and Θ are independent. µ X (t) = E{A}E{cos(00t + Θ)} = 0 Introduction to Probability and Stochastic Processes I p. 22/29

37 Example II (continued) Let t = t and t 2 = t + τ, then R XX (t,t + τ) = E{X(t )X(t 2 )} = E{X(t)X(t + τ)} = E{A cos(00t + Θ)A cos(00t + 00τ + Θ)} = E{ A2 2 (cos(00τ) + cos(200t + 00τ + 2Θ))} = 2 cos(00τ) + E{cos(200t + 00τ + 2Θ)} 2 = 2 cos(00τ) since 2 cos(x) cos(y) = cos(x + y) + cos(x y) and E{cos(200t + 00τ + 2Θ)} = 0. Introduction to Probability and Stochastic Processes I p. 23/29

38 Two or More Random Processes I Two random processes have a joint distribution function P[X(t ) x,...x(t n ) x n,y (t ) y,...y (t m ) x m] The relation between X(t) and Y (t) are described by The Cross-correlation Function R XY (t,t 2 ) = E{X (t )Y (t 2 )} The Cross-covariance Function C XY (t,t 2 ) = R XY (t,t 2 ) µ X (t )µ Y (t 2 ) TheCorrelation Coefficient C XY (t,t 2 ) r XY (t,t 2 ) = CXX (t,t )C Y Y (t 2,t 2 ) Introduction to Probability and Stochastic Processes I p. 24/29

39 Two or More Random Processes II Equality: Two random processes are equal if their member functions are identical for each outcome λ S. And they are defined on the same random experiment. Uncorrelated: Two random processes X(t) and Y (t) are uncorrelated if C XY (t,t 2 ) = 0, t,t 2 Γ Orthogonal: Two random processes X(t) and Y (t) are orthogonal if R XY (t,t 2 ) = 0, t,t 2 Γ Introduction to Probability and Stochastic Processes I p. 25/29

40 Two or More Random Processes III Independent: Two random processes X(t) and Y (t) are independent if P[X(t ) x,...,x(t n ) x n,y (t ) y,...,y (t m) y m ] = P[X(t ) x,...,x(t n ) x n ]P[Y (t ) y,...,y (t m) y m ] for all n,m and t,t,...,t n,t,t 2,...,t m Λ As in the case of random variables independent implies uncorrelated but not conversely. Introduction to Probability and Stochastic Processes I p. 26/29

41 Example E having sample space S = {, 2, 3, 4, 5, 6} = {λ,λ 2,λ 3,λ 4,λ 5,λ 6 }. E 2 having sample space S 2 = {head,tail} = {q,q 2 }. λ i x i (t) y i (t) q i z j (t) 4 2 (head) cos t (tail) sin t t t 2 0 Two random processes X(t) and Y (t) are defined on the same experiment, but they are not equal. Introduction to Probability and Stochastic Processes I p. 27/29

42 Example (continued) λ i x i (t) y i (t) q i z j (t) 4 2 (head) cos t 2 x 2-4 2(tail) sin t t t 2 0 R XY (t,t 2 ) = E{X(t )Y (t 2 )} 6 = x i (t )y i (t 2 )P(λ i ) = 6 = 0 i= So X and Y are orthogonal. ( ) Introduction to Probability and Stochastic Processes I p. 28/29

43 Example (continued) λ i x i (t) y i (t) q i z j (t) 4 2 (head) cos t (tail) sin t t t 2 0 Since C XY (t,t 2 ) = R XY (t,t 2 ) µ X (t )µ Y (t 2 ) = 0, X and Y are uncorrelated, but they are clearly not independent. X and Z are independent, since they are based on two unrelated experiments, so P(λ i and q j ) = P(λ i )P(q j ) Introduction to Probability and Stochastic Processes I p. 29/29

44 Introduction to Probability and Stochastic Processes I Lecture 4 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides originally by: Line Ørtoft Endelt Introduction to Probability and Stochastic Processes I p. /30

45 Definition of Random Process Let S be the sample space of a random experiment, and let t be a variable with values in Γ R. A Real-valued random process X(t) is a function X(t) : Γ S R. If Γ is a subset of integers, then X(t) is a random sequence, otherwise X(t) is a random process. The random process is a mapping X(t) from the sample space S to a space of continuous time functions {x i (t)} i I (member functions). At each fixed time t 0 the mapping X(t 0 ) is a random variable (with values in the set {x i (t 0 )} i I ). Introduction to Probability and Stochastic Processes I p. 2/30

46 Strict-sense Stationarity I A Random process X(t) is called stationary in the strict sense (SSS) if for all t,t 2,...,t k,t + τ,t 2 + τ,...,t k + τ Γ and for all k =, 2,..., P[X(t ) x,x(t 2 ) x 2,...X(t k ) x k ] = P[X(t + τ) x,x(t 2 + τ) x 2,...X(t k + τ) x k ] If the abovementioned definition only holds for k N but not neccessary for k > N, then the process is called Nth order stationary. Introduction to Probability and Stochastic Processes I p. 3/30

47 Strict-sense Stationarity II Consider a SSS random process, then for any τ P[X(t) x] = P[X(t + τ) x] which gives E{X(t)} = µ X = constant Introduction to Probability and Stochastic Processes I p. 4/30

48 Strict-sense Stationarity III The second-order distribution P[X(t ) x,x(t 2 ) x 2 ] = P[X(t + τ) x,x(t 2 + τ) x 2 ] only depends on the difference t 2 t. So the autocorrelation function of a SSS random process can be written as R XX (t,t 2 ) = E{X (t )X(t 2 )} = R XX (t 2 t ). Remark: A random process with a constant mean, and an autocorrelation function that only depends on the time difference, is not nessesarely SSS, it need not even be first order stationary. Introduction to Probability and Stochastic Processes I p. 5/30

49 Wide-sense Stationary I A Random process X(t) is called wide-sense stationary (WSS) if E{X(t)} E{X (t)x(t + τ)} = µ X = R XX (τ) Two Random processes X(t) and Y (t) are jointly WSS if E{X(t)} E{Y (t)} E{X (t)x(t + τ)} E{Y (t)y (t + τ)} E{X (t)y (t + τ)} = µ X = µ Y = R XX (τ) = R Y Y (τ) = R XY (τ) Introduction to Probability and Stochastic Processes I p. 6/30

50 Wide-sense Stationary II A random sequence X(k) is wide-sense stationary if E{X(k)} E{X (n)x(n + k)} = µ X = R XX (k) Note: SSS WSS WSS SSS Introduction to Probability and Stochastic Processes I p. 7/30

51 Example I λ i x i (t) φ j y j (t) sin(t) sin(t) cos(t) cos(t) E{X(t)} = 0 R XX (t,t 2 ) = 70 ( ) = 6 6 X is SSS, since the member functions do not change under time shifts. Introduction to Probability and Stochastic Processes I p. 8/30

52 Example I (continued) λ i x i (t) φ j y j (t) sin(t) sin(t) cos(t) cos(t) E{Y (t)} = 0 R Y Y (t,t 2 ) = 6 6 j= y i (t )y i (t 2 ) Y is WSS but not SSS!! = 6 { cos(t 2 t )} = R Y Y (t 2 t ) Introduction to Probability and Stochastic Processes I p. 9/30

53 Example II X(n) is a binary Markov sequence, for n Z. The sequence is described by: P[X(n) = 0, X(n + ) = 0] = 0.2 P[X(n) = 0, X(n + ) = ] = 0.2 P[X(n) =, X(n + ) = 0] = 0.2 P[X(n) =, X(n + ) = ] = 0.4 P[X(n) = 0] = P[X(n) = 0, X(n + ) = 0] + P[X(n) = 0, X(n + ) = ] = 0.4 P[X(n) = ] = P[X(n) =, X(n + ) = 0] + P[X(n) =, X(n + ) = ] = 0.6 µ X = 0 P[X(n) = 0] + P[X(n) = ] = 0.6 R XX (n, n) = 0 2 P[X(n) = 0] + 2 P[X(n) = ] = 0.6 Introduction to Probability and Stochastic Processes I p. 0/30

54 Example II (continued) R XX (n, n + ) = R XX (n, n + 2) = i=0 j=0 ijp[x(n) = i, X(n + ) = j] = P[X(n) =, X(n + ) = ] = 0.4 i=0 j=0 ijp[x(n) = i, X(n + 2) = j] = P[X(n) =, X(n + 2) = ] = P[X(n) =, X(n + ) = 0, X(n + 2) = ] +P[X(n) =, X(n + ) =, X(n + 2) = ] = P[X(n) = ]P[X(n) = X(n + ) = 0] P[X(n + 2) = X(n) =, X(n + ) = 0] +P[X(n) = ]P[X(n) = X(n + ) = ] P[X(n + 2) = X(n) =, X(n + ) = ] = P[X(n) = ]P[X(n) = X(n + ) = 0] P[X(n + 2) = X(n + ) = 0] +P[X(n) = ]P[X(n) = X(n + ) = ] P[X(n + 2) = X(n + ) = ] Introduction to Probability and Stochastic Processes I p. /30

55 Example II (continued) so R XX (n, n + 2) = (0.6) (0.6) E{X} = 0.6 R XX (n, n) = 0.6 R XX (n, n + ) = 0.4 R XX (n, n + 2) = It can be shown, that R XX (n,n + k) is independent of n for k Z, so the process is wide-sense stationary. Introduction to Probability and Stochastic Processes I p. 2/30

56 Example III A i and B i have a joint gaussian distribution for i =,2,3,..., n, with µ Ai = µ Bi = 0 and σ 2 A i = σ 2 B i = σ 2. They are assumed to be uncorrelated. A random process is defined as X(t) = Show that the process i WSS: n i= (A i cos ω i t + B i sin ω i t) E{X(t)} = n i= E{A i }cos ω i t + E{B i } sin ω i t = 0 Ò n E{X(t)X(t + τ)} = E i= n j= [A i cos ω i t + B i sin ω i t] Ó [A j cos ω j (t + τ) + B j sin ω j (t + τ)] = n i= E{A 2 i } cos ω itcos ω i (t + τ) +E{B 2 i } sin ω itsin ω i (t + τ) = σ 2 n i= cos ω i τ = R XX (τ) Introduction to Probability and Stochastic Processes I p. 3/30

57 Example III (continued) A i and B i have a joint gaussian distribution for i =,2,3,..., n, with µ Ai = µ Bi = 0 and σ 2 A i = σ 2 B i = σ 2. They are assumed to be uncorrelated. A random process is defined as Is it SSS? X(t) = n i= (A i cos ω i t + B i sin ω i t) Introduction to Probability and Stochastic Processes I p. 4/30

58 Other forms of Stationarity I A random process X(t) is asymptotically stationary if the distribution of X(t + τ),x(t 2 + τ),...x(t k + τ) does not depend on τ when τ is large. A random process is stationary in an interval if P[X(t ) x,x(t 2 ) x 2,...X(t k ) x k ] = P[X(t + τ) x,x(t 2 + τ) x 2,...X(t k + τ) x k ] hold for all τ for which t,t 2,...,t k,t + τ,t 2 + τ,...,t k + τ lie in an interval that is a subset of Γ. Introduction to Probability and Stochastic Processes I p. 5/30

59 Other forms of Stationarity II A random process X(t) is said to have stationary increments if its increments Y (t) = X(t + τ) X(t) form a stationary process for every τ. A random process is cyclostationary or periodically stationary if it is stationary under a shift of the time origin by integer multiples of a constant T 0 (the period of the process). Introduction to Probability and Stochastic Processes I p. 6/30

60 Autocorrelation Fct. of a Real WSS RP The autocorrelation function of a Real WSS random process is given by R XX (τ) = E{X(t)X(t + τ)} The autocorrelation function satisfies the following properties. If X(t) is a voltage waveform across a -Ω resistance, then the ensemble average value of X 2 (t) is the average value of the power delivered to the -Ω resistance by X(t): E{X 2 (t)} = Average Power = R XX (0) 0 Introduction to Probability and Stochastic Processes I p. 7/30

61 Autocorrelation Fct. of a Real WSS RP 2. The autocorrelation function R XX (τ) is an even function of τ R XX (τ) = R XX ( τ) 3. The autocorrelation function R XX (τ) is bounded R XX (τ) R XX (0) Introduction to Probability and Stochastic Processes I p. 8/30

62 Proof of 3 E{[X(t + τ) X(t)] 2 } 0 E{[X(t + τ) + X(t)] 2 } 0 which implies {X 2 (t + τ)} + E{X 2 (t)} 2R XX (τ) 0 E{X 2 (t + τ)} + E{X 2 (t)} + 2R XX (τ) 0 since E{X 2 (t + τ)} = E{X 2 (t)} = R XX (0) Hence 2R XX (0) 2R XX (τ) 0 2R XX (0) + 2R XX (τ) 0 R XX (τ) R XX (0) Introduction to Probability and Stochastic Processes I p. 9/30

63 Cross-correlation Function The cross-correlation function of two real random processes, that are jointly WSS is R XY (τ) = E{X(t)Y (t + τ)} Its has the following properties. R XY (τ) = R Y X ( τ) 2. R XY (τ) R XX (0)R Y Y (0) 3. R XY (τ) 2 [R XX(0) + R Y Y (0)] 4. R XY (τ) = 0 if the processes are orthogonal and R XY (τ) = µ X µ Y if the processes are independent. Introduction to Probability and Stochastic Processes I p. 20/30

64 Power Spectral Density Function I For a deterministic signal the average power in the signal is defined as P x = lim T 2T T T x 2 (t)dt If the deterministic signal is periodic with period T 0, then the timeaveraged autocorrelation function is defined as R xx (τ) T0 = T 0 T0 0 x(t)x(t + τ)dt If S xx (f) is the Fourier transform of R xx (τ) T0, then P x = S xx (f)df Introduction to Probability and Stochastic Processes I p. 2/30

65 Power Spectral Density Function II The Power Spectral Density Function (psd) of a WSS random process X(t) is defined as S XX (f) = F {R XX (τ)} = R XX (τ) exp( j2πfτ)dτ Called the Wiener-Khinchine relation. The autocorrelation is recovered by R XX (τ) = F {S XX (f)} = S XX (f) exp(j2πfτ)df Introduction to Probability and Stochastic Processes I p. 22/30

66 Properties of psd The psd function is also called the spectrum of X(t), and has the following properties:. S XX (f) is real and nonnegative. 2. The average power in X(t) is given by E{X 2 (t)} = R XX (0) = S XX (f)df 3. For X(t) real, R XX (τ) is an even function and hence S XX ( f) = S XX (f) 4. If X(t) has periodic components, then S XX (f) will have impulses. Introduction to Probability and Stochastic Processes I p. 23/30

67 Lowpass and Bandpass Processes A random proces is lowpass if its psd is zero for f > B, and B is called the bandwidth of the process. A random process is bandpass if its psd is zero outside the band Figure p. 47!! f c B 2 f f c + B 2 Introduction to Probability and Stochastic Processes I p. 24/30

68 Power and bandwidth Calculations I The power in a band of frequencies, f to f 2, for 0 < f < f 2 is for a real random process X(t) P X [f,f 2 ] = 2 f2 f S XX (f)df For a zero mean random process with continuous psd, the effiective bandwidth is defined as S XX(f)df B eff = 2 max[s XX (f)] Introduction to Probability and Stochastic Processes I p. 25/30

69 Power and bandwidth Calculations II The effective bandwidth is related to the correlation time τ c = R XX(τ)dτ R XX (0) If S XX (f) is continuous and has maximum at f = 0, then B eff = 2τ c Introduction to Probability and Stochastic Processes I p. 26/30

70 Cross-power Spectral Density Function For two real-valued random processes X(t) and Y (t) the cross-power spectral density (cpsd) function S XY (f) is defined by S XY (f) = R XY (τ) exp( j2πfτ)dτ and the cross-correlation function is recovered as R XY (τ) = S XY (f) exp(j2πfτ)df The cross-power spectral function will in general be a complex valued function. Introduction to Probability and Stochastic Processes I p. 27/30

71 Cross-power Spectral Density Function Some properties of the cpsd are:. S XY (f) = S Y X (f) 2. Re(S XY (f)) is an even function of f, and Im(S XY (f)) is an odd function of f. 3. S XY (f) = 0 if X(t) and Y (t) are orthogonal and S XY (f) = µ X µ Y δ(f) if X(t) and Y (t) are independent. Introduction to Probability and Stochastic Processes I p. 28/30

72 Coherence function The real-valued coherence function between two random processes is defined as ρ 2 XY (f) = S XY (f) 2 S XX (f)s Y Y (f) When ρ 2 XY (f 0) = 0, then X(t) and Y (t) are incoherent at f 0. When ρ 2 XY (f 0) =, then X(t) and Y (t) are fully coherent at f 0. If X(t) and Y (t) are statistically independent, then ρ 2 XY (f 0) = 0 at all frequencies except at f = 0. Introduction to Probability and Stochastic Processes I p. 29/30

73 PSD of Random Sequences The Power Spectral Density (psd) of a random sequence X(nT s ) with uniform sampling time of one second (T s = ) is defined by the Fourier Transform of the sequence: S XX (f) = n= exp( j2πfn)r XX (n), 2 < f < 2 The auto-correlation is recovered as R XX (n) = 2 2 S XX (f) exp(j2πfn)df If the sampling time is not, then the psd is defined for 2T s < f < 2T s. Introduction to Probability and Stochastic Processes I p. 30/30

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

16.584: Random (Stochastic) Processes

16.584: Random (Stochastic) Processes 1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable

More information

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X

More information

Stochastic Processes

Stochastic Processes Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic

More information

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome

More information

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE 53 Handout #46 Prof. Young-Han Kim Thursday, June 5, 04 Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei). Discrete-time Wiener process. Let Z n, n 0 be a discrete time white

More information

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS 1. Define random process? The sample space composed of functions of time is called a random process. 2. Define Stationary process? If a random process is divided

More information

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017) UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable

More information

Chapter 6: Random Processes 1

Chapter 6: Random Processes 1 Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Fig 1: Stationary and Non Stationary Time Series

Fig 1: Stationary and Non Stationary Time Series Module 23 Independence and Stationarity Objective: To introduce the concepts of Statistical Independence, Stationarity and its types w.r.to random processes. This module also presents the concept of Ergodicity.

More information

Chapter 6 - Random Processes

Chapter 6 - Random Processes EE385 Class Notes //04 John Stensby Chapter 6 - Random Processes Recall that a random variable X is a mapping between the sample space S and the extended real line R +. That is, X : S R +. A random process

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

Chapter 6. Random Processes

Chapter 6. Random Processes Chapter 6 Random Processes Random Process A random process is a time-varying function that assigns the outcome of a random experiment to each time instant: X(t). For a fixed (sample path): a random process

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan

More information

Problem Sheet 1 Examples of Random Processes

Problem Sheet 1 Examples of Random Processes RANDOM'PROCESSES'AND'TIME'SERIES'ANALYSIS.'PART'II:'RANDOM'PROCESSES' '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''Problem'Sheets' Problem Sheet 1 Examples of Random Processes 1. Give

More information

Stochastic Processes. Chapter Definitions

Stochastic Processes. Chapter Definitions Chapter 4 Stochastic Processes Clearly data assimilation schemes such as Optimal Interpolation are crucially dependent on the estimates of background and observation error statistics. Yet, we don t know

More information

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of

More information

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else ECE 450 Homework #3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3 4 5

More information

Random Process. Random Process. Random Process. Introduction to Random Processes

Random Process. Random Process. Random Process. Introduction to Random Processes Random Process A random variable is a function X(e) that maps the set of experiment outcomes to the set of numbers. A random process is a rule that maps every outcome e of an experiment to a function X(t,

More information

ECE Homework Set 3

ECE Homework Set 3 ECE 450 1 Homework Set 3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

ECE 636: Systems identification

ECE 636: Systems identification ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental

More information

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if jt X ( ) = xte ( ) dt, (3-) then X ( ) represents its energy spectrum. his follows from Parseval

More information

7 The Waveform Channel

7 The Waveform Channel 7 The Waveform Channel The waveform transmitted by the digital demodulator will be corrupted by the channel before it reaches the digital demodulator in the receiver. One important part of the channel

More information

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7

UCSD ECE250 Handout #24 Prof. Young-Han Kim Wednesday, June 6, Solutions to Exercise Set #7 UCSD ECE50 Handout #4 Prof Young-Han Kim Wednesday, June 6, 08 Solutions to Exercise Set #7 Polya s urn An urn initially has one red ball and one white ball Let X denote the name of the first ball drawn

More information

Probability and Statistics

Probability and Statistics Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph

More information

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7 UCSD ECE50 Handout #0 Prof. Young-Han Kim Monday, February 6, 07 Solutions to Exercise Set #7. Minimum waiting time. Let X,X,... be i.i.d. exponentially distributed random variables with parameter λ, i.e.,

More information

3F1 Random Processes Examples Paper (for all 6 lectures)

3F1 Random Processes Examples Paper (for all 6 lectures) 3F Random Processes Examples Paper (for all 6 lectures). Three factories make the same electrical component. Factory A supplies half of the total number of components to the central depot, while factories

More information

Chapter 5 Random Variables and Processes

Chapter 5 Random Variables and Processes Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering National Sun Yat-sen University Table of Contents 5.1 Introduction 5. Probability

More information

Chapter 4 Random process. 4.1 Random process

Chapter 4 Random process. 4.1 Random process Random processes - Chapter 4 Random process 1 Random processes Chapter 4 Random process 4.1 Random process 4.1 Random process Random processes - Chapter 4 Random process 2 Random process Random process,

More information

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation

More information

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes

More information

Problems on Discrete & Continuous R.Vs

Problems on Discrete & Continuous R.Vs 013 SUBJECT NAME SUBJECT CODE MATERIAL NAME MATERIAL CODE : Probability & Random Process : MA 61 : University Questions : SKMA1004 Name of the Student: Branch: Unit I (Random Variables) Problems on Discrete

More information

5 Analog carrier modulation with noise

5 Analog carrier modulation with noise 5 Analog carrier modulation with noise 5. Noisy receiver model Assume that the modulated signal x(t) is passed through an additive White Gaussian noise channel. A noisy receiver model is illustrated in

More information

Probability and Statistics for Final Year Engineering Students

Probability and Statistics for Final Year Engineering Students Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: May 24, 2011. Lecture 6p: Spectral Density, Passing Random Processes through LTI Systems, Filtering Terms

More information

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011 UCSD ECE53 Handout #40 Prof. Young-Han Kim Thursday, May 9, 04 Homework Set #8 Due: Thursday, June 5, 0. Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process,

More information

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

2. (a) What is gaussian random variable? Develop an equation for guassian distribution Code No: R059210401 Set No. 1 II B.Tech I Semester Supplementary Examinations, February 2007 PROBABILITY THEORY AND STOCHASTIC PROCESS ( Common to Electronics & Communication Engineering, Electronics &

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

Fundamentals of Noise

Fundamentals of Noise Fundamentals of Noise V.Vasudevan, Department of Electrical Engineering, Indian Institute of Technology Madras Noise in resistors Random voltage fluctuations across a resistor Mean square value in a frequency

More information

Definition of a Stochastic Process

Definition of a Stochastic Process Definition of a Stochastic Process Balu Santhanam Dept. of E.C.E., University of New Mexico Fax: 505 277 8298 bsanthan@unm.edu August 26, 2018 Balu Santhanam (UNM) August 26, 2018 1 / 20 Overview 1 Stochastic

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides were adapted

More information

MA6451 PROBABILITY AND RANDOM PROCESSES

MA6451 PROBABILITY AND RANDOM PROCESSES MA6451 PROBABILITY AND RANDOM PROCESSES UNIT I RANDOM VARIABLES 1.1 Discrete and continuous random variables 1. Show that the function is a probability density function of a random variable X. (Apr/May

More information

Stochastic Process II Dr.-Ing. Sudchai Boonto

Stochastic Process II Dr.-Ing. Sudchai Boonto Dr-Ing Sudchai Boonto Department of Control System and Instrumentation Engineering King Mongkuts Unniversity of Technology Thonburi Thailand Random process Consider a random experiment specified by the

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the

More information

Chapter 2 Random Processes

Chapter 2 Random Processes Chapter 2 Random Processes 21 Introduction We saw in Section 111 on page 10 that many systems are best studied using the concept of random variables where the outcome of a random experiment was associated

More information

ECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process

ECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process 1 ECE6604 PERSONAL & MOBILE COMMUNICATIONS Week 3 Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process 2 Multipath-Fading Mechanism local scatterers mobile subscriber base station

More information

Signals and Spectra (1A) Young Won Lim 11/26/12

Signals and Spectra (1A) Young Won Lim 11/26/12 Signals and Spectra (A) Copyright (c) 202 Young W. Lim. Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version.2 or any later

More information

Random Processes Why we Care

Random Processes Why we Care Random Processes Why we Care I Random processes describe signals that change randomly over time. I Compare: deterministic signals can be described by a mathematical expression that describes the signal

More information

5.9 Power Spectral Density Gaussian Process 5.10 Noise 5.11 Narrowband Noise

5.9 Power Spectral Density Gaussian Process 5.10 Noise 5.11 Narrowband Noise Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering g National Sun Yat-sen University Table of Contents 5.1 Introduction 5. Probability

More information

Random Processes Handout IV

Random Processes Handout IV RP-IV.1 Random Processes Handout IV CALCULATION OF MEAN AND AUTOCORRELATION FUNCTIONS FOR WSS RPS IN LTI SYSTEMS In the last classes, we calculated R Y (τ) using an intermediate function f(τ) (h h)(τ)

More information

Stochastic Processes. A stochastic process is a function of two variables:

Stochastic Processes. A stochastic process is a function of two variables: Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:

More information

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2:

EEM 409. Random Signals. Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Problem 2: EEM 409 Random Signals Problem Set-2: (Power Spectral Density, LTI Systems with Random Inputs) Problem 1: Consider a random process of the form = + Problem 2: X(t) = b cos(2π t + ), where b is a constant,

More information

Signals and Spectra - Review

Signals and Spectra - Review Signals and Spectra - Review SIGNALS DETERMINISTIC No uncertainty w.r.t. the value of a signal at any time Modeled by mathematical epressions RANDOM some degree of uncertainty before the signal occurs

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

Part IV Stochastic Image Analysis 1 Contents IV Stochastic Image Analysis 1 7 Introduction to Stochastic Processes 4 7.1 Probability................................... 5 7.1.1 Random events and subjective

More information

Stochastic Processes. Monday, November 14, 11

Stochastic Processes. Monday, November 14, 11 Stochastic Processes 1 Definition and Classification X(, t): stochastic process: X : T! R (, t) X(, t) where is a sample space and T is time. {X(, t) is a family of r.v. defined on {, A, P and indexed

More information

Question Paper Code : AEC11T03

Question Paper Code : AEC11T03 Hall Ticket No Question Paper Code : AEC11T03 VARDHAMAN COLLEGE OF ENGINEERING (AUTONOMOUS) Affiliated to JNTUH, Hyderabad Four Year B Tech III Semester Tutorial Question Bank 2013-14 (Regulations: VCE-R11)

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 4 Ergodic Random Processes, Power Spectrum Linear Systems 0 c 2011, Georgia Institute of Technology (lect4 1) Ergodic Random Processes An ergodic random process is one

More information

Lecture - 30 Stationary Processes

Lecture - 30 Stationary Processes Probability and Random Variables Prof. M. Chakraborty Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur Lecture - 30 Stationary Processes So,

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1 ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen D. van Alphen 1 Lecture 10 Overview Part 1 Review of Lecture 9 Continuing: Systems with Random Inputs More about Poisson RV s Intro. to Poisson Processes

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 8: Stochastic Processes Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 5 th, 2015 1 o Stochastic processes What is a stochastic process? Types:

More information

PROBABILITY AND RANDOM PROCESSESS

PROBABILITY AND RANDOM PROCESSESS PROBABILITY AND RANDOM PROCESSESS SOLUTIONS TO UNIVERSITY QUESTION PAPER YEAR : JUNE 2014 CODE NO : 6074 /M PREPARED BY: D.B.V.RAVISANKAR ASSOCIATE PROFESSOR IT DEPARTMENT MVSR ENGINEERING COLLEGE, NADERGUL

More information

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship Random data Deterministic Deterministic data are those can be described by an explicit mathematical relationship Deterministic x(t) =X cos r! k m t Non deterministic There is no way to predict an exact

More information

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes

STOCHASTIC PROCESSES, DETECTION AND ESTIMATION Course Notes STOCHASTIC PROCESSES, DETECTION AND ESTIMATION 6.432 Course Notes Alan S. Willsky, Gregory W. Wornell, and Jeffrey H. Shapiro Department of Electrical Engineering and Computer Science Massachusetts Institute

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

ECE-340, Spring 2015 Review Questions

ECE-340, Spring 2015 Review Questions ECE-340, Spring 2015 Review Questions 1. Suppose that there are two categories of eggs: large eggs and small eggs, occurring with probabilities 0.7 and 0.3, respectively. For a large egg, the probabilities

More information

Spectral Analysis of Random Processes

Spectral Analysis of Random Processes Spectral Analysis of Random Processes Spectral Analysis of Random Processes Generally, all properties of a random process should be defined by averaging over the ensemble of realizations. Generally, all

More information

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES LECTURE NOTES ON PTSP (15A04304) B.TECH ECE II YEAR I SEMESTER

More information

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu From RV

More information

2A1H Time-Frequency Analysis II

2A1H Time-Frequency Analysis II 2AH Time-Frequency Analysis II Bugs/queries to david.murray@eng.ox.ac.uk HT 209 For any corrections see the course page DW Murray at www.robots.ox.ac.uk/ dwm/courses/2tf. (a) A signal g(t) with period

More information

Preliminary statistics

Preliminary statistics 1 Preliminary statistics The solution of a geophysical inverse problem can be obtained by a combination of information from observed data, the theoretical relation between data and earth parameters (models),

More information

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University Communication Systems Lecture 1, Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University 1 Outline Linear Systems with WSS Inputs Noise White noise, Gaussian noise, White Gaussian noise

More information

UNIT-4: RANDOM PROCESSES: SPECTRAL CHARACTERISTICS

UNIT-4: RANDOM PROCESSES: SPECTRAL CHARACTERISTICS UNIT-4: RANDOM PROCESSES: SPECTRAL CHARACTERISTICS In this unit we will study the characteristics of random processes regarding correlation and covariance functions which are defined in time domain. This

More information

2016 Spring: The Final Exam of Digital Communications

2016 Spring: The Final Exam of Digital Communications 2016 Spring: The Final Exam of Digital Communications The total number of points is 131. 1. Image of Transmitter Transmitter L 1 θ v 1 As shown in the figure above, a car is receiving a signal from a remote

More information

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y)

The distribution inherited by Y is called the Cauchy distribution. Using that. d dy ln(1 + y2 ) = 1 arctan(y) Stochastic Processes - MM3 - Solutions MM3 - Review Exercise Let X N (0, ), i.e. X is a standard Gaussian/normal random variable, and denote by f X the pdf of X. Consider also a continuous random variable

More information

Example 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ)

Example 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ) Chapter 4 Stochastic Processes 4. Definition In the previous chapter we studied random variables as functions on a sample space X(ω), ω Ω, without regard to how these might depend on parameters. We now

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

TSKS01 Digital Communication Lecture 1

TSKS01 Digital Communication Lecture 1 TSKS01 Digital Communication Lecture 1 Introduction, Repetition, and Noise Modeling Emil Björnson Department of Electrical Engineering (ISY) Division of Communication Systems Emil Björnson Course Director

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : Additional Problems MATERIAL CODE : JM08AM004 REGULATION : R03 UPDATED ON : March 05 (Scan the above QR code for the direct

More information

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides

Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering. Stochastic Processes and Linear Algebra Recap Slides Prof. Dr.-Ing. Armin Dekorsy Department of Communications Engineering Stochastic Processes and Linear Algebra Recap Slides Stochastic processes and variables XX tt 0 = XX xx nn (tt) xx 2 (tt) XX tt XX

More information

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES 3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES 3.1 Introduction In this chapter we will review the concepts of probabilit, rom variables rom processes. We begin b reviewing some of the definitions

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

Final. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:

Final. Fall 2016 (Dec 16, 2016) Please copy and write the following statement: ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 06 Instructor: Prof. Stanley H. Chan Final Fall 06 (Dec 6, 06) Name: PUID: Please copy and write the following statement: I certify

More information

UNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS

UNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS UNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS In many practical situations, multiple random variables are required for analysis than a single random variable. The analysis of two random variables especially

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar

PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar PROBABILITY THEORY By Prof. S. J. Soni Assistant Professor Computer Engg. Department SPCE, Visnagar Introduction Signals whose values at any instant t are determined by their analytical or graphical description

More information

STOCHASTIC PROBABILITY THEORY PROCESSES. Universities Press. Y Mallikarjuna Reddy EDITION

STOCHASTIC PROBABILITY THEORY PROCESSES. Universities Press. Y Mallikarjuna Reddy EDITION PROBABILITY THEORY STOCHASTIC PROCESSES FOURTH EDITION Y Mallikarjuna Reddy Department of Electronics and Communication Engineering Vasireddy Venkatadri Institute of Technology, Guntur, A.R < Universities

More information

1 Elementary probability

1 Elementary probability 1 Elementary probability Problem 1.1 (*) A coin is thrown several times. Find the probability, that at the n-th experiment: (a) Head appears for the first time (b) Head and Tail have appeared equal number

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 4: Review on Fourier analysis and probabilty theory Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt Febraury 19 th, 2015 1 Course Website o http://lms.mans.edu.eg/eng/

More information

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit   dwm/courses/2tf Time-Frequency Analysis II (HT 20) 2AH 2AH Time-Frequency Analysis II Bugs/queries to david.murray@eng.ox.ac.uk HT 20 For hints and answers visit www.robots.ox.ac.uk/ dwm/courses/2tf David Murray. A periodic

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 03 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA 6 MATERIAL NAME : Problem Material MATERIAL CODE : JM08AM008 (Scan the above QR code for the direct download of

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

This examination consists of 10 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

This examination consists of 10 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS THE UNIVERSITY OF BRITISH COLUMBIA Department of Electrical and Computer Engineering EECE 564 Detection and Estimation of Signals in Noise Final Examination 08 December 2009 This examination consists of

More information

Appendix A PROBABILITY AND RANDOM SIGNALS. A.1 Probability

Appendix A PROBABILITY AND RANDOM SIGNALS. A.1 Probability Appendi A PROBABILITY AND RANDOM SIGNALS Deterministic waveforms are waveforms which can be epressed, at least in principle, as an eplicit function of time. At any time t = t, there is no uncertainty about

More information