5.9 Power Spectral Density Gaussian Process 5.10 Noise 5.11 Narrowband Noise

Size: px
Start display at page:

Download "5.9 Power Spectral Density Gaussian Process 5.10 Noise 5.11 Narrowband Noise"

Transcription

1 Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering g National Sun Yat-sen University

2 Table of Contents 5.1 Introduction 5. Probability bilit 5.3 Random Variables 5.4 Statistical Averages 5.5 Random Processes 5.6 Mean, Correlation and Covariance Functions 5.7 Transmission of a Random Process through a Linear Filter 5.8 Power Spectral Density 5.9 Power Spectral Density Gaussian Process 5.10 Noise 5.11 Narrowband Noise

3 5.1 Introduction Fourier transform is a mathematical tool for the representation of deterministic signals. Deterministic signals: the class of signals that may be modeled as completely specified functions of time. A signal is random if it is not possible to predict its precise value in advance. A random process consists of an ensemble (family) of sample functions, each of which varies randomly with time. A random variable is obtained by observing a random process at a fixed instant of time. 3

4 5. Probability Probability theory is rooted in phenomena that, explicitly or implicitly, can be modeled by an experiment with an outcome that is subject to chance. Example: Experiment may be the observation of the result of tossing a fair coin. In this experiment, the possible outcomes of a trial are heads or tails. If an experiment has K possible outcomes, then for the kth possible outcome we have a point called the sample point, which we denote by s k. With this basic framework, we make the following definitions: iti The set of all possible outcomes of the experiment is called the sample space, which we denote by S. An event corresponds to either a single sample point or a set of sample points in the space S. 4

5 5. Probability A single sample point is called an elementary event. The entire sample space S is called the sure event; and the null set φ is called the null or impossible event. Two oevents entsare mutually exclusive if the occurrence of one event ent precludes the occurrence of the other event. A probability measure P is a function that assigns a non-negative negative number to an event A in the sample space S and satisfies the following three properties p (axioms): 1.. P P[ A] ( ) [ S ] = 1 ( 5 5. ) If A and B are two mutually exclusive events, then [ A B ] = [ A ] + [ B ] ( ) P P P 5

6 5. Probability 6

7 5. Probability The following properties of probability measure P may be derived from the above axioms: 1. P A = 1 P[ [ A ] ( 5.4 ). When events A and B are not mutually exclusive: [ A B ] = [ A ] + [ B ] [ A B ] ( 5.5 ) P P P P 3. If AA 1,,..., A m are mutually exclusive events that include all possible outcomes of the random experiment, then [ ] + [ ] + + [ ] = ( ) P A1 P A P A m

8 5. Probability Let P[B A] denote the probability of event B, given that event A has occurred. The probability P[B A] is called the conditional probability of B given A. P[B A] is defined by Bayes rule P P BA = [ A B] P [ A ] ( 5.7) We may write Eq.(5.7) as P[A B] = P[B A]P[A] (5.8) It is apparent that we may also write P[A B] = P[A B]P[B] (5.9) From Eqs.(5.8) and (5.9), provided P[A] 0, we may determine P[B A] by using the relation P BA = [ B] [ ] P AB P P A ( ) 8

9 5. Conditional Probability Suppose that the condition probability P[B A] is simply equal to the elementary probability of occurrence of event B, that is P BA = P [ B ] P [ A B ] = P [ A ] P [ B ] so that [ A B] [ A] [ B] P [ B ] P [ B ] P P P P AB = = = P [ A] ( 5.13) Events A and B that satisfy this condition are said to be statistically independent. 9

10 5. Conditional Probability Example 5.1 Binary Symmetric Channel This channel is said to be discrete in that it is designed to handle discrete messages. The channel is memoryless in the sense that the channel output at any time depends only on the channel input at that time. The channel is symmetric, which means that the probability of receiving symbol 1 when 0 is sent is the same as the probability of receiving symbol 0 when symbol 1 is sent. 10

11 5. Conditional Probability Example 5.1 Binary Symmetric Channel (continued) The a priori i probabilities bili i of sending binary symbols 0 and d1: P [ A ] = p P[ A ] 0 0 = The conditional probabilities of error: p 1 1 P B1 A0 = P B0 A1 = p The probability of receiving symbol 0 is given by: [ ] [ ] [ ] ( 1 ) P B = P B A P A + P B A P A = p p + pp The probability of receiving symbol 1 is given by: [ ] = [ ] + [ ] = + ( 1 ) P B P B A P A P B A P A pp p p

12 5. Conditional Probability Example 5.1 Binary Symmetric Channel (continued) The a posteriori i probabilities bili i P[A 0 B 0 ] and dp[a 1 B 1 ]: P P[ ] ( 1 ) P[ ] ( 1 ) P B0 A0 A0 p p0 A0 B 0 = = B p p + pp P P[ ] ( 1 ) P[ [ ] + ( 1 ) P B1 A1 A1 p p1 A1 B 1 = = B pp p p

13 5.3 Random Variables We denote the random variable as (s) or just. is a function. Random variable may be discrete or continuous. Consider the random variable and the probability of the event x. We denote this probability by P[ x]. To simplify our notation, we write ( ) = P[ ] ( 5.15) F x x The function F (x) is called the cumulative distribution function (cdf) or simply the distribution function of the random variable. The distribution function F (x) has the following properties: F x ( x ) ( ) ( ) F x F x if x < x x 1 x 1 13

14 5.3 Random Variables There may be more than one random variable associated with the same random experiment. 14

15 5.3 Random Variables If the distribution function is continuously differentiable, then d f x = F x dx ( ) ( ) ( 5.17) f (x) is called the probability bilit density function (pdf) of the random variable. Probability of the event x 1 < x equals [ x < x ] = [ x ] [ x ] P P P 1 1 x x1 = ( ) ( ) ( ) ( ) ( ) = F x F x1 F x = f ξ dξ 5.19 = x x 1 f ( ) x dx Probability density function must always be a nonnegative function, and with a total area of one. 15

16 5.3 Random Variables Example 5. Uniform Distribution 0, x a 1 f ( x ) =, a< x b b a 0, x> b 0, x a x a F ( x) =, a< x b b a 0, x > b 16

17 5.3 Random Variables Several Random Variables Consider two random variables and Y. We define the jit joint distribution function F,Y (x,y) as the probability that the random variable is less than or equal to a specified value x and that the random variable Y is less than or equal to a specified value y. ( ) = P [ ] ( 53 ) FY, x, y x, Y y 5.3 Suppose that joint distribution function F,Y (x,y) is continuous everywhere, and that t the partial derivative f Y, FY, ( x, y) ( x, y ) = ( 5.4 ) xy exists and is continuous everywhere. We call the function f,y (x,y) the joint probability density function of the random variables and Y. 17

18 5.3 Random Variables Several Random Variables The joint distribution function F,Y (x,y) is a monotonenondecreasing function of both x and y. ( ξη) f,y, dξdη= 1 Marginal density f (x) x ( ) = ( ξη, ) ξ η ( ) = (, η) η ( 5.7) F x f d d f x f x d, Y, Y Suppose that and Y are two continuous random variables with joint probability density function f,y (x,y). The conditional probability density function of Y given that = x is defined by f Y ( yx) = f ( x, y ) ( x) Y,, f ( 5.8) 18

19 5.3 Random Variables Several Random Variables If the random variable and Y are statistically i independent, d then knowledge of the outcome of can in no way affect the distribution of Y. ( ) by( ) Y = Y ( ), Y (, ) = ( ) Y ( ) ( 5.3 ) f y x f y f x y f x f y [ x AY, B] = [ A] [ Y B] ( 5.33) P P P 19

20 5.3 Random Variables Example 5.3 Binomial Random Variable Consider a sequence of coin-tossing experiments where the probability of a head is p and let n be the Bernoulli random variable representing the outcome of the nth toss. Let Y be the number of heads that occur on N tosses of the coins: Y N = n= 1 n P N y y [ Y = y] = p ( 1 p) N N! = y y!( N y)! N y 0

21 5.4 Statistical Averages The expected value or mean of a random variable is defined by μ x [ ] xf ( x) dx ( 5.36) = E = Function of a Random Variable Let denote a random variable, and let g() denote a realvalued function defined on the real line. We denote as Y = ( ) ( 5.37 ) g E To find the expected value of the random variable Y. [ Y ] = yf ( y) dy E g ( ) = g ( x) f ( x) dx ( 5.38) Y x 1

22 5.4 Statistical Averages Example 5.4 Cosinusoidal Random Variable Let Y=g()=cos() is a random variable uniformly distributed in the interval (-π, π) 1, π < x < π f ( x ) = π 0, otherwise E Y x 1 = dx π π π [ ] ( cos ) = = 0 1 sin π x π x= π

23 5.4 Statistical Averages Moments For the special case of g() = n, we obtain the nth moment of the probability distribution of the random variable ; that is n n x f x dx E = ( ) ( 5.39) Mean-square value of : x f x dx E = The nth central moment is ( x ) dx ( 5.40 ) n n ( μ ) = ( x μ ) f ( x) dx ( 5.41) E 3

24 5.4 Statistical Averages For n = the second central moment is referred to as the variance of the random variable, written as [ ] ( ) = μ = ( x μ ) f ( x) dx ( ) var E 5.4 The variance of a random variable is commonly denoted as σ. The square root of the variance is called the standard deviation of the random variable. [ ] ( ) E = E μ + μ σ = var = E μ [ ] = E μ E + μ = E μ 4 ( 5.44)

25 5.4 Statistical Averages Chebyshev inequality Suppose is an arbitrary random variable with finite mean m x and finite variance σ x. For any positive number δ: Proof: P σ δ ( ) x m x δ σ = d x x x x m δ ( x m ) p ( x ) dx ( x m ) p ( x ) dx ( ) = ( ) δ p xdx δ P m x δ x m δ x x 5

26 5.4 Statistical Averages Chebyshev inequality Another way to view the Chebyshev bound is working with the zero mean random variable Y=-m x. Define a function g(y) as: g ( Y ) ( Y δ ) ( Y < δ ) 1 δ = and E g( Y ) = P Y 0 [ ] ( δ ) Upper-bound g(y) by the quadratic (Y/δ), i.e. g( Y ) The tail probability g( Y ) [ ] ( Y E Y ) E = Y δ σ y σ x E δ = = δ δ δ 6

27 5.4 Statistical Averages Chebychev inequality A quadratic upper bound on g(y) used in obtaining the tail probability (Chebyshev bound) For many practical applications, the Chebyshev bound is extremely loose. 7

28 5.4 Statistical Averages ( ) Characteristic function φ is defined as the expectation of the υ complex exponential function exp( jυ ), as shown by ( j ) = exp( j ) = f ( x) exp ( j ) dx ( 5.45) ψ υ E υ υ In other words, the characteristic function φ ( υ ) is the Fourier transform of the probability density function f (x). Analogous with the inverse Fourier transform: 1 ( ) ψ ( υ) exp ( υ ) υ ( 5.46) f x = j j d π 8

29 5.4 Statistical Averages Characteristic functions First moment (mean) can be obtained by: E( ) = m x = j dψ ( jv ) dv v= 0 Since the differentiation process can be repeated, n-th moment can be calculated by: n n n d ψ ( jv) E( ) = ( j) n dv v=0 9

30 5.4 Statistical Averages Characteristic functions Determining the PDF of a sum of statistically independent random variables: Y n n jvy = i ψ Y ( jv) = E( e ) = E exp jv i= 1 i= 1 n E i= 1 Since the ( jv ) jvx e i i =... e p( x, x,..., xn ) dx dx dxn i n i= 1 random variables are statistically independent, = p ( x If n 1, x,...,, x n ) = p ( x 1 ) p ( x )... p ( x n ) ψ Y ( jv ) = ψ i= 1 i are iid (independent and identically distributed) Y [ ( jv ] n ψ ( jv ) = ψ ) i ( jv ) 30

31 5.4 Statistical Averages Characteristic functions The PDF of Y is determined from the inverse Fourier transform of Ψ Y (jv). Since the characteristic function of the sum of n statistically independent random variables is equal to the product of the characteristic ti functions of the individual id random variables, it follows that, in the transform domain, the PDF of Y is the n- fold convolution of the PDFs of the i. Usually, the n-fold convolution is more difficult to perform than the characteristic function method in determining the PDF of Y. 31

32 5.4 Statistical Averages Example 5.5 Gaussian Random Variable The probability density function of such a Gaussian random variable is defined by: 1 ( x μ ) f ( x) = exp, < x< πσ σ The characteristic function of a Gaussian random variable with mean m x and variance σ is (Problem 5.1): jvx 1 ( x m ) / σ x jvmx ( ) ( ) 1/ v σ ψ jv = e e dx = e πσ It can be shown that the central moments of a Gaussian random variable are given by: k [ ] k 1 3 ( k 1) σ (even k ) E ( mx ) = μk = 0 (odd k) 3

33 5.4 Statistical Averages Example 5.5 Gaussian Random Variable (cont.) The sum of n statistically independent Gaussian random variables is also a Gaussian random variable. Proof: where n Y = i ψ Y m i = 1 n ( jv) = ψ ( jv) y = Therefore, Y and variance n i= 1 i= 1 m i i and σ = y n i= 1 = e n i= 1 jvm v σ i i σ / jvmy v i = e is Gaussian-distributed with mean m σ y. y σ y / 33

34 5.4 Statistical Averages Joint Moments Consider next a pair of random variables and Y. A set of statistical averages of importance in this case are the joint moments, namely, the expected value of f i Y k, where i and k may assume any positive integer values. We may thus write i k i k ( ) ( ) Y = xy f, xy, dxdy 5.51 E A joint moment of particular importance is the correlation defined by E[Y], which corresponds to i = k = 1. Y Covariance of and Y : [ ] [ ] ( ) ( [ ] ) [ ] ( ) cov Y = E E Y E Y = E Y μ μy

35 5.4 Statistical Averages Correlation coefficient of and Y : ρ = cov [ Y ] σ σ Y ( 5.54) σ and σ Y denote the variances of and Y. We say and Y are uncorrelated if and only if cov[y] = 0. Nt Note thtif that and Y are statistically i independent, d then they are uncorrelated. The converse of the above statement is not necessarily true. We say and Y are orthogonal if and only if E[Y] = 0. 35

36 5.4 Statistical Averages Example 5.6 Moments of a Bernoulli Random Variable Consider the coin-tossing experiment where the probability of a head is p. Let be a random variable that takes the value 0 if the result is a tail and 1 if it is a head. We say that is a Bernoulli random variable. 1 p x = 0 = x = p x= 1 0 otherwise 1 E P ( ) [ ] k = 0 P( ) ( ) E E [ ] σ 1 ( k μ ) P[ k] = = k = 0 ( 0 p) ( 1 p) ( 1 p) p( 1 p) = + = = k = k = 0 1 p + 1 p= p E j 36 k j k j k = E j j = k p j k = p j = k where the 1 E [ ]. j = k P = k k = 0

37 5.5 Random Processes An ensemble of sample functions. For a fixed time instant t k, { } constitutes a random variable. { x1( tk), x( tk), xn( tk) } = ( tk, s1), ( tk, s),, ( tk, sn) 37

38 5.5 Random Processes At any given time instant, the value of a stochastic process is a random variable indexed dby the parameter t. We denote such a process by (t). In general, the parameter t is continuous, whereas may be either continuous or discrete, depending on the characteristics of the source that generates the stochastic process. The noise voltage generated by a single resistor or a single information source represents a single realization of the stochastic process. It is called a sample function. 38

39 5.5 Random Processes The set of all possible sample functions constitutes an ensemble of sample functions or, equivalently, the stochastic process (t). In general, the number of sample functions in the ensemble is assumed to be extremely large; often it is infinite. Having defined a stochastic process (t) as an ensemble of sample functions, we may consider the values of the process at any set of time instants t 1 >t >t 3 > >t n, where n is any positive integer. general, the random variables ( t ), i = 1,,...,,, n, are In ti i characterized statistically by their joint PDF 39 i p( x, x,..., x ). t t 1 n t

40 5.5 Random Processes Stationary stochastic processes ( ) Consider another set of n random variables t, i t t + + i t i = 1,,..., n, where t is an arbitrary time shift. These random ( ) t + t t + t t + t variables are characterized by the joint PDF p x, x,..., x. 1 The jont PDFs of the random variables and,i = 1,,...,n, t t + t = may or may not be identical. When they are identical, i.e., when (,,..., ) (,,..., ) t t t = t + t t + t t + t p x x x p x x x 1 n 1 for all t and all n, it is said to be stationary in the strict sense (SSS). When the joint PDFs are different, the stochastic process is non-stationary. i n i n 40

41 5.5 Random Processes Averages for a stochastic process are called ensemble averages. The nth moment of the random variable is defined as : E ( n = n ) x p ( x ) t i = t i t dx i t i In general, the value of the nth moment will depend on the time instant t i if the PDF of When the process is stationary, t i depends t i on ( x ) p ( x ) for all p = Therefore, the PDF is independent of time, and, as a consequence, the n th moment is independent of time. t +t i t i t i. t. 41

42 5.5 Random Processes Two random variables: ( t ), i = 1,. The correlation is measured by the joint moment: ( ) p (, ) t i E = t x x p x x dx dx 1 t t1 t t1 t t1 t Since this joint moment depends on the time instants t 1 and t, it is denoted by R (t 1, t )., R (t 1, t ) is called the autocorrelation function of the stochastic process. For a stationary stochastic process, the joint moment is: E ( ) = R( t, t) = R( t t) = R( τ ) t1 t 1 1 R ( τ ) = E( ) = E( ) = E( ) = R ( τ ) t t + τ t + τ t τ ' ' t1 t1 Average power in the process (t): R (0)=E( t ). i 4

43 5.5 Random Processes Wide-sense stationary (WSS) A wide-sense stationary process has the property that the mean value of the process is independent of time (a constant) and where the autocorrelation function satisfies the condition that R (t 1,t )=R (t 1 -t ). Wide-sense stationarity is a less stringent condition than strict-sense sense stationarity. 43

44 5.5 Random Processes Auto-covariance function The auto-covariance function of a stochastic process is defined as: μ ( t ) { ( ) ( ) } 1, t = E t mt 1 1 t mt = R ( t1, t) m( t1) m( t) When the process is stationary, the auto-covariance function simplifies to: μ( t, t ) = μ( t t ) = μ( τ) = R ( τ) m 1 1 For a Gaussian random process, higher-order moments can be expressed in terms of first and second moments. Consequently, a Gaussian random process is completely characterized by its first two moments. 44

45 5.6 Mean, Correlation and Covariance Functions Consider a random process (t). We define the mean of the process (t) as the expectation of the random variable obtained by observing the process at some time t, as shown by μ () t = ( t) = xf ()( x) dx ( 5.57) E t A random process is said idto be stationary to first order if the distribution function (and therefore density function) of (t) does not vary with time. ( )( ) = ( )( ) for all and μ ( ) = μ for all ( 5.59 t ) f x f x t t t t t1 1 The mean of the random process is a constant. The variance of such a process is also constant. 45

46 5.6 Mean, Correlation and Covariance Functions We define the autocorrelation function of the process (t) as the expectation of the product of two random variables (t 1 )and(t ). ( ) E ( ) ( ) R t1, t = t1 t = xx f x x dxdx ( ) ( )(,, ) ( 5.60 t ) 1 t We say a random process (t)isstationary stationary to second order if the joint distribution f ( ) ( )( x ) depends on the difference between 1 1, x t, t the observation time t 1 and t. (, ) = ( ) for all and ( 5.61) R t t R t t t t The autocovariance function of a stationary random process (t)is written as ( ) ( ( ) )( ( ) ) 1, = E 1 μ μ = ( 1) μ ( 5.6) C t t t t R t t 46

47 5.6 Mean, Correlation and Covariance Functions For convenience of notation, we redefine the autocorrelation function of a stationary process (t)as ( τ) = ( + τ) ( ) for all ( 5.63) R E t t t This autocorrelation function has several important properties: 1. R ( 0 ) = E ( t) ( 5.64). R ( τ ) = R ( τ ) ( 5.65 ) 3. R ( τ ) R ( 0 ) ( 5.67) Proof of (5.64) can be obtained from (5.63) by putting τ = 0. 47

48 5.6 Mean, Correlation and Covariance Functions Proof of (5.65): ( τ ) = E ( + τ) ( ) = E ( ) ( + τ) = ( τ) R t t t t R Proof of (5.67): E ( τ ) ( t) ( t+ ± ) 0 ( τ) ( τ) ( ) ( ) ( ) R ( τ ) ( 0) ( τ ) ( 0) ( τ ) R ( 0) E t+ ± E t+ t + E t 0 R 0 ± 0 R R R R 48

49 5.6 Mean, Correlation and Covariance Functions The physical significance of the autocorrelation function R (τ) is that it provides a means of describing the interdependence of two random variables obtained by observing a random process (t) ( ) at times τ seconds apart. 49

50 5.6 Mean, Correlation and Covariance Functions Example 5.7 Sinusoidal Signal with Random Phase Consider a sinusoidal signal with random phase: 1, π θ π ( t ) = A cos ( π fc t +Θ ) fθ ( θ ) = π 0, elsewhere R ( τ ) = E ( t + τ ) ( t ) A = E A cos ( 4π f t π f τ c + c + Θ ) + E cos π f τ ( ) c A π 1 A = cos 4 π ft + π f τ + θ dθ + cos π f τ π c c c π A = cos ( π f cτ ) ( ) ( ) 50

51 5.6 Mean, Correlation and Covariance Functions Averages for joint stochastic processes Let (t)andy(t) denote two stochastic processes and let ti (t i ), i=1,,,n, Y t j Y(t j ), j=1,,,m, represent the random variables at times t 1>t >t 3> >t n n,, and t 1 >t >t 3 > >t m, respectively. The two processes are characterized statistically by their joint PDF: (,,...,,,,..., ) t t t t t t p x x x y y y ' ' ' 1 n 1 m The cross-correlation correlation function of (t)andy(t), denoted by R xy (t 1,t ), is defined as the joint moment: R ( t, t ) E ( Y ) x y p( x, y ) dx dy = = xy 1 t t t t t t t t The cross-covariance is: μ ( t, t ) = R ( t, t ) m ( t ) m ( t ) xy 1 xy 1 x 1 y 51

52 5.6 Mean, Correlation and Covariance Functions Averages for joint stochastic processes When the process are jointly and individually stationary, we have R xy (t 1,t )=R xy (t 1 -t ), and μ xy (t 1,t )= μ xy (t 1 -t ): R ( τ ) = E ( Y ) = E ( Y ) = E ( Y ) = R ( τ ) xy t t + τ τ τ yx ' ' ' ' 1 1 t1 t1 t1 t1 The stochastic processes (t)andy(t) are said to be statistically independent if and only if : p ( x, x,..., x, y ', y ',..., y ' ) = p( x, x,..., x ) p( y, y ',..., y ' t t t 1 ' n t1 t tm 1 n t1 t t m t t t ) for all choices of t i and t i and for all positive integers n and m. The processes are said to be uncorrelated if R ( t, t ) = E( ) E( Y ) ( t, t ) 0 xy 1 t t 1 5 μ xy ( 1 =

53 5.6 Mean, Correlation and Covariance Functions Example 5.9 Quadrature-Modulated Processes Consider a pair of quadrature-modulated processes 1 (t) and (t): ( τ ) = E ( ) ( τ ) R1 E 1 t t ( ) = ( ) ( π +Θ) ( ) = ( ) ( π +Θ) 1 t t cos fct t t sin fct () ( τ) cos( π ) sin ( π π τ ) = E t t fct+θ fct fc +Θ () ( τ) E cos( π ) sin ( π π τ ) = E t t fct+θ fct fc +Θ 1 = R ( τ ) E sin( 4π fct π fct+ Θ) sin( π fcτ) 1 = R ( τ ) sin( π f τ) R1 ( 0) = E 1 ( t) ( t) = 0 c 53

54 5.6 Mean, Correlation and Covariance Functions Ergodic Processes In many instances, it is difficult or impossible to observe all sample functions of a random process at a given time. It is often more convenient to observe a single sample function for a long period of time. For a sample function x(t), the time average of the mean value over an observation period T is 1 μ T xt, = ( ) ( ) T xtdt T For many stochastic processes of interest in communications, the time averages and ensemble averages are equal, a property known as ergodicity. This property implies that t whenever an ensemble average is required, we may estimate it by using a time average. 54

55 5.6 Mean, Correlation and Covariance Functions Cyclostationary Processes (in the wide sense) There is another important class of random processes commonly encountered in practice, the mean and autocorrelation function of which exhibit periodicity: μ t + T = μ t ( 1 ) ( 1) ( +, + ) = (, ) R t T t T R t t for all t 1 and t. 1 1 Modeling the process (t) as cyclostationary adds a new dimension, namely, period T to the partial description of the process. 55

56 5.7 Transmission of a Random Process Through a Linear Filter Suppose that a random process (t) is applied as input to linear time-invariantinvariant filter of impulse response h(t), producing a new random process Y(t) at the filter output. Assume that (t) is a wide-sense stationary random process. The mean of the output random process Y(t) is given by μy () t = E Y() t h( τ1) ( t τ1) dτ = E 1 ( ) ( ) = h τ1 E t τ1 dτ1 ( τ ) μ ( τ ) τ ( ) = 56 h 1 t 1 d

57 5.7 Transmission of a Random Process Through a Linear Filter When the input random process (t) is wide-sense stationary, the mean μ t is a constant, then mean μ t is also a constant μ. μ ( ) μ ( ) () t h( ) d H( ) ( ) μ μ τ τ μ = = Y Y Y where H(0) is the zero-frequency (dc) response of the system. The autocorrelation function of the output random process Y(t) is given by: RY ( t, u) = E Y( t) Y( u) h( τ1) ( t τ1) dτ1 h( τ) ( u τ) dτ = E ( ) ( ) ( ) ( ) E = dτ h τ dτ h τ t τ u τ ( ) ( ) (, ) = dτ h τ dτ h τ R t τ u τ

58 5.7 Transmission of a Random Process Through a Linear Filter When the input (t) is a wide-sense stationary random process, the autocorrelation function of (t) is only a function of the difference between the observation times: Y ( τ) = ( τ ) ( τ ) ( τ τ + τ ) τ τ ( ) R h h R d d If the input to a stable linear time-invariant filter is a wide-sense stationary random process, then the output of the filter is also a wide-sense stationary random process. 58

59 5.8 Power Spectral Density The Fourier transform of the autocorrelation function R (τ) is called the power spectral density S ( f ) of the random process (t). ( ) ( τ) exp( π τ) τ ( 5.91) S f R j f d = ( τ ) ( ) exp( π τ) ( 5.9) R = S f j f df Equations (5.91) and (5.9) are basic relations in the theory of spectral analysis of random processes, and together they constitute what are usually called the Einstein-Wiener-Khintchine relations. 59

60 5.8 Power Spectral Densitye n s i t y Properties of the Power Spectral Density Property 1: ( 0 ) ( τ ) τ ( 5.93) S = R d Proof: Let f =0 in Eq. (5.91) Property : () t = S ( f ) df ( 5.94) E Proof: Let τ =0 in Eq. (5.9) and note that R (0)=E[ (t)]. Property 3: S ( f ) 0 for all f ( 5.95) Property 4: Proof: From (5.91) ( ) = ( ) ( 5.96) S f S f ( ) ( ) ( ) τ τ ( τ) = R ( τ) 60 ( ) ( ) ( ) S f = R τ exp jπ fτ dτ = R τ exp jπ fτ dτ = S f R

61 Proof of Eq. (5.95) S f = S f H f It can be shown that (see eq ) ( ) ( ) ( ) ( τ) ( ) exp( π τ) ( ) ( ) exp( π τ) R = S f j f df = S f H f j f df Y Y ( 0) = ( ) = ( ) ( ) 0 for any ( ) RY E Y t S f H f df H f Suppose we lt H( let f ) =1 for any arbitrarily il small interval f 1 f f, and H( f )=0 outside this interval. Then, we have: This is possible if an only if S ( f ) 0forallf f. f f 1 S ( ) Y f df 0 Conclusion: S ( f ) 0f for all llf. 61

62 5.8 Power Spectral Density Example 5.10 Sinusoidal Signal with Random Phase Consider the random process (t)=acos(πf c t+θ), where Θ is a uniformly distributed random variable over the interval (-π,π). The autocorrelation ti function of this random process is given in Example 5.7: A R ( τ) = cos( π fcτ) (5.74) Taking the Fourier transform of both sides of this relation: A S ( f ) = δ( f fc) + δ( f + fc) (5.97) 4 6

63 5.8 Power Spectral Density Example 5.1 Mixing of a Random Process with a Sinusoidal id Process A situation that often arises in practice is that of mixing (i.e., multiplication) of a WSS random process (t) with a sinusoidal signal cos(πf c t+θ), where the phase Θ is a random variable that is uniformly distributed over the interval (0,π). Determining the power spectral density of the random process Y(t) defined by: ( ) = ( ) ( π +Θ) Y f t cos f t (5.101) c We note that random variable Θ is independent of (t). 63

64 5.8 Power Spectral Density Example 5.1 Mixing of a Random Process with a Sinusoidal id Process (continued) The autocorrelation function of Y(t) is given by: RY ( τ ) = E Y( t+ τ ) Y( t) = E ( t+ τ ) cos ( π f ct+ π f cτ +Θ ) ( t ) cos ( π f ct+θ ) = E ( t+ τ) ( t) E cos( π fct+ π fcτ +Θ ) cos( π fct+θ) 1 = R ( τ) E cos( π fcτ) + cos( 4π fct+ π fcτ + Θ) 1 = R ( τ) cos( π f τ) c Fourier transform 1 SY ( f ) = S ( f fc) + S ( f + fc) (5.103) 4 64

65 5.8 Power Spectral Density Relation among the Power Spectral Densities of the Input and Output Random Processes Let S Y ( f ) denote the power spectral density of the output random process Y(t) obtained by passing the random process through a linear filter of transfer function H( f ). j ( ) ( τ) π fτ SY f RY e d = τ R Y ( τ ) = h ( τ ) h ( τ ) R ( τ τ + τ ) dτ dτ ( 5.90 ) j π fτ = h τ h τ R τ τ + τ e dτ dτ dτ = = ( ) ( ) ( ) ( ) ( ) ( ) Lt Let τ τ + τ = τ 1 0 jπ f( τ0+ τ1 τ) h τ1 h τ R τ0 e dτ1dτdτ 0 j ( ) π fτ1 j 0 ( ) π fτ j π fτ ( ) h τ e dτ h τ e dτ R τ e dτ ( ) ( ) ( ) ( ) ( ) ( 5.106) = H f H f S f = H f S f 65

66 5.8 Power Spectral Density Example 5.13 Comb Filter Consider the filter of Figure (a) consisting of a delay line and a summing device. We wish to evaluate the power spectral density of the filter output Y(t). 66

67 5.8 Power Spectral Density Example 5.13 Comb Filter (continued) The transfer function of this filter is ( f ) = ( j π ft) = ( π ft) + j ( π ft) H 1 exp 1 cos sin ( f ) = ( π ft) + ( π ft) H 1 cos sin ( π ft ) ( π ft ) = 1 cos =4sin Because of the periodic form of this frequency response (Fig. (b)), the filter is sometimes referred to as a comb filter. The power spectral density of the filter output is: Y ( ) = ( ) ( ) = 4sin ( π ) ( ) S f H f S f ft S f If ft is very small S ( f) 4 π f T S ( f) (5.107) Y 67

68 5.9 Gaussian Process A random variable Y is defined by: Y = T 0 () () g t t dt Y N = i= 1 We refer to Y as a linear functional of (t). Y is a linear function of fthe i. If the weighting function g(t) is such that the mean-square value of the random variable Y is finite, and if the random variable Y is a Gaussian-distributed random variable for every g(t) in this class of functions, then the process (t) is said to be a Gaussian process. In other words, the process (t) )is a Gaussian process if every linear functional of (t) is a Gaussian random variable. a i i a i are constants i are random variables The Gaussian process has many properties that make analytic results possible. The random processes produced by physical phenomena are often such that a Gaussian model is appropriate. 68

69 5.9 Gaussian Process The random variable Y has a Gaussian distribution if its probability density function has the form 1 ( y μ ) Y fy ( y) = exp πσ σ Y Y μ : the mean of the random variable Y Y σ : the variance of the random variable If the Gaussian random variable Y is normalized to have a mean of zero and a variance of one, such a normalized dgaussian distribution is commonly written as N(0,1). Y Y f Y ( y) 1 y = exp π 69

70 5.9 Gaussian Process Central Limit Theorem Let i, i = 1,,, N, be a set of random variables that satisfies the following requirements: The i are statistically independent. The i have the same probability distribution with mean μ and variance σ. The i so described are said to constitute a set of independent and identically distributed (i.i.d.) random variables. Define: 1 Yi = ( i μ ), i = 1,,, N. σ E [ Y i ] = 0 [ i ] var Y = 1 V N 1 N = Y N i = 1 The central limit theorem states that the probability distribution of V N approaches a normalized Gaussian distribution N(0,1) in the limit as N approaches infinity. i 70

71 5.9 Gaussian Process Property 1: If a Gaussian process (t) is applied to a stable linear filter, then the output of Y(t) ( ) is also Gaussian. Property : Consider the set of random variables or samples (t 1 ), (t ), (t n ), obtained by observing a random process (t) at time t 1, t,, t n. If the process (t) is Gaussian, then this set of random variables is jointly Gaussian for any n, with their n-fold joint probability density function being completely determined by specifying the set of means: μ ( ti) = E ( ti), i= 1,,, n and the set of autocovariance functions: ( ) ( ( ) ) t ( ) ( ) E ( ) t ( ) C tk, ti = tk μ t μ,, 1,,..., i k i= n k i Consider the composite set of random variables (t 1 ), (t ),, (t n ), Y(u 1 ), Y(u ),, Y(u m). We say that the processes (t) and Y(t) are jointly Gaussian if this composite set of random variables are jointly Gaussian for any n and m. 71

72 5.9 Gaussian Process Property 3: If a Gaussian process is wide-sense stationary, then theprocessisalsostationary stationary in the strict sense. Property 4: If the random variables (t 1 ), (t ), (t n )are ), uncorrelated, that is ( ( ) ) ( ( ) ( ) ) k t i ( t ) E t μ t μ =0 0, i k k i then these random variables are statistically independent. The implication of this property is that the joint probability density function of the set of random variables (t 1 ), (t ),, (t n ) can be expressed as the product of the probability density functions of the individual random variables in the set. 7

73 5.10 Noise The sources of noise may be external to the system (e.g., atmospheric noise, galactic noise, man-made noise), or internal to the system. The second category includes an important type of noise that arises from spontaneous fluctuations of current or voltage in electrical circuits. This type of noise represents a basic limitation on the transmission or detection of signals in communication systems involving the use of electronic devices. The two most common examples of spontaneous fluctuations in electrical circuits are shot noise and thermal noise. 73

74 5.10 Noise Shot Noise Shot noise arises in electronic devices such as diodes and transistors because of the discrete nature of current flow in these devices. For example, in a photodetector circuit a current pulse is generated every time an electron is emitted by the cathode due to incident light from a source of constant intensity. The electrons are naturally emitted at random times denote by τ k. If the random emissions ssosoeecos of electrons have vebeen going gon for a long time, then the total current flowing through the photodetector may be modeled as an infinite sum of current pulses, as shown by ( t) = h( t τ k ) k= where h(t- τ k ) is the current pulse generated at time τ k. The process (t) is a stationary process, called shot noise. 74

75 5.10 Noise Shot Noise The number of electrons, N(t), emitted in the time interval (0, t) constitutes a discrete stochastic process, the value of which increase by one each time an electron is emitted. (Fig. 5.17) Let the mean value of the number of electrons, v, emitted between times t and t+t 0 be E ν = λt [ ] 0 λ: a constant called the rate of the process The total number of electrons emitted in the interval (t, t+t 0 )is ν = N ( t+ t 0 ) N ( t ) follows a Poisson distribution with a mean value equal to λt 0. The probability that k electrons are emitted in the interval (t, t+t 0 0) is k ( λt0 ) λk P[ ν = k] = e k = 0, 1, k! 75 Fig Sample function of a Poisson counting process.

76 5.10 Noise Thermal Noise Thermal noise is the name given to the electrical noise arising from the random motion of electrons in a conductor. The mean-square value of the thermal noise voltage V TN, appearing across the terminals of a resistor, measured in a bandwidth of Δf Hertz, is given by: E V = 4 ktrδ f volts TN k : Boltzmann s constant= joules per degree Kelvin. T : Absolute temperature in degrees Kelvin. R: The resistance in ohms. 76

77 5.10 Noise White Noise The noise analysis is customarily based on an idealized form of noise called white noise, the power spectral density of which is independent of the operating frequency. White is used in the sense that white light contains equal amount of all frequencies within the visible band of electromagnetic radiation. We express the power spectral density of white noise, with a sample function denoted edby w(t), ( as N SW f = N = kt ( ) 0 0 e The dimensions of N 0 are in watts per Hertz, k is Boltzmann s constant and T e is the equivalent noise temperature of the receiver. 77

78 5.10 Noise White Noise The equivalent noise temperature of a system is defined as the temperature at which a noisy resistor has to be maintained such that, by connecting the resistor to the input of a noiseless version of the system, it produces the same available noise power at the output of the system as that produced by all the sources of noise in the actual system. The autocorrelation function is the inverse Fourier transform of the power spectral density: N0 RW ( τ) = δ ( τ) Any two different samples of white noise, no matter how closely together in time they are taken, are uncorrelated. If the white noise w(t) is also Gaussian, then the two samples are statistically independent. 78

79 5.10 Noise Example 5.14 Ideal Low-Pass Filtered White Noise Suppose that ta white Gaussian noise w(t) ( ) of zero mean and power spectral density N 0 / is applied to an ideal low-pass filter of bandwidth B and passband amplitude response of one. The power spectral density of the noise n(t) is S N ( f ) N 0, B< f < B = 0, f > B The autocorrelation function of n(t) is B N0 RN ( τ) = exp( jπfτ) df B = NB 0 sinc( Bτ) 79

80 5.11 Narrowband Noise The receiver of a communication system usually includes some provision for preprocessing p the received signal. The preprocessing may take the form of a narrowband filter whose bandwidth is just large enough to pass the modulated component of the received signal essentially undistorted but not so large as to admit excessive noise through the receiver. The noise process appearing at the output of such a filter is called narrowband noise. Fig. 5.4 (a). Power spectral density of narrowband noise. ose. (b). Sample function of narrowband noise, which appears somewhat similar to a sine wave of frequency f c, which undulates slowly in both amplitude and phase. 80

81 Representation of Narrowband Noise in Terms of In-phase and Quadrature Components Consider a narrowband noise n(t) of bandwidth B centered on frequency yff c, it can be represented as () = ( ) cos( π ) () sin( π ) nt n t ft n t ft I c Q c n I (t): in-phase component of n(t) ( ) n Q (t): quadrature component of n(t) Both n I (t) andn n Q (t)arelow low-pass signal. Fig. 5.5 () (a). Extraction of in-phase and quadrature components of a narrowband process. (b). Generation of a narrowband process from its in-phase and quadrature components. 81

82 Representation of Narrowband Noise in Terms of In-phase and Quadrature Components n I (t) and n Q (t) of a narrowband noise n(t) have some important properties: p 1) The n I (t) and n Q (t) of n(t) have zero mean. ) If n(t) is Gaussian, then n I (t) and n Q (t) are jointly Gaussian. 3) If n(t) (t)is stationary, ti then n I (t) and n Q (t) are jointly stationary. ti 4) Both n I (t) and n Q (t) have the same power spectral density, which is related to the power spectral density S N ( f ) of n(t) as ( ) ( ), SN f fc + SN f + fc B f B SN ( f ) = S ( ) I N f = Q 0, otherwise 5) n I (t) and n Q (t) have the same variance as the narrowband noise n(t). 6) The cross-spectral density of n I (t) and n Q (t) of n(t) is purely imaginary ( ) ( ), j SN f + fc SN f fc B f B SN ( ) ( ) IN f = S Q NQN f = I 0, otherwise 7) If n(t) is Gaussian and its power spectral density S N (t) is symmetric about the mid-band frequency f c, then n I (t) and n Q (t) are statistically independent. 8

83 Representation of Narrowband Noise in Terms of In-phase and Quadrature Components Example 5.17 Ideal Band-Pass Filtered White Noise Consider a white Gaussian noise of zero mean and power spectral density N 0 /, which is passed through an ideal band-pass filter of passband magnitude response equal to one, mid-band frequency f c, and bandwidth B. The power spectral density characteristic of the filtered noise n(t) ( ) is shown in Fig. (a). The power spectral density characteristic of n I (t) and n Q (t) are shown in Fig. (c). 83

84 Representation of Narrowband Noise in Terms of In-phase and Quadrature Components Example 5.17 Ideal Band-Pass Filtered White Noise The autocorrelation function of n(t) is the inverse Fourier transform of the power spectral density characteristic: fc+ B N fc+ B 0 N0 RN ( τ) = exp( jπ fτ) df + exp( jπ fτ) df fc B fc B = NB 0 sinc ( Bτ ) exp( j π fcτ) + exp( j π fcτ) 0 ( τ) ( π τ) = NBsinc B cos f c The autocorrelation function of n I (t) and n Q (t)isgi given enb by: N I ( τ ) = ( τ) = sinc( τ) R R N B B N Q 0 84

85 Representation of Narrowband Noise in Terms of In-phase and Quadrature Components The narrowband noise n(t) can be represented in terms of its envelope and phase components: ( ) = ( ) cos π + ψ( ) nt rt ft c t () = () + () 1 I Q ψ() t r t n t n t ( t) () t n 1 Q = tan ni r(t) : envelope of n(t); ψ(t) : phase of n(t) Both r(t) and ψ(t) are sample functions of low-pass random processes. The probability distributions of r(t) and ψ(t) may be obtained from those of n I (t) and n Q (t). 85

86 Representation of Narrowband Noise in Terms of In-phase and Quadrature Components Let N I and N Q denote the random variables obtained by observing the random processes represented by the sample functions n I I( (t) and n Q (t), respectively. N I and N Q are independent Gaussian random variables of zero mean and variance σ. Their joint probability density function is given by: 1 ni + n Q f ( ) N,, exp I N n Q In Q = πσ σ Define:n I = r cosψ and n Q = r sinψ. We have dn I dn Q =rdr dψ. The joint probability density function of R and Ψ is: f r r,ψ = exp πσ ( ) r σ R, Ψ The Ψ is uniformly distributed ib d inside id the range 0 to π. 86

87 Representation of Narrowband Noise in Terms of In-phase and Quadrature Components The probability density function of the random variable R is: f R ( r) r r exp, r 0 = σ σ 0, elsewhere (5.150) A random variable having the probability density function of (5.150) is said to be Rayleigh distributed. The Rayleigh distribution in the normalized form f V ( υ ) υ υ exp, υ 0 = 0, elsewhere 87 Fig. 5.8 Normalized Rayleigh distribution

88 Representation of Narrowband Noise in Terms of In-phase and Quadrature Components Example 5.18 Sinusoidal Signal Plus Narrowband Noise A sample function of the sinusoidal signal A cos(πf c t)plus narrowband noise n(t) is given by: ( ) = cos ( π ) + ( ) xt Acos π ft c nt Representing n(t) in terms of its in-phase and quadrature components around the carrier frequency f c ' ' ( ) = I ( ) cos( π c ) Q( ) sin( π c ) n ( t) = A+ n ( t) x t n t f t n t f t I I Assume that n(t) is Gaussian with zero mean and variance σ. Bth Both n I (t) and n Q (t) are Gaussian and statistically ti ti independent. d The mean of n I (t) is A and that of n Q (t) is zero. The variance of both n (t)isσ I (t) and n Q σ. 88

89 Representation of Narrowband Noise in Terms of In-phase and Quadrature Components The joint probability density function of the random variables N I and N Q, corresponding to n I (t) and n Q (t)is ( ' ) ' 1 ni A n Q f + ( n ) ', n, I Q = exp πσ σ N N I Q Lt Let r(t) denote the envelope of x(t) and ψ(t) (t)denote its phase. { } ' ( 1 t) = ( t) ( t) ψ I + ( t) Q r t n t n t ( t) ( t) ψ n 1 Q = tan ' n I The joint probability density function of the random variables R and Ψ is given by f r r + A Ar R, Ψ ( r,ψ) = exp πσ 89 cosψ σ

90 Representation of Narrowband Noise in Terms of In-phase and Quadrature Components The function f R,Ψ (r,ψ) cannot be expressed as a product f R (r)f Ψ (ψ). This is because we now have a term involving the values of both random variables multiplied together as r cos ψ. Rician distribution: π M difi d B l f ti f th fi t f r = f r,ψ dψ R ( ) ( ) 0 R, Ψ r r A π Ar 0 Modified Bessel function of the first kind of zeroth order. + = exp exp cos ψ dψ πσ σ σ The Rician distribution reduces to the Rayleigh distribution for small a, and reduces to an approximate Gaussian distribution when a is large. Fig 5.9 Normalized Rician distribution 90

Chapter 5 Random Variables and Processes

Chapter 5 Random Variables and Processes Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering National Sun Yat-sen University Table of Contents 5.1 Introduction 5. Probability

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan

More information

7 The Waveform Channel

7 The Waveform Channel 7 The Waveform Channel The waveform transmitted by the digital demodulator will be corrupted by the channel before it reaches the digital demodulator in the receiver. One important part of the channel

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X

More information

Stochastic Processes

Stochastic Processes Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the

More information

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

Problems on Discrete & Continuous R.Vs

Problems on Discrete & Continuous R.Vs 013 SUBJECT NAME SUBJECT CODE MATERIAL NAME MATERIAL CODE : Probability & Random Process : MA 61 : University Questions : SKMA1004 Name of the Student: Branch: Unit I (Random Variables) Problems on Discrete

More information

Introduction to Probability and Stochastic Processes I

Introduction to Probability and Stochastic Processes I Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides

More information

Probability and Statistics

Probability and Statistics Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph

More information

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes

More information

Chapter 6: Random Processes 1

Chapter 6: Random Processes 1 Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

ECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process

ECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process 1 ECE6604 PERSONAL & MOBILE COMMUNICATIONS Week 3 Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process 2 Multipath-Fading Mechanism local scatterers mobile subscriber base station

More information

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University Communication Systems Lecture 1, Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University 1 Outline Linear Systems with WSS Inputs Noise White noise, Gaussian noise, White Gaussian noise

More information

TSKS01 Digital Communication Lecture 1

TSKS01 Digital Communication Lecture 1 TSKS01 Digital Communication Lecture 1 Introduction, Repetition, and Noise Modeling Emil Björnson Department of Electrical Engineering (ISY) Division of Communication Systems Emil Björnson Course Director

More information

Chapter 6. Random Processes

Chapter 6. Random Processes Chapter 6 Random Processes Random Process A random process is a time-varying function that assigns the outcome of a random experiment to each time instant: X(t). For a fixed (sample path): a random process

More information

Random Processes Why we Care

Random Processes Why we Care Random Processes Why we Care I Random processes describe signals that change randomly over time. I Compare: deterministic signals can be described by a mathematical expression that describes the signal

More information

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES LECTURE NOTES ON PTSP (15A04304) B.TECH ECE II YEAR I SEMESTER

More information

Problem Sheet 1 Examples of Random Processes

Problem Sheet 1 Examples of Random Processes RANDOM'PROCESSES'AND'TIME'SERIES'ANALYSIS.'PART'II:'RANDOM'PROCESSES' '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''Problem'Sheets' Problem Sheet 1 Examples of Random Processes 1. Give

More information

PROBABILITY AND RANDOM PROCESSESS

PROBABILITY AND RANDOM PROCESSESS PROBABILITY AND RANDOM PROCESSESS SOLUTIONS TO UNIVERSITY QUESTION PAPER YEAR : JUNE 2014 CODE NO : 6074 /M PREPARED BY: D.B.V.RAVISANKAR ASSOCIATE PROFESSOR IT DEPARTMENT MVSR ENGINEERING COLLEGE, NADERGUL

More information

3F1 Random Processes Examples Paper (for all 6 lectures)

3F1 Random Processes Examples Paper (for all 6 lectures) 3F Random Processes Examples Paper (for all 6 lectures). Three factories make the same electrical component. Factory A supplies half of the total number of components to the central depot, while factories

More information

Question Paper Code : AEC11T03

Question Paper Code : AEC11T03 Hall Ticket No Question Paper Code : AEC11T03 VARDHAMAN COLLEGE OF ENGINEERING (AUTONOMOUS) Affiliated to JNTUH, Hyderabad Four Year B Tech III Semester Tutorial Question Bank 2013-14 (Regulations: VCE-R11)

More information

Fundamentals of Noise

Fundamentals of Noise Fundamentals of Noise V.Vasudevan, Department of Electrical Engineering, Indian Institute of Technology Madras Noise in resistors Random voltage fluctuations across a resistor Mean square value in a frequency

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

Signals and Spectra - Review

Signals and Spectra - Review Signals and Spectra - Review SIGNALS DETERMINISTIC No uncertainty w.r.t. the value of a signal at any time Modeled by mathematical epressions RANDOM some degree of uncertainty before the signal occurs

More information

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS 1. Define random process? The sample space composed of functions of time is called a random process. 2. Define Stationary process? If a random process is divided

More information

ECE-340, Spring 2015 Review Questions

ECE-340, Spring 2015 Review Questions ECE-340, Spring 2015 Review Questions 1. Suppose that there are two categories of eggs: large eggs and small eggs, occurring with probabilities 0.7 and 0.3, respectively. For a large egg, the probabilities

More information

Stochastic Processes. A stochastic process is a function of two variables:

Stochastic Processes. A stochastic process is a function of two variables: Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:

More information

Digital Band-pass Modulation PROF. MICHAEL TSAI 2011/11/10

Digital Band-pass Modulation PROF. MICHAEL TSAI 2011/11/10 Digital Band-pass Modulation PROF. MICHAEL TSAI 211/11/1 Band-pass Signal Representation a t g t General form: 2πf c t + φ t g t = a t cos 2πf c t + φ t Envelope Phase Envelope is always non-negative,

More information

Square Root Raised Cosine Filter

Square Root Raised Cosine Filter Wireless Information Transmission System Lab. Square Root Raised Cosine Filter Institute of Communications Engineering National Sun Yat-sen University Introduction We consider the problem of signal design

More information

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES 3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES 3.1 Introduction In this chapter we will review the concepts of probabilit, rom variables rom processes. We begin b reviewing some of the definitions

More information

Appendix A PROBABILITY AND RANDOM SIGNALS. A.1 Probability

Appendix A PROBABILITY AND RANDOM SIGNALS. A.1 Probability Appendi A PROBABILITY AND RANDOM SIGNALS Deterministic waveforms are waveforms which can be epressed, at least in principle, as an eplicit function of time. At any time t = t, there is no uncertainty about

More information

Module 4. Signal Representation and Baseband Processing. Version 2 ECE IIT, Kharagpur

Module 4. Signal Representation and Baseband Processing. Version 2 ECE IIT, Kharagpur Module Signal Representation and Baseband Processing Version ECE II, Kharagpur Lesson 8 Response of Linear System to Random Processes Version ECE II, Kharagpur After reading this lesson, you will learn

More information

STOCHASTIC PROBABILITY THEORY PROCESSES. Universities Press. Y Mallikarjuna Reddy EDITION

STOCHASTIC PROBABILITY THEORY PROCESSES. Universities Press. Y Mallikarjuna Reddy EDITION PROBABILITY THEORY STOCHASTIC PROCESSES FOURTH EDITION Y Mallikarjuna Reddy Department of Electronics and Communication Engineering Vasireddy Venkatadri Institute of Technology, Guntur, A.R < Universities

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

Introduction...2 Chapter Review on probability and random variables Random experiment, sample space and events

Introduction...2 Chapter Review on probability and random variables Random experiment, sample space and events Introduction... Chapter...3 Review on probability and random variables...3. Random eperiment, sample space and events...3. Probability definition...7.3 Conditional Probability and Independence...7.4 heorem

More information

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017) UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

16.584: Random (Stochastic) Processes

16.584: Random (Stochastic) Processes 1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable

More information

ECE 636: Systems identification

ECE 636: Systems identification ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental

More information

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process

Lecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

Probability and Statistics for Final Year Engineering Students

Probability and Statistics for Final Year Engineering Students Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: May 24, 2011. Lecture 6p: Spectral Density, Passing Random Processes through LTI Systems, Filtering Terms

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Chapter 2 Random Processes

Chapter 2 Random Processes Chapter 2 Random Processes 21 Introduction We saw in Section 111 on page 10 that many systems are best studied using the concept of random variables where the outcome of a random experiment was associated

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 8: Stochastic Processes Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 5 th, 2015 1 o Stochastic processes What is a stochastic process? Types:

More information

Lecture 15. Theory of random processes Part III: Poisson random processes. Harrison H. Barrett University of Arizona

Lecture 15. Theory of random processes Part III: Poisson random processes. Harrison H. Barrett University of Arizona Lecture 15 Theory of random processes Part III: Poisson random processes Harrison H. Barrett University of Arizona 1 OUTLINE Poisson and independence Poisson and rarity; binomial selection Poisson point

More information

Random Process. Random Process. Random Process. Introduction to Random Processes

Random Process. Random Process. Random Process. Introduction to Random Processes Random Process A random variable is a function X(e) that maps the set of experiment outcomes to the set of numbers. A random process is a rule that maps every outcome e of an experiment to a function X(t,

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES

2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES 2. SPECTRAL ANALYSIS APPLIED TO STOCHASTIC PROCESSES 2.0 THEOREM OF WIENER- KHINTCHINE An important technique in the study of deterministic signals consists in using harmonic functions to gain the spectral

More information

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

2. (a) What is gaussian random variable? Develop an equation for guassian distribution Code No: R059210401 Set No. 1 II B.Tech I Semester Supplementary Examinations, February 2007 PROBABILITY THEORY AND STOCHASTIC PROCESS ( Common to Electronics & Communication Engineering, Electronics &

More information

Fig 1: Stationary and Non Stationary Time Series

Fig 1: Stationary and Non Stationary Time Series Module 23 Independence and Stationarity Objective: To introduce the concepts of Statistical Independence, Stationarity and its types w.r.to random processes. This module also presents the concept of Ergodicity.

More information

This examination consists of 11 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

This examination consists of 11 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS THE UNIVERSITY OF BRITISH COLUMBIA Department of Electrical and Computer Engineering EECE 564 Detection and Estimation of Signals in Noise Final Examination 6 December 2006 This examination consists of

More information

5 Analog carrier modulation with noise

5 Analog carrier modulation with noise 5 Analog carrier modulation with noise 5. Noisy receiver model Assume that the modulated signal x(t) is passed through an additive White Gaussian noise channel. A noisy receiver model is illustrated in

More information

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept

More information

Probability theory. References:

Probability theory. References: Reasoning Under Uncertainty References: Probability theory Mathematical methods in artificial intelligence, Bender, Chapter 7. Expert systems: Principles and programming, g, Giarratano and Riley, pag.

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE

3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE 3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE 3.0 INTRODUCTION The purpose of this chapter is to introduce estimators shortly. More elaborated courses on System Identification, which are given

More information

Chapter 4 Random process. 4.1 Random process

Chapter 4 Random process. 4.1 Random process Random processes - Chapter 4 Random process 1 Random processes Chapter 4 Random process 4.1 Random process 4.1 Random process Random processes - Chapter 4 Random process 2 Random process Random process,

More information

EE401: Advanced Communication Theory

EE401: Advanced Communication Theory EE401: Advanced Communication Theory Professor A. Manikas Chair of Communications and Array Processing Imperial College London Introductory Concepts Prof. A. Manikas (Imperial College) EE.401: Introductory

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory Chapter 1 Statistical Reasoning Why statistics? Uncertainty of nature (weather, earth movement, etc. ) Uncertainty in observation/sampling/measurement Variability of human operation/error imperfection

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

1: PROBABILITY REVIEW

1: PROBABILITY REVIEW 1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following

More information

Analysis and Design of Analog Integrated Circuits Lecture 14. Noise Spectral Analysis for Circuit Elements

Analysis and Design of Analog Integrated Circuits Lecture 14. Noise Spectral Analysis for Circuit Elements Analysis and Design of Analog Integrated Circuits Lecture 14 Noise Spectral Analysis for Circuit Elements Michael H. Perrott March 18, 01 Copyright 01 by Michael H. Perrott All rights reserved. Recall

More information

Power Spectral Density of Digital Modulation Schemes

Power Spectral Density of Digital Modulation Schemes Digital Communication, Continuation Course Power Spectral Density of Digital Modulation Schemes Mikael Olofsson Emil Björnson Department of Electrical Engineering ISY) Linköping University, SE-581 83 Linköping,

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t, CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance

More information

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship Random data Deterministic Deterministic data are those can be described by an explicit mathematical relationship Deterministic x(t) =X cos r! k m t Non deterministic There is no way to predict an exact

More information

Fourier Analysis Linear transformations and lters. 3. Fourier Analysis. Alex Sheremet. April 11, 2007

Fourier Analysis Linear transformations and lters. 3. Fourier Analysis. Alex Sheremet. April 11, 2007 Stochastic processes review 3. Data Analysis Techniques in Oceanography OCP668 April, 27 Stochastic processes review Denition Fixed ζ = ζ : Function X (t) = X (t, ζ). Fixed t = t: Random Variable X (ζ)

More information

Lecture - 30 Stationary Processes

Lecture - 30 Stationary Processes Probability and Random Variables Prof. M. Chakraborty Department of Electronics and Electrical Communication Engineering Indian Institute of Technology, Kharagpur Lecture - 30 Stationary Processes So,

More information

EE303: Communication Systems

EE303: Communication Systems EE303: Communication Systems Professor A. Manikas Chair of Communications and Array Processing Imperial College London Introductory Concepts Prof. A. Manikas (Imperial College) EE303: Introductory Concepts

More information

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.

13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if jt X ( ) = xte ( ) dt, (3-) then X ( ) represents its energy spectrum. his follows from Parseval

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

where r n = dn+1 x(t)

where r n = dn+1 x(t) Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution

More information

Definition of a Stochastic Process

Definition of a Stochastic Process Definition of a Stochastic Process Balu Santhanam Dept. of E.C.E., University of New Mexico Fax: 505 277 8298 bsanthan@unm.edu August 26, 2018 Balu Santhanam (UNM) August 26, 2018 1 / 20 Overview 1 Stochastic

More information

Chapter 6 - Random Processes

Chapter 6 - Random Processes EE385 Class Notes //04 John Stensby Chapter 6 - Random Processes Recall that a random variable X is a mapping between the sample space S and the extended real line R +. That is, X : S R +. A random process

More information

Chapter Review of of Random Processes

Chapter Review of of Random Processes Chapter.. Review of of Random Proesses Random Variables and Error Funtions Conepts of Random Proesses 3 Wide-sense Stationary Proesses and Transmission over LTI 4 White Gaussian Noise Proesses @G.Gong

More information

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

2.1 Introduction 2.22 The Fourier Transform 2.3 Properties of The Fourier Transform 2.4 The Inverse Relationship between Time and Frequency 2.

2.1 Introduction 2.22 The Fourier Transform 2.3 Properties of The Fourier Transform 2.4 The Inverse Relationship between Time and Frequency 2. Chapter2 Fourier Theory and Communication Signals Wireless Information Transmission System Lab. Institute of Communications Engineering g National Sun Yat-sen University Contents 2.1 Introduction 2.22

More information

PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar

PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar PROBABILITY THEORY By Prof. S. J. Soni Assistant Professor Computer Engg. Department SPCE, Visnagar Introduction Signals whose values at any instant t are determined by their analytical or graphical description

More information

MATHEMATICAL TOOLS FOR DIGITAL TRANSMISSION ANALYSIS

MATHEMATICAL TOOLS FOR DIGITAL TRANSMISSION ANALYSIS ch03.qxd 1/9/03 09:14 AM Page 35 CHAPTER 3 MATHEMATICAL TOOLS FOR DIGITAL TRANSMISSION ANALYSIS 3.1 INTRODUCTION The study of digital wireless transmission is in large measure the study of (a) the conversion

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011

UCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011 UCSD ECE53 Handout #40 Prof. Young-Han Kim Thursday, May 9, 04 Homework Set #8 Due: Thursday, June 5, 0. Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process,

More information

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. STAT 302 Introduction to Probability Learning Outcomes Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. Chapter 1: Combinatorial Analysis Demonstrate the ability to solve combinatorial

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

IV. Covariance Analysis

IV. Covariance Analysis IV. Covariance Analysis Autocovariance Remember that when a stochastic process has time values that are interdependent, then we can characterize that interdependency by computing the autocovariance function.

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 03 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA 6 MATERIAL NAME : Problem Material MATERIAL CODE : JM08AM008 (Scan the above QR code for the direct download of

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Properties of the Autocorrelation Function

Properties of the Autocorrelation Function Properties of the Autocorrelation Function I The autocorrelation function of a (real-valued) random process satisfies the following properties: 1. R X (t, t) 0 2. R X (t, u) =R X (u, t) (symmetry) 3. R

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : Additional Problems MATERIAL CODE : JM08AM004 REGULATION : R03 UPDATED ON : March 05 (Scan the above QR code for the direct

More information