STOCHASTIC PROCESSES and NOISE in ELECTRONIC and OPTICAL SYSTEMS. Prof. Yosef PINHASI
|
|
- Victor Davidson
- 5 years ago
- Views:
Transcription
1
2 STOCHASTIC PROCESSES and NOISE in ELECTRONIC and OPTICAL SYSTEMS Prof. Yosef PINHASI June 8, 004
3 Contents Fundamentals of Stochastic Processes. The random experiment Set of events Probability Random variables One dimensional random variable Two dimensional random variable Multiple random variables The Gaussian (normal) distribution Random processes Statistical characteristics A wide-sense stationary process Time average, power and correlation Spectral characteristics Response of linear systems Modulation Quadrature representation of band-pass process Cross-correlation and cross-spectrum Noise as a Stochastic Process 43 i
4 . White noise Band-limited (colored) noise Low-pass noise Band-pass noise Multi-mode noise Noise statistics The Gaussian distribution of noise The chi-square distribution of noise instantaneous power Linear transformation of Gaussian noise Quadrature representation of band-pass noise The Rayleigh distribution of noise amplitude The exponential distribution of the noise average power Noise phase Carrier and Noise The Rice distribution of the envelope of signal and noise Phase noise FM noise and frequency stability Spectral lineshape of harmonic oscillator Signal and Noise Detection and de-modulation Measurement of signal in the presence of noise Double side-band modulation Single side-band modulation Amplitude modulation Frequency modulation ii
5 4 Shot Noise Poisson counting process Poisson impulses Randomly occurring events Examples of shot noise generation Electron current Photon noise in coherent light Photo-current quantum noise Photo multiplier Spontaneous emission in light sources Generalization to non-homogeneous Poisson process Examples Thermal Noise 5 5. Doubly stochastic Poisson counting process Thermal light Boltzmann distribution Bose-Einstein statistics Planck radiation law Black-body radiation Johnson noise Noise gure Noise in circuits and systems 5 7 Electromagnetic Field 7 7. Wave equation Classication of waveguide mode solutions Transverse Electro-Magnetic TEM modes iii
6 7.. Transverse Electric TE (or H) modes Transverse Magnetic TM (or E) modes Normalization of mode amplitudes and some orthogonality relations Relations between waveguide modes Transmission line equations for the excited waveguide The excitation equation Excitation of general time dependent elds
7 Chapter Fundamentals of Stochastic Processes. The random experiment.. Set of events A trial is a single performance of an experiment. The result of the trial is an outcome which cannot be predicted in advance. The sample space S is the collection (set) of all possible outcomes in a given experiment. An event A is dened as a subset of the sample space A S. The algebra of sets consists of the following basic operations between events: Union A[B Intersection A\B Complement A = S A Subset AB Empty set = S Sets are mutually exclusive if they have no common elements. In that case A\B =. Figure. is a Venn diagram illustrating relations between sets of events in the sample space. Sets also satisfy the following laws
8 Chapter : Fundamentals of stochastic processes Figure.: Venn diagram illustrating operations between sets. The commutative law A[B = B[ A A\B = B\ A The distributive law A[(B\ C) = (A[B)\(A[C) A\(B[ C) = (A\B)[(A\C) The associative law A[(B[ C) = (A[B)[C = A[B[ C A\(B\ C) = (A\B)\C = A\B\ C
9 Chapter : Fundamentals of stochastic processes 3 De-Morgan's law.. Probability A[B = A\B A\B = A[B A probability P (A) is the chance of an event A to occur in a trial. It is the relative frequency of appearance of the event A in a suciently large number of experiments. Probability is a function of the events in the sample space S and satises the following: 0P (A) P (S) = P () = 0 P ( A) = P (A) P (A)P (B) P (A[B) = P (A) + P (B) P (A\B) if AB The joint probability is the probability of the intersection of events P (A\B). Note that when the events are mutually exclusive (A\B = ), the probability of their intersection is P (A\B) = P () = 0, resulting in a union probability equal to the sum of probabilities: P (A[B) = P (A) + P (B)
10 4 Chapter : Fundamentals of stochastic processes. Random variables A random variable X is a function that maps each element in the sample space into a point in the real line... One dimensional random variable Probability distribution The probability of the event that the random variable X accepts values X x is described by the cumulative distribution function (CDF): F X (x) = P (X x) (.) The CDF is a positive non-decreasing function that satises the following properties: F X ( ) = 0 F X (+) = 0F X (x) F X (x )F X (x ) if x < x P (x < X x ) = P (X > x \ X x ) = F X (x ) F X (x ) The probability density function (PDF) is the derivative of the distribution function: satisfying: f X (x) = d dx F X(x) (.) f X (x)0 for all x R x f X (x 0 )dx 0 = F X (x) = P (X x) R + f X (x)dx = F X (+) = P (x < X x ) = R x x f X (x)dx = F X (x ) F X (x )
11 Chapter : Fundamentals of stochastic processes 5 Moments The mth order moment of the random variable X is the expected value of X m : Z X m = EfX g + m = xm f X (x)dx (.3) When m = 0 it is simply the area of the PDF equal to F X (+) =. The mean value (statistical average) of the random variable is the rst (m = ) order moment: Z X + X = xf X(x)dx (.4) Moments about the mean value X X are central moments. The variance is the second central moment: X V fxg = E h i X X Z + = [x X] f X (x)dx (.5) The positive square root of the variance is the standard deviation X. The variance can be found from rst and second moments, according to: X = (X X ) = X X X + X = = X X X + X = X X (.6) Characteristic function The characteristic function of a random variable is the statistical average: Z + X(j!)e j!x = f X(x)e j!x dx (.7) Observe that X (j!) is a Fourier transform (in terms of!) of the PDF f X (x). The mth moment (.3) of the random variable X can be found from the derivatives of the characteristic function according to:
12 6 Chapter : Fundamentals of stochastic processes X m = ( j) m dm d! m X (j!)!=0 (.8) Transformation of a random variable If Y (X) is a monotonic transformation of a random variable X, the PDF of the random variable Y can be found from: f Y (y) = dx dy f X(x) (.9) When Y (X) is a non-monotonic transformation of a random variable X, the PDF of the random variable Y is then: f Y (y) = X i where the x i are the the roots of y y(x i ) = 0 The mth order moment of Y is given by: dy dx xi f X (x i ) (.0) Y m = R + y m f Y (y)dy = R + [y(x)] m f X (x)dx Linear transformation Consider the linear transformation: Y (X) = a X + b where a and b are constants. The PDF of the random variable Y is: f Y (y) = jaj f X a (y with average: Y = ax + b = a X + b and variance: Y = a X. b)
13 Chapter : Fundamentals of stochastic processes 7.. Two dimensional random variable Joint probability Consider two random variables X and Y. The probability of the event of their intersection X x\y y is dened by the joint probability distribution function: F X;Y (x; y) = P (X x\y y) (.) The properties of the joint distribution are: F X;Y ( ; ) = F X;Y (x; ) = F X;Y ( ; y) = 0 F X;Y (+; +) = 0F X;Y (x; y) F X;Y (x ; y )F X;Y (x ; y ) if x < x and y < y P (x < X x \ y < Y y ) = = F X;Y (x ; y ) + F X;Y (x ; y ) [F X;Y (x ; y ) + F X;Y (x ; y )] The joint probability density function is the derivative of the distribution function: satisfying: f X;Y (x; F X;Y (x; y) (.) f X;Y (x; y)0 for all (x; y) R x R y f X;Y (x 0 ; y 0 )dx 0 dy 0 = F X;Y (x; y) = P (X x\y y) R + R + f X;Y (x; y)dxdy = F X;Y (+; +) = P (x < X x \ y < Y y ) = R x R y x y f X;Y (x; y)dxdy = = F X;Y (x ; y ) + F X;Y (x ; y ) [F X;Y (x ; y ) + F X;Y (x ; y )]
14 8 Chapter : Fundamentals of stochastic processes Marginal distribution The marginal distribution functions of the one dimensional random variables X and Y are the probabilities: F X (x) = P (X x\y +) = F X;Y (x; +) = F Y (y) = P (X +\Y y) = F X;Y ( ; y) = The marginal probability density functions: Z x Z + f X;Y (x 0 ; y)dx 0 dy Z + Z y f X;Y (x; y 0 )dxdy 0 (.3) f X (x) = d Z + dx F X(x) = f X;Y (x; y)dy f Y (y) = d Z + dy F Y (y) = f X;Y (x; y)dx (.4) are the density functions of the individual one-dimensional random variables X and Y. Joint moments The joint moment of the two dimensional random variable the expected value: Z + Z + X m Y n = xm y n f X;Y (x; y)dxdy (.5) m + n is the order of the joint moment. When m = n = 0 it is simply the volume of the density function equal to F X;Y (+; +) =. Substituting n = 0 results in the moments X m, while for m = 0 we get the moments Y n. The second order joint moment when m = and n = : Z R XY + Z + XY = xyf X;Y (x; y)dxdy (.6) is the statistical correlation of X and Y. The two random variables are said to be orthogonal if their correlation is R XY = 0.
15 Chapter : Fundamentals of stochastic processes 9 The second order joint central moment is called the covariance: Z XY + Z + (X X )(Y Y ) = (x X)(y Y )f X;Y (x; y)dxdy (.7) Expansion of the expression for the covariance, one can nd that: XY = (X X )(Y Y ) = XY X Y Y X + X Y = = XY X Y Y X + X Y = R XY X Y (.8) The correlation coecient is the normalized covariance: XY X Y = R XY X Y q X X q Y Y (.9) According to Schwartz's inequality j XY j X Y. Consequently the correlation coecient is bounded = +. For two orthogonal random variables (R XY = 0), the correlation coecient is given by: X X Y Y. When =, the random variables X and Y are perfectly correlated, if = 0, they are un-correlated and when =, the variables are anti-correlated. Statistical independence The two events X x and Y y are statistically independent if (and only if) P (X x\y y) = P (X x) P (Y y). Consequently, if the two random variables X and Y are statistically independent, their joint distribution function is a product of the marginal distributions: F X;Y (x; y) = F X (x) F Y (y) (.0) The corresponding joint density function can then be written as:
16 0 Chapter : Fundamentals of stochastic processes f X;Y (x; y) = f X (x) f Y (y) (.) The statistical correlation of two independent random variables is given by: R XY = XY = X Y = X Y (.) The covariance is then XY = 0 and the correlation coecient is = 0. Consequently, statistically independent variables are always un-correlated. If two random variables X and Y are un-correlated (i.e. XY = X Y ), they are not necessarily independent. Joint characteristic function The joint characteristic function of a two dimensional random variable is the statistical average: Z + Z + X;Y (! X ;! Y )e j(! Xx+! Y y) = f X;Y (x; y)e j(! Xx+! Y y) dxdy (.3) X;Y (! X ;! Y ) is a two dimensional Fourier transform [in terms of (! X ;! Y )] of the joint density f X;Y (x; y). The m + n moment of the random variable can be found from the derivatives of the characteristic function according to: X m Y n = ( m X@! n Y X;Y (! X ;! Y ) (!X =0;! Y =0) (.4) The marginal characteristic functions are obtained from: X(! X ) = X;Y (! X ; 0) Y (! Y ) = X;Y (0;! Y ) (.5)
17 Chapter : Fundamentals of stochastic processes Transformation If U(X; Y ) and V (X; Y ) are transformations of random variables X and Y, the joint density function of the two dimensional random variable given by U and V can be found from: where the Jacobian is: f U;V (u; v) = D(x; y) D(u; v) f X;Y (x; y) (.6) J = D(x; y) D(u; Sum of two random variables Consider the transformation U = X + Y and V = Y (the random variable U is a sum of the random variables X and Y ). The Jacobian is then J = = resulting in the joint density function: f U;V (u; v) = f X;Y (u v; v) The marginal PDF of the random variable U = X + Y is: f U (u) = Z + U;V (u; v)dv = Z + X;Y (u v; v)dv (.7) The mean of a sum of two variables hereby shown to be equal to the sum of their individual means:
18 Chapter : Fundamentals of stochastic processes X + Y = U = R + uf U (u)du = = R + u h R + f X;Y (u v; v)dv i du = R + R + (x + y)f X;Y (x; y)dxdy = = R + R + xf X;Y (x; y)dxdy + R + R + yf X;Y (x; y)dxdy = X + Y (.8) Consequently, U = X + Y = X + Y U = X + Y + X Y U = U U = X + Y If the random variables X and Y are statistically independent, the PDF of their sum U = X + Y is given by a convolution of their individual density functions: Z + f U (u) = f X(u v)f Y (v)dv = f X (u)f Y (u) (.9)..3 Multiple random variables Joint and marginal distributions For N random variables represented by the vector X = [X ; X ; :::; X N ] the joint distribution function is the probability: The joint probability density: F X (x ; x ; :::; x N ) = PfX x \ X x \ :::\X N x N g (.30) f X (x ; x ; :::; x N ) = The k-dimensional marginal density N :::@x N F X (x ; x ; :::; x N ) (.3)
19 Chapter : Fundamentals of stochastic processes 3 f X ;:::;X k (x ; :::; x k ) = The m + m + ::: + m N -order joint moment is given by: Z + ::: Z + f X(x ; x ; :::; x N )dx k+ :::dx N (.3) Z + Z + X m X m X m N N = ::: xm xm x m N N f X (x ; :::; x N )dx :::dx N (.33) The joint characteristic function of N random variable is dened by: = X(! X ; :::;! XN ) = e j(! X x +:::+! XN x N ) = Z + ::: Z + f X(x ; :::; x N )e j(! X x +:::+! XN x N ) dx :::dx N (.34) The m + m + ::: + m N moment of the random variable can be found from the derivatives of the characteristic function according to: X m X m :::X m N N = ( j) m +m +:::+m m +m :::@x N X ;:::;X N (! X ; :::;! XN ) (!X =0;:::;! X N =0) (.35) Transformation If Y (X) is a vector of transformations of random variables X, the joint density function of the N-dimensional random variable given by Y can be found from: f Y (y ; :::; y N ) =jjj f X (x ; :::; x N ) (.36) where the Jacobian is the determinant:
20 4 Chapter : Fundamentals of stochastic processes @y N : : ::: : : : ::: : : : ::: N N Sum of random variables A normalized summation of N random variables X i is termed the sample mean: Y is a random variable with mean: Y = N NX i= X i (.37) and second moment: Y = N NX i= X i = N NX i= X i = N NX i= Xi (.38) Y = " N NX i= The variance is found by: X i # = N Y = Y Y = N 4 N X i= NX NX i= i 0 = X i X i X i 0 = N 4 N X Xi + i= = 4 X N N Xi + NX X X i + i i 0 6=i N = i= NX X i= i 0 6=i NX X 3 X i X i 05 = R Xi i= i 0 X i 0 6=i 3 3 RXi X i 0 Xi Xi 5 0 = 3 4 N X i= X i + NX X Xi i= i 0 X i 0 6=i 5 (.39) 5 (.40) When the random variables X i are statistically independent Xi X i 0 = 0 (for i 6= i 0 ), the variance of their sample mean is Y = N P Ni= X i
21 Chapter : Fundamentals of stochastic processes 5..4 The Gaussian (normal) distribution The (uni-variate) Gaussian random variable When a random variable X is Gaussian distributed with mean X and standard deviation X, its PDF is given by: f X (x) = p e X (x X ) X (.4) The CDF is given by: F X (x) = Z x f X(z)dz = p Z x The distribution function can be written in terms of the Q-function : X X e z dz (.4) or in terms of the error function : F X (x) = x Q X X (.43) F X (x) = " + erf!# p X = X x! erfc x p X X (.44) The Q-function: Q(x) p Z + x e z dz The error function is dened by: and its complementary version is: erf(x) p Z x 0 e z dz erfc(x) erf(x) = p Z + x e z dz
22 6 Chapter : Fundamentals of stochastic processes Figure.: Probability density function (PDF) and cumulative distribution function (CDF) of the Gaussian (normal) distribution. Figure. shows the density and distribution functions of a Gaussian random variable with mean X = 0 and standard deviation X =. The central moments of a Gaussian random variable can be found from the standard deviation: (X X ) m = 8 >< >: 3 ::: (m ) m X m even 0 m odd (.45) The ordinary moments may be expressed in terms of the central moments: X m = 0 mx B m i C A i X (X X ) m i (.46) The characteristics function is given by:
23 Chapter : Fundamentals of stochastic processes 7 j! (!) = e! (.47) The bi-variate Gaussian distribution When two random variables X and Y are jointly Gaussian distributed, their joint density function is given by f X;Y (x; y) = ( X Y p exp " (x X ) ( ) X #) (x X )(y Y ) + (y Y ) X Y Y (.48) and called the bi-variate Gaussian density. Here: X = X Y = Y X = (X X ) Y = (Y Y ) = XY X Y where XY = (X X )(Y Y ) When the two variables X and Y are un-correlated ( = 0), the joint density function can be written as a product of marginal densities of two uni-variate Gaussian random variables: f X (x) = f Y (y) = p e X p e Y (x X ) X (y Y ) Y (.49) The joint characteristic function of the bi-variate Gaussian random variable is: " (! X ;! Y ) = exp j( X! X + Y! Y )!# X! X + X Y! X! Y + Y! Y (.50)
24 8 Chapter : Fundamentals of stochastic processes The multi-variate Gaussian distribution N random variables X = [X ; X ; :::; X N ] are jointly Gaussian distributed if their joint density function is given by: f X (x ; :::; x N ) = r () N= X exp h x X ) i t X h x X ) i (.5) Here: X 3 X X = : : 6 4 : XN 7 5 is the mean vector and 3 X X X : : : X X N X X X : : : X X N X = : : : : : : : : : : : : 6 4 : : : : : : XN X XN X : : : X N is the covariance matrix. Note that for a bi-variate Gaussian distribution, the covariance matrix is: X = where is the correlation coecient (.9). 6 4 X X X X X X
25 Chapter : Fundamentals of stochastic processes 9 The central limit theorem The probability of a sum of a large number of statistically independent random variables (which are not necessarily distributed equally) approaches a Gaussian distribution.
26 0 Chapter : Fundamentals of stochastic processes.3 Random processes A random process X(t) is an ensemble of time dependent sample waveforms x(t) (real functions are considered here), which are realizations of the process. A process is called deterministic if future values of any sample function can be predicted from past values. A random process is said to be stationary if all the statistical properties do not change with time..3. Statistical characteristics Sampling of a random process at a particular time t results in a random variable associated with distribution (.): and probability density (.): F X(t) (x; t) = P [X(t)x] (.5) f X(t) (x; t) = d dx F X(t)(x; t) (.53) which in general are time dependent functions. The resultant mth order moments (.3) are time dependent functions: X m (t) = Z + xm f X(t) (x; t)dx (.54) In particular, the mean (.4): X (t) = X(t) = Z + X(t)(x; t)dx (.55) and the variance (.4): X(t) = [X(t) X (t)] = X (t) x(t) (.56)
27 Chapter : Fundamentals of stochastic processes The statistical auto-correlation of a random process is the correlation (.6) between two samples of the random process at times t and t+ [two random variables X(t) and X(t+)]: Z + Z + R XX (t; ) = X(t)X(t + ) = x x f X(t);X(t+) (x ; t; x ; t + )dx dx (.57) The statistical auto-correlation satises the following properties: The auto-covariance (.7) is dened by: jr XX (t; )j q R XX (t; 0)R XX (t + ; 0) R XX (t; 0) = X (t) = X(t) + X(t) R XX (t ; ) = R XX (t; ) XX (t; ) = [X(t) X (t)][x(t + ) X (t + )] = R XX (t; ) X (t) X (t + ) (.58).3. A wide-sense stationary process In many practical cases we are required to deal with the mean value and auto-correlation function of the random process. When these quantities are not time dependent, i.e. the mean: is constant and the auto-correlation: X = X(t) (.59) R XX () = X(t)X(t + ) (.60) is a function of the time dierence only, the process is wide-sense stationary. For such processes the auto-correlation satises:
28 Chapter : Fundamentals of stochastic processes The auto-covariance is then: jr XX ()jr XX (0) R XX (0) = X (t) = X + X R XX ( ) = R XX (+) XX () = R XX () X (.6).3.3 Time average, power and correlation Consider a portion of a realization of the random process in the time interval h T + T i 3 : Average 8 t >< x T (t)x(t)rect = T >: Time averaging of each realization x T (t) x(t) jtj T 0 jtj > T (.6) < x T >= T Z +T= T= x(t)dt (.63) results in a random variable < x T > representing the DC value of each realization of the random process in the time interval T. The expected value (ensemble average of all time averages) of this random variable is: < x T > = T Z +T= T= x(t)dt = T Z +T= T= X(t)dt = T Z +T= T= X(t)dt (.64) 3 The rectangular function: rect t T 8 < : jtj T 0 jtj > T
29 Chapter : Fundamentals of stochastic processes 3 In the case of a wide-sense stationary process the mean value X(t) = X is a constant that does not depend on time. In that case, the expected value of the the time averages (DC values of all sample waveforms) is equal to the mean value of the process: < x > = lim T! T Z +T= T= x(t)dt = lim T! T Instantaneous and average power Z +T= T= Z +T= X(t)dt = lim T! T Xdt = X (.65) T= The instantaneous power of the realization x(t) is given by P x (t) = x (t). The time average power of a portion x T (t) of a realization of the random process results in a random variable: < P xt >= T Z +T= T= x (t)dt (.66) The expected value is: < P xt > = T Z +T= T= x (t)dt = T Z +T= T= X (t)dt (.67) For a wide-sense stationary process, the second moment is constant X (t) = R XX (0). The expected value of the average power is then equal to the second moment of the random process: Z +T= Z +T= < P x > = lim T! T T= x (t)dt = lim T! T T= X (t)dt = R XX (0) (.68) Time auto-correlation The time correlation of a sample signal x(t) is given by: R xt ()< x(t)x(t + ) >= T Z +T= T= x(t)x(t + )dt (.69) It is a random variable (for any time dierence ), with average:
30 4 Chapter : Fundamentals of stochastic processes R xt () = < x(t)x(t + ) > = T = T Z +T= Z +T= x(t)x(t + )dt = T= Z +T= X(t)X(t + )dt = T= T R XX(t; )dt (.70) T= For a wide-sense stationary process the statistical auto-correlation R XX (t; ) = R XX () does not depend on t. Consequently, we nd that: Ergodicy R x () = lim T! T Z +T= T= x(t)x(t + )dt = lim T! T = lim T! T Z +T= T= Z +T= X(t)X(t + )dt = T= R XX()dt = R XX () (.7) If the time average < x > and the auto-correlation R x () of a stationary process are random variables with zero variances, they are actually constants and their expected values: < x > =< x >= X R x () = R x () = R XX () (.7) A process of which the statistical averages can be found from time averages is said to be ergodic..3.4 Spectral characteristics Fourier transform The Fourier transform of a portion of a realization of the random process 4 is then: R 4 + Note that the Fourier transform exists only for realizations x(t) that satises jx(t)jdt <. Many realizations of the process may not satisfy this sucient (not necessary) condition. Taking a portion x T (t) of a realization will assures a nite integral and will have a Fourier transform.
31 Chapter : Fundamentals of stochastic processes 5 Z X + T (jf) = x T (t)e jft dt = The inverse Fourier transform is given by: Z +T= T= x(t)e jft dt (.73) Z + X x T (t) = T (jf)e +jft df (.74) Power spectral density Substituting the inverse Fourier transform (.74) in the expression for the average power (.66) results in: Z +T= Z + Z + < P xt >= T T= x (t)dt = T x T (t)dt = T jx T (f)j df (.75) This relation is the Parseval's theorem for a realization of the random process. Taking the expected value of the average power gives: Z +T= Z + < P xt > = T T= X (t)dt = T jx T (f)j df (.76) The power spectral density (PSD) of a portion of the random process is dened by: S XT (f) T jx T (f)j (.77) In order to found the power spectrum of the entire process we form the limit T!: The PSD properties are: S jx X (f) = lim T (f)j (.78) T! T SX(f) =S X (f) S X (f)0 S X ( f) =S X (f)
32 6 Chapter : Fundamentals of stochastic processes Time-frequency distribution Fourier transformation of the statistical auto-correlation of a random process results in a time-frequency dependent function [6] 5 : Z W + Z + X (t; f) = R XX(t; )e jf d = X(t)X(t + )e jf d = Z + = x(t)x(t + )e jf d (.79) The properties of the time-frequency distribution (.79) are: W X(t; f) =W X (t; f) for a real random process Equation (.79) can be written in the form: Z W + X (t; f) = x(t) x(t + )e jf d = = lim T! x(t)e+jft Z +T= T= x(t)e jft dt = lim T! x(t)x T (f)e +jft The time average of the time-frequency distribution is equal to the power spectrum according to: 5 The time frequency distribution given in (.79): Z + W X (t; f) = X(t)X(t + )e jf d is known as the Rihaczek distribution [7]. When a symmetric denition for the auto-correlation is used, the Wigner distribution [8] is obtained: W X (t; f) = Z + X t X t + e jf d
33 Chapter : Fundamentals of stochastic processes 7 Table.: Stochastic process X(t). Mean (expected value): x (t) = X(t) Auto-correlation: R XX (t; ) = X(t)X(t + ) Auto-covariance: XX (t; ) = R XX (t; ) X (t) X (t + ) Time-frequency distribution: W X (t; f) = R + R XX (t; )e jf d Power spectral density: S X (f) = lim T! T Power: < P x > = lim T! T R +T= T= W X (t; f)dt = lim T! R +T= T= X (t)dt = lim T! T TjX T (f)j R + jx T (f)j df Z +T= W lim X(t; f)dt = lim T! T T= T! X Z +T= T (f) T T= x(t)e+jft dt = lim T! Integration over the entire frequency domain results in: T jx T (f)j =S X (f) (.80) Z + W X(t; f)df = lim T! x(t) Z + X T (f)e +jft df = X (t) For a wide-sense stationary process, where R XX (t; ) = R XX () = R x (), the timefrequency distribution is exactly the power spectrum of the process: Z W + Z + X (f) = R XX()e jf d = R x()e jf d = dt = lim T! Z +T= Z + = lim T! x(t) x(t + )e jf d T= T jx T (f)j =S X (f) (.8) Equation (.8) is the Wiener-Khinchine theorem for wide-sense stationary processes. The above relations are summarized in table.. The relations for wide-sense stationary process are given in table..
34 8 Chapter : Fundamentals of stochastic processes Table.: Wide-sense stationary process X(t). R +T= Mean (expected value): x = X(t) = lim T! T T= x(t)dt Auto-correlation: R XX () = X(t)X(t + ) = lim T! T Auto-covariance: XX () = R XX () X Time-frequency distribution: W X (f) = R + R XX ()e jf d Power spectral density: S X (f) =W X (f) = lim T! TjX T (f)j R + jx T (f)j df Power: < P x > = R XX (0) = lim T! T R +T= T= x(t)x(t + )dt The harmonic signal Consider the sine waveform with random phase: x(t) = A 0 cos(f 0 t + ) = A 0 cos() cos(f 0 t) A 0 sin() sin(f 0 t) (.8) A 0 is the amplitude and f 0 is the frequency. The signal x(t) is a realization of a random process X(t), where the phase is a random variable distributed uniformly over [ by: k +k], where 0 k. The PDF of the phase is given f () = k rect! k Figure.3 shows few realization waveforms of the sine random process for k =. The PDF of the random process can be found using the transformation (.9): f X(t) (x) = j A 0 sin(f 0 t + )j f () = qa 0 [A 0 cos(f 0 t + )] f () =
35 Chapter : Fundamentals of stochastic processes 9 Figure.3: a) Realizations of the sine random process (k = ). b) Probability density function. c) Auto-correlation. d) Power spectral density.
36 30 Chapter : Fundamentals of stochastic processes ( ) = q k A 0 x rect x (.83) A 0 [ cos(k)] The mean 6 : X (t) = X(t) = A 0 cos() cos(f 0 t) A 0 sin() sin(f 0 t) = A 0 sinc(k) cos(f 0 t) The auto-correlation: (.84) R XX (t; ) = X(t)X(t + ) = A 0 cos(f 0 t + ) A 0 cos [f 0 (t + ) + )] = The time-frequency distribution is: = A 0 cos(f 0) + A 0 cos [f 0(t + ) + )] = = A 0 cos(f 0) + A 0 sinc(k) cos [f 0(t + ))] (.85) Z W + X (t; f) = R XX(t; )e jf d = A 0 4 and the power spectral density is: h + sinc(k)e +j4ft i [(f f 0 ) + (f + f 0 )] (.86) Z S +T= W X (f) = lim X(t; f)dt = A 0 T! T T= 4 [(f f 0) + (f + f 0 )] (.87) When k = the random process is wide-sense stationary with expected DC level 6 X = X(t) = 0 and expected average power R XX (0) = X (t) = A 0. sin() = k Z +k k sin()d = 0 cos() = k Z +k k cos()d = sin(k) k sinc(k)
37 Chapter : Fundamentals of stochastic processes 3 Figure.4: a) Linear system. b) Linear time-invariant system..3.5 Response of linear systems The response y(t), obtained at the output of a system when a signal x(t) is introduced at its input (see gure.4), is given by the operation: y(t) = Lfx(t)g L is an operator representing the action of the system on the input signal x(t). A system is linear if the operator satises: y(t) = Lf P i K i x i (t)g = P i K i Lfx i (t)g = P i K i y i (t) where K i are constants and y i (t) = Lfx i (t)g. In that case the output signal can be expressed in terms of its impulse response:
38 3 Chapter : Fundamentals of stochastic processes Z y(t) = Lfx(t)g + = L x(t0 )(t t 0 )dt 0 Z + = x(t0 )Lf(t t )g 0 dt 0 = h(t; t 0 ) = Lf(t Z + = x(t0 )h(t; t 0 )dt 0 (.88) t 0 )g is the time dependent response at the output of the system when an impulse is applied to its input at the time t 0. If the impulse response of the system satises Lf(t t 0 )g = h(t t 0 ), the system is said to be a linear time-invariant (LTI) system. In that case the response at the output is given by the convolution: Z + Z + y(t) = x(t)h(t) = x(t0 )h(t t 0 )dt 0 = x(t t0 )h(t 0 )dt 0 (.89) When dealing with random signals, the input x(t) represents a sample waveform of the random process X(t). In general, the action of the linear system may also have a random nature, resulting in an ensemble of impulse responses h(t). Consequently, the output y(t) is a realization of a random process Y (t), with average: Z + Z + Y (t) = Y (t) = x(t t0 )h(t 0 )dt 0 = X(t t0 ) h(t 0 )dt 0 = Z + = X(t t 0 )h(t 0 )dt 0 = X (t)h(t) (.90) and auto-correlation: Z + Z + R Y Y (t; ) = Y (t)y (t + ) = x(t t0 )h(t 0 )dt 0 x(t + t0 )h(t 0 )dt 0 = Z + Z + = x(t t0 )x(t + t 00 )h(t 0 )h(t 00 )dt 0 dt 00 = Z + Z + = x(t t0 )x(t + t 0 0 )h(t 0 )h(t )dt 0 d 0 = Z + Z + = X(t t0 )X(t + t 0 0 ) h(t 0 )h(t )dt 0 d 0 = Z + Z + = R XX(t t 0 ; 0 )R hh (t 0 ; 0 )dt 0 d 0 (.9)
39 Chapter : Fundamentals of stochastic processes 33 where R hh (t 0 ; 0 ) = h(t 0 )h(t ) is the statistical auto-correlation of the impulse response. The time-frequency distribution (.79) is the Fourier transform of the auto-correlation: Z W + Y (t; f) = Z + W X (t t 0 ; f) Z + The PSD of the output random signal: R hh(t 0 ; 0 )e +jf0 d 0 dt 0 = W X(t t 0 ; f)w h (t 0 ; f)dt 0 =W X (t; f)w h (t; f) (.9) Z S +T= W Y (f) = lim Y (t; f)dt = T! T T= Z + " lim T! T Z +T= T= W X(t t 0 ; f)dt If the impulse response h(t) of the system is a deterministic signal: # W h (t 0 ; f)dt 0 (.93) Y (t) = R + X (t t 0 )h(t 0 )dt 0 = X (t)h(t) R Y Y (t; ) = R + h h(t 0 ) R + R XX (t t 0 ; 0 )h(t )d 0i dt 0 W Y (t; f) =jh(f)j R + W X (t t 0 ; f)dt 0 S Y (f) =jh(f)j S X (f) where H(jf) = R + h(t)e jft dt is the transfer function of the system given by a Fourier transform of the impulse response. In case where the input signal to a deterministic system is a wide-sense stationary process random signal X(t), the resulted random process Y (t) at the output is wide-sense stationary with: Y = X R + h(t)dt =H(0) X R Y Y () = R + h RXX ( 0 ) R + h(t 0 )h(t )dt 0i d 0 = R XX ()h( )h() S Y (f) =jh(f)j S X (f)
40 34 Chapter : Fundamentals of stochastic processes The dierentiator A dierentiator is an LTI system, where the output signal corresponds to the derivative of the input: y(t) = d x(t) (.94) dt The impulse response is the dublet function: and the transfer function of the system is: h(t) = (t) 0 d (t) (.95) dt H(jf) = jf (.96) A random process X(t) at the input results in a random process Y (t) at the output, with mean: and auto-correlation: Y (t) = d dt X(t) (.97) R Y Y (t; + ) R XX(t; ) (.98) Fourier transformation of the auto-correlation results in the time-frequency distribution: W Y (t; f) = (f) Z + The PSD of the output random signal: W X(t t 0 ; f)dt 0 (.99)
41 Chapter : Fundamentals of stochastic processes 35 Figure.5: Double side-band modulation. S Y (f) = (f) S X (f) (.00).3.6 Modulation Consider the product of a signal x(t) and a sine waveform at a frequency f 0 and phase (see gure.5). In that case the signal x(t) modulates a carrier wave resulting in a modulated signal: y(t) = x(t) cos(f 0 t + ) (.0) If the modulating signal x(t) is a realization of a random process X(t) and the phase is a random variable distributed uniformly over [ with mean: +], then Y (t) is a random process
42 36 Chapter : Fundamentals of stochastic processes Y (t) = x(t) cos(f 0 t + ) = X(t) cos(f 0 t + ) = X 0 = 0 (.0) and auto-correlation: R Y Y (t; ) = X(t)X(t + ) cos(f 0 t + ) cos [f 0 (t + ) + )] = R XX(t; ) cos(f 0 ) (.03) The time-frequency distribution is the Fourier transform of the auto-correlation function: W Y (t; f) = W X(t; f) [(f f 0) + (f + f 0 )] = = 4 [W X(t; f f 0 ) +W X (t; f + f 0 )] (.04) The power spectrum of the modulated signal is then: S Y (f) = 4 [S X(f f 0 ) +S X (f + f 0 )] (.05).3.7 Quadrature representation of band-pass process The modulated waveform given in (.0) can be presented in the form: y(t) = i(t) cos(f 0 t) q(t)sin(f 0 t) (.06) where i(t) = x(t) cos() and q(t) = x(t)sin() are realizations of two independent random processes I(t) and Q(t) respectively, satisfying the following relations:
43 Chapter : Fundamentals of stochastic processes 37 Figure.6: a) Quadrature representation of a band-pass process. b) Phasor illustration. I(t) = Q(t) = 0 R II (t; ) = R QQ (t; ) = R XX(t; ) R II (t; 0) = R QQ (t; 0) = R XX(t; 0) = R Y Y (t; 0) I(t)Q(t) = I(t) Q(t) = 0 W I (t; f) =W Q (t; f) = W X (t; f) S I (f) =S Q (f) = S X (f) The scheme that carries out the modulation as given in (.06), is known as an quadrature (I-Q) modulator, illustrated in gure Cross-correlation and cross-spectrum Consider two random processes X(t) and Y (t) with means X (t) and Y (t) and autocorrelations R XX (t; ) and R Y Y (t; ) respectively. The statistical cross-correlation of the
44 38 Chapter : Fundamentals of stochastic processes two random processes is the correlation (.6) between samples of the random processes at times t and t + : Z + Z + R XY (t; ) = X(t)Y (t + ) = xyf X(t);Y (t+)(x; t; y; t + )dxdy (.07) When R XY (t; ) = 0, the two processes are orthogonal. If they are statistically independent their cross-correlation is R XY (t; ) = X(t) Y (t + ) = X (t) Y (t+). If the processes X(t) and Y (t) are at least jointly wide-sense stationary then R XY () = X(t)Y (t + ) is the function of the time dierence only, satisfying the followings: jr XY ()j q R XX (0)R Y Y (0) [R XX(0) + R Y Y (0)] R XY (0) = X(t) Y(t) R XY ( ) = R Y X (+) The cross-covariance (.7) of the two processes is: XY (t; ) = [X(t) X (t)][y (t + ) Y (t + )] = R XY (t; ) X (t) Y (t + ) (.08) If XY (t; ) = 0, the processes are un-correlated. For jointly wide-sense stationary processes the cross-covariance becomes: The time-frequency cross-distribution is given by: XY () = R XY () X Y (.09) Z W + Z + XY (t; f) = R XY (t; )e jf d = X(t)Y (t + )e jf d = Z + = x(t)y(t + )e jf d (.0) and the cross-power density spectrum is:
45 Chapter : Fundamentals of stochastic processes 39 with the following properties: Z S +T= W X XY (f) = lim XY (t; f)dt = lim T! T T= T! T (f)y T (f) (.) T S XY (f) =S Y X ( f) =S Y X(f) If orthogonal: S XY (f) =S Y X (f) = 0 If un-correlated: S XY (f) =S Y X (f) = X Y (f) Two harmonic signals Consider two sine random processes 'voltage' V (t) and 'current' I(t) at frequency f 0 with realizations: v(t) = V cos(f 0 t + V ) i(t) = I cos(f 0 t + I ) (.) V and I are the (peak) amplitudes of the voltage and current respectively and V and I are their random phases. The cross-correlation is given by: R V I (t; ) = V (t)i(t + ) = V cos(f 0 t + V ) I cos [f 0 (t + ) + I )] = = V I cos(f 0 + V I ) + V I cos [f 0(t + ) + V + I (.3) )] When the second average over a high frequency in the above equation is small and negligible, the cross-correlation is: R V I () = V I cos(f 0 + ) = V I h cos() cos(f0 ) sin() sin(f 0 ) i (.4)
46 40 Chapter : Fundamentals of stochastic processes where = V I is the phase dierence between voltage and current with PDF given by the convolution: f () = f V ()f I ( ) (.5) The cross-spectrum is: Z S + V I (f) = R XY ()e jf d = V I 4 h e +j (f f 0 ) + e j (f + f 0 ) i (.6) The instantaneous power is a random process given by the product P (t) = V (t) I(t) with average 7 : P (t) = R V I (0) = V I cos() (.7) 7 When is uniformly distributed [ k; +k], the average is: cos() = k Z +k k cos()d = sinc(k) When is normally distributed with mean 0 and variance, the average is: Z + cos() = cos() p e d = e
47 Bibliography [] W. B. Davenport, W. L. Root: "An introduction to the theory of random signals and noise", McGraw-Hill (958) [] W. B. Davenport: "Probability and random processes, an introduction for applied scientists and engineers", McGraw-Hill (970) [3] A. Papoulis: "Probability, random variables, and stochastic processes", McGraw-Hill (99) [4] P. Z. Peebles: "Probability, random variables, and random signal principles", McGraw- Hill (993) [5] J. Goodman: "Statistical optics", Wiley (985) [6] L. Cohen: "Time-frequency distributions - a review", Proc. IEEE 77, 94-98, (989) [7] W. Rihaczek: "Signal energy distribution in the time and frequency", IEEE Trans. Information Theory IT-4, , (968) [8] E. P. Wigner: "On the quantum correction for thermodynamic equilibrium", Phys. Rev. 40, , (93) 4
48 4 Chapter : Fundamentals of stochastic processes
49 Chapter Noise as a Stochastic Process. White noise White noise n(t) is a wide-sense stationary process with zero mean n(t) = 0 and autocorrelation function: N 0 R n () = () (.) where N 0 is a real positive constant. According to the Wiener-Khinchin theorem, the power spectral density (PSD) of the white noise is the Fourier transform of the auto-correlation (.), resulting in a uniform density at all frequencies : S n (f) = N 0 (.) N 0 is the two-sided power density in the positive and negative frequency domain, where N 0 is referred to the single side-band (SSB) noise power density (N 0 is double side-band (DSB) noise power density). The auto-correlation function of white noise and its related power spectrum are illustrated in Fig..a and Fig..b respectively. Observe that the The terminology 'white noise' follows 'white light', which contains all frequencies in the visible light spectrum. 43
50 44 Chapter 3: Noise as a stochastic process total average power of white noise at the entire spectrum is innite: Z + P n = R n (0) = N 0 df =. Band-limited (colored) noise In practical cases, the frequency response of the system given by its transfer function H(jf) is band-limited. The noise equivalent bandwidth is dened by: B = R 0 jh(f)j df jh(f 0 )j (.3) where f 0 is the central frequency of the band-pass response of the system (for low-pass system f 0 = 0), and B is the bandwidth of an ideal band-pass lter producing the same noise power at its output. The spectral line-shape of the power density of the equivalent band-limited noise at the system's output is uniform within the noise equivalent bandwidth B: S N! 0 n (f) = rect f [(f f 0 ) + (f + f 0 )] (.4) B (see gure..b). The auto-correlation function of band-limited white noise is found by inverse Fourier transformation of (.4) to be: R n () = n sinc(b)cos(f 0) (.5) as shown in gure..c. Here n is the variance equal to the total average power of the (zero mean) band-limited noise: P n = R n (0) = n =N 0 B (.6) In the following we calculate the equivalent bandwidths and correlation functions of noise at the output of typical low-pass and band-pass linear systems.
51 Chapter 3: Noise as a stochastic process 45 Figure.: a) Auto-correlation function of white noise. b) Power spectral density of white noise and equivalent band-pass noise. c) Auto-correlation of equivalent band-pass noise.
52 46 Chapter 3: Noise as a stochastic process.. Low-pass noise Consider the linear rst-order R-L and R-C electrical circuits shown in gure..a. The transfer function is given for them both by: H(jf) = + j c f (.7) The system is of the low-pass type, where c is the decay time constant ( c = RC for the R-C network and c = L R for the R-L network) of its impulse response given by the inverse Fourier transform of the transfer function H(jf): h(t) = c e t c u(t) (.8) When a white noise (with power spectral density given in (.)) is introduced at the input of the rst-order low-pass lters, the power spectral density of the noise at the output follows a Lorentzian line-shape: S N N 0 n (f) = jh(f)j 0 = + ( c f) (.9) Following denition (.3), the eective noise bandwidth is found to be B = 4 c. The autocorrelation function is the inverse Fourier transformation of the PSD (.9): where n = N 0 B = N 0 4 c R n () = n e jj c (.0) is the total average power of the low-pass noise. Figures..b and..c show the auto-correlation function (.0) and power spectrum (.9) of the band-limited noise obtained at the output of the rst-order low-pass lter. The power transfer function of a Butterworth nth order low-pass lter is given by: with noise equivalent bandwidth of: jh(f)j = + ( c f) n
53 Chapter 3: Noise as a stochastic process 47 Figure.: a) First-order low-pass lters. b) Auto-correlation function of low-pass noise. c) Line-shape of the power spectrum of low-pass noise.
54 48 Chapter 3: Noise as a stochastic process B = 4n c sin n
55 Chapter 3: Noise as a stochastic process 49.. Band-pass noise The second-order resonant R-L-C circuit in gure.3.a is a band-pass lter (BPF) with a transfer function of the form: H(jf) = where f 0 = p LC is the resonance frequency and Q = Rq L C + jq f f 0 f 0 f (.) is the quality factor. The equivalent resistance R = R SR L R S +R L results from the source resistor R S in parallel with the load R L 3. The impulse response of the circuit is given by: h(t) = d h i t e c sin(f n t)u(t) (.) f n c dt where c = Q f 0 and f n = f 0 Qp 4Q. The power spectral density of the noise at the output when a white noise is introduced at the input is: S N N 0 n (f) = jh(f)j 0 = + Q f f 0 f 0 f (.3) with eective bandwidth B = f 0 Q = c. For suciently high Q, the PSD can be approximated by using the Lorentzian line shape: S n (f)' N 0 + ( c f) [(f f 0 ) + (f + f 0 )] (.4) Inverse Fourier transformation of (.4) results in the auto-correlation function: where n =N 0 B = N 0 c R n ()' n e jj c cos(f 0 ) (.5) is the total average power of the noise. Figures.3.b and.3.c show the auto-correlation function (.5) and power spectrum (.3) of the band-limited noise obtained at the output of the resonant circuit. 3 This is known as the loaded Q-factor. The unloaded quality factor is the one calculated for R L!
56 50 Chapter 3: Noise as a stochastic process Figure.3: a) Second-order band-pass resonant circuit. b) Auto-correlation function of band-pass noise. c) Line-shape of the power spectrum of band-pass noise.
57 Chapter 3: Noise as a stochastic process 5..3 Multi-mode noise Figure.4.a is a schematic illustration of a Fabry-Perot resonator. In the frequency domain, the transfer function describing the ratio between the out-coupled eld and the circulating eld inside the resonator after M round-trips is given by the summation: where H(jf) = MX m=0 m e jmkzlc = e jk zl c M e jk zl c (.6) is the combined complex eld reectivities in the round-trip feedback loop, is the eld transmission coecient, k z is the axial wavenumber of the transverse mode excited in the cavity and l c is the round-trip length of the cavity. Inverse Fourier transformation of H(jf), neglecting dispersive eects, results in the impulse response: where t r = lc v g h(t)' MX m=0 m (t mt r ) (.7) is the round-trip time determined by the group velocity v g = df dk z circulating eld inside the resonator. of the An expression for the power transfer function of a Fabry-Perot resonator at steady-state is obtained in the limit M! : jh(f)j = T p R + 4 p R sin k zl c (.8) where R = j j is the total round-trip power reectivity and T = j j is the power transmission coecient. The power transfer characteristics of the Fabry-Perot resonator is shown in gure.4.b. Maximum transmission occurs when k z l c = m (where m is an integer), which denes the resonant frequencies f m for the longitudinal modes of the resonator. 4 The transmission peaks are equal to T ( p R). The free-spectral range is the intermode frequency separation F SR = f m+ f m = df dk z [k z (f m+ ) k z (f m )] = l c v g = t r. The full-width half-maximum of 4 In free-space propagation k z = f c, the resonant frequencies are f m = m l c c (where c = : m=s is the speed of light).
58 5 Chapter 3: Noise as a stochastic process the transmission peaks is given by F W HM = F SR F resonator. where F = 4p R p R is the Finesse of the The power spectral density of the noise obtained at the output of the resonator when a white noise is excited inside can be written: S N N 0 n (f) = jh(f)j 0 = T p R + F sin k zl c (.9) For a suciently high Finesse, the PSD near each resonant frequency can be approximated by employing the Lorentzian line-shape, and the noise spectrum can be written as: where c = S n (f)' N 0 F F SR T p R + ( c f) X m= [(f f m ) + (f + f m )] (.0) is the decay time of the circulating eld in the resonator (The cavity decay time of the stored power is c /). The eective noise bandwidth of each longitudinal mode is B = c function: = F SR F. Inverse Fourier transformation of (.0) results in the auto-correlation where n = R n ()' n e jj c X m= cos(f m ) (.) T p R N 0 T B = p R N 0 (.) c is the total average output power of the band-pass noise at each mode.
59 Chapter 3: Noise as a stochastic process 53 Figure.4: a) Fabry-perot distributed resonator. b) Line-shape characteristics of the power spectrum of noise at the output of a resonator.
60 54 Chapter 3: Noise as a stochastic process Table.: Characteristics of band-limited noise LPF BPF Resonator Resonant (central) frequency: f 0 0 p f LC m Finesse: F 4p R p R Quality factor: Decay time: c RC or L R Full-width half-maximum: Noise equivalent bandwidth: Noise power: Q F W HM B n Rq L C Q f 0 c c 4 c c N 0 N 0 4 c c f m F SR F F F SR F SR F F SR F T ( p R) N 0 c Table. summarizes some of the parameters characterizing band-limited noise power spectral densities.
61 Chapter 3: Noise as a stochastic process 55.3 Noise statistics.3. The Gaussian distribution of noise The probability density function (PDF) of a Gaussian noise is: f n(t) (n) = p e n The corresponding cumulative distribution function (cdf) is given by 5 : n n (.3) F n(t) (n) = " = + erf Z n f n(t)(n 0 )dn 0 n = Q =!# n n p = n erfc n p n! (.4) The PDF and CDF of a Gaussian noise with n = are shown in gure.5.a.3. The chi-square distribution of noise instantaneous power The instantaneous power P n (t) = n (t) of the band-pass white noise n(t) is a random process with a central chi-square distribution with one degree of freedom according to the transformation: 5 The Q-function: The error function is dened by: and its complementary version is: Q(x) erf(x) p Z + x p Z x 0 e z dz e z dz erfc(x) erf(x) = p Z + y e z dz
62 56 Chapter 3: Noise as a stochastic process f Pn (t)(p n ) = dp n dn f n(t)(n) = p n p e Pn P n n (.5) dened for P n 0. The expected value of the average power is the rst moment P n = n (Eq. (.6)) and the second moment is P n = 3 4 n. In gure.5.b, graphs of the probability density function of central and non-central chi-square (with one degree of freedom) distributions are drawn..3.3 Linear transformation of Gaussian noise Consider the linear transformation: y(t) = a n(t) + b (.6) where a is the signal amplication constant and b is the DC bias. The resulted process Y (t) is Gaussian distributed: f Y (t) (x) = p e jaj n (y b) (an) (.7) with mean Y (t) = b and standard deviation Y =jaj n. The correlation function is: R Y Y () = [a n(t) + b][a n(t + ) + b] = a R n () + b (.8) and the corresponding PSD found by Fourier transformation of (.8) is: S Y (f) = a S n (f) + b (f) (.9) The instantaneous power P y (t) = y (t) of the Gaussian process Y (t) is a random process with a non-central chi-square distribution with one degree of freedom (see gure.5.b): f Py (t)(p y ) = p q e y Py ( p Py b) y (.30)
63 Chapter 3: Noise as a stochastic process 57 Figure.5: a) Probability density function (PDF) and cumulative distribution function (CDF) of Gaussian noise ( n = ), and b) its power chi-square distributed PDF.
64 58 Chapter 3: Noise as a stochastic process dened for P y 0 (see.5.b). The expected value of the average power is the rst moment: The second moment is P y = 3(a n ) 4 + 6(ab n ) + b. P y = R Y Y (0) = (a n ) + b (.3)
ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationSRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS
UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS 1. Define random process? The sample space composed of functions of time is called a random process. 2. Define Stationary process? If a random process is divided
More informationIntroduction to Probability and Stochastic Processes I
Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides
More informationFundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes
Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of
More informationfor valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I
Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:
More informationFundamentals of Noise
Fundamentals of Noise V.Vasudevan, Department of Electrical Engineering, Indian Institute of Technology Madras Noise in resistors Random voltage fluctuations across a resistor Mean square value in a frequency
More informationEAS 305 Random Processes Viewgraph 1 of 10. Random Processes
EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome
More informationStochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno
Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.
More informationDeterministic. Deterministic data are those can be described by an explicit mathematical relationship
Random data Deterministic Deterministic data are those can be described by an explicit mathematical relationship Deterministic x(t) =X cos r! k m t Non deterministic There is no way to predict an exact
More informationECE-340, Spring 2015 Review Questions
ECE-340, Spring 2015 Review Questions 1. Suppose that there are two categories of eggs: large eggs and small eggs, occurring with probabilities 0.7 and 0.3, respectively. For a large egg, the probabilities
More informationChapter 5 Random Variables and Processes
Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering National Sun Yat-sen University Table of Contents 5.1 Introduction 5. Probability
More information7 The Waveform Channel
7 The Waveform Channel The waveform transmitted by the digital demodulator will be corrupted by the channel before it reaches the digital demodulator in the receiver. One important part of the channel
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan
More informationProbability and Statistics for Final Year Engineering Students
Probability and Statistics for Final Year Engineering Students By Yoni Nazarathy, Last Updated: May 24, 2011. Lecture 6p: Spectral Density, Passing Random Processes through LTI Systems, Filtering Terms
More informationFourier Analysis Linear transformations and lters. 3. Fourier Analysis. Alex Sheremet. April 11, 2007
Stochastic processes review 3. Data Analysis Techniques in Oceanography OCP668 April, 27 Stochastic processes review Denition Fixed ζ = ζ : Function X (t) = X (t, ζ). Fixed t = t: Random Variable X (ζ)
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the
More informationChapter 6 - Random Processes
EE385 Class Notes //04 John Stensby Chapter 6 - Random Processes Recall that a random variable X is a mapping between the sample space S and the extended real line R +. That is, X : S R +. A random process
More informationSignal Processing Signal and System Classifications. Chapter 13
Chapter 3 Signal Processing 3.. Signal and System Classifications In general, electrical signals can represent either current or voltage, and may be classified into two main categories: energy signals
More informationStochastic Processes
Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic
More informationECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process
1 ECE6604 PERSONAL & MOBILE COMMUNICATIONS Week 3 Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process 2 Multipath-Fading Mechanism local scatterers mobile subscriber base station
More informationP 1.5 X 4.5 / X 2 and (iii) The smallest value of n for
DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X
More information5 Analog carrier modulation with noise
5 Analog carrier modulation with noise 5. Noisy receiver model Assume that the modulated signal x(t) is passed through an additive White Gaussian noise channel. A noisy receiver model is illustrated in
More informationPROBABILITY AND RANDOM PROCESSESS
PROBABILITY AND RANDOM PROCESSESS SOLUTIONS TO UNIVERSITY QUESTION PAPER YEAR : JUNE 2014 CODE NO : 6074 /M PREPARED BY: D.B.V.RAVISANKAR ASSOCIATE PROFESSOR IT DEPARTMENT MVSR ENGINEERING COLLEGE, NADERGUL
More information13. Power Spectrum. For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if.
For a deterministic signal x(t), the spectrum is well defined: If represents its Fourier transform, i.e., if jt X ( ) = xte ( ) dt, (3-) then X ( ) represents its energy spectrum. his follows from Parseval
More informationEE4601 Communication Systems
EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two
More informationRepresentation of Signals & Systems
Representation of Signals & Systems Reference: Chapter 2,Communication Systems, Simon Haykin. Hilbert Transform Fourier transform frequency content of a signal (frequency selectivity designing frequency-selective
More informationFig 1: Stationary and Non Stationary Time Series
Module 23 Independence and Stationarity Objective: To introduce the concepts of Statistical Independence, Stationarity and its types w.r.to random processes. This module also presents the concept of Ergodicity.
More informationPart IV Stochastic Image Analysis 1 Contents IV Stochastic Image Analysis 1 7 Introduction to Stochastic Processes 4 7.1 Probability................................... 5 7.1.1 Random events and subjective
More information3F1 Random Processes Examples Paper (for all 6 lectures)
3F Random Processes Examples Paper (for all 6 lectures). Three factories make the same electrical component. Factory A supplies half of the total number of components to the central depot, while factories
More information2. (a) What is gaussian random variable? Develop an equation for guassian distribution
Code No: R059210401 Set No. 1 II B.Tech I Semester Supplementary Examinations, February 2007 PROBABILITY THEORY AND STOCHASTIC PROCESS ( Common to Electronics & Communication Engineering, Electronics &
More informationwhere r n = dn+1 x(t)
Random Variables Overview Probability Random variables Transforms of pdfs Moments and cumulants Useful distributions Random vectors Linear transformations of random vectors The multivariate normal distribution
More informationω 0 = 2π/T 0 is called the fundamental angular frequency and ω 2 = 2ω 0 is called the
he ime-frequency Concept []. Review of Fourier Series Consider the following set of time functions {3A sin t, A sin t}. We can represent these functions in different ways by plotting the amplitude versus
More informationECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1
ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen D. van Alphen 1 Lecture 10 Overview Part 1 Review of Lecture 9 Continuing: Systems with Random Inputs More about Poisson RV s Intro. to Poisson Processes
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationE[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =
Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of
More information2.1 Basic Concepts Basic operations on signals Classication of signals
Haberle³me Sistemlerine Giri³ (ELE 361) 9 Eylül 2017 TOBB Ekonomi ve Teknoloji Üniversitesi, Güz 2017-18 Dr. A. Melda Yüksel Turgut & Tolga Girici Lecture Notes Chapter 2 Signals and Linear Systems 2.1
More informationFourier Series. Spectral Analysis of Periodic Signals
Fourier Series. Spectral Analysis of Periodic Signals he response of continuous-time linear invariant systems to the complex exponential with unitary magnitude response of a continuous-time LI system at
More informationStatistical signal processing
Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable
More informationChapter 3: Random Variables 1
Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationProblems on Discrete & Continuous R.Vs
013 SUBJECT NAME SUBJECT CODE MATERIAL NAME MATERIAL CODE : Probability & Random Process : MA 61 : University Questions : SKMA1004 Name of the Student: Branch: Unit I (Random Variables) Problems on Discrete
More informationand 1 P (w i)=1. n i N N = P (w i) lim
Chapter 1 Probability 1.1 Introduction Consider an experiment, result of which is random, and is one of the nite number of outcomes. Example 1. Examples of experiments and possible outcomes: Experiment
More informationGaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts
White Gaussian Noise I Definition: A (real-valued) random process X t is called white Gaussian Noise if I X t is Gaussian for each time instance t I Mean: m X (t) =0 for all t I Autocorrelation function:
More informationQuestion Paper Code : AEC11T03
Hall Ticket No Question Paper Code : AEC11T03 VARDHAMAN COLLEGE OF ENGINEERING (AUTONOMOUS) Affiliated to JNTUH, Hyderabad Four Year B Tech III Semester Tutorial Question Bank 2013-14 (Regulations: VCE-R11)
More informationProblem Sheet 1 Examples of Random Processes
RANDOM'PROCESSES'AND'TIME'SERIES'ANALYSIS.'PART'II:'RANDOM'PROCESSES' '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''Problem'Sheets' Problem Sheet 1 Examples of Random Processes 1. Give
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More information7.7 The Schottky Formula for Shot Noise
110CHAPTER 7. THE WIENER-KHINCHIN THEOREM AND APPLICATIONS 7.7 The Schottky Formula for Shot Noise On p. 51, we found that if one averages τ seconds of steady electron flow of constant current then the
More informationChapter 2 Random Processes
Chapter 2 Random Processes 21 Introduction We saw in Section 111 on page 10 that many systems are best studied using the concept of random variables where the outcome of a random experiment was associated
More informationProperties of the Autocorrelation Function
Properties of the Autocorrelation Function I The autocorrelation function of a (real-valued) random process satisfies the following properties: 1. R X (t, t) 0 2. R X (t, u) =R X (u, t) (symmetry) 3. R
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationChapter 6: Random Processes 1
Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.
More informationELEG 3143 Probability & Stochastic Process Ch. 4 Multiple Random Variables
Department o Electrical Engineering University o Arkansas ELEG 3143 Probability & Stochastic Process Ch. 4 Multiple Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Two discrete random variables
More informationDefinition of a Stochastic Process
Definition of a Stochastic Process Balu Santhanam Dept. of E.C.E., University of New Mexico Fax: 505 277 8298 bsanthan@unm.edu August 26, 2018 Balu Santhanam (UNM) August 26, 2018 1 / 20 Overview 1 Stochastic
More informationLecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable
Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed
More informationStochastic Processes- IV
!! Module 2! Lecture 7 :Random Vibrations & Failure Analysis Stochastic Processes- IV!! Sayan Gupta Department of Applied Mechanics Indian Institute of Technology Madras Properties of Power Spectral Density
More information4 Classical Coherence Theory
This chapter is based largely on Wolf, Introduction to the theory of coherence and polarization of light [? ]. Until now, we have not been concerned with the nature of the light field itself. Instead,
More informationCommunication Theory II
Communication Theory II Lecture 8: Stochastic Processes Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 5 th, 2015 1 o Stochastic processes What is a stochastic process? Types:
More informationGeneral Appendix A Transmission Line Resonance due to Reflections (1-D Cavity Resonances)
A 1 General Appendix A Transmission Line Resonance due to Reflections (1-D Cavity Resonances) 1. Waves Propagating on a Transmission Line General A transmission line is a 1-dimensional medium which can
More informationOn the Average Crossing Rates in Selection Diversity
PREPARED FOR IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS (ST REVISION) On the Average Crossing Rates in Selection Diversity Hong Zhang, Student Member, IEEE, and Ali Abdi, Member, IEEE Abstract This letter
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationMATHEMATICAL TOOLS FOR DIGITAL TRANSMISSION ANALYSIS
ch03.qxd 1/9/03 09:14 AM Page 35 CHAPTER 3 MATHEMATICAL TOOLS FOR DIGITAL TRANSMISSION ANALYSIS 3.1 INTRODUCTION The study of digital wireless transmission is in large measure the study of (a) the conversion
More information13.42 READING 6: SPECTRUM OF A RANDOM PROCESS 1. STATIONARY AND ERGODIC RANDOM PROCESSES
13.42 READING 6: SPECTRUM OF A RANDOM PROCESS SPRING 24 c A. H. TECHET & M.S. TRIANTAFYLLOU 1. STATIONARY AND ERGODIC RANDOM PROCESSES Given the random process y(ζ, t) we assume that the expected value
More informationOutline. Random Variables. Examples. Random Variable
Outline Random Variables M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Random variables. CDF and pdf. Joint random variables. Correlated, independent, orthogonal. Correlation,
More informationECE 636: Systems identification
ECE 636: Systems identification Lectures 3 4 Random variables/signals (continued) Random/stochastic vectors Random signals and linear systems Random signals in the frequency domain υ ε x S z + y Experimental
More informationLecture 5 Notes, Electromagnetic Theory II Dr. Christopher S. Baird, faculty.uml.edu/cbaird University of Massachusetts Lowell
Lecture 5 Notes, Electromagnetic Theory II Dr. Christopher S. Baird, faculty.uml.edu/cbaird University of Massachusetts Lowell 1. Waveguides Continued - In the previous lecture we made the assumption that
More informationReliability Theory of Dynamic Loaded Structures (cont.) Calculation of Out-Crossing Frequencies Approximations to the Failure Probability.
Outline of Reliability Theory of Dynamic Loaded Structures (cont.) Calculation of Out-Crossing Frequencies Approximations to the Failure Probability. Poisson Approximation. Upper Bound Solution. Approximation
More informationLecture Notes 7 Stationary Random Processes. Strict-Sense and Wide-Sense Stationarity. Autocorrelation Function of a Stationary Process
Lecture Notes 7 Stationary Random Processes Strict-Sense and Wide-Sense Stationarity Autocorrelation Function of a Stationary Process Power Spectral Density Continuity and Integration of Random Processes
More informationSignals and Spectra - Review
Signals and Spectra - Review SIGNALS DETERMINISTIC No uncertainty w.r.t. the value of a signal at any time Modeled by mathematical epressions RANDOM some degree of uncertainty before the signal occurs
More information3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE
3. ESTIMATION OF SIGNALS USING A LEAST SQUARES TECHNIQUE 3.0 INTRODUCTION The purpose of this chapter is to introduce estimators shortly. More elaborated courses on System Identification, which are given
More informationENSC327 Communications Systems 2: Fourier Representations. Jie Liang School of Engineering Science Simon Fraser University
ENSC327 Communications Systems 2: Fourier Representations Jie Liang School of Engineering Science Simon Fraser University 1 Outline Chap 2.1 2.5: Signal Classifications Fourier Transform Dirac Delta Function
More informationMATTHIAS P ATZOLD, MEMBER, IEEE, ULRICH KILLAT, MEMBER, IEEE, FRANK LAUE, YINGCHUN LI. Technical University of Hamburg-Harburg.
On the Statistical Properties of Deterministic Simulation Models for Mobile Fading Channels MATTHIAS P ATZOLD, MEMBER, IEEE, ULRICH KILLAT, MEMBER, IEEE, FRANK LAUE, YINGCHUN LI Technical University of
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationSTOCHASTIC PROBABILITY THEORY PROCESSES. Universities Press. Y Mallikarjuna Reddy EDITION
PROBABILITY THEORY STOCHASTIC PROCESSES FOURTH EDITION Y Mallikarjuna Reddy Department of Electronics and Communication Engineering Vasireddy Venkatadri Institute of Technology, Guntur, A.R < Universities
More informationLecture 15. Theory of random processes Part III: Poisson random processes. Harrison H. Barrett University of Arizona
Lecture 15 Theory of random processes Part III: Poisson random processes Harrison H. Barrett University of Arizona 1 OUTLINE Poisson and independence Poisson and rarity; binomial selection Poisson point
More informationQ. 1 Q. 25 carry one mark each.
GATE 5 SET- ELECTRONICS AND COMMUNICATION ENGINEERING - EC Q. Q. 5 carry one mark each. Q. The bilateral Laplace transform of a function is if a t b f() t = otherwise (A) a b s (B) s e ( a b) s (C) e as
More information5.9 Power Spectral Density Gaussian Process 5.10 Noise 5.11 Narrowband Noise
Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering g National Sun Yat-sen University Table of Contents 5.1 Introduction 5. Probability
More informationMultivariate random variables
Multivariate random variables DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Joint distributions Tool to characterize several
More informationProbability and Statistics
Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph
More information1 Review of di erential calculus
Review of di erential calculus This chapter presents the main elements of di erential calculus needed in probability theory. Often, students taking a course on probability theory have problems with concepts
More informationThis is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or
Physics 7b: Statistical Mechanics Brownian Motion Brownian motion is the motion of a particle due to the buffeting by the molecules in a gas or liquid. The particle must be small enough that the effects
More informationFourier Series and Transforms. Revision Lecture
E. (5-6) : / 3 Periodic signals can be written as a sum of sine and cosine waves: u(t) u(t) = a + n= (a ncosπnft+b n sinπnft) T = + T/3 T/ T +.65sin(πFt) -.6sin(πFt) +.6sin(πFt) + -.3cos(πFt) + T/ Fundamental
More information04. Random Variables: Concepts
University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 215 4. Random Variables: Concepts Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative
More informationRandom Process. Random Process. Random Process. Introduction to Random Processes
Random Process A random variable is a function X(e) that maps the set of experiment outcomes to the set of numbers. A random process is a rule that maps every outcome e of an experiment to a function X(t,
More information2A1H Time-Frequency Analysis II
2AH Time-Frequency Analysis II Bugs/queries to david.murray@eng.ox.ac.uk HT 209 For any corrections see the course page DW Murray at www.robots.ox.ac.uk/ dwm/courses/2tf. (a) A signal g(t) with period
More informationRandom Variables and Probability Distributions
CHAPTER Random Variables and Probability Distributions Random Variables Suppose that to each point of a sample space we assign a number. We then have a function defined on the sample space. This function
More information16.584: Random (Stochastic) Processes
1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable
More informationENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University
ENSC327 Communications Systems 19: Random Processes Jie Liang School of Engineering Science Simon Fraser University 1 Outline Random processes Stationary random processes Autocorrelation of random processes
More informationEE 438 Essential Definitions and Relations
May 2004 EE 438 Essential Definitions and Relations CT Metrics. Energy E x = x(t) 2 dt 2. Power P x = lim T 2T T / 2 T / 2 x(t) 2 dt 3. root mean squared value x rms = P x 4. Area A x = x(t) dt 5. Average
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationWhere are the Fringes? (in a real system) Div. of Amplitude - Wedged Plates. Fringe Localisation Double Slit. Fringe Localisation Grating
Where are the Fringes? (in a real system) Fringe Localisation Double Slit spatial modulation transverse fringes? everywhere or well localised? affected by source properties: coherence, extension Plane
More informationApplied Probability and Stochastic Processes
Applied Probability and Stochastic Processes In Engineering and Physical Sciences MICHEL K. OCHI University of Florida A Wiley-Interscience Publication JOHN WILEY & SONS New York - Chichester Brisbane
More informationSTA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/
STA263/25//24 Tutorial letter 25// /24 Distribution Theor ry II STA263 Semester Department of Statistics CONTENTS: Examination preparation tutorial letterr Solutions to Assignment 6 2 Dear Student, This
More informationStochastic Processes. A stochastic process is a function of two variables:
Stochastic Processes Stochastic: from Greek stochastikos, proceeding by guesswork, literally, skillful in aiming. A stochastic process is simply a collection of random variables labelled by some parameter:
More informationU U B U P x
Smooth Quantum Hydrodynamic Model Simulation of the Resonant Tunneling Diode Carl L. Gardner and Christian Ringhofer y Department of Mathematics Arizona State University Tempe, AZ 8587-184 Abstract Smooth
More informationENSC327 Communications Systems 2: Fourier Representations. School of Engineering Science Simon Fraser University
ENSC37 Communications Systems : Fourier Representations School o Engineering Science Simon Fraser University Outline Chap..5: Signal Classiications Fourier Transorm Dirac Delta Function Unit Impulse Fourier
More informationCommunications and Signal Processing Spring 2017 MSE Exam
Communications and Signal Processing Spring 2017 MSE Exam Please obtain your Test ID from the following table. You must write your Test ID and name on each of the pages of this exam. A page with missing
More informationChapter 2 Random Variables
Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung
More information