Appendix A PROBABILITY AND RANDOM SIGNALS. A.1 Probability

Size: px
Start display at page:

Download "Appendix A PROBABILITY AND RANDOM SIGNALS. A.1 Probability"

Transcription

1 Appendi A PROBABILITY AND RANDOM SIGNALS Deterministic waveforms are waveforms which can be epressed, at least in principle, as an eplicit function of time. At any time t = t, there is no uncertainty about the value of the waveform. Communication signals, however, are at least partially unpredictable, otherwise the object of communication is defeated. Furthermore, the ever-present agitation of the universe at the atomic level constitutes the noise in communication systems. Such noise waveforms are also unpredictable. Unpredictable waveforms such as a signal s(t) or a noise n(t) are called random processes. While random processes are unpredictable, they can very often be estimated. The estimation of the performance of a random process is generally associated with a certain probability of being correct. A. Probability Relative Frequency Approach: In an eperiment repeated N times, if the event A occurs m times, then P(A) the probability of event A is defined as m P(A) = lim N N (A.) This definition is known as the relative frequency definition. Note that as N, the limit eists. Thus it follows, 0 P(A) (A.2) Such eperiments (trials) are usually done in the mind rather than actually performed. For instance, an unbiased coin is tossed; a head and a tail are equally likely. Thus we reason that if many trials are performed, the probability of having a head is 0.5. i

2 ii (The relative frequency approach to probability is not rigorous, since there remain several undefined factors, e.g. how large is N in order that Eq. (A.) is true? Thus we use the aiomatic approach for rigour: The probability of an event A is a number assigned to this event such that P(A) 0 P(S) = where S is made up of all mutually eclusive and ehaustive events and if A and B are mutually eclusive. P(A + B) = P(A) + P(B) A.2 Mutually Eclusive Events Two possible outcomes of an eperiment are mutually eclusive if the occurrence of one precludes the occurrence of the other. If the events A and A 2 have probabilities P(A ) and P(A 2 ) respectively, and if A and A 2 are mutually eclusive then P(A A 2 ) = P(A ) + P(A 2 ) (A.3) Etending to n mutually eclusive outcomes, P(A A 2 A n ) = If M is the total number of outcomes, then M P(A i ) = i= n P(A i ) i= (A.4) (A.5) A.3 Joint Probability, Conditional Probability and Independent Events If an eperiment has two sets of outcomes A, A 2,... and B, B 2,..., then the probability of the joint occurrence of A i and B j is written as P(A i B j ) or simply P(A i, B j ). If the outcome B is influenced by the outcome A, then we have to introduce the concept

3 iii of conditional probability denoted by P(B j A i ), i.e. the probability of B j given that the outcome of A is A i. Suppose the eperiment is performed N times out of which N i is the number of times A i occurs with or without B j, and N j the number of times B j occurs with or without A i, also N ij is the number of times the A i and B j jointly occur, then, P(B j A i ) = N ij = N ij/n N i N i /N = P(A i, B j ) P(A i ) (A.6) Similarly, since N ij = N ji, we have, P(A i B j ) = N ji = P(A i, B j ) N j P(B j ) (A.7) So that Thus, P(A i, B j ) = P(B j A i )P(A i ) = P(A i B j )P(B j ) P(A i B j ) = P(A i) P(B j ) P(B j A i ) (A.8) (A.9) which is known as Bayes theorem. If the outcome of B is not influenced by the outcome of A, then A and B are independent for which } P(A i B j ) = P(A i ) (A.0) P(B j A i ) = P(B j ) and P(A i, B j ) = P(A i )P(B j ) Etending to any arbitrary number of independent outcomes, P(A i, B j, C k,...) = P(A i )P(B j )P(C k )... (A.) (A.2) A.4 Random Variables: Discrete The set of elements consisting of all possible distinct outcomes of an eperiment is called the sampled space. The individual elements of the sample space are called sample points, i.e each sample point corresponds to one distinct outcome of the eperiment. It s mathematically attractive to assign a real number to each sample point in the sample space according to some rule. This number is called a random variable. For eample we can assign a to a head and a to a tail in the tossing of a coin. Then the random

4 iv variable can assume two discrete values and. In general, if there are n outcomes, we assign n discrete values, 2,..., n to these sample points and the random variable can then assume all these n discrete values. (A random variable is actually a function in a conventional sense). Thus, a random variable is a real-valued point function (u) defined over the sample space S where u S. The probability assigned to each value of the random variable is denoted by P ( i ). Here the subscript refers to the random variable whereas the argument i is the particular value of the random variable. Thus P ( i ) is the probability that a random variable assumes a value i. Generally, if only one random variable is involved in the discussion, the omission of the subscript causes no confusion, hence the probability is denoted by P( i ). If there are n mutually eclusive outcomes, then n P ( i ) = i= (A.3) If there are two sets of outcomes described by two random variables and y, then P y ( i, y j ) = (A.4) i j For eample, throwing two dice where represents the outcomes of the first dice, and y those of the second. If and y are two random variables, then the conditional probability of = i and y = y j is denoted by P ( i y = y j ). For a given value y j of y, must assume any of the n values, 2,..., n. Hence, Similarly, Also, and Thus Bayes rule is, n P ( i y = y j ) = i= n P y (y j = i ) = j= P ( i y = y j ) = P y( i, y j ) P y (y j ) P y (y j = i ) = P y( i, y j ) P ( i ) P ( i y = y j ) = P y(y j = i )P ( i ) P y (y j ) (A.5) (A.6) (A.7) (A.8) (A.9)

5 v A.5 Random Variables: Continuous If the random variable can assume an innumerably infinite possible sample point, then the sample space is continuous and the random variable is thus continuous. For eample, the temperature T of a room is a continuous sample space. In this case, a more meaningful measure is the probability of observing the temperature in some small interval T. Hence we inquire about the probability of observing a random variable below some value. This probability denoted by F () is defined as Thus F () = probability( ) F () = 0 F ( ) = F () is called the cumulative distribution function of. Note that for 2 >, } F ( 2 ) F ( ) (A.20) (A.2) (A.22) Probability Density function In Eq. (A.22), if 2 >, the outcomes that, and that < 2 are mutually eclusive. Hence, probability( 2 ) = probability( ) + probability( < 2 ) Hence If 0, then by Taylor s epansion F ( + ) = F () + probability( < + ) (A.23) Comparing Eqs. (A.23) and (A.24) F ( + ) = F () + df () +... (A.24) d probability( < + ) = lim 0 The derivative of F () w.r.t is denoted by p (), i.e. p () = df () d df () (A.25) d (A.26) p () is called the probability density function of the random variable. From Eq. (A.26), F () = p ()d (A.27)

6 vi where, of course, F () = 0. Also, Note that and probability( < 2 ) = F ( 2 ) F ( ) = = 2 p ()d 2 p ()d p ()d (A.28) p ()d = p () 0 (A.29) (A.30) For a discrete random variable, the probability density function may be considered as a limiting case of the continuous variable, i.e. the probability density function is concentrated as impulses at some discrete points. Thus if a discrete random variable assumes values, 2,..., n with probabilities a, a 2,..., a n then the probability density function is, n p () = a r δ( r ) (A.3) Observe that since then p ()d = r= n a r = r= n r= a r δ( r )d = (A.32) It is possible to have a probability density function being both continuous and discrete at some points. Eample. A signal may have a probability density function as shown in Fig A.(a). If such a signal is passed through a limiter which clips all the voltage level greater than A, the new probability density function will be as shown in Fig A.(b) where the impulse appearing at = A has a strength K = A p ()d Similarly, if the limiter is used to clip both positive as well as negative amplitudes above A, the new probability density function will appear as in Fig A.(c) where the strength of the impulse at = A is given by K = A p ()d

7 vii p ( ) p ( ) (a) p ( ) 2 0 A (b) A 0 A (c) Figure A.: Thus, for Fig A.(c), the probability of observing any particular voltage amplitude where A < < A is zero, while the probability of observing the voltage amplitude A is K and that of observing voltage amplitude A is also K. A.6 Joint Distribution We can etend the concept of probability density functions to the outcome of two (or more) random variables which may or may not be independent of each other. Thus correspondingly, for two random variables and y, the probability that + d while y y y + dy is given by probability( + d, y y y + dy) = p y (, y)ddy where p y (, y) is the joint probability density function. Etending Eq. (A.33) to a finite interval, probability( 2, y y y 2 ) = The cumulative distribution function is, y2 2 F y (, y) = probability(, y y) = y y p y (, y)ddy (A.33) p y (, y)ddy (A.34) (A.35)

8 viii Obviously, From Eq. (A.35), we have p y (, y) = p y (, y)ddy = 2 y F y(, y) (A.36) (A.37) The individual probability densities p () and p y (y), usually called the marginal densities, can be obtained from the joint density p (, y). To obtain these densities, we have lim p () = probability( < + d, < y ) 0 = lim 0 = lim lim p () = 0 [ p y (, y)dyd ] p y (, y)dy d regardless of where y lies lim 0 (A.38) Since, 0, the integral inside the brackets can be treated as a constant over the range (, + ). Hence, [ + ][ ] d p y (, y)dy Similarly, Thus, = lim 0 p () = p y (y) = [ ] p y (, y)dy p y (, y)dy p y (, y)d F () = probability(, y ) (A.39) (A.40) If and y are independent, then = p ()d = p y(, y)dyd and F y (y) = y p y(y)dy = y p y(, y)ddy probability( 2, y y y 2 ) = [ 2 ] [ y2 ] p ()d p y (y)dy y (A.4) (A.42)

9 i If there is no confusion, the density functions p (), p y (y) can simply be written as p() and p(y). A.7 Conditional Densities Etending the idea of conditional probability to continuous variables, we define the conditional probability density p ( y = y j ) as the probability density of given that y has a value y j. The probability density p ( y = y j ) is the intersection of a plane y = y j with the joint probability density function p y (, y) as shown in Fig A.2. Similarly, p y (y = i ) p (, y) p (, y) y Figure A.2: is the intersection of a plane = i with p y (, y). From definition, prob( y < y y + y) = = prob(, y < y y + y) prob(y < y y + y) y+ y p y (, y)dyd y y+ y y p y (y)dy As y 0, we have lim y 0 y p y(, y)d prob( y < y y + y) = lim y 0 yp y (y) = p y(, y)d p y (y)

10 As y 0, L.H.S. becomes prob( y = y) = F ( y = y). Hence, Now, Thus, Similarly, From Eq. (A.46) Substituting this in Eq. (A.45), Similarly and F ( y = y) = p(, y) p(y) d p ( y = y) = df ( y = y) d p ( y = y) = p y (y = ) = p(, y) p(y) p(, y) p() p(, y) = p y (y = )p() p ( y = y) = py(y =)p() p(y) p y (y = ) = p( y=y)p(y) p() p y (y = ) = p(,y) p(,y)dy (A.43) (A.44) (A.45) (A.46) (A.47) Eqs. (A.47) are the Bayes rule for continuous random variables. Note that from Eqs. (A.46) and (A.45) we obtain, p(,y) p ( y = y) = p(,y)d (A.48) Note also that p ( y = y)d = and p y(y = )dy = i.e. observing in the interval (, ) given that y = y is a certainty. The continuous random variables and y are said to be independent if p ( y = y) = p() Hence, p y (y = ) = p(y) p y (, y) = p()p(y) (A.49) (A.50) (A.5)

11 i A.8 Statistical Average (Mean) If a random variable can assume n values,..., n, and if the eperiment is repeated N times (N ), while m,..., m n be the number of trials favorable to outcomes,..., n respectively, then the mean value of is As N, we have = N (m m n n ) = m N m n N n = n i P ( i ) i= (A.52) where P ( i ) = probability of assuming the value i. The mean value is also called the epected value denoted by E[]. Thus, = E[] = i ip ( i ) and if is continuous = E[] = (A.53) p()d Often, it is desired to find the mean value of a certain function of a random variable. For eample, suppose we have a noise whose amplitude is represented by a random variable. In practice we are more interested in the mean square value of the signal, i.e. we desire E[ 2 ]. In general, if y = g() by definition, E[y] = yp y (y)dy (A.54) But the probability of y falling between y and y + y is p y (y)dy, and this is caused by falling between and +, the probability of which is p ()d. Thus, p y (y)dy = p ()d E[y] = If z is a function of and y such that z = φ(, y), g()p ()d (A.55) then, E[z] = φ(, y)p(, y)ddy (A.56)

12 ii Eample.2 If z = y and if and y are independent, then z = y Eample.3 If z = + y, and, y independent, then z 2 = 2 + y 2 + 2y A.9 Moments The nth moment of a random variable is defined as E[ n ] = n p()d The nth central moment of is its nth moment about its mean m, thus The variance σ 2 E[( m ) n ] = is the second central moment of, i.e. σ 2 = E[( m ) 2 ] = E[( 2 2m + m 2 )] = E[ 2 ] 2m E[] + m 2 If are independent and if and y are independent and if (A.57) ( m ) n p()d (A.58) = E[ 2 ] 2m 2 + m 2 = 2 m 2 (A.59) z = + y then [ σz 2 = ( + y) 2 ( + y) 2] = [ 2 + 2y + y 2 ( + y) 2] ] = [ 2 + 2y + y 2 2 2y y 2 ) )] = [( (y 2 y 2 = σ 2 + σ2 y (A.60) where y = y if and y are independent.

13 iii A.0 Some Useful Probability Distributions A.0. The Gaussian Probability Density Also called the normal probability density function is very important because many naturally occurring eperiments are characterized by random variables with a Gaussian Density. The Gaussian density is defined as [ ] ( m ) 2 p() = ep (A.6) 2πσ 2 Eample.4 Show that for a Gaussian density, p()d = 2σ 2 = m 2 = σ 2 + m 2 (A.62) A.0.2 The Rayleigh Probability Density Consider a comple phasor re jθ which may be written as where re jθ = n c + jn s n c = a k cosθ k k n s = a k sin θ k k (A.63) (A.64) If all θ k are all independent and are uniformly distributed random variables, also if all a k are random variables of similar magnitudes, then the central limit theorem ensures that n c and n s are Gaussian random variables, i.e. and p nc (n c ) = p ns (n s ) = 2πσ 2 nc e nc2 /2σ 2 nc 2πσ 2 ns e ns2 /2σ 2 ns (A.65) σ 2 n c = σ 2 n s = k σ 2 a k 2 σ2

14 iv Here, σ 2 a k is the variance of the random variable a k, and n c and n s are independent. The joint density is given by where r 2 = n 2 c + n2 s. But 2 +n 2 s )/2σ 2 p ncn s (n c, n s ) = p nc (n c ) p ns (n s ) = e (nc 2πσ 2 = e r2/2σ2 2πσ 2 (A.66) and transforming differential areas such that p ncn s (n c, n s ) dn c dn s = p rθ (r, θ) dr dθ dn c dn s = rdr dθ, we have Thus, p ncn s (n c, n s ) dn c dn s = e r/2σ2 2πσ 2 r dr dθ = p rθ(r, θ) dr dθ p r,θ (r, θ) = ( ) ( re r2 /2σ 2 2π σ 2 ) = p r (r) p θ (θ) (A.67) The last step in (A.67) is arrived at because we note that the epression of p rθ (r, θ) is free of θ, indicating that r and θ are independent and that the probability density function p θ (θ) must be a constant. Therefore, we conclude that and p r (r) = r σ 2 ep p θ (θ) = 2π [ r 2 2σ 2 ] 0 r = 0 elsewhere (A.68) (A.69) and p r (r) is called the Rayleigh distribution. For a random variable r, having a Rayleigh density, the mean value of r is E[r] = σ 2 0 r 2 e r2 /2σ 2 dr = σ π 2 (A.70)

15 v The mean square value of r is The variance σ 2 r is E[r 2 ] = r 3 e r2 /2σ σ 2 2 dr 0 = 2σ 2 (A.7) σr 2 = E[ 2 ] (E[]) 2 ( = 2 π ) σ 2 2 (A.72) A.0.3 The Rician Probability Density The Rayleigh distribution is derived from the fact that no one of the n c and n s components in (A.64) predominates. In some cases, it is found that one particular signal does dominate. This dominating sinusoidal wave is called the specular component. In this case, we assume the received signal has a specular component A cosω c t added to the previous sum of random terms. Hence, where σ 2 n c = σ 2 n s = σ 2. s(t) = (n c + A) cosω c t n s sin ω c t (A.73) Considering the term (n c +A) alone, we note that the sum represents a Gaussian variable with A being the average value and σ 2 being the variances; let we have n c = n c + A p n c (n c ) = e (n c A)2 /2σ 2 2πσ 2 (A.74) Now, r 2 = n 2 c + n 2 s = (n c + A) n s θ = tan n ( ) s = tan ns (A.75) n c n c + A with n c = r cosθ and n s = r sin θ, thus we have p rθ (r, θ) dr dθ = p n c n s (n c, n s ) dn c dn s { } = e (n c A) 2 +n 2 s /2σ 2 dn 2πσ 2 c dn s = e A2 /2σ 2 re (r2 2rAcos θ)/2σ 2 2πσ 2 dr dθ (A.76)

16 vi Now, p rθ (r, θ) cannot be written as a product p r (r)p θ (θ) since a term in the equation with both variables multiplied together as r cos θ. This indicates that r and θ are dependent variables. p r (r) can be found by integrating overall values of θ, i.e., p r (r) = e A2 /2σ 2 re r2 /2σ 2 2πσ 2 2π 0 e racos θ/σ2 dθ The integral cannot be evaluated in terms of elementary functions. The integral I o (z) 2π 2π 0 e z cos θ dθ is called the modified Bessel function of the first kind and zero order. Thus, { r +A p r (r) = 2 )/2σ 2} ( ) r A I σ 2 e (r2 o σ 2 (A.77) (A.78) (A.79) and is called the Rician distribution. The Rician distribution is plotted in Fig. A.3. Figure A.3: A. The Error Function For a Gaussian probability density with m = 0, the cumulative density is given by prob( ) = F() = 2πσ 2 e 2 /2σ 2 d (A.80) This integral cannot be easily evaluated but is directly related to the error function thoroughly tabulated. The error function of u is defined as erf(u) 2 π u 0 e 2 d (A.8)

17 vii Note that erf(0) = 0 and erf( ) =. The complementary error function erfc(u) is defined as erfc(u) = erf(u) 2 e 2 d (A.82) π To epress F() in Eq. (A.80) in terms of erf(u) and erfc(u), we have [ u F() = d = ] e 2 /2σ 2 d 2πσ 2 2πσ 2 = e 2 /2σ 2 2πσ 2 d u e 2 /2σ 2 d ( = ) 2 e u2 du where u = 2 π σ 2 σ 2 = 2 erfc( σ 2 ) = { } + erf( 2 σ 2 ) Since tabulated values are only obtained for positive u in erfc(u), thus for 0 F() = F( ) = = σ 2 e u2 du π { = } 2 e ζ2 dζ 2 π σ 2 e 2 /2σ 2 2πσ 2 d = erfc( 2 σ 2 ) (A.83) (A.84) Eample.5 Consider the transmission of a signal which can assume two values 0 and E. Due to the noise present during transmission, the signal received, v, is no more 0 or E but rather v = n when 0 is transmitted v = n + E when E is transmitted where n is a random variable representing the noise. We assume the noise has a Gaussian distribution and zero mean. Thus when 0 is transmitted, the received signal v has a Gaussian distribution centered at 0 (Fig A.4(a)). When E is transmitted, the received signal has a Gaussian distribution centered at E (Fig A.4(b)). A threshold level a has to be set such that if the received signal v is below a, we decide that the transmitted signal is 0; if the received signal v is above a, then we decide the transmitted signal is E. Thus if 0 is transmitted and the received signal v > a, then it constitute an error. The probability of this is given by, (Fig A.4(a)) e v2 /2σ 2 prob(v > a)) = dv = 2πσ 2 2 erfc( a σ 2 ) a

18 viii p( v) 0 a v (a) p( v) 0 a E v (b) Figure A.4: If E is transmitted and v < a, then an error is received, the probability of which is, (Fig A.4(b)) a e (v E)2 /2σ 2 prob(v < a) = dv 2πσ 2 Letting u = (E v)/σ 2 E a σ 2 prob(v < a) = = E a σ 2 e u2 π du e u2 π du = 2 erfc(e a σ 2 ) The combined probability of error is shown in Fig A.5. The shaded area constitutes the total probability of error if the decision threshold is chosen to be a. Obviously, the combined prob- 0 a E v Figure A.5:

19 i ability of error is a minimum when the decision threshold is chosen to be at the intersection of the two curves, i.e. a = E 2. A.2 The Transform of Probability Density Functions From Eq. (A.54), the mean of a function g() of the random variable is E[g()] = Let g() = e jζ, thus E[e jζ ] = g()p ()d p ()e jζ d = P (ζ) (A.85) Thus, the epected value of e jζ is the Fourier transform of the probability density function. Other transforms of the probability density function are also used: (a) The characteristic function E[e jζ )] = (b) The moment generating function E[e s ] = p()e jζ d = 2πF [p()] p()e s d = 2πjL [p()] Note that p () = P (ζ)e jζ dζ 2π A.3 Probability Density Function of a Sum of Two Independent Random Variable Let and y be two statistically independent random variables. Let z = + y

20 Let p (), p y (y), p z (z) be the probability density functions of, y and z respectively and P (ζ), P y (ζ), P z (ζ) be the respective Fourier transforms. By definition, Thus, by Eq. (A.56) P z (ζ) = P z (ζ) = E[e jζz ] p y (, y)e jζ(+y) ddy But and y are independent, thus the joint-probability density function is p y (, y) = p()p(y) Hence, substituting this joint density function into P z (ζ), we have P z (ζ) = Using the convolution theorem, p()e jζ p(y)e jζy ddy = P (ζ)p y (ζ) (A.86) p z (z) = p () p y (y) = p (λ)p y (z λ)dλ (A.87) Etending to n independent random variables having probability density functions p( ), p( 2 ),..., p( n ), and if z = n then, p z (z) = p( ) p( 2 )...p( n ) (A.88) Eample.6 Two independent random variables and y both have Gaussian probability density functions given by ( m)2 /2σ2 p () = e σ 2π If z = + y, then But p y (y) = P (ζ) = σ 2π = (y my)2 /2σ2 y e σ y 2π P z (ζ) = P (ζ)p y (ζ) { e σ2 ζ 2 /2 e ( m)2 /2σ 2 e jζ d } e jζm (A.89)

21 i and similarly, { P y (ζ) = e ζ2} σ2 y e jζmy P z (ζ) = [e )ζ2] [ ] (σ2 +σ2 y e jζ(m +m y) Hence p z (z) = F [P z (ζ)] = e [z (m+my)]2/2(σ2 2π(σ 2 + σ2 y ) = (z mz)2 /2σ2 e z σ z 2π +σ2 y ) (A.90) where σ 2 z = σ2 + σ2 y m z = m + m y From Eq. (A.90), we see that the probability density function of a sum of two independent Gaussian random variables is also Gaussian. This result can be etended to any number of normally distributed random variables. A.4 The Central Limit Theorem The theorem states that if p( ), p( 2 ),..., p( n ) are the probability density functions of n random variables, then the probability density function of the sum of these n random variables tends to be Gaussian as n. The mean and variance of this gaussian density are respectively the sum of the means and the sum of the variances of the independent variables. The proof of the theorem will be omitted but rather demonstrated with a square pulse function p() in Fig A.6. p( ) p( ) p( ) p( ) p( ) p( ) Figure A.6

22 ii A.5 Correlation Between Random Variables The covariance r of two random variables and y is defined as r E[( m )(y m y )] (A.9) If and y are independent random variables, then r = E[( m )(y m y )] = = ( m )(y m y )p(, y)ddy ( m )p()d (y m y )p(y)dy = (m m )(m y m y ) = 0 (A.92) On the other hand, if and y are dependent, say if increases, y increases; if decreases, y decreases, then we epect that E[( m )(y m y )] > 0. Similarly, if and y are related such that when increases, y decreases, and when decreases y increases, then we anticipate E[( m )(y m y )] < 0. If maimum possible relation between and y is assumed, i.e. = y, then, with m = m y = 0, we have E{y} = E[ 2 ] = E[y 2 ] = σ 2 = σ2 y = σ σ y (A.93) Or, if = y E{y} = E[ 2 ] = E[ y 2 ] = σ 2 = σ2 y = σ σ y We define the correlation coefficient ρ between and y as Since ma r = σ σ y, we have ρ r σ σ y = E{y} σ σ y ρ (A.94) (A.95) (A.96) If and y are independent then ρ = 0. Conversely, if ρ = 0, and y are said to be uncorrelated. But ρ = 0 does not necessarily mean that and y are independent. Thus Independent uncorrelated ρ = 0. Eample.7 is a random variable with p() =,. Let y = 2 2. Thus and y are not independent; however, and y are uncorrelated. E[] = 2 d = 0

23 iii The covariance is and y are uncorrelated. E[y] = E[ 2 ] = 2 2 d = 3 { ( r = E{( m )(y m y )} = E y )} 3 { = E 3 } ( = 3 ) d For two random variables and y both having Gaussian probability density functions, if and y are uncorrelated, i.e. if ρ = 0, then they are independent. A.6 Random Processes To obtain the probabilities of various possible outcomes of an eperiment, we can either repeat the eperiment many times, or we can perform a large collection of the eperiment simultaneously. Such a collection of eperiments is called an ensemble, and the individual outcome called the sample function. For eample, consider a noise waveform n(t). We may measure repeatedly the output of a single noise source, or we may make simultaneous measurements of the output of an ensemble of statistically identical noise sources. Suppose we want to measure the mean square of the noise source: The average obtained from measurements on a single sample function at successive times is called a time average represented by n 2 (t). The average obtained from measurements on an ensemble of noise sources is called an ensemble average represented by E[n 2 (t)] t=t where t is the instant at which the measurements are performed. In general, the time average and the ensemble average will not be the same. For eample, if the statistical characteristics of the sample functions change with time, then the ensemble average will be different at different instants. When the statistical characteristics of the sample functions do not change with time, the random process is called stationary. Even if the random process is stationary, the ensemble average may not be the same as the time average for while the individual sample functions are stationary, they may differ in statistical characteristics to each other. If a random process is such that the ensemble average is identical to the time average, the process is called ergodic. An ergodic process is stationary, but a stationary process is not necessarily ergodic. Throughout this course, we deal only with ergodic random processes, i.e. n 2 (t) = E{n 2 (t)} t

24 iv A.7 Auto-Correlation of a Random Process The autocorrelation of ergodic random process n(t) is defined as R n (τ) = lim T T T 2 T 2 n(t)n(t + τ)dt (A.97) Since the auto-correlation of a deterministic signal is the inverse transform of the power spectral density, the power spectral density of a random process is defined similarly as in the case of a deterministic process such that S n (ω) = F[R n (τ)] = R n (τ)e jωτ dτ (A.98) In order to define the power spectral density of a deterministic function f(t) which etend from to, we select a section of this waveform which etends from T/2 to T/2. This waveform f T (t) = f(t) within this range, outside this range f T (t) = 0. Now, f T (t) F T (ω) and over the interval T, the normalized power density is F T (ω) 2 /T. As t, f T (t) f(t), thus the power spectral density of a deterministic waveform is given by S f (ω) = lim T T F T(ω) 2 (A.99) Correspondingly the power spectral density of a sample function of a random process is defined as S n (ω) = lim T T N T(ω) 2 But since n(t) is a random process, it will be meaningful to define only the ensemble average of the power density spectrum, i.e. [ ] S n (ω) = E lim T T N T(ω) 2 [ ] = lim E T T N T(ω) 2 (A.00) where E[ ] represents the ensemble average and N T (ω) is the Fourier transform of a truncated section of a sample function of the random process n(t). The auto correlation function R n (τ) in Eq. (A.97) is a time average. For an ergodic process, we can change this into an ensemble average, thus R n (τ) = E[n(t)n(t + τ)] (A.0)

25 v Now n(t) is a random variable at t, n(t + τ) is another random variable at t + τ, R n (τ) is the covariance of two different random variables. Suppose for some τ, we find R n (τ) = 0, then the random variables n(t) and n(t + τ) are uncorrelated, and if these random processes have Gaussian distributions, then they are also independent. In fact, as far as actual noise is concerned the covariance R n (τ) is zero ecept when τ = 0, i.e. n(t) and n(t + τ) are uncorrelated unless τ = 0. This is because noise in communication systems are generally regarded as white, i.e., their spectral density function is a constant throughout the whole spectrum. i.e. Now, since R n (τ) S n (ω) and if S n (ω) = k, then R n (τ) = kδ(τ) R n (τ) = 0 for τ 0. Thus n(t) and n(t+τ) are uncorrelated, and if they have Gaussian probability density functions, they are independent. - -

26 vi Problems. Si dice are thrown simultaneously. What is the probability that at least die shows a 3? 2. A card is picked from each of four cards in a 52-card deck. (a) What is the probability of selecting at least one 6 of spades? (b) What is the probability of selecting at least card larger than an 8? 3. A card is drawn from a 52-card deck, and without replacing the first card a second card is drawn. The first and second cards are not replaced an a third card is drawn. (a) If the first card is a heart, what is the probability of the second card being a heart? (b) If the first and second cards are hearts, what is the probability that the third card is the king of clubs? 4. A binary data system uses two symbols 0 and, transmitted with probabilities P 0 and P. Owing to transmission errors a 0 maybe changed to a at the receiver with probability p 0, and similarly for p. Obtain epressions for the following: (i) the approimate umber of errors in a sequence of n digits; (ii) the probability that a received digit is a 0; (iii) the probability that an error has occurred given that a 0 is received. 5. (a) An important probability density function is the Rayleigh density { e 2 /2 0 p() = 0 < 0 i) Prove that p() satisfies p() 0 for all, and p()d =. ii) Find the distribution function F(). (b) Refer to the Rayleigh density function given in (a). Find the probability P( < 2 ), where 2 =, so that P( < 2 ) is a maimum. Hint: Find P( < 2 ); replace 2 by +, and maimize P with respect to. (c) Refer to the Rayleigh density function given in (a). Find i) P(0.5 < 2). ii) P(0.5 < 2). 6. The joint probability density of the random variables and y is p(, y) = ke (+y) in the range 0, 0 y, and p(, y) = 0 otherwise. (a) Find the value of the constant k.

27 vii (b) Find the probability density p(), the probability density of independently of y. (c) Find the probability P(0 2; 2 y 3). (d) Are the random variables dependent or independent? 7. is a random variable having a gaussian density. E() = 0, σ 2 variable having the values or -, each with probability. 2 =. v is a random (a) Find the joint density p(, v). (b) Show that p(v) = p(, v)d. 8. The joint probability density of the random variables and y is p(, y) = ke (y+) in the range 0, 0 y, and p(, y) = 0 otherwise. (a) Find p() and p(y), the probability density of independently of y and y independently of. (b) Are the random variables dependent or independent? 9. Compare the most probable [p() is a maimum] and the average value of when (a) p () = 2π e ( m)2/2 for all { e 2 /2 for 0 (b) p 2 () = 0 elsewhere 0. Calculate the variance of the random variables having densities: (a) The gaussian density p () = 2π e ( m)2 /2 for all. (b) The Rayleigh density p 2 () = e 2 /2, 0. (c) The uniform density p 3 () = /a, a/2 a/2.. (a) A voltage v is a function of time t and is given by v(t) = cos ωt + y sin ωt in which ω is a constant angular frequency and and y are independent gaussian variables each with zero mean and variance σ 2. Show that v(t) may be written v(t) = r cos(ωt + θ) in which r is a random variable with a Rayleigh probability density and θ is a random variable with uniform density. (b) If σ 2 =, what is the probability that r?

28 viii 2. The random signal v in Problem (a) is transmitted through a channel in which the channel noise power is a constant η. The average signal to noise ratio is thus given by Γ = E[r2 ] η where from Problem, r is the magnitude of the random signal and has a Raleigh distribution. (a) Show that the signal to noise ratio has also a Rayleigh distribution given by p(γ) = Γ e γ Γ (b) A certain detector for the above random signal has a characteristic such that the probability of error is dependent on the signal to noise ratio. For a given γ, the probability of error is given by P(e γ) = 2 e γ Show that the total probability of error in the detector is given by P(ε) = 2(Γ + ) 3. The independent random variables and y are added to form z. If p () = e 2 /2 0 and (A.02) p y (y) = 2 e y y <, (A.03) find p z (z). 4. The function of time z(t) = cosω 0 t 2 sin ω 0 t is a random process. If and 2 are independent gaussian random variables each with zero mean and variance σ 2, find (a) E(z), E(z 2 ), σ 2 z, and (b) p z (z), (c) R z (τ). 5. A random process n(t) has a power spectral density S n (ω) = η/2 for ω. The random process is passed through a low-pass filter which has a transfer function H(ω) = 2 for ω m ω ω m and H(ω) = 0 otherwise. find the power spectral density of the waveform at the output of the filter.

29 i 6. White noise n(t) with S n (ω) = η/2 is passed through a low-pass RC network with a 3-dB frequency ω c. (a) Find the autocorrelation R(τ) of the output noise of the network. (b) Sketch ρ(τ) = R(τ)/R(0). (c) Find ω c τ such that ρ(τ) 0..

Introduction...2 Chapter Review on probability and random variables Random experiment, sample space and events

Introduction...2 Chapter Review on probability and random variables Random experiment, sample space and events Introduction... Chapter...3 Review on probability and random variables...3. Random eperiment, sample space and events...3. Probability definition...7.3 Conditional Probability and Independence...7.4 heorem

More information

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept

More information

Signals and Spectra - Review

Signals and Spectra - Review Signals and Spectra - Review SIGNALS DETERMINISTIC No uncertainty w.r.t. the value of a signal at any time Modeled by mathematical epressions RANDOM some degree of uncertainty before the signal occurs

More information

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:

More information

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes EAS 305 Random Processes Viewgraph 1 of 10 Definitions: Random Processes A random process is a family of random variables indexed by a parameter t T, where T is called the index set λ i Experiment outcome

More information

Chapter 5 Random Variables and Processes

Chapter 5 Random Variables and Processes Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering National Sun Yat-sen University Table of Contents 5.1 Introduction 5. Probability

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar

PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar PROBABILITY THEORY By Prof. S. J. Soni Assistant Professor Computer Engg. Department SPCE, Visnagar Introduction Signals whose values at any instant t are determined by their analytical or graphical description

More information

Chapter Review of of Random Processes

Chapter Review of of Random Processes Chapter.. Review of of Random Proesses Random Variables and Error Funtions Conepts of Random Proesses 3 Wide-sense Stationary Proesses and Transmission over LTI 4 White Gaussian Noise Proesses @G.Gong

More information

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES LECTURE NOTES ON PTSP (15A04304) B.TECH ECE II YEAR I SEMESTER

More information

3F1 Random Processes Examples Paper (for all 6 lectures)

3F1 Random Processes Examples Paper (for all 6 lectures) 3F Random Processes Examples Paper (for all 6 lectures). Three factories make the same electrical component. Factory A supplies half of the total number of components to the central depot, while factories

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

Notes 12 Autumn 2005

Notes 12 Autumn 2005 MAS 08 Probability I Notes Autumn 005 Conditional random variables Remember that the conditional probability of event A given event B is P(A B) P(A B)/P(B). Suppose that X is a discrete random variable.

More information

Chapter 3 Single Random Variables and Probability Distributions (Part 1)

Chapter 3 Single Random Variables and Probability Distributions (Part 1) Chapter 3 Single Random Variables and Probability Distributions (Part 1) Contents What is a Random Variable? Probability Distribution Functions Cumulative Distribution Function Probability Density Function

More information

Review of Probability

Review of Probability Review of robabilit robabilit Theor: Man techniques in speech processing require the manipulation of probabilities and statistics. The two principal application areas we will encounter are: Statistical

More information

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes Klaus Witrisal witrisal@tugraz.at Signal Processing and Speech Communication Laboratory www.spsc.tugraz.at Graz University of

More information

Problem Sheet 1 Examples of Random Processes

Problem Sheet 1 Examples of Random Processes RANDOM'PROCESSES'AND'TIME'SERIES'ANALYSIS.'PART'II:'RANDOM'PROCESSES' '''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''Problem'Sheets' Problem Sheet 1 Examples of Random Processes 1. Give

More information

Introduction to Probability and Stochastic Processes I

Introduction to Probability and Stochastic Processes I Introduction to Probability and Stochastic Processes I Lecture 3 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark Slides

More information

7 The Waveform Channel

7 The Waveform Channel 7 The Waveform Channel The waveform transmitted by the digital demodulator will be corrupted by the channel before it reaches the digital demodulator in the receiver. One important part of the channel

More information

System Identification & Parameter Estimation

System Identification & Parameter Estimation System Identification & Parameter Estimation Wb3: SIPE lecture Correlation functions in time & frequency domain Alfred C. Schouten, Dept. of Biomechanical Engineering (BMechE), Fac. 3mE // Delft University

More information

2. (a) What is gaussian random variable? Develop an equation for guassian distribution

2. (a) What is gaussian random variable? Develop an equation for guassian distribution Code No: R059210401 Set No. 1 II B.Tech I Semester Supplementary Examinations, February 2007 PROBABILITY THEORY AND STOCHASTIC PROCESS ( Common to Electronics & Communication Engineering, Electronics &

More information

Chapter 2: The Random Variable

Chapter 2: The Random Variable Chapter : The Random Variable The outcome of a random eperiment need not be a number, for eample tossing a coin or selecting a color ball from a bo. However we are usually interested not in the outcome

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Module 4. Signal Representation and Baseband Processing. Version 2 ECE IIT, Kharagpur

Module 4. Signal Representation and Baseband Processing. Version 2 ECE IIT, Kharagpur Module Signal Representation and Baseband Processing Version ECE II, Kharagpur Lesson 8 Response of Linear System to Random Processes Version ECE II, Kharagpur After reading this lesson, you will learn

More information

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit dwm/courses/2tf

2A1H Time-Frequency Analysis II Bugs/queries to HT 2011 For hints and answers visit   dwm/courses/2tf Time-Frequency Analysis II (HT 20) 2AH 2AH Time-Frequency Analysis II Bugs/queries to david.murray@eng.ox.ac.uk HT 20 For hints and answers visit www.robots.ox.ac.uk/ dwm/courses/2tf David Murray. A periodic

More information

MATHEMATICAL TOOLS FOR DIGITAL TRANSMISSION ANALYSIS

MATHEMATICAL TOOLS FOR DIGITAL TRANSMISSION ANALYSIS ch03.qxd 1/9/03 09:14 AM Page 35 CHAPTER 3 MATHEMATICAL TOOLS FOR DIGITAL TRANSMISSION ANALYSIS 3.1 INTRODUCTION The study of digital wireless transmission is in large measure the study of (a) the conversion

More information

Problems on Discrete & Continuous R.Vs

Problems on Discrete & Continuous R.Vs 013 SUBJECT NAME SUBJECT CODE MATERIAL NAME MATERIAL CODE : Probability & Random Process : MA 61 : University Questions : SKMA1004 Name of the Student: Branch: Unit I (Random Variables) Problems on Discrete

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan

More information

02 Background Minimum background on probability. Random process

02 Background Minimum background on probability. Random process 0 Background 0.03 Minimum background on probability Random processes Probability Conditional probability Bayes theorem Random variables Sampling and estimation Variance, covariance and correlation Probability

More information

PROBABILITY AND RANDOM PROCESSESS

PROBABILITY AND RANDOM PROCESSESS PROBABILITY AND RANDOM PROCESSESS SOLUTIONS TO UNIVERSITY QUESTION PAPER YEAR : JUNE 2014 CODE NO : 6074 /M PREPARED BY: D.B.V.RAVISANKAR ASSOCIATE PROFESSOR IT DEPARTMENT MVSR ENGINEERING COLLEGE, NADERGUL

More information

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Chapter 2 Random Processes

Chapter 2 Random Processes Chapter 2 Random Processes 21 Introduction We saw in Section 111 on page 10 that many systems are best studied using the concept of random variables where the outcome of a random experiment was associated

More information

2.7 The Gaussian Probability Density Function Forms of the Gaussian pdf for Real Variates

2.7 The Gaussian Probability Density Function Forms of the Gaussian pdf for Real Variates .7 The Gaussian Probability Density Function Samples taken from a Gaussian process have a jointly Gaussian pdf (the definition of Gaussian process). Correlator outputs are Gaussian random variables if

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the

More information

Probability theory. References:

Probability theory. References: Reasoning Under Uncertainty References: Probability theory Mathematical methods in artificial intelligence, Bender, Chapter 7. Expert systems: Principles and programming, g, Giarratano and Riley, pag.

More information

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University

Communication Systems Lecture 21, 22. Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University Communication Systems Lecture 1, Dong In Kim School of Information & Comm. Eng. Sungkyunkwan University 1 Outline Linear Systems with WSS Inputs Noise White noise, Gaussian noise, White Gaussian noise

More information

Topic 3: Introduction to Probability

Topic 3: Introduction to Probability Topic 3: Introduction to Probability 1 Contents 1. Introduction 2. Simple Definitions 3. Types of Probability 4. Theorems of Probability 5. Probabilities under conditions of statistically independent events

More information

Chapter 1. Statistics of Waves

Chapter 1. Statistics of Waves Chapter 1 Statistics of Waves 1.1 Introduction The theory of linear ship motion in the Gaussian seaway has been applied for the design of offshore structures in the past half century (St. Denis and Pierson,

More information

5.9 Power Spectral Density Gaussian Process 5.10 Noise 5.11 Narrowband Noise

5.9 Power Spectral Density Gaussian Process 5.10 Noise 5.11 Narrowband Noise Chapter 5 Random Variables and Processes Wireless Information Transmission System Lab. Institute of Communications Engineering g National Sun Yat-sen University Table of Contents 5.1 Introduction 5. Probability

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Chapter 4 Multiple Random Variables Chapter 41 Joint and Marginal Distributions Definition 411: An n -dimensional random vector is a function from a sample space S into Euclidean space n R, n -dimensional

More information

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Reading Chapter 5 (continued) Lecture 8 Key points in probability CLT CLT examples Prior vs Likelihood Box & Tiao

More information

2A1H Time-Frequency Analysis II

2A1H Time-Frequency Analysis II 2AH Time-Frequency Analysis II Bugs/queries to david.murray@eng.ox.ac.uk HT 209 For any corrections see the course page DW Murray at www.robots.ox.ac.uk/ dwm/courses/2tf. (a) A signal g(t) with period

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

II. Probability. II.A General Definitions

II. Probability. II.A General Definitions II. Probability II.A General Definitions The laws of thermodynamics are based on observations of macroscopic bodies, and encapsulate their thermal properties. On the other hand, matter is composed of atoms

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

Signals & Linear Systems Analysis Chapter 2&3, Part II

Signals & Linear Systems Analysis Chapter 2&3, Part II Signals & Linear Systems Analysis Chapter &3, Part II Dr. Yun Q. Shi Dept o Electrical & Computer Engr. New Jersey Institute o echnology Email: shi@njit.edu et used or the course:

More information

Question Paper Code : AEC11T03

Question Paper Code : AEC11T03 Hall Ticket No Question Paper Code : AEC11T03 VARDHAMAN COLLEGE OF ENGINEERING (AUTONOMOUS) Affiliated to JNTUH, Hyderabad Four Year B Tech III Semester Tutorial Question Bank 2013-14 (Regulations: VCE-R11)

More information

Expected value of r.v. s

Expected value of r.v. s 10 Epected value of r.v. s CDF or PDF are complete (probabilistic) descriptions of the behavior of a random variable. Sometimes we are interested in less information; in a partial characterization. 8 i

More information

CBSE 2018 ANNUAL EXAMINATION DELHI

CBSE 2018 ANNUAL EXAMINATION DELHI CBSE 08 ANNUAL EXAMINATION DELHI (Series SGN Code No 65/ : Delhi Region) Ma Marks : 00 Time Allowed : Hours SECTION A Q0 Find the value of tan cot ( ) Sol 5 5 tan cot ( ) tan tan cot cot 6 6 6 0 a Q0 If

More information

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf) Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

Mixed Signal IC Design Notes set 6: Mathematics of Electrical Noise

Mixed Signal IC Design Notes set 6: Mathematics of Electrical Noise ECE45C /8C notes, M. odwell, copyrighted 007 Mied Signal IC Design Notes set 6: Mathematics o Electrical Noise Mark odwell University o Caliornia, Santa Barbara rodwell@ece.ucsb.edu 805-893-344, 805-893-36

More information

If the objects are replaced there are n choices each time yielding n r ways. n C r and in the textbook by g(n, r).

If the objects are replaced there are n choices each time yielding n r ways. n C r and in the textbook by g(n, r). Caveat: Not proof read. Corrections appreciated. Combinatorics In the following, n, n 1, r, etc. will denote non-negative integers. Rule 1 The number of ways of ordering n distinguishable objects (also

More information

Probability Year 9. Terminology

Probability Year 9. Terminology Probability Year 9 Terminology Probability measures the chance something happens. Formally, we say it measures how likely is the outcome of an event. We write P(result) as a shorthand. An event is some

More information

ORF 245 Fundamentals of Statistics Joint Distributions

ORF 245 Fundamentals of Statistics Joint Distributions ORF 245 Fundamentals of Statistics Joint Distributions Robert Vanderbei Fall 2015 Slides last edited on November 11, 2015 http://www.princeton.edu/ rvdb Introduction Joint Cumulative Distribution Function

More information

2 Statistical Estimation: Basic Concepts

2 Statistical Estimation: Basic Concepts Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 2 Statistical Estimation:

More information

M378K In-Class Assignment #1

M378K In-Class Assignment #1 The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.

More information

Lecture 7 Random Signal Analysis

Lecture 7 Random Signal Analysis Lecture 7 Random Signal Analysis 7. Introduction to Probability 7. Amplitude Distributions 7.3 Uniform, Gaussian, and Other Distributions 7.4 Power and Power Density Spectra 7.5 Properties of the Power

More information

This examination consists of 11 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS

This examination consists of 11 pages. Please check that you have a complete copy. Time: 2.5 hrs INSTRUCTIONS THE UNIVERSITY OF BRITISH COLUMBIA Department of Electrical and Computer Engineering EECE 564 Detection and Estimation of Signals in Noise Final Examination 6 December 2006 This examination consists of

More information

Review Basic Probability Concept

Review Basic Probability Concept Economic Risk and Decision Analysis for Oil and Gas Industry CE81.9008 School of Engineering and Technology Asian Institute of Technology January Semester Presented by Dr. Thitisak Boonpramote Department

More information

Chapter 6. Random Processes

Chapter 6. Random Processes Chapter 6 Random Processes Random Process A random process is a time-varying function that assigns the outcome of a random experiment to each time instant: X(t). For a fixed (sample path): a random process

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

2. A Basic Statistical Toolbox

2. A Basic Statistical Toolbox . A Basic Statistical Toolbo Statistics is a mathematical science pertaining to the collection, analysis, interpretation, and presentation of data. Wikipedia definition Mathematical statistics: concerned

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

16.584: Random (Stochastic) Processes

16.584: Random (Stochastic) Processes 1 16.584: Random (Stochastic) Processes X(t): X : RV : Continuous function of the independent variable t (time, space etc.) Random process : Collection of X(t, ζ) : Indexed on another independent variable

More information

Ch. 5 Joint Probability Distributions and Random Samples

Ch. 5 Joint Probability Distributions and Random Samples Ch. 5 Joint Probability Distributions and Random Samples 5. 1 Jointly Distributed Random Variables In chapters 3 and 4, we learned about probability distributions for a single random variable. However,

More information

L2: Review of probability and statistics

L2: Review of probability and statistics Probability L2: Review of probability and statistics Definition of probability Axioms and properties Conditional probability Bayes theorem Random variables Definition of a random variable Cumulative distribution

More information

Probability Year 10. Terminology

Probability Year 10. Terminology Probability Year 10 Terminology Probability measures the chance something happens. Formally, we say it measures how likely is the outcome of an event. We write P(result) as a shorthand. An event is some

More information

PROBABILITY AND RANDOM VARIABLES

PROBABILITY AND RANDOM VARIABLES CHAPTER PROBABILIT AND RANDOM VARIABLES. Introduction.. Basics of Probability.3.. Terminology.3.. Probability of an event.5.3 Random Variables.3.3. Distribution function.3.3. Probability density function.7.3.3

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

ENGI 2422 First Order ODEs - Separable Page 3-01

ENGI 2422 First Order ODEs - Separable Page 3-01 ENGI 4 First Order ODEs - Separable Page 3-0 3. Ordinary Differential Equations Equations involving only one independent variable and one or more dependent variables, together with their derivatives with

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei) UCSD ECE 53 Handout #46 Prof. Young-Han Kim Thursday, June 5, 04 Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei). Discrete-time Wiener process. Let Z n, n 0 be a discrete time white

More information

2 Two Random Variables

2 Two Random Variables Two Random Variables 19 2 Two Random Variables A number of features of the two-variable problem follow by direct analogy with the one-variable case: the joint probability density, the joint probability

More information

Expectation Maximization Deconvolution Algorithm

Expectation Maximization Deconvolution Algorithm Epectation Maimization Deconvolution Algorithm Miaomiao ZHANG March 30, 2011 Abstract In this paper, we use a general mathematical and eperimental methodology to analyze image deconvolution. The main procedure

More information

Example: Bipolar NRZ (non-return-to-zero) signaling

Example: Bipolar NRZ (non-return-to-zero) signaling Baseand Data Transmission Data are sent without using a carrier signal Example: Bipolar NRZ (non-return-to-zero signaling is represented y is represented y T A -A T : it duration is represented y BT. Passand

More information

The story of the film so far... Mathematics for Informatics 4a. Jointly distributed continuous random variables. José Figueroa-O Farrill

The story of the film so far... Mathematics for Informatics 4a. Jointly distributed continuous random variables. José Figueroa-O Farrill The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 2 2 March 22 X a c.r.v. with p.d.f. f and g : R R: then Y g(x is a random variable and E(Y g(f(d variance:

More information

Statistics for Managers Using Microsoft Excel (3 rd Edition)

Statistics for Managers Using Microsoft Excel (3 rd Edition) Statistics for Managers Using Microsoft Excel (3 rd Edition) Chapter 4 Basic Probability and Discrete Probability Distributions 2002 Prentice-Hall, Inc. Chap 4-1 Chapter Topics Basic probability concepts

More information

Taylor Series and Asymptotic Expansions

Taylor Series and Asymptotic Expansions Taylor Series and Asymptotic Epansions The importance of power series as a convenient representation, as an approimation tool, as a tool for solving differential equations and so on, is pretty obvious.

More information

Square Root Raised Cosine Filter

Square Root Raised Cosine Filter Wireless Information Transmission System Lab. Square Root Raised Cosine Filter Institute of Communications Engineering National Sun Yat-sen University Introduction We consider the problem of signal design

More information

Probability Theory Review

Probability Theory Review Cogsci 118A: Natural Computation I Lecture 2 (01/07/10) Lecturer: Angela Yu Probability Theory Review Scribe: Joseph Schilz Lecture Summary 1. Set theory: terms and operators In this section, we provide

More information

Let X and Y denote two random variables. The joint distribution of these random

Let X and Y denote two random variables. The joint distribution of these random EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.

More information

Short course A vademecum of statistical pattern recognition techniques with applications to image and video analysis. Agenda

Short course A vademecum of statistical pattern recognition techniques with applications to image and video analysis. Agenda Short course A vademecum of statistical pattern recognition techniques with applications to image and video analysis Lecture Recalls of probability theory Massimo Piccardi University of Technology, Sydney,

More information

Probability, CLT, CLT counterexamples, Bayes. The PDF file of this lecture contains a full reference document on probability and random variables.

Probability, CLT, CLT counterexamples, Bayes. The PDF file of this lecture contains a full reference document on probability and random variables. Lecture 5 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2015 http://www.astro.cornell.edu/~cordes/a6523 Probability, CLT, CLT counterexamples, Bayes The PDF file of

More information

Fourier Analysis Fourier Series C H A P T E R 1 1

Fourier Analysis Fourier Series C H A P T E R 1 1 C H A P T E R Fourier Analysis 474 This chapter on Fourier analysis covers three broad areas: Fourier series in Secs...4, more general orthonormal series called Sturm iouville epansions in Secs..5 and.6

More information

ECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process

ECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process 1 ECE6604 PERSONAL & MOBILE COMMUNICATIONS Week 3 Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process 2 Multipath-Fading Mechanism local scatterers mobile subscriber base station

More information

ECE Homework Set 3

ECE Homework Set 3 ECE 450 1 Homework Set 3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3

More information

Communication Theory II

Communication Theory II Communication Theory II Lecture 8: Stochastic Processes Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 5 th, 2015 1 o Stochastic processes What is a stochastic process? Types:

More information

5. Conditional Distributions

5. Conditional Distributions 1 of 12 7/16/2009 5:36 AM Virtual Laboratories > 3. Distributions > 1 2 3 4 5 6 7 8 5. Conditional Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an

More information

ECE Lecture #10 Overview

ECE Lecture #10 Overview ECE 450 - Lecture #0 Overview Introduction to Random Vectors CDF, PDF Mean Vector, Covariance Matrix Jointly Gaussian RV s: vector form of pdf Introduction to Random (or Stochastic) Processes Definitions

More information

3. Review of Probability and Statistics

3. Review of Probability and Statistics 3. Review of Probability and Statistics ECE 830, Spring 2014 Probabilistic models will be used throughout the course to represent noise, errors, and uncertainty in signal processing problems. This lecture

More information

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com 1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix

More information

EE 302 Division 1. Homework 6 Solutions.

EE 302 Division 1. Homework 6 Solutions. EE 3 Division. Homework 6 Solutions. Problem. A random variable X has probability density { C f X () e λ,,, otherwise, where λ is a positive real number. Find (a) The constant C. Solution. Because of the

More information

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5 Stochastic processes Lecture : Multiple Random Variables Ch. 5 Dr. Ir. Richard C. Hendriks 26/04/8 Delft University of Technology Challenge the future Organization Plenary Lectures Book: R.D. Yates and

More information

CHAPTER - 16 PROBABILITY Random Experiment : If an experiment has more than one possible out come and it is not possible to predict the outcome in advance then experiment is called random experiment. Sample

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

ECE531: Principles of Detection and Estimation Course Introduction

ECE531: Principles of Detection and Estimation Course Introduction ECE531: Principles of Detection and Estimation Course Introduction D. Richard Brown III WPI 22-January-2009 WPI D. Richard Brown III 22-January-2009 1 / 37 Lecture 1 Major Topics 1. Web page. 2. Syllabus

More information

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions 1999 Prentice-Hall, Inc. Chap. 4-1 Chapter Topics Basic Probability Concepts: Sample

More information