ECE 450 - Lecture #0 Overview Introduction to Random Vectors CDF, PDF Mean Vector, Covariance Matrix Jointly Gaussian RV s: vector form of pdf Introduction to Random (or Stochastic) Processes Definitions & Vocabulary Examples
Review Correlation of RV s & Y: E(Y) Cov(, Y) = E[ ( ) (Y ) ] = E(Y) (mean of prod. of ) Correlation Coefficient r Y = cov(, Y)/(s s Y )
Random Vectors (of dimension n) Consider n random variables, all defined on the same experiment; then = (,,, n ) T is a random (column) vector of dimension n. For each experimental outcome, RV s,, n all take on particular values, say x,, x n. Example : Toss a coin 5 times, and try to catch them (noting heads or tails) before they hit the floor. Define RV s: = # of heads; = # of tails = # of coins that I catch in my hands; 4 = # of coins that hit the floor, or that I can t catch.
Example, continued The R. vector = (,,, 4 ) T takes different values, depending on the experimental outcome. For example, I catch 4 of them, and one hits the floor, and there are H s and T s, then = (,, 4, ) T For this example, the total number of possible different vectors is: 4
Random Vectors, continued Example : Measure the voltage at a particular point in a circuit at n different points in time: = (,,, n ) T where i is the voltage measured at the i th point in time. Note: Lecture #8 (bivariate probability distributions) was a study of random variables, and Y, based on a common experiment; If we rename these RV s (, Y ), we see that this was a special case of a random vector, with n = (- dimensional). 5
Cumulative Distribution Function (CDF) for R. Vector F (x, x,, x n ) = Pr( x,, n x n ) Example: From Example, re: coin toss, 5 times F (,,, ) = Pr(,,, 4 ) = Pr( (0 or H) and (0,, or T) and (0,, or caught) and (0, or missed)) = 0 Why?? 6
Example, continued F (4,, 5, 0) =? F (4,, 5, 0) = Pr(,,, 4 ) = Pr( (0,,, or 4 H) and (0,, or T) and (0,,,, 4 or 5 caught) and (0 missed)) = Pr{ (H & T or 4H & T) and 5 catches } = Pr(H & T or 4H & T) Pr(5 catches) 5 5 5 4 5? (depends upon catching ability) 7
Random Vectors in EE ( Common Cases) Case : Start with a random process (experimental outcomes mapped to waveforms) Take samples (say f s = 0 samples per sec. in example below) from the resulting waveforms Samples are from a random vector ( vector components) Note that random waveforms processed by computers are always sampled to obtain random vectors 0 - - - 0 0. 0. 0. 0.4 0.5 0.6 0.7 0.8 0.9 8
Random Vectors in EE - Common Cases Case. In communications engineering, most digital communications systems have signal waveforms represented by -dimensional vectors, e.g: a. 6-APSK b. 8-PSK 9
(t ) Case. Spatial Vectors (Image Processing) Note: For some applications, the parameter t will be spatial, not temporal. Example (above): Consider a line of pixels (picture elements) in a TV screen image. (t ) may be the brightness of the st pixel, (t ) the brightness of the nd pixel, etc. Example (below, from Klein Project Blog, http://blog.kleinproj-ect.org/), with color code: 0 for black, for white pixels Recall that a matrix can be changed to a vector, reading out column by column. 0
CDF Properties & PDF Definition/Properties F (-, -,, - ) = Pr(,, Pr( n ) = F (,,, ) = Pr(,, Pr( n ) = Joint Probability Density Function: f (x, x,, x n ) = F (x, x,, x n ) x x x Finding probability: Pr( A) n n = Pr{ (,,, n ) A } =... f (x,,x A (n-fold integral) n )dx...dx n
Properties of the Joint PDF Unit area:... f(x,,xn)dxdxn Non negative: f (x, x,, x n ) Marginal PDF s: f (x)... f (x,,x n )dx dx n Finding the CDF: F (x, x,, x n ) = x x x n f (x,x,,x n )dx dx n
Example Consider R. Vector with pdf f (x, x, x ) = x x x x 6exp [ ] x 6e, xi x 0 Find the CDF and find Pr( > ). (0 else) F (x, x, x ) = = 6 x 0 x 0 x x x f x x x 0 e e (x e,x x,x dx )dx dx dx dx dx
Example, continued F (x, x, x ) = 6 x 0 e x x x 0 e x 0 e x dx dx dx ( e x ) x x 0 e 0 x e x x x e dx x 0 ( e dx ) x e x x 0 ( e ) 4
Example, continued F (x, x, x ) = ( e x ) ( e x ) x x 0 e x dx x x ( e ) ( e ) ( e ) x i > 0 (0 else) Note RV s,, and are independent, and F (x ) = ( e ) Thus, Pr( > ) = Pr( ) = F () = e - =.679 Also: f (x ) = e -x, f (x ) = e -x, f (x ) = e -x x x > 0 (0 else) (x i > 0, (0 else); exponential RV s, based on the form of the CDF s) 5
Average or Mean Vector For R. Vector = (,,, n ) T, define the mean vector E() = ( E( ), E( ),, E( n ) ) T Example : E() = ( E( ), E( ), E( ) ) T = (, ½, / ) T (based on known mean of exponential RV s) As for the -dimensional case, if Z = g(), then E(Z) = E( g() ) = g(x,x,,x n ) f (x,x dx,,x n dx 6 ) n
The Covariance Matrix Say = (,,, n ) T is a R. (Column) Vector with of its R. Variables being k and m. Recall that the covariance of the RV s is given by (mean of the - product of the ): cov( k, m ) = E( k m ) Also note cov( k, k ) = var( k ) Now define the n x n covariance matrix C with elements C ij : C ij = cov( i, j ), i, j, n Note: Since C ij = C ji, the covariance matrix C is symmetric. 7
Example, continued (of any ) Recall,, & independent uncorrelated cov = 0 Also: E() = ( E( ), E( ), E( ) ) T = (, ½, / ) T var( ) =, var( ) = ¼, var( ) = /9 Covariance Matrix: C 0 0 0 0 / 4 0 0 /9 (known for exponential RV s) Information: cov(, ) = C = var( ) = cov(, ) = C = 8
The Correlation Matrix (Compare to Covariance Matrix) As before: = (,,, n ) T is a R. (Column) Vector with of its R. Variables being k and m. Recall that the correlation of the RV s k and m is by: E( k m ) (def.) Now define the n x n correlation matrix R with elements R ij : R ij = E( i j ), i, j, n Notes:. Since R ij = R ji, the correlation matrix is symmetric.. R ii = E( i ), the nd moment of RV i (on diag. of R matrix) 9
Multivariate Gaussian Distributions (Compare to Lect. 9, Pt., pp. 5-6 for -dimensional case.) The n RV s,,, n are jointly Gaussian (or normal) if their pdf is of the form f (x, x,, x n ) = () T C n/ / det( C) exp ( E( )) ( E( )) where is a column vector and C is the covariance matrix of R. Vector Note: For the case n =, C = s rss rs s s 0
Example Write the pdf for jointly Gaussian, 0-mean RV s and in expanded form, if the covariance matrix is C = Note det(c) =, n/ =, C - = Exponential argument: 5 5 T 5 ) ( C 5 ] [
Example, continued ( ) 5 ( 5 ( (5/ ) ) ) Scaling Coefficient: n/ / () det( C) f (x, x ) = ( (5 / ) exp )
Example : Alternate Approach C = s = 5, s =, r s s = r =/sqrt(0) f (x, x ) = s s s s s s ) r ( r exp r 5 (.) 0 0 5 exp. 0
4 Example, continued (.) 0 0 5 exp. 0 (.) 0 5 0 6 0 exp } ) (5 / exp{ Common denominator
Random Processes: Vocabulary Say we perform experiment E, with possible outcomes z, in sample space S. For each experimental outcome (z), assign some waveform, (t): (a t b) or 0 t or - t The mapping or function that takes S to {(t)} is a random process. The set {(t)} of all possible waveforms is called the ensemble of sample functions. Each possible waveform is called a sample function. 5
Random Processes, continued Note: sometimes we write (t) = (t; z) to emphasize that the process is random, depending on the outcome, z.. Example: tune a radio between two stations; turn it on at time t = 0, and record the noise process, n(t), a voltage. One sample function, n(t; z ): n(t; z ) t t 6
Random Processes, continued Another sample function, n(t; z ): n(t; z ) t Another sample function, n(t; z ) n(t; z )... t (This process has infinitely many sample functions in the ensemble.) 7
Random Processes The Ensemble of Waveforms, continued n(t; z ) t t n(t; z ) t n(t; z )... t 8
Random Processes (RP s), continued Consider (t; z) in general, i.e., a general RP Now fix time: say t = t. Then instead of an ensemble of waveforms, we have a set of real # s: the values of the waveforms at t = t, assigned to each experimental outcome. Thus, (t, z) defines a random variable. (fixed time) Now fix the experimental outcome: say z = z, leaving t unfixed. Then we have (t, z ), a particular waveform or sample function. Thus, (t, z ) defines a sample function. (fixed outcome) 9
Fixing Time in a Random Process n(t; z ) t t n(t; z ) t n(t; z )... t 0
Fixing the Experimental Outcome in a Random Process n(t; z ) t t n(t; z ) t n(t; z )... t
More RP Vocabulary Summary: Random Process fix t fix z RV sample function If we sample the random process (t) at a finite # of times, t, t,, t n, ( fixing many time points), we get a random vector, or a set of random variables, with joint pdf f (x, x,, x n ) f (x,t ; x,t ; ; x n,t n ) The collection of all such joint pdfs, for all n ( n < ) and for all sample times t,, t n completely describes the RP in a statistical way. (Fortunately: not usually necessary!)
The Big Picture: From RV s to RP s Random Variables map Exp. Outcome z Real # (z) = x Random Vectors map Exp. Outcome z Random Sequences map Exp. Outcome z Vector (z) = x = [x (z),, x n (z)] real # s Sequence x n (z) = x n = x (z), x (z), Random Processes map Exp. Outcome z Function x(t; z) = x(t) real #, real #, (real function of time)
The Big Picture, continued Random Variable Bag of real #; s Random Vector Bag of vectors Random Sequence Bag of sequences Random Process Bag of functions = the ensemble Sample Explanation, for Random Processes: Reach into bag; draw out one of the functions in the bag; which function you get depends on which experimental outcome, z, occurs 4