ECE Lecture #10 Overview

Similar documents
ECE Homework Set 3

Definition of a Stochastic Process

Multiple Random Variables

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Chapter 4 continued. Chapter 4 sections

Multivariate Random Variable

4. Distributions of Functions of Random Variables

conditional cdf, conditional pdf, total probability theorem?

Introduction to Probability and Stocastic Processes - Part I

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Chapter 5 continued. Chapter 5 sections

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Lecture 2: Repetition of probability theory and statistics

6.041/6.431 Fall 2010 Quiz 2 Solutions

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

1.1 Review of Probability Theory

Stochastic Processes

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Random Processes Why we Care

L2: Review of probability and statistics

Lecture 1: August 28

PROBABILITY THEORY. Prof. S. J. Soni. Assistant Professor Computer Engg. Department SPCE, Visnagar

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology

Chapter 6. Random Processes

Algorithms for Uncertainty Quantification

Chapter 2: Random Variables

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

ECE Lecture #9 Part 2 Overview

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Review (Probability & Linear Algebra)

Communication Theory II

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

ECE Homework Set 2

Chapter 5. Chapter 5 sections

BASICS OF PROBABILITY

Statistical signal processing

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

Statistics 3657 : Moment Approximations

Gaussian, Markov and stationary processes

Joint Distribution of Two or More Random Variables

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

System Identification

Lecture 2: Review of Basic Probability Theory

where r n = dn+1 x(t)

Statistics for scientists and engineers

Short course A vademecum of statistical pattern recognition techniques with applications to image and video analysis. Agenda

Brief Review of Probability

Appendix A : Introduction to Probability and stochastic processes

Random Variables and Their Distributions

More on Distribution Function

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7

ENSC327 Communications Systems 19: Random Processes. Jie Liang School of Engineering Science Simon Fraser University

Dependence. Practitioner Course: Portfolio Optimization. John Dodson. September 10, Dependence. John Dodson. Outline.

Lecture 11. Multivariate Normal theory

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Review: mostly probability and some statistics

Chapter 5 Class Notes

3. Probability and Statistics

Review of Probability Theory

Lecture 2: Review of Probability

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

7 Random samples and sampling distributions

EE4601 Communication Systems

STAT 414: Introduction to Probability Theory

Gaussian Random Variables Why we Care

ELEG 3143 Probability & Stochastic Process Ch. 4 Multiple Random Variables

Continuous Random Variables

The Multivariate Normal Distribution 1

A Probability Review

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Random Process. Random Process. Random Process. Introduction to Random Processes

Name of the Student: Problems on Discrete & Continuous R.Vs

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Solutions to Homework Set #6 (Prepared by Lele Wang)

ECE 650 1/11. Homework Sets 1-3

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

Variations. ECE 6540, Lecture 02 Multivariate Random Variables & Linear Algebra

16.584: Random Vectors

UCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

G.PULLAIAH COLLEGE OF ENGINEERING & TECHNOLOGY DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING PROBABILITY THEORY & STOCHASTIC PROCESSES

Lecture 3. Probability - Part 2. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. October 19, 2016

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

1 Presessional Probability

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Recitation 2: Probability

Basics on Probability. Jingrui He 09/11/2007

Chapter 4. Chapter 4 sections

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Chapter 2 Random Variables

STAT 418: Probability and Stochastic Processes

Multivariate random variables

Review of Statistics

Name of the Student: Problems on Discrete & Continuous R.Vs

Transcription:

ECE 450 - Lecture #0 Overview Introduction to Random Vectors CDF, PDF Mean Vector, Covariance Matrix Jointly Gaussian RV s: vector form of pdf Introduction to Random (or Stochastic) Processes Definitions & Vocabulary Examples

Review Correlation of RV s & Y: E(Y) Cov(, Y) = E[ ( ) (Y ) ] = E(Y) (mean of prod. of ) Correlation Coefficient r Y = cov(, Y)/(s s Y )

Random Vectors (of dimension n) Consider n random variables, all defined on the same experiment; then = (,,, n ) T is a random (column) vector of dimension n. For each experimental outcome, RV s,, n all take on particular values, say x,, x n. Example : Toss a coin 5 times, and try to catch them (noting heads or tails) before they hit the floor. Define RV s: = # of heads; = # of tails = # of coins that I catch in my hands; 4 = # of coins that hit the floor, or that I can t catch.

Example, continued The R. vector = (,,, 4 ) T takes different values, depending on the experimental outcome. For example, I catch 4 of them, and one hits the floor, and there are H s and T s, then = (,, 4, ) T For this example, the total number of possible different vectors is: 4

Random Vectors, continued Example : Measure the voltage at a particular point in a circuit at n different points in time: = (,,, n ) T where i is the voltage measured at the i th point in time. Note: Lecture #8 (bivariate probability distributions) was a study of random variables, and Y, based on a common experiment; If we rename these RV s (, Y ), we see that this was a special case of a random vector, with n = (- dimensional). 5

Cumulative Distribution Function (CDF) for R. Vector F (x, x,, x n ) = Pr( x,, n x n ) Example: From Example, re: coin toss, 5 times F (,,, ) = Pr(,,, 4 ) = Pr( (0 or H) and (0,, or T) and (0,, or caught) and (0, or missed)) = 0 Why?? 6

Example, continued F (4,, 5, 0) =? F (4,, 5, 0) = Pr(,,, 4 ) = Pr( (0,,, or 4 H) and (0,, or T) and (0,,,, 4 or 5 caught) and (0 missed)) = Pr{ (H & T or 4H & T) and 5 catches } = Pr(H & T or 4H & T) Pr(5 catches) 5 5 5 4 5? (depends upon catching ability) 7

Random Vectors in EE ( Common Cases) Case : Start with a random process (experimental outcomes mapped to waveforms) Take samples (say f s = 0 samples per sec. in example below) from the resulting waveforms Samples are from a random vector ( vector components) Note that random waveforms processed by computers are always sampled to obtain random vectors 0 - - - 0 0. 0. 0. 0.4 0.5 0.6 0.7 0.8 0.9 8

Random Vectors in EE - Common Cases Case. In communications engineering, most digital communications systems have signal waveforms represented by -dimensional vectors, e.g: a. 6-APSK b. 8-PSK 9

(t ) Case. Spatial Vectors (Image Processing) Note: For some applications, the parameter t will be spatial, not temporal. Example (above): Consider a line of pixels (picture elements) in a TV screen image. (t ) may be the brightness of the st pixel, (t ) the brightness of the nd pixel, etc. Example (below, from Klein Project Blog, http://blog.kleinproj-ect.org/), with color code: 0 for black, for white pixels Recall that a matrix can be changed to a vector, reading out column by column. 0

CDF Properties & PDF Definition/Properties F (-, -,, - ) = Pr(,, Pr( n ) = F (,,, ) = Pr(,, Pr( n ) = Joint Probability Density Function: f (x, x,, x n ) = F (x, x,, x n ) x x x Finding probability: Pr( A) n n = Pr{ (,,, n ) A } =... f (x,,x A (n-fold integral) n )dx...dx n

Properties of the Joint PDF Unit area:... f(x,,xn)dxdxn Non negative: f (x, x,, x n ) Marginal PDF s: f (x)... f (x,,x n )dx dx n Finding the CDF: F (x, x,, x n ) = x x x n f (x,x,,x n )dx dx n

Example Consider R. Vector with pdf f (x, x, x ) = x x x x 6exp [ ] x 6e, xi x 0 Find the CDF and find Pr( > ). (0 else) F (x, x, x ) = = 6 x 0 x 0 x x x f x x x 0 e e (x e,x x,x dx )dx dx dx dx dx

Example, continued F (x, x, x ) = 6 x 0 e x x x 0 e x 0 e x dx dx dx ( e x ) x x 0 e 0 x e x x x e dx x 0 ( e dx ) x e x x 0 ( e ) 4

Example, continued F (x, x, x ) = ( e x ) ( e x ) x x 0 e x dx x x ( e ) ( e ) ( e ) x i > 0 (0 else) Note RV s,, and are independent, and F (x ) = ( e ) Thus, Pr( > ) = Pr( ) = F () = e - =.679 Also: f (x ) = e -x, f (x ) = e -x, f (x ) = e -x x x > 0 (0 else) (x i > 0, (0 else); exponential RV s, based on the form of the CDF s) 5

Average or Mean Vector For R. Vector = (,,, n ) T, define the mean vector E() = ( E( ), E( ),, E( n ) ) T Example : E() = ( E( ), E( ), E( ) ) T = (, ½, / ) T (based on known mean of exponential RV s) As for the -dimensional case, if Z = g(), then E(Z) = E( g() ) = g(x,x,,x n ) f (x,x dx,,x n dx 6 ) n

The Covariance Matrix Say = (,,, n ) T is a R. (Column) Vector with of its R. Variables being k and m. Recall that the covariance of the RV s is given by (mean of the - product of the ): cov( k, m ) = E( k m ) Also note cov( k, k ) = var( k ) Now define the n x n covariance matrix C with elements C ij : C ij = cov( i, j ), i, j, n Note: Since C ij = C ji, the covariance matrix C is symmetric. 7

Example, continued (of any ) Recall,, & independent uncorrelated cov = 0 Also: E() = ( E( ), E( ), E( ) ) T = (, ½, / ) T var( ) =, var( ) = ¼, var( ) = /9 Covariance Matrix: C 0 0 0 0 / 4 0 0 /9 (known for exponential RV s) Information: cov(, ) = C = var( ) = cov(, ) = C = 8

The Correlation Matrix (Compare to Covariance Matrix) As before: = (,,, n ) T is a R. (Column) Vector with of its R. Variables being k and m. Recall that the correlation of the RV s k and m is by: E( k m ) (def.) Now define the n x n correlation matrix R with elements R ij : R ij = E( i j ), i, j, n Notes:. Since R ij = R ji, the correlation matrix is symmetric.. R ii = E( i ), the nd moment of RV i (on diag. of R matrix) 9

Multivariate Gaussian Distributions (Compare to Lect. 9, Pt., pp. 5-6 for -dimensional case.) The n RV s,,, n are jointly Gaussian (or normal) if their pdf is of the form f (x, x,, x n ) = () T C n/ / det( C) exp ( E( )) ( E( )) where is a column vector and C is the covariance matrix of R. Vector Note: For the case n =, C = s rss rs s s 0

Example Write the pdf for jointly Gaussian, 0-mean RV s and in expanded form, if the covariance matrix is C = Note det(c) =, n/ =, C - = Exponential argument: 5 5 T 5 ) ( C 5 ] [

Example, continued ( ) 5 ( 5 ( (5/ ) ) ) Scaling Coefficient: n/ / () det( C) f (x, x ) = ( (5 / ) exp )

Example : Alternate Approach C = s = 5, s =, r s s = r =/sqrt(0) f (x, x ) = s s s s s s ) r ( r exp r 5 (.) 0 0 5 exp. 0

4 Example, continued (.) 0 0 5 exp. 0 (.) 0 5 0 6 0 exp } ) (5 / exp{ Common denominator

Random Processes: Vocabulary Say we perform experiment E, with possible outcomes z, in sample space S. For each experimental outcome (z), assign some waveform, (t): (a t b) or 0 t or - t The mapping or function that takes S to {(t)} is a random process. The set {(t)} of all possible waveforms is called the ensemble of sample functions. Each possible waveform is called a sample function. 5

Random Processes, continued Note: sometimes we write (t) = (t; z) to emphasize that the process is random, depending on the outcome, z.. Example: tune a radio between two stations; turn it on at time t = 0, and record the noise process, n(t), a voltage. One sample function, n(t; z ): n(t; z ) t t 6

Random Processes, continued Another sample function, n(t; z ): n(t; z ) t Another sample function, n(t; z ) n(t; z )... t (This process has infinitely many sample functions in the ensemble.) 7

Random Processes The Ensemble of Waveforms, continued n(t; z ) t t n(t; z ) t n(t; z )... t 8

Random Processes (RP s), continued Consider (t; z) in general, i.e., a general RP Now fix time: say t = t. Then instead of an ensemble of waveforms, we have a set of real # s: the values of the waveforms at t = t, assigned to each experimental outcome. Thus, (t, z) defines a random variable. (fixed time) Now fix the experimental outcome: say z = z, leaving t unfixed. Then we have (t, z ), a particular waveform or sample function. Thus, (t, z ) defines a sample function. (fixed outcome) 9

Fixing Time in a Random Process n(t; z ) t t n(t; z ) t n(t; z )... t 0

Fixing the Experimental Outcome in a Random Process n(t; z ) t t n(t; z ) t n(t; z )... t

More RP Vocabulary Summary: Random Process fix t fix z RV sample function If we sample the random process (t) at a finite # of times, t, t,, t n, ( fixing many time points), we get a random vector, or a set of random variables, with joint pdf f (x, x,, x n ) f (x,t ; x,t ; ; x n,t n ) The collection of all such joint pdfs, for all n ( n < ) and for all sample times t,, t n completely describes the RP in a statistical way. (Fortunately: not usually necessary!)

The Big Picture: From RV s to RP s Random Variables map Exp. Outcome z Real # (z) = x Random Vectors map Exp. Outcome z Random Sequences map Exp. Outcome z Vector (z) = x = [x (z),, x n (z)] real # s Sequence x n (z) = x n = x (z), x (z), Random Processes map Exp. Outcome z Function x(t; z) = x(t) real #, real #, (real function of time)

The Big Picture, continued Random Variable Bag of real #; s Random Vector Bag of vectors Random Sequence Bag of sequences Random Process Bag of functions = the ensemble Sample Explanation, for Random Processes: Reach into bag; draw out one of the functions in the bag; which function you get depends on which experimental outcome, z, occurs 4