Lecture 2. Spring Quarter Statistical Optics. Lecture 2. Characteristic Functions. Transformation of RVs. Sums of RVs

Similar documents
EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

A Probability Review

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011

EE4601 Communication Systems

Continuous Random Variables

conditional cdf, conditional pdf, total probability theorem?

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Section 3: Complex numbers

where r n = dn+1 x(t)

Introduction to Probability and Stocastic Processes - Part I

3. Probability and Statistics

Probability, CLT, CLT counterexamples, Bayes. The PDF file of this lecture contains a full reference document on probability and random variables.

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

2 Functions of random variables

Uncorrelatedness and Independence

Review (Probability & Linear Algebra)

5 Operations on Multiple Random Variables

2.7 The Gaussian Probability Density Function Forms of the Gaussian pdf for Real Variates

16.584: Random Vectors

Statistics for scientists and engineers

Multiple Random Variables

Multiple Random Variables

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

Definition of a Stochastic Process

UNIT Define joint distribution and joint probability density function for the two random variables X and Y.

Multivariate Random Variable

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

4. CONTINUOUS RANDOM VARIABLES

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Gaussian random variables inr n

University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries

Let X and Y denote two random variables. The joint distribution of these random

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Problem Set 7 Due March, 22

Stochastic Processes

Introduction to Normal Distribution

Chapter 5,6 Multiple RandomVariables

The Central Limit Theorem

ECE Lecture #10 Overview

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Course on Inverse Problems

ECE-340, Spring 2015 Review Questions

ECE 636: Systems identification

Joint Distributions: Part Two 1

Elliptically Contoured Distributions

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

BASICS OF PROBABILITY

Averages of Random Variables. Expected Value. Die Tossing Example. Averages of Random Variables

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

EAS 305 Random Processes Viewgraph 1 of 10. Random Processes

Lecture 2: Review of Basic Probability Theory

Multivariate Distribution Models

Chapter 5 continued. Chapter 5 sections

Random Variables and Their Distributions

2. The CDF Technique. 1. Introduction. f X ( ).

3. Review of Probability and Statistics

Lecture 2: Repetition of probability theory and statistics

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Final. Fall 2016 (Dec 16, 2016) Please copy and write the following statement:

Basics on Probability. Jingrui He 09/11/2007

Multivariate random variables

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Bivariate Distributions. Discrete Bivariate Distribution Example

Lecture 5: Moment Generating Functions

Circuit Analysis-III. Circuit Analysis-II Lecture # 3 Friday 06 th April, 18

Algorithms for Uncertainty Quantification

Stochastic Adaptive Control (02421) Niels Kjølstad Poulsen

Variations. ECE 6540, Lecture 02 Multivariate Random Variables & Linear Algebra

Recitation 2: Probability

Statistical signal processing

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

L2: Review of probability and statistics

Probability Background

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

MTH739U/P: Topics in Scientific Computing Autumn 2016 Week 6

Physics 403. Segev BenZvi. Parameter Estimation, Correlations, and Error Bars. Department of Physics and Astronomy University of Rochester

1 Solution to Problem 2.1

The Multivariate Gaussian Distribution

Gaussian, Markov and stationary processes

Name of the Student: Problems on Discrete & Continuous R.Vs

3F1 Random Processes Examples Paper (for all 6 lectures)

Dependence. MFM Practitioner Module: Risk & Asset Allocation. John Dodson. September 11, Dependence. John Dodson. Outline.

Section 8.1. Vector Notation

1 Joint and marginal distributions

Multivariate Distributions

UCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)

Outline. Random Variables. Examples. Random Variable

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Dependence. Practitioner Course: Portfolio Optimization. John Dodson. September 10, Dependence. John Dodson. Outline.

Joint Probability Distributions and Random Samples (Devore Chapter Five)

ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals

Statistical Methods in Particle Physics

Exercises with solutions (Set D)

x. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ).

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

Transcription:

s of Spring Quarter 2018 ECE244a - Spring 2018 1

Function s of The characteristic function is the Fourier transform of the pdf (note Goodman and Papen have different notation) C x(ω) = e iωx = = f x(x)e iωx dx (iω) n x n, (1) n! n=0 If the random variable is discrete with probability function p k (k), then the characteristic function is C k (ω) = e iωk p k (k). (2) k= ECE244a - Spring 2018 2

Relationship between C x (ω) and moments of pdf s of The nth moment of the probability distribution, if it exists, may be determined by differentiation of the characteristic function x n = 1 d n i n dω n Cx(ω). (3) ω=0 function weighted sum of the moments x n of the distribution Differentiating and setting ω = 0 generates moments. ECE244a - Spring 2018 3

Example - RV s of function of a is a with variance reciprocally scaled [ ] [ ] 1 (u u )2 exp 2πσ 2σ 2 exp iω u σ2 ω 2 2 ECE244a - Spring 2018 4

s of Consider the transformation z = f(u) where f is so that u = f 1 (z) exists. Given p u(u), what is p z(z)? ECE244a - Spring 2018 5

s (cont.) s of Differential area must be preserved or p u(u)du = p z(z)dz Substitute u = f 1 (z) [ p z(z) = p u f 1 (z) ] du dz ECE244a - Spring 2018 6

Example s of Let z = e u, with p u(u) = e u over [0, ) (Exponential distribution.) Then u = ln z, dz = e u du or du dz = e u = z 1. Range of z [1, ) Substituting p z(z) = e ln z z 1 = 1 z 2 for 1 < z < ECE244a - Spring 2018 7

s of s of Variables Let a new RV be given as the sum of two other z = v + u Given the joint distribution p uv(u, v), what is p z(z)? Start w/joint CDF F uv(u, v) and line z = u + v F z(z) is region where z < u + v or F z(z) = Differentiate both sides d dz Fz(z) = d dz z v dv p uv(u, v)du [ z v dv p uv(u, v)du Use result from calculus g(z) d p u(u)du = p u [g(z)] dg dz dz with g(z) = z v. Then dg dz = 1 and d dz z v p uv(u, v)du = p uv(z v, v) ECE244a - Spring 2018 8 ]

s of RV s ( cont. 2) s of Therefore p z(z) = p uv(z v, v)dv If are independent, then p uv(z v, v) = p u(z v)p v(v) and p z(z) = p u(z v)p v(v)dv The pdf of the sum of independent is the convolution of the separate pdfs In terms of the characteristic function M z(z) = M u(u)m v(v) ECE244a - Spring 2018 9

Distribution s of A gaussian random variable has a gaussian probability distribution defined by f x(x). = 1 2πσ e (x x )2 /2σ 2. (4) It is easy to verify that the expected value of x is x and the variance is σ 2. A unique property of gaussian random variables is that any weighted superposition of multiple gaussian random variables, whether they are independent or dependent, is also a gaussian random variable. ECE244a - Spring 2018 10

Area Under s of The probability that a unit-variance gaussian random variable exceeds a value z, P {x > z}, can be expressed in terms of the complementary error function denoted by erfc and defined as ( ) 1 e x2 /2 1 z 2 dx = 2π z 2 erfc, (5) where erfc(z) = 1 erf(z) with erf(z) being the error function that is defined as erf(z) = z 2 e s2 ds. π 0 For large arguments, the erfc function can be approximated by erfc(x) 1 x π e x2. (6) ECE244a - Spring 2018 11

Joint gaussian probability distribution s of The two-dimensional joint gaussian probability distribution for two independent random variables with the same variance is given by f xy(x, y) = 1 2πσ 2 e [(x x )2 +(y y ) 2 ]/2σ 2. (7) If the two gaussian random variables each have zero-mean and are correlated so that ρ xy. = xy /σ 2, (8) then the joint probability distribution may be written as ( ) 1 f xy(x, y) = 2πσ 2 exp x2 2ρ xyxy + y 2 1 ρ 2 xy 2σ 2 (1 ρ 2. (9) xy) If ρ xy = 0, then the probability distribution is separable and factors into f xy(x, y) = f x(x)f y(y). Therefore, uncorrelated gaussian random variables are independent. ECE244a - Spring 2018 12

Circularly Symmetric Variables s of A two-dimensional gaussian distribution is often used to model complex-baseband noise The corresponding random variable is called a complex gaussian random variable. If the two noise components are independent and have equal variance, then the two-dimensional gaussian distribution is circularly-symmetric. The corresponding random variable is called a circularly-symmetric gaussian random variable. It is possible to have a joint two-dimensional probability distribution that has marginal gaussian distributions, but is not jointly gaussian. Knowing that each marginal distribution is gaussian is not sufficient to infer that the joint distribution is gaussian. ECE244a - Spring 2018 13

Distribution s of The probability distribution for the multivariate gaussian distribution is given by f x(x) = where C is the real autocovariance matrix. 1 e 1 2 (x x ) T C 1 (x x ), (10) (2π) N/2 det C The matrix is defined as the expectation of the outer product of the column vector with itself after removing the mean C = (x x ) (x x ) T. (11) The on-diagonal matrix element C ii is the variance of the gaussian random variable x i. The off-diagonal matrix element C ij is the autocovariance of the two gaussian random variables x i and x j. ECE244a - Spring 2018 14

Example s of As an example, let a set of M uncorrelated gaussian random variables have an autocovariance matrix given by C = σ 2 I M where where I M is an M by M identity matrix. Using the matrix identity det(σ 2 I M) = σ 2M, the joint probability distribution is f x(x) = 1 (2πσ 2 ) M/2 e x x 2 /2σ 2. (12) ECE244a - Spring 2018 15

Circularly-symmetric Variables s of A multivariate joint gaussian distribution can also be defined in which each component of the column vector is a complex gaussian random variable z i = Re[z i ] + i Im[z i ]. The complex autocovariance matrix of this vector is defined as W. = (z z ) (z z ). (13) If the complex gaussian random variables are jointly gaussian and jointly circularly symmetric, then the real autocovariance matrix C given in (11) can be expressed in terms of the complex autocovariance matrix W given in (13) [ ] C = 1 Re W Im W. (14) 2 Im W Re W (asked as a problem) ECE244a - Spring 2018 16

pdf for vector of Circularly-symmetric s s of The joint probability distribution for a circularly-symmetric multivariate gaussian distribution expressed in terms of the complex autocovariance matrix is f z(z) = 1 π N det W e (z z ) W 1 (z z ). (15) Using the properties of determinants, the leading term (π N det W) 1 may be written as det(πw) 1 If W = 2σ 2 I M, then following the same steps used to derive (12), we have f z(z) = 1 (2πσ 2 ) M e z z 2 /2σ 2. (16) ECE244a - Spring 2018 17

s of Example A new set of decorrelated gaussian random variables may be defined by using a coordinate transformation matrix T that diagonalizes the autocovariance matrix C. The uncorrelated gaussian random variables in the new coordinate system are then independent, but may have changed mean values and variances. Consider a complex gaussian random variable defined in (9) with ρ xy = 0. The joint probability distribution f xy(x, y) for these correlated gaussian random variables is shown below y y x x (a) (b) ECE244a - Spring 2018 18

Example-cont. s of To determine the transformation that produces decorrelated random variables, define a new set of random variables x and y such that ] [ x y ] = A [ x y where A is the matrix that diagonalizes the autocovariance matrix C, given as [ ] σx C = 2 ρ xyσ xσ y ρ xyσ xσ y σy 2. The matrix A is formed by the eigenvectors of C. If σ 2 x = σ 2 y = σ 2, then x = 1 2 (x + y) and y = 1 2 (x y). The eigenvalues of C are the variances of the decorrelated gaussian random variables and are given by σ 2 x = σ2 (1 + ρ xy) and σ 2 y = σ2 (1 ρ xy)., ECE244a - Spring 2018 19

Decorrelated Distribution s of The decorrelated joint gaussian distribution is [ ] f x y (x, y 1 /2σ ) = 2 (1+ρ xy) 2πσ 2 (1 + ρ xy) e x 2 [ ] 1 /2σ 2 (1 ρ xy) 2πσ 2 (1 ρ xy) e y 2, where the functional form is written to show that the probability distribution in the new coordinate system is separable as f x y (x, y ) = f x (x )f y (y ). Therefore x and y are independent gaussian random variables.joint Variables ECE244a - Spring 2018 20

Correlated s of Two jointly random variables each with zero mean [ ] 1 p uv(u, v) = 2πσ 2 1 ρ exp u2 + v 2 2uvρ 2 2σ 2 (1 ρ 2 ) u v ρ σ 2 Uncorrelated jointly are independent (One of rare times that correlation implies independence.) s of statistically independent are also with the total variance being the sum of the individual variances. s of dependent are also but the variance of sum is not sum of individual variances. If σ same for each distribution, total variance varies from 2σ 2 (two are independent (ρ = 0) to 4σ 2 when the two are completely correlated (ρ = 1). ECE244a - Spring 2018 21

s of Let the pdf of the sum of n independent be defined by z = 1 n i=1 U i u i σ i where it is assumed that z has unit variance and zero mean. As n the pdf of z approaches a distribution regardless of the underlining pdfs or lim pz(z) = 1 e z2 2 n 2π ECE244a - Spring 2018 22

(cont.) s of Result only true in the limit. Finite sums may not approach -especially in tails individual distributions must be small. ECE244a - Spring 2018 23

Phasors s of Recall that a single component of the field is real U(z, t) = A(z, t) cos [ωt kz + φ(z, t)] At a specific point in space r and a specific time t, the amplitude of the field is u = A cos [ φ ] where A and φ are random variables. Let u = Re{U} where U = Ae jφ is a random complex phasor. Consider as a vector in complex plane ECE244a - Spring 2018 24

Phasor s Consider sum of a large number of random phasors added to a constant phasor A as shown below s of Imaginary Axis (quadrature) of independent random vectors Many independent contributions Imaginary Axis (quadrature) Joint gaussian pdf Real Axis (in-phase) Real Axis (in-phase) ECE244a - Spring 2018 25

s of Phasors (cont. 1) s of Write down real and imag. parts of the fluctuation part of the phasor r = Re{a} = A cos(φ) + 1 N i = Im{a} = A sin(φ) + 1 N N α k cos φ k k=1 N α k sin φ k k=1 Both r and i are sums of many independent contributions and thus as asserted by the central limit theorem, the pdf of each is and it can be shown that r and i are jointly ECE244a - Spring 2018 26

Mean of the s of Now calculate mean, variance and correlation of the The mean value of the distribution is given by r = = 1 N 1 N N αk cos φ k k=1 N α k cos φ k (from independence) k=1 = N α cos φ (from iid) = 0 ( φ uniform over ( π, π)) ECE244a - Spring 2018 27

s of Start with definition of variance σr 2 = r 2 r r = 0 from before, so that σr 2 = r 2 and r 2 N N = N 1 αn cos φ n α k cos φ k k=1 n=1 N N = N 1 α 2 cos φ n cos φ k (from independence) k=1 n=1 and same distribution For n = k cos φ n cos φ k = 1 2 (mean of cos 2 (φ) = 1 2 for φ being uniform ) For n = k cos φ n cos φ k = 0 (add out of phase ) ECE244a - Spring 2018 28

Variance and s of The variance then becomes r 2 = i 2 α 2 = σ 2 2 r i r i = 1 N N α 2 cos φ N n sin φ k = 0 k=1 n=1 ECE244a - Spring 2018 29

s of A sum of a large number of phasors produces a joint process f xy(x, y) = σ 2 = α 2 /2 variances are the same 1 2πσ 2 e [(x x )2 +(y y ) 2 ]/2σ 2. (17) x and y are zero mean, jointly and are independent ECE244a - Spring 2018 30