ECE Lecture #9, Part 1

Similar documents
ECE 450 Lecture 3. Overview: Randomly Selected Experiments & Hypothesis Testing. Digital Communications Examples. Many General Probability Examples

Chapter 5,6 Multiple RandomVariables

ECE Homework Set 3

Discrete Mathematics and Probability Theory Fall 2015 Jean Walrand 15A Discussion Solutions

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs

Gaussian Random Variables Why we Care

2. Conditional Expectation (9/15/12; cf. Ross)

A Probability Review

ENGG2430A-Homework 2

Stat 5101 Notes: Algorithms (thru 2nd midterm)

ECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)

EE4601 Communication Systems

UNIT Define joint distribution and joint probability density function for the two random variables X and Y.

ECE Lecture #10 Overview

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Continuous Random Variables

STA 4321/5325 Solution to Homework 5 March 3, 2017

Bivariate distributions

ECE Lecture #9 Part 2 Overview

SDS 321: Introduction to Probability and Statistics

Lecture 8: Channel Capacity, Continuous Random Variables

4 Pairs of Random Variables

Lecture 5 Channel Coding over Continuous Channels

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

Mathematics 426 Robert Gross Homework 9 Answers

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

Stat 5101 Notes: Algorithms

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Lecture 2: Review of Probability

Recitation 2: Probability

Multiple Random Variables

Parameter Estimation

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

STAT 430/510 Probability

Single Maths B: Introduction to Probability

Chapter 4 Multiple Random Variables

ECE 650 1/11. Homework Sets 1-3

UNIVERSITY OF DUBLIN

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Basics on Probability. Jingrui He 09/11/2007

Chapter 5 continued. Chapter 5 sections

A Hilbert Space for Random Processes

Lecture 11: Continuous-valued signals and differential entropy

Lecture 3: Basic Statistical Tools. Bruce Walsh lecture notes Tucson Winter Institute 7-9 Jan 2013

18 Bivariate normal distribution I

Digital Transmission Methods S

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y.

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

Lecture 2: Probability

Chapter 4 Multiple Random Variables

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

ELEC546 Review of Information Theory

Ch. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

Chapter 5. Chapter 5 sections

Chapter 4. Chapter 4 sections

Lecture 2: Probability

1 Random Variable: Topics

Lecture 14: Multivariate mgf s and chf s

1 Solution to Problem 2.1

Lecture 13: Conditional Distributions and Joint Continuity Conditional Probability for Discrete Random Variables

Basic concepts of probability theory

Order Statistics and Distributions

From Determinism to Stochasticity

ECE 4400:693 - Information Theory

ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions

Lecture 1: Basics of Probability

Multivariate Random Variable

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

STA 256: Statistics and Probability I

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Bivariate Distributions. Discrete Bivariate Distribution Example

( x) ( ) F ( ) ( ) ( ) Prob( ) ( ) ( ) X x F x f s ds

C-N M406 Lecture Notes (part 4) Based on Wackerly, Schaffer and Mendenhall s Mathematical Stats with Applications (2002) B. A.

STAT 430/510: Lecture 15

ECE 673-Random signal analysis I Final

Ch. 8 Math Preliminaries for Lossy Coding. 8.5 Rate-Distortion Theory

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Exam 1 Review With Solutions Instructor: Brian Powers

Let X and Y denote two random variables. The joint distribution of these random

HW4 : Bivariate Distributions (1) Solutions

Introduction to Statistics and Error Analysis

Chapter I: Fundamental Information Theory

Lecture 8: Shannon s Noise Models

2 Chapter 2: Conditional Probability

ECE531 Lecture 8: Non-Random Parameter Estimation

Problem Set 7 Due March, 22

Complex Gaussian Ratio Distribution with Applications for Error Rate Calculation in Fading Channels with Imperfect CSI

General Random Variables

Bivariate Distributions

Review: mostly probability and some statistics

Introduction to Probability and Stocastic Processes - Part I

4.1 The Expectation of a Random Variable

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

Mathematics 1 Lecture Notes Chapter 1 Algebra Review

Transcription:

ECE 450 - Lecture #9, Part 1 Pairs of Random Variables, continued - Overview Joint PDF s A Discrete Example (PDF for Pair of Discrete RV s) Jointly Gaussian RV s Conditional Probability Density Functions (Again) Bayes Rules for PDF s Mnemonic Table: Conditional Probabilities vs. Conditional PDF s Communications Example: Signal + Noise 1

Pairs of RV s A Discrete Example (from Ross, A First Course in Probability) Suppose that 15 percent of the families in a certain community have no children, 0% have 1 child, 35% have children, and 30% have 3 children. Also suppose that each child is equally likely to be a boy or girl, independent of family size. Experiment: Choose a family at random from the community. RV s: Let B be the # of boys in the family, and let G be the # of girls. Find the joint pdf, f BG (b, g). Note: both RV s (B & G) are discrete, and each RV can take values:,,, or

Repeating the givens Fraction of families w/0 children:.15 Fraction of families w/1 child:.0 Fraction of families w/ children:.35 Fraction of families w/3 children:.30 Calculating some probabilities: Pr(B = 0, G = 0) = Pr(no children) =.15 Assumes Pr(G) = Pr(B) =.5 Pr(B = 0, G = 1) = Pr(1 girl and total of 1 child) = Pr(1 girl 1 child) Pr(1 child) =.5 (.0) =.10 Pr(B = 1, G = 1) = Pr(1 boy, 1 girl children) Pr( children) = Pr{ (GB or BG children) (.35) =.5 (.35) =.175 Pr(B =, G = 1) = Pr( boys, 1 girl 3 children) Pr(3 children) = Pr(BBG or BGB or GBB 3 children) (.30) = (3/8) *.30 =.115 etc. 3

i (Answers calculated on previous page are in bold.) Table of Joint Probabilities: P(B = i, G = j) j 0 1 3 Row sum = Pr(B = i) 0.15.10.0875.0375.3750 1.10.175.115 0.3875.0875.115 0 0.000 3.0375 0 0 0.0375 Col. Sum = Pr(G = j).3750.3875.000.0375 Verification of the other probabilities (not in bold) will be a homework problem. The joint pdf for B, G would be 16 delta functions in the x-y (or b-g) plane, with areas given by the corresponding probability numbers. The last row and the last column are the marginal probabilities, Pr(G = j) and Pr(B = i). 4

Jointly Gaussian RV s RV s and are said to be jointly Gaussian (or jointly normal) if: f (x, y) = 1 x y 1 1 exp{ (1 x m [( ) x m ( y m )( y m ) ( ) x ) ]} Parameters: Mean of =, Mean of =, Standard Deviation of =, Standard Deviation of = Correlation Coefficient (between and ) = 5

Jointly Gaussian RV s Claim (to be shown for homework): If we let = 0 in the equation for f (x, y) (repeated below), the joint pdf can be factored into the form: f (x, y) = f (x) f (y) f (x, y) = 1 1 1 1 exp{ (1 x m [( ) x m ( y m )( y m ) ( ) x ) ]} This will prove that: If two jointly normal RV s and have correlation coefficient = 0, then and are independent. 6

Jointly Gaussian RV s (aka: Bivariate Gaussians) m = m = 0; = = 1; = 0 m = m = 0; = = 1; =.9 0. 0.4 0.15 0.3 pdf 0.1 pdf 0. 0.05 0.1 0 4 0 y - -4-4 - x 0 4 0 4 0 y - -4-4 - x 0 4 Note Bell-shaped curve with circular cross-sections. Note: Bell is squished when x & y are highly correlated; elliptical cross-sections 7

MATLAB Code for the -Dimensional Gaussian PDF Function % function pdf = Gauss_d(mx, my, varx, vary, r) % Generates plot of -d Gaussian pdf function pdf = Gauss_d(mx, my, varx, vary, r) stdx = sqrt(varx); stdy = sqrt(vary); maxstd = max(stdx, stdy); xend = mx + 4 * maxstd; xstart = mx - 4 * maxstd; xstep = (xend - xstart)/100; yend = my + 4 * maxstd; ystart = my - 4 * maxstd; ystep = (yend - ystart)/100; A = 1/(*pi*stdx*stdy*sqrt(1-r^)); [x,y] = meshgrid(xstart : xstep : xend, ystart : ystep : yend); exparg = -1/(*(1-r^)); trm1 = ((x - mx).^)./varx; trm = *r*(x - mx).*(y - my)./ (stdx * stdy); trm3 = ((y - my).^)./vary; pdf = A * exp(exparg*(trm1 - trm + trm3)); surf(x,y,pdf) % surface plot of pdf(x,y) ECE 650 D. van Alphen 8

Conditional pdf s - Again Recall: f (x M) = (d/dx) F(x M), Special Case: If M = {a < b), then f (x M) = f (x) Pr(a b) Same shape as original pdf over region (a, b) Now: event M will be defined in terms of a nd RV, Example: If M is the event: y, then Pr( x, y) F (x M) = F (x y) = Pr(M) F(x, y) F (y) Similarly: If M is the event: y 1 <, then F (x M) = F (x y 1 < ) = Pr( x,m) Pr(M) F(x, y F (y ) ) F(x, y1) F (y ) 1 9

Conditional pdf s - Again In the previous cases, the event M had non-zero probability, so Pr(M) made sense in the denominator. Problem: How do we handle an event M with probability 0, as in M = { = y}, when is a continuous RV? F (x M) = F (x = y) = Pr( x, y) Pr( y) 0 0? Instead, try: F (x = y) = lim F (x y < y + Dy) Similarly: f f (x (y y) Dy 0 x) f (x, y) f (y) f (x, y) f (x) See Cooper & McGillem, 3 rd ed., p. 15 for proof (subscripts on LHS often omitted) 10

Bayes Rule for PDF s Shortened notation: Sometimes we write f (x =y) as f(x y); and f (y =x) as f(y x) So from the previous page we would write: f (x y) f (x, y) f (y) and f (y x) f (x, y) f (x) f (x, y) f (x y) f (y) f (x, y) f (y x) f (x) Equating the RHS s of the two equations above yields: Alternate form: f (x y) f (x y)f (y) f (y x) f (x) f(x y) f (y) = f(y x) f (x) f (y x)f (x) f (y) 11

Mnemonic: Table of Similar Equations Conditional Probabilities Conditional pdf s Definition (1) Pr(A B) Pr(A,B) Pr(B) f(x y) f(x,y) f () Definition () Pr(B A) Pr(A,B) Pr(A) f(y x) f(x,y) f (x) Total Prob. (1) Pr(B) = Pr(B A i ) Pr(A i ) f (y) = f(y x) f (x) dx Total Prob. () Pr(A) = Pr(A B i ) Pr(B i ) f (x) = f(x y) f (y) dx Bayes Rule (1) Pr(A B) Pr(B A)Pr(A) Pr(B) f(x y) f(y x)f(x) f () Bayes Rule () Pr(B A) Pr(A B)Pr(B) Pr(A) f(y x) f(x y)f(y) f (x) 1

Mixed Form of Bayes Rule (Mixing probabilities of events and pdf s) Let A be an event and let be a RV, with pdf f (x). Then Pr(A = x) = f (x A)Pr(A) f (x) and f (x A) = Pr( A x)f Pr(A) (x) 13

Signal + Noise An Example N Tx + = + N Suppose that signal is tx d over an additive noise channel, so that the received signal is = + N. Here is called the observable. The specific observable (say = y 1 ) is considered a good estimate for the desired value,. Goal: Find f(x y). For Bayes Rule, we need f(y x). But since = + N, and is given (as implied by f(y x), then the randomness (in ) is all in N, and is modeled by f N (n), so: f(y x) = f N (n = y x) = f N (y-x) (*) 14

Signal + Noise Specific Example, MAP Receivers for Binary Communication Consider communicating one of two messages (m 0 or m 1 ) over the additive white noise channel: x m i y i Modulator S (In transmitter) n Rcvr From communication theory, the maximum a posteriori probability (or MAP) decision rule for the receiver (which yields minimum Pr(error) in additive white noise for equally likely signals), based on observable y 0 is: m ^ i Modulator: maps messages or symbols to waveforms Choose m = m 1 iff Pr(m 1 =y 0 ) > Pr(m 0 =y 0 ) 15

Signal + Noise Specific Example, MAP Receivers, continued x m i y i Mod. S rcvr m ^ i n MAP Receiver decision rule is based on observable y: ^ Choose m i = m 1 iff Pr(m 1 = y) > Pr(m 0 = y) f(y m 1 ) Pr(m 1 )/f(y) > f(y m 0 ) Pr(m 0 )/f(y) Assume Pr(m 1 ) = Pr(m 0 ) = ½ f N (y-x 1 ) Pr(m 1 ) > f N (y-x 0 ) Pr(m 0 ) f N (y-x 1 ) > f N (y-x 0 ) 16

MAP Rcvr: Binary Case, Equally Likely Signals ^ So far: choose m i = m 1 iff f N (y-x 1 ) > f N (y-x 0 ) Now suppose that the modulator mapping (from messages to signals) is: For message m 0 : tx signal x 0 = 5 volts For message m 1 : tx signal x 1 = -5 volts And suppose that the noise is Gaussian: N(0, = sqrt()): ^ MAP rule becomes: choose m i = m 1 iff f N (y+5) > f N (y-5) 1 exp{ (y 5) /( )} 1 exp{ (y 5) /( )} > 17

MAP Rcvr: Binary Case, Equally Likely Signals (5, -5) in AWGN So far: Choose m i = m 1 iff: ^ < < < < Receiver implementation, with comparator: m 0 if y m 1 if y 18