EE4601 Communication Systems

Similar documents
Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Continuous Random Variables

Let X and Y denote two random variables. The joint distribution of these random

Introduction to Probability and Stocastic Processes - Part I

Recitation 2: Probability

Algorithms for Uncertainty Quantification

Basics on Probability. Jingrui He 09/11/2007

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

Lecture 2: Repetition of probability theory and statistics

Introduction to Probability and Stocastic Processes - Part I

ECE 4400:693 - Information Theory

Multivariate Random Variable

Multiple Random Variables

Random Variables. P(x) = P[X(e)] = P(e). (1)

conditional cdf, conditional pdf, total probability theorem?

UNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS

3. Probability and Statistics

Bivariate distributions

Quick Tour of Basic Probability Theory and Linear Algebra

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

2 (Statistics) Random variables

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

1 Presessional Probability

1 Random Variable: Topics

Lecture 11. Probability Theory: an Overveiw

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Introduction to Probability Theory

18 Bivariate normal distribution I

Review: mostly probability and some statistics

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

STAT Chapter 5 Continuous Distributions

Chapter 5 continued. Chapter 5 sections

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

5 Operations on Multiple Random Variables

Formulas for probability theory and linear models SF2941

Appendix A : Introduction to Probability and stochastic processes

Review of Probability Theory

1 Review of Probability and Distributions

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Chapter 2: Random Variables

Probability and Distributions

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Lecture 14: Multivariate mgf s and chf s

Given a experiment with outcomes in sample space: Ω Probability measure applied to subsets of Ω: P[A] 0 P[A B] = P[A] + P[B] P[AB] = P(AB)

Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution:

Introduction to Normal Distribution

A Probability Review

Chap 2.1 : Random Variables

Chapter 3: Random Variables 1

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

Random Variables and Their Distributions

matrix-free Elements of Probability Theory 1 Random Variables and Distributions Contents Elements of Probability Theory 2

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Ch. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited

Preliminary statistics

Communication Theory II

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Final Exam # 3. Sta 230: Probability. December 16, 2012

Probability Theory and Statistics. Peter Jochumzen

Conditional distributions. Conditional expectation and conditional variance with respect to a variable.

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Multiple Random Variables

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Chapter 2. Probability

where r n = dn+1 x(t)

Joint Gaussian Graphical Model Review Series I

Lecture 2: Review of Probability

Practice Examination # 3

More than one variable

Fundamentals of Digital Commun. Ch. 4: Random Variables and Random Processes

1.1 Review of Probability Theory

ACM 116: Lectures 3 4

Statistics, Data Analysis, and Simulation SS 2015

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Section 8.1. Vector Notation

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology

BASICS OF PROBABILITY

Chapter 5. Chapter 5 sections

Lecture 1: August 28

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Statistics for scientists and engineers

ENGG2430A-Homework 2

UCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Review of Statistics I

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Lecture 2. Spring Quarter Statistical Optics. Lecture 2. Characteristic Functions. Transformation of RVs. Sums of RVs

1: PROBABILITY REVIEW

Probability Background

Review (Probability & Linear Algebra)

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Elements of Probability Theory

Transcription:

EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1)

Conditional Probability Consider a sample space that consists of two events A and B. The conditional probability P(A B) is the probability of the event A given that the event B has occurred. P(A B) = P(A B) P(B) If A and B are independent events, then P(A B) = P(A)P(B) Hence, P(A B) = P(A) and the occurrence of event B does not change the probability of occurrence of event A. 0 c 2011, Georgia Institute of Technology (lect2 2)

Disjoint vs. Independent Events The probability of the union event A or B is P(A B) = P(A)+P(B) P(A B) If events A and B are mutually exclusive or disjoint then P(A B) = P(A)+P(B) Note that mutually exclusive and independent events are two entirely different concepts. In fact, independent events A and B with non-zero probabilities, P(A) and P(B), cannot be mutually exclusive because P(A B) = P(A)P(B) > 0. If they were mutually exclusive then we must have P(A B) = 0. Intuitively, if the events A and B are mutually exclusive, then the occurrence of the event A precludes the occurrence of the event B. Hence, the knowledge that A has occurred definitely affects the probability that B has occurred. So A and B are not independent. 0 c 2011, Georgia Institute of Technology (lect2 3)

Bayes Theorem Let events A i,i = 1,...n be mutually exclusive such that n i=1 A i = S, where S is the sample space. Let B be some event with non-zero probability. Then P(A i B) = P(A i,b) P(B) P(B A i )P(A i ) = ni=1 P(B A i )P(A i ) where we use notation P(A i,b) = P(A i B). For continuous random variables x and y with probability density functions f(x) and f(y), the conditional density f(x y) is f(x y) = f(x,y) f(y) f(y x)f(x) = f(y x)f(x)dx 0 c 2011, Georgia Institute of Technology (lect2 4)

Bayes Theorem - Example Suppose that a digital source generates 0 s and 1 s with unequal probability Q(0) = q and Q(1) = 1 q. The bits are transmitted over a binary symmetric channel (BSC), such that P(1 0) = P(0 1) = p and P(0 0) = P(1 1) = 1 p, where P(j k) is the probability that j is received given that k is transmitted. If a 1 is received what is the probability that a 0 was transmitted? P(k j) = P(0 1) = What is the probability of bit error? = P(j k)q(k) P(j 0)Q(0) + P(j 1)Q(1) P(1 0)Q(0) P(1 0)Q(0) + P(1 1)Q(1) pq pq +(1 p)(1 q) P e = P(1 0)Q(0)+P(0 1)Q(1) = p 0 c 2013, Georgia Institute of Technology (lect2 5)

Random Variables Consider the random variable X. The cumulative distribution function (cdf) of X is F X (x) = P(X x), 0 F X (x) 1 The complementary distribution function (cdfc) of X is FX(x) c = P(X > x) = 1 F X (x), 0 F X (x) 1 The probability density function (pdf) of X is f X (x) = df X(x) dx F X (x) = x f X(x)dx 0 c 2011, Georgia Institute of Technology (lect2 6)

Bivariate Random Variables Consider two random variables X and Y. The joint (cdf) of X and Y is F XY (x,y) = P(X x,y y), 0 F XY (x,y) 1 The joint (cdfc) of X and Y is FXY(x,y) c = P(X > x,y > y) = 1 F XY (x,y), 0 F XY (x,y) 1 The joint (pdf) of X and Y is f XY (x,y) = 2 F XY (x,y) x y The marginal pdfs of X and Y are f X (x) = F XY (x) = x f XY(x,y)dy f Y (x) = The conditional pdfs of X and Y are f X Y (x y) = f XY(x,y) f Y (y) y f XY(x,y)dxdy f XY(x,y)dx f Y X (y x) = f XY(x,y) f X (x) 0 c 2011, Georgia Institute of Technology (lect2 7)

Statistical Averages Consider any random variable X. The mean of X is The nth moment of X is The variance of X is µ X = E[X] = E[X n ] = xf X(x)dx xn f X (x)dx σx 2 = E[(X µ X) 2 ] = E[X 2 2Xµ X +µ 2 X] = E[X 2 ] 2E[X]µ X +µ 2 X = E[X 2 ] µ 2 X Consider any function g(x) of the random variable X. Then E[g n (X)] = gn (x)f X (x)dx 0 c 2011, Georgia Institute of Technology (lect2 8)

Joint Moments Consider a pair of random variables X and Y. The joint moment of X and Y is E[X i Y j ] = The covariance of X and Y is xi y j f XY (x,y)dxdy cov[x,y] = E[(X µ X )(Y µ Y )] = E[XY Xµ Y Yµ X +µ X µ Y ] = E[XY] µ X µ Y The correlation coefficient of X and Y is ρ = cov[x,y] σ X σ Y Two random variables X and Y are uncorrelated iff cov[xy] = 0. Note that independent uncorrelated. Two random variables X and Y are orthogonal iff E[XY] = 0. 0 c 2011, Georgia Institute of Technology (lect2 9)

Characteristic Functions Consider the random variable X. The characteristic or moment generating function of X is Φ X (v) = E[e jvx ] = f X(x)e jvx dx Except for the sign of the exponent in the integrand, the characteristic function is just the Fourier transform of the pdf. Taking the derivative of both sides n times and setting v = 0 gives d n dv nφ X(v) = (j) n v=0 xn f X (x)dx Recognizing the integral on the R.H.S. as the nth moment, we have ( j) n dn dv nφ X(v) = E[x n ] v=0 The pdf is inverse Fourier transform (note change in sign of exponent) f X (x) = 1 2π Φ X(v)e jvx dv 0 c 2011, Georgia Institute of Technology (lect2 10)

Joint Characteristic Functions Consider the random variables X and Y. The joint characteristic function is Φ XY (v 1,v 2 ) = E[e jv 1X+jv 2 Y ] = If X and Y are independent, then Φ XY (v 1,v 2 ) = E[e jv 1X+jv 2 Y ] = f X(x)e jv1x dx = Φ X (v 1 )Φ Y (v 2 ) Moments can be generated according to f XY(x,y)e jv 1x+jv 2 y dxdy E[XY] = 2 Φ XY (v 1,v 2 ) v 1 v 2 v1 =v 2 =0 f Y(y)e jv 2y dy with higher order moments generated in a straight forward extension. 0 c 2011, Georgia Institute of Technology (lect2 11)

Binomial Distribution Let X be a Bernoulli random variable such that X = 0 with probability1 p and X = 1 with probabilityp. Although X is a discrete random random variable with an associated probability distribution function, it is possible to treat X as a continuous random variable with a probability density function (pdf) by using dirac delta functions. The pdf of X is p X (x) = (1 p)δ(x)+pδ(x 1) Let Y = n i=1 X i, where the X i are independent and identically distributed (iid) Bernoulli random variables. Then the random variable Y is an integer from the set {0,1,...,n} and Y has the binomial probability distribution function p Y (k) = P(Y = k) = n p k (1 p) n k, k = 0,1,...,n k Using dirac delta functions, the binomial random variable Y has the pdf f Y (y) = n n p k (1 p) n k δ(y k) k k=0 0 c 2013, Georgia Institute of Technology (lect2 12)

Bernoulli and Binomial RVs 0 c 2011, Georgia Institute of Technology (lect2 13)

Gaussian Random Variables A real-valued Gaussian random variable X N(µ,σ 2 ) has the pdf f X (x) = 1 e (x µ)2 2σ 2πσ 2 where µ = E[X] is the mean and σ 2 = E[(X µ) 2 ] is the variance. The random variable X N(0,1) has a standard normal density. The cumulative distribution function (cdf) of X, F X (x), is F X (x) = x 1 2πσ e (y µ)2 2σ 2 dy The complementary distribution function (cdfc), F c X (x) = 1 F X(x) of a standard normal random variable defines the Gaussian Q function Q(x) = while its cdf defines the Gaussian Φ function x 1 2π e y2 /2 dy Φ(x) = 1 Q(x) 0 c 2013, Georgia Institute of Technology (lect2 14)

Gaussian RV 0 c 2011, Georgia Institute of Technology (lect2 15)

Gaussian Random Variables If X is a non-standard normal random variable, X N(µ,σ 2 ), then ( x µ F X (x) = Φ σ ( x µ FX(x) c = Q σ The error function erf(x) and the complementary error function erfc(x), are defined by erfc(x) = 2 π x e y2 dy erf(x) = 2 π x 0 e y2 dy Note that erfc(x) 1 erf(x). The complementary error function and Q function are related as follows ) ) erfc(x) = 2Q( 2x) Q(x) = 1 2 erfc ( x 2 ) 0 c 2011, Georgia Institute of Technology (lect2 16)

Multivariate Gaussian Distribution Let X i N(µ i,σ 2 i),i = 1,...,n, be correlated real-valued Gaussian random variables having covariances Let µ Xi X j = E[(X i µ i )(X j µ j )] = E[X i X j ] µ i µ j, 1 i,j n X = (X 1,X 2,...,X n ) T x = (x 1,x 2,...,x n ) T µ X = (µ 1,µ 2,...,µ n ) T Λ = µ X1 X 1 µ X1 X n.. µ Xn X 1 µ Xn X n where X T is the transpose of X. 0 c 2011, Georgia Institute of Technology (lect2 17)

Multivariate Gaussian Distribution The joint pdf of X defines the multivariate Gaussian distribution f X (x) = where Λ is the determinant of Λ. { 1 (2π) n/2 Λ exp 1 } 1/2 2 (x µ X) T Λ 1 (x µ X ) 0 c 2012, Georgia Institute of Technology (lect2 18)

Bivariate Gaussian Distribution For the case of 2 Gaussian random variables µ X = (µ 1,µ 2 ) T Λ = σ 2 1 ρ ρ 1 where ρ = µ 12 /(σ 1 σ 2 ) = µ 12 /σ 2. Then Λ = σ 4 (1 ρ 2 ) and Λ 1 = σ2 1 ρ Λ ρ 1 = 1 1 ρ σ 2 (1 ρ 2 ) ρ 1 With µ x = (0,0) we have f X1,X 2 (x 1,x 2 ) = = 1 2πσ 2 1 ρ exp 2 1 2πσ 2 1 ρ exp 2 1 2σ 2 (1 ρ 2 ) (x 1,x 2 ) 1 ρ ρ 1 x2 1 2ρx 1 x 2 +x 2 2 2σ 2 (1 ρ 2 ) x 1 x 2 0 c 2012, Georgia Institute of Technology (lect2 19)

Bivariate Gaussian Distribution σ X = σ Y = 1, ρ XY = 0. 0 c 2011, Georgia Institute of Technology (lect2 20)

Bivariate Gaussian Distribution σ X = σ Y = 1, ρ XY = 0.3. 0 c 2011, Georgia Institute of Technology (lect2 21)

Bivariate Gaussian Distribution σ X = σ Y = 1, ρ XY = 0.7. 0 c 2011, Georgia Institute of Technology (lect2 22)

Bivariate Gaussian Distribution σ X = σ Y = 1, ρ XY = 0.95. 0 c 2011, Georgia Institute of Technology (lect2 23)

Examples Suppose that X N( E,N o /2). What is the probability that X < 0? Answer: P(X < 0) = P(X > 2 E) = Q 2 E µ X σ X = Q E = Q No /2 2E N o The first line followsfrom the fact that the pdf of X is symmetricabout its mean E. 0 c 2013, Georgia Institute of Technology (lect2 24)

Examples Suppose that X and Y are independent identically distributed Gaussian random variables with mean E and variance N o /2. What is the probability of the joint event that X < 0 and Y < 0. Answer: P(X < 0,Y < 0) = P(X > 2 E,Y > 2 E) = P(X > 2 E)P(Y > 2 E) = Q 2 2E N o The second line follows from the fact that X and Y are independent. 0 c 2013, Georgia Institute of Technology (lect2 25)

Examples Suppose that X and Y are independent identically distributed Gaussian random variables with mean µ and variance σ 2. What is the mean and variance of the random variable XY. Answer: We could find the joint pdf of the random variable XY and then compute its moments. However, there is a much easier approach µ XY = µ X µ Y = µ 2 σ 2 XY = E[(XY µ XY ) 2 ] = E[(XY) 2 2E[XY]µ XY +µ 2 XY = E[X 2 ]E[Y 2 ] µ 4 = (σ 2 +µ 2 ) 2 µ 4 = σ 4 +2µ 2 σ 2 0 c 2013, Georgia Institute of Technology (lect2 26)