Continuous Random Variables

Similar documents
Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Expectation of Random Variables

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

5 Operations on Multiple Random Variables

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

SDS 321: Introduction to Probability and Statistics

Formulas for probability theory and linear models SF2941

Joint Distribution of Two or More Random Variables

2 Functions of random variables

3. Probability and Statistics

Chapter 4. Chapter 4 sections

conditional cdf, conditional pdf, total probability theorem?

Probability and Distributions

Mathematics 426 Robert Gross Homework 9 Answers

Introduction to Probability Theory

BASICS OF PROBABILITY

EE4601 Communication Systems

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

2 (Statistics) Random variables

MULTIVARIATE PROBABILITY DISTRIBUTIONS

More on Distribution Function

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

STAT/MATH 395 PROBABILITY II

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Chapter 5 continued. Chapter 5 sections

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Introduction to Normal Distribution

Random Variables. P(x) = P[X(e)] = P(e). (1)

ECE Lecture #9 Part 2 Overview

STAT Chapter 5 Continuous Distributions

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

More than one variable

Lecture 2: Repetition of probability theory and statistics

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Bivariate Transformations

Continuous Random Variables and Continuous Distributions

Random Variables and Their Distributions

Introduction to Probability and Stocastic Processes - Part I

1 Presessional Probability

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

Review: mostly probability and some statistics

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

Introduction to Statistical Inference Self-study

18 Bivariate normal distribution I

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

Chapter 4 Multiple Random Variables

Bivariate distributions

STAT 430/510: Lecture 16

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Chp 4. Expectation and Variance

ENGG2430A-Homework 2

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

The Multivariate Normal Distribution. In this case according to our theorem

Lecture 19: Properties of Expectation

1 Random Variable: Topics

Review of Probability Theory

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

Multivariate Random Variable

Bivariate Distributions

Lecture 1: August 28

Elements of Probability Theory

Mathematical Statistics 1 Math A 6330

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

1 Review of Probability and Distributions

Convergence of Random Variables

Chapter 3: Random Variables 1

Fundamental Tools - Probability Theory II

Continuous distributions

Chapter 4 : Expectation and Moments

STA 256: Statistics and Probability I

Lecture 11. Probability Theory: an Overveiw

UCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)

Let X and Y denote two random variables. The joint distribution of these random

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

ACM 116: Lectures 3 4

Chapter 4. Continuous Random Variables 4.1 PDF

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2

3 Multiple Discrete Random Variables

Gaussian random variables inr n

Chapter 2. Continuous random variables

Statistics for scientists and engineers

SDS 321: Introduction to Probability and Statistics

1 Probability theory. 2 Random variables and probability theory.

Algorithms for Uncertainty Quantification

Chapter 5. Random Variables (Continuous Case) 5.1 Basic definitions

Probability and Statistics Notes

Multiple Random Variables

Convergence of Random Variables

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

Lecture 2: Review of Probability

Stat 5101 Notes: Algorithms

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

01 Probability Theory and Statistics Review

Transcription:

1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013

2 / 24 Continuous Random Variables Definition A random variable is called continuous if its distribution function can be expressed as F(x) = x f (u) du for all x R for some integrable function f : R [0, ) called the probability density function of X. Example Uniform random variable Ω = [a, b], X(ω) = ω, f (x) = { 1 b a F (x) = for a x b 0 otherwise 0 x < a a x b 1 x > b x a b a

3 / 24 Probability Density Function The numerical value f (x) is not a probability. It can be larger than 1. f (x)dx can be intepreted as the probability P(x < X x + dx) since P(x < X x + dx) = F(x + dx) F(x) f (x) dx P(a X b) = b f (x) dx a f (x) dx = 1 P(X = x) = 0 for all x R

4 / 24 Independence Continuous random variables X and Y are independent if the events {X x} and {Y y} are independent for all x and y in R If X and Y are independent, then the random variables g(x) and h(y ) are independent Let the joint probability distribution function of X and Y be F(x, y) = P(X x, Y y). Then X and Y are said to be jointly continuous random variables with joint pdf f X,Y (x, y) if for all x, y in R F (x, y) = u v X and Y are independent if and only if f X,Y (u, v) du dv f X,Y (x, y) = f X (x)f Y (y) for all x, y R

5 / 24 Expectation The expectation of a continuous random variable with density function f is given by E(X) = whenever this integral is finite. xf (x) dx If X and g(x) are continuous random variables, then E(g(X)) = g(x)f (x) dx If a, b R, then E(aX + by ) = ae(x) + be(y ) If X and Y are independent, E(XY ) = E(X)E(Y ) If k is a positive integer, the kth moment m k of X is defined to be m k = E(X k ) The kth central moment σ k is σ k = E [ (X m 1 ) k] The second central moment σ 2 = E[(X m 1 ) 2 ] is called the variance For a non-negative continuous RV X, E(X) = [1 F(x)] dx 0 Cauchy-Schwarz inequality holds for continuous random variables

Gaussian Random Variables

7 / 24 Gaussian Random Variable Definition A continuous random variable with pdf of the form p(x) = ( ) 1 exp (x µ)2 2πσ 2 2σ 2, < x <, where µ is the mean and σ 2 is the variance. 0.4 0.3 p(x) 0.2 0.1 0 0.1 4 2 0 2 4 x

8 / 24 Notation N(µ, σ 2 ) denotes a Gaussian distribution with mean µ and variance σ 2 X N(µ, σ 2 ) X is a Gaussian RV with mean µ and variance σ 2 X N(0, 1) is termed a standard Gaussian RV

9 / 24 Affine Transformations Preserve Gaussianity Theorem If X is Gaussian, then ax + b is Gaussian for a, b R, a 0. Remarks If X N(µ, σ 2 ), then ax + b N(aµ + b, a 2 σ 2 ). If X N(µ, σ 2 ), then X µ σ N(0, 1).

CDF and CCDF of Standard Gaussian Cumulative distribution function of X N(0, 1) x ( ) 1 t 2 Φ(x) = P [X x] = exp dt 2π 2 Complementary cumulative distribution function of X N(0, 1) ( ) 1 t 2 Q(x) = P [X > x] = exp dt 2π 2 x p(t) Q(x) Φ(x) x t 10 / 24

11 / 24 Φ(x) + Q(x) = 1 Q( x) = Φ(x) = 1 Q(x) Properties of Q(x) Q(0) = 1 2 Q( ) = 0 Q( ) = 1 X N(µ, σ 2 ) ( ) α µ P[X > α] = Q σ ( ) µ α P[X < α] = Q σ

Jointly Gaussian Random Variables Definition (Jointly Gaussian RVs) Random variables X 1, X 2,..., X n are jointly Gaussian if any non-trivial linear combination is a Gaussian random variable. a 1 X 1 + + a n X n is Gaussian for all (a 1,..., a n ) R n \ 0 Example (Not Jointly Gaussian) X N(0, 1) { X, if X > 1 Y = X, if X 1 Y N(0, 1) and X + Y is not Gaussian. 12 / 24

13 / 24 Gaussian Random Vector Definition (Gaussian Random Vector) A random vector X = (X 1,..., X n ) T whose components are jointly Gaussian. Notation X N(m, C) where m = E[X], C = E [(X ] m)(x m) T Definition (Joint Gaussian Density) If C is invertible, the joint density is given by 1 p(x) = ( (2π) n det(c) exp 1 ) 2 (x m)t C 1 (x m)

14 / 24 Uncorrelated Jointly Gaussian RVs are Independent If X 1,..., X n are jointly Gaussian and pairwise uncorrelated, then they are independent. 1 p(x) = ( (2π) n det(c) exp 1 ) 2 (x m)t C 1 (x m) = ( ) n 1 exp (x i m i ) 2 2πσi 2 2σi 2 i=1 where m i = E[X i ] and σ 2 i = var(x i ).

15 / 24 Uncorrelated Gaussian RVs may not be Independent Example X N(0, 1) W is equally likely to be +1 or -1 W is independent of X Y = WX Y N(0, 1) X and Y are uncorrelated X and Y are not independent

Conditional Distribution and Density Functions

17 / 24 Conditional Distribution Function For discrete RVs, the conditional distribution was defined as F Y X (y x) = P(Y y X = x) for any x such that P(X = x) > 0 For continuous RVs, P(X = x) = 0 for all x But considering an interval around x such that f X (x) > 0, we have P(Y y x X x + dx) = = P(Y y, x X x + dx) P(x X x + dx) y f (x, v) dx dv v= f X (x) dx y v= f (x, v) f X (x) dv Definition The conditional distribution function of Y given X = x is the function F Y X ( x) given by F Y X (y x) = y v= f (x, v) f X (x) dv for any x such that f X (x) > 0. It is sometimes denoted by P(Y y X = x).

18 / 24 Conditional Density Function Definition The conditional density function of Y given X = x is given by for any x such that f X (x) > 0. f Y X (y x) = f (x, y) f X (x) Example (Bivariate Standard Normal Distribution) X and Y are continuous random variables with joint density given by ( ) 1 f (x, y) = 2π 1 ρ exp 1 2 2(1 ρ 2 ) (x 2 2ρxy + y 2 ) where 1 < ρ < 1. [X Y ] T N(m, C) where m = [ ] 0, C = 0 [ ] 1 ρ ρ 1 What are the marginal densities of X and Y? What is the conditional density f Y X (y x)?

19 / 24 Definition Conditional Expectation The conditional expectation of Y given X is given by E(Y X = x) = yf Y X (y x) dy Theorem The conditional expectation ψ(x) = E(Y X) satisfies E [E(Y X)] = E(Y ) Example (Bivariate Standard Normal Distribution) X and Y are continuous random variables with joint density given by ( ) 1 f (x, y) = 2π 1 ρ exp 1 2 2(1 ρ 2 ) (x 2 2ρxy + y 2 ) where 1 < ρ < 1. What is the conditional expectation of Y given X?

Functions of Continuous Random Variables

21 / 24 Functions of a Single Random Variable If X is a continuous random variable with density function f, what is the distribution function of Y = g(x)? F Y (y) = P(g(X) y) ( ) = P X g 1 (, y] = f (x) dx g 1 (,y] Example (Affine transformation) Let X be a continuous random variable. What are the distribution and density functions of ax + b for a, b R? Example (Squaring a Gaussian RV) Let X N(0, 1) and let g(x) = x 2. What are the distribution and density functions of g(x)?

22 / 24 Functions of Two Random Variables Let X 1 and X 2 have the joint density function f. Let Y 1 = g(x 1, X 2 ) and Y 2 = h(x 1, X 2 ). What is the joint density function of Y 1 and Y 2? Let the transformation T : (x 1, x 2 ) (y 1, y 2 ) be one-to-one. Then the transformation has an inverse x 1 = x 1 (y 1, y 2 ) and x 2 = x 2 (y 1, y 2 ) with Jacobian equal to the determinant x 1 x 2 y J(y 1, y 2 ) = 1 y 1 x 1 = x 1 x 2 x 1 x 2 y 1 y 2 y 2 y 1 x 2 y 2 y 2 The joint density of Y 1 and Y 2 is given by { f (x1 (y f Y1,Y 2 (y 1, y 2 ) = 1, y 2 ), x 2 (y 1, y 2 )) J if (y 1, y 2 ) is in T s range 0 otherwise Example Let Y 1 = ax 1 + bx 2 and Y 2 = cx 1 + dx 2 with ad bc 0. What is the joint density of Y 1 and Y 2?

23 / 24 Theorem Sum of Continuous Random Variables If X and Y have a joint density function f, then X + Y has density function f X+Y (z) = If X and Y are independent, then f X+Y (z) = f X (x)f Y (z x) dx = f (x, z x) dx. f X (z y)f Y (y) dy. The density function of the sum is the convolution of the marginal density functions. Example (Sum of Gaussian RVs) Let X N(0, 1) and Y N(0, 1) be independent. What is the density function of X + Y?

Questions? 24 / 24