P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

Similar documents
Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Multivariate Random Variable

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Multiple Random Variables

conditional cdf, conditional pdf, total probability theorem?

5 Operations on Multiple Random Variables

Review of Probability Theory

3. Probability and Statistics

Probability Notes. Compiled by Paul J. Hurtado. Last Compiled: September 6, 2017

STT 441 Final Exam Fall 2013

1 Random Variable: Topics

BASICS OF PROBABILITY

Continuous Random Variables

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

2 (Statistics) Random variables

ENGG2430A-Homework 2

Lecture 11. Probability Theory: an Overveiw

LIST OF FORMULAS FOR STK1100 AND STK1110

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

EE4601 Communication Systems

Lecture 2: Repetition of probability theory and statistics

p. 6-1 Continuous Random Variables p. 6-2

Lecture 19: Properties of Expectation

STAT 430/510: Lecture 16

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

1 Expectation of a continuously distributed random variable

Algorithms for Uncertainty Quantification

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

4. Distributions of Functions of Random Variables

Lecture 1: August 28

Formulas for probability theory and linear models SF2941

ACM 116: Lectures 3 4

Homework 5 Solutions

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

Probability Theory and Statistics. Peter Jochumzen

Probability Background

Notes on Random Vectors and Multivariate Normal

2 Functions of random variables

1.1 Review of Probability Theory

Let X and Y denote two random variables. The joint distribution of these random

Random Variables and Their Distributions

Lecture 5: Moment generating functions

Introduction to Probability and Stocastic Processes - Part I

Review: mostly probability and some statistics

matrix-free Elements of Probability Theory 1 Random Variables and Distributions Contents Elements of Probability Theory 2

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

More than one variable

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Chapter 4. Chapter 4 sections

18.440: Lecture 28 Lectures Review

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables:

18.440: Lecture 28 Lectures Review

ECE 4400:693 - Information Theory

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

Elements of Probability Theory

Lecture 4 : Random variable and expectation

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

1 Review of Probability

Chapter 5 continued. Chapter 5 sections

STAT Chapter 5 Continuous Distributions

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

Quick Tour of Basic Probability Theory and Linear Algebra

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

1 Probability and Random Variables

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Basics on Probability. Jingrui He 09/11/2007

A Probability Review

Uncorrelatedness and Independence

Joint p.d.f. and Independent Random Variables

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

ECE Lecture #9 Part 2 Overview

ECSE B Solutions to Assignment 8 Fall 2008

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Actuarial Science Exam 1/P

Notes for Math 324, Part 19

Lecture 14: Multivariate mgf s and chf s

EE 302: Probabilistic Methods in Electrical Engineering

Partial Solutions for h4/2014s: Sampling Distributions

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

Multivariate Distributions (Hogg Chapter Two)

Northwestern University Department of Electrical Engineering and Computer Science

Statistics, Data Analysis, and Simulation SS 2015

3. General Random Variables Part IV: Mul8ple Random Variables. ECE 302 Fall 2009 TR 3 4:15pm Purdue University, School of ECE Prof.

STAT/MATH 395 PROBABILITY II

Class 8 Review Problems solutions, 18.05, Spring 2014

Lecture 2: Review of Probability

STOR Lecture 16. Properties of Expectation - I

Section 8.1. Vector Notation

16.584: Random Vectors

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1

Probability and Distributions

Stat 5101 Notes: Algorithms

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MULTIVARIATE PROBABILITY DISTRIBUTIONS

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

Transcription:

JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are rvs Discrete random vectors are described by the joint probability density function of X i (or joint pdf; also called the joint probability mass function (pmf)), i = {1,, n} denoted by P (X = x) = P (s S : X i (s) = x i for all i) = p X (x 1, x 2, ) Computing probabilities: For any event A N n, P (X A) = a A P (X = a) Continuous random vectors are described by the joint probability density function of X i (or joint pdf) denoted by f X (x 1,, x n ) The pdf has the following properties: 1 f X (x 1,, x n ) 0 for every x R n 2 f X(x 1,, x n )dx 1 dx n = 1 3 For event A R n, P (X A) = A f X(x)dx = f X (x 1,, x n )dx 1 dx n Marginal distributions Let X be a continuous/discrete random vector having a joint distribution with pdf/pmf f(x) Then, the one-dimensional distributions of X i are called marginal distributions We compute the marginal distributions as follows: If X is a discrete random vector, then the distributions of X i are given by: p Xi (k i ) = P (x) all other X j =x j If X is a continuous random vector (see p172), then the marginal distributions of X i are: f Xi (x i ) = f(x)dx 1 dx n } {{ } Integrate over all x j i Joint cdf of a vector X The joint cumulative distribution function of X is defined by F X (x) = P (X 1 x 1,, X n x n ) Theorem Let F X,Y (u, v) be a joint cdf of the vector (X, Y ) Then the joint pdf of (X, Y ), f X,Y, is given by second partial deriveative of the cdf That is f X,Y (x, y) = 2 F x y X,Y (x, y) 1

INDEPENDENT RANDOM VARIABLES Definition Two random variables are called independent iff for every intervals A and B on the real line P (X A and Y B) = P (X A)P (Y B) Theorem The random variables X i are independent iff their joint pdf is a product of the marginal pdfs, that is f X1,X 2,X 3,,X n (x 1, x 2, x 3,, x n ) = f X1 (x 1 )f X2 (x 2 ) f Xn (x n ), where f X1,X 2,X 3,,X n (x 1, x 2, x 3,, x n ) is the joint pdf of the vector (X 1, X 2, X 3,, X n ), and f X1 (x 1 ), f X2 (x 2 ),, f Xn (x n ) are the marginal pdf s of the variables X 1, X 2, X 3,, X n NOTE: Random variables X and Y are independent iff F X,Y (x, y) = F X (x)f Y (y), where F (x, y) is the joint cdf of (X, Y ), and F X (x) and F Y (y) are the marginal cdf s of the X and Y, respectively Random Sample A random sample of size n from distribution f is a set X 1, X 2, X 3,, X n of independent and identically distributed (iid), with distribution f, random variables CONDITIONAL DISTRIBUTIONS Let (X, Y ) be a random vector with some joint pdf or pmf Consider the problem of finding the probability that X=x AFTER a value of Y was observed To do that we develop conditional distribution of X given Y=y Definition If (X, Y ) is a discrete random vector with pmf p X,Y (x, y), and if P (Y = y) > 0, then the conditional distribution of X given Y=y is given by the conditional pmf p X Y =y (x) = p X,Y (x, y) p Y (y) Similarily, if P (X = x) > 0, then the conditional distribution of Y given X=x is given by the conditional pmf p Y X=x (y) = p X,Y (x,y) p X (x) Definition If (X, Y ) is a continuous random vector with pdf f X,Y (x, y), and if f Y (y) > 0, then the conditional distribution of X given Y=y is given by the conditional pdf f X Y =y (x) = f X,Y (x, y) f Y (y) Similarily, if f X (x) > 0, then the conditional distribution of Y given X=x is given by the conditional pdf f Y X=x (y) = f X,Y (x,y) f X (x) 2

Independence and conditional distributions If random variables X and Y are independent, then their marginal pdf/pmf s are the same as their conditional pdf/pmf s That is f Y X=x (y) = f Y (y) and f X Y =y (x) = f X (x), for all y and x where f Y (y) > 0 and f X (x) > 0, respectively FUNCTIONS OF RANDOM VARIABLES: PDF OF A SUM, PRODUCT OR QUOTIENT Let X and Y be independent random variables with pdf or pmf s f X and f Y or p X and p Y, respectively Then, If X and Y are discrete random variables, then the pmf of their sum W = X+ Y is p W (w) = allx p X (x)p Y (w x) If X and Y are continuous random variables, then the pdf of their sum W = X+ Y is the convolution of the individual densities: f W (w) = f X (x)f Y (w x)dx If X and Y are independent continuous random variables, then the pdf of their quotient W = Y/X is given by: f W (w) = x f X (x)f Y (wx)dx For a proof of the above result, and the statement below, see pg 181-182 in the text If X and Y are independent continuous random variables, then the pdf of their product W = XY is given by: 1 f W (w) = x f 1 X(w/y)f Y (y)dy = x f X(x)f Y (w/x)dx 3

MORE ON EXPECATION, VARIANCE, AND COVARIANCE Expected value of a function Let (X, Y ) be a discrete random vector with pmf p(x, y) or continuous random vector with pdf f(x, y) Let g( ) be a real values function of X and Y Then, the expected value of the random variable g(x, Y ) is given by: E(g(X, Y )) = allx E(g(X, Y )) = g(x, y)p(x, y), in discrete case, and ally g(x, y)f(x, y)dxdy, in continuous case, provided that the sums and the integrals converge absolutely This generalizes, as expected, to n-dimensional random vectors X Note: Some treat the terms mean and expected value synonymously I will try to restrict the use of the term mean to refer to the arithmetic mean of a series of numbers Expected value of a sum of random variables Let X 1, X 2,, X n be any random variables with finite expected values, and let a 1, a 2,, a n be a set of real numbers Then E(a 1 X 1 + a 2 X 2 + + a n X n ) = a 1 E(X 1 ) + a 2 E(X 2 ) + + a n E(X n ) Expected value of a product of independent random variables If X and Y are independent random variables with finite expectations, then E(XY ) = E(X)E(Y ) Variance of a sum of independent random variables Let X 1, X 2,, X n be any independent random variables with finite second moments (ie E(Xi 2 ) < ), and let a 1, a 2,, a n be a set of real numbers Then V ar(a 1 X 1 + a 2 X 2 + + a n X n ) = a 2 1V ar(x 1 ) + a 2 2V ar(x 2 ) + + a 2 nv ar(x n ) Covariance The covariance of random variables X and Y (each with finite variances) is written Cov(X, Y ) or σ(x, Y ) and defined as Cov(X, Y ) = E((X E(X))(Y E(Y ))) = E(XY ) E(X)E(Y ) For random vectors X = (X 1,, X n ) T and Y = (Y 1,, Y m ) T the covariance matrix is Cov(X, Y ) = E((X E(X))(Y E(Y ) T )) = E(XY T ) E(X)E(Y ) T The i, jth element of the covariance matrix is Cov(X i, Y j ) (sometimes written σ Xi Y j or simply σ ij ) The ith diagonal term of Cov(X, X) is the variance of X i (V ar(x i )) 4

Additional properties of covariance: For random vectors X, Y, and Z and constant c, 1 σ(x, c) = σ(c, X) = 0 2 σ(x, Y ) = σ(y, X) T 3 σ(x + Y, Z) = σ(x, Z) + σ(y, Z) and σ(x, Y + Z) = σ(x, Y ) + σ(x, Z) 4 σ(cx, Y ) = c σ(x, Y ) = σ(x, cy ) Correlation: If Cov(X, Y ) = 0 we say X and Y are uncorrelated (if Cov(X, Y ) 0, X and Y are correlated) Note: Independent random variables are always uncorrelated, however the reverse isn t generally true! The one general case where uncorrelated implies independent is where the joint distribution of X and Y is multivariate-normal (aka Gaussian) Note: Correlation is often described using correlation coefficients (eg, see pg 576) MOMENT GENERATING FUNCTIONS Definition The moment generating function M X (t) of random variable X is given by e tk p X (k) if X discrete, M X (t) = E(e tx ) = all k etx f(x)dx if X continuous, at all real values t for which the expectation exists Use for mgf s Moment generating functions are very useful in probability and mathematical statistics They are primarily used for two purposes: 1 Finding moments of random variables, and 2 Identifying distributions of random variables Theorem Let X be a continuous random variable with pdf f(x) Assume that the pdf f(x) is sufficiently smooth for the order of differentiation and integration to be exchanged Let M X (t) be the mgf of X Then provided that the rth moment of X exists M (r) X (0) = E(Xr ), 5

Theorem: Identifying distributions Suppose that X 1 and X 2 are random variables with pdf/pmf s f X1 (x) and f X2 (x), respectively Suppose that M X1 (t) = M X2 (t) for all t in a neighborhood of 0 Then X 1 and X 2 are equal in distribution, that is f X1 (x) = f X2 (x) Theorem: Properties of mgf s This theorem describes the mgf of a linear function and of a sum of independent random variables 1 Mgf of a linear function Suppose that X is a random variable with mgf M X (t), and let Y = ax + b, where a and b are real numbers Then, the mgf of Y is given by: M Y (t) = e bt M X (at) 2 Mgf of a sum of independent rv s Suppose X 1, X 2,, X n are independent random variables with mgf s M X1 (t), M X2 (t),, M Xn (t), respectively Then, the mgf of their sum is the product of the mgf s, that is M X1 +X 2 + +X n (t) = M X1 (t) M X2 (t) M Xn (t) 6