Chapter 5 Random vectors, Joint distributions. Lectures 18-23

Similar documents
Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

Section 2: Classes of Sets

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 3 9/10/2008 CONDITIONING AND INDEPENDENCE

MTH 202 : Probability and Statistics

Probability. Lecture Notes. Adolfo J. Rumbos

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

EXAM # 3 PLEASE SHOW ALL WORK!

Basic Definitions: Indexed Collections and Random Functions

Basics on Probability. Jingrui He 09/11/2007

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

Math-Stat-491-Fall2014-Notes-I

Chapter 1: Probability Theory Lecture 1: Measure space and measurable function

(2) E M = E C = X\E M

Module 3. Function of a Random Variable and its distribution

Lebesgue Measurable Sets

Review of Probability Theory

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Economics 204 Fall 2011 Problem Set 1 Suggested Solutions

Introduction to Stochastic Processes

Brief Review of Probability

Lecture 6 Feb 5, The Lebesgue integral continued

1.1. MEASURES AND INTEGRALS

LECTURE 3 RANDOM VARIABLES, CUMULATIVE DISTRIBUTION FUNCTIONS (CDFs)

Chapter 1: Probability Theory Lecture 1: Measure space, measurable function, and integration

HW Solution 12 Due: Dec 2, 9:19 AM

1 Variance of a Random Variable

Indeed, if we want m to be compatible with taking limits, it should be countably additive, meaning that ( )

Single Maths B: Introduction to Probability

The discrete and indiscrete topologies on any set are zero-dimensional. The Sorgenfrey line

Thus, X is connected by Problem 4. Case 3: X = (a, b]. This case is analogous to Case 2. Case 4: X = (a, b). Choose ε < b a

LECTURE 2. Convexity and related notions. Last time: mutual information: definitions and properties. Lecture outline

Math 3338: Probability (Fall 2006)

Probability (continued)

Chapter 1. Probability, Random Variables and Expectations. 1.1 Axiomatic Probability

Discrete Probability Refresher

Chapter 4 Multiple Random Variables

Basic Measure and Integration Theory. Michael L. Carroll

MTH 202 : Probability and Statistics

18.175: Lecture 2 Extension theorems, random variables, distributions

Lecture Notes on Metric Spaces

Laplace Transform Introduction

Product measures, Tonelli s and Fubini s theorems For use in MAT4410, autumn 2017 Nadia S. Larsen. 17 November 2017.

Some Background Material

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

Measures and Measure Spaces

1 Joint and marginal distributions

Spring 2014 Advanced Probability Overview. Lecture Notes Set 1: Course Overview, σ-fields, and Measures

Module 1. Probability

Test One Mathematics Fall 2009

2 Measure Theory. 2.1 Measures

Lecture 13: Conditional Distributions and Joint Continuity Conditional Probability for Discrete Random Variables

Lecture 12: Multiple Random Variables and Independence

Part V. 17 Introduction: What are measures and why measurable sets. Lebesgue Integration Theory

Office hours: Wednesdays 11 AM- 12 PM (this class preference), Mondays 2 PM - 3 PM (free-for-all), Wednesdays 3 PM - 4 PM (DE class preference)

STA2112F99 ε δ Review

3. The Multivariate Hypergeometric Distribution

Appendix A : Introduction to Probability and stochastic processes

Some Basic Notations Of Set Theory

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

ST5215: Advanced Statistical Theory

Multivariate random variables

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 18

Course on Inverse Problems

MORE ON CONTINUOUS FUNCTIONS AND SETS

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

Problem Set 1 Sept, 14

Chapter 3: Unbiased Estimation Lecture 22: UMVUE and the method of using a sufficient and complete statistic

Abstract Measure Theory

Lecture 9: Conditional Probability and Independence

Construction of a general measure structure

Sample Spaces, Random Variables

Lecture Lecture 5

6.262: Discrete Stochastic Processes 2/2/11. Lecture 1: Introduction and Probability review

4.1 Notation and probability review

Notes on Random Variables, Expectations, Probability Densities, and Martingales

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Solutions Homework 6

Continuous-time Markov Chains

CS 237: Probability in Computing

Random experiments may consist of stages that are performed. Example: Roll a die two times. Consider the events E 1 = 1 or 2 on first roll

Probability, Random Processes and Inference

Measures. 1 Introduction. These preliminary lecture notes are partly based on textbooks by Athreya and Lahiri, Capinski and Kopp, and Folland.

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 16. A Brief Introduction to Continuous Probability

Basic Probability space, sample space concepts and order of a Stochastic Process

STAT 7032 Probability Spring Wlodek Bryc

Measures. Chapter Some prerequisites. 1.2 Introduction

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Introduction to Ergodic Theory

3 Multiple Discrete Random Variables

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Statistics for scientists and engineers

1 Random Variable: Topics

Recitation 2: Probability

Chapter 2 Random Variables

Discrete Random Variable

1 Lesson 1: Brunn Minkowski Inequality

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

Transcription:

Chapter 5 Random vectors, Joint distributions Lectures 18-23 In many real life problems, one often encounter multiple random objects. For example, if one is interested in the future price of two different stocks in a stock market. Since the price of one stock can affect the price of the second, it is not advisable to analysis them separately. To model such phenomenon, we need to introduce many random variables in a single platform (i.e., a probability space). First we will recall, some elementary facts about -dimensional Euclidean space. Let with the usual metric A subset of is said to be open if for each, there exists an such that where Any open set can be written as a countable union of open sets of the form, called open rectangles. Definition 5.1. The -field generated by all open sets in is called the Borel -field of subsets of and is denoted by. Theorem 5.0.16 Let Then Proof. We prove for, for, it is similar. Note that Hence from the definition of, we have Note that for,

For each such that we have Hence all open rectangles are in. Since any open set in can be rewritten as a countable union of open rectangles, all open sets are in. Therefore from the definition of, we get This completes the proof. (It is advised that student try to write down the proof for ) Definition 5.2. Let be a probability space. A map, is called a random vector if Now onwards we set (for simplicity) Theorem 5.0.17 is a random vector iff are random variables where denote the component of. Proof: Let For be a random vector. since Therefore is a random variable. Similarly, we can show that is a random variable. Suppose are random variables. For (5.0.1) Set By (5.0.1) (5.0.2)

For, we have Hence Thus. Similarly Hence Thus from (5.0.2), we have Therefore from Theorem 5.0.16, we have. Hence is a random vector. This completes the proof. Theorem 5.0.18 Let be a random vector. On define as follows Then is a probability measure on. Proof. Since, we have Let be pair wise disjoint elements from. Then are pair wise disjoint and are in. Hence This completes the proof. Definition 5.3. The probability measure is called the Law of the random vector and is denoted by. Definition 5.4. (joint distribution function)

Let be a random vector. Then the function given by is called the joint distribution function of. Theorem 5.0.19 Let be the joint distribution function of a random vector. Then satisfies the following. (i) (a) (b) (ii) is right continuous in each argument. (iii) is nondecreasing in each arguments. The proof of the above theorem is an easy exercise to the student. Given a random vector, the distribution function of denoted by is called the marginal distribution of. Similarly the marginal distribution function of is defined. Given the joint distribution function of, one can recover the corresponding marginal distributions as follows. Similarly Given the marginal distribution functions of and, in general it is impossible to construct the joint distribution function. Note that marginal distribution functions doesn't contain information about the dependence of over and vice versa. One can characterize the independence of and in terms of its joint and marginal distributions as in the following theorem. The proof is beyond the scope of this course. Theorem 5.0.20 Let be a random vector with distribution function. Then and are independent iff Definition 5.5. (joint pmf of discrete random vector) Let be a discrete random vector, i.e, are discrete random variables. Define by Then is called joint pmf of.

Definition 5.6. (joint pdf of continuous random vector) Let be a continuous random variable (i.e., are continuous random variables) with joint distribution function. If there exists a function such that then is called the joint pdf of. Theorem 5.0.21 Let be a continuous random vector with joint pdf. Then Proof. Note that L.H.S of the equality corresponds to the law of. Let denote the set of all finite union of rectangles in. Then is a field (exercise for the student). Set Then are probability measures on and on Hence, using extension theorem, we have i.e., Example 5.0.34 Let be two random variables with joint pdf given by If denote the marginal pdfs of and respectively, then

Therefore Here means is normally distributed with mean and variance. Similarly, Therefore Also note that and are dependent since, see exercise. Theorem 5.0.22 Let be independent random variables with joint pdf. Then the pdf of is given by where denote the convolution of and and is defined as Proof. Let denote the distribution function of. Set

Therefore This completes the proof. Example 5.0.35 Let be independent exponential random variables with parameters and respectively. Then is given similarly. Now for, clearly. For, Conditional Densities. The notion of conditional densities are intended to give a quantification of dependence of one random variable over the other if the random variables are not independent. Definition 5.7. Let be two discrete random variables with joint pmf. Then the conditional density of given denoted by is defined as Intuitively, means the pmf of given the information about. Here information about means knowledge about the occurrence (or non occurrence) of for each. One can rewrite in terms of the pmfs as follows.

Definition 5.8. Let are continuous random variables with joint pdf. The conditional distribution of given is defined as Definition 5.9. If are continuous random variable and if denote the conditional density of given. Then for, Example 5.0.36 Let be uniform random variable over and be uniform random variable over. i.e., Note that the pdf of given is, i.e. Also Hence