Conditional distributions (discrete case)

Similar documents
6.041/6.431 Fall 2010 Quiz 2 Solutions

Lecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014

Math/Stat 352 Lecture 8

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Lecture 14. Text: A Course in Probability by Weiss 5.6. STAT 225 Introduction to Probability Models February 23, Whitney Huang Purdue University

More on Distribution Function

18.440: Lecture 26 Conditional expectation

Notes 12 Autumn 2005

Chapter 5. Chapter 5 sections

SDS 321: Introduction to Probability and Statistics

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.

3 Multiple Discrete Random Variables

Things to remember when learning probability distributions:

5. Conditional Distributions

X = X X n, + X 2

18.175: Lecture 13 Infinite divisibility and Lévy processes

Chapter 4. Chapter 4 sections

M378K In-Class Assignment #1

STAT 516 Midterm Exam 3 Friday, April 18, 2008

Lecture 3. Discrete Random Variables

IB Mathematics HL Year 2 Unit 7 (Core Topic 6: Probability and Statistics) Valuable Practice

SDS 321: Introduction to Probability and Statistics

18.175: Lecture 17 Poisson random variables

18.440: Lecture 28 Lectures Review

Topic 3: The Expectation of a Random Variable

Poisson approximations

Conditional densities, mass functions, and expectations

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

1 Random Variable: Topics

Prof. Thistleton MAT 505 Introduction to Probability Lecture 18

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

Statistics for Economists. Lectures 3 & 4

Binomial and Poisson Probability Distributions

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

BNAD 276 Lecture 5 Discrete Probability Distributions Exercises 1 11

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

More than one variable

Homework for 1/13 Due 1/22

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type

18.440: Lecture 28 Lectures Review

Spring 2012 Math 541A Exam 1. X i, S 2 = 1 n. n 1. X i I(X i < c), T n =

Joint Probability Distributions and Random Samples (Devore Chapter Five)

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

Lecture 16: Hierarchical models and miscellanea

We introduce methods that are useful in:

Chapter 8. Some Approximations to Probability Distributions: Limit Theorems

Mathematical Statistics 1 Math A 6330

Hypergeometric, Poisson & Joint Distributions

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

Expectation of Random Variables

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

The story of the film so far... Mathematics for Informatics 4a. Jointly distributed continuous random variables. José Figueroa-O Farrill

success and failure independent from one trial to the next?

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

Jointly Distributed Random Variables

Math Stochastic Processes & Simulation. Davar Khoshnevisan University of Utah

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

Joint Distribution of Two or More Random Variables

Discrete Random Variables

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Chapter 4 Multiple Random Variables

1 Generating functions

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

Binomial random variable

Generating and characteristic functions. Generating and Characteristic Functions. Probability generating function. Probability generating function

Chapter 5 continued. Chapter 5 sections

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Probability, Random Processes and Inference

Lecture 1: August 28

Guidelines for Solving Probability Problems

Statistics for Managers Using Microsoft Excel (3 rd Edition)

SDS 321: Introduction to Probability and Statistics

ECO220Y Continuous Probability Distributions: Uniform and Triangle Readings: Chapter 9, sections

UNIT Define joint distribution and joint probability density function for the two random variables X and Y.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

Random variables, Expectation, Mean and Variance. Slides are adapted from STAT414 course at PennState

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

3 Conditional Expectation

Discrete Random Variables

Discrete Random Variables. David Gerard Many slides borrowed from Linda Collins

CMPSCI 240: Reasoning Under Uncertainty

Bivariate distributions

CS 361: Probability & Statistics

Lecture 12. Poisson random variables

Random variables (discrete)

UCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, Homework Set #6 Due: Thursday, May 22, 2011

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

CSCI2244-Randomness and Computation First Exam with Solutions

Summary statistics, distributions of sums and means

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

CMPSCI 240: Reasoning Under Uncertainty

Tom Salisbury

Continuous Random Variables

Transcription:

Conditional distributions (discrete case) The basic idea behind conditional distributions is simple: Suppose (XY) is a jointly-distributed random vector with a discrete joint distribution. Then we can think of P{X Y }, for every fixed, as a distribution of probabilities indexed by the variable. Example. Recall the following joint distribution from your text: possible value for X 3 possible 3 /6 /6 0 values /6 0 /6 for Y 0 /6 /6 Now suppose we know that Y 3. Then, the conditional distribution of X, given that we know Y 3, is given by the following: P{X Y 3} P{X Y 3} P{X Y 3} P{Y 3} /6 /3 similarly, and P{X Y 3} 0for all other values of. Note that, in this example, P{X Y 3}. Therefore, the conditional distribution of X given that Y 3is indeed a total collection of probabilities. As such, it has an expectation, variance, etc., as well. For instance, E(X Y 3) P{X Y 3} 3 That is, if Y 3, then our best predictor of X is 3/. Whereas, EX, which means that our best predictor of X, in light of no additional 83

84 7 information, is. You should check that also Similarly, E(X Y ) 5 E(X Y 3) E(X Y ) 3 whence 3 Var(X Y 3) 3 3 4 You should compare this with the unconditional variance VarX 8/3 (check the details). Fix a number. In general, the conditional distribution of X given that Y is given by the table [function of ]: P{X Y } P{X Y } P{Y } P{X Y } P{X Y } It follows easily from this that: (i) P{X Y } 0; and (ii) P{X Y }. This shows that we really are studying a [here conditional] probability distribution. As such, E (X) Y ()P{X Y } as long as the sum converges absolutely [or when () 0 for all ]. Example. Choose and fix 0 <<, and two integers. Let X and Y be independent random variables; we suppose that X has a binomial distribution with parameters and ; and Y has a binomial distribution with parameters and. Because X Y can be thought of as the total number of successes in tosses of independent -coins, it follows that X Y has a binomial distribution with parameters and. Our present goal is to find the conditional distribution of X, given that XY, for a fixed integer 0. P{X X Y } P{X X Y } P{X Y } P{X } P{Y } P{X Y } The numerator is zero unless 0. But if 0, then P{X X Y }

Conditional distributions (discrete case) 85 That is, given that X Y, then the conditional distribution of X is a hypergeometric distribution We can now read off the mean and the variance from facts we know about hypergeometrics. Namely, according to Example on page 6 of these notes, E(X X Y ) and Example on page 6 tells us that Var(X X Y ) (Check the details Some care is needed; on page 6, the notation was slightly different than it is here: The variable B there is now ; N there is now, etc.) Example 3. Suppose X and X are independent, distributed respectively as Poisson(λ ) and Poisson(λ ), where λ λ > 0 are fixed constants. What is the distribution of X, given that X X for a positive [fixed] integer? We will need the distribution of X X. Therefore, let us begin with that: The possible values of X X are 0 ; therefore the convolution formula for discrete distributions implies that for all 0, P{X X } P{X } P{X } 0 0 λ λ λ λ (λ λ ) P{X } λ λ ( ) (λ λ ) 0 (λ λ ) [binomial theorem]; λ λ and P{X X } 0for other values of. In other words, X X is distributed as Poisson(λ λ ). Aside: This and induction together prove that if X X are independent and all distributed respectively as Poisson(λ ),..., Poisson(λ ), then X X is distributed according to a Poisson distribution with parameter λ λ.

86 7 Now, where P{X X X } P{X } P{X } P{X X } λ λ λ λ ( ) (λ λ ) (λ λ ) : λ : λ λ λ λ λ That is, the conditional distribution of X, given that X X Binomial()., is Continuous conditional distributions Once we understand conditional distributions in the discrete setting, we could predict how the theory should work in the continuous setting [although it is quite difficult to justify that what is about to be discussed is legitimate]. Suppose (XY) is jointly distributed with joint density (). Then we define the conditional density of X given Y [assuming that Y () 0] as X Y ( ) : () for all << Y () This leads to conditional expectations: E (X) Y () X Y ( ) etc. As such we have now E(X Y ), Var(X Y ), etc. in the usual way using [now conditional] densities. Example 4 (Uniform on a triangle, p. 44 of your text). Suppose (XY) is chosen uniformly at random from the triangle {(): 0 0 }. Find the conditional density of Y, given that X [for a fixed 0 ]. We know that () if () is in the triangle and ()0 otherwise. Therefore, for all fixed 0, X () 0 ()

Continuous conditional distributions 87 and X () 0for other values of. 0, Y X ( ) () X () This yields the following: For all / ( )/ and Y X ( ) 0for all other values of. In other words, given that X, the conditional distribution of Y is Uniform(0 ). Thus, for instance, P{Y > X } 0if < [i.e., >] and P{Y > X } ( )/( ) for 0. Alternatively, we can work things out the longer way by hand: P{Y > X } Y X ( ) Now the integrand is zero if >. Therefore, unless 0, the preceding probability is zero. When 0, then we have P{Y > X } As another example within this one, let us compute E(Y X ): E(Y X ) 0 Y X ( ) [Alternatively, we can read this off from facts that we know about uniforms; for instance, we should be able to tell without computation that Var(Y X ) ( ) /. Check this by direct computation also]