STAT 430/510: Lecture 15

Similar documents
STAT 430/510 Probability

STAT 430/510: Lecture 16

STAT 430/510: Lecture 10

Multivariate distributions

Lecture 2: Repetition of probability theory and statistics

Lecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014

Chapter 4 Multiple Random Variables

Lecture 14. Text: A Course in Probability by Weiss 5.6. STAT 225 Introduction to Probability Models February 23, Whitney Huang Purdue University

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

4 Pairs of Random Variables

Basics on Probability. Jingrui He 09/11/2007

The Binomial distribution. Probability theory 2. Example. The Binomial distribution

Chapter 5 Class Notes

SDS 321: Introduction to Probability and Statistics

SOLUTION FOR HOMEWORK 12, STAT 4351

Ch. 8 Math Preliminaries for Lossy Coding. 8.5 Rate-Distortion Theory

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

1 Probability and Random Variables

Stat 35: Introduction to Probability. Outline for the day: 1. teams, s, bruin. 2. HW2. 3. Booth and Ivey. 4. Binomial random variables. 5.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Stat 100a, Introduction to Probability.

1 Joint and marginal distributions

Conditional distributions

Chapter 4 Multiple Random Variables

Continuous r.v practice problems

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

Algorithms for Uncertainty Quantification

Probability review. September 11, Stoch. Systems Analysis Introduction 1

STAT 3610: Review of Probability Distributions

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

Conditional distributions. Conditional expectation and conditional variance with respect to a variable.

Ch. 5 Joint Probability Distributions and Random Samples

EXAM # 3 PLEASE SHOW ALL WORK!

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Bivariate distributions

3 Multiple Discrete Random Variables

Special Discrete RV s. Then X = the number of successes is a binomial RV. X ~ Bin(n,p).

1 Random Variable: Topics

Chapter 5 continued. Chapter 5 sections

Discrete Random Variables

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Probability Review. Gonzalo Mateos

Lecture 3. Discrete Random Variables

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

CS145: Probability & Computing

STA 256: Statistics and Probability I

The Random Variable for Probabilities Chris Piech CS109, Stanford University

Joint Probability Distributions, Correlations

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

2 Functions of random variables

Conditional distributions (discrete case)

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Joint Probability Distributions and Random Samples (Devore Chapter Five)

STOR Lecture 14. Jointly distributed Random Variables - II

Chapter 2: Random Variables

IEOR 4703: Homework 2 Solutions

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

STAT Chapter 5 Continuous Distributions

Chapter 5. Chapter 5 sections

SDS 321: Introduction to Probability and Statistics

ESS011 Mathematical statistics and signal processing

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Check Your Understanding of the Lecture Material Finger Exercises with Solutions

Mathematical Statistics 1 Math A 6330

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

HW4 : Bivariate Distributions (1) Solutions

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

Continuous Probability Distributions. Uniform Distribution

Basic concepts of probability theory

HW Solution 12 Due: Dec 2, 9:19 AM

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Joint Probability Distributions, Correlations

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Math/Stat 352 Lecture 8

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Lecture 4. Continuous Random Variables and Transformations of Random Variables

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Stat 5101 Lecture Slides: Deck 8 Dirichlet Distribution. Charles J. Geyer School of Statistics University of Minnesota

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Stat 451 Lecture Notes Simulating Random Variables

1 Review of Probability and Distributions

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Bivariate Distributions

ENGG2430A-Homework 2

ECE 4400:693 - Information Theory

7 Random samples and sampling distributions

UNIT Define joint distribution and joint probability density function for the two random variables X and Y.

Contents 1. Contents

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

ECE353: Probability and Random Processes. Lecture 5 - Cumulative Distribution Function and Expectation

Conditional densities, mass functions, and expectations

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

4 Moment generating functions

Basic concepts of probability theory

INF FALL NATURAL LANGUAGE PROCESSING. Jan Tore Lønning

M378K In-Class Assignment #1

Transcription:

STAT 430/510: Lecture 15 James Piette June 23, 2010

Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.4...

Conditional Distribution: Discrete Def: The conditional pmf of X given Y y is defined as p X Y (x y) P(X x Y y) p(x, y) p Y (y) for all values of y such that p Y (y) > 0. The conditional cdf of X given that Y y is defined similarily: F X Y (x y) P(X x Y y) a x p X Y (a y) Also, if X and Y are independent, then the conditional pmf of X on Y y is equal to: p X Y (x y) P(X x), P(Y y) P(Y y) P(X x)

Example 1 Question: If X and Y are independent Poisson r.v. s with respective parameters λ 1 and λ 2, then what is the conditional distribution of X given that X + Y n?

Example 1 Question: If X and Y are independent Poisson r.v. s with respective parameters λ 1 and λ 2, then what is the conditional distribution of X given that X + Y n? Solution: Go through similar motions that we had when calculating the conditional probabilities of events: P(X k X + Y n) P(X k, X + Y n) P(X + Y n)

Example 1 Question: If X and Y are independent Poisson r.v. s with respective parameters λ 1 and λ 2, then what is the conditional distribution of X given that X + Y n? Solution: Go through similar motions that we had when calculating the conditional probabilities of events: P(X k X + Y n) P(X k, X + Y n) P(X + Y n) P(X k, k + Y n) P(X + Y n) P(X k)p(y n k) P(X + Y n)

Example 1 Question: If X and Y are independent Poisson r.v. s with respective parameters λ 1 and λ 2, then what is the conditional distribution of X given that X + Y n? Solution: Go through similar motions that we had when calculating the conditional probabilities of events: P(X k X + Y n) P(X k, X + Y n) P(X + Y n) P(X k, k + Y n) P(X + Y n) P(X k)p(y n k) P(X + Y n) Recalling from the previous lecture, we know that X + Y is distributed...

Example 1 Question: If X and Y are independent Poisson r.v. s with respective parameters λ 1 and λ 2, then what is the conditional distribution of X given that X + Y n? Solution: Go through similar motions that we had when calculating the conditional probabilities of events: P(X k X + Y n) P(X k, X + Y n) P(X + Y n) P(X k, k + Y n) P(X + Y n) P(X k)p(y n k) P(X + Y n) Recalling from the previous lecture, we know that X + Y is distributed... Poisson with parameter λ 1 and λ 2.

Example 1 (cont.) Using that fact, we can proceed... P(X k)p(y n k) P(X + Y n) e λ 1λ k 1 e λ 2λ n k 2 k! (n k)! [ e (λ 1+λ 2 ) (λ 1 + λ 2 ) n n! ] 1

Example 1 (cont.) Using that fact, we can proceed... P(X k)p(y n k) P(X + Y n) e λ 1λ k 1 e λ 2λ n k 2 k! (n k)! [ e (λ 1+λ 2 ) (λ 1 + λ 2 ) n n! ] 1 λ k 1 λn k 2 n! (n k)!k! (λ 1 + λ 2 ) n

Example 1 (cont.) Using that fact, we can proceed... P(X k)p(y n k) P(X + Y n) e λ 1λ k 1 e λ 2λ n k 2 k! (n k)! [ e (λ 1+λ 2 ) (λ 1 + λ 2 ) n n! ] 1 λ k 1 λn k 2 n! (n k)!k! (λ 1 + λ 2 ) n ( ) ( ) n k ( λ1 λ2 k λ 1 + λ 2 λ 1 + λ 2 ) n k Recognize this as anything?

Example 1 (cont.) Using that fact, we can proceed... P(X k)p(y n k) P(X + Y n) e λ 1λ k 1 e λ 2λ n k 2 k! (n k)! [ e (λ 1+λ 2 ) (λ 1 + λ 2 ) n n! ] 1 λ k 1 λn k 2 n! (n k)!k! (λ 1 + λ 2 ) n ( ) ( ) n k ( λ1 λ2 k λ 1 + λ 2 λ 1 + λ 2 ) n k Recognize this as anything? Binomial with n trials and prob. λ 1 λ 1 +λ 2.

Formalization Def: Let X and Y be jointly continuous r.v s. Then, for any x value for which f X (x) > 0, the conditional pdf of X given Y y is f X Y (x y) For any set A, P(X A Y y) f (x, y) f Y (y), where f Y (y) > 0 A f X Y (x y)dx By letting A (, a], we define the conditional cdf of X given Y y by F X Y (a y) P(X a Y y) a f X Y (x y)dx

Example 2 The joint density of X and Y is given by f (x, y) { 12 5 x(2 x y) 0 < x < 1, 0 < y < 1 0 otherwise Question: What is the conditional density of X given that Y y, where 0 < y < 1?

Example 2 (cont.) For 0 < x < 1, 0 < y < 1, we have f X Y (x y) f (x, y) f Y (y)

Example 2 (cont.) For 0 < x < 1, 0 < y < 1, we have f X Y (x y) f (x, y) f Y (y) f (x, y) f (x, y)dx

Example 2 (cont.) For 0 < x < 1, 0 < y < 1, we have f X Y (x y) f (x, y) f Y (y) f (x, y) f (x, y)dx 1 0 x(2 x y) x(2 x y)dx

Example 2 (cont.) For 0 < x < 1, 0 < y < 1, we have f X Y (x y) f (x, y) f Y (y) f (x, y) f (x, y)dx 1 0 x(2 x y) x(2 x y)dx x(2 x y) 2 3 y 2 6x(2 x y) 4 3y

Example 3 Suppose that the joint density of X and Y is given by f (x, y) { e x/y e y y 0 otherwise Question: What is P(X > 1 Y y)? 0 < x <, 0 < y <

Example 3 Suppose that the joint density of X and Y is given by f (x, y) { e x/y e y y 0 otherwise 0 < x <, 0 < y < Question: What is P(X > 1 Y y)? Solution: First, we need to obtain the conditional density of X given that Y y.

Example 3 Suppose that the joint density of X and Y is given by f (x, y) { e x/y e y y 0 otherwise 0 < x <, 0 < y < Question: What is P(X > 1 Y y)? Solution: First, we need to obtain the conditional density of X given that Y y. f X Y (x y) f (x, y) f Y (y) e x/y e y /y e y 0 (1/y)e x/y dx

Example 3 Suppose that the joint density of X and Y is given by f (x, y) { e x/y e y y 0 otherwise 0 < x <, 0 < y < Question: What is P(X > 1 Y y)? Solution: First, we need to obtain the conditional density of X given that Y y. f X Y (x y) f (x, y) f Y (y) e x/y e y /y e y 0 (1/y)e x/y dx 1 y e x/y

Example 3 (cont.) With this, we can find P(X > 1 Y y)...

Example 3 (cont.) With this, we can find P(X > 1 Y y)... P(X > 1 Y y) 1 1 y e x/y dx

Example 3 (cont.) With this, we can find P(X > 1 Y y)... P(X > 1 Y y) 1 1 y e x/y dx e x/y 1

Example 3 (cont.) With this, we can find P(X > 1 Y y)... P(X > 1 Y y) 1 1 y e x/y dx e x/y e 1/y 1

One Discrete and One Continuous Suppose that X is a continuous r.v. with pdf f and N is a discrete r.v. Then, the conditional distribution of X given that N n is f X N (x n) P(N n X x) f (x) P(N n)

Example 4 Consider n + m trials having a common prob. of success. Suppose, however, that this success prob. is not fixed in advance, but is chosen from a uniform (0,1). Question: What is the conditional distribution of the success prob. given that n + m trials result in n successes?

Example 4 Consider n + m trials having a common prob. of success. Suppose, however, that this success prob. is not fixed in advance, but is chosen from a uniform (0,1). Question: What is the conditional distribution of the success prob. given that n + m trials result in n successes? Solution: Let X denote the prob. that a given trial is a success. Then X is distributed uniform (0,1).

Example 4 Consider n + m trials having a common prob. of success. Suppose, however, that this success prob. is not fixed in advance, but is chosen from a uniform (0,1). Question: What is the conditional distribution of the success prob. given that n + m trials result in n successes? Solution: Let X denote the prob. that a given trial is a success. Then X is distributed uniform (0,1). Given that X x, the n + m trials are independent with common prob. of success x, so N, the number of successes, is a binomial r.v. with parameters (n + m, x).

Example 4 (cont.) The conditional density of X given that N n is f X N (x n) P(N n X x)f X (x) P(N n)

Example 4 (cont.) The conditional density of X given that N n is f X N (x n) P(N n X x)f X (x) P(N n) ( n+m ) n x n (1 x) m where 0 < x < 1 P(N n)

Example 4 (cont.) The conditional density of X given that N n is f X N (x n) P(N n X x)f X (x) P(N n) ( n+m ) n x n (1 x) m where 0 < x < 1 P(N n) cx n (1 x) m The conditional density is that of a beta r.v. with parameters n + 1, m + 1.

Motivation Let X 1 and X 2 be jointly cont. r.v. s with joint pdf f X1,X 2. Suppose there are two r.v. s, Y 1 and Y 2, such that Y 1 g 1 (X 1, X 2 ) and Y 2 g 2 (X 1, X 2 ) for some functions g 1 and g 2. There are two assumptions: 1 Equations y 1 g 1 (x 1, x 2 ) and y 2 g 2 (x 1, x 2 ) can be uniquely solved for x 1 and x 2 in terms of y 1 and y 2, with solutions given by x 1 h 1 (y 1, y 2 ) and x 2 h 2 (y 1, y 2 ). 2 The functions g 1 and g 2 have cont. partial derivatives at all points (x 1, x 2 ) and are s.t. the 2 2 determinant: g 1 J(x 1, x 2 ) x 1 g 2 x 2 g 1 g 1 x 1 x 2 0

Formalization Under the previous two assumptions, the r.v. s Y 1 and Y 2 are jointly cont. with joint density given by f Y1 Y 2 (y 1, y 2 ) f X1 X 2 (x 1, x 2 ) J(x 1, x 2 ) 1 where x 1 h 1 (y 1, y 2 ), x 2 h 2 (y 1, y 2 ).

Example 5 Let X 1 and X 2 be jointly cont. r.v. s with pdf f X1 X 2. Let Y 1 X 1 + X 2, Y 2 X 1 X 2. Question: What is the joint pdf of Y 1 and Y 2 in terms of f X1 X 2?

Example 5 Let X 1 and X 2 be jointly cont. r.v. s with pdf f X1 X 2. Let Y 1 X 1 + X 2, Y 2 X 1 X 2. Question: What is the joint pdf of Y 1 and Y 2 in terms of f X1 X 2? Solution: Let g 1 (x 1, x 2 ) x 1 + x 2 and g 2 (x 1, x 2 ) x 1 x 2. Then, we need to verify the assumption 2...

Example 5 Let X 1 and X 2 be jointly cont. r.v. s with pdf f X1 X 2. Let Y 1 X 1 + X 2, Y 2 X 1 X 2. Question: What is the joint pdf of Y 1 and Y 2 in terms of f X1 X 2? Solution: Let g 1 (x 1, x 2 ) x 1 + x 2 and g 2 (x 1, x 2 ) x 1 x 2. Then, we need to verify the assumption 2... J(x 1, x 2 ) 1 1 1 1

Example 5 (cont.) And verify assumption 1...

Example 5 (cont.) And verify assumption 1... x 1 (y 1 + y 2 ) 2 x 2 (y 1 y 2 ) 2

Example 5 (cont.) And verify assumption 1... x 1 (y 1 + y 2 ) 2 x 2 (y 1 y 2 ) 2 Thus, the desired probability is:

Example 5 (cont.) And verify assumption 1... x 1 (y 1 + y 2 ) 2 x 2 (y 1 y 2 ) 2 Thus, the desired probability is: f Y1 Y 2 (y 1, y 2 ) 1 ( 2 f y1 + y 2 X 1 X 2, y ) 1 y 2 2 2

Let s cover self-test problems 6.13 and 6.14. We ve now covered up to 6.5 and started 6.7 (skipped 6.6). Remember, I ve posted HW4 up on my website and it is due on Mon. June 28th.