STAT 430/510 Probability

Similar documents
STAT 430/510: Lecture 15

STAT 430/510: Lecture 16

Chapter 4 Multiple Random Variables

Multivariate distributions

Chapter 2: Random Variables

STA 256: Statistics and Probability I

4 Pairs of Random Variables

Lecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014

Chapter 4 Multiple Random Variables

Continuous r.v practice problems

2 Functions of random variables

Conditional distributions. Conditional expectation and conditional variance with respect to a variable.

SOLUTION FOR HOMEWORK 12, STAT 4351

ENGG2430A-Homework 2

Basics on Probability. Jingrui He 09/11/2007

Conditional distributions

5. Conditional Distributions

Chapter 5: Joint Probability Distributions

, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40

UNIT Define joint distribution and joint probability density function for the two random variables X and Y.

1 Review of Probability and Distributions

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

Measure-theoretic probability

Bivariate distributions

Chapter 5,6 Multiple RandomVariables

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Continuous Random Variables

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

Lecture 2: Repetition of probability theory and statistics

Contents 1. Contents

1.1 Review of Probability Theory

Stat 5101 Lecture Slides: Deck 8 Dirichlet Distribution. Charles J. Geyer School of Statistics University of Minnesota

STAT Chapter 5 Continuous Distributions

Statistics for scientists and engineers

1 Random Variable: Topics

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

Physics 403 Probability Distributions II: More Properties of PDFs and PMFs

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

SDS 321: Introduction to Probability and Statistics

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

STAT 509 Section 3.4: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s.

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

The binomial model. Assume a uniform prior distribution on p(θ). Write the pdf for this distribution.

5 Operations on Multiple Random Variables

Chapter 5 Class Notes

Lecture 3. David Aldous. 31 August David Aldous Lecture 3

Mathematical Statistics 1 Math A 6330

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

conditional cdf, conditional pdf, total probability theorem?

Statistics 1B. Statistics 1B 1 (1 1)

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

ECE 4400:693 - Information Theory

Joint p.d.f. and Independent Random Variables

2 (Statistics) Random variables

Stochastic Processes

Chapter 5 continued. Chapter 5 sections

Check Your Understanding of the Lecture Material Finger Exercises with Solutions

Lecture 3. Conditional distributions with applications

IEOR 4703: Homework 2 Solutions

2. The CDF Technique. 1. Introduction. f X ( ).

Multiple Random Variables

Probability and Distributions

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

3. Probability and Statistics

STAT 516 Midterm Exam 3 Friday, April 18, 2008

Probability Theory. Patrick Lam

Generation from simple discrete distributions

Review of probability

Lecture 13 and 14: Bayesian estimation theory

Introduction to Probability and Stocastic Processes - Part I

Review of Probability Mark Craven and David Page Computer Sciences 760.

Chapter 5 Joint Probability Distributions

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

Conditional distributions (discrete case)

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1

Special Discrete RV s. Then X = the number of successes is a binomial RV. X ~ Bin(n,p).

Let X and Y denote two random variables. The joint distribution of these random

STAT 515 MIDTERM 2 EXAM November 14, 2018

ACM 116: Lectures 3 4

Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution:

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Mathematical statistics

Continuous Random Variables and Continuous Distributions

1 Probability and Random Variables

Two Posts to Fill On School Board

Stat Lecture 20. Last class we introduced the covariance and correlation between two jointly distributed random variables.

Lecture 14. Text: A Course in Probability by Weiss 5.6. STAT 225 Introduction to Probability Models February 23, Whitney Huang Purdue University

EXAM # 3 PLEASE SHOW ALL WORK!

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

Lecture 15. Hypothesis testing in the linear model

Statistics, Data Analysis, and Simulation SS 2015

Mathematics 375 Probability and Statistics I Final Examination Solutions December 14, 2009

ECE-340, Spring 2015 Review Questions

Introduction to Statistical Inference Self-study

The Random Variable for Probabilities Chris Piech CS109, Stanford University

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Transcription:

STAT 430/510 Probability Hui Nie Lecture 16 June 24th, 2009

Review Sum of Independent Normal Random Variables Sum of Independent Poisson Random Variables Sum of Independent Binomial Random Variables Conditional Distributions: Discrete Case

Conditional Distributions: Continuous Case Let X and Y be jointly continuous r.v. s. Then for any x value for which f X (x) > 0, the conditional pdf of Y given X = x is f Y X (y x) = f (x, y) f X (x), < y < For any set A, P(Y A X = x) = If X and Y are independent, then A f Y X (y x)dy = A f (x, y) f X (x) dy f Y X (y x) = f (x, y) f X (x) = f X (x)f Y (y) = f Y (y) f X (x)

Example The joint density of X and Y is given by f (x, y) = { 12 5 x(2 x y), 0 < x < 1, 0 < y < 1 0, otherwise Compute the conditional density of X given that Y = y, where 0 < y < 1. f X Y (x y) = = = f (x, y) f Y (y) x(2 x y) 1 0 x(2 x y)dx 6x(2 x y) 4 3y

One Discrete R.V. and one Continuous R.V. Suppose that X is a continuous random variable having probability density function f and N is a discrete random variable. Then the conditional distribution of X given that N = n is P(N = n X = x) f X N (x n) = f (x) P(N = n)

Example Consider n + m trials having a common probability of success. Suppose, however, that this success probability is not fixed in advance but is chosen from a uniform (0,1) population. What is the conditional distribution of the success probability given that the n + m trials result in n successes?

Example: Solution Let X denote the probability that a given trials is a success. Let N be the number of successes. f X N (x n) = P(N = n X = x)f X (x) P(N = n) ( ) n + m x n n (1 x) m = P(N = n) x n (1 x) m The conditional density is a beta distribution with parameters (n + 1, m + 1)

Joint Probability Distribution of Functions of R.V. s X 1 and X 2 are jointly continuous random variables with joint probability density function f X1,X 2. Want to find the joint distribution of Y 1 = g 1 (X 1, X 2 ) and Y 2 = g 2 (X 1, X 2 )

Two Conditions Condition 1 The equations y 1 = g 1 (x 1, x 2 ) and y 2 = g 2 (x 1, x 2 ) can be uniquely solved for x 1 and x 2 in terms of y 1 and y 2, with solutions given by x 1 = h 1 (y 1, y 2 ) and x 2 = h 2 (y 1, y 2 ). Condition 2 The functions g 1 and g 2 have continuous partial derivatives at all points (x 1, x 2 ) and are such that the 2 2 determinant g 1 at all points (x 1, x 2 ) J(x 1, x 2 ) = x 1 g 2 x 2 g 2 g 1 x 1 x 2 0

Change of R.V. s Under two conditions above, random variables Y 1 and Y 2 are jointly continuous with joint density function given by f Y1,Y 2 (y 1, y 2 ) = f X1,X 2 (x 1, x 2 ) J(x 1, x 2 ) 1 where x 1 = h 1 (y 1, y 2 ), x 2 = h 2 (y 1, y 2 )

Example Let X 1 and X 2 be jointly continuous random variables with probability density function f X1,X 2. Let Y 1 = X 1 + X 2, Y 2 = X 1 X 2. find the joint density function of Y 1 and Y 2 in terms of f X1,X 2.

Example: Solution Let g 1 (x 1, x 2 ) = x 1 + x 2 and g 2 (x 1, x 2 ) = x 1 x 2. Then J(x 1, x 2 ) = 1 1 1 1 = 2 Equations g 1 (x 1, x 2 ) = x 1 + x 2 and g 2 (x 1, x 2 ) = x 1 x 2 have x 1 = (y 1 + y 2 )/2, x 2 = (y 1 y 2 )/2 as their solutions f Y1,Y 2 (y 1, y 2 ) = 1 2 f X 1,X 2 ( y 1+y 2 2, y 1 y 2 2 )

Generalization to n dimensions Two Conditions y 1 = g 1 (x 1,, x n ),,y n = g n (x 1,, x n ) have a unique solution x 1 = h 1 (y 1,, y n ),,x n = h n (y 1,, y n ) J(x 1,, x n ) 0 Conclusion f Y1,,Y n (y 1,, y n ) = f X1,,X n (x 1,, x n ) J(x 1,, x n ) 1 where x i = h i (y 1,, y n ), i = 1,, n

Example Let X 1, X 2 and X 3 be independent standard normal random variables. If Y 1 = X 1 + X 2 + X 3, Y 2 = X 1 X 2 and Y 3 = X 1 X 3, compute the joint density function of Y 1, Y 2, Y 3.

Example: Solution J = 1 1 1 1 1 0 1 0 1 = 3 X 1 = Y 1+Y 2 +Y 3 3, X 2 = Y 1 2Y 2 +Y 3 3, X 3 = Y 1+Y 2 2Y 3 3 f Y1,Y 2,Y 3 = exp{ (( y 1+y 2 +y 3 3 ) 2 + ( y 1 2y 2 +y 3 3 ) 2 + ( y 1+y 2 2y 3 3 ) 2 )/2} 3(2π) 3/2 = exp{ (y 2 1 /3 + 2y 2 2 /3 + 2y 2 3 /3 2y 2y 3 /3)/2} 3(2π) 3/2