Similar documents
Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

HW4 : Bivariate Distributions (1) Solutions

Homework 9 (due November 24, 2009)

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

MULTIVARIABLE INTEGRATION

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Random Variables and Probability Distributions

Random Variables (Continuous Case)

STAT/MA 416 Midterm Exam 3 Monday, November 19, Circle the section you are enrolled in:

SDS 321: Introduction to Probability and Statistics

Homework 5 Solutions

Chapter 4 Multiple Random Variables

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs

General Random Variables

Solutions to Assignment #8 Math 501 1, Spring 2006 University of Utah

Multivariate distributions

Mathematics 426 Robert Gross Homework 9 Answers

Chapter 5. Random Variables (Continuous Case) 5.1 Basic definitions

Homework 10 (due December 2, 2009)

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

More on Distribution Function

The story of the film so far... Mathematics for Informatics 4a. Jointly distributed continuous random variables. José Figueroa-O Farrill

Mathematical Analysis II, 2018/19 First semester

1 Basic continuous random variable problems

Continuous Distributions

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Probability (continued)

2 Functions of random variables

Notes for Math 324, Part 19

7a3 2. (c) πa 3 (d) πa 3 (e) πa3

Stat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Chapter 3 Questions. Question 3.1. Based on the nature of values that each random variable can take, we can have the following classifications:

STA 256: Statistics and Probability I

Final Exam. Math Su10. by Prof. Michael Cap Khoury

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

MA 201: Partial Differential Equations Lecture - 2

MATH 317 Fall 2016 Assignment 5

Continuous distributions

Check Your Understanding of the Lecture Material Finger Exercises with Solutions

36. Double Integration over Non-Rectangular Regions of Type II

ELEG 3143 Probability & Stochastic Process Ch. 4 Multiple Random Variables

Without fully opening the exam, check that you have pages 1 through 12.

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Lecture 13: Conditional Distributions and Joint Continuity Conditional Probability for Discrete Random Variables

Joint Distributions: Part Two 1

CS145: Probability & Computing

MA6451 PROBABILITY AND RANDOM PROCESSES

we need to describe how many cookies the first person gets. There are 6 choices (0, 1,... 5). So the answer is 6.

Review Sheet for the Final

Math 265 (Butler) Practice Midterm III B (Solutions)

Multiple Random Variables

MATH 180A - INTRODUCTION TO PROBABILITY PRACTICE MIDTERM #2 FALL 2018

Math 265H: Calculus III Practice Midterm II: Fall 2014

Lecture Notes: Divergence Theorem and Stokes Theorem

Name: Instructor: Lecture time: TA: Section time:

Calculus III 2004 Summer Practice Final 8/3/2004

MATH 0350 PRACTICE FINAL FALL 2017 SAMUEL S. WATSON. a c. b c.

Course 1 Solutions November 2001 Exams

234 Review Sheet 2 Solutions

Chapter 2. Probability

Bivariate Distributions

Massachusetts Institute of Technology

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

e x2 dxdy, e x2 da, e x2 x 3 dx = e

Before you begin read these instructions carefully.

Note: Each problem is worth 14 points except numbers 5 and 6 which are 15 points. = 3 2

MATH 52 FINAL EXAM SOLUTIONS

Continuous r.v practice problems

Math 30530: Introduction to Probability, Fall 2013

Chapter 4 Multiple Random Variables

Math 223 Final. July 24, 2014

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

Bivariate Transformations

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Class 8 Review Problems solutions, 18.05, Spring 2014

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

Created by T. Madas LINE INTEGRALS. Created by T. Madas

Bivariate distributions

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA

Basics on Probability. Jingrui He 09/11/2007

PRACTICE PROBLEMS. Please let me know if you find any mistakes in the text so that i can fix them. 1. Mixed partial derivatives.

MATHS 267 Answers to Stokes Practice Dr. Jones

Dr. Maddah ENMG 617 EM Statistics 10/15/12. Nonparametric Statistics (2) (Goodness of fit tests)

Peter Alfeld Math , Fall 2005

MAC2313 Final A. (5 pts) 1. How many of the following are necessarily true? i. The vector field F = 2x + 3y, 3x 5y is conservative.

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Dimensions = xyz dv. xyz dv as an iterated integral in rectangular coordinates.

Math 151. Rumbos Spring Solutions to Review Problems for Exam 2

Conditional densities, mass functions, and expectations

MAT 211 Final Exam. Fall Jennings.

Lecture 1: Basics of Probability

Math 23b Practice Final Summer 2011

Math 53 Homework 5 Solutions

STAT Chapter 5 Continuous Distributions

Final Exam 2011 Winter Term 2 Solutions

Probability Foundation for Electrical Engineers Prof. Krishna Jagannathan Department of Electrical Engineering Indian Institute of Technology, Madras

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Transcription:

sheng@mail.ncyu.edu.tw

Content Joint distribution functions Independent random variables Sums of independent random variables Conditional distributions: discrete case Conditional distributions: continuous case Order statistics Joint probability distribution of functions of random variables Exchangeable random variables 2

6. Joint distribution functions The joint cumulative probability distribution function X and Y by F a, b = P X a, Y a a, b The distribution of X can be obtained from joint distribution of X and Y as F X a = P{X a} F(a, ) The distribution functions F X and F Y are sometimes referred to as the marginal distributions of X and Y. For discrete random variable, the joint probability mass function of X and Y by p x, y = P{X = x, Y = y}

Example a Suppose that balls are randomly selected from an urn containing red, 4 white, and 5 blue balls. If we let X and Y denote, respectively, the number of red and white balls chosen, then the joint probability mass function of X and Y, p(i,j)=p{x=i,y=j} is given by 4

Example a p, = p, = p,2 = 5 4 4 2 2 5 2 5 = 22 2 2 = 4 22 = 22 p, = 4 2 = 4 22 p, = 5 2 2 = 22 5

Example a p, = p,2 = p 2, = p 2, = p, = 2 2 4 4 2 5 4 2 5 2 2 2 2 = 6 22 = 8 22 = 5 22 = 2 22 = 22 6

Table 6. Because the individual probability mass functions of X and Y thus appear in the margin of such a table, they are often referred to as the marginal probability mass functions of X and Y, respectively. 7

Joint probability density function We say that X and Y are jointly continuous if there exists a function f(x,y), define for all real x and y, having the property that, for every set C of pairs of real numbers, P X, Y C = f x, y dxdy (x,y) C (.) The function f(x,y) is called the joint probability density function of X and Y. If A and B are any sets of real numbers, then, by defining C = { x, y : x A, y B}, we know P X A, Y B = f x, y dxdy B A (.) 8

Individually probability density functions The probability density function of X f X x = f x, y dy The probability density function of Y f Y y = f x, y dx 9

Example c The joint density function of X and Y is given by f x, y = 2e x e 2y, < x <, < y <, otherwise Compute (a) P{X>,Y<} (b) P{X<Y} and (c) P{X<a}

Solution. c (a) P X >, Y < = 2e x e 2y dxdy (b) 2e 2y e x = e 2e 2y dy P X < Y = 2e x e 2y dxdy x,y :x<y 2e 2y e y dy a = 2e 2y dy (c) P X < a = 2e x e 2y dydx = = e e 2 a y = 2e x e 2y dxdy 2e y dy = e x dx = = 2 = = e a

N random variables The joint cumulative probability distribution function F(a,a 2,..,a n ) of the n random variable X, X 2,, X n is defined by F a, a 2,, a n = P{X a, X 2 a 2,, X n a n } Further, the n random variables are said to be jointly continuous if there exists a function f(x,x 2,,x n ), called the joint probability density function, such that, for any set C in n-space P X, X 2,, X n C = f x,, x n dx dx 2 dx n 2

N random variables For any n sets of real numbers A, A 2,,A n P X A, X 2 A 2,, X n A n = f x,, x n dx dx 2 dx n A n A n A

6.2 Independent random variables The random variables X and Y are said to be independent if, for any two sets of real numbers A and B, P X A, Y B = P X A P Y B 2. Hence, in terms of the joint distribution function F of X and Y, X and Y are independent if F a, b = F X a F Y b for all a, b When X and Y are discrete random variables, the condition of independence (2.) is equivalent to p x, y = p X x p Y y for all x, y In the jointly continuous case, the condition of independence is equivalent to f x, y = f X x f Y y for all x, y

Example 2c A man and a woman decide to meet at a certain location. If each of them independently arrives at a time uniformly distributed between 2 noon and P.M., find the probability that the first to arrive has to wait longer than minutes. 8

Solution. 2c If we let X and Y denote, respectively, the time past 2 that the man and the woman arrive, then X and Y are independent random variables, each of which is uniformly distributed over (,6). The desired probability, P{X+<Y}+P{Y+<X}, which, by symmetry, equals 2P{X+<Y}, is obtained as follows: 2P X + < Y = 2 f x, y dxdy x+<y = 2 f X x f Y y dxdy = 2 6 x+<y 6 2 y dy = 25 6 = 2 6 y 6 2 dx dy 9

Example 2d Buffon s needle problem A table is ruled with equidistant parallel lines a distance D apart. A needle of length L, where L D, is randomly thrown on the table. What is the probability that the needle will intersect one of the lines (the other possibility being that the needle will be completely contained in the strip between two lines)? 2

Solution. 2d Let us determine the position of the needle by specifying () the distance X from the middle point of the needle to the nearest parallel line and (2) the angle θ between the needle and the projected line of length X. The needle will intersect a line if the hypotenuse of the right triangle is less than L/2. That is, if X cosθ < L 2 or X < L 2 cosθ 2

Solution. 2d As X varies between and D/2 and θ between and p/2, it is reasonable to assume that they are independent, uniformly distributed random variables over these respective ranges. Hence, P X < L 2 cos θ = f X x f θ y dxdy π/2 4 πd π/2 L/2 cos y L/2 cos y x<l/2 cos y D 2 π dxdy 2 dxdy = 4 πd = π/2 L 2 cos y dy = = 2L πd 22

Proposition 2. The continuous (discrete) random variables X and Y are independent if and only if their joint probability density (mass) function can be expressed as f X,Y x, y = h x g y < x <, < y < 2

Example 2f If the joint density function of X and Y is f x, y = 6e 2x e y < x <, < y < and is equal to outside this region, are the random variables independent? What if the joint density function is f x, y = 24xy < x <, < y <, < x + y < and is equal to otherwise? 24

Solution. 2f In the first instance, the joint density function factors, and thus the random variables, are independent (with one being exponential with rate 2 and the other exponential with rate ). In the second instance, because the region in which the joint density is nonzero cannot be expressed in the form x A, y B, the joint density does not factor, so the random variables are not independent. This can be seen clearly by letting if < x <, < y <, < x + y < I x, y = otherwise 25

Solution. 2f And writing f x, y = 24xyI x, y which clearly does not factor into a part depending only on x and another depending only on y. 26

Example 2h Let X, Y, Z be independent and uniformly distributed over (,). Compute P{X YX}. 27

Solution. 2h Since f X,Y,Z x, y, z = f X (x)f Y (y) f Z z = x, y, z we have P X YZ = f X,Y,Z x, y, z dxdydz x yz = dxdydz yz = z 2 dz = 4 = yz dydz 28

Independence is a symmetric relation The random variables X and Y are independent if their joint density function (or mass function in the discrete case) is the product of their individual density (or mass) functions. Therefore, to say that X is independent of Y is equivalent to saying that Y is independent of X - or just that X and Y are independent. As a result, in considering whether X is independent of Y in situations where it is not at all intuitive that knowing the value of Y will not change the probabilities concerning X, it can be beneficial to interchange the roles of X and Y and ask instead whether Y is independent of X. 29

Integrate e -4x Since e x = e x, then e x dx = e x + C, that is e u du = e u + C Let u=-4x, du=-4dx, then e 4x dx = 4 e 4x 4dx = 4 e u du = 4 eu + C = 4 e 4x + C