Bivariate Distributions

Similar documents
Discrete Probability Distributions

More than one variable

Bivariate distributions

3. General Random Variables Part IV: Mul8ple Random Variables. ECE 302 Fall 2009 TR 3 4:15pm Purdue University, School of ECE Prof.

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

Continuous Random Variables

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

The Binomial distribution. Probability theory 2. Example. The Binomial distribution

2 (Statistics) Random variables

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables:

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Chapter 4 Multiple Random Variables

Review: mostly probability and some statistics

Homework 5 Solutions

ENGG2430A-Homework 2

STAT 516 Midterm Exam 3 Friday, April 18, 2008

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

STA 256: Statistics and Probability I

Introduction to Normal Distribution

Review of Probability. CS1538: Introduction to Simulations

MATH 38061/MATH48061/MATH68061: MULTIVARIATE STATISTICS Solutions to Problems on Random Vectors and Random Sampling. 1+ x2 +y 2 ) (n+2)/2

ECSE B Solutions to Assignment 8 Fall 2008

Chapter 5 continued. Chapter 5 sections

STAT 515 MIDTERM 2 EXAM November 14, 2018

Partial Solutions for h4/2014s: Sampling Distributions

Probability and Statistics Notes

Joint Distribution of Two or More Random Variables

Bivariate Paired Numerical Data

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes?

Homework 10 (due December 2, 2009)

Introduction to Probability and Stocastic Processes - Part I

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Gaussian random variables inr n

Mathematics 426 Robert Gross Homework 9 Answers

Notes for Math 324, Part 19

Lecture 4 : Random variable and expectation

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

Lecture 2: Repetition of probability theory and statistics

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

18 Bivariate normal distribution I

Functions of two random variables. Conditional pairs

Probability Review. Chao Lan

EE4601 Communication Systems

Math 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS

Bivariate Distributions. Discrete Bivariate Distribution Example

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

Algorithms for Uncertainty Quantification

Chapter 2. Probability

SDS 321: Introduction to Probability and Statistics

Let X and Y denote two random variables. The joint distribution of these random

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

ACM 116: Lectures 3 4

Joint Probability Distributions and Random Samples (Devore Chapter Five)

HW4 : Bivariate Distributions (1) Solutions

Statistical Learning Theory

Bivariate Distributions

Chapter 5 Joint Probability Distributions

Outline Properties of Covariance Quantifying Dependence Models for Joint Distributions Lab 4. Week 8 Jointly Distributed Random Variables Part II

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

STAT 430/510: Lecture 16

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

Introduction to Probability and Stocastic Processes - Part I

Continuous r.v practice problems

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1

STAT Chapter 5 Continuous Distributions

x i p X (x i ) if X is discrete provided that the relevant sum or integral is absolutely convergent, i.e., i

1 Review of Probability and Distributions

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

ECE Homework Set 3

UNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Statistical Methods in Particle Physics

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables and Their Distributions

15.5. Applications of Double Integrals. Density and Mass. Density and Mass. Density and Mass. Density and Mass. Multiple Integrals

Chapter 4 continued. Chapter 4 sections

Jointly Distributed Random Variables

Probability and Distributions

ECE 4400:693 - Information Theory

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

IIT JAM : MATHEMATICAL STATISTICS (MS) 2013

Chapter 4. Chapter 4 sections

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

Joint Probability Distributions, Correlations

Chapter 5. Chapter 5 sections

There are two basic kinds of random variables continuous and discrete.

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Multiple Random Variables

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

MULTIVARIATE PROBABILITY DISTRIBUTIONS

5 Operations on Multiple Random Variables

Transcription:

Bivariate Distributions EGR 260 R. Van Til Industrial & Systems Engineering Dept. Copyright 2013. Robert P. Van Til. All rights reserved. 1

What s It All About? Many random processes produce Examples.» The length and diameter of a part made by a CNC lathe.» Your SYS 317 final exam score and the amount of time spent studying. Note that the 2 outcomes 2

What s It All About? In this presentation, we will extent our probability results to random processes with 2 RV s. Including the development of probability functions, pdf s, expected value formulas, etc. Also, some new concepts will be presented. Note that these results can be readily extended to random processes with 3 or more outcomes. 3

Definition Let X and Y be discrete RV s. The joint probability function of X and Y is given by The marginal probability functions of X and Y, respectively, are given by and 4

Properties Note the joint probability function computes probability! Hence, it satisfies all the properties of a probability function, such as, and 0 P(x,y) 1 for any x,y S 5

Calculating P(x,y) Consider a discrete bivariate distribution with RV s X and Y. Values for the joint probability function can be calculated using f Y x (y x) is the f X (x) is the f XY (x,y) = P[(X=x) (Y=y)] = (1) Note it s ok to exchange X with Y in (1). 6

Example Suppose a bin contains 56 parts of which 7 are defective. Select 2 parts at random without replacement and let the discrete RV s be " $ X = 0 1st part good # % $ 1 1 st part bad " $ & Y = 0 2nd part good # % $ 1 2 nd part bad Determine the joint probability function and the marginal probability functions. 7

Aside Note for many discrete bivariate distributions, a Hence, the joint probability function is placed in a table.» The table also contains the marginal probability functions. 8

Example For this example, the structure for the joint probability function table is X Y 0 1 f X (x) 0 f XY (0,0) f XY (0,1) f X (0) 1 f XY (1,0) f XY (1,1) f X (1) f Y (y) f Y (0) f Y (1) 1.0 9

Example For this example, the structure for the joint probability function table is X Y 0 1 f X (x) 0 f XY (0,0) f XY (0,1) f X (0) 1 f XY (1,0) f XY (1,1) f X (1) f Y (y) f Y (0) f Y (1) 1.0 10

Example For this example, the structure for the joint probability function table is X Y 0 1 f X (x) 0 f XY (0,0) f XY (0,1) f X (0) 1 f XY (1,0) f XY (1,1) f X (1) f Y (y) f Y (0) f Y (1) 1.0 11

Example For this example, the structure for the joint probability function table is X Y 0 1 f X (x) 0 f XY (0,0) f XY (0,1) f X (0) 1 f XY (1,0) f XY (1,1) f X (1) f Y (y) f Y (0) f Y (1) 1.0 f Y (y) = n f XY (x i,y) i=1 12

Example For this example, the structure for the joint probability function table is X Y 0 1 f X (x) 0 f XY (0,0) f XY (0,1) f X (0) 1 f XY (1,0) f XY (1,1) f X (1) f Y (y) f Y (0) f Y (1) 1.0 n f X (y) = f XY (x,y i ) i=1 13

Example For this example, the structure for the joint probability function table is X Y 0 1 f X (x) 0 f XY (0,0) f XY (0,1) f X (0) 1 f XY (1,0) f XY (1,1) f X (1) f Y (y) f Y (0) f Y (1) 1.0 14

Back to the Example Problem Suppose a bin contains 56 parts of which 7 are defective. Select 2 parts at random without replacement and let the discrete RV s be " $ X = 0 1st part good # % $ 1 1 st part bad " $ & Y = 0 2nd part good # % $ 1 2 nd part bad Determine the joint probability function and the marginal probability functions. 15

Example 16

Example Hence, X Y 0 1 f X (x) 0 0.875 1 0.125 f Y (y) 0.875 0.125 1.0 17

Definition Let X and Y be continuous RV s. The joint probability density function, f XY (x,y), is such that P[(a X b) (c Y d)] = The marginal probability density functions are given by b a d c f(x,y)dydx and 18

Properties The joint pdf is a pdf! Hence, f(x,y) 0 for all x,y S and 19

Example Any process for producing an industrial chemical will yield a product containing impurities. For a randomly selected sample of a particular chemical, let RV s X = {proportion of all impurities in the sample} Y = {proportion of type 1 impurities among all impurities found in the sample} After investigating several samples, it s determined that f XY (x, y) = 2(1 - x) 0 x 1, 0 y 1 0 elsewhere Determine the probability that X < 0.5 and 0.4 Y 0.7. 20

Example 21

Example 22

Recall As noted earlier, the conditional probability function for discrete RV s X and Y is f X y (X y) = f XY (x, y) f Y (y) for all f Y (y) 0 23

Definition The conditional probability density function for continuous RV s X and Y is given by The conditional pdf is used to calculate conditional probabilities, such as P(a < X < b c < Y < d) = b a d c f X y (x y)dydx 24

Example A soft drink machine starts the day with a supply of Y gallons and dispenses X gallons during the day without being resupplied. It is observed that X and Y have a joint pdf of # 0.5 0 x y, 0 y 2 f XY (x,y) = $ % 0 elsewhere Determine the probability that less than 0.4 gallons of pop is sold during the day given that the machine started the day with more than 1 gallon of pop. 25

Example Hence, want to determine 0.4 P(X < 0.4 Y > 1) = f(x y)dydx 1 26

Example So, the marginal probability of Y is f Y (y) = y - f XY (x,y)dx Thus, P(X < 0.4 Y > 1) = 0.4 0 2 1 0.5 0.5y dydx 27

Example Note that if the machine had started the day with Y > 1.5 gallons, then P(X < 0.4 Y > 1.5) = 0.4 2 0.4 1 1.5 y dydx = 0.194dx = 0.078 0 0 But P(X<0.4 Y>1) =» This implies that the 28

Definition The RV s X and Y are statistically independent if f XY (x,y) = f X (x)f Y (y) for all x,y S The RV s can be discrete or continuous. This is our existing definition written in terms of RV s. 29

Example The joint probability function and marginal probability functions from the earlier good-bad parts problem is given below. Are the RV s X and Y statistically independent? X Y 0 1 f X (x) 0 0.764 0.111 0.875 1 0.111 0.014 0.125 f Y (y) 0.875 0.125 1.0 30

Example 31

Example Does this make sense? Consider the random process. Suppose a bin contains 56 parts of which 7 are defective. Select 2 parts at random without replacement and let the discrete RV s be " $ X = 0 1st part good # % $ 1 1 st part bad " $ & Y = 0 2nd part good # % $ 1 2 nd part bad 32

Definition Let g(x,y) be any real valued function of RV s X and Y. If RV s X and Y are discrete, then the expected value of g(x,y) is If RV X and Y are continuous, then the expected value of g(x,y) is 33

Properties of Expected Value If RV s X and Y are statistically independent, then For any RV s X 1, X 2,..., X n (they do not need to be statistically independent), Also, E(X 1 - X 2 ) = E(X 1 ) - E(X 2 ) 34

Definition and Property Let g(x,y) be any real valued function of RV s X and Y, then VAR[g(X,Y)] = E[(g(X,Y) - E(g(X,Y))) 2 ] = E[g 2 (X,Y)] - E 2 [g(x,y)] Valid if X and Y are discrete RV s or continuous RV s. 35

Definition and Property If X 1, X 2,..., X n are statistically independent, then VAR(X 1 +... + X n ) = VAR(X 1 ) +... + VAR(X n ) Also, if X 1 and X 2 are statistically independent, then VAR(X 1 - X 2 ) = VAR(X 1 ) + VAR(X 2 ) 36

Special Property Suppose the normal RV s X N(µ X,σ X ) and Y N(µ Y,σ Y ) are statistically independent, then the RV given by is also a normal RV with Note this result holds for n statistically independent normal RV s, Q = X 1 + X 2 +...+ X n N µ 1 +µ 2 +...+µ n, σ 1 2 + σ2 2 +...+ σn 2 ( ) 37

Note All of what we covered in bivariate distributions up to now has been applying existing concepts (e.g., pdf or expected value) to a bivariate random process. Next, we introduce a new concept! 38

Definition The covariance of bivariate RV s X and Y is a 1. If X tends to be large when Y tends to be large, then X and Y will have a 2. If X tends to be large when Y tends to be small, then X and Y will have a 3. If X and Y are unrelated (i.e., statistically independent), then the covariance of X and Y 39

Illustration of Covariance Consider a random process that produces RV s X and Y. Collect several samples from this process and make a scatter plot of its data. Y Y Y X X X 40

Definition The covariance between RV s X and Y is given by Formula valid for discrete RV s or continuous RV s. Note an alternative formula for covariance is 41

Definition and Property Recall if X and Y are statistically independent, then E(XY) = E(X)E(Y) COV(X,Y) = 0 Converse not necessarily true. Note that for RV s X and Y, 42

A Problem with Covariance What is considered a large positive, or negative, value for COV(X,Y)? It depends on the units of the RV s X and Y. Can eliminate this problem by considering the correlation coefficient, which is defined by Note that: Also, ρ has same sign as COV(X,Y). 43

Example Determine the covariance and ρ for RV s X and Y for the previous good-bad part example where the joint probability function and marginal probability functions are given by X Y 0 1 f X (x) 0 0.764 0.111 0.875 1 0.111 0.014 0.125 f Y (y) 0.875 0.125 1.0 44

Example 45

Example 46

Example Recall the random process. Suppose a bin contains 56 parts of which 7 are defective. Select 2 parts at random without replacement and let the discrete RV s be " X = 0 1st part good # $ 1 1 st part bad " & Y = 0 2nd part good # $ 1 2 nd part bad Earlier we concluded that X and Y are statistically dependent. But not by too much! COV(X,Y) and ρ should be near zero. 47