Functions of two random variables. Conditional pairs

Similar documents
ECE Lecture #9 Part 2 Overview

ENGG2430A-Homework 2

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

STT 441 Final Exam Fall 2013

3. General Random Variables Part IV: Mul8ple Random Variables. ECE 302 Fall 2009 TR 3 4:15pm Purdue University, School of ECE Prof.

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

STAT/MATH 395 PROBABILITY II

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Review of Probability. CS1538: Introduction to Simulations

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

Bivariate Distributions

Math Spring Practice for the final Exam.

Continuous Random Variables

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

Bivariate distributions

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes?

Final Exam # 3. Sta 230: Probability. December 16, 2012

Jointly Distributed Random Variables

2. Discrete Random Variables Part II: Expecta:on. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak

Mathematics 426 Robert Gross Homework 9 Answers

STAT 430/510: Lecture 16

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Math 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS

HW Solution 12 Due: Dec 2, 9:19 AM

2 (Statistics) Random variables

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Notes for Math 324, Part 19

Review of Probability Theory

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Chapter 4. Chapter 4 sections

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

SDS 321: Introduction to Probability and Statistics

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Covariance and Correlation

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Properties of Summation Operator

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

(4) Suppose X is a random variable such that E(X) = 5 and Var(X) = 16. Find

Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com

Midterm Exam 1 Solution

3. Probability and Statistics

Preliminary Statistics. Lecture 3: Probability Models and Distributions

Practice Examination # 3

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Bivariate Distributions

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

STAT Chapter 5 Continuous Distributions

Joint Distribution of Two or More Random Variables

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

STA 256: Statistics and Probability I

Class 8 Review Problems solutions, 18.05, Spring 2014

Twelfth Problem Assignment

ISyE 6644 Fall 2016 Test #1 Solutions

Expectation of Random Variables

MAS113 Introduction to Probability and Statistics. Proofs of theorems

4. Distributions of Functions of Random Variables

Lecture 2: Repetition of probability theory and statistics

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Multivariate Distributions CIVL 7012/8012

Homework 5 Solutions

Stat 5101 Notes: Algorithms

Notes for Math 324, Part 20

Random Variables and Their Distributions

ECSE B Solutions to Assignment 8 Fall 2008

More on Distribution Function

ECON Fundamentals of Probability

Statistics 427: Sample Final Exam

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula

UNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS

Chp 4. Expectation and Variance

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables:

1 Basic continuous random variable problems

ECE Homework Set 3

UNIT Define joint distribution and joint probability density function for the two random variables X and Y.

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Class 8 Review Problems 18.05, Spring 2014

Joint Probability Distributions, Correlations

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

The mean, variance and covariance. (Chs 3.4.1, 3.4.2)

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Lecture 2: Probability

Limiting Distributions

1 Random Variable: Topics

STAT 515 MIDTERM 2 EXAM November 14, 2018

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Transcription:

Handout 10 Functions of two random variables. Conditional pairs "Science is a wonderful thing if one does not have to earn a living at it. One should earn one's living by work of which one is sure one is capable. Only when we do not have to be accountable to anyone can we nd joy in scientic endeavor." Albert Einstein for discrete random variables X, Y the random variable W = g(x, Y ) has the PMF: P W (w) = P X,Y (x, y) (x,y): g(x,y)=w for continuous random variables X and Y, the CDF of W = g(x, Y ) is: F W (w) = P (W w) = g(x,y) w f X,Y (x, y) dxdy in particular, the CDF of W = max(x, Y ) is: w w F W (w) = F X,Y (w, w)) = f X,Y (x, y) dxdy the expected valued of W = g(x, Y ) is: in the discrete case and: E(W ) = E(W ) = in the continuous case. In general: y S Y g(x, y)p X,Y (x, y) g(x, y)f X,Y (x, y) dxdy E(X + Y ) = E(X) + E(Y ) for any random variables X, Y, but: var(x + Y ) = var(x) + var(y ) + 2E [(X E(X))(Y E(Y ))] with var(x + Y ) = var(x) + var(y )

in the case when X, Y are independent Conditioning by an event for any event B, a region of the X, Y plane with P (B) > 0 the conditional joint probability mass function of X, Y given B is: { PX,Y (x,y) P = P (B), if (x, y) B X,Y B 0, otherwise the conditional joint probability density function of X and Y is: { fx,y (x,y) f = P (B), if (x, y) B X,Y B 0, otherwise the conditional expected value of W = g(x, Y ) given B is: E(W B) = g(x, y)p (x, y) X,Y y S B Y in the discrete case and: E(W B) = g(x, y)f (x, y) dxdy X,Y B in the continuous case. for the variance one has: var(w B) = E ) [ (W 2 B 2 E(W B)]. Conditioning by a random variable for any event Y = y such that P Y (y) > 0, the conditional PMF is: one can observe that: P X Y (x y) = P X Y (x y) = P (X = x Y = y) P (X = x, Y = y) P (Y = y) = P X,Y (x, y) P Y (y) thus for a pair of random variables X, Y one has the general rule: P (X = x, Y = y) = P X Y (x y) P Y (y) = P Y X (y x) P X (x) for a continuous random variable the conditioning PDF of X given Y = y, f Y (y) > 0 is: f X Y (x y) = f X,Y (x, y) f Y (y) there are similar formulae for the conditional expected value: E(X Y = y) = x P X Y (x y)

E(g(X, Y ) Y = y) = E(X Y = y) = E(g(X, Y ) Y = y) = g(x, y) P X Y (x y) Variance of the sum x f X Y (x y) dx g(x, y) f X Y (x y) dx for a two random variables X, Y the formula holds: var(x + Y ) = var(x) + var(y ) + 2 cov(x, Y ) where: cov(x, Y ) = E(XY ) E(X)E(Y ). of course cov(x, X) = var(x)

Proposed problems Problem 1. A business trip is equally likely to take 2, 3, or 4 days. After a d- day trip, the change in the traveler's weight, measured as an integer number of pounds, is uniformly distributed between d and d pounds. For one such trip, denote the number of days by D and the change in weight by W. Find the joint PMF P D,W (d, w). Problem 2. If the random variables X and Y have joint density function: { xy f(x, y) = 96, 0 < x < 4, 1 < y < 5 0, otherwise nd the density function of U = X + 2Y. Problem 3. Let X and Y be two independent random variables each having the standard normal distribution. What is the probability distribution function of X 2 + Y 2? Problem 4. A company receives shipments from two factories. Depending on the size of the order, a shipment can be in: 1 box for a small order, 2 boxes for a medium order, 3 boxes for a large order. The company has two dierent suppliers: Factory Q is 60 miles from the company. Factory R is 180 miles from the company. An experiment consists of monitoring a shipment and observing B, the number of boxes, and M, the number of miles the shipment travels. The following probability model describes the experiment: (a) Find P B,M (b, m), the joint PMF of the number of boxes and the distance. (You may present your answer as a matrix if you like.) (b) What is E[B], the expected number of boxes? (c) What is P M B (m 2), the conditional PMF of the distance when the order requires two boxes? (d) Find E[M B = 2], the expected distance given that the order requires 2 boxes. (e) Are the random variables B and M independent? (f) The price per mile of sending each box is one cent per mile the box travels. C cents is the price of one shipment. What is E[C], the expected price of one shipment?

Problem 5. An ice cream company orders supplies by fax. Depending on the size of the order, a fax can be either 1 page for a short order, 2 pages for a long order. The company has three dierent suppliers: The vanilla supplier is 20 miles away. The chocolate supplier is 100 miles away. The strawberry supplier is 300 miles away. An experiment consists of monitoring an order and observing N, the number of pages, and D, the distance the order is transmitted. The following probability model describes the experiment: (a) What is the joint PMF P N,D (n, d) of the number of pages and the distance? (b) What is E[D], the expected distance of an order? (c) Find P D N (d 2), the conditional PMF of the distance when the order requires 2 pages. (d) Write E[D N = 2], the expected distance given that the order requires 2 pages. (e) Are the random variables D and N independent? (f) The price per page of sending a fax is one cent per mile transmitted. C cents is the price of one fax. What is E[C], the expected price of one fax?