STAT 430/510: Lecture 16

Similar documents
STAT 430/510: Lecture 15

STOR Lecture 16. Properties of Expectation - I

STAT 430/510: Lecture 10

STAT 430/510 Probability Lecture 16, 17: Compute by Conditioning

3 Multiple Discrete Random Variables

STAT 430/510 Probability

Lecture 2: Repetition of probability theory and statistics

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Chapter 4. Chapter 4 sections

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

STAT Chapter 5 Continuous Distributions

Expectation of Random Variables

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

Continuous Random Variables

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Analysis of Engineering and Scientific Data. Semester

Continuous r.v practice problems

Algorithms for Uncertainty Quantification

Ch. 5 Joint Probability Distributions and Random Samples

STT 441 Final Exam Fall 2013

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

18.440: Lecture 28 Lectures Review

Probability and Distributions

Lecture 19: Properties of Expectation

STA 256: Statistics and Probability I

18.440: Lecture 28 Lectures Review

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

SDS 321: Introduction to Probability and Statistics

ECE Lecture #9 Part 2 Overview

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

3. Probability and Statistics

The mean, variance and covariance. (Chs 3.4.1, 3.4.2)

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics

2 (Statistics) Random variables

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Fundamental Tools - Probability Theory II

ENGG2430A-Homework 2

Notes for Math 324, Part 19

Bayesian statistics, simulation and software

SDS 321: Introduction to Probability and Statistics

Bivariate distributions

Joint Distribution of Two or More Random Variables

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.

Continuous Random Variables and Continuous Distributions

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1

1 Review of Probability and Distributions

Probability and Statistics Notes

Lecture 1: August 28

1.1 Review of Probability Theory

MATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Lecture 14. Text: A Course in Probability by Weiss 5.6. STAT 225 Introduction to Probability Models February 23, Whitney Huang Purdue University

MAS113 Introduction to Probability and Statistics. Proofs of theorems

CMPSCI 240: Reasoning Under Uncertainty

1 Presessional Probability

1 Random Variable: Topics

Stochastic Models of Manufacturing Systems

Bivariate Distributions

1 Basic continuous random variable problems

4 Pairs of Random Variables

Random variables (discrete)

p. 6-1 Continuous Random Variables p. 6-2

1 Review of Probability

Chp 4. Expectation and Variance

conditional cdf, conditional pdf, total probability theorem?

Multivariate distributions

Lecture 4 : Random variable and expectation

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

4. Distributions of Functions of Random Variables

2 Continuous Random Variables and their Distributions

Final Exam # 3. Sta 230: Probability. December 16, 2012

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

MAS113 Introduction to Probability and Statistics. Proofs of theorems

STAT/MATH 395 PROBABILITY II

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

CMPSCI 240: Reasoning Under Uncertainty

Notes 12 Autumn 2005

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

2 Functions of random variables

Random Variables and Their Distributions

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

The Binomial distribution. Probability theory 2. Example. The Binomial distribution

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Northwestern University Department of Electrical Engineering and Computer Science

Midterm #1. Lecture 10: Joint Distributions and the Law of Large Numbers. Joint Distributions - Example, cont. Joint Distributions - Example

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

Chapter 5 Class Notes

Lecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula

5 Operations on Multiple Random Variables

Things to remember when learning probability distributions:

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Practice Examination # 3

Chapter 4 Multiple Random Variables

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

Transcription:

STAT 430/510: Lecture 16 James Piette June 24, 2010

Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7.

Joint Distribution of Functions of Cont. R.V. s Let X 1 and X 2 be jointly cont. r.v. s with joint pdf f X1,X 2. Suppose there are two r.v. s, Y 1 and Y 2, such that Y 1 = g 1 (X 1, X 2 ) and Y 2 = g 2 (X 1, X 2 ) for some functions g 1 and g 2. There are two assumptions: 1 Equations y 1 = g 1 (x 1, x 2 ) and y 2 = g 2 (x 1, x 2 ) can be uniquely solved for x 1 and x 2 in terms of y 1 and y 2, with solutions given by x 1 = h 1 (y 1, y 2 ) and x 2 = h 2 (y 1, y 2 ). 2 The functions g 1 and g 2 have cont. partial derivatives at all points (x 1, x 2 ) and are s.t. the 2 2 determinant: g 1 J(x 1, x 2 ) = x 1 g 2 x 2 g 1 g 1 x 1 x 2 0

Joint Distribution of Functions of Cont. R.V. s (cont.) Under the previous two assumptions, the r.v. s Y 1 and Y 2 are jointly cont. with joint density given by f Y1 Y 2 (y 1, y 2 ) = f X1 X 2 (x 1, x 2 ) J(x 1, x 2 ) 1 where x 1 = h 1 (y 1, y 2 ), x 2 = h 2 (y 1, y 2 ).

Example 1 Let X 1, X 2, X 3 be independent standard normal r.v. s. Question: If Y 1 = X 1 + X 2 + X 3, Y 2 = X 1 X 2 and Y 3 = X 1 X 3, then what is the joint density of Y 1, Y 2, Y 3? Solution: Need to verify the assumptions. The first assumption... X 1 = Y 1 + Y 2 + Y 3 3 X 2 = Y 1 2Y 2 + Y 3 3 X 3 = Y 1 + Y 2 2Y 3 3 And the second assumption... 1 1 1 J(x 1, x 2 ) = 1 1 0 1 0 1 = 3

Example 1 (cont.) Remember that... f X1 X 2 X 3 (x 1, x 2, x 3 ) = 1 3 e 1 x i 2 /2 (2π) 3/2 Now, all we need to do is plug in the appropriate values into the formulas: f Y1 Y 2 Y 3 (y 1, y 2, y 3 ) = 1 ( 3 f y1 + y 2 + y 3 X 1 X 2 X 3, y 1 2y 2 + y 3, y 1 + y 2 3 3 3 1 = 3(2π) 3/2 e Q(y 1,y 2,y 3 )/2 where... Q(y 1, y 2, y 3 ) = ( ) y1 + y 2 + y 2 ( ) 3 y1 2y 2 + y 2 ( 3 y1 + y 2 + + 3 3 3 = y 1 2 3 + 2y 2 2 3 + 2y 3 2 3 2y 2y 3 3

Recall: Expectation The expected value of a discrete r.v. X with pmf p(x) is given by E(X) = xp(x) x The expected value of a continuous r.v. X with pdf f (x) is given by E(X) = xf (x)dx If P(a X b) = 1, then a E(X) b.

Proposition 7.2.1 If X and Y have a joint pmf p(x, y), then E(g(X, Y )) = g(x, y)p(x, y) y x If X and Y have a joint pdf f (x, y), then E(g(X, Y )) = g(x, y)f (x, y)dxdy

Example 2 An accident occurs at point X that is uniformly dist. on a road of length L. At the time of the accident, an ambulance is at location Y that is also uniformly dist. on the road. Question: Assuming X and Y are independent, what is the expected distance between the ambulance and point of the accident? Solution: In this scenario, we are looking to calculate the expectation of the r.v.... X Y because we want distance between the two.

Example 2 (cont.) First, we need the joint density function of X and Y : f (x, y) = 1 L 2, 0 < x < L, 0 < y < L From Proposition 7.2.1,... E( X Y ) = 1 L 2 L L 0 0 x y dydx

Example 2 (cont.) Let s evaluate the first integral by splitting it up: L 0 x y dy = x 0 (x y)dy + L x (y x)dy = x 2 2 + L2 2 x 2 x(l x) 2 = L2 2 + x 2 xl Therefore, E( X Y ) = 1 L ( ) L 2 L 2 2 + x 2 xl dx = L 3 0

Properties Let X and Y be continuous r.v. s If X Y, then E(X) E(Y ). E( n i=1 X i) = n i=1 E(X i). E(aX + by ) = ae(x) + be(y ).

Sample Mean Let X 1,..., X n be i.i.d r.v. s having distribution F and expected value µ. Such a sequence of r.v. s is said to constitute a sample from the distribution F. The quantity X = n i=1 X i n is called the sample mean.

Sample Mean (cont.) The expectation of X is [ n ] E( X) X i = E n i=1 [ n ] = 1 n E X i i=1 = 1 n E(X i ) n i=1 = µ since E(X i ) µ

Expectation of Binomial R.V. Let X be a binomial r.v. with parameters (n, p). Recalling that a binomial represents the number of successes in n independent trials when each trail has probability p of being a success, we have that X = X 1 + X 2 +... + X n where X i = { 1 if the ith trial is a success 0 if the ith trial is a failure Hence, X i is a Bernoulli r.v. with expectation... E(X i ) = 0(1 p) + 1(p) = p. So... E(X) = E(X 1 ) + E(X 2 ) +... + E(X n ) = np

Expectation of Negative Binomial R.V. Let X represent the number of trials needed to get a total of r successes, where the probability of each success is p. Then X is a negative binomial r.v. with parameters (r, p). Once again, X can be represented by X = X 1 + X 2 +... + X r where X i is the number of trials required after after the (i 1)th success until a total of i successes is obtained. Thus, X i is a geometric r.v. with parameter p. So... E(X) = E(X 1 ) + E(X 2 ) +... + E(X r ) = r p

Example 3 Suppose there are N different types of coupons and each time one obtains a coupon, it is equally likely to be any one of the N types. Question: What is the expected number of coupons needed to amass before you have a complete set of at least one of each type? Solution: Let X denote the number of coupons collected before complete set is attained. Then, X = X 0 + X 1 +... + X N 1 where X i = number of additional coupons needed to be obtained after i distinct types have been collected in order to obtain another distinct type. How is X i distributed? Geometric with parameter N i, with expectation... E(X i ) = N N i. N

Example 3 (cont.) This implies that E(X) = X 0 + X 1 +... + X N 1 = 1 + N N 1 + N N 2 +... + N [ 1 = N 1 +... + 1 N 1 + 1 ] N

Example 4 Ten hunters are waiting for ducks to fly by. When a flock of ducks flies overhead, the hunters fire at the same time, but each chooses his target at random, independently of all the others. Question: If each hunter independently hits his target with probability p, then what is the expected number of ducks that escape unhurt when a flock of 10 flies overhead? Solution: Let X denote the number of ducks that escape. Then, we can represent X as X = X 1 + X 2 +... + X 10 where X i is an indicator (remember back?) such that { 1 if the ith duck escapes unhurt X i = 0 otherwise

Example 4 (cont.) Like with any indicator, its expected value is... E(X i ) = P(X i = 1) = (1 p 10 )10 Therefore, E(X) = E(X 1 ) + E(X 2 ) +... + E(X 10 ) = 10(1 p 10 )10

Motivation Expected value describes the average of the r.v. s. Variance desrcibes the variation of the r.v. s. Covariance describes the relationship between two r.v. s (i.e. how the interact).

Formalization Def: The covariance between two r.v. s X and Y is defined as Cov(X, Y ) = E[(X E(X))(Y E(Y ))] An alternative form of covariance is Cov(X, Y ) = E[XY ] E[X]E[Y ] Remembering back, Cov comes up when we look at X and Y not independent and... Var(X + Y ) = Var(X) + Var(Y ) + 2Cov(X, Y )

Let s cover self-test problems 6.20. We ve now finished Ch. 6 and sections 7.1-7.2, plus started 7.4. Remember, I ve posted HW4 up on my website and it is due on Mon. June 28th.