Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

Similar documents
SDS 321: Introduction to Probability and Statistics


Chapter 4 Multiple Random Variables

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Bivariate distributions

Basics on Probability. Jingrui He 09/11/2007

MATH 180A - INTRODUCTION TO PROBABILITY PRACTICE MIDTERM #2 FALL 2018

Multiple Integrals and Probability Notes for Math 2605

Introduction to Probability

Ch. 8 Math Preliminaries for Lossy Coding. 8.5 Rate-Distortion Theory

EXAM # 3 PLEASE SHOW ALL WORK!

Single Maths B: Introduction to Probability

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Recitation 2: Probability

Chapter 5 Random vectors, Joint distributions. Lectures 18-23

Probability (continued)

ECE 4400:693 - Information Theory

Continuous Random Variables

Introduction to Probability and Stocastic Processes - Part I

STA 256: Statistics and Probability I

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Class 26: review for final exam 18.05, Spring 2014

Homework 9 (due November 24, 2009)

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Chapter 5,6 Multiple RandomVariables

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

ISyE 6739 Test 1 Solutions Summer 2015

1 Random variables and distributions

Joint Probability Distributions, Correlations

Multiple Random Variables

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Multivariate Distributions CIVL 7012/8012

Multivariate random variables

More on Distribution Function

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

4.1 The Expectation of a Random Variable

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

1. Exercise on Page 8: 2. Exercise on Page 8: (b)c 1 (C 2 C 3 ) and (C 1 C 2 ) (C 1 C 3 ) (C 1 C 2 ) (C 2 C 3 ) Problem Set 1 Fall 2017

Continuous random variables

Massachusetts Institute of Technology

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

DIFFERENTIATION AND INTEGRATION PART 1. Mr C s IB Standard Notes

Lecture 4. Continuous Random Variables and Transformations of Random Variables

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Math Review Sheet, Fall 2008

When Are Two Random Variables Independent?

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

Review: mostly probability and some statistics

Homework 10 (due December 2, 2009)

Chapter 5 continued. Chapter 5 sections

Probability Distributions: Continuous

Chapter 2. Probability

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

Test One Mathematics Fall 2009

CS145: Probability & Computing

Sample Spaces, Random Variables

Chapter 4 Multiple Random Variables

Counting principles, including permutations and combinations.

STAT 430/510: Lecture 15

GOV 2001/ 1002/ Stat E-200 Section 1 Probability Review

Conditioning a random variable on an event

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

This does not cover everything on the final. Look at the posted practice problems for other topics.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology

1 Joint and marginal distributions

Math 151. Rumbos Fall Solutions to Review Problems for Final Exam

Multivariate distributions

Chapter 5 Joint Probability Distributions

Math 30530: Introduction to Probability, Fall 2013

Continuous r.v practice problems

BACKGROUND NOTES FYS 4550/FYS EXPERIMENTAL HIGH ENERGY PHYSICS AUTUMN 2016 PROBABILITY A. STRANDLIE NTNU AT GJØVIK AND UNIVERSITY OF OSLO

Multivariate random variables

1 Basic continuous random variable problems

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

Class 8 Review Problems solutions, 18.05, Spring 2014

Chapter 5. Chapter 5 sections

Multivariate Random Variable

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 4: Solutions Fall 2007

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

15 Discrete Distributions

Math Camp. September 16, 2017 Unit 2. MSSM Program Columbia University Dr. Satyajit Bose

Math 151. Rumbos Spring Solutions to Review Problems for Exam 1

b. ( ) ( ) ( ) ( ) ( ) 5. Independence: Two events (A & B) are independent if one of the conditions listed below is satisfied; ( ) ( ) ( )

STAT Chapter 5 Continuous Distributions

STAT 515 MIDTERM 2 EXAM November 14, 2018

Probability Review. September 25, 2015

STA 291 Lecture 8. Probability. Probability Rules. Joint and Marginal Probability. STA Lecture 8 1

Midterm Exam 1 Solution

Solution to Assignment 3

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

Bivariate Distributions

MATH 280 Multivariate Calculus Fall Integrating a vector field over a surface

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)

(a) x cos 3x dx We apply integration by parts. Take u = x, so that dv = cos 3x dx, v = 1 sin 3x, du = dx. Thus

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

Transcription:

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: PMF case: p(x, y, z) is the joint Probability Mass Function of X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) PDF case: f(x, y, z) is the joint Probability Density Function of X, Y, Z iff for all sets A, B, C, P x A, y B, z C A B C f x, y, z dz dy dx. If you know p x, y, z or f x, y, z then you can calculate the distribution of X 1 (or X 2 or X 3 ) alone. This distribution of X 1 alone is the marginal distribution of X 1. PMF case: P(X 1 = x 1 ) = y z p x, y, z summed over all y, z. PDF case: P(X = x) = f x, y, z dydz. `Suppose X and Y are random variables such that X has two values: 0, 1 and Y has two values 2, 3. Suppose the joint PMF p(x, y) is p(0,2) = 2/5, p(0,3) = p(1,2) = p(1,3) = 1/5. Find P(X = 1). P(X = 1) = P(X = 1, Y =2) + P(X = 1, Y =3) = p(1,2) + p(1,3) = 1/5 + 1/5 = 2/5.

The probability of an event A = the proportion of outcomes in the sample space W which are in A. A has probability.5 iff it has 50% of the outcomes. Suppose an event A of nonzero probability is known to have occurred. Then outcomes not in A are ruled out and A becomes the new sample space. Since A is 100% of this new sample space, A now has probability 1. The conditional probability P B A of another event B is the proportion of outcomes in A which lie in B. DEFINITION. The conditional probability of B given A is P B A P A B /P A = the percentage of A which lies in B. Solving for P A B gives P A B P A P B A. For three events the formula is: P A B C P A P B A P C A B.

`A rider just misses a bus. The time of arrival of the next bus has an exponential distribution with rate l= 3 / hour. (a) Find the probability the next bus arrives within t minutes. (b) Find the probability it arrives later than t minutes. (c) If the rider has waited s minutes, find the probability the bus will take t minutes or more before it arrives.

`A rider just misses a bus. The time of arrival of the next bus has an exponential distribution with rate l= 3 / hour. (a) Find the probability the next bus arrives within t minutes. F t 1 e 3t (b) Find the probability it arrives later than t minutes. 1 F t 1 1 e 3t e 3t (c) If the rider has waited s minutes, find the probability the bus will take t minutes or more before it arrives. e 3 s t /e 3s e 3t

DEFINITION. The conditional probability of A given B is P A B P B A /P B. Solving for P A B gives P A B P A P A B. Events A and B are independent if knowing one does not affect the probability of the other. Since knowing B doesn t affect the probability of A, P A P A B. Thus P A B P A B /P B becomes P A P A B /P B. Multiplying by P B gives P A B P A P B. DEFINITION. Events A and B are independent iff P A B P A P B. Events A, B, C are mutually independent iff P A B C P A P B P C.

BAYES THEOREM. Suppose events B 1, B 2, B 3 partition the sample space, i.e., they are disjoint and their union is everything. Then for any event A, P A P A B 1 P B 1 P A B 2 P B 2 P A B 3 P B 3 i P A B i P B i. Proof. P A P A B 1 P A B 2 P A B 3 = P A B 1 P B 1 P A B 2 P B 2 P A B 3 P B 3. This is easy to visualize with a Venn diagram. B A B 1 B 2 3 The formulas for conditional probability mass and density functions obey the same laws.

Suppose X and Y are two continuous random variables on the same sample space, say the height and weight respectively of a randomly picked UH student. Let f x, y be the joint density function. Hence P[height is from 5 to 6 & weight is from 100 lb to 150 lb] =. 56 100 150 f x,y dxdy Let f X x be the (marginal) density function for X. This is the probability density function for x. We don t care what y is, it can be anything. Thus to get the marginal density we integrate over all possible values of y -- f X x f x, y dy. This is the cross-section at x. Since f x, y is the joint distribution of X and Y and since n is the set theoretic translation of and, the conditional formula P Y X P X Y /P X translates into the following formula for density functions. DEFINITION. The conditional density of Y given X = x is f y x f x, y /f X x. For any set B, P Y B X x y B f y x dy.

More generally, if f x 1, x 2, y 1, y 2 is the joint density of random variables X 1, X 2, Y 1, Y 2, then the conditional density of Y 1, Y 2 given X 1 x 1, X 2 x 2 is f y 1, y 2 x 1, x 2 f x 1, x 2, y 1, y 2 /f X1,X 2 x 1, x 2 where f X1,X 2 x 1, x 2 f x 1, x 2, y 1, y 2 dy 1 dy 2 is the marginal distribution of X 1, X 2. Recall: events (sets) A, B, C were independent if P A B C P A P B P C. Suppose X, Y, Z are random variables with joint probability density function f x, y, z. DEFINITION. X, Y, Z are mutually independent iff P X A, Y B, Z C P X A P Y B P Z C for all sets A, B, C. Equivalently, X, Y, Z are mutually independent iff f x, y, z f X x f Y y f Z z where f X x, f Y y, f Z z are their marginal density functions. `Suppose X, Y have joint density function f x, y 1/6 if x 0, 2, y 0, 3, = 0 elsewhere. Are X and Y independent? Find f X x, f Y y. Is f x, y f X x f Y y?

`Are height and weight independent random variables? `You randomly pick 3 students. Let X, Y, Z be the heights of the first, second and third student. Are X, Y, and Z independent? Do they have the same marginal distribution, i.e., is f X x f Y y f Z z? As is usually the case in statistics, they have the normal distribution. The density function is the normal bell-curve f x 1 2 e x 2 /2 2 standard deviation. where m is the average and s is the

Suppose f x, y is the joint density function of random variables X and Y then f y x f x, y /f X x is the conditional density of Y and f X x f x, y dy is the marginal density for X. f Y y f x, y dx is the marginal density for Y. Density analog of Bayes Theorem: P B i P B A i P A i f Y y f y x f X x dx Proof. f Y y f x, y dx f x,y f X x f X x dx CONTINUOUS BAYES THEOREM.. f y x f X x dx. CONDITIONING/UNCONDITIONING THEOREM. A B f x, y dydx A ( B f y x dy)f X x dx Proof. A B f x, y dydx A B f x,y f X x f X x dydx A B f x,y f X x dy f X x dx A ( B f y x dy)f X x dx. ( B f y x dy)f X x dx.