CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables:

Similar documents
Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1

Joint Probability Distributions and Random Samples (Devore Chapter Five)

University of California, Los Angeles Department of Statistics. Joint probability distributions

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

ECON Fundamentals of Probability

The mean, variance and covariance. (Chs 3.4.1, 3.4.2)

Jointly Distributed Random Variables

General Random Variables

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom

Covariance and Correlation

Chapter 5 continued. Chapter 5 sections

18 Bivariate normal distribution I

Random Variables and Their Distributions

Bivariate Distributions

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

ENGG2430A-Homework 2

Ch. 5 Joint Probability Distributions and Random Samples

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

Two-dimensional Random Vectors

2 (Statistics) Random variables

Notes for Math 324, Part 19

Continuous Random Variables

Probability and Statistics Notes

STAT Chapter 5 Continuous Distributions

Multivariate Random Variable

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Bivariate distributions

Lecture 16 : Independence, Covariance and Correlation of Discrete Random Variables

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA

Summary of Random Variable Concepts March 17, 2000

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q

Joint p.d.f. and Independent Random Variables

Basics on Probability. Jingrui He 09/11/2007

Multiple Random Variables

Homework 10 (due December 2, 2009)

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Math Review Sheet, Fall 2008

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

, 0 x < 2. a. Find the probability that the text is checked out for more than half an hour but less than an hour. = (1/2)2

Outline Properties of Covariance Quantifying Dependence Models for Joint Distributions Lab 4. Week 8 Jointly Distributed Random Variables Part II

STAT/MATH 395 PROBABILITY II

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Joint Distribution of Two or More Random Variables

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Probability Theory and Statistics. Peter Jochumzen

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours

Applied Statistics I

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

Probability and Distributions

3.0 PROBABILITY, RANDOM VARIABLES AND RANDOM PROCESSES

Joint ] X 5) P[ 6) P[ (, ) = y 2. x 1. , y. , ( x, y ) 2, (

5 Operations on Multiple Random Variables

Lecture 2: Review of Probability

Probability, Random Processes and Inference

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Review of Probability. CS1538: Introduction to Simulations

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Gov Multiple Random Variables

Stat 5101 Notes: Algorithms

Multivariate Distributions CIVL 7012/8012

Properties of Summation Operator

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Chapter 2. Probability

Recitation 2: Probability

Lecture 2: Repetition of probability theory and statistics

Let X denote a random variable, and z = h(x) a function of x. Consider the

1 Probability and Random Variables

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Variance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18

LIST OF FORMULAS FOR STK1100 AND STK1110

Mathematical Statistics. Gregg Waterman Oregon Institute of Technology

Functions of two random variables. Conditional pairs

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Chapter 4: Continuous Random Variables and Probability Distributions

Statistics, Data Analysis, and Simulation SS 2015

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

STA 256: Statistics and Probability I

More than one variable

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Chapter 2. Continuous random variables

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

Problem #1 #2 #3 #4 Total Points /5 /7 /8 /4 /24

Bayesian statistics, simulation and software

1 Presessional Probability

DEPARTMENT OF MATHEMATICS AND STATISTICS

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

MAHALAKSHMI ENGINEERING COLLEGE TIRUCHIRAPALLI

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

3 Conditional Expectation

1 Random variables and distributions

matrix-free Elements of Probability Theory 1 Random Variables and Distributions Contents Elements of Probability Theory 2

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

Transcription:

CHAPTER 5 Jointl Distributed Random Variable There are some situations that experiment contains more than one variable and researcher interested in to stud joint behavior of several variables at the same time. Jointl Probabilit Mass Function for Two Discrete Distributed Random Variables: Let X and Y are discrete random variables. The joint pmf p(x, ) is defined for each pair of numbers (x, ) b p(x, ) = P (X = x and Y = ), then the probabilit P [(X, Y ) A] can find b The marginal pmf of X and Y are P [(X, Y ) A] = (x,) p(x, ), A p X (x) = p(x, ) p Y () = x p(x, ) X and Y are independent, if for ever pair of x and p(x, ) = p X (x) p Y () Example The joint pmf of X and Y appears in the accompaning tabulation p(x,) 0 1 2 0.1.04.02 x 1.08.2.06 2.06.14.3 a. What is P (X = 1 and Y = 1)? b. Compute P (X 1andY 1). c. Give a word description of the event (X 0andY 0) and compute the probabilit of this event. d. Compute the marginal pmf of X and of Y. What is P (X 1)? e. Are X and Y independent r.v s? 1

Jointl Probabilit Densit Function for Two Continuous Distributed Random Variables: The joint pdf for two continuous random variables X and Y for an two-dimensional set A is P [(X, Y ) A] = f(x, )dxd If A be a rectangle {(x, ) : a x b, c d}, then P [(X, Y ) A] = P (a x b, c d) = The marginal pdf of X and Y are f X (x) = f Y () = f(x, )d f(x, )dx A b d for < x < for < < a c f(x, )ddx. Two continuous random variables X and Y are independent, if for ever pair of x and f(x, ) = f X (x)f Y () Example: Each front tire on a particular tpe of vehicle is supposed to be filled to a pressure of 26 psi. Suppose the actual air pressure in each tire is a random variable (X) for the right tire and (Y ) for the left tire, with joint pdf f(x, ) = { K(x 2 + 2 ) 20 x 30, 20 30 0 otherwise. a. What is the value of K? b. What is the probabilit that both tires are under filled? c. What is the probabilit that the difference in air pressure between the two tires is at most 2 psi? d. Determine the distribution of air pressure in the right tire alone. e. Are X and Y independent rv s? For two continuous rv s X and Y, the conditional pdf of Y given that X = x is If X and Y be discrete f Y X ( x) = p Y X ( x) = f(x, ) f X (x) p(x, ) p X (x) 2 < < < <

Expected Values, Covariance, and Correlation The expected value of function h(x, ) denoted b E[h(X, Y )] or µ h(x,y ) is { x E[h(X, Y )] = h(x, )p(x, ) h(x, )f(x, )dxd The covariance between two random variables X and Y is if X and Y are discrete if X and Y are continuous Cov(X, Y ) = E[(X µ X )(Y µ Y )] { x = (x µ X)( µ Y )p(x, ) (x µ X)( µ Y )f(x, )dxd X and Y discrete X and Y continuous Also Cov(X, Y ) = E(XY ) µ X µ Y The correlation coefficient of two random variables is and has the following properties Corr(X, Y ) = ρ X,Y = Cov(X, Y ) σ X σ Y Corr(aX + b, cy + d) = Corr(X, Y ), if a and c have same sign (same positive or negative). 1 ρ X,Y 1 ρ X,Y = 1 or -1 if and onl if Y = ax + b such that a 0 If X and Y are independent ρ = 0 - Example: Consider the following joint pmf p(x,) 0 5 10 15 0.02.06.02.1 x 5.04.15.2.1 10.01.15.14.01 a. What is E(X + Y )? b. What is expected value for maximum of X and Y? c. Compute the covariance for X and Y. d. Compute ρ for X and Y. 3

The Distribution of the Sample Mean A statistic is an quantit that calculated from sample like sample mean ( X). Random variables X 1, X 2, X n from a random sample of size n if 1. The X i s are independent random variables. 2. Ever X i has the same probabilit distribution. If X 1, X 2, X n be a random sample from a distribution with mean µ and variance σ 2, then 1. E( X) = µ X = µ X is unbiased 2. V ( X) = σ 2 x = σ2 n Also, for T = X 1 + X 2 + + X n (the total sample) 1. E(T ) = nµ 2. V (T ) = nσ 2 If X 1, X 2, X n be a random sample from a normal distribution with µ and σ 2, then for an n, sample mean is normall distributed with µ and σ 2, i.e., also X N(µ, σ2 n ) T N(nµ, nσ 2 ) The Central limit theorem For a random sample X 1, X 2, X n from a distribution with µ and σ 2, sample mean has approximatel a normal distribution with mean µ and variance σ2, if n is sufficientl large. n (Also total sample has a normal distribution) If n 30, the central limit theorem can be used. - Example: The inside diameter of a randoml selected position ring is a random variable with mean value 12 cm and standard deviation 0.04 cm. a. If X is the sample mean for a random sample of n = 16 rings, where is the sampling distribution of X centered, and what is the standard deviation of the X distribution? b. Answer the question part (a) for a sample size of n = 64 rings. c. For which of the two random samples, X is more likel to be within 0.01 cm of 12 cm? d. Calculate P (11.99 X 12.01) when n = 64. 4

The Distribution of a Linear Combination In general a 1 X 1 + a 2 X 2 + + a n X n is a linear combination of random variables X 1, X 2,, X n have mean values µ 1, µ 2,, µ n, and variance of σ 2 1, σ 2 2,, σ 2 n. respectivel E(a 1 X 1 + a 2 X 2 + + a n X n ) = a 1 E(X 1 ) + a 2 E(X 2 ) + + a n E(X n ) = a 1 µ 1 + a 2 µ 2 + + a n µ n, n n V (a 1 X 1 + a 2 X 2 + + a n X n ) = a i a j Cov(X i X j ). i=1 If X i s and X j s be independent, Cov(X i, X j ) = 0, then V (a 1 X 1 + a 2 X 2 + + a n X n ) =? In particular, for difference of two random variables j=1 E(X 1 X 2 ) = E(X 1 ) E(X 2 ) V (X 1 X 2 ) = V (X 1 ) + V (X 2 ), if X 1 and X 2 are independent If X 1, X 2,, X n are independent and normall distributed, an linear combination of them has also normal distribution. Example: Let X 1, X 2, X 3, X 4, X 5 be the observed numbers of miles per gallon for the five cars. suppose these variables are independent and normall distributed with µ 1 = µ 2 = 20, µ 3 = µ 4 = µ 5 = 21, and σ 2 = 4 for X 1 and X 2 and σ 2 = 3.5 for others, define Y as Y = X 1 + X 2 2 X 3 + X 4 + X 5 3 Compute P (0 Y ) and P ( 1 Y 1). Suggested Exercises for Chapter 5: 3, 5, 11, 13, 15, 19, 25, 27, 31, 37, 39, 41, 47, 49, 51, 55, 59, 63, 65, 69, 73, 75, 5