Bivariate distributions

Size: px
Start display at page:

Download "Bivariate distributions"

Transcription

1 Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.)

2 Bivariate Distributions of the Discrete Type The Correlation Coefficient Conditional Distributions Bivariate Distributions of the Continuous Type The Bivariate Normal Distribution

3 Bivariate Distributions of the Discrete Type The Correlation Coefficient Conditional Distributions Bivariate Distributions of the Continuous Type The Bivariate Normal Distribution

4 up to now, we have taken a measurement on a single item under observation it is clear in many practical cases that it is possible (and often very desirable) to take a measurement of the multiple items for example, we are observing university students to obtain some information of their physical characteristics such as height x and weight y we want to determine relation y = u(x) and say something about the variation of the points around the curve

5 DEFINITION Let X and Y be two random variables defined on a discrete space. Let S denote the corresponding two-dimensional space of X and Y, the two random variables of the discrete type. The probability that X = x and Y = y is denoted by f(x,y) = P(X = x, Y = y). The function f(x,y) is called the joint probability mass function of X and Y and has the following properties: (a) 0 f(x,y) 1 b x,y S f x, y = 1 c P X, Y A = (x,y) A f x, y, where A is a subset of the space S

6 EXAMPLE Roll a pair of fair dice. For each of the 36 sample points with probability 1/36, let X denote the smaller and Y the larger outcome on the dice. For example, if the outcome is (3,), then the observed values are X =, Y = 3. The event {X =, Y = 3} could occur in one of two ways (3,) or (,3) so its probability is = 36. If the outcome is (,), then the observed values are X =, Y =. Since the event {X =, Y = } can occur in only one way, P(X =, Y = ) = 1/36. The joint pmf of X and Y is given by the probabilities f x, y = 1, 1 x = y 6 36, 1 x < y 6 36 when x and y are integers.

7 DEFINITION Let X and Y have the joint probability mass function f(x,y) with space S. The probability mass function of X alone, which is called the marginal probability mass function of X, f X x = y is defined by f x, y = P X = x, x ε S X, where the summation is taken over all possible y values for each given x in the x space S X. That is, the summation is over all (x,y) in S with a given x value. Similarly, the marginal probability mass function of Y is defined by f Y x = f x, y = P Y = y, y ε S Y, x where the summation is taken over all possible x values for each given y in the y space S Y.

8 DEFINITION The random variables X and Y are independent if and only if, for every x ε S X and every y ε S Y, P(X = x, Y = y) = P(Y = y) P(Y = y) or, equivalently, f(x,y) = f X (x) f Y (y). otherwise, X and Y are said to be dependent.

9 it is possible to define a probability histogram for a joint pmf as we did for a single random variable suppose that X and Y have a joint pmf f(x,y) with space S, where S is a set of pairs of integers at a point (x,y) in S, construct a rectangular column that is centered at (x,y) and has a one-by-one-unit base and height equal to f(x,y) f(x,y) is equal to the volume of this rectangular column the sum of the volumes of the rectangular columns in the probability histogram is equal to 1 f x, y = xy 30, x = 1,,3 y = 1,

10 NOTE sometimes it is convenient to replace symbols X and Y representing random variables by X 1 and X let X 1 and X be random variables of the discrete type with the joint pmf f(x 1, x ) on the space S if u(x 1, X ) is a function of these two random variables, then E u(x 1, X ) = u(x 1, x )f x 1, x, x 1, x S if it exists, is called the mathematical expectation (or expected value) of u(x 1, X )

11 the following mathematical expectations, if they exist, have special names: A. if u i (X 1,X ) = X i, for i = 1,, then E u(x 1, X ) = E X i = μ i is called the mean of X i, for i = 1,. B. if u i (X 1,X ) = (X i i ) for i = 1,, then E u(x 1, X ) = E X i μ i = σ i = Var(X i ) is called the variance of X i, for i = 1,. the mean i and the variance σ i can be computed from the joint pmf f(x 1, x ) or the marginal f i (x i ), i = 1,

12 EXTENSION OF THE BINOMIAL DISTRIBUTION TO A TRINOMIAL DISTRIBUTION we have three mutually exclusive and exhaustive ways for an experiment to terminate: perfect, seconds, defective we repeat the experiment n independent times, and the probabilities p X, p Y, p Z = 1 p X p Y remains the same from trial to trial in the n trials, let X = number of perfect items Y = number of seconds Z = n X Y = number of defectives if the x and y are nonnegative integers such that x + y n, then the probability of having x perfects, y seconds and n x y defectives is p X x p Y y 1 p X p Y n x y

13 EXTENSION OF THE BINOMIAL DISTRIBUTION TO A TRINOMIAL DISTRIBUTION however, if we want P(X = x, Y = y), then we must recognize that X = x, Y = y can be achieved in n x, y, n x y = n! x! y! n x y! different ways f x, y = P X = x, Y = y = the trinomial pmf is n! x! y! n x y! p X x p Y y 1 p X p Y n x y where x and y are nonnegative integers such that x + y n without summing, we know that X is b(n, p X ) and Y is b(n, p Y ), thus X and Y are dependent

14 Bivariate Distributions of the Discrete Type The Correlation Coefficient Conditional Distributions Bivariate Distributions of the Continuous Type The Bivariate Normal Distribution

15 we introduced the mathematical expectation of a function of two random variables X, Y μ X = E X, μ Y = E Y = E X μ X, σ Y = E X μ Y A. if u(x,y) = (X X ) (Y Y ), then E u(x, Y) = E X μ X Y μ Y = Y = Cov(X, Y) is called the covariance of X and Y. B. if the standard deviations and σ Y are positive, then Cov(X, Y) ρ = = Y σ Y σ Y is called the correlation coefficient of X and Y.

16 it is convenient that the mean and the variance of X can be computed from either the joint pmf (or pdf) or the marginal pmf (or pdf) of X μ X = E X = xf x, y = x f(x, y) = xf X (x) x y x y x to compute the covariance, we need the joint pmf (or pdf) E X μ X Y μ Y = E XY μ X Y μ Y X + μ X μ Y = E XY μ X E Y μ Y E X + μ X μ Y because E is a linear or distributive operator thus, Cov X, Y = E XY μ Y μ X μ X μ Y + μ X μ Y = E X, Y μ X μ Y since ρ = Cov(X,Y) / σ Y, we also have E XY = μ X μ Y + ρ σ Y (the expected value of the product of two random variables is equal to the product μ X μ Y plus their covariance ρ σ Y )

17 ρ = x y x μ X y μ Y f(x, y) σ Y interpretation of the sign of the correlation coefficient if positive probabilities are assigned to pairs (x,y) in which both x and y are either simultaneously above or simultaneously below their respective means, then the corresponding terms in the summation that defines ρ are positive because both factors x μ X and y μ Y will be positive or both will be negative if, on the one hand, the points (x,y), which yield large positive products x μ X y μ Y, contain most of the probability of the distribution, then the correlation coefficient will tend to be positive if, on the other hand, the points (x,y), in which one component is below its mean and the other above its mean, have most of the probability, then the coefficient of correlation will tend to be negative because products x μ X y μ Y having higher probabilities are negative

18 consider the following problem: think of the points (x,y) in the space S and their corresponding probabilities let us consider all possible lines in two-dimensional space, each with finite slope, that pass through the point associated with the means (μ Y, μ X ) these lines are of the form y = μ Y + b(x μ X ) for each point (x 0,y 0 ) in S, so that f(x 0,y 0 ) > 0 consider the vertical distance from that point to the aforesaid lines since y 0 is the height of the point above the x-axis and μ Y + b(x 0 μ X ) is the height of the point on the line that is directly above or below the point (x 0,y 0 ), the absolute value of the difference of these two heights is the vertical distance from the point (x 0,y 0 ) to the line y = μ Y + b(x μ X ) y 0 μ Y b x 0 μ X

19 let us now square the distance and take the weighted average of all such squares let us consider the mathematical expectation E Y μ Y b X μ X = K(b) the problem is to find that line (or that b) which minimizes this expectation of the square Y μ Y b X μ X application of the principle of least squares the line is sometimes called the least square regression line K b = E Y μ Y b X μ X Y μ Y + b X μ X = σ Y bρ σ Y + b because E is a linear operator and E X μ X Y μ Y = ρ σ Y

20 K b = E Y μ Y b X μ X Y μ Y + b X μ X = σ Y bρ σ Y + b the derivative K b = ρ σ Y + b equals to zero at b = ρσ Y / and we see that K b = > 0 obtains its minimum for b consequently, the least squares regression line is y = μ Y + ρ σ Y (x μ X ) if ρ > 0, the slope is positive; if ρ < 0, the slope is negative

21 K b = E Y μ Y b X μ X Y μ Y + b X μ X = σ Y bρ σ Y + b the value of the minimum is K ρ σ Y = σ Y ρ σ Y ρ σ Y + ρ σ Y = σ Y ρ σ Y + ρ σ Y = σ Y (1 ρ ) σ Y 1 ρ = K(b) 0 ρ 1 1 ρ 1 if ρ = 0 K ρ σ Y = σ Y if ρ is close to 1 or -1, then K ρ σ Y is relatively small

22 K ρ σ Y = σ Y ρ σ Y ρ σ Y + ρ σ Y = σ Y ρ σ Y + ρ σ Y = σ Y (1 ρ ) if ρ = 0 K ρ σ Y = σ Y if ρ is close to 1 or -1, then K ρ σ Y is relatively small the vertical deviations of the points with positive probability from the line y = μ Y + ρσ Y are small if ρ is close to 1 or -1 ρ measures the amount of linearity in the probability distribution x μ X

23 suppose that X and Y are independent, so that f x, y f X x f Y y suppose also that we want to find the expected value of the product u(x) v(y) E u X v Y = S X S Y u x v y f(x, y) = S X S Y u x v y f X x f Y y E u X v Y = S X u x f X x S Y v y f Y y = E u X E[v(Y)] the correlation coefficient of two independent variables is zero: Cov X, Y = E X μ X Y μ Y = E X μ X E Y μ Y = 0!!! independence implies zero correlation coefficient, but zero correlation coefficient does not necessarily imply independence!!!

24 Bivariate Distributions of the Discrete Type The Correlation Coefficient Conditional Distributions Bivariate Distributions of the Continuous Type The Bivariate Normal Distribution

25 let X and Y have a joint discrete distribution with pmf f(x,y) on space S the marginal probability mass functions are f X x and f Y y with spaces S X and S Y let event A = {X = x} and event B = {Y = y}, (x,y) S A B = {X = x, Y = y} because P(A B) = P(X = x, Y = y) = f(x,y) and P(B) = P(Y = y) = f Y (y) > 0 (since y S Y ) the conditional probability of event A given event B is P A B = P(A B) P(B) = f(x, y) f Y (y)

26 DEFINITION The conditional probability mass function of X, given that Y = y, is defined by g x y = f(x, y) f Y (y), provided that f Y y > 0. Similarly, the conditional probability mass function of Y, given that X = x, is defined by h y x = f(x, y) f X (x), provided that f X x > 0.

27 h(y x) 0 if we sum h(y x) over y for that fixed x, we obtain y h y x = y f(x, y) f X (x) = f X(x) f X (x) = 1 h(y x) satisfies the conditions of a probability mass function, we can compute conditional probabilities such as P a < Y < b X = x = {y: a<y<b} h(y x) and conditional mathematical expectations such as E u Y X = x = y u y h(y x)

28 special conditional mathematical expectations conditional mean of Y, given that X = x, defined by μ Y X = E(Y x) = y yh(y x) conditional variance of Y, given that X = x, defined by σ Y X = E{[Y E(Y x)] x} = y [Y E(Y x)] h(y x) σ Y X = E Y x [E(Y x)] the conditional mean μ X Y and the conditional variance Y are given by similar expression

29 suppose that the conditional mean is a linear function of x; that is, E(Y x) = a + bx let us find the constants a and b in terms of characteristics μ X, μ Y,, σ Y and ρ we assume that the respective standard deviations and σ Y are both positive, so that the correlation coefficient will exist y yh(y x) = y f(x, y) y f X (x) = a + bx, for x S X yf(x, y) = a + bx f X (x), for x S X y xεs X y yf x, y = xεs X (a + bx)f X (x) μ Y = a + bμ X

30 yf(x, y) = ax + b f X (x) y xyf(x, y) = ax + bx f X (x) y E XY = ae X + be X or, equivalently, μ X μ Y + σ σ Y = aμ X + b(μ X + ) μ Y = a + bμ X the solution of the equations is a = μ Y ρ σ Y μ X and b = ρ σ Y if E(Y x) is linear, it is given by E Y x = μ Y + ρ σ Y (x μ X )

31 E Y x = μ Y + ρ σ Y (x μ X ) by symmetry E X y = μ X + ρ σ Y (y μ Y ) if the conditional mean of Y, given that X = x, is linear, it is exactly the same as the best-fitting line (least squares regression line) considered in the previous case

32 Bivariate Distributions of the Discrete Type The Correlation Coefficient Conditional Distributions Bivariate Distributions of the Continuous Type The Bivariate Normal Distribution

33 the idea of joint distributions of two random variables of the discrete type can be extended to that of two random variables of the continuous type the definitions are the same except that integrals replace summations The joint probability density function (joint pdf) of two continuous-type random variables is an integrable function f(x,y) with the following properties: (a) f(x,y) 0, where f(x,y) = 0 when (x,y) is not in the support (space) S of X and Y b f x, y dxdy = 1 c P X, Y A = A f x, y dxdy, where X, Y A is an event defined in the plane A. (c) P[(X,Y) A] is the volume of the solid over the region A in the xy-plane and bounded by the surface z = f(x,y)

34 the respective marginal pdfs of continuous-type random variables X and Y are given by f X x = f Y y = f x, y dy, f x, y dx, x S X y S Y in the definitions of mathematical expectations: summation replace with integration X and Y are independent if and only if the joint pdf factors into the product of their marginal pdfs f x, y f x x f Y y, x S X, y S Y

35 let X and Y have a distribution of the continuous type with joint pdf f(x,y) and marginal pdfs f X (x) and f Y (y) the conditional pdf, mean and variance of Y, given that X = x, are h y x = f(x, y) f X (x), provided that f X x > 0 E Y x = yh y x dy Var Y x = E Y E Y x x = y E Y x h y x dy = E Y x [E(Y x)]

36 if E(Y x) is linear, then E Y x = μ Y + ρ σ Y x μ X if E(X y) is linear, then E X y = μ X + ρ σ Y y μ Y

37 Bivariate Distributions of the Discrete Type The Correlation Coefficient Conditional Distributions Bivariate Distributions of the Continuous Type The Bivariate Normal Distribution

38 let X and Y be random variables with joint pdf f(x,y) of the continuous type and marginal pdfs f X (x) and f Y (y) suppose that we have an application in which we can make following three assumptions about the conditional distribution of Y, given X = x: (a) It is normal for each real x. (b) Its mean, E(Y x), is a linear function of x. (c) Its variance is constant, that is, it does not depend upon the given value of x. assumption (b) implies: E Y x = μ Y + ρ σ Y x μ X assumption (c) implies: σ Y x = y μ Y ρ σ Y x μ X (multiply by f X (x) and integrate on x) h y x dy

39 σ Y x = y μ Y ρ σ Y x μ X (multiply by f X (x) and integrate on x) h y x dy since σ Y x is constant, the left-hand side is equal to σ Y x σ Y x = y μ Y ρ σ Y x μ X h y x f X (x)dydx h y x f X x = f(x, y) σ Y x = E (Y μ Y ) ρ σ Y X μ Y μ Y + ρ σ Y X μ X X

40 σ Y x = E (Y μ Y ) ρ σ Y X μ Y μ Y + ρ σ Y X μ X X using the fact that the expectation E is a linear operator and recalling that E X μ X Y μ Y = ρ σ Y we have σ Y x = σ Y ρ σ Y ρσ σ Y + ρ σ Y X σ = σ Y ρ σ Y + ρ σ Y = σ Y (1 ρ ) X these facts about the conditional mean and variance + assumption (a) require that the conditional pdf of Y, given X = x, be h y x = 1 σ Y π 1 ρ exp [y μ Y ρ(σ Y / )(x μ X )] σ Y (1 ρ ), < y <, for every real x up to this point, nothing has been said about the distribution of X other than it has mean μ X and positive variance

41 suppose we assume that distribution of X is also normal that is, the marginal pdf of X is f X x = 1 π exp (x μ x), < x < the joint pdf of X and Y is given by the product f x, y = h y x f X x = π σ Y 1 q(x, y) exp 1 ρ where q x, y = 1 1 ρ x μ X ρ x μ X x μ Y σ Y + y μ Y σ Y a joint pdf of this form is called a bivariate normal pdf

42 suppose we assume that distribution of X is also normal that is, the marginal pdf of X is f X x = 1 π exp (x μ x), < x <

43 THEOREM If X and Y have a bivariate normal distribution with correlation coefficient ρ, then X and Y are independent if and only if ρ = 0. the joint pdf of X and Y equals f X (x) f Y (y) h y x is a normal pdf with mean μ Y and variance σ Y f x, y = h y x f X x = π 1 q(x, y) exp 1 ρ where q x, y = 1 1 ρ x μ X ρ x μ X x μ Y σ Y + y μ Y σ Y

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University Statistics for Economists Lectures 6 & 7 Asrat Temesgen Stockholm University 1 Chapter 4- Bivariate Distributions 41 Distributions of two random variables Definition 41-1: Let X and Y be two random variables

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

Bivariate Distributions

Bivariate Distributions STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 17 Néhémy Lim Bivariate Distributions 1 Distributions of Two Random Variables Definition 1.1. Let X and Y be two rrvs on probability space (Ω, A, P).

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

Chapter 2. Probability

Chapter 2. Probability 2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing

More information

Review: mostly probability and some statistics

Review: mostly probability and some statistics Review: mostly probability and some statistics C2 1 Content robability (should know already) Axioms and properties Conditional probability and independence Law of Total probability and Bayes theorem Random

More information

Mathematics 426 Robert Gross Homework 9 Answers

Mathematics 426 Robert Gross Homework 9 Answers Mathematics 4 Robert Gross Homework 9 Answers. Suppose that X is a normal random variable with mean µ and standard deviation σ. Suppose that PX > 9 PX

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

Chapter 5 Class Notes

Chapter 5 Class Notes Chapter 5 Class Notes Sections 5.1 and 5.2 It is quite common to measure several variables (some of which may be correlated) and to examine the corresponding joint probability distribution One example

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

Joint Probability Distributions, Correlations

Joint Probability Distributions, Correlations Joint Probability Distributions, Correlations What we learned so far Events: Working with events as sets: union, intersection, etc. Some events are simple: Head vs Tails, Cancer vs Healthy Some are more

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

Statistical Learning Theory

Statistical Learning Theory Statistical Learning Theory Part I : Mathematical Learning Theory (1-8) By Sumio Watanabe, Evaluation : Report Part II : Information Statistical Mechanics (9-15) By Yoshiyuki Kabashima, Evaluation : Report

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B) REVIEW OF MAIN CONCEPTS AND FORMULAS Boolean algebra of events (subsets of a sample space) DeMorgan s formula: A B = Ā B A B = Ā B The notion of conditional probability, and of mutual independence of two

More information

Introduction to Probability and Stocastic Processes - Part I

Introduction to Probability and Stocastic Processes - Part I Introduction to Probability and Stocastic Processes - Part I Lecture 1 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise. 54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and

More information

Bivariate Distributions

Bivariate Distributions Bivariate Distributions EGR 260 R. Van Til Industrial & Systems Engineering Dept. Copyright 2013. Robert P. Van Til. All rights reserved. 1 What s It All About? Many random processes produce Examples.»

More information

18 Bivariate normal distribution I

18 Bivariate normal distribution I 8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019 Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial

More information

HW4 : Bivariate Distributions (1) Solutions

HW4 : Bivariate Distributions (1) Solutions STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 7 Néhémy Lim HW4 : Bivariate Distributions () Solutions Problem. The joint probability mass function of X and Y is given by the following table : X Y

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES VARIABLE Studying the behavior of random variables, and more importantly functions of random variables is essential for both the

More information

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3 Probability Paul Schrimpf January 23, 2018 Contents 1 Definitions 2 2 Properties 3 3 Random variables 4 3.1 Discrete........................................... 4 3.2 Continuous.........................................

More information

Section 8.1. Vector Notation

Section 8.1. Vector Notation Section 8.1 Vector Notation Definition 8.1 Random Vector A random vector is a column vector X = [ X 1 ]. X n Each Xi is a random variable. Definition 8.2 Vector Sample Value A sample value of a random

More information

Review of Statistics

Review of Statistics Review of Statistics Topics Descriptive Statistics Mean, Variance Probability Union event, joint event Random Variables Discrete and Continuous Distributions, Moments Two Random Variables Covariance and

More information

Multivariate distributions

Multivariate distributions CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

6.041/6.431 Fall 2010 Quiz 2 Solutions

6.041/6.431 Fall 2010 Quiz 2 Solutions 6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

Let X and Y denote two random variables. The joint distribution of these random

Let X and Y denote two random variables. The joint distribution of these random EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.

More information

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr. Topic 2: Probability & Distributions ECO220Y5Y: Quantitative Methods in Economics Dr. Nick Zammit University of Toronto Department of Economics Room KN3272 n.zammit utoronto.ca November 21, 2017 Dr. Nick

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Definition: n-dimensional random vector, joint pmf (pdf), marginal pmf (pdf) Theorem: How to calculate marginal pmf (pdf) given joint pmf (pdf) Example: How to calculate

More information

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables. Probability UBC Economics 326 January 23, 2018 1 2 3 Wooldridge (2013) appendix B Stock and Watson (2009) chapter 2 Linton (2017) chapters 1-5 Abbring (2001) sections 2.1-2.3 Diez, Barr, and Cetinkaya-Rundel

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Joint Probability Distributions, Correlations

Joint Probability Distributions, Correlations Joint Probability Distributions, Correlations What we learned so far Events: Working with events as sets: union, intersection, etc. Some events are simple: Head vs Tails, Cancer vs Healthy Some are more

More information

Multivariate Distributions CIVL 7012/8012

Multivariate Distributions CIVL 7012/8012 Multivariate Distributions CIVL 7012/8012 Multivariate Distributions Engineers often are interested in more than one measurement from a single item. Multivariate distributions describe the probability

More information

STAT/MATH 395 PROBABILITY II

STAT/MATH 395 PROBABILITY II STAT/MATH 395 PROBABILITY II Bivariate Distributions Néhémy Lim University of Washington Winter 2017 Outline Distributions of Two Random Variables Distributions of Two Discrete Random Variables Distributions

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

3-1. all x all y. [Figure 3.1]

3-1. all x all y. [Figure 3.1] - Chapter. Multivariate Distributions. All of the most interesting problems in statistics involve looking at more than a single measurement at a time, at relationships among measurements and comparisons

More information

ECSE B Solutions to Assignment 8 Fall 2008

ECSE B Solutions to Assignment 8 Fall 2008 ECSE 34-35B Solutions to Assignment 8 Fall 28 Problem 8.1 A manufacturing system is governed by a Poisson counting process {N t ; t < } with rate parameter λ >. If a counting event occurs at an instant

More information

Discrete Random Variables

Discrete Random Variables CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Statistics for Economists. Lectures 3 & 4

Statistics for Economists. Lectures 3 & 4 Statistics for Economists Lectures 3 & 4 Asrat Temesgen Stockholm University 1 CHAPTER 2- Discrete Distributions 2.1. Random variables of the Discrete Type Definition 2.1.1: Given a random experiment with

More information

Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution:

Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution: 4.1 Bivariate Distributions. Chapter 4. Multivariate Distributions For a pair r.v.s (X,Y ), the Joint CDF is defined as F X,Y (x, y ) = P (X x,y y ). Obviously, the marginal distributions may be obtained

More information

The Binomial distribution. Probability theory 2. Example. The Binomial distribution

The Binomial distribution. Probability theory 2. Example. The Binomial distribution Probability theory Tron Anders Moger September th 7 The Binomial distribution Bernoulli distribution: One experiment X i with two possible outcomes, probability of success P. If the experiment is repeated

More information

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text. TEST #3 STA 536 December, 00 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. You will have access to a copy

More information

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C, Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: PMF case: p(x, y, z) is the joint Probability Mass Function of X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) PDF case: f(x, y, z) is

More information

Random variables, Expectation, Mean and Variance. Slides are adapted from STAT414 course at PennState

Random variables, Expectation, Mean and Variance. Slides are adapted from STAT414 course at PennState Random variables, Expectation, Mean and Variance Slides are adapted from STAT414 course at PennState https://onlinecourses.science.psu.edu/stat414/ Random variable Definition. Given a random experiment

More information

Brief Review of Probability

Brief Review of Probability Brief Review of Probability Nuno Vasconcelos (Ken Kreutz-Delgado) ECE Department, UCSD Probability Probability theory is a mathematical language to deal with processes or experiments that are non-deterministic

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance Covariance Lecture 0: Covariance / Correlation & General Bivariate Normal Sta30 / Mth 30 We have previously discussed Covariance in relation to the variance of the sum of two random variables Review Lecture

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

1 Random variables and distributions

1 Random variables and distributions Random variables and distributions In this chapter we consider real valued functions, called random variables, defined on the sample space. X : S R X The set of possible values of X is denoted by the set

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Ch. 5 Joint Probability Distributions and Random Samples

Ch. 5 Joint Probability Distributions and Random Samples Ch. 5 Joint Probability Distributions and Random Samples 5. 1 Jointly Distributed Random Variables In chapters 3 and 4, we learned about probability distributions for a single random variable. However,

More information

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University. Random Signals and Systems Chapter 3 Jitendra K Tugnait Professor Department of Electrical & Computer Engineering Auburn University Two Random Variables Previously, we only dealt with one random variable

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

More than one variable

More than one variable Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to

More information

Discrete Random Variables

Discrete Random Variables Discrete Random Variables An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan Introduction The markets can be thought of as a complex interaction of a large number of random processes,

More information

EXPECTED VALUE of a RV. corresponds to the average value one would get for the RV when repeating the experiment, =0.

EXPECTED VALUE of a RV. corresponds to the average value one would get for the RV when repeating the experiment, =0. EXPECTED VALUE of a RV corresponds to the average value one would get for the RV when repeating the experiment, independently, infinitely many times. Sample (RIS) of n values of X (e.g. More accurately,

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,

More information

Regression and Covariance

Regression and Covariance Regression and Covariance James K. Peterson Department of Biological ciences and Department of Mathematical ciences Clemson University April 16, 2014 Outline A Review of Regression Regression and Covariance

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

Introduction to Statistical Inference Self-study

Introduction to Statistical Inference Self-study Introduction to Statistical Inference Self-study Contents Definition, sample space The fundamental object in probability is a nonempty sample space Ω. An event is a subset A Ω. Definition, σ-algebra A

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ST 370 The probability distribution of a random variable gives complete information about its behavior, but its mean and variance are useful summaries. Similarly, the joint probability

More information

Probability and Statistics Notes

Probability and Statistics Notes Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1

More information

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v } Statistics 35 Probability I Fall 6 (63 Final Exam Solutions Instructor: Michael Kozdron (a Solving for X and Y gives X UV and Y V UV, so that the Jacobian of this transformation is x x u v J y y v u v

More information

Class 8 Review Problems solutions, 18.05, Spring 2014

Class 8 Review Problems solutions, 18.05, Spring 2014 Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots

More information

ACE 562 Fall Lecture 2: Probability, Random Variables and Distributions. by Professor Scott H. Irwin

ACE 562 Fall Lecture 2: Probability, Random Variables and Distributions. by Professor Scott H. Irwin ACE 562 Fall 2005 Lecture 2: Probability, Random Variables and Distributions Required Readings: by Professor Scott H. Irwin Griffiths, Hill and Judge. Some Basic Ideas: Statistical Concepts for Economists,

More information