Bivariate Normal Distribution

Size: px
Start display at page:

Download "Bivariate Normal Distribution"

Transcription

1 .0. TWO-DIMENSIONAL RANDOM VARIABLES Bivariate Normal Distribution Figure.: Bivariate Normal pdf Here we use matrix notation. A bivariate rv is treated as a random vector X X =. The expectation of a bivariate random vector is written as X µ µ = EX = E = and its variance-covariance matrix is varx covx V =,X σ = ρσ σ covx,x varx ρσ σ σ X X Then the joint pdf of a normal bi-variate rv X is given by f X x = wherex = x,x T. π detv exp µ { x µt V x µ. },.8 The determinant ofv is σ detv = det ρσ σ ρσ σ σ = ρ σσ.

2 48 CHAPTER. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY Hence, the inverse ofv is V = σ ρσ σ detv ρσ σ σ = ρ σ ρσ ρσ σ σ. Then the exponent in formula.8 can be written as x µt V x µ = = ρ x µ,x µ x µ = ρ σ σ ρσ σ ρσ σ x µ x µ. ρ x µ x µ σ σ + x µ σ So, the joint pdf of the two-dimensional normal rv X is f X x = πσ σ ρ { x µ exp ρ σ Note that whenρ = 0 it simplifies to f X x = πσ σ exp ρ x µ x µ + x } µ. σ σ σ { x µ σ + x } µ, σ which can be written as a product of the marginal distributions of X and X. Hence, if X = X,X T has a bivariate normal distribution and ρ = 0 then the variablesx and X are independent..0.8 Bivariate Transformations Theorem.7. Let X and Y be jointly continuous random variables with joint pdf f X,Y x,y which has support on S R. Consider random variables U = gx,y and V = hx,y, where g, and h, form a one-to-one mapping froms tod with inversesx = g u,v andy = h u,v which have continuous partial derivatives. Then, the joint pdf of U, V is f U,V u,v = f X,Y g u,v,h u,v J,

3 .0. TWO-DIMENSIONAL RANDOM VARIABLES 49 where, the Jacobian of the transformationj is J = det g u,v u h u,v u g u,v v h u,v v for allu,v D Example.3. Let X,Y be independent rvs and X Expλ and Y Expλ. Then, the joint pdf ofx,y is on supports = {x,y : x > 0,y > 0}. f X,Y x,y = λe λx λe λy = λ e λx+y We will find the joint pdf for U,V, where U = gx,y = X + Y and V = hx,y = X/Y. This transformation and the support forx,ygive the support foru,v. This is{u,v : u > 0,v > 0}. The inverse functions are x = g u,v = uv +v and y = h u,v = u +v. The Jacobian of the transformation is equal to J = det g u,v u h u,v u g u,v v h u,v v Hence, by Theorem.7 we can write = det v +v u +v +v u +v f U,V u,v = f X,Y g u,v,h u,v J { uv = λ exp λ +v + u } u +v +v = λ ue λu +v, = u +v. foru,v > 0. These transformed variables are independent. In a simpler situation where gx is a function of x only and hy is function of y only, it is easy to see the following very useful result.

4 50 CHAPTER. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY Theorem.8. Let X and Y be independent rvs and let gx be a function of x only andhy be function ofy only. Then the functionsu = gx andv = hy are independent. Proof. Continuous case For any u R and v R, define A u = {x : gx u} and A v = {y : hy v}. Then, we can obtain the joint cdf ofu,v as follows F U,V u,v = PU u,v v = PX A u,y A v = PX A u PY A v as X and Y are independent. The mixed partial derivative with respect to u and v will give us the joint pdf for U,V. That is, f U,V u,v = d d u v F U,Vu,v = du PX A u dv PY A v as the first factor depends on u only and the second factor on v only. Hence, the rvs U = gx and V = hy are independent. Exercise.9. LetX,Y be a two-dimensional random variable with joint pdf { 8xy, for 0 x < y ; f X,Y x,y = 0, otherwise. LetU = X/Y and V = Y. a Are the variables X and Y independent? Explain. b Calculate the covariance ofx and Y. c Obtain the joint pdf ofu,v. d Are the variables U and V independent? Explain. e What is the covariance ofu and V? Exercise.0. LetX and Y be independent random variables such that X Expλ and Y Expλ.

5 .0. TWO-DIMENSIONAL RANDOM VARIABLES 5 a Find the joint probability density function ofu,v, where U = X X +Y and V = X +Y. b Are the variablesu and V independent? Explain. c Show that U is uniformly distributed on 0,. d What is the distribution ofv?

6 5 CHAPTER. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY. Random Sample and Sampling Distributions Example.3. In a study on relation of the level of cholesterol in blood and incidents of heart attack 8 heart attack patients had their cholesterol level measured two days and four days after the attack. Also, cholesterol levels were recorded for a control group of 30 people who did not have a heart attack. The data are available on the course web-page. Various questions may be asked here. For example:. What is the population s mean cholesterol level on the second day after heart attack?. Is the difference in the mean cholesterol level on day and on day 4 after the attack statistically significant? 3. Is high cholesterol level a significant risk factor for a heart attack? Each numerical value in this example can be treated as a realization of a random variable. For example, value x = 70 for patient one measured after two hours of the heart attack is a realization of a rv X representing all possible values of cholesterol level of patient one. Here we have 8 patients, hence we have 8 random variables X,...,X 8. These patients are only a small part a sample of all people a population having a heart attack at this time and area of living. Here we come with a definition of a random sample. Definition.. The random variables X,...,X n are called a random sample of size n from a population if X,...,X n are mutually independent, each having the same probability distribution. We say that such random variables are iid, that is identically, independently distributed. The joint pdf pmf in the discrete case can be written as a product of the marginal pdfs, i.e., n f X x,...,x n = f Xi x i, where X = X,...,X n is the jointly continuous n-dimensional random variable.

7 .. RANDOM SAMPLE AND SAMPLING DISTRIBUTIONS 53 Note: In the Example.3 it can be assumed in that these rvs are mutually independent cholesterol level of one patient does not depend on the cholesterol level of another patient. If we can assume that the distribution of the cholesterol of each patient is the same, then the variables X,...,X 8 constitute a random sample. To make sensible inference from the observed values a realization of a random sample we often calculate some summaries of the data, such as the average or an estimate of the sample variance. In general, such summaries are functions of the random sample and we write Y = TX,...,X n. Note that Y itself is then a random variable. The distribution of Y carries some information about the population and it allows us make inference regarding some parameters of the population. For example, about the expected cholesterol level and its variability in the population of people who suffer from heart attack. The distributions of many functions TX,...,X n can be derived from the distributions of the variablesx i. Such distributions are called sampling distributions and the functiont is called a statistic. Functions X = n n i X i and S = n n X i X are the most common statistics used in data analysis... χ, t andf, Distributions These three distributions can be derived from distributions of iid random variables. They are commonly used in statistical hypothesis testing and in interval estimation of unknown population parameters. χ distribution We have introduced theχ distribution as a special case of the gamma distribution, that is Gamma,. We write Y χ,

8 54 CHAPTER. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY if the rv Y has the chi-squared distribution with degrees of freedom. is the parameter of the distribution function. The pdf ofy χ is f Y y = y e y Γ for y > 0, =,,... and the mgf ofy is M Y t =, t < t. It is easy to show, using the derivatives of the mgf evaluated att = 0, that EY = and vary =. In the following example we will see thatχ is the distribution of a function of iid standard normal rvs. Example.33. In Example.7 we have seen that square of a standard normal rv has χ distribution. Now, assume that Z N0, and Z N0, independently. What is the distribution of Y = Z + Z? Denote by Y = Z and by Y = Z. To answer this question we can use the properties of the mgf as follows. M Y = E e ty = E [ e ty +Y ] = E [ e ty e ty ] = M Y tm Y t as, by Theorem.8, Y and Y are independent. Also, Y χ and Y χ, each with the mgf equal to Hence, M Yi t = M Y t = M Y tm Y t = t /, i =,. t / t / = t. This is the mgf forχ. Hence, by the uniqueness of the mgf we can conclude that Y = Z +Z χ. Note: This result can be easily extended to n independent standard normal rvs. That is, ifz i N0, fori =,...,n independently, then n Y = Zi χ n.

9 .. RANDOM SAMPLE AND SAMPLING DISTRIBUTIONS 55 From this result we can draw a useful conclusion. Corollary.. A sum of independent random variablest = k Y i, where each component has a chi-squared distribution, i.e., Y i χ i, is a random variable having a chi-squared distribution with = k i degrees of freedom. Note thatt in the above corollary can be written as a sum of squared iid standard normal rvs, hence it must have a chi-squared distribution. Example.34. Let X,...,X n be iid random variables, such that Then and so X i Nµ,σ, i =,...,n. Z i = X i µ σ n Zi = N0, n X i µ χ σ n. This is a useful result, but it depends on two parameters, whose values are usually unknown when we analyze data. Here we will see what happens when we replace µ withx = n n X i. We can write n X i µ = = = n [ Xi X+X µ ] n X i X + n n X µ +X µ X i X n X i X +nx µ. Hence, dividing this byσ we get n X i µ σ wheres = n n X i X. We know that Intro to Stats X N µ, σ n = n S σ, so X µ σ n } {{ } =0 X µ +,.9 σ n N0,.

10 56 CHAPTER. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY Hence, Also X µ σ n χ. n X i µ χ σ n. Furthermore, it can be shown thatx ands are independent. Now, equation.9 is a relation of the form W = U +V, where W χ n and V χ. Since hereu andv are independent, we have M W t = M U tm V t. That is M U t = M Wt M V t = t n/ t / = t n /. The last expression is the mgf of a random variable with aχ n distribution. That is n S σ χ n. Equivalently, n X i X σ χ n. Student t distribution A derivation of the t-distribution was published in 908 by William Sealy Gosset when he worked at the Guinness Brewery in Dublin. Due to proprietary issues, the paper was written under the pseudonym Student. The distribution is used in hypothesis testing and the test functions having a t distribution are often called t-tests. We write Y t,

11 .. RANDOM SAMPLE AND SAMPLING DISTRIBUTIONS 57 if the rv Y has the t distribution with degrees of freedom. is the parameter of the distribution function. The pdf is given by f Y y = Γ + πγ + + y, y R, =,,3,... The following theorem is widely applied in statistics to build hypotheses tests. Theorem.9. Let Z and X be independent random variables such that Z N0, and X χ. The random variable Y = Z X/ has Studenttdistribution with degrees of freedom. Proof. Here we will apply Theorem.7. We will find the joint distribution of Y,W, where Y = X/ Z and W = X, and then we will find the marginal density function fory, as required. The densities of standard normal and chi-squared rvs, respectively, are and f Z z = π e z, z R f X x = x e x Γ, x R + Z andx are independent, so the joint pdf ofz,x is equal to the product of the marginal pdfs f Z z and f X x. That is, Here we have the transformation f Z,X z,x = π e z x e x Γ y = z x/ and w = x, which gives the inverses z = y w/ and x = w.

12 58 CHAPTER. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY Then, Jacobian of the transformation is w/ y J = det w 0 = w. Hence, the joint pdf fory,w is f Y,W y,w = e y w π = w + e w + π w e w Γ + y Γ. w The range of Z,X is {z,x : < z <,0 < x < } and so the range of Y,W is{y,w : < y <,0 < w < }. To obtain the marginal pdf fory we need to integrate the joint pdf fory,w over the range of W. This is an easy task when we notice similarities of this function with the pdf of Gammaα,λ and use the fact that a pdf over the whole range integrates to. The pdf of a gamma random variablev is Denote f V v = vα λ α e λv. Γα α = +, λ = + y. Then, multiplying and dividing the joint pdf for Y,W by λ α and by Γα, we get f Y,W y,w = wα e λw πα Γ λα Γα λ α Γα Then, the marginal pdf fory is f Y y = = 0 = wα λ α e λw Γα w α λ α e λw Γα Γα πα Γ λα 0 Γα πα Γ λα. Γα πα Γ λαdw w α λ α e λw dw Γα

13 .. RANDOM SAMPLE AND SAMPLING DISTRIBUTIONS 59 as the second factor is treated as a constant when we integrate with respect to w. The first factor has a form of a gamma pdf, hence it integrates to. This gives the pdf fory equal to f Y y = = Γα πα Γ λα Γ + π + Γ [ = Γ+ πγ + y + y + ]+ which is the pdf of a random variable having the Student t distribution with degrees of freedom. Example.35. LetX i,i =,...,n, be iid normal random variables with meanµ and varianceσ. We often write this fact in the following way Then X N X i iid Nµ,σ, i =,...,n. µ, σ, so U = X µ n σ n Also, as shown in Example.34, we have V = n S σ χ n. N0,. Furthermore,X ands are independent, henceu andv are also independent by Theorem.8. Hence, by Theorem.9, we have T = U V/n t n. That is T = X µ σ n n S σ /n = X µ S n t n Note: The function T in Example.35 is used for testing hypothesis about the population mean µ when the variance of the population is unknown. We will be using this function later on in this course.

14 60 CHAPTER. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY F, Distribution Another sampling distribution very often used in statistics is the Fisher-Snedecor F distribution. We write Y F,, if the rv Y has F distribution with and degrees of freedom. The degrees of freedom are the parameters of the distribution function. The pdf is quite complicated. It is given by f Y y = Γ + Γ Γ y + y +, y R +. This distribution is related to chi-squared distribution in the following way. Theorem.0. Let X andy be independent random variables such that Then the random variable X χ and Y χ. F = X/ Y/ has Fisher s F distribution with and degrees of freedom. Proof. This result can be shown in a similar way as the result of Theorem.9. Take the transformationf = X/ Y/ andw = Y and use Theorem.7. Example.36. Let X,...,X n and Y,...,Y m be two independent random samples such that X i Nµ,σ and Y j Nµ,σ, i =,...,n; j =,...,m and lets ands be the sample variances of the random samplesx,...,x n and Y,...,Y m, respectively. Then see Example.34, n S σ χ n. Similarly, m S σ χ m.

15 .. RANDOM SAMPLE AND SAMPLING DISTRIBUTIONS 6 Hence, by Theorem.0, the ratio F = n S /n σ /m = S /σ S /σ m S σ F n,m. This statistic is used in testing hypotheses regarding variances of two independent normal populations. Exercise.. Let Y Gammaα,λ, that is,y has the pdf given by y α λ α e λy, for y > 0; f Y y = Γα 0, otherwise, forα > 0 andλ > 0. a Show that for anya R such that a+α > 0 the expectation ofy a is EY a = Γa+α λ a Γα. b Obtain EY and vary. Hint: Use result given in point a and the iterative property of gamma function, i.e., Γz = z Γz. c Let X χ. Use to obtainex andvarx. Exercise.. LetY,...,Y 5 be a random sample of size 5 from a standard normal population, that isy i iid N0,,i =,...,5, and denote byy the sample mean, that is Y = 5 5 Y i. Let Y 6 be another independent standard normal random variable. What is the distribution of a W = 5 Y i? b U = 5 Y i Y? c U +Y 6? d 5Y 6 / W? Explain each of your answers.

1.12 Multivariate Random Variables

1.12 Multivariate Random Variables 112 MULTIVARIATE RANDOM VARIABLES 59 112 Multivariate Random Variables We will be using matrix notation to denote multivariate rvs and their distributions Denote by X (X 1,,X n ) T an n-dimensional random

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix: Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

n x u x v n x = (u+v) n. p x (1 p) n x x = ( e t p+(1 p) ) n., x = 0,1,2,... λe = e λ t ) x = e λ e λet = e λ(et 1).

n x u x v n x = (u+v) n. p x (1 p) n x x = ( e t p+(1 p) ) n., x = 0,1,2,... λe = e λ t ) x = e λ e λet = e λ(et 1). Eercise 8 We can write E(X b) 2 = E ( X EX +EX b ) 2 = E ( (X EX)+(EX b) ) 2 = E(X EX) 2 +(EX b) 2 +2E ( (X EX)(EX b) ) = E(X EX) 2 +(EX b) 2 +2(EX b)e(x EX) = E(X EX) 2 +(EX b) 2 as E(X EX) = This is

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

[Chapter 6. Functions of Random Variables]

[Chapter 6. Functions of Random Variables] [Chapter 6. Functions of Random Variables] 6.1 Introduction 6.2 Finding the probability distribution of a function of random variables 6.3 The method of distribution functions 6.5 The method of Moment-generating

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45 Two hours Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER PROBABILITY 2 14 January 2015 09:45 11:45 Answer ALL four questions in Section A (40 marks in total) and TWO of the THREE questions

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

18 Bivariate normal distribution I

18 Bivariate normal distribution I 8 Bivariate normal distribution I 8 Example Imagine firing arrows at a target Hopefully they will fall close to the target centre As we fire more arrows we find a high density near the centre and fewer

More information

A Probability Review

A Probability Review A Probability Review Outline: A probability review Shorthand notation: RV stands for random variable EE 527, Detection and Estimation Theory, # 0b 1 A Probability Review Reading: Go over handouts 2 5 in

More information

Probability Background

Probability Background Probability Background Namrata Vaswani, Iowa State University August 24, 2015 Probability recap 1: EE 322 notes Quick test of concepts: Given random variables X 1, X 2,... X n. Compute the PDF of the second

More information

Stat 5101 Notes: Brand Name Distributions

Stat 5101 Notes: Brand Name Distributions Stat 5101 Notes: Brand Name Distributions Charles J. Geyer September 5, 2012 Contents 1 Discrete Uniform Distribution 2 2 General Discrete Uniform Distribution 2 3 Uniform Distribution 3 4 General Uniform

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Data from one or a series of random experiments are collected. Planning experiments and collecting data (not discussed here). Analysis:

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Lecture 11. Multivariate Normal theory

Lecture 11. Multivariate Normal theory 10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

Section 8.1. Vector Notation

Section 8.1. Vector Notation Section 8.1 Vector Notation Definition 8.1 Random Vector A random vector is a column vector X = [ X 1 ]. X n Each Xi is a random variable. Definition 8.2 Vector Sample Value A sample value of a random

More information

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y. CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information

2. The CDF Technique. 1. Introduction. f X ( ).

2. The CDF Technique. 1. Introduction. f X ( ). Week 5: Distributions of Function of Random Variables. Introduction Suppose X,X 2,..., X n are n random variables. In this chapter, we develop techniques that may be used to find the distribution of functions

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

STAT:5100 (22S:193) Statistical Inference I

STAT:5100 (22S:193) Statistical Inference I STAT:5100 (22S:193) Statistical Inference I Week 10 Luke Tierney University of Iowa Fall 2015 Luke Tierney (U Iowa) STAT:5100 (22S:193) Statistical Inference I Fall 2015 1 Monday, October 26, 2015 Recap

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Preliminary Statistics. Lecture 3: Probability Models and Distributions

Preliminary Statistics. Lecture 3: Probability Models and Distributions Preliminary Statistics Lecture 3: Probability Models and Distributions Rory Macqueen (rm43@soas.ac.uk), September 2015 Outline Revision of Lecture 2 Probability Density Functions Cumulative Distribution

More information

MAS223 Statistical Inference and Modelling Exercises and Solutions

MAS223 Statistical Inference and Modelling Exercises and Solutions MAS3 Statistical Inference and Modelling Exercises and Solutions The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional

More information

MATH2715: Statistical Methods

MATH2715: Statistical Methods MATH2715: Statistical Methods Exercises V (based on lectures 9-10, work week 6, hand in lecture Mon 7 Nov) ALL questions count towards the continuous assessment for this module. Q1. If X gamma(α,λ), write

More information

4. CONTINUOUS RANDOM VARIABLES

4. CONTINUOUS RANDOM VARIABLES IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We

More information

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text. TEST #3 STA 536 December, 00 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. You will have access to a copy

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

MA 519 Probability: Review

MA 519 Probability: Review MA 519 : Review Yingwei Wang Department of Mathematics, Purdue University, West Lafayette, IN, USA Contents 1 How to compute the expectation? 1.1 Tail........................................... 1. Index..........................................

More information

Chapter 2 Continuous Distributions

Chapter 2 Continuous Distributions Chapter Continuous Distributions Continuous random variables For a continuous random variable X the probability distribution is described by the probability density function f(x), which has the following

More information

First Year Examination Department of Statistics, University of Florida

First Year Examination Department of Statistics, University of Florida First Year Examination Department of Statistics, University of Florida August 19, 010, 8:00 am - 1:00 noon Instructions: 1. You have four hours to answer questions in this examination.. You must show your

More information

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours

MATH c UNIVERSITY OF LEEDS Examination for the Module MATH2715 (January 2015) STATISTICAL METHODS. Time allowed: 2 hours MATH2750 This question paper consists of 8 printed pages, each of which is identified by the reference MATH275. All calculators must carry an approval sticker issued by the School of Mathematics. c UNIVERSITY

More information

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem

More information

Chapter 5. The multivariate normal distribution. Probability Theory. Linear transformations. The mean vector and the covariance matrix

Chapter 5. The multivariate normal distribution. Probability Theory. Linear transformations. The mean vector and the covariance matrix Probability Theory Linear transformations A transformation is said to be linear if every single function in the transformation is a linear combination. Chapter 5 The multivariate normal distribution When

More information

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text. TEST #3 STA 5326 December 4, 214 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access to

More information

The Multivariate Normal Distribution. In this case according to our theorem

The Multivariate Normal Distribution. In this case according to our theorem The Multivariate Normal Distribution Defn: Z R 1 N(0, 1) iff f Z (z) = 1 2π e z2 /2. Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) T with the Z i independent and each Z i N(0, 1). In this

More information

SOLUTION FOR HOMEWORK 12, STAT 4351

SOLUTION FOR HOMEWORK 12, STAT 4351 SOLUTION FOR HOMEWORK 2, STAT 435 Welcome to your 2th homework. It looks like this is the last one! As usual, try to find mistakes and get extra points! Now let us look at your problems.. Problem 7.22.

More information

1 Solution to Problem 2.1

1 Solution to Problem 2.1 Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto

More information

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 25: Review. Statistics 104. April 23, Colin Rundel Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April

More information

Stat410 Probability and Statistics II (F16)

Stat410 Probability and Statistics II (F16) Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two

More information

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2}, ECE32 Spring 25 HW Solutions April 6, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

Expectation and Variance

Expectation and Variance Expectation and Variance August 22, 2017 STAT 151 Class 3 Slide 1 Outline of Topics 1 Motivation 2 Expectation - discrete 3 Transformations 4 Variance - discrete 5 Continuous variables 6 Covariance STAT

More information

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2 APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so

More information

Chapter 5. Random Variables (Continuous Case) 5.1 Basic definitions

Chapter 5. Random Variables (Continuous Case) 5.1 Basic definitions Chapter 5 andom Variables (Continuous Case) So far, we have purposely limited our consideration to random variables whose ranges are countable, or discrete. The reason for that is that distributions on

More information

Probability and Statistics Notes

Probability and Statistics Notes Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1

More information

2.3 Analysis of Categorical Data

2.3 Analysis of Categorical Data 90 CHAPTER 2. ESTIMATION AND HYPOTHESIS TESTING 2.3 Analysis of Categorical Data 2.3.1 The Multinomial Probability Distribution A mulinomial random variable is a generalization of the binomial rv. It results

More information

, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40

, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40 Assignment 4 Fall 07. Exercise 3.. on Page 46: If the mgf of a rom variable X is ( 3 + 3 et) 5, find P(X or 3). Since the M(t) of X is ( 3 + 3 et) 5, X has a binomial distribution with n 5, p 3. The probability

More information

The Delta Method and Applications

The Delta Method and Applications Chapter 5 The Delta Method and Applications 5.1 Local linear approximations Suppose that a particular random sequence converges in distribution to a particular constant. The idea of using a first-order

More information

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment: Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230

More information

Will Murray s Probability, XXXII. Moment-Generating Functions 1. We want to study functions of them:

Will Murray s Probability, XXXII. Moment-Generating Functions 1. We want to study functions of them: Will Murray s Probability, XXXII. Moment-Generating Functions XXXII. Moment-Generating Functions Premise We have several random variables, Y, Y, etc. We want to study functions of them: U (Y,..., Y n ).

More information

Introduction to Probability Theory

Introduction to Probability Theory Introduction to Probability Theory Ping Yu Department of Economics University of Hong Kong Ping Yu (HKU) Probability 1 / 39 Foundations 1 Foundations 2 Random Variables 3 Expectation 4 Multivariate Random

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

This does not cover everything on the final. Look at the posted practice problems for other topics.

This does not cover everything on the final. Look at the posted practice problems for other topics. Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

0, otherwise. U = Y 1 Y 2 Hint: Use either the method of distribution functions or a bivariate transformation. (b) Find E(U).

0, otherwise. U = Y 1 Y 2 Hint: Use either the method of distribution functions or a bivariate transformation. (b) Find E(U). 1. Suppose Y U(0, 2) so that the probability density function (pdf) of Y is 1 2, 0 < y < 2 (a) Find the pdf of U = Y 4 + 1. Make sure to note the support. (c) Suppose Y 1, Y 2,..., Y n is an iid sample

More information

Math Review Sheet, Fall 2008

Math Review Sheet, Fall 2008 1 Descriptive Statistics Math 3070-5 Review Sheet, Fall 2008 First we need to know about the relationship among Population Samples Objects The distribution of the population can be given in one of the

More information

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/ STA263/25//24 Tutorial letter 25// /24 Distribution Theor ry II STA263 Semester Department of Statistics CONTENTS: Examination preparation tutorial letterr Solutions to Assignment 6 2 Dear Student, This

More information

SOME SPECIFIC PROBABILITY DISTRIBUTIONS. 1 2πσ. 2 e 1 2 ( x µ

SOME SPECIFIC PROBABILITY DISTRIBUTIONS. 1 2πσ. 2 e 1 2 ( x µ SOME SPECIFIC PROBABILITY DISTRIBUTIONS. Normal random variables.. Probability Density Function. The random variable is said to be normally distributed with mean µ and variance abbreviated by x N[µ, ]

More information

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise. 54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and

More information

Introduction to Normal Distribution

Introduction to Normal Distribution Introduction to Normal Distribution Nathaniel E. Helwig Assistant Professor of Psychology and Statistics University of Minnesota (Twin Cities) Updated 17-Jan-2017 Nathaniel E. Helwig (U of Minnesota) Introduction

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3. Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae

More information

Stat 5101 Notes: Brand Name Distributions

Stat 5101 Notes: Brand Name Distributions Stat 5101 Notes: Brand Name Distributions Charles J. Geyer February 14, 2003 1 Discrete Uniform Distribution DiscreteUniform(n). Discrete. Rationale Equally likely outcomes. The interval 1, 2,..., n of

More information

Random Vectors and Multivariate Normal Distributions

Random Vectors and Multivariate Normal Distributions Chapter 3 Random Vectors and Multivariate Normal Distributions 3.1 Random vectors Definition 3.1.1. Random vector. Random vectors are vectors of random 75 variables. For instance, X = X 1 X 2., where each

More information

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace

More information

Lecture 3. Probability - Part 2. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. October 19, 2016

Lecture 3. Probability - Part 2. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. October 19, 2016 Lecture 3 Probability - Part 2 Luigi Freda ALCOR Lab DIAG University of Rome La Sapienza October 19, 2016 Luigi Freda ( La Sapienza University) Lecture 3 October 19, 2016 1 / 46 Outline 1 Common Continuous

More information

Class 8 Review Problems 18.05, Spring 2014

Class 8 Review Problems 18.05, Spring 2014 1 Counting and Probability Class 8 Review Problems 18.05, Spring 2014 1. (a) How many ways can you arrange the letters in the word STATISTICS? (e.g. SSSTTTIIAC counts as one arrangement.) (b) If all arrangements

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions Math/Stats 45, Sec., Fall 4: Introduction to Probability Final Exam: Solutions. In a game, a contestant is shown two identical envelopes containing money. The contestant does not know how much money is

More information

STAT 512 sp 2018 Summary Sheet

STAT 512 sp 2018 Summary Sheet STAT 5 sp 08 Summary Sheet Karl B. Gregory Spring 08. Transformations of a random variable Let X be a rv with support X and let g be a function mapping X to Y with inverse mapping g (A = {x X : g(x A}

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

Statistics, Data Analysis, and Simulation SS 2015

Statistics, Data Analysis, and Simulation SS 2015 Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler

More information

Statistics 1B. Statistics 1B 1 (1 1)

Statistics 1B. Statistics 1B 1 (1 1) 0. Statistics 1B Statistics 1B 1 (1 1) 0. Lecture 1. Introduction and probability review Lecture 1. Introduction and probability review 2 (1 1) 1. Introduction and probability review 1.1. What is Statistics?

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

Some Special Distributions (Hogg Chapter Three)

Some Special Distributions (Hogg Chapter Three) Some Special Distributions Hogg Chapter Three STAT 45-: Mathematical Statistics I Fall Semester 5 Contents Binomial & Related Distributions. The Binomial Distribution............... Some advice on calculating

More information

Exercises with solutions (Set D)

Exercises with solutions (Set D) Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information