The Multivariate Normal Distribution 1
|
|
- Kathlyn Fletcher
- 5 years ago
- Views:
Transcription
1 The Multivariate Normal Distribution 1 STA 302 Fall See last slide for copyright information. 1 / 40
2 Overview 1 Moment-generating Functions 2 Definition 3 Properties 4 χ 2 and t distributions 2 / 40
3 Joint moment-generating function Of a p-dimensional random vector x ( M x (t = E e t x For example, M (x1,x 2,x 3 (t 1, t 2, t 3 = E ( e x 1t 1 +x 2 t 2 +x 3 t 3 Just write M(t if there is no ambiguity. Section 4.3 of Linear models in statistics has some material on moment-generating functions (optional. 3 / 40
4 Uniqueness Proof omitted Joint moment-generating functions correspond uniquely to joint probability distributions. M(t is a function of F (x. Step One: f(x = x 1 x p F (x. For example, x 1 x 2 x2 x1 f(y 1, y 2 dy 1 dy 2 Step Two: M(t = e t x f(x dx Could write M(t = g (F (x. Uniqueness says the function g is one-to-one, so that F (x = g 1 (M(t. 4 / 40
5 g 1 (M(t = F (x A two-variable example g 1 ( g 1 (M(t = F (x ex 1t 1 +x 2 t 2 f(x 1, x 2 dx 1 dx 2 = x 2 x1 f(y 1, y 2 dy 1 dy 2 5 / 40
6 Theorem Two random vectors x 1 and x 2 are independent if and only if the moment-generating function of their joint distribution is the product of their moment-generating functions. 6 / 40
7 Proof Two random vectors are independent if and only if the moment-generating function of their joint distribution is the product of their moment-generating functions. Independence therefore the MGFs factor is an exercise. M x1,x 2 (t 1, t 2 = M x1 (t 1 M x2 (t 2 ( ( = e x 1t 1 f x1 (x 1 dx 1 e x 2t 2 f x2 (x 2 dx 2 = = e x 1t 1 e x 2t 2 f x1 (x 1 f x2 (x 2 dx 1 dx 2 e x 1t 1 +x 2 t 2 f x1 (x 1 f x2 (x 2 dx 1 dx 2 7 / 40
8 Proof continued Have M x1,x 2 (t 1, t 2 = ex 1t 1 +x 2 t 2 f x1 (x 1 f x2 (x 2 dx 1 dx 2. Using F (x = g 1 (M(t, F (x 1, x 2 = g 1 ( = = = x2 x1 x2 x2 = F x1 (x 1 f x2 (y 2 e x 1t 1 +x 2 t 2 f x1 (x 1 f x2 (x 2 dx 1 dx 2 f x1 (y 1 f x2 (y 2 dy 1 dy 2 ( x1 f x2 (y 2 F x1 (x 1 dy 2 x2 = F x1 (x 1 F x2 (x 2 f x1 (y 1 dy 1 dy 2 f x2 (y 2 dy 2 So that x 1 and x 2 are independent. 8 / 40
9 A helpful distinction If x 1 and x 2 are independent, M x1 +x 2 (t = M x1 (tm x2 (t x 1 and x 2 are independent if and only if M x1,x 2 (t 1, t 2 = M x1 (t 1 M x2 (t 2 9 / 40
10 Theorem: Functions of independent random vectors are independent Show x 1 and x 2 independent implies that y 1 = g 1 (x 1 and y 2 = g 2 (x 2 are independent. ( ( ( y1 g1(x 1 t1 Let y = = and t =. Then g 2(x 2 y 2 t 2 M y(t = E (e t y ( ( = E e t 1 y 1+t 2 y 2 = E e t 1 y 1 e t 2 y 2 = E (e t 1 g 1(x 1 e t 2 g 2(x 2 = e t 1 g 1(x 1 e t 2 g 2(x 2 f x1 (x 1 f x2 (x 2 dx 1 dx 2 ( = e t 2 g 2(x 2 f x2 (x 2 e t 1 g 1(x 1 f x1 (x 1 dx 1 dx 2 = e t 2 g 2(x 2 f x2 (x 2 M g1 (x 1 (t 1 dx 2 = M g1 (x 1 (t 1 M g2 (x 2 (t 2 = M y1 (t 1 M y2 (t 2 So y 1 and y 2 are independent. 10 / 40
11 M Ax (t = M x (A t Analogue of M ax(t = M x(at ( M Ax (t = E e t Ax = E (e ( A t x = M x (A t Note that t is the same length as y = Ax: The number of rows in A. 11 / 40
12 M x+c (t = e t c M x (t Analogue of M x+c(t = e ct M x(t ( M x+c (t = E ( = E = e t c E e t (x+c e t x+t c ( e t x = e t c M x (t 12 / 40
13 Distributions may be defined in terms of moment-generating functions Build up the multivariate normal from univariate normals. If y N(µ, σ 2, then M y (t = e µt+ 1 2 σ2 t 2 Moment-generating functions correspond uniquely to probability distributions. So define a normal random variable with expected value µ and variance σ 2 as a random variable with moment-generating function e µt+ 1 2 σ2 t 2. This has one surprising consequence / 40
14 Degenerate random variables A degenerate random variable has all the probability concentrated at a single value, say P r{y = y 0 } = 1. Then M y (t = E(e yt = e yt p(y y = e y 0t p(y 0 = e y 0t 1 = e y 0t 14 / 40
15 If P r{y = y 0 } = 1, then M y (t = e y 0t This is of the form e µt+ 1 2 σ2 t 2 with µ = y 0 and σ 2 = 0. So y N(y 0, 0. That is, degenerate random variables are normal with variance zero. Call them singular normals. This will be surprisingly handy later. 15 / 40
16 Independent standard normals Let z 1,..., z p i.i.d. N(0, 1. z = z 1. z p E(z = 0 cov(z = I p 16 / 40
17 Moment-generating function of z Using e µt+ 1 2 σ2 t 2 M z (t = = p M zj (t j j=1 p j=1 e 1 2 t2 j = e 1 2 p j=1 t2 j = e 1 2 t t 17 / 40
18 Transform z to get a general multivariate normal Remember: A non-negative definite means v Av 0 Let Σ be a p p symmetric non-negative definite matrix and µ R p. Let y = Σ 1/2 z + µ. The elements of y are linear combinations of independent standard normals. Linear combinations of normals should be normal. y has a multivariate distribution. We d like to call y a multivariate normal. 18 / 40
19 Moment-generating function of y = Σ 1/2 z + µ Remember: M Ax(t = M x(a t and M x+c(t = e t c M x(t and M z (t = e 1 2 t t M y (t = M (t Σ 1/2 z+µ = e t µ M (t Σ 1/2 z = e t µ M z (Σ 1/2 t = e t µ M z (Σ 1/2 t = e t µ e 1 2 (Σ1/2 t (Σ 1/2 t = e t µ e 1 2 t Σ 1/2 Σ 1/2 t = e t µ e 1 2 t Σt = e t µ+ 1 2 t Σt So define a multivariate normal random variable y as one with moment-generating function M y (t = e t µ+ 1 2 t Σt. 19 / 40
20 Compare univariate and multivariate normal moment-generating functions Univariate M y (t = e µt+ 1 2 σ2 t 2 Multivariate M y (t = e t µ+ 1 2 t Σt So the univariate normal is a special case of the multivariate normal with p = / 40
21 Mean and covariance matrix For a univariate normal, E(y = µ and V ar(y = σ 2 Recall y = Σ 1/2 z + µ. E(y = µ cov(y = Σ 1/2 cov(zσ 1/2 = Σ 1/2 I Σ 1/2 = Σ We will say y is multivariate normal with expected value µ and variance-covariance matrix Σ, and write y N p (µ, Σ. Note that because M y (t = e t µ+ 1 2 t Σt, µ and Σ completely determine the distribution. 21 / 40
22 Probability density function of y N p (µ, Σ Remember, Σ is only positive semi-definite. It is easy to write down the density of z N p (0, I as a product of standard normals. If Σ is strictly positive definite (and not otherwise, the density of y = Σ 1/2 z + µ can be obtained using the Jacobian Theorem as { 1 f(y = exp 1 } Σ 1 2 (2π p 2 2 (y µ Σ 1 (y µ This is usually how the multivariate normal is defined. 22 / 40
23 Σ positive definite? Positive definite means that for any non-zero p 1 vector a, we have a Σa > 0. Since the one-dimensional random variable w = p i=1 a iy i may be written as w = a y and V ar(w = cov(a y = a Σa, it is natural to require that Σ be positive definite. All it means is that every non-zero linear combination of y values has a positive variance. Often, this is what you want. 23 / 40
24 Singular normal: Σ is positive semi-definite. Suppose there is a 0 with a Σa = 0. Let w = a y. Then V ar(w = cov(a y = a Σa = 0. That is, w has a degenerate distribution (but it s still still normal. In this case we describe the distribution of y as a singular multivariate normal. Including the singular case saves a lot of extra work in later proofs. We will insist that a singular multivariate normal is still multivariate normal, even though it has no density. 24 / 40
25 Distribution of Ay Recall y N p(µ, Σ means M y (t = e t µ+ 1 2 t Σt Let y N p (µ, Σ, and w = Ay, where A is an r p matrix. M w (t = M Ay (t = M y (A t = e (A t µ e 1 2 (A t Σ(A t = e t (Aµ e 1 2 t (AΣA t = e t (Aµ+ 1 2 t (AΣA t Recognize moment-generating function and conclude w N r (Aµ, AΣA 25 / 40
26 Exercise Use moment-generating functions, of course. Let y N p (µ, Σ. Show y + c N p (µ + c, Σ. 26 / 40
27 Zero covariance implies independence for the multivariate normal. Independence always implies zero covariance. For the multivariate normal, zero covariance also implies independence. The multivariate normal is the only continuous distribution with this property. 27 / 40
28 Show zero covariance implies independence By showing M y(t = M y1 (t 1M y2 (t 2 Let y N(µ, Σ, with y = ( y1 y 2 µ = ( µ1 µ 2 ( Σ1 0 Σ = 0 Σ 2 t = ( t1 t 2 M y (t = E (e t y = E e t 1 t 2 y = / 40
29 Continuing the calculation: M y (t = e t µ+ 1 2 t Σt y = ( y1 y 2 µ = ( µ1 µ 2 ( Σ1 0 Σ = 0 Σ 2 t = ( t1 t 2 M y(t = E e t1 t 2 y { ( = exp (t 1 t µ1 2 µ 2 { = e t 1 µ 1 +t 2 µ 2 1 exp 2 } { ( 1 exp 2 (t 1 t Σ Σ 2 } ( t 1 Σ 1 t 2Σ 2 ( t 1 t 2 { = e t 1 µ 1 +t 2 µ 2 1 ( exp t 1 Σ 1t 1 + t } 2Σ 2t 2 2 = e t 1 µ 1 e t 2 µ 2 e 1 2 (t 1 Σ 1t 1 e 1 2 (t 2 Σ 2t 2 = e t 1 µ (t 1 Σ 1t 1 e t 2 µ (t 2 Σ 2t 2 = M y1 (t 1M y2 (t 2 ( t1 t 2 } So y 1 and y 2 are independent. 29 / 40
30 An easy example If you do it the easy way Let y 1 N(1, 2, y 2 N(2, 4 and y 3 N(6, 3 be independent, with w 1 = y 1 + y 2 and w 2 = y 2 + y 3. Find the joint distribution of w 1 and w 2. ( w1 w 2 = ( y 1 y 2 y 3 w = Ay N(Aµ, AΣA 30 / 40
31 w = Ay N(Aµ, AΣA y 1 N(1, 2, y 2 N(2, 4 and y 3 N(6, 3 are independent Aµ = AΣA = = ( ( ( = ( / 40
32 Marginal distributions are multivariate normal y N p(µ, Σ, so w = Ay N(Aµ, AΣA Find the distribution of ( y 1 y 2 y 3 y 4 = ( y2 y 4 Bivariate normal. The expected value is easy. 32 / 40
33 Covariance matrix Of Ay cov(ay = AΣA = = = ( ( σ1,2 σ 2 2 σ 2,3 σ 2,4 σ 1,4 σ 2,4 σ 3,4 σ 2 4 ( σ 2 2 σ 2,4 σ 2,4 σ 2 4 σ1 2 σ 1,2 σ 1,3 σ 1,4 σ 1,2 σ2 2 σ 2,3 σ 2,4 σ 1,3 σ 2,3 σ3 2 σ 3,4 σ 1,4 σ 2,4 σ 3,4 σ Marginal distributions of a multivariate normal are multivariate normal, with the original means, variances and covariances / 40
34 Summary If c is a vector of constants, x + c N(c + µ, Σ. If A is a matrix of constants, Ax N(Aµ, AΣA. Linear combinations of multivariate normals are multivariate normal. All the marginals (dimension less than p of x are (multivariate normal, but it is possible in theory to have a collection of univariate normals whose joint distribution is not multivariate normal. For the multivariate normal, zero covariance implies independence. The multivariate normal is the only continuous distribution with this property. 34 / 40
35 Showing (x µ Σ 1 (x µ χ 2 (p Σ has to be positive definite this time x N (µ, Σ y = x µ N (0, Σ z = Σ 1 2 y N (0, Σ 1 2 ΣΣ 1 2 = N (0, Σ Σ 2 Σ 2 Σ 1 2 = N (0, I So z is a vector of p independent standard normals, and p y Σ 1 y = (Σ 1 2 y (Σ 1 2 y = z z = zi 2 χ 2 (p j=1 35 / 40
36 x and s 2 independent x 1,..., x n i.i.d N(µ, σ 2 x = x 1. x n N ( µ1, σ 2 I y = x 1 x. x n x = Ax Note A is (n + 1 n, so cov(ax = σ 2 AA is (n + 1 (n + 1, singular. x 36 / 40
37 The argument x 1 x. y = Ax = x n x = x y 2 x y is multivariate normal because x is multivariate normal. Cov (x, (x j x = 0 (Exercise So x and y 2 are independent. So x and S 2 = g(y 2 are independent. 37 / 40
38 Leads to the t distribution If z N(0, 1 and y χ 2 (ν and z and y are independent, then we say T = z y/ν t(ν 38 / 40
39 Random sample from a normal distribution i.i.d. Let x 1,..., x n N(µ, σ 2. Then n(x µ N(0, 1 and σ (n 1S 2 σ 2 χ 2 (n 1 and These quantities are independent, so T = = n(x µ/σ (n 1S 2 /(n 1 σ 2 n(x µ t(n 1 S 39 / 40
40 Copyright Information This slide show was prepared by Jerry Brunner, Department of Statistical Sciences, University of Toronto. It is licensed under a Creative Commons Attribution - ShareAlike 3.0 Unported License. Use any part of it as you like and share the result freely. The L A TEX source code is available from the course website: brunner/oldclass/302f17 40 / 40
The Multivariate Normal Distribution 1
The Multivariate Normal Distribution 1 STA 302 Fall 2014 1 See last slide for copyright information. 1 / 37 Overview 1 Moment-generating Functions 2 Definition 3 Properties 4 χ 2 and t distributions 2
More informationRandom Vectors 1. STA442/2101 Fall See last slide for copyright information. 1 / 30
Random Vectors 1 STA442/2101 Fall 2017 1 See last slide for copyright information. 1 / 30 Background Reading: Renscher and Schaalje s Linear models in statistics Chapter 3 on Random Vectors and Matrices
More informationSTA 302f16 Assignment Five 1
STA 30f16 Assignment Five 1 Except for Problem??, these problems are preparation for the quiz in tutorial on Thursday October 0th, and are not to be handed in As usual, at times you may be asked to prove
More informationLecture 11. Multivariate Normal theory
10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances
More informationSTA 2101/442 Assignment 3 1
STA 2101/442 Assignment 3 1 These questions are practice for the midterm and final exam, and are not to be handed in. 1. Suppose X 1,..., X n are a random sample from a distribution with mean µ and variance
More informationStat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1
Stat 366 A1 Fall 6) Midterm Solutions October 3) page 1 1. The opening prices per share Y 1 and Y measured in dollars) of two similar stocks are independent random variables, each with a density function
More informationMLES & Multivariate Normal Theory
Merlise Clyde September 6, 2016 Outline Expectations of Quadratic Forms Distribution Linear Transformations Distribution of estimates under normality Properties of MLE s Recap Ŷ = ˆµ is an unbiased estimate
More informationBIOS 2083 Linear Models Abdus S. Wahed. Chapter 2 84
Chapter 2 84 Chapter 3 Random Vectors and Multivariate Normal Distributions 3.1 Random vectors Definition 3.1.1. Random vector. Random vectors are vectors of random variables. For instance, X = X 1 X 2.
More informationRandom Vectors and Multivariate Normal Distributions
Chapter 3 Random Vectors and Multivariate Normal Distributions 3.1 Random vectors Definition 3.1.1. Random vector. Random vectors are vectors of random 75 variables. For instance, X = X 1 X 2., where each
More informationThe Bootstrap 1. STA442/2101 Fall Sampling distributions Bootstrap Distribution-free regression example
The Bootstrap 1 STA442/2101 Fall 2013 1 See last slide for copyright information. 1 / 18 Background Reading Davison s Statistical models has almost nothing. The best we can do for now is the Wikipedia
More informationJoint Distributions: Part Two 1
Joint Distributions: Part Two 1 STA 256: Fall 2018 1 This slide show is an open-source document. See last slide for copyright information. 1 / 30 Overview 1 Independence 2 Conditional Distributions 3 Transformations
More informationSTA442/2101: Assignment 5
STA442/2101: Assignment 5 Craig Burkett Quiz on: Oct 23 rd, 2015 The questions are practice for the quiz next week, and are not to be handed in. I would like you to bring in all of the code you used to
More informationNotes on Random Vectors and Multivariate Normal
MATH 590 Spring 06 Notes on Random Vectors and Multivariate Normal Properties of Random Vectors If X,, X n are random variables, then X = X,, X n ) is a random vector, with the cumulative distribution
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More information5.1 Consistency of least squares estimates. We begin with a few consistency results that stand on their own and do not depend on normality.
88 Chapter 5 Distribution Theory In this chapter, we summarize the distributions related to the normal distribution that occur in linear models. Before turning to this general problem that assumes normal
More informationSTAT 100C: Linear models
STAT 100C: Linear models Arash A. Amini April 27, 2018 1 / 1 Table of Contents 2 / 1 Linear Algebra Review Read 3.1 and 3.2 from text. 1. Fundamental subspace (rank-nullity, etc.) Im(X ) = ker(x T ) R
More informationThe Multivariate Normal Distribution. Copyright c 2012 Dan Nettleton (Iowa State University) Statistics / 36
The Multivariate Normal Distribution Copyright c 2012 Dan Nettleton (Iowa State University) Statistics 611 1 / 36 The Moment Generating Function (MGF) of a random vector X is given by M X (t) = E(e t X
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationThe Multivariate Normal Distribution. Copyright c 2012 Dan Nettleton (Iowa State University) Statistics / 36
The Multivariate Normal Distribution Copyright c 2012 Dan Nettleton (Iowa State University) Statistics 611 1 / 36 The Moment Generating Function (MGF) of a random vector X is given by M X (t) = E(e t X
More informationThe Multivariate Normal Distribution. In this case according to our theorem
The Multivariate Normal Distribution Defn: Z R 1 N(0, 1) iff f Z (z) = 1 2π e z2 /2. Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) T with the Z i independent and each Z i N(0, 1). In this
More informationMoment Generating Functions
MATH 382 Moment Generating Functions Dr. Neal, WKU Definition. Let X be a random variable. The moment generating function (mgf) of X is the function M X : R R given by M X (t ) = E[e X t ], defined for
More informationCHAPTER 1 DISTRIBUTION THEORY 1 CHAPTER 1: DISTRIBUTION THEORY
CHAPTER 1 DISTRIBUTION THEORY 1 CHAPTER 1: DISTRIBUTION THEORY CHAPTER 1 DISTRIBUTION THEORY 2 Basic Concepts CHAPTER 1 DISTRIBUTION THEORY 3 Random Variables (R.V.) discrete random variable: probability
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationx. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ).
.8.6 µ =, σ = 1 µ = 1, σ = 1 / µ =, σ =.. 3 1 1 3 x Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ ). The Gaussian distribution Probably the most-important distribution in all of statistics
More informationSTA 2101/442 Assignment Four 1
STA 2101/442 Assignment Four 1 One version of the general linear model with fixed effects is y = Xβ + ɛ, where X is an n p matrix of known constants with n > p and the columns of X linearly independent.
More information1.12 Multivariate Random Variables
112 MULTIVARIATE RANDOM VARIABLES 59 112 Multivariate Random Variables We will be using matrix notation to denote multivariate rvs and their distributions Denote by X (X 1,,X n ) T an n-dimensional random
More informationLecture 14: Multivariate mgf s and chf s
Lecture 14: Multivariate mgf s and chf s Multivariate mgf and chf For an n-dimensional random vector X, its mgf is defined as M X (t) = E(e t X ), t R n and its chf is defined as φ X (t) = E(e ıt X ),
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationLecture 2: Review of Probability
Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................
More informationdiscrete random variable: probability mass function continuous random variable: probability density function
CHAPTER 1 DISTRIBUTION THEORY 1 Basic Concepts Random Variables discrete random variable: probability mass function continuous random variable: probability density function CHAPTER 1 DISTRIBUTION THEORY
More informationContinuous Random Variables 1
Continuous Random Variables 1 STA 256: Fall 2018 1 This slide show is an open-source document. See last slide for copyright information. 1 / 32 Continuous Random Variables: The idea Probability is area
More information1 Solution to Problem 2.1
Solution to Problem 2. I incorrectly worked this exercise instead of 2.2, so I decided to include the solution anyway. a) We have X Y /3, which is a - function. It maps the interval, ) where X lives) onto
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More informationSTA 2201/442 Assignment 2
STA 2201/442 Assignment 2 1. This is about how to simulate from a continuous univariate distribution. Let the random variable X have a continuous distribution with density f X (x) and cumulative distribution
More informationIntroduction to Computational Finance and Financial Econometrics Probability Review - Part 2
You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /
More informationContents 1. Contents
Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................
More informationMultivariate Random Variable
Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate
More informationGeneralized Linear Models 1
Generalized Linear Models 1 STA 2101/442: Fall 2012 1 See last slide for copyright information. 1 / 24 Suggested Reading: Davison s Statistical models Exponential families of distributions Sec. 5.2 Chapter
More informationMoment Generating Function. STAT/MTHE 353: 5 Moment Generating Functions and Multivariate Normal Distribution
Moment Generating Function STAT/MTHE 353: 5 Moment Generating Functions and Multivariate Normal Distribution T. Linder Queen s University Winter 07 Definition Let X (X,...,X n ) T be a random vector and
More informationIntroduction to Normal Distribution
Introduction to Normal Distribution Nathaniel E. Helwig Assistant Professor of Psychology and Statistics University of Minnesota (Twin Cities) Updated 17-Jan-2017 Nathaniel E. Helwig (U of Minnesota) Introduction
More information8 - Continuous random vectors
8-1 Continuous random vectors S. Lall, Stanford 2011.01.25.01 8 - Continuous random vectors Mean-square deviation Mean-variance decomposition Gaussian random vectors The Gamma function The χ 2 distribution
More informationProbability Lecture III (August, 2006)
robability Lecture III (August, 2006) 1 Some roperties of Random Vectors and Matrices We generalize univariate notions in this section. Definition 1 Let U = U ij k l, a matrix of random variables. Suppose
More informationInstrumental Variables Again 1
Instrumental Variables Again 1 STA431 Winter/Spring 2017 1 See last slide for copyright information. 1 / 15 Overview 1 2 2 / 15 Remember the problem of omitted variables Example: is income, is credit card
More informationElements of Probability Theory
Short Guides to Microeconometrics Fall 2016 Kurt Schmidheiny Unversität Basel Elements of Probability Theory Contents 1 Random Variables and Distributions 2 1.1 Univariate Random Variables and Distributions......
More informationBMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs
Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem
More informationUses of Asymptotic Distributions: In order to get distribution theory, we need to norm the random variable; we usually look at n 1=2 ( X n ).
1 Economics 620, Lecture 8a: Asymptotics II Uses of Asymptotic Distributions: Suppose X n! 0 in probability. (What can be said about the distribution of X n?) In order to get distribution theory, we need
More information3d scatterplots. You can also make 3d scatterplots, although these are less common than scatterplot matrices.
3d scatterplots You can also make 3d scatterplots, although these are less common than scatterplot matrices. > library(scatterplot3d) > y par(mfrow=c(2,2)) > scatterplot3d(y,highlight.3d=t,angle=20)
More informationSTA 2101/442 Assignment 2 1
STA 2101/442 Assignment 2 1 These questions are practice for the midterm and final exam, and are not to be handed in. 1. A polling firm plans to ask a random sample of registered voters in Quebec whether
More information4. CONTINUOUS RANDOM VARIABLES
IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We
More informationLecture 25: Review. Statistics 104. April 23, Colin Rundel
Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April
More informationEcon 508B: Lecture 5
Econ 508B: Lecture 5 Expectation, MGF and CGF Hongyi Liu Washington University in St. Louis July 31, 2017 Hongyi Liu (Washington University in St. Louis) Math Camp 2017 Stats July 31, 2017 1 / 23 Outline
More informationThe Multinomial Model
The Multinomial Model STA 312: Fall 2012 Contents 1 Multinomial Coefficients 1 2 Multinomial Distribution 2 3 Estimation 4 4 Hypothesis tests 8 5 Power 17 1 Multinomial Coefficients Multinomial coefficient
More informationChapter 7: Special Distributions
This chater first resents some imortant distributions, and then develos the largesamle distribution theory which is crucial in estimation and statistical inference Discrete distributions The Bernoulli
More informationContinuous Distributions
A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later
More informationChapter 5. Random Variables (Continuous Case) 5.1 Basic definitions
Chapter 5 andom Variables (Continuous Case) So far, we have purposely limited our consideration to random variables whose ranges are countable, or discrete. The reason for that is that distributions on
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More information1. Density and properties Brief outline 2. Sampling from multivariate normal and MLE 3. Sampling distribution and large sample behavior of X and S 4.
Multivariate normal distribution Reading: AMSA: pages 149-200 Multivariate Analysis, Spring 2016 Institute of Statistics, National Chiao Tung University March 1, 2016 1. Density and properties Brief outline
More informationSTA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/
STA263/25//24 Tutorial letter 25// /24 Distribution Theor ry II STA263 Semester Department of Statistics CONTENTS: Examination preparation tutorial letterr Solutions to Assignment 6 2 Dear Student, This
More informationMultivariate Distributions
Chapter 2 Multivariate Distributions and Transformations 2.1 Joint, Marginal and Conditional Distributions Often there are n random variables Y 1,..., Y n that are of interest. For example, age, blood
More informationSampling Distributions
In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationThe Delta Method and Applications
Chapter 5 The Delta Method and Applications 5.1 Local linear approximations Suppose that a particular random sequence converges in distribution to a particular constant. The idea of using a first-order
More informationJoint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:
Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,
More informationProperties of Summation Operator
Econ 325 Section 003/004 Notes on Variance, Covariance, and Summation Operator By Hiro Kasahara Properties of Summation Operator For a sequence of the values {x 1, x 2,..., x n, we write the sum of x 1,
More informationSampling Distributions
Merlise Clyde Duke University September 8, 2016 Outline Topics Normal Theory Chi-squared Distributions Student t Distributions Readings: Christensen Apendix C, Chapter 1-2 Prostate Example > library(lasso2);
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationContingency Tables Part One 1
Contingency Tables Part One 1 STA 312: Fall 2012 1 See last slide for copyright information. 1 / 32 Suggested Reading: Chapter 2 Read Sections 2.1-2.4 You are not responsible for Section 2.5 2 / 32 Overview
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationi=1 k i=1 g i (Y )] = k
Math 483 EXAM 2 covers 2.4, 2.5, 2.7, 2.8, 3.1, 3.2, 3.3, 3.4, 3.8, 3.9, 4.1, 4.2, 4.3, 4.4, 4.5, 4.6, 4.9, 5.1, 5.2, and 5.3. The exam is on Thursday, Oct. 13. You are allowed THREE SHEETS OF NOTES and
More informationE[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =
Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More information4. Distributions of Functions of Random Variables
4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n
More informationRegression. ECO 312 Fall 2013 Chris Sims. January 12, 2014
ECO 312 Fall 2013 Chris Sims Regression January 12, 2014 c 2014 by Christopher A. Sims. This document is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License What
More informationPreliminaries. Copyright c 2018 Dan Nettleton (Iowa State University) Statistics / 38
Preliminaries Copyright c 2018 Dan Nettleton (Iowa State University) Statistics 510 1 / 38 Notation for Scalars, Vectors, and Matrices Lowercase letters = scalars: x, c, σ. Boldface, lowercase letters
More informationChapter 5. The multivariate normal distribution. Probability Theory. Linear transformations. The mean vector and the covariance matrix
Probability Theory Linear transformations A transformation is said to be linear if every single function in the transformation is a linear combination. Chapter 5 The multivariate normal distribution When
More information2.3. The Gaussian Distribution
78 2. PROBABILITY DISTRIBUTIONS Figure 2.5 Plots of the Dirichlet distribution over three variables, where the two horizontal axes are coordinates in the plane of the simplex and the vertical axis corresponds
More informationANOVA: Analysis of Variance - Part I
ANOVA: Analysis of Variance - Part I The purpose of these notes is to discuss the theory behind the analysis of variance. It is a summary of the definitions and results presented in class with a few exercises.
More informationExam 2. Jeremy Morris. March 23, 2006
Exam Jeremy Morris March 3, 006 4. Consider a bivariate normal population with µ 0, µ, σ, σ and ρ.5. a Write out the bivariate normal density. The multivariate normal density is defined by the following
More informationCovariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance
Covariance Lecture 0: Covariance / Correlation & General Bivariate Normal Sta30 / Mth 30 We have previously discussed Covariance in relation to the variance of the sum of two random variables Review Lecture
More information01 Probability Theory and Statistics Review
NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More informationCh4. Distribution of Quadratic Forms in y
ST4233, Linear Models, Semester 1 2008-2009 Ch4. Distribution of Quadratic Forms in y 1 Definition Definition 1.1 If A is a symmetric matrix and y is a vector, the product y Ay = i a ii y 2 i + i j a ij
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationInteractions and Factorial ANOVA
Interactions and Factorial ANOVA STA442/2101 F 2018 See last slide for copyright information 1 Interactions Interaction between explanatory variables means It depends. Relationship between one explanatory
More information, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40
Assignment 4 Fall 07. Exercise 3.. on Page 46: If the mgf of a rom variable X is ( 3 + 3 et) 5, find P(X or 3). Since the M(t) of X is ( 3 + 3 et) 5, X has a binomial distribution with n 5, p 3. The probability
More informationSampling Distributions
Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of
More informationPart 6: Multivariate Normal and Linear Models
Part 6: Multivariate Normal and Linear Models 1 Multiple measurements Up until now all of our statistical models have been univariate models models for a single measurement on each member of a sample of
More informationSTA 431s17 Assignment Eight 1
STA 43s7 Assignment Eight The first three questions of this assignment are about how instrumental variables can help with measurement error and omitted variables at the same time; see Lecture slide set
More informationMultivariate Analysis and Likelihood Inference
Multivariate Analysis and Likelihood Inference Outline 1 Joint Distribution of Random Variables 2 Principal Component Analysis (PCA) 3 Multivariate Normal Distribution 4 Likelihood Inference Joint density
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationSOLUTION FOR HOMEWORK 12, STAT 4351
SOLUTION FOR HOMEWORK 2, STAT 435 Welcome to your 2th homework. It looks like this is the last one! As usual, try to find mistakes and get extra points! Now let us look at your problems.. Problem 7.22.
More information5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors
EE401 (Semester 1) 5. Random Vectors Jitkomut Songsiri probabilities characteristic function cross correlation, cross covariance Gaussian random vectors functions of random vectors 5-1 Random vectors we
More informationExpectation, variance and moments
Expectation, variance and moments John Appleby Contents Expectation and variance Examples 3 Moments and the moment generating function 4 4 Examples of moment generating functions 5 5 Concluding remarks
More informationMultivariate Analysis Homework 1
Multivariate Analysis Homework A490970 Yi-Chen Zhang March 6, 08 4.. Consider a bivariate normal population with µ = 0, µ =, σ =, σ =, and ρ = 0.5. a Write out the bivariate normal density. b Write out
More information7 Random samples and sampling distributions
7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces
More informationStat 5101 Notes: Algorithms
Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................
More informationLecture 15: Multivariate normal distributions
Lecture 15: Multivariate normal distributions Normal distributions with singular covariance matrices Consider an n-dimensional X N(µ,Σ) with a positive definite Σ and a fixed k n matrix A that is not of
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More information