CHAPTER 1 DISTRIBUTION THEORY 1 CHAPTER 1: DISTRIBUTION THEORY
|
|
- Johnathan O’Brien’
- 6 years ago
- Views:
Transcription
1 CHAPTER 1 DISTRIBUTION THEORY 1 CHAPTER 1: DISTRIBUTION THEORY
2 CHAPTER 1 DISTRIBUTION THEORY 2 Basic Concepts
3 CHAPTER 1 DISTRIBUTION THEORY 3 Random Variables (R.V.) discrete random variable: probability mass function continuous random variable: probability density function
4 CHAPTER 1 DISTRIBUTION THEORY 4 Formal but Scary Definition of Random Variables R.V. are some measurable functions from a probability measure space to real space; probability is some non-negative value assigned to sets of a σ-field; probability mass the Radon-Nykodym derivative of the random variable-induced measure w.r.t. to a counting measure; probability density function the derivative w.r.t. Lebesgue measure.
5 CHAPTER 1 DISTRIBUTION THEORY 5 Descriptive quantities of univariate distribution cumulative distribution function: P (X x) moments (mean): E[X k ] quantiles mode centralized moments (variance): E[(X µ) k ] the skewness: E[(X µ) 3 ]/V ar(x) 3/2 the kurtosis: E[(X µ) 4 ]/V ar(x) 2
6 CHAPTER 1 DISTRIBUTION THEORY 6 Characteristic function (c.f.) ϕ X (t) = E[exp{itX}] c.f. uniquely determines the distribution F X (b) F X (a) = lim T f X (x) = 1 2π 1 2π T T e ita e itb it e itx ϕ(t)dt. ϕ X (t)dt
7 CHAPTER 1 DISTRIBUTION THEORY 7 Descriptive quantities of multivariate R.V.s Cov(X, Y ), corr(x, Y ) f X Y (x y) = f X,Y (x, y)/f Y (y) E[X Y ] = xf X Y (x y)dx Independence: P (X x, Y y) = P (X x)p (Y y), f X,Y (x, y) = f X (x)f Y (y)
8 CHAPTER 1 DISTRIBUTION THEORY 8 Equalities of double expectations E[X] = E[E[X Y ]] V ar(x) = E[V ar(x Y )] + V ar(e[x Y ]) Useful when the distribution of X given Y is simple!
9 CHAPTER 1 DISTRIBUTION THEORY 9 Examples of Discrete Distributions
10 CHAPTER 1 DISTRIBUTION THEORY 10 Binomial distribution Binomial(n, p): P (S n = k) = ( ) n k p k (1 p) n k, k = 0,..., n E[S n ] = np, V ar(s n ) = np(1 p) ϕ X (t) = (1 p + pe it ) n Binomial(n 1, p) + Binomial(n 2, p) Binomial(n 1 + n 2, p)
11 CHAPTER 1 DISTRIBUTION THEORY 11 Negative Binomial distribution P (W m = k) = ( ) k 1 m 1 p m (1 p) k m k = m, m + 1,... E[W m ] = m/p, V ar(w m ) = m/p 2 m/p Neg-Binomial(m 1, p) + Neg-Binomial(m 2, p) Neg-Binomial(m 1 + m 2, p)
12 CHAPTER 1 DISTRIBUTION THEORY 12 Hypergeometric distribution P (S n = k) = ( M k )( N M n k E[S n ] = Mn/(M + N) ) / ( N n V ar(s n ) = nmn(m+n n) (M+N) 2 (M+N 1) ), k = 0, 1,.., n
13 CHAPTER 1 DISTRIBUTION THEORY 13 Poisson distribution P (X = k) = λ k e λ /k!, k = 0, 1, 2,... E[X] = V ar(x) = λ, ϕ X (t) = exp{ λ(1 e it )} P oisson(λ 1 ) + P oisson(λ 2 ) P oisson(λ 1 + λ 2 )
14 CHAPTER 1 DISTRIBUTION THEORY 14 Poisson vs Binomial X 1 P oisson(λ 1 ), X 2 P oisson(λ 2 ) X 1 X 1 + X 2 = n Binomial(n, λ 1 λ 1 +λ 2 ) if X n1,..., X nn are i.i.d Bernoulli(p n ) and np n λ, then P (S n = k) = n! k!(n k)! pk n(1 p n ) n k λk k! e λ.
15 CHAPTER 1 DISTRIBUTION THEORY 15 Multinomial Distribution N l, 1 l k counts the number of times that {Y 1,..., Y n } fall into B l P (N 1 = n 1,..., N k = n k ) = ( ) n n 1,...,n k p n 1 1 p n k k n n k = n the covariance matrix for (N 1,..., N k ) n p 1 (1 p 1 )... p 1 p k..... p 1 p k... p k (1 p k )
16 CHAPTER 1 DISTRIBUTION THEORY 16 Examples of Continuous Distribution
17 CHAPTER 1 DISTRIBUTION THEORY 17 Uniform distribution Uniform(a, b): f X (x) = I [a,b] (x)/(b a) E[X] = (a + b)/2 and V ar(x) = (b a) 2 /12
18 CHAPTER 1 DISTRIBUTION THEORY 18 Normal distribution N(µ, σ 2 ): f X (x) = 1 2πσ 2 E[X] = µ and V ar(x) = σ 2 ϕ X (t) = exp{itµ σ 2 t 2 /2} exp{ (x µ)2 2σ 2 }
19 CHAPTER 1 DISTRIBUTION THEORY 19 Gamma distribution Gamma(θ, β): f X (x) = 1 β θ Γ(θ) xθ 1 exp{ x β }, x > 0 E[X] = θβ and V ar(x) = θβ 2 θ = 1 equivalent to Exp(β) θ = n/2, β = 2 equivalent to χ 2 n
20 CHAPTER 1 DISTRIBUTION THEORY 20 Cauchy distribution Cauchy(a, b): f X (x) = 1 bπ{1+(x a) 2 /b 2 } E[ X ] =, ϕ X (t) = exp{iat bt } often used as counter example
21 CHAPTER 1 DISTRIBUTION THEORY 21 Algebra of Random Variables
22 CHAPTER 1 DISTRIBUTION THEORY 22 Assumption X and Y are independent and Y > 0 we are interested in d.f. of X + Y, XY, X/Y
23 CHAPTER 1 DISTRIBUTION THEORY 23 Summation of X and Y Derivation: F X+Y (z) = E[I(X + Y z)] = E Y [E X [I(X z Y ) Y ]] = E Y [F X (z Y )] = F X (z y)df Y (y) Convolution formula: F X+Y (z) = f X f Y (z) F Y (z x)df X (x) F X F Y (z) f X (z y)f Y (y)dy = f Y (z x)f X (x)dx
24 CHAPTER 1 DISTRIBUTION THEORY 24 Product and quotient of X and Y (Y > 0) F XY (z) = E[E[I(XY z) Y ]] = F X (z/y)df Y (y) f XY (z) = f X (z/y)/yf Y (y)dy F X/Y (z) = E[E[I(X/Y z) Y ]] = F X (yz)df Y (y) f X/Y (z) = f X (yz)yf Y (y)dy
25 CHAPTER 1 DISTRIBUTION THEORY 25 Application of formulae N(µ 1, σ1) 2 + N(µ 2, σ2) 2 N(µ 1 + µ 2, σ1 2 + σ2) 2 Gamma(r 1, θ) + Gamma(r 2, θ) Gamma(r 1 + r 2, θ) P oisson(λ 1 ) + P oisson(λ 2 ) P oisson(λ 1 + λ 2 ) Negative Binomial(m 1, p) + Negative Binomial(m 2, p) Negative Binomial(m 1 + m 2, p)
26 CHAPTER 1 DISTRIBUTION THEORY 26 Summation of R.V.s using c.f. Result: if X and Y are independent, then ϕ X+Y (t) = ϕ X (t)ϕ Y (t) Example: X and Y are normal with ϕ X (t) = exp{iµ 1 t σ 2 1t 2 /2}, ϕ Y (t) = exp{iµ 2 t σ 2 2t 2 /2} ϕ X+Y (t) = exp{i(µ 1 + µ 2 )t (σ σ 2 2)t 2 /2}
27 CHAPTER 1 DISTRIBUTION THEORY 27 Further examples of special distribution assume X N(0, 1), Y χ 2 m and Z χ 2 n are independent; X Y/m Student s t(m), Y/m Z/n Snedecor s F m,n, Y Y + Z Beta(m/2, n/2).
28 CHAPTER 1 DISTRIBUTION THEORY 28 Densities of t, F and Beta distributions f t(m) (x) = Γ((m+1)/2) πmγ(m/2) 1 (1+x 2 /m) (m+1)/2 I (, ) (x) f Fm,n (x) = Γ(m+n)/2 Γ(m/2)Γ(n/2) (m/n) m/2 x m/2 1 (1+mx/n) (m+n)/2 I (0, ) (x) f Beta(a,b) (x) = Γ(a+b) Γ(a)Γ(b) xa 1 (1 x) b 1 I(x (0, 1))
29 CHAPTER 1 DISTRIBUTION THEORY 29 Exponential vs Beta- distributions assume Y 1,..., Y n+1 are i.i.d Exp(θ); Z i = Y Y i Y Y n+1 Beta(i, n i + 1); (Z 1,..., Z n ) has the same joint distribution as that of the order statistics (ξ n:1,..., ξ n:n ) of n Uniform(0,1) r.v.s.
30 CHAPTER 1 DISTRIBUTION THEORY 30 Transformation of Random Vector
31 CHAPTER 1 DISTRIBUTION THEORY 31 Theorem 1.3 Suppose that X is k-dimension random vector with density function f X (x 1,..., x k ). Let g be a one-to-one and continuously differentiable map from R k to R k. Then Y = g(x) is a random vector with density function f X (g 1 (y 1,..., y k )) J g 1(y 1,..., y k ), where g 1 is the inverse of g and J g 1 g 1. is the Jacobian of
32 CHAPTER 1 DISTRIBUTION THEORY 32 Example let R 2 Exp{2}, R > 0 and Θ Uniform(0, 2π) be independent; X = R cos Θ and Y = R sin Θ are two independent standard normal random variables; it can be applied to simulate normally distributed data.
33 CHAPTER 1 DISTRIBUTION THEORY 33 Multivariate Normal Distribution
34 CHAPTER 1 DISTRIBUTION THEORY 34 Definition Y = (Y 1,..., Y n ) is said to have a multivariate normal distribution with mean vector µ = (µ 1,..., µ n ) and non-degenerate covariance matrix Σ n n if f Y (y 1,..., y n ) = 1 (2π) n/2 Σ 1/2 exp{ 1 2 (y µ) Σ 1 (y µ)}
35 CHAPTER 1 DISTRIBUTION THEORY 35 Characteristic function ϕ Y (t) = E[e it Y ] = (2π) n 2 Σ 1 2 = ( 2π) n 2 Σ 1 2 = exp{ µ Σ 1 µ/2} ( 2π) n/2 Σ 1/2 exp{ y Σ 1 y 2 exp exp{it y 1 2 (y µ) Σ 1 (y µ)}dy + (it + Σ 1 µ) y µ Σ 1 µ }dy 2 { 1 2 (y Σit µ) Σ 1 (y Σit µ) +(Σit + µ) Σ 1 (Σit + µ)/2 } dy { = exp µ Σ 1 µ/2 + 1 } 2 (Σit + µ) Σ 1 (Σit + µ) = exp{it µ 1 2 t Σt}.
36 CHAPTER 1 DISTRIBUTION THEORY 36 Conclusions multivariate normal distribution is uniquely determined by µ and Σ for standard multivariate normal distribution, ϕ X (t) = exp{ t t/2} the moment generating function for Y is exp{t µ + t Σt/2}
37 CHAPTER 1 DISTRIBUTION THEORY 37 Linear transformation of normal r.v. Theorem 1.4 If Y = A n k X k 1 where X N(0, I) (standard multivariate normal distribution), then Y s characteristic function is given by ϕ Y (t) = exp { t Σt/2}, t = (t 1,..., t n ) R k and rank(σ) = rank(a). Conversely, if ϕ Y (t) = exp{ t Σt/2} with Σ n n 0 of rank k, then Y = A n k X k 1 with rank(a) = k and X N(0, I).
38 CHAPTER 1 DISTRIBUTION THEORY 38 Proof Y = AX and X N(0, I) ϕ Y (t) = E[exp{it (AX)}] = E[exp{i(A t) X}] = exp{ (A t) (A t)/2} = exp{ t AA t/2} Y N(0, AA ). Note that rank(aa ) = rank(a).
39 CHAPTER 1 DISTRIBUTION THEORY 39 Conversely, suppose ϕ Y (t) = exp{ t Σt/2}. There exists O, an orthogonal matrix, Σ = O DO, D = diag((d 1,..., d k, 0,..., 0) ), d 1,..., d k > 0. Define Z = OY ϕ Z (t) = E[exp{it (OY )}] = E[exp{i(O t) Y }] = exp{ (O t) Σ(O t) } = exp{ d 1 t 2 2 1/2... d k t 2 k/2}. Z 1,..., Z k are independent N(0, d 1 ),..., N(0, d k ) and Z k+1,..., Z n are zeros. Let X i = Z i / d i, i = 1,..., k and O = (B n k, C n (n k) ). Clearly, rank(a) = k. Y = O Z = Bdiag{( d 1,..., d k )}X AX.
40 CHAPTER 1 DISTRIBUTION THEORY 40 Conditional normal distributions Theorem 1.5 Suppose that Y = (Y 1,..., Y k, Y k+1,..., Y n ) has a multivariate normal distribution with mean µ = (µ (1), µ (2) ) and a non-degenerate covariance matrix Σ = ( Σ11 Σ 12 Σ 21 Σ 22 Then (i) (Y 1,..., Y k ) N k (µ (1), Σ 11 ). (ii) (Y 1,..., Y k ) and (Y k+1,..., Y n ) are independent if and only if Σ 12 = Σ 21 = 0. (iii) For any matrix A m n, AY has a multivariate normal distribution with mean Aµ and covariance AΣA. ).
41 CHAPTER 1 DISTRIBUTION THEORY 41 (iv) The conditional distribution of Y (1) = (Y 1,..., Y k ) given Y (2) = (Y k+1,..., Y n ) is a multivariate normal distribution given as Y (1) Y (2) N k (µ (1) +Σ 12 Σ 1 22 (Y (2) µ (2) ), Σ 11 Σ 12 Σ 1 22 Σ 21 ).
42 CHAPTER 1 DISTRIBUTION THEORY 42 Proof Prove (iii) first. ϕ Y (t) = exp{it µ t Σt/2}. ϕ AY (t) = E[e it AY ] = E[e i(a t) Y ] = exp{i(a t) µ (A t) Σ(A t)/2} = exp{it (Aµ) t (AΣA )t/2} AY N(Aµ, AΣA ).
43 CHAPTER 1 DISTRIBUTION THEORY 43 Prove (i). Y 1.. = ( I k k 0 k (n k) ) Y. Y 1. Y k N (( I k k 0 k (n k) ) µ, Y k ( I k k 0 k (n k) ) Σ ( I k k 0 k (n k) ) ).
44 CHAPTER 1 DISTRIBUTION THEORY 44 Prove (ii). t = (t (1), t (2) ) ϕ Y (t) = exp 1 2 [ it (1) µ (1) + it (2) µ (2) { t (1) Σ 11 t (1) + 2t (1) Σ 12 t (2) + t (2) Σ 22 t (2)}]. t (1) and t (2) are separable iff Σ 12 = 0. (ii) implies that two normal random variables are independent if and only if their covariance is zero.
45 CHAPTER 1 DISTRIBUTION THEORY 45 Prove (iv). Consider Z (1) N(0, Σ Z ) Z (1) = Y (1) µ (1) Σ 12 Σ 1 22 (Y (2) µ (2) ). Σ Z = Cov(Y (1), Y (1) ) 2Σ 12 Σ 1 22 Cov(Y (2), Y (1) ) +Σ 12 Σ 1 22 Cov(Y (2), Y (2) )Σ 1 22 Σ 21 = Σ 11 Σ 12 Σ 1 22 Σ 21. Cov(Z (1), Y (2) ) = Cov(Y (1), Y (2) ) Σ 12 Σ 1 22 Cov(Y (2), Y (2) ) = 0. Z (1) is independent of Y (2). The conditional distribution Z (1) given Y (2) is the same as the unconditional distribution of Z (1) Z (1) Y (2) Z (1) N(0, Σ 11 Σ 12 Σ 1 22 Σ 21).
46 CHAPTER 1 DISTRIBUTION THEORY 46 Remark on Theorem 1.5 The constructed Z in (iv) has a geometric interpretation using the projection of (Y (1) µ (1) ) on the normalized variable Σ 1/2 22 (Y (2) µ (2) ).
47 CHAPTER 1 DISTRIBUTION THEORY 47 Example X and U are independent, X N(0, σ 2 x) and U N(0, σ 2 u) Y = X + U (measurement error) ( σ 2 the covariance of (X, Y ): x σx 2 σx 2 σx 2 + σu 2 ) X Y N( σ 2 x σ 2 x+σ 2 u Y, σ 2 x σ4 x σ 2 x+σ 2 u ) N(λY, σ 2 x(1 λ)) λ = σ 2 x/σ 2 y (reliability coefficient)
48 CHAPTER 1 DISTRIBUTION THEORY 48 Quadratic form of normal random variables N(0, 1) N(0, 1) 2 χ 2 n Gamma(n/2, 2). If Y N n (0, Σ) with Σ > 0, then Y Σ 1 Y χ 2 n. Proof: Y = AX where X N(0, I) and Σ = AA. Y Σ 1 Y = X A (AA ) 1 AX = X X χ 2 n.
49 CHAPTER 1 DISTRIBUTION THEORY 49 Noncentral Chi-square distribution assume X N(µ, I) Y = X X χ 2 n(δ) and δ = µ µ. Y s density: f Y (y) = k=0 exp{ δ/2}(δ/2) k g(y; (2k + n)/2, 1/2). k!
50 CHAPTER 1 DISTRIBUTION THEORY 50 Additional notes on normal distributions noncentral t-distribution: N(δ, 1)/ χ 2 n/n noncentral F -distribution: χ 2 n(δ)/n/(χ 2 m/m) how to calculate E[X AX] where X N(µ, Σ): E[X AX] = E[tr(X AX)] = E[tr(AXX )] = tr(ae[xx ]) = tr(a(µµ + Σ)) = µ Aµ + tr(aσ)
51 CHAPTER 1 DISTRIBUTION THEORY 51 Families of Distributions
52 CHAPTER 1 DISTRIBUTION THEORY 52 location-scale family ax + b: f X ((x b)/a)/a, a > 0, b R mean ae[x] + b and variance a 2 V ar(x) examples: normal distribution, uniform distribution, gamma distributions etc.
53 CHAPTER 1 DISTRIBUTION THEORY 53 Exponential family examples: binomial, poisson distributions for discrete variables and normal distribution, gamma distribution, Beta distribution for continuous variables has a general expression of densities possesses nice statistical properties
54 CHAPTER 1 DISTRIBUTION THEORY 54 Form of densities {P θ }, is said to form an s-parameter exponential family: p θ (x) = exp exp{b(θ)} = { s k=1 exp{ η k (θ)t k (x) B(θ) } h(x) s η k (θ)t k (x)}h(x)dµ(x) < k=1 if {η k (θ)} = θ, the above form is called the canonical form of the exponential family.
55 CHAPTER 1 DISTRIBUTION THEORY 55 Examples X 1,..., X n N(µ, σ 2 ): exp { µ σ 2 n i=1 x i 1 2σ 2 n i=1 x 2 i n } 2σ 2 µ2 1 ( 2πσ) n X Binomial(n, p): exp{x log p 1 p + n log(1 p)} ( ) n x X P oisson(λ): P (X = x) = exp{x log λ λ}/x!
56 CHAPTER 1 DISTRIBUTION THEORY 56 Moment generating function (MGF) MGF for (T 1,..., T s ): M T (t 1,..., t s ) = E [exp{t 1 T t s T s }] the coefficients in the Taylor expansion of M T correspond to the moments of (T 1,..., T s )
57 CHAPTER 1 DISTRIBUTION THEORY 57 Calculate MGF in the exponential family Theorem 1.6 Suppose the densities of an exponential family can be written as the canonical form exp{ s k=1 η k T k (x) A(η)}h(x), where η = (η 1,..., η s ). Then for t = (t 1,..., t s ), M T (t) = exp{a(η + t) A(η)}.
58 CHAPTER 1 DISTRIBUTION THEORY 58 Proof exp{a(η)} = exp{ s k=1 η it i (x)}h(x)dµ(x). = M T (t) = E [exp{t 1 T t s T s }] s exp{ (η i + t i )T i (x) A(η)}h(x)dµ(x) k=1 = exp{a(η + t) A(η)}.
59 CHAPTER 1 DISTRIBUTION THEORY 59 Cumulant generating function (CGF) K T (t 1,..., t s ) = log M T (t 1,..., t s ) = A(η + t) A(η) the coefficients in the Taylor expansion are called the cumulants for (T 1,..., T s ) the first two cumulants are the mean and variance
60 CHAPTER 1 DISTRIBUTION THEORY 60 Examples revisited Normal distribution: η = µ/σ 2 and A(η) = 1 2σ 2 µ2 = η 2 σ 2 /2 M T (t) = exp{ σ2 2 ((η + t)2 η 2 )} = exp{µt + t 2 σ 2 /2}. From the Taylor expansion, for X N(0, σ 2 ), E[X 2r+1 ] = 0, E[X 2r ] = 1 2 (2r 1)σ 2r, r = 1, 2,...
61 CHAPTER 1 DISTRIBUTION THEORY 61 gamma distribution has a canonical form exp{ x/b + (a 1) log x log(γ(a)b a )}I(x > 0). η = 1/b, T = X A(η) = log(γ(a)b a ) = a log( 1/η) + log Γ(a). η M X (t) = exp{a log η + t } = (1 bt) a. E[X] = ab, E[X 2 ] = ab 2 + (ab) 2,...
62 CHAPTER 1 DISTRIBUTION THEORY 62 READING MATERIALS: Lehmann and Casella, Sections 1.4 and 1.5
discrete random variable: probability mass function continuous random variable: probability density function
CHAPTER 1 DISTRIBUTION THEORY 1 Basic Concepts Random Variables discrete random variable: probability mass function continuous random variable: probability density function CHAPTER 1 DISTRIBUTION THEORY
More informationChapter 3.3 Continuous distributions
Chapter 3.3 Continuous distributions In this section we study several continuous distributions and their properties. Here are a few, classified by their support S X. There are of course many, many more.
More information5.1 Consistency of least squares estimates. We begin with a few consistency results that stand on their own and do not depend on normality.
88 Chapter 5 Distribution Theory In this chapter, we summarize the distributions related to the normal distribution that occur in linear models. Before turning to this general problem that assumes normal
More informationLecture 11. Multivariate Normal theory
10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances
More informationThe Multivariate Normal Distribution 1
The Multivariate Normal Distribution 1 STA 302 Fall 2017 1 See last slide for copyright information. 1 / 40 Overview 1 Moment-generating Functions 2 Definition 3 Properties 4 χ 2 and t distributions 2
More informationLIST OF FORMULAS FOR STK1100 AND STK1110
LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function
More informationJoint p.d.f. and Independent Random Variables
1 Joint p.d.f. and Independent Random Variables Let X and Y be two discrete r.v. s and let R be the corresponding space of X and Y. The joint p.d.f. of X = x and Y = y, denoted by f(x, y) = P(X = x, Y
More informationChapter 4 : Expectation and Moments
ECE5: Analysis of Random Signals Fall 06 Chapter 4 : Expectation and Moments Dr. Salim El Rouayheb Scribe: Serge Kas Hanna, Lu Liu Expected Value of a Random Variable Definition. The expected or average
More informationLecture 6: Special probability distributions. Summarizing probability distributions. Let X be a random variable with probability distribution
Econ 514: Probability and Statistics Lecture 6: Special probability distributions Summarizing probability distributions Let X be a random variable with probability distribution P X. We consider two types
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More informationRandom Vectors and Multivariate Normal Distributions
Chapter 3 Random Vectors and Multivariate Normal Distributions 3.1 Random vectors Definition 3.1.1. Random vector. Random vectors are vectors of random 75 variables. For instance, X = X 1 X 2., where each
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationProbability- the good parts version. I. Random variables and their distributions; continuous random variables.
Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More informationExam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)
Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax
More informationChapter 2: Fundamentals of Statistics Lecture 15: Models and statistics
Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Data from one or a series of random experiments are collected. Planning experiments and collecting data (not discussed here). Analysis:
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationThe Multivariate Normal Distribution 1
The Multivariate Normal Distribution 1 STA 302 Fall 2014 1 See last slide for copyright information. 1 / 37 Overview 1 Moment-generating Functions 2 Definition 3 Properties 4 χ 2 and t distributions 2
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationGaussian vectors and central limit theorem
Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables
More informationChp 4. Expectation and Variance
Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.
More informationBIOS 2083 Linear Models Abdus S. Wahed. Chapter 2 84
Chapter 2 84 Chapter 3 Random Vectors and Multivariate Normal Distributions 3.1 Random vectors Definition 3.1.1. Random vector. Random vectors are vectors of random variables. For instance, X = X 1 X 2.
More informationLecture 19: Properties of Expectation
Lecture 19: Properties of Expectation Dan Sloughter Furman University Mathematics 37 February 11, 4 19.1 The unconscious statistician, revisited The following is a generalization of the law of the unconscious
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationBASICS OF PROBABILITY
October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,
More informationECON 5350 Class Notes Review of Probability and Distribution Theory
ECON 535 Class Notes Review of Probability and Distribution Theory 1 Random Variables Definition. Let c represent an element of the sample space C of a random eperiment, c C. A random variable is a one-to-one
More informationConditional distributions. Conditional expectation and conditional variance with respect to a variable.
Conditional distributions Conditional expectation and conditional variance with respect to a variable Probability Theory and Stochastic Processes, summer semester 07/08 80408 Conditional distributions
More informationActuarial Science Exam 1/P
Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,
More informationContents 1. Contents
Contents 1 Contents 6 Distributions of Functions of Random Variables 2 6.1 Transformation of Discrete r.v.s............. 3 6.2 Method of Distribution Functions............. 6 6.3 Method of Transformations................
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More information1 Review of Probability
1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationStatistics STAT:5100 (22S:193), Fall Sample Final Exam B
Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationMLES & Multivariate Normal Theory
Merlise Clyde September 6, 2016 Outline Expectations of Quadratic Forms Distribution Linear Transformations Distribution of estimates under normality Properties of MLE s Recap Ŷ = ˆµ is an unbiased estimate
More informationthe convolution of f and g) given by
09:53 /5/2000 TOPIC Characteristic functions, cont d This lecture develops an inversion formula for recovering the density of a smooth random variable X from its characteristic function, and uses that
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More informationRandom Vectors 1. STA442/2101 Fall See last slide for copyright information. 1 / 30
Random Vectors 1 STA442/2101 Fall 2017 1 See last slide for copyright information. 1 / 30 Background Reading: Renscher and Schaalje s Linear models in statistics Chapter 3 on Random Vectors and Matrices
More informationBrief Review of Probability
Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions
More informationE[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =
Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of
More information1 Review of Probability and Distributions
Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More information1 Exercises for lecture 1
1 Exercises for lecture 1 Exercise 1 a) Show that if F is symmetric with respect to µ, and E( X )
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationPart IB Statistics. Theorems with proof. Based on lectures by D. Spiegelhalter Notes taken by Dexter Chua. Lent 2015
Part IB Statistics Theorems with proof Based on lectures by D. Spiegelhalter Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly)
More informationP (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n
JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are
More informationNotes on Random Vectors and Multivariate Normal
MATH 590 Spring 06 Notes on Random Vectors and Multivariate Normal Properties of Random Vectors If X,, X n are random variables, then X = X,, X n ) is a random vector, with the cumulative distribution
More informationChapter 2. Discrete Distributions
Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation
More informationCharacteristic Functions and the Central Limit Theorem
Chapter 6 Characteristic Functions and the Central Limit Theorem 6.1 Characteristic Functions 6.1.1 Transforms and Characteristic Functions. There are several transforms or generating functions used in
More informationPCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities
PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets
More informationStatistics, Data Analysis, and Simulation SS 2015
Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More informationLecture 14: Multivariate mgf s and chf s
Lecture 14: Multivariate mgf s and chf s Multivariate mgf and chf For an n-dimensional random vector X, its mgf is defined as M X (t) = E(e t X ), t R n and its chf is defined as φ X (t) = E(e ıt X ),
More information4. CONTINUOUS RANDOM VARIABLES
IA Probability Lent Term 4 CONTINUOUS RANDOM VARIABLES 4 Introduction Up to now we have restricted consideration to sample spaces Ω which are finite, or countable; we will now relax that assumption We
More informationSTA 2201/442 Assignment 2
STA 2201/442 Assignment 2 1. This is about how to simulate from a continuous univariate distribution. Let the random variable X have a continuous distribution with density f X (x) and cumulative distribution
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More informationMIT Spring 2016
Dr. Kempthorne Spring 2016 1 Outline Building 1 Building 2 Definition Building Let X be a random variable/vector with sample space X R q and probability model P θ. The class of probability models P = {P
More informationProbability Theory and Statistics. Peter Jochumzen
Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous
More informationt x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.
Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae
More informationProbability Notes. Compiled by Paul J. Hurtado. Last Compiled: September 6, 2017
Probability Notes Compiled by Paul J. Hurtado Last Compiled: September 6, 2017 About These Notes These are course notes from a Probability course taught using An Introduction to Mathematical Statistics
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationReview of Probability Theory
Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving
More informationLecture 3. Probability - Part 2. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. October 19, 2016
Lecture 3 Probability - Part 2 Luigi Freda ALCOR Lab DIAG University of Rome La Sapienza October 19, 2016 Luigi Freda ( La Sapienza University) Lecture 3 October 19, 2016 1 / 46 Outline 1 Common Continuous
More informationconditional cdf, conditional pdf, total probability theorem?
6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random
More informationPROBABILITY THEORY LECTURE 3
PROBABILITY THEORY LECTURE 3 Per Sidén Division of Statistics Dept. of Computer and Information Science Linköping University PER SIDÉN (STATISTICS, LIU) PROBABILITY THEORY - L3 1 / 15 OVERVIEW LECTURE
More informationSampling Distributions
Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of
More informationStat 5101 Notes: Brand Name Distributions
Stat 5101 Notes: Brand Name Distributions Charles J. Geyer February 14, 2003 1 Discrete Uniform Distribution DiscreteUniform(n). Discrete. Rationale Equally likely outcomes. The interval 1, 2,..., n of
More informationDepartment of Mathematics
Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Supplement 2: Review Your Distributions Relevant textbook passages: Pitman [10]: pages 476 487. Larsen
More informationLecture notes for Part A Probability
Lecture notes for Part A Probability Notes written by James Martin, updated by Matthias Winkel Oxford, Michaelmas Term 017 winkel@stats.ox.ac.uk Version of 5 September 017 1 Review: probability spaces,
More informationFundamental Tools - Probability Theory II
Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random
More informationMultivariate Distributions
Chapter 2 Multivariate Distributions and Transformations 2.1 Joint, Marginal and Conditional Distributions Often there are n random variables Y 1,..., Y n that are of interest. For example, age, blood
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More information18.175: Lecture 15 Characteristic functions and central limit theorem
18.175: Lecture 15 Characteristic functions and central limit theorem Scott Sheffield MIT Outline Characteristic functions Outline Characteristic functions Characteristic functions Let X be a random variable.
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationSection 21. Expected Values. Po-Ning Chen, Professor. Institute of Communications Engineering. National Chiao Tung University
Section 21 Expected Values Po-Ning Chen, Professor Institute of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30010, R.O.C. Expected value as integral 21-1 Definition (expected
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationStatistics 3657 : Moment Generating Functions
Statistics 3657 : Moment Generating Functions A useful tool for studying sums of independent random variables is generating functions. course we consider moment generating functions. In this Definition
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationMoment Generating Functions
MATH 382 Moment Generating Functions Dr. Neal, WKU Definition. Let X be a random variable. The moment generating function (mgf) of X is the function M X : R R given by M X (t ) = E[e X t ], defined for
More informationRandom Variables. P(x) = P[X(e)] = P(e). (1)
Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment
More informationChapter 1. Statistical Spaces
Chapter 1 Statistical Spaces Mathematical statistics is a science that studies the statistical regularity of random phenomena, essentially by some observation values of random variable (r.v.) X. Sometimes
More informationPartial Solutions for h4/2014s: Sampling Distributions
27 Partial Solutions for h4/24s: Sampling Distributions ( Let X and X 2 be two independent random variables, each with the same probability distribution given as follows. f(x 2 e x/2, x (a Compute the
More informationChapter 5 continued. Chapter 5 sections
Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More information01 Probability Theory and Statistics Review
NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement
More informationSampling Distributions
In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,
More informationMultiple Random Variables
Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An
More informationST5215: Advanced Statistical Theory
Department of Statistics & Applied Probability Monday, September 26, 2011 Lecture 10: Exponential families and Sufficient statistics Exponential Families Exponential families are important parametric families
More informationDeccan Education Society s FERGUSSON COLLEGE, PUNE (AUTONOMOUS) SYLLABUS UNDER AUTOMONY. SECOND YEAR B.Sc. SEMESTER - III
Deccan Education Society s FERGUSSON COLLEGE, PUNE (AUTONOMOUS) SYLLABUS UNDER AUTOMONY SECOND YEAR B.Sc. SEMESTER - III SYLLABUS FOR S. Y. B. Sc. STATISTICS Academic Year 07-8 S.Y. B.Sc. (Statistics)
More informationLecture 2: Repetition of probability theory and statistics
Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:
More informationDepartment of Mathematics
Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2018 Supplement 2: Review Your Distributions Relevant textbook passages: Pitman [10]: pages 476 487. Larsen
More informationAdvanced topics from statistics
Advanced topics from statistics Anders Ringgaard Kristensen Advanced Herd Management Slide 1 Outline Covariance and correlation Random vectors and multivariate distributions The multinomial distribution
More informationPerhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.
Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage
More informationChapter 7: Special Distributions
This chater first resents some imortant distributions, and then develos the largesamle distribution theory which is crucial in estimation and statistical inference Discrete distributions The Bernoulli
More information