MIT Spring 2016
|
|
- Lawrence O’Brien’
- 5 years ago
- Views:
Transcription
1 Dr. Kempthorne Spring
2 Outline Building 1 Building 2
3 Definition Building Let X be a random variable/vector with sample space X R q and probability model P θ. The class of probability models P = {P θ, θ Θ} is a one-parameter exponential family if the density/pmf function p(x θ) can be written: p(x θ) = h(x)exp{η(θ)t (x) B(θ)} where h : X R η : Θ R B : Θ R. Note: By the Factorization Theorem, T (X ) is sufficient for θ p(x θ) = h(x)g(t (x), θ) Set g(t (x), θ) = exp{η(θ)t (x) B(θ)} T (X ) is the Natural Sufficient Statistic. 3
4 Examples Building Poisson Distribution (1.6.1) : X Poisson(θ), where E[X ] = θ. θ p(x θ) = x e θ x!, x = 0, 1,... 1 = x! exp{(log(θ)x θ} = h(x)exp{η(θ)t (x) B(θ) where: h(x) = 1 x! η(θ) = log(θ) T (x) = x 4
5 Examples Building Binomial Distribution (1.6.2) ( : X r Binomial(θ, n) n p(x θ) = θ 1 (1 θ) n x x = 0, 1,..., n ( x r n θ = exp{log( 1 θ )x + nlog(1 θ)} x = h(x)exp{η(θ)t (x) B(θ) where: ( r n h(x) = x θ η(θ) = log( 1 θ ) T (x) = x B(θ) = nlog(1 θ) 5
6 Examples Building Normal Distribution : X N(µ, σ 0 2 ). (Known variance) p(x θ) = e 2σ 0 2 where: 1 (x µ) 1 2 2πσ µ 2 2σ } 2πσ σ0 2 2σ0 2 0 = [ exp{ x }] exp{ µ x = h(x)exp{η(θ)t (x) B(θ) 1 2πσ 2 2σ0 2 0 h(x) = [ exp{ x2 }] η(θ) = µ σ 2 0 T (x) = x B(θ) = µ2 2σ 2 0 6
7 Examples Building Normal Distribution : X N(µ 0, σ 2 ). (Known mean) p(x θ) = e 1 (x µ 0 ) 1 2 2σ 2 2πσ πσ 2 2σ 2 2 = [ ] exp{ (x µ 0 ) 2 ) 1 log(σ 2 )} = h(x)exp{η(θ)t (x) B(θ) where: 1 h(x) = [ ] 2π 1 η(θ) = 2σ 2 T (x) = (x µ 0 ) 2 B(θ) = 1 log(σ 2 ) 2 7
8 Building Samples from One-Parameter Exponential Family Distribution 8 Consider a sample: X 1,..., X n, where X i are iid P where P P= {P θ, θ Θ} is a one-parameter exponential family distribution with density function p(x θ) = h(x)exp{η(θ)t (x) B(θ)} The sample X = (X 1,.. T., X n ) is a random vector with density/pmf: p(x θ) = n T i=1 (h(x i )exp[η(θ)t (x ) i B(θ)]) n n = [ i=1 h(x i )] exp[η(θ) i=1 T (x i ) nb(θ)] = h (x)exp{η (θ)t (x) B (θ)} where: T h (x) = n i=1 h(x i ) η (θ) = η(θ) ) T (x) = n i=1 T (x i ) B (θ) = nb(θ) Note: The Sufficient Statsitic T is one-dimensional for all n.
9 Building Samples from One-Parameter Exponential Family Distribution Theorem Let {P θ } be a one-parameter exponential family of discrete distributions with pmf function: p(x θ) = h(x)exp{η(θ)t (x) B(θ)} Then the family of distributions of the statistic T (X ) is a one-parameter exponential family of discrete distributions whose frequency functions are P θ (T (x) = t) = p(t θ) = h (t)exp{η(θ)t B(θ)} where h (t) = h(x) {x:t (x)=t} Proof: Immediate 9
10 Canonical Exponential Family Building Re-parametrize setting η = η(θ) the Natural Parameter The density has the form p(x, η) = h(x)exp{ηt (x) A(η)} The function A(η) replaces B(θ) and is defined as the normalization constant: t t t log(a(η)) = h(x)exp{ηt (x)]dx if X continuous or log(a(η)) = h(x)exp{ηt (x)] x X if X discrete The Natural Parameter Space {η : η = η(θ), θ Θ} = E (Later, Theorem gives properties of E) T (x) is the Natural Sufficient Statistic. 10
11 Building Canonical Representation of Poisson Family Poisson Distribution (1.6.1) : X Poisson(θ), where E[X ] = θ. θ p(x θ) = x e θ x!, x = 0, 1,... 1 = x! exp{(log(θ)x θ} = h(x)exp{η(θ)t (x) B(θ)} where: h(x) = 1 x! η(θ) = log(θ) T (x) = x Canonical Representation η = log(θ). A(η) = B(θ) = θ = e η. 11
12 Building MGFs of Canonical Exponenetial Family Models 12 Theorem Suppose X is distributued according to a canonical exponential family, i.e., the density/pmf function is given by p(x η) = h(x)exp[ηt (x) A(η)], for x X R q. If η is an interior point of E, the natural parameter space, then The moment generating function of T (X ) exists and is given by M T (s) = E [e st (X ) η] = exp{a(s + η) A(η)} for s in some neighborhood of 0. E [T (X ) η] = A " (η). Var[T (X ) η] = A "" (η). Proof: t t M T (s) = E [e st (X ) ) η] = t t h(x)e (s+η)t (x) A(η) dx = [e [A(s+η) A(s)] ] h(x)e (s+η)t (x) A(s+η) dx = [e [A(s+η) A(s)] ] 1 Remainder follows from properties of MFGs.
13 Building Moments of Canonical Exponential Family Distributions Poisson Distribution: A(η) = B(θ) = θ = e. E (X θ) = A " (η) = e η = θ. Var(X θ) = A "" (η) = e η = θ. Binomial Distribution: ( r n p(x θ) = θ X (1 θ) x n x θ = h(x)exp{log( (1 θ))x + nlog(1 θ)} = h(x)exp{ηx nlog(e η + 1)} θ So A(η) = nlog(e η + 1, ) with η = log( 1 θ ) e A " η (η) = n = nθ e η +1 A "" 1 (η) η 1 = n e η + ne η e η +1 eη η e (e η +1) 2 eη = n[ e ] (1 ) η +1 (e η +1) = nθ(1 θ) 13
14 Moments of the Gamma Distribution Building X 1,..., X n i.i.d Gamma(p, λ) distribution with density xp 1e λx p(x λ, p) = λp Γ(p), 0 < x < where t Γ(p) = 0 λ p x p 1 e λx dx p(x λ, p) = [ xp 1 Γ(p) ]exp{ λx + plog(λ)} = h(x)exp{ηt (x) A(η)} where Thus η = λ A(η) = plog(λ) = plog( η) E (X ) = A " (η) = p/η = p/λ Var(X ) = A "" (η) = (p/η 2 ) = p/λ 2 14
15 Notes on Gamma Distribution Building Gamma(p = n/2, λ = 1/2) corresponds to the Chi-Squared distribution with n degrees of freedom. p = 2 corresponds to the Exponential Distribution For p = 1, Γ(1/2) = π Γ(p + 1) = pγ(p) for positive integer p. 15
16 Building Rayleigh Distribution Sample X 1..., X n iid with density function x p(x θ) = exp( x 2 /2θ 2 ) θ 2 1 = [x] exp{ x 2 log(θ 2 )} 2θ 2 = h(x)exp{ηt (x) A(η)} where 1 η = 2θ 2 T (X ) = X 2. A(η) = log(θ 2 ) = log( 1 ) = log( 2η) 2η By the mgf E (X 2 ) = A " (η) = 2 = 1 2η η = 2θ 2 Var(X 2 ) = A "" (η) = + 1 = 4θ 4 η 2 For the n sample: X = (X 1,..., X n ) ) n 1 i T (X) = X 2 E [T (X)] = n/η = 2nθ 2 1 Var[T (X)] = n η 2 = 4nθ 4. Note: P(X x) = 1 exp{ x2 } (Failure time model) 2θ 2 16
17 Outline Building 1 Building 17
18 Definition Building 18 {P θ, θ Θ}, Θ R k, is a k-parameter exponential family if the the density/pmf function of X P θ is k p(x θ) = h(x)exp[ η j (θ)t j (x) B(θ)], where x X R q, and j=1 η 1,..., η k and B are real-valued functions mappying Θ R. T 1,..., T k and h are real-valued functions mapping R q R. Note: By the Factorization Theorem (Theorem 1.5.1): T(X ) = (T 1 (X ),..., T k (X )) T is sufficient. For a sample X 1,..., X n iid P θ, the sample X = (X 1,..., X n ) has a distribution in the k-parameter exponential family with natural sufficient statistic n n T (n) = ( T 1 (X i ),..., T 1 (X n )) i=1 i=1
19 Building Examples Example Normal Family P θ = N(µ, σ 2 ), with Θ = R R + = {(µ, σ 2 )} and density µ 1 p(x θ) = exp { σ 2σ x 2 2 σ 2 a k = 2 multiparameter exponential family (X = R 1, q = 1) and µ η 1 (θ) = and T 1 (X ) = X Note: σ 2 η 2 (θ) = 1 and T 2 (X ) = X 2 2σ 2 B(θ) = 1 ( µ2 + log(2πσ 2 )) 2 σ 2 h(x) = 1 2 x 2 1 ( µ2 + log(2πσ 2 ) )} For an n-sample X = ) (X 1,.. )., X n ) the natural suffficient n n statistic is T(X) = ( X i, 1, X 2 ) 1 i 19
20 Building Canonical k-parameter Exponential Family Corresponding to consider p(x θ) = h(x)exp[ k j=1 η j (θ)t j (x) B(θ)], Natural Parameter: η = (η 1,..., η k ) T Natural Sufficient Statistic: T(X) = (T 1 (X ),..., T k (X )) T Density function q(x η) = h(x)exp{t T (x)η) A(η)} where t t A(η) = log h(x)exp{t T (x)η}dx or A(η) = log[ h(x)exp{t T (x)η}] x X Natural Parameter space: E = {η R k : < A(η) < }. 20
21 Building Canonical Exponential Family Examples (k > 1) Example Normal Family (continued) P θ = N(µ, σ 2 ), with Θ = R R + = {(µ, σ 2 )} and density µ 1 p(x θ) = exp { σ 2 x 2σ 2 x 2 1 ( µ 2 2 σ µ η 1 (θ) = σ 2 and T 1 (X ) = X η 2 (θ) = 1 2σ and T 2 (X ) = X 2 2 B(θ) = 1 ( µ2 2 + log(2πσ 2 )) and h(x) = 1 σ 2 Canonical Exponential Density: q(x η) = h(x)exp{t T (x)η A(η)} T T (x) = (x, x 2 ) = (T 1 (x), T 2 (x)) µ η = (η 1 1, η 2 ) T = ( σ 2, 2σ 2 ) 1 A(η) = 1 [ η2 + log(π 1 2 2η 2 η 2 )] E = R R = {(η 1, η 2 ) : A(η) exists} 2 + log(2πσ 2 ) )} 21
22 Building Canonical Exponential Family Examples (k > 1) Multinomial Distribution X = (X 1, X 2,..., X q ) Multinomial(n, θ = (θ 1, θ 2,..., θ q )) n p(x θ) = where Notes: x 1! x q! q is a given positive integer, ) q θ = (θ 1,..., θ q ) : 1 θ j = 1. n is a given positive integer ) q 1 X i = n. What is Θ? What is the dimensionality of Θ What is the Multinomial distribution when q = 2? 22
23 Example: Multinomial Distribution Building n θ x 1 θ x 2 x p(x θ) = q x θq 1! x q! 1 2 n = x 1! x q! exp{log(θ 1 )x log(θ q 1 )x q 1 ) q 1 ) q 1 +log(1 1 θ j )[n 1 x j ]} ) = h(x)exp{ q 1 j=1 η j (θ)t j (x) B(θ)} where: n h(x) = x1! x q! η(θ) = (η 1 (θ), η 2 (θ),..., η q 1 (θ)) ) q 1 η j (θ) = log(θ j /(1 1 θ j )), j = 1,..., q 1 T (x) = (X 1, X 2,..., X q 1 ) = (T 1 (x), T 2 (x),..., T q 1 (x)). ) B(θ) = nlog(1 q 1 j=1 θ j ) For the canonical exponential density: ) A(η) = +nlog(1 + q 1 η j j=1 e ) 23
24 Outline Building 1 Building 24
25 Building Building Definition: Submodels Consider a k-parameter exponential family {q(x η); η E R k }. A Submodel is an exponential family defined by p(x θ) = q(x η(θ)) where θ Θ R k, k k, and η : Θ R k. Note: The submodel is specified by Θ. The natural parameters corresponding to Θ are a subset of the natural parameter space E = {η E : η = η(θ), θ Θ}. Example:X is a discrete r.v. s as X with X = {1, 2,..., k}, and X 1, X 2,..., X n are iid as X. Let P = set of distributions for X = (X 1,..., X n ), where the distribution of the X i is a member of any fixed collection of discrete distributions on X. Then P is exponential family (subset of Multinomial Distributions). 25
26 Building Building Models from Affine Transformations: Case I Consider P, the class of distributions for a r.v. X which is a canonical family generated by the natural sufficient statistic T(X ), a (k 1) vector-statistic, and h( ) : X R. A distribution in P has density/pmf function: p(x η) = h(x)exp{t T (x)η A(η)} where A(η) = log[ h(x)exp{t T (x)}] or x X A(η) = log[ h(x)exp{t T (x)}dx] X M: an affine tranformation form R k to R k defined by M(T) = MT + b, where M is k k and b is k 1, are known constants. 26
27 Building Building Models from Affine Transformations (continued) Consider P, the class of distributions for a r.v.x generated by the natural sufficient statistic M(T(X )) = MT(X ) + b Since the distribution of X has density/pmf: density/pmf function: p(x η) = h(x)exp{t T (x)η A(η)} we can write p(x η) = h(x)exp{[m(t(x)] T η A (η )} = h(x)exp{[mt(x) + b] T η A (η )} = h(x)exp{t T (x)[m T η ] + b T η A (η ) = h(x)exp{t T (x)[m T η ] A (η ) a subfamily pf P corresponding to Θ = {η : η E : η = M T η } Density constant for level sets of M(T(x)) 27
28 Building Building Models from Affine Transformations: Case II Consider P, the class of distributions for a r.v. X which is a canonical family generated by the natural sufficient statistic T(X ), a (k 1) vector-statistic, and h( ) : X R. A distribution in P has density/pmf function: p(x η) = h(x)exp{t T (x)η A(η)} For Θ R k, define η(θ) = Bθ E R k, where B is a constant k k matrix. The submodel of P is a submodel of the exponential family generated by B T T(X ) and h( ). 28
29 Models from Affine Transformations Building Logistic Regression. Y 1,..., Y n are independent Binomial(n i, λ i ), i = 1, 2,..., n Case 1: Unrestricted λ i : 0 < λ i < 1, i = 1,..., n n-parameter canonical exponential family Y i = {0, 1,..., n i } Natural sufficient statistic: T(Y 1,..., Y n ) = Y. ni ( r n h(y) = i 1({0 y i n i }) i=1 y i η i = log( λ i ) 1 λ i )n A(η) = i=1 n i log(1 + e η i ) p(y η) = h(y)exp{y T η A(η)} 29
30 Logistic Regression (continued) Building Case 2: For specified levels x 1 < x 2 < < x n assume η i (θ) = θ 1 + θ 2 x i, i = 1,..., n and θ = (θ 1, θ 2 ) T R 2. η(θ) = Bθ, where B is the n 2 matrix 1 x 1 B = [1, x] =.. 1 x n Set M = B T, this is the 2-parameter canonical exonential family generated ) by n ) MY = ( n i=1 Y i=1 x i Y i ) T i, and h(y) with n A(θ 1, θ 2 ) = n i log(1 + exp(θ 1 + θ 2 x i )). i=1 30
31 Logistic Regression (continued) Building Medical Experiment x i measures toxicity of drug n i number of animals subjected to toxicity level x i Y i = number of animals dying out of the n i when exposed to drug at level x i. Assumptions: Each animal has a random toxicity threshold X and death results iff drug level at or above x is applied. Independence of animals response to drug effects. Distribution of X is logistic P(X x) = [1 + exp( (θ 1 + θ 2 x))] 1 ( r P[X x] log = θ 1 + θ 2 x 1 P(X x) 31
32 Building Exponential Models Building Additional Topics Curved, e.g., Gaussian with Fixed Signal-to-Noise Ratio Location-Scale Regression Super models Exponential structure preserved under random (iid) sampling 32
33 MIT OpenCourseWare Mathematical Statistics Spring 2016 For information about citing these materials or our Terms of Use, visit:
MIT Spring 2016
Exponential Families II MIT 18.655 Dr. Kempthorne Spring 2016 1 Outline Exponential Families II 1 Exponential Families II 2 : Expectation and Variance U (k 1) and V (l 1) are random vectors If A (m k),
More informationLECTURE 11: EXPONENTIAL FAMILY AND GENERALIZED LINEAR MODELS
LECTURE : EXPONENTIAL FAMILY AND GENERALIZED LINEAR MODELS HANI GOODARZI AND SINA JAFARPOUR. EXPONENTIAL FAMILY. Exponential family comprises a set of flexible distribution ranging both continuous and
More informationST5215: Advanced Statistical Theory
Department of Statistics & Applied Probability Monday, September 26, 2011 Lecture 10: Exponential families and Sufficient statistics Exponential Families Exponential families are important parametric families
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More information15 Discrete Distributions
Lecture Note 6 Special Distributions (Discrete and Continuous) MIT 4.30 Spring 006 Herman Bennett 5 Discrete Distributions We have already seen the binomial distribution and the uniform distribution. 5.
More information1. Fisher Information
1. Fisher Information Let f(x θ) be a density function with the property that log f(x θ) is differentiable in θ throughout the open p-dimensional parameter set Θ R p ; then the score statistic (or score
More informationGeneralized Linear Models and Exponential Families
Generalized Linear Models and Exponential Families David M. Blei COS424 Princeton University April 12, 2012 Generalized Linear Models x n y n β Linear regression and logistic regression are both linear
More informationChapter 2: Fundamentals of Statistics Lecture 15: Models and statistics
Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Data from one or a series of random experiments are collected. Planning experiments and collecting data (not discussed here). Analysis:
More informationThings to remember when learning probability distributions:
SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions
More informationMIT Spring 2016
Generalized Linear Models MIT 18.655 Dr. Kempthorne Spring 2016 1 Outline Generalized Linear Models 1 Generalized Linear Models 2 Generalized Linear Model Data: (y i, x i ), i = 1,..., n where y i : response
More informationMarch 10, 2017 THE EXPONENTIAL CLASS OF DISTRIBUTIONS
March 10, 2017 THE EXPONENTIAL CLASS OF DISTRIBUTIONS Abstract. We will introduce a class of distributions that will contain many of the discrete and continuous we are familiar with. This class will help
More informationContinuous Random Variables
1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables
More informationLECTURE 2 NOTES. 1. Minimal sufficient statistics.
LECTURE 2 NOTES 1. Minimal sufficient statistics. Definition 1.1 (minimal sufficient statistic). A statistic t := φ(x) is a minimial sufficient statistic for the parametric model F if 1. it is sufficient.
More informationIntroduction: exponential family, conjugacy, and sufficiency (9/2/13)
STA56: Probabilistic machine learning Introduction: exponential family, conjugacy, and sufficiency 9/2/3 Lecturer: Barbara Engelhardt Scribes: Melissa Dalis, Abhinandan Nath, Abhishek Dubey, Xin Zhou Review
More informationUniformly Most Powerful Bayesian Tests and Standards for Statistical Evidence
Uniformly Most Powerful Bayesian Tests and Standards for Statistical Evidence Valen E. Johnson Texas A&M University February 27, 2014 Valen E. Johnson Texas A&M University Uniformly most powerful Bayes
More informationGeneralized Linear Models (1/29/13)
STA613/CBB540: Statistical methods in computational biology Generalized Linear Models (1/29/13) Lecturer: Barbara Engelhardt Scribe: Yangxiaolu Cao When processing discrete data, two commonly used probability
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More informationMaximum Likelihood Large Sample Theory
Maximum Likelihood Large Sample Theory MIT 18.443 Dr. Kempthorne Spring 2015 1 Outline 1 Large Sample Theory of Maximum Likelihood Estimates 2 Asymptotic Results: Overview Asymptotic Framework Data Model
More informationSTAT 3610: Review of Probability Distributions
STAT 3610: Review of Probability Distributions Mark Carpenter Professor of Statistics Department of Mathematics and Statistics August 25, 2015 Support of a Random Variable Definition The support of a random
More informationBrief Review of Probability
Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions
More informationSTA 732: Inference. Notes 2. Neyman-Pearsonian Classical Hypothesis Testing B&D 4
STA 73: Inference Notes. Neyman-Pearsonian Classical Hypothesis Testing B&D 4 1 Testing as a rule Fisher s quantification of extremeness of observed evidence clearly lacked rigorous mathematical interpretation.
More informationMIT Spring 2016
MIT 18.655 Dr. Kempthorne Spring 2016 1 MIT 18.655 Outline 1 2 MIT 18.655 3 Decision Problem: Basic Components P = {P θ : θ Θ} : parametric model. Θ = {θ}: Parameter space. A{a} : Action space. L(θ, a)
More informationSTAT215: Solutions for Homework 1
STAT25: Solutions for Homework Due: Wednesday, Jan 30. (0 pt) For X Be(α, β), Evaluate E[X a ( X) b ] for all real numbers a and b. For which a, b is it finite? (b) What is the MGF M log X (t) for the
More informationFoundations of Statistical Inference
Foundations of Statistical Inference Jonathan Marchini Department of Statistics University of Oxford MT 2013 Jonathan Marchini (University of Oxford) BS2a MT 2013 1 / 27 Course arrangements Lectures M.2
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationSTAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationExponential Families
Exponential Families David M. Blei 1 Introduction We discuss the exponential family, a very flexible family of distributions. Most distributions that you have heard of are in the exponential family. Bernoulli,
More informationECE 275B Homework # 1 Solutions Version Winter 2015
ECE 275B Homework # 1 Solutions Version Winter 2015 1. (a) Because x i are assumed to be independent realizations of a continuous random variable, it is almost surely (a.s.) 1 the case that x 1 < x 2
More informationdiscrete random variable: probability mass function continuous random variable: probability density function
CHAPTER 1 DISTRIBUTION THEORY 1 Basic Concepts Random Variables discrete random variable: probability mass function continuous random variable: probability density function CHAPTER 1 DISTRIBUTION THEORY
More informationFundamentals of Statistics
Chapter 2 Fundamentals of Statistics This chapter discusses some fundamental concepts of mathematical statistics. These concepts are essential for the material in later chapters. 2.1 Populations, Samples,
More informationCHAPTER 1 DISTRIBUTION THEORY 1 CHAPTER 1: DISTRIBUTION THEORY
CHAPTER 1 DISTRIBUTION THEORY 1 CHAPTER 1: DISTRIBUTION THEORY CHAPTER 1 DISTRIBUTION THEORY 2 Basic Concepts CHAPTER 1 DISTRIBUTION THEORY 3 Random Variables (R.V.) discrete random variable: probability
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationProbability Distributions Columns (a) through (d)
Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)
More informationChapter 3. Exponential Families. 3.1 Regular Exponential Families
Chapter 3 Exponential Families 3.1 Regular Exponential Families The theory of exponential families will be used in the following chapters to study some of the most important topics in statistical inference
More informationECE 275B Homework # 1 Solutions Winter 2018
ECE 275B Homework # 1 Solutions Winter 2018 1. (a) Because x i are assumed to be independent realizations of a continuous random variable, it is almost surely (a.s.) 1 the case that x 1 < x 2 < < x n Thus,
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationLimiting Distributions
Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the
More informationRandom Variables and Their Distributions
Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital
More informationSTAT 460/ /561 STATISTICAL INFERENCE I & II 2017/2018, TERMs I & II
STAT 460/560 + 461/561 STATISTICAL INFERENCE I & II 2017/2018, TERMs I & II Jiahua Chen and Ruben Zamar Department of Statistics University of British Columbia Contents 1 Some basics 1 1.1 Discipline of
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More information1 Probability Model. 1.1 Types of models to be discussed in the course
Sufficiency January 11, 2016 Debdeep Pati 1 Probability Model Model: A family of distributions {P θ : θ Θ}. P θ (B) is the probability of the event B when the parameter takes the value θ. P θ is described
More informationSampling Distributions
Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of
More informationSTAT/MATH 395 PROBABILITY II
STAT/MATH 395 PROBABILITY II Chapter 6 : Moment Functions Néhémy Lim 1 1 Department of Statistics, University of Washington, USA Winter Quarter 2016 of Common Distributions Outline 1 2 3 of Common Distributions
More informationMathematical Statistics 1 Math A 6330
Mathematical Statistics 1 Math A 6330 Chapter 2 Transformations and Expectations Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 14, 2015 Outline 1 Distributions of Functions
More informationACM 116: Lectures 3 4
1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance
More informationMathematical statistics
October 4 th, 2018 Lecture 12: Information Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation Chapter
More informationStatistics 1B. Statistics 1B 1 (1 1)
0. Statistics 1B Statistics 1B 1 (1 1) 0. Lecture 1. Introduction and probability review Lecture 1. Introduction and probability review 2 (1 1) 1. Introduction and probability review 1.1. What is Statistics?
More informationHypothesis Testing. Testing Hypotheses MIT Dr. Kempthorne. Spring MIT Testing Hypotheses
Testing Hypotheses MIT 18.443 Dr. Kempthorne Spring 2015 1 Outline Hypothesis Testing 1 Hypothesis Testing 2 Hypothesis Testing: Statistical Decision Problem Two coins: Coin 0 and Coin 1 P(Head Coin 0)
More informationOrder Statistics. The order statistics of a set of random variables X 1, X 2,, X n are the same random variables arranged in increasing order.
Order Statistics The order statistics of a set of random variables 1, 2,, n are the same random variables arranged in increasing order. Denote by (1) = smallest of 1, 2,, n (2) = 2 nd smallest of 1, 2,,
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More information1 Random Variable: Topics
Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?
More informationBMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs
Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem
More informationProbability and Distributions
Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated
More informationLinear model A linear model assumes Y X N(µ(X),σ 2 I), And IE(Y X) = µ(x) = X β, 2/52
Statistics for Applications Chapter 10: Generalized Linear Models (GLMs) 1/52 Linear model A linear model assumes Y X N(µ(X),σ 2 I), And IE(Y X) = µ(x) = X β, 2/52 Components of a linear model The two
More informationMathematical Statistics
Mathematical Statistics Chapter Three. Point Estimation 3.4 Uniformly Minimum Variance Unbiased Estimator(UMVUE) Criteria for Best Estimators MSE Criterion Let F = {p(x; θ) : θ Θ} be a parametric distribution
More informationHT Introduction. P(X i = x i ) = e λ λ x i
MODS STATISTICS Introduction. HT 2012 Simon Myers, Department of Statistics (and The Wellcome Trust Centre for Human Genetics) myers@stats.ox.ac.uk We will be concerned with the mathematical framework
More information1 Probability Model. 1.1 Types of models to be discussed in the course
Sufficiency January 18, 016 Debdeep Pati 1 Probability Model Model: A family of distributions P θ : θ Θ}. P θ (B) is the probability of the event B when the parameter takes the value θ. P θ is described
More informationSeverity Models - Special Families of Distributions
Severity Models - Special Families of Distributions Sections 5.3-5.4 Stat 477 - Loss Models Sections 5.3-5.4 (Stat 477) Claim Severity Models Brian Hartman - BYU 1 / 1 Introduction Introduction Given that
More informationBrief Review on Estimation Theory
Brief Review on Estimation Theory K. Abed-Meraim ENST PARIS, Signal and Image Processing Dept. abed@tsi.enst.fr This presentation is essentially based on the course BASTA by E. Moulines Brief review on
More informationMathematical statistics
October 1 st, 2018 Lecture 11: Sufficient statistic Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation
More informationMA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems
MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions
More informationClassical Estimation Topics
Classical Estimation Topics Namrata Vaswani, Iowa State University February 25, 2014 This note fills in the gaps in the notes already provided (l0.pdf, l1.pdf, l2.pdf, l3.pdf, LeastSquares.pdf). 1 Min
More informationStatistics 3858 : Maximum Likelihood Estimators
Statistics 3858 : Maximum Likelihood Estimators 1 Method of Maximum Likelihood In this method we construct the so called likelihood function, that is L(θ) = L(θ; X 1, X 2,..., X n ) = f n (X 1, X 2,...,
More informationMachine learning - HT Maximum Likelihood
Machine learning - HT 2016 3. Maximum Likelihood Varun Kanade University of Oxford January 27, 2016 Outline Probabilistic Framework Formulate linear regression in the language of probability Introduce
More informationTest Problems for Probability Theory ,
1 Test Problems for Probability Theory 01-06-16, 010-1-14 1. Write down the following probability density functions and compute their moment generating functions. (a) Binomial distribution with mean 30
More informationUnbiased Estimation. Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others.
Unbiased Estimation Binomial problem shows general phenomenon. An estimator can be good for some values of θ and bad for others. To compare ˆθ and θ, two estimators of θ: Say ˆθ is better than θ if it
More informationFundamentals. CS 281A: Statistical Learning Theory. Yangqing Jia. August, Based on tutorial slides by Lester Mackey and Ariel Kleiner
Fundamentals CS 281A: Statistical Learning Theory Yangqing Jia Based on tutorial slides by Lester Mackey and Ariel Kleiner August, 2011 Outline 1 Probability 2 Statistics 3 Linear Algebra 4 Optimization
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More information3. Probability and Statistics
FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important
More information1 Probability and Random Variables
1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in
More informationPart IB Statistics. Theorems with proof. Based on lectures by D. Spiegelhalter Notes taken by Dexter Chua. Lent 2015
Part IB Statistics Theorems with proof Based on lectures by D. Spiegelhalter Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly)
More informationLecture 5: Moment generating functions
Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf
More informationStat410 Probability and Statistics II (F16)
Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two
More informationLecture 4: Exponential family of distributions and generalized linear model (GLM) (Draft: version 0.9.2)
Lectures on Machine Learning (Fall 2017) Hyeong In Choi Seoul National University Lecture 4: Exponential family of distributions and generalized linear model (GLM) (Draft: version 0.9.2) Topics to be covered:
More informationClosed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.
IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,
More informationStatistics for scientists and engineers
Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3
More informationSpring 2012 Math 541A Exam 1. X i, S 2 = 1 n. n 1. X i I(X i < c), T n =
Spring 2012 Math 541A Exam 1 1. (a) Let Z i be independent N(0, 1), i = 1, 2,, n. Are Z = 1 n n Z i and S 2 Z = 1 n 1 n (Z i Z) 2 independent? Prove your claim. (b) Let X 1, X 2,, X n be independent identically
More informationUniformly and Restricted Most Powerful Bayesian Tests
Uniformly and Restricted Most Powerful Bayesian Tests Valen E. Johnson and Scott Goddard Texas A&M University June 6, 2014 Valen E. Johnson and Scott Goddard Texas A&MUniformly University Most Powerful
More informationContinuous Distributions
A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later
More information1.6 Families of Distributions
Your text 1.6. FAMILIES OF DISTRIBUTIONS 15 F(x) 0.20 1.0 0.15 0.8 0.6 Density 0.10 cdf 0.4 0.05 0.2 0.00 a b c 0.0 x Figure 1.1: N(4.5, 2) Distribution Function and Cumulative Distribution Function for
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More informationPhysics 403 Probability Distributions II: More Properties of PDFs and PMFs
Physics 403 Probability Distributions II: More Properties of PDFs and PMFs Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Last Time: Common Probability Distributions
More informationWhen is MLE appropriate
When is MLE appropriate As a rule of thumb the following to assumptions need to be fulfilled to make MLE the appropriate method for estimation: The model is adequate. That is, we trust that one of the
More informationST5215: Advanced Statistical Theory
Department of Statistics & Applied Probability Wednesday, October 19, 2011 Lecture 17: UMVUE and the first method of derivation Estimable parameters Let ϑ be a parameter in the family P. If there exists
More informationMultivariate Distributions
Chapter 2 Multivariate Distributions and Transformations 2.1 Joint, Marginal and Conditional Distributions Often there are n random variables Y 1,..., Y n that are of interest. For example, age, blood
More informationReview for the previous lecture
Lecture 1 and 13 on BST 631: Statistical Theory I Kui Zhang, 09/8/006 Review for the previous lecture Definition: Several discrete distributions, including discrete uniform, hypergeometric, Bernoulli,
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/
More informationContinuous Distributions
Chapter 3 Continuous Distributions 3.1 Continuous-Type Data In Chapter 2, we discuss random variables whose space S contains a countable number of outcomes (i.e. of discrete type). In Chapter 3, we study
More information4 Moment generating functions
4 Moment generating functions Moment generating functions (mgf) are a very powerful computational tool. They make certain computations much shorter. However, they are only a computational tool. The mgf
More informationChapter 3: Unbiased Estimation Lecture 22: UMVUE and the method of using a sufficient and complete statistic
Chapter 3: Unbiased Estimation Lecture 22: UMVUE and the method of using a sufficient and complete statistic Unbiased estimation Unbiased or asymptotically unbiased estimation plays an important role in
More informationt x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.
Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae
More informationChapter 4. Chapter 4 sections
Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation
More informationDefinition 1.1 (Parametric family of distributions) A parametric distribution is a set of distribution functions, each of which is determined by speci
Definition 1.1 (Parametric family of distributions) A parametric distribution is a set of distribution functions, each of which is determined by specifying one or more values called parameters. The number
More informationMcGill University. Faculty of Science. Department of Mathematics and Statistics. Part A Examination. Statistics: Theory Paper
McGill University Faculty of Science Department of Mathematics and Statistics Part A Examination Statistics: Theory Paper Date: 10th May 2015 Instructions Time: 1pm-5pm Answer only two questions from Section
More informationWrite your Registration Number, Test Centre, Test Code and the Number of this booklet in the appropriate places on the answersheet.
2016 Booklet No. Test Code : PSA Forenoon Questions : 30 Time : 2 hours Write your Registration Number, Test Centre, Test Code and the Number of this booklet in the appropriate places on the answersheet.
More informationMIT Spring 2016
MIT 18.655 Dr. Kempthorne Spring 2016 1 MIT 18.655 Outline 1 2 MIT 18.655 Decision Problem: Basic Components P = {P θ : θ Θ} : parametric model. Θ = {θ}: Parameter space. A{a} : Action space. L(θ, a) :
More informationStatistics 3657 : Moment Generating Functions
Statistics 3657 : Moment Generating Functions A useful tool for studying sums of independent random variables is generating functions. course we consider moment generating functions. In this Definition
More informationProbability and Statistics Notes
Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1
More information