The Prediction of Random Effects in Generalized Linear Mixed Model

Similar documents
MIXED MODELS THE GENERAL MIXED MODEL

Chapter 5 Prediction of Random Variables

Mixed-Models. version 30 October 2011

Mixed-Model Estimation of genetic variances. Bruce Walsh lecture notes Uppsala EQG 2012 course version 28 Jan 2012

1 Mixed effect models and longitudinal data analysis

STA216: Generalized Linear Models. Lecture 1. Review and Introduction

Generalized linear models

Generalized Linear Models

Linear Methods for Prediction

MS&E 226: Small Data. Lecture 11: Maximum likelihood (v2) Ramesh Johari

Outline of GLMs. Definitions

21. Best Linear Unbiased Prediction (BLUP) of Random Effects in the Normal Linear Mixed Effects Model

SB1a Applied Statistics Lectures 9-10

Introduction to Generalized Linear Models

Lecture 9 Multi-Trait Models, Binary and Count Traits

Various types of likelihood

Variational Bayes and Variational Message Passing

Generalized Linear Models. Kurt Hornik

The Simple Regression Model. Part II. The Simple Regression Model

MA 575 Linear Models: Cedric E. Ginestet, Boston University Midterm Review Week 7

Introduction to Estimation Methods for Time Series models Lecture 2

Linear Mixed Models. One-way layout REML. Likelihood. Another perspective. Relationship to classical ideas. Drawbacks.

Introduction to General and Generalized Linear Models

Statistics 203: Introduction to Regression and Analysis of Variance Penalized models

Applied Linear Statistical Methods

Chapter 3 Best Linear Unbiased Estimation

Contextual Effects in Modeling for Small Domains

Gauss Markov & Predictive Distributions

Statistics & Data Sciences: First Year Prelim Exam May 2018

Linear Methods for Prediction

Logistic regression. 11 Nov Logistic regression (EPFL) Applied Statistics 11 Nov / 20

STA 216: GENERALIZED LINEAR MODELS. Lecture 1. Review and Introduction. Much of statistics is based on the assumption that random

Lecture 5: BLUP (Best Linear Unbiased Predictors) of genetic values. Bruce Walsh lecture notes Tucson Winter Institute 9-11 Jan 2013

VARIANCE COMPONENT ESTIMATION & BEST LINEAR UNBIASED PREDICTION (BLUP)

The consequences of misspecifying the random effects distribution when fitting generalized linear mixed models

Part IB Statistics. Theorems with proof. Based on lectures by D. Spiegelhalter Notes taken by Dexter Chua. Lent 2015

Chapter 11 MIVQUE of Variances and Covariances

Stat 579: Generalized Linear Models and Extensions

Chapter 12 REML and ML Estimation

SCHOOL OF MATHEMATICS AND STATISTICS. Linear and Generalised Linear Models

STAT5044: Regression and Anova. Inyoung Kim

Outline for today. Computation of the likelihood function for GLMMs. Likelihood for generalized linear mixed model

The linear model is the most fundamental of all serious statistical models encompassing:

Approximating models. Nancy Reid, University of Toronto. Oxford, February 6.

Parameter Estimation

Stats 579 Intermediate Bayesian Modeling. Assignment # 2 Solutions

Parametric Bootstrap Methods for Bias Correction in Linear Mixed Models

MLES & Multivariate Normal Theory

Stat260: Bayesian Modeling and Inference Lecture Date: March 10, 2010

Practical Econometrics. for. Finance and Economics. (Econometrics 2)

Review of Econometrics

Lecture Notes. Introduction

3. Linear Regression With a Single Regressor

REGRESSION WITH SPATIALLY MISALIGNED DATA. Lisa Madsen Oregon State University David Ruppert Cornell University

STAT5044: Regression and Anova

Topic 25 - One-Way Random Effects Models. Outline. Random Effects vs Fixed Effects. Data for One-way Random Effects Model. One-way Random effects

36-720: Linear Mixed Models

Statistical Models. ref: chapter 1 of Bates, D and D. Watts (1988) Nonlinear Regression Analysis and its Applications, Wiley. Dave Campbell 2009

Statistics: Learning models from data

STAT Financial Time Series

Mixed models in R using the lme4 package Part 7: Generalized linear mixed models

(I AL BL 2 )z t = (I CL)ζ t, where

Advanced Quantitative Methods: maximum likelihood

Homework 1: Solutions

Restricted Maximum Likelihood in Linear Regression and Linear Mixed-Effects Model

557: MATHEMATICAL STATISTICS II BIAS AND VARIANCE

Brief Review on Estimation Theory

Generalized Linear Models. Last time: Background & motivation for moving beyond linear

Physics 403. Segev BenZvi. Parameter Estimation, Correlations, and Error Bars. Department of Physics and Astronomy University of Rochester

Outline. Mixed models in R using the lme4 package Part 5: Generalized linear mixed models. Parts of LMMs carried over to GLMMs

Regression. Oscar García

Now consider the case where E(Y) = µ = Xβ and V (Y) = σ 2 G, where G is diagonal, but unknown.

Solutions for Econometrics I Homework No.3

THE ANOVA APPROACH TO THE ANALYSIS OF LINEAR MIXED EFFECTS MODELS

Peter Hoff Linear and multilinear models April 3, GLS for multivariate regression 5. 3 Covariance estimation for the GLM 8

Advanced Quantitative Methods: maximum likelihood

WU Weiterbildung. Linear Mixed Models

Multiple Regression Analysis

Topic 12 Overview of Estimation

Problem 1 (20) Log-normal. f(x) Cauchy

Hierarchical Generalized Linear Model Approach For Estimating Of Working Population In Kepulauan Riau Province

Variations. ECE 6540, Lecture 10 Maximum Likelihood Estimation

Bayesian Linear Regression

Chapter 5: Generalized Linear Models

Bayesian Inference in GLMs. Frequentists typically base inferences on MLEs, asymptotic confidence

The Gauss-Markov Model. Copyright c 2012 Dan Nettleton (Iowa State University) Statistics / 61

STAT763: Applied Regression Analysis. Multiple linear regression. 4.4 Hypothesis testing

Generalized Linear Models Introduction

The Statistical Property of Ordinary Least Squares

Likelihood-Based Methods

Lecture 2. (See Exercise 7.22, 7.23, 7.24 in Casella & Berger)

Generalized Estimating Equations

The loss function and estimating equations

LECTURE 2 LINEAR REGRESSION MODEL AND OLS

Summer School in Statistics for Astronomers V June 1 - June 6, Regression. Mosuk Chow Statistics Department Penn State University.

ABHELSINKI UNIVERSITY OF TECHNOLOGY

arxiv: v1 [math.st] 22 Dec 2018

Signal Processing - Lecture 7

Bayesian Inference. Chapter 9. Linear models and regression

ST3241 Categorical Data Analysis I Generalized Linear Models. Introduction and Some Examples

Transcription:

The Prediction of Random Effects in Generalized Linear Mixed Model Wenxue Ge Department of Mathematics and Statistics University of Victoria, Canada Nov 20, 2017 @University of Victoria

Outline Background 1 Background GLM and GLMM 2 3 First Lactation Yields 4 Expansions

Outline Background GLM and GLMM 1 Background GLM and GLMM 2 3 First Lactation Yields 4 Expansions

GLM and GLMM Exponential Family f(y;θ) = s(y) t(θ) exp[a(y)b(θ)] f(y;θ) = exp[ a(y)b(θ) + c(θ) + d(y) ] -where s(y) = exp[ d(y) ]; t(θ) = exp[ c(θ) ] Canonical if a(y) = y; b(θ) is natural parameter

GLM and GLMM Generalized Linear Model - Y 1,...,Y N from the exponential family 1. Y i has the canonical form f(y i ;θ i ) = exp [ y i b i (θ i ) + c i (θ i ) + d i (y i )] 2. Y i s same distribution form - Link function g(µ i ) = x T i β Monotone, Differentiable

GLM and GLMM Generalized Linear Mixed Model - Random Effects Incorporating correlation, broader inference - General Model

Outline Background 1 Background GLM and GLMM 2 3 First Lactation Yields 4 Expansions

, or BLUP - "Best": miminum MSE (not within all predictors) - Model Var [ ] u = e [ ] G 0 σ 2 0 R E(u) = 0, E(e) = 0

Outline Background 1 Background GLM and GLMM 2 3 First Lactation Yields 4 Expansions

Henderson s Justification, u N q (0,Gσ 2 ) f (y, u) = g(y u)h(u) (1) = (2πσ 2 ) 1 2 n 1 2 q (det G 0 ) 1 1 2 exp{ 0 R 2σ 2 [u T G 1 u + (y Xβ Z u) T R 1 (y Xβ Z u)]} Bayesian Derivation, u N q (0,Gσ 2 ) - β: uniform, improper prior - Posterior density is proportional to (1)

Minimize u T G 1 u + (y Xβ Z u) T R 1 (y Xβ Z u) Mixed model equations X T R 1 X ˆβ + X T R 1 Z û = X T R 1 y (2) Z T R 1 X ˆβ + (Z T R 1 Z + G 1 )û = Z T R 1 y (3) - Eliminate u û = (Z T R 1 Z + G 1 ) 1 Z T R 1 (y X ˆβ) (4) gives X T R 1 X ˆβ X T R 1 Z (Z T R 1 Z + G 1 ) 1 Z T R 1 X ˆβ = X T R 1 y X T R 1 Z (Z T R 1 Z + G 1 ) 1 Z T R 1 y

Simplify X T (R 1 R 1 Z (Z T R 1 Z + G 1 ) 1 Z T R 1 )X ˆβ = X T (R 1 R 1 Z (Z T R 1 Z + G 1 ) 1 Z T R 1 )y and (R+ZGZ T ) 1 = R 1 R 1 Z (Z T R 1 Z +G 1 ) 1 Z T R 1 gives X T (R + ZGZ T ) 1 X ˆβ = X T (R + ZGZ T ) 1 y (5) Solution (Note:û = (Z T R 1 Z +G 1 ) 1 Z T R 1 (y X ˆβ)) ˆβ = (X T (R + ZGZ T ) 1 X) 1 X T (R + ZGZ T ) 1 y (6) û = (Z T R 1 Z + G 1 ) 1 {[Z T R 1 Z T R 1 X] [X T (R + ZGZ T ) 1 X] 1 X T (R + ZGZ T ) 1 }y (7)

Outline Background 1 Background GLM and GLMM 2 3 First Lactation Yields 4 Expansions

BLUP(b T β + c T u) - b T and c T are given - Seek λ : λ T y unbiase for b T β + c T u and minimize var(λ T y (b T β + c T u)) + 2m T (X T λ X T b) = λ T var(y)λ + c T var(u)c-2λ T cov(u, y T ) +2 m T (X T λ X T b) 2m Lagrange multipliers Yields λ T soly as BLUP(b T β + c T u) = b T ˆβ + c T (Z T R 1 Z + G 1 ) 1 Z T R 1 (y X ˆβ)

Harville and Robinson - More intuitive - Linear Combination: b T β + c T u - Any Linear Unbiased Estimator: b T ˆβ + c T û + a T y - Where X T a = 0 b T and c T are given

Robinson Continued E[yy T ] = Xββ T X T + ZGZ T σ 2 + Rσ 2 E[uy T ] = GZ T σ 2 E[(ˆβ β)y T a] =[X T (R + ZGZ T ) 1 X] 1 X T (ZGZ T + R) 1 E[yy T ]a βe[y T ]a =ββ T X T a + [X T (R + ZGZ T ) 1 X] 1 X T σ 2 a ββ T X T a =0 Same steps E[(û u)y T a] = 0

"Best" E{[b T (ˆβ β) + c T (û u)a T y] [b T (ˆβ β) + c T (û u)a T y] T } = E{[b T (ˆβ β) + c T (û u)][b T (ˆβ β) + c T (û u)] T } + E[a T yy T a] + E{[b T (ˆβ β) + c T (û u)]y T a} + E{a T y[b T (ˆβ β) + c T (û u)]} = E{[b T (ˆβ β) + c T (û u)][b T (ˆβ β) + c T (û u)] T } + E[a T yy T a]

Outline Background First Lactation Yields 1 Background GLM and GLMM 2 3 First Lactation Yields 4 Expansions

First Lactation Yields Model: y 9x1 = X 9x3 β 3x1 + Z 9x4 u 4x1 + e 9x1 y : Yield β : Herd u : Sire

First Lactation Yields Mixed model Equations X T R 1 X ˆβ + X T R 1 Z û = X T R 1 y (2) Z T R 1 X ˆβ + (Z T R 1 Z + G 1 )û = Z T R 1 y (3) R 9x9 = I 9x9 G 4x4 = 0.1I 4x4

First Lactation Yields Mixed model Equations X T R 1 X ˆβ + X T R 1 Z û = X T R 1 y (2) Z T R 1 X ˆβ + (Z T R 1 Z + G 1 )û = Z T R 1 y (3)

First Lactation Yields Solutions

Outline Background Expansions 1 Background GLM and GLMM 2 3 First Lactation Yields 4 Expansions

Expansions Why have random effects Expansion - Goldberger s Derivition Not require normal - Restricted Maximum Likelihood Variance parameters

Expansions

Expansions