+ E 1,1.k + E 2,1.k Again, we need a constraint because our model is over-parameterized. We add the constraint that

Similar documents
F statistic = s2 1 s 2 ( F for Fisher )

17 Nested and Higher Order Designs

Appendix for Causal Interaction in Factorial Experiments: Application to Conjoint Analysis

Chapter 11: I = 2 samples independent samples paired samples Chapter 12: I 3 samples of equal size J one-way layout two-way layout

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

ANOVA. The Observations y ij

8.6 The Complex Number System

Solution of Linear System of Equations and Matrix Inversion Gauss Seidel Iteration Method

2.3 Nilpotent endomorphisms

THE SUMMATION NOTATION Ʃ

Chapter Twelve. Integration. We now turn our attention to the idea of an integral in dimensions higher than one. Consider a real-valued function f : D

Department of Statistics University of Toronto STA305H1S / 1004 HS Design and Analysis of Experiments Term Test - Winter Solution

Publication 2006/01. Transport Equations in Incompressible. Lars Davidson

Number of cases Number of factors Number of covariates Number of levels of factor i. Value of the dependent variable for case k

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER I EXAMINATION MTH352/MH3510 Regression Analysis

Chapter 13: Multiple Regression

Workshop: Approximating energies and wave functions Quantum aspects of physical chemistry

PHYS 705: Classical Mechanics. Calculus of Variations II

From Biot-Savart Law to Divergence of B (1)

Additional Codes using Finite Difference Method. 1 HJB Equation for Consumption-Saving Problem Without Uncertainty

Linear Feature Engineering 11

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands

UCLA STAT 13 Introduction to Statistical Methods for the Life and Health Sciences. Chapter 11 Analysis of Variance - ANOVA. Instructor: Ivo Dinov,

Statistical Inference. 2.3 Summary Statistics Measures of Center and Spread. parameters ( population characteristics )

STAT 511 FINAL EXAM NAME Spring 2001

Definition. Measures of Dispersion. Measures of Dispersion. Definition. The Range. Measures of Dispersion 3/24/2014

Goodness of fit and Wilks theorem

Physics 5153 Classical Mechanics. Principle of Virtual Work-1

Homework Notes Week 7

Difference Equations

The Geometry of Logit and Probit

Chapter 11: Simple Linear Regression and Correlation

Statistics Chapter 4

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

10. Canonical Transformations Michael Fowler

MMA and GCMMA two methods for nonlinear optimization

Affine transformations and convexity

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations

Physics 181. Particle Systems

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

MAE140 - Linear Circuits - Winter 16 Midterm, February 5

Statistics MINITAB - Lab 2

Gravitational Acceleration: A case of constant acceleration (approx. 2 hr.) (6/7/11)

MD. LUTFOR RAHMAN 1 AND KALIPADA SEN 2 Abstract

Section 8.3 Polar Form of Complex Numbers

Statistics II Final Exam 26/6/18

Estimation: Part 2. Chapter GREG estimation

A how to guide to second quantization method.

Kernel Methods and SVMs Extension

Integrals and Invariants of Euler-Lagrange Equations

Poisson brackets and canonical transformations

Exercises. 18 Algorithms

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k.

8 : Learning in Fully Observed Markov Networks. 1 Why We Need to Learn Undirected Graphical Models. 2 Structural Learning for Completely Observed MRF

Temperature. Chapter Heat Engine

Integrals and Invariants of

8.1 Arc Length. What is the length of a curve? How can we approximate it? We could do it following the pattern we ve used before

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

Structure and Drive Paul A. Jensen Copyright July 20, 2003

SIMPLE LINEAR REGRESSION

MAE140 - Linear Circuits - Fall 10 Midterm, October 28

Solution Thermodynamics

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

MAE140 - Linear Circuits - Fall 13 Midterm, October 31

Introduction to Dummy Variable Regressors. 1. An Example of Dummy Variable Regressors

Effects of Ignoring Correlations When Computing Sample Chi-Square. John W. Fowler February 26, 2012

Lagrange Multipliers. A Somewhat Silly Example. Monday, 25 September 2013

Foundations of Arithmetic

A NOTE ON CES FUNCTIONS Drago Bergholt, BI Norwegian Business School 2011

x = , so that calculated

Mechanics Physics 151

FE REVIEW OPERATIONAL AMPLIFIERS (OP-AMPS)( ) 8/25/2010

Two-factor model. Statistical Models. Least Squares estimation in LM two-factor model. Rats

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Advanced Circuits Topics - Part 1 by Dr. Colton (Fall 2017)

Exercise Solutions to Real Analysis

Lecture Notes for STATISTICAL METHODS FOR BUSINESS II BMGT 212. Chapters 14, 15 & 16. Professor Ahmadi, Ph.D. Department of Management

Chapter 14 Simple Linear Regression

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

Société de Calcul Mathématique SA

Open Systems: Chemical Potential and Partial Molar Quantities Chemical Potential

Chapter 3 Describing Data Using Numerical Measures

Iterative General Dynamic Model for Serial-Link Manipulators

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

Complex Numbers. x = B B 2 4AC 2A. or x = x = 2 ± 4 4 (1) (5) 2 (1)

ψ ij has the eigenvalue

Physics 5153 Classical Mechanics. D Alembert s Principle and The Lagrangian-1

Differentiating Gaussian Processes

The Number of Ways to Write n as a Sum of ` Regular Figurate Numbers

INDUCTANCE. RC Cicuits vs LR Circuits

Lecture 6: Introduction to Linear Regression

Prof. Dr. I. Nasser Phys 630, T Aug-15 One_dimensional_Ising_Model

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

Homework Assignment 3 Due in class, Thursday October 15

CHAPTER 4. Vector Spaces

REDUCTION MODULO p. We will prove the reduction modulo p theorem in the general form as given by exercise 4.12, p. 143, of [1].

Section 3.6 Complex Zeros

Salmon: Lectures on partial differential equations. Consider the general linear, second-order PDE in the form. ,x 2

Transcription:

TWO WAY ANOVA Next we consder the case when we have two factors, categorzatons, e.g. lab and manufacturer. If there are I levels n the frst factor and J levels n the second factor then we can thnk of ths stuaton as one where there are I J levels of the combned factors. Notaton Notaton-wse, we smply add another subscrpt to the response, that s, y now has a trple subscrpt, where y,,k represents the measurement on the kth subect that belongs to both the th group (lab) of the frst factor and the group (manufacturer) of the second factor, =,... I, =,..., J, and k =,..., n. For smplcty, we wll only work wth the specal case of n, = K,.e. all subgroups have the same number of responses. Then we wrte the model as follows: y,,k = α + η, + E,,k So, when = and =, y,,k = and when = 2 and =, y 2,,k = + E,.k + E 2,.k Agan, we need a constrant because our model s over-parameterzed. We add the constrant that η, = 0. A Smpler Sub-model In our example of the study of the measurement process, we fnd that wth 7 labs and 4 manufacturers, we have 28 levels. If the effect of the lab s the same, regardless of whch manufacturer the tablets are comng from, and f the effect of the manufacturer s the same regardless of whch lab s measurng the tablets then we could express the model as y,,k = α + β + γ + E,,k

Note that now we have only I + J levels, rather than I J. Ths model s called an addtve model. It puts structure on the levels. That s the dfference between measurements at LAb and Lab 2 of tablets from Manufacturer A s β 2 β, and ths dfference s the same for the measurements at Labs and 2 for tablets from Manufacturer B,.e. there s no nteracton between lab and manufacturer. Degrees of Freedom To see that the addtve model s a sub-model of the full model, we can we express the full model as follows: y,,k = α + β + γ + ν, + E,,k Now agan, we need to put constrants on the parameterzaton. If we thnk about t from the geometrc perspectve, we see that the vector les n both the space spanned by the lab ndcators (the e ) and the space spanned by the manufacturer ndcators (the u ). So, the vector, and I of the e vectors and J of the u vectors are all that s needed for the addtve part of the model. As for the rest, suppose we have vectors v, that ndcate whether a response belongs n group, or not. Note that v, =, and and that v, =. So we need only of these I J vectors. All together that gves us + (I ) + ( ) + ( ) = ( ) of the + I + J + IJ vectors. If we are to put all of the parameters n then we must add constrants. Tradtonally these are β = 0 γ = 0 2

= 0, for = 0, for How many constrants do we have? Sums of Squares The Anova table of the sums of squared devatons helps us assess whether the smple addtve model s adequate to descrbe the varaton n the means, and whether there s a lab effect or a manufacturer effect (.e. whether all of the β = 0 or all of the γ = 0). The decomposton of the sums of squares s a bt more complex here. Frst we need to ntroduce some more notaton, ȳ.. = ȳ. = ȳ. = IJK JK y k y k, for =... I ȳ = Now let s look at the sums of squares: (y k ȳ.. ) 2 To begn, let s add and subtract the IJ means ȳ. 3

(y k ȳ.. ) 2 = (y k ȳ ) 2 + (ȳ ȳ.. ) 2 Show that the cross product term s 0. We call the frst sum on the rght-hand sde of the equaton the error sum of squares, or SS E. We want to further decompose the second term. (ȳ ȳ.. ) 2 What do we add and subtract ȳ. or ȳ.? Both: (ȳ ȳ. ȳ. + ȳ.. + ȳ. ȳ.. + ȳ. ȳ.. ) 2 = K(ȳ ȳ. ȳ. + ȳ.. ) 2 + JK(ȳ. ȳ.. ) 2 + IK(ȳ. ȳ.. ) 2 The three terms on the rght-hand sde of the equalty are called, the nteracton sum of squares, or SS LM, the sum of squares due to Lab, or SS L, and the sum of squares due to Manufacturer, or 4

SS M. Show that the cross products are all 0. ANOVA Table Arrange the sum of squares nto an ANOVA table. Source DF Sum of Squares Mean Square F-statstc Labs Manufacturer 3 Interacton 8 Error 60 Total 85 The frst F statstc s used to test whether there s a dfference between labs,.e. whether there s a lab effect. The second F statstc s used to test whether there s a dfference between manufactureres. The thrd s to test the addtve model,.e. s there an nteracton between lab and manufacturer. 5