# c i. INFERENCE FOR CONTRASTS (Chapter 4) It's unbiased: Recall: A contrast is a linear combination of effects with coefficients summing to zero:

Similar documents
1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands

Department of Statistics University of Toronto STA305H1S / 1004 HS Design and Analysis of Experiments Term Test - Winter Solution

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Lecture 4 Hypothesis Testing

Interval Estimation in the Classical Normal Linear Regression Model. 1. Introduction

Economics 130. Lecture 4 Simple Linear Regression Continued

STATISTICS QUESTIONS. Step by Step Solutions.

STAT 3008 Applied Regression Analysis

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Chapter 11: I = 2 samples independent samples paired samples Chapter 12: I 3 samples of equal size J one-way layout two-way layout

where I = (n x n) diagonal identity matrix with diagonal elements = 1 and off-diagonal elements = 0; and σ 2 e = variance of (Y X).

First Year Examination Department of Statistics, University of Florida

Tests of Single Linear Coefficient Restrictions: t-tests and F-tests. 1. Basic Rules. 2. Testing Single Linear Coefficient Restrictions

Topic 23 - Randomized Complete Block Designs (RCBD)

Comparison of Regression Lines

x = , so that calculated

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

Chapter 11: Simple Linear Regression and Correlation

F statistic = s2 1 s 2 ( F for Fisher )

Statistics II Final Exam 26/6/18

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER I EXAMINATION MTH352/MH3510 Regression Analysis

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6

17 Nested and Higher Order Designs

Lecture 6 More on Complete Randomized Block Design (RBD)

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

ECONOMICS 351*-A Mid-Term Exam -- Fall Term 2000 Page 1 of 13 pages. QUEEN'S UNIVERSITY AT KINGSTON Department of Economics

Topic- 11 The Analysis of Variance

Now we relax this assumption and allow that the error variance depends on the independent variables, i.e., heteroskedasticity

Chapter 13: Multiple Regression

Correlation and Regression. Correlation 9.1. Correlation. Chapter 9

Statistics for Economics & Business

Biostatistics 360 F&t Tests and Intervals in Regression 1

4.3 Poisson Regression

BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu

[The following data appear in Wooldridge Q2.3.] The table below contains the ACT score and college GPA for eight college students.

Chapter 12 Analysis of Covariance

Professor Chris Murray. Midterm Exam

Resource Allocation and Decision Analysis (ECON 8010) Spring 2014 Foundations of Regression Analysis

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

MULTIPLE COMPARISON PROCEDURES

Lecture 6: Introduction to Linear Regression

Answers Problem Set 2 Chem 314A Williamsen Spring 2000

DO NOT OPEN THE QUESTION PAPER UNTIL INSTRUCTED TO DO SO BY THE CHIEF INVIGILATOR. Introductory Econometrics 1 hour 30 minutes

ISQS 6348 Final Open notes, no books. Points out of 100 in parentheses. Y 1 ε 2

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

See Book Chapter 11 2 nd Edition (Chapter 10 1 st Edition)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

CHAPTER 8. Exercise Solutions

Here is the rationale: If X and y have a strong positive relationship to one another, then ( x x) will tend to be positive when ( y y)

is the calculated value of the dependent variable at point i. The best parameters have values that minimize the squares of the errors

Y = β 0 + β 1 X 1 + β 2 X β k X k + ε

STAT 511 FINAL EXAM NAME Spring 2001

17 - LINEAR REGRESSION II

MD. LUTFOR RAHMAN 1 AND KALIPADA SEN 2 Abstract

Lecture 3 Stat102, Spring 2007

UCLA STAT 13 Introduction to Statistical Methods for the Life and Health Sciences. Chapter 11 Analysis of Variance - ANOVA. Instructor: Ivo Dinov,

Introduction to Analysis of Variance (ANOVA) Part 1

x i1 =1 for all i (the constant ).

Chapter 5: Hypothesis Tests, Confidence Intervals & Gauss-Markov Result

Reduced slides. Introduction to Analysis of Variance (ANOVA) Part 1. Single factor

LINEAR REGRESSION ANALYSIS. MODULE VIII Lecture Indicator Variables

Statistics Chapter 4

Statistical Inference. 2.3 Summary Statistics Measures of Center and Spread. parameters ( population characteristics )

Topic 7: Analysis of Variance

Lecture 16 Statistical Analysis in Biomaterials Research (Part II)

Composite Hypotheses testing

A Comparative Study for Estimation Parameters in Panel Data Model

PubH 7405: REGRESSION ANALYSIS. SLR: INFERENCES, Part II

ANOVA. The Observations y ij

Polynomial Regression Models

/ n ) are compared. The logic is: if the two

Sampling Theory MODULE V LECTURE - 17 RATIO AND PRODUCT METHODS OF ESTIMATION

Chapter 2 - The Simple Linear Regression Model S =0. e i is a random error. S β2 β. This is a minimization problem. Solution is a calculus exercise.

Lecture 9: Linear regression: centering, hypothesis testing, multiple covariates, and confounding

Lecture 9: Linear regression: centering, hypothesis testing, multiple covariates, and confounding

Chapter 4 Experimental Design and Their Analysis

Durban Watson for Testing the Lack-of-Fit of Polynomial Regression Models without Replications

Analysis of Variance and Design of Experiments-II

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

Negative Binomial Regression

Statistics for Managers Using Microsoft Excel/SPSS Chapter 14 Multiple Regression Models

Topic 10: ANOVA models for random and mixed effects Fixed and Random Models in One-way Classification Experiments

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

experimenteel en correlationeel onderzoek

18. SIMPLE LINEAR REGRESSION III

Chapter 7 Generalized and Weighted Least Squares Estimation. In this method, the deviation between the observed and expected values of

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

28. SIMPLE LINEAR REGRESSION III

Introduction to Regression

Modeling and Simulation NETW 707

Recall that quantitative genetics is based on the extension of Mendelian principles to polygenic traits.

Linear Regression Analysis: Terminology and Notation

Chapter 14 Simple Linear Regression

ECON 351* -- Note 23: Tests for Coefficient Differences: Examples Introduction. Sample data: A random sample of 534 paid employees.

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law:

Learning Objectives for Chapter 11

Statistical tables are provided Two Hours UNIVERSITY OF MANCHESTER. Date: Wednesday 4 th June 2008 Time: 1400 to 1600

e i is a random error

Transcription:

1 INFERENCE FOR CONTRASTS (Chapter 4 Recall: A contrast s a lnear combnaton of effects wth coeffcents summng to zero: " where " = 0. Specfc types of contrasts of nterest nclude: Dfferences n effects Dfferences n means A specal type of dfference n means often of nterest n an experment wth a control group: The dfference between the control group effect and the mean of the other treatment effects. It's unbased: E( = E( = E ( = (µ + " = " µ + " = ", snce " µ = % ( µ = 0. " ' Recall: The least squares estmator of the contrast " s ˆ " =

3 4 Recall two model assumptons: Y t = µ +! + " t. The " t are ndependent random arables. Ths mples that the Y t 's are ndependent. Snce each s a lnear combnaton of the Y t 's for the th treatment group only, t follows that the 's are ndependent. Thus Var( = c Var( " = = " c. Recall two model assumptons: Y t = µ +! + " t. For each and t, " t ~ N(0, These mply: Y t ~ N(µ +!, Snce the Y t 's are ndependent, each, as a lnear combnaton of ndependent normal random arables, s also normal. Snce the contrast estmator s a lnear combnaton of the ndependent normal random arables, t too must be normal. Summarzng: ~ N( ", " c.

5 6 Standardzng, (* ~ N(0,1 Usng the estmate mse for, we obtan the standard error for the contrast estmator : se( = mse". Replacng the standard deaton of the contrast by the standard erron (* ges, whch no longer has a normal dstrbuton because of the substtuton of for. The usual trck works: In terms of random arables: SE( Y c " = ". (** = As mentoned before, / = SSE/[(n- ] ~ (n-/(n-. It can be proed that the numerator and denomnaton (** are ndependent. Thus

7 8 ~ t(n-. We can use ths as a test statstc to do nference (confdence nterals and hypothess tests for contrasts. Example: In the battery experment, treatments 1 and were alkalne batteres, whle types 3 and 4 were heay duty. To compare the alkalne wth the heay duty, we consder the dfference of means contrast D = (1/(! 1 +! - (1/(! 3 +! 4. Fnd a 95% confdence nteral for the contrast. State precsely what the resultng confdence nteral means. Perform a hypothess test wth null hypothess: The means for the two types are equal. Comments: 1. For a two-sded test, we could also do an F-test wth test statstc t.. A ery smlar analyss shows: The standard error for the th treatment mean µ +! mse s r. The test statstc µ has a t-dstrbuton wth n - degrees of freedom. So we can do hypothess tests and form confdence nterals for a sngle mean.. We haen't done examples of fndng confdence nterals or hypothess tests for effect dfferences or for treatment means, snce n practce n ANOVA, one does not usually do just one test or confdence nteral, so modfed technques for multple comparsons are needed.

The Problem of Multple Comparsons Suppose we want to form confdence nterals for two means or for two effect dfferences. If we formed a 95% confdence nteral for, say,! 1 -!, and another 95% confdence nteral for! 3 -! 4, we would get two nterals, say (a,b and (c,d, respectely. These would mean: 1. We hae produced (a,b by a method whch, for 95% percent of all completely randomzed samples of the same sze wth the specfed numben each treatment, yelds an nteral contanng! 1 -!, and. We hae produced (c,d by a method whch, for 95% percent of all completely randomzed samples of the same sze wth the specfed numben each treatment, yelds an nteral contanng! 3 -! 4. But there s absolutely no reason to belee that the 95% of samples n (1 are the same as the 95% of samples n (. 9 If we let A be the eent that the confdence nteral for! 1 -! actually contans! 1 -!, and let B be the eent that the confdence nteral for! 3 -! 4 actually contans! 3 -! 4, the best we can say n general s the followng: P(obtanng a sample gng a confdence nteral for! 1 -! that actually contans! 1 -! and also gng a confdence nteral for! 3 -! 4 that actually contans! 3 -! 4 = P(A%B = 1 - P((A%B C = 1 - P(A C B C = 1 - [P(A C + P(B C - P(A C % B C ] = 1 - P(A C - P(B C + P(A C % B C! 1 - P(A C - P(B C = 1-0.05-0.05 = 0.90 Smlarly, f we were formng k 95% confdence nterals, our "confdence" that for all of them, the correspondng true effect dfference would le n the correspondng CI would be reduced to 1 -.05k. Thus, other technques are needed for such "smultaneous nference" (or "multple comparsons". 10