Regression With a Categorical Independent Variable: Mean Comparisons

Similar documents
Regression With a Categorical Independent Variable

Regression With a Categorical Independent Variable

Regression With a Categorical Independent Variable

Variance Partitioning

The legacy of Sir Ronald A. Fisher. Fisher s three fundamental principles: local control, replication, and randomization.

Variance Partitioning

Chapter 8: Regression Models with Qualitative Predictors

OHSU OGI Class ECE-580-DOE :Design of Experiments Steve Brainerd

Applied Regression Analysis

Analysis of Variance (ANOVA)

Profile Analysis Multivariate Regression

Analysis of Variance

ANALYTICAL COMPARISONS AMONG TREATMENT MEANS (CHAPTER 4)

Inferences for Regression

Analysis of Variance: Repeated measures

Sleep data, two drugs Ch13.xls

Topic 1. Definitions

Topic 17 - Single Factor Analysis of Variance. Outline. One-way ANOVA. The Data / Notation. One way ANOVA Cell means model Factor effects model

Chapter 4: Regression Models

The One-Way Repeated-Measures ANOVA. (For Within-Subjects Designs)


Chapter 4. Regression Models. Learning Objectives

Group comparison test for independent samples

Chapter 3 Multiple Regression Complete Example

4:3 LEC - PLANNED COMPARISONS AND REGRESSION ANALYSES

BIOL Biometry LAB 6 - SINGLE FACTOR ANOVA and MULTIPLE COMPARISON PROCEDURES

Comparisons among means (or, the analysis of factor effects)

Statistical Foundations:

One-way between-subjects ANOVA. Comparing three or more independent means

Difference in two or more average scores in different groups

Orthogonal, Planned and Unplanned Comparisons

Topic 28: Unequal Replication in Two-Way ANOVA

COMPARING SEVERAL MEANS: ANOVA

Introduction To Logistic Regression

SIMPLE REGRESSION ANALYSIS. Business Statistics

Variance Estimates and the F Ratio. ERSH 8310 Lecture 3 September 2, 2009

Categorical Predictor Variables

Regression Models. Chapter 4. Introduction. Introduction. Introduction

Keppel, G. & Wickens, T. D. Design and Analysis Chapter 4: Analytical Comparisons Among Treatment Means

Mathematics for Economics MA course

Basic Business Statistics, 10/e

Course Introduction and Overview Descriptive Statistics Conceptualizations of Variance Review of the General Linear Model

STA441: Spring Multiple Regression. This slide show is a free open source document. See the last slide for copyright information.

Regression Analysis. BUS 735: Business Decision Making and Research

While you wait: Enter the following in your calculator. Find the mean and sample variation of each group. Bluman, Chapter 12 1

Statistical Distribution Assumptions of General Linear Models

STAT 350: Geometry of Least Squares

One-way between-subjects ANOVA. Comparing three or more independent means

10.2: The Chi Square Test for Goodness of Fit

Chapter 1 Statistical Inference

Six Sigma Black Belt Study Guides

Multiple Comparison Procedures Cohen Chapter 13. For EDUC/PSY 6600

Independent Samples ANOVA

Ch 3: Multiple Linear Regression

Chapter 14 Student Lecture Notes 14-1

We like to capture and represent the relationship between a set of possible causes and their response, by using a statistical predictive model.

Course Introduction and Overview Descriptive Statistics Conceptualizations of Variance Review of the General Linear Model

MORE ON MULTIPLE REGRESSION

ST430 Exam 2 Solutions

Lecture 13 Extra Sums of Squares

Multiple t Tests. Introduction to Analysis of Variance. Experiments with More than 2 Conditions

STAT 705 Chapter 16: One-way ANOVA

Ch 2: Simple Linear Regression

PLSC PRACTICE TEST ONE

Introduction to the Analysis of Variance (ANOVA) Computing One-Way Independent Measures (Between Subjects) ANOVAs

Repeated Measures ANOVA Multivariate ANOVA and Their Relationship to Linear Mixed Models

Module 03 Lecture 14 Inferential Statistics ANOVA and TOI

Dr. Junchao Xia Center of Biophysics and Computational Biology. Fall /1/2016 1/46

psyc3010 lecture 2 factorial between-ps ANOVA I: omnibus tests

Analytical Comparisons Among Treatment Means (Chapter 4) Analysis of Trend (Chapter 5) ERSH 8310 Fall 2009

Review. One-way ANOVA, I. What s coming up. Multiple comparisons

Chapter 7 Student Lecture Notes 7-1

What Is ANOVA? Comparing Groups. One-way ANOVA. One way ANOVA (the F ratio test)

The One-Way Independent-Samples ANOVA. (For Between-Subjects Designs)

AMS 7 Correlation and Regression Lecture 8

Notes for Week 13 Analysis of Variance (ANOVA) continued WEEK 13 page 1

Chapter 10. Design of Experiments and Analysis of Variance

Comparing Several Means: ANOVA

Longitudinal Data Analysis of Health Outcomes

An Introduction to Mplus and Path Analysis

One-Way ANOVA. Some examples of when ANOVA would be appropriate include:

ANCOVA. Lecture 9 Andrew Ainsworth

Analysis of Covariance. The following example illustrates a case where the covariate is affected by the treatments.

An Old Research Question

The simple linear regression model discussed in Chapter 13 was written as

WELCOME! Lecture 13 Thommy Perlinger

An Introduction to Path Analysis

A Introduction to Matrix Algebra and the Multivariate Normal Distribution

Multilevel Models in Matrix Form. Lecture 7 July 27, 2011 Advanced Multivariate Statistical Methods ICPSR Summer Session #2

(ii) Scan your answer sheets INTO ONE FILE only, and submit it in the drop-box.

Statistical methods for comparing multiple groups. Lecture 7: ANOVA. ANOVA: Definition. ANOVA: Concepts

Correlation Analysis

Linear models and their mathematical foundations: Simple linear regression

Lecture 10 Multiple Linear Regression

26:010:557 / 26:620:557 Social Science Research Methods

Contrasts and Multiple Comparisons Supplement for Pages

MORE ON SIMPLE REGRESSION: OVERVIEW

Applied Multivariate Statistical Modeling Prof. J. Maiti Department of Industrial Engineering and Management Indian Institute of Technology, Kharagpur

Group comparison test for independent samples

Multiple Comparisons

Transcription:

Regression With a Categorical Independent Variable: Mean Lecture 16 March 29, 2005 Applied Regression Analysis Lecture #16-3/29/2005 Slide 1 of 43

Today s Lecture comparisons among means. Today s Lecture Upcoming Schedule Post Hoc Planned comparisons - contrasts. Post hoc comparisons. Partitioning the sum of squares (again...this time for contrasts). A Priori in SPSS Lecture #16-3/29/2005 Slide 2 of 43

Upcoming Schedule 3/31 - categorical independent variables - Ch. 12. Today s Lecture Upcoming Schedule Post Hoc A Priori in SPSS 4/5 - Curvilinear regression analysis - Ch. 13. 4/7 - Continuous and categorical independent variables - Ch. 14 (homework handed out). 4/12, 4/14 - No class (AERA/NCME meetings). 4/19, 4/21 - ANCOVA - Ch. 14 and 15. 4/26 - Logistic regression - Ch. 17. 4/28 - SEM with latent variables, introduction to Confirmatory Factor Analysis - Ch. 19 (Final handed out - due 4pm May 10th). 5/3 - Path analysis - Ch. 18. 5/5 - Wrap-up (may be used for more in-depth coverage of previous topics). Lecture #16-3/29/2005 Slide 3 of 43

Just a Review Concerns Example Data Set Example Hypothesis Test General Types of Post Hoc A Priori in SPSS Recall (if you can) prior to spring break...we discussed techniques for incorporating categorical independent variables into a regression analysis using the general linear model. Categorical independent variables can be incorporated into a regression analysis via coding techniques. Dummy coding. Effect coding. Using either coding technique resulted in the same conclusions - estimates of the mean of the dependent variable (Y ) at each level of the categorical independent variable. Lecture #16-3/29/2005 Slide 4 of 43

Lecture #16-3/29/2005 Slide 5 of 43

Just a Review Concerns Example Data Set Example Hypothesis Test General Types of Post Hoc A Priori in SPSS The overall regression analysis using the coded variables gave us the same familiar partitioning of the sums of squares of the dependent variable: Sums of squares due to regression. Sums of squares due to error - not explained by the regression. The regression F-test found from these sums of squares provides information regarding the overall effect of the categorical independent variable. If the F-test was statistically significant, we could determine that: The categorical independent variable predicts Y above a chance level. There is a statistically significant difference between at least one pair of the means of each category level. Lecture #16-3/29/2005 Slide 6 of 43

Concerns Example Data Set Example Hypothesis Test General Types of For a categorical independent variable, a statistically significant R 2 means a rejection of the null hypothesis: H 0 : µ 1 = µ 2 =... = µ g Note that rejection simply means that at least one of the above = signs is truly a. To determine which means are not equal, one of the multiple comparison procedures must be applied. Post Hoc A Priori in SPSS Lecture #16-3/29/2005 Slide 7 of 43

Comparison Concerns Concerns Example Data Set Example Hypothesis Test General Types of Post Hoc A Priori in SPSS The topic of multiple comparisons brings up a wealth of concerns, both from philosophical and statistical points of view. Most concerns are centered around the potential for an exponential number of post-hoc comparisons, for g groups: ( ) g 2 The phrase capitalization on chance is frequently used to describe many concerns. Even with these concerns, most people still use multiple comparisons for information regarding their analysis. Like most other statistical techniques, know the limitations of a technique is often as important as knowing the results of a technique. Lecture #16-3/29/2005 Slide 8 of 43

Example Data Set Concerns Example Data Set Example Hypothesis Test General Types of Post Hoc A Priori in SPSS Neter (1996, p. 676). The Kenton Food Company wished to test four different package designs for a new breakfast cereal. Twenty stores, with approximately equal sales volumes, were selected as the experimental units. Each store was randomly assigned one of the package designs, with each package design assigned to five stores. The stores were chosen to be comparable in location and sales volume. Other relevant conditions that could affect sales, such as price, amount and location of shelf space, and special promotional efforts, were kept the same for all of the stores in the experiment. Lecture #16-3/29/2005 Slide 9 of 43

Cereal Concerns Example Data Set Example Hypothesis Test General Types of Post Hoc A Priori in SPSS Lecture #16-3/29/2005 Slide 10 of 43

Overall Hypothesis Test Concerns Example Data Set Example Hypothesis Test General Types of Post Hoc A Priori F = R 2 /k (1 R 2 )/(N k 1) = 0.788/3 (1 0.7.88)/(20 3 1) = 19.803 From Excel ( =fdist(19.803,3,16) ), p = 0.00001. If we used a Type-I error rate of 0.05, we would reject the null hypothesis, and conclude that at least one regression coefficient for this analysis would be significantly different from zero. Having a regression coefficient of zero means having zero difference between the mean of one category and the grand mean. in SPSS Lecture #16-3/29/2005 Slide 11 of 43

= Concerns Example Data Set Example Hypothesis Test General Types of Post Hoc A Priori in SPSS Imagine you would like to examine the difference between box type #1 and box type #2. You seek to test a null hypothesis of H 0 : µ 1 = µ 2 Notice, equivalently, that this null hypothesis could be expressed in a slightly different way: H 0 : µ 1 = µ 2 µ 1 µ 2 = 0 The expression at the right hand side indicates that the null hypothesis states that the difference between the two means is zero. Lecture #16-3/29/2005 Slide 12 of 43

= Concerns Example Data Set Example Hypothesis Test General Types of Post Hoc A Priori in SPSS Alternatively, consider a mathematical way of expressing a contrast (or linear combination) the entire set of category level means: L = C 1 Ȳ 1 + C 2 Ȳ 2 +... + C g Ȳ g This linear combination is called a contrast. The contrast can be used to construct any comparison of group means. From our example, to test the difference between the means of the first and second box type, our contrast would be: L = (1)(Ȳ1) + ( 1)(Ȳ2) + (0)(Ȳ3) + (0)(Ȳ4) = Ȳ1 Ȳ2 Lecture #16-3/29/2005 Slide 13 of 43

General Concerns Example Data Set Example Hypothesis Test General Types of Post Hoc A Priori in SPSS can be used for comparisons beyond the equivalence of two means. For instance, if one wanted to contrast the average sales for box type #1 and box type #2 with that of the average sales of box type #3, the contrast would look like: L = ( ) ( ) 1 1 (Ȳ 1 )+ (Ȳ 2 )+( 1)(Ȳ 3 )+(0)(Ȳ 4 ) = Ȳ1 + Ȳ2 2 2 2 One could re-write this contrast, equivalently, as: Ȳ 3 L = (1)(Ȳ 1 ) + (1) (Ȳ 2 ) + ( 2)(Ȳ 3 ) + (0)(Ȳ 4 ) = Ȳ 1 + Ȳ 2 2Ȳ 3 Lecture #16-3/29/2005 Slide 14 of 43

Lecture #16-3/29/2005 Slide 15 of 43

Types of A classic distinction (that may be fading, presently), is made. Concerns Example Data Set Example Hypothesis Test General Types of Post Hoc A Priori planned prior to the overall analysis F-test are called planned or a priori contrasts. decided upon after running the overall F-test are called post hoc contrasts. Typically, post hoc contrasts have been thought of as a negative thing. Are they really that bad? Should they be treated any differently? in SPSS Lecture #16-3/29/2005 Slide 16 of 43

Post Hoc Post Hoc Scheffé s Method Scheffé s Example Example Contrast #1 Example Contrast #2 Alternatively In our textbook, Pedhazur indicates his belief that post hoc contrasts should be evaluated differently from a priori contrasts. To illustrate his suggestions, I will separate both for the remainder of the lecture. In reality, however, little difference exists between the two. A Priori in SPSS Lecture #16-3/29/2005 Slide 17 of 43

Scheffé s Method Post Hoc Scheffé s Method Scheffé s Example Example Contrast #1 Example Contrast #2 Alternatively A Priori in SPSS Many types of statistics can be computed to evaluate the null hypothesis behind many of the contrasts one can develop. One of the most conservative (in terms of Type I errors) is the method developed by Scheffé. This method allows for any type of contrast to be built, providing a test statistic for the contrast. One can calculate the cut-off point where the magnitude L of a contrast becomes statistically significant by using: S = kf α;k,n k 1 MSR g j=1 (C j ) 2 n j Lecture #16-3/29/2005 Slide 18 of 43

Scheffé s Method Post Hoc Scheffé s Method Scheffé s Example Example Contrast #1 Example Contrast #2 Alternatively A Priori in SPSS The Scheffé test: S = kf α;k,n k 1 MSR g j=1 (C j ) 2 k is the number of coded vectors in the analysis - the number of category levels minus one. F α;k,n k 1 is the value of the F-statistic for a given Type I error rate α, with k and N k 1 degrees of freedom. MSR is the mean square of the residuals from the overall ANOVA table (overall regression hypothesis test). C j is the contrast coefficient for category level j. n j n j is the number of observations in category level j. Lecture #16-3/29/2005 Slide 19 of 43

Scheffé s Example Post Hoc Scheffé s Method Scheffé s Example Example Contrast #1 Example Contrast #2 Alternatively A Priori in SPSS Recall from our cereal box test example, the mean number of units sold for each box type: Box Type Mean N 1 14.6 5 2 13.4 5 3 19.4 5 4 27.2 5 Also recall the MSR from the overall regression hypothesis test was 9.9. Using this information, we will construct two contrasts: L = (1)(Ȳ 1 ) + ( 1)(Ȳ 2 ) + (0)(Ȳ 3 ) + (0)(Ȳ 4 ) = Ȳ 1 Ȳ 2 L = ( ) 1 2 ( Ȳ 1 ) + ( ) 1 2 ( Ȳ 2 ) + ( 1)(Ȳ3) + (0)(Ȳ4) = Ȳ1+Ȳ 2 2 Ȳ3 Lecture #16-3/29/2005 Slide 20 of 43

Example Contrast #1 L = (1)(Ȳ 1 ) + ( 1)(Ȳ 2 ) + (0)(Ȳ 3 ) + (0)(Ȳ 4 ) = Ȳ 1 Ȳ 2 Post Hoc Scheffé s Method Scheffé s Example Example Contrast #1 Example Contrast #2 Alternatively A Priori in SPSS Null hypothesis: H 0 : µ 1 µ 2 = 0 Alternative hypothesis H A : µ 1 µ 2 0 Type I error rate: 0.05; k = 3; N = 20. F 0.05;3,16 = 3.24 L = 14.6 13.4 = 1.2 S = kf α;k,n k 1 MSR g j=1 (C j ) 2 n j = [ ] 1 2 3 3.24 9.9 5 + ( 1)2 = 6.2 5 Lecture #16-3/29/2005 Slide 21 of 43

Lecture #16-3/29/2005 Slide 22 of 43

Example Contrast #1 With L = 1.2 being less than S = 6.2, we conclude that the the contrast is not significantly different from zero, or that there is no difference between the mean sales of box type #1 and box type #2. Post Hoc Scheffé s Method Scheffé s Example Example Contrast #1 Example Contrast #2 Alternatively A Priori in SPSS Lecture #16-3/29/2005 Slide 23 of 43

Example Contrast #2 L = ( 1 2 ) (Ȳ1 ) + ( ) 1 2 (Ȳ2 ) + ( 1)(Ȳ 3 ) + (0)(Ȳ 4 ) = Ȳ1+Ȳ2 2 Ȳ 3 Post Hoc Scheffé s Method Scheffé s Example Example Contrast #1 Example Contrast #2 Alternatively A Priori in SPSS Null hypothesis: H 0 : Ȳ1+Ȳ2 2 Ȳ 3 = 0 Alternative hypothesis H A : Ȳ1+Ȳ2 2 Ȳ 3 0 Type I error rate: 0.05; k = 3; N = 20. F 0.05;3,16 = 3.24 L = 14.6+13.4 2 19.4 = 5.5 = [ ].5 2 3 3.24 9.9 5 +.52 5 + ( 2)2 = 9.3 5 Lecture #16-3/29/2005 Slide 24 of 43

Example Contrast #2 With L = 5.5 being less than S = 9.3, we conclude that the the contrast is not significantly different from zero, or that there is no difference between the average mean sales of box type #1 and box type #2 with that of box type #3. Post Hoc Scheffé s Method Scheffé s Example Example Contrast #1 Example Contrast #2 Alternatively A Priori in SPSS Lecture #16-3/29/2005 Slide 25 of 43

Lecture #16-3/29/2005 Slide 26 of 43

Alternatively... Post Hoc Scheffé s Method Scheffé s Example Example Contrast #1 Example Contrast #2 Alternatively A Priori in SPSS Instead of constructing S, one can compute an F-statistic for each contrast, and then compare that statistic with a value given in an F-table: F = MSR L [ 2 g j=1 (C j ) 2 n j ] This F statistic has one degree of freedom for the numerator and N-k-1 degrees of freedom for the denominator. If this statistic exceeds kf α;k,n k 1, the null hypothesis is rejected. Lecture #16-3/29/2005 Slide 27 of 43

A Priori Pedhazur indicates that A Priori contrasts: Post Hoc A Priori Orthogonal Orthogonal Coding Partitioning the Sum of Squares Nonorthogonal Clearly, such comparisons are preferable as they are focused on tests of hypotheses derived from theory or ones concerned with the relative effectiveness of treatments, programs, practices, and the like. (p. 376) Although thin on support for this claim, he suggests that A Priori contrasts be held to a different standard of evidence for suggesting when to reject a null hypothesis. The following slides show common types of A Priori contrasts and their hypothesis tests. in SPSS Lecture #16-3/29/2005 Slide 28 of 43

Orthogonal Two contrasts are orthogonal when the sum of the products of the coefficients for their respective elements is zero, or: C 1C 2 = 0 Post Hoc A Priori Orthogonal Orthogonal Coding Partitioning the Sum of Squares Nonorthogonal in SPSS Here, C g represents a column vector (size g 1) of the coefficients for a given contrast. A result of this orthogonality is that the correlation between these two comparisons is zero. As you will see, having a zero correlation has implications for partitioning the sum of squares due to these contrasts. The maximum number of orthogonal contrasts that can be built is equal to the number of groups (category levels) minus one. Lecture #16-3/29/2005 Slide 29 of 43

Orthogonal Example Post Hoc A Priori Orthogonal Orthogonal Coding Partitioning the Sum of Squares Nonorthogonal in SPSS To demonstrate orthogonal contrasts, consider the two example contrasts we constructed previously: L 1 = (1)(Ȳ1) + ( 1)(Ȳ2) + (0)(Ȳ3) + (0)(Ȳ4) = Ȳ1 Ȳ2 L 2 = ( ) 1 2 ( Ȳ 1 )+ ( ) 1 2 ( Ȳ 2 )+( 1)(Ȳ3)+(0)(Ȳ4) = Ȳ1+Ȳ 2 2 Ȳ3 Notice that multiplying the contrast coefficients for each contrast gives: (1 1 2 ) + ( 1 1 ) + (0 2) + (0 0) = 0 2 Therefore, L 1 and L 2 are orthogonal contrasts. Because there are four category levels, only one more orthogonal contrast can be made: L 3 = ( ) 1 3 (Ȳ1 )+ ( ) 1 3 (Ȳ2 )+ ( ) 1 3 (Ȳ3 )+( 3)(Ȳ 4 ) = Ȳ1+Ȳ2+Ȳ3 3 Ȳ 4 Lecture #16-3/29/2005 Slide 30 of 43

Orthogonal Coding Post Hoc A Priori Orthogonal Orthogonal Coding Partitioning the Sum of Squares Nonorthogonal in SPSS Instead of wording contrasts as functions of the means of each category level, consider contrasts as yet another type of variable coding technique. One could create a set of new column vectors, with entries representing the coefficients of each contrast. Once these vectors were created, the GLM could be used to estimate the contrasts. The General Linear Model states that the estimated regression parameters are given by: b = (X X) 1 X y Lecture #16-3/29/2005 Slide 31 of 43

Lecture #16-3/29/2005 Slide 32 of 43

Orthogonal Coding Example Cereal example data: Post Hoc A Priori Orthogonal Orthogonal Coding Partitioning the Sum of Squares Nonorthogonal in SPSS Y I O 1 O 2 O 3 Type 11 1 1 1 1 1 17 1 1 1 1 1 16 1 1 1 1 1 14 1 1 1 1 1 15 1 1 1 1 1 12 1-1 1 1 2 10 1-1 1 1 2 15 1-1 1 1 2 19 1-1 1 1 2 11 1-1 1 1 2 23 1 0-2 1 3 20 1 0-2 1 3 18 1 0-2 1 3 17 1 0-2 1 3 19 1 0-2 1 3 27 1 0 0-3 4 33 1 0 0-3 4 22 1 0 0-3 4 26 1 0 0-3 4 28 1 0 0-3 4 Lecture #16-3/29/2005 Slide 33 of 43

Partitioning the Sum of Squares Post Hoc Recall from previous lectures that a regression model with uncorrelated predictor variables allows for additive increases in R 2 when variables are added to the model. Because of our orthogonal contrasts, the model R 2 can be decomposed into components that are due to each contrast: A Priori Orthogonal Orthogonal Coding Partitioning the Sum of Squares Nonorthogonal Model R 2 SS reg I, O 1 0.005 3.60 I, O 2 0.130 97.20 I, O 3 0.653 487.35 I, O 1, O 2, O 3 0.788 588.15 in SPSS Lecture #16-3/29/2005 Slide 34 of 43

Partitioning the Sum of Squares The ANOVA table can then be written as: Source df SS M S F Regression 3 588.15 196.05 19.803 O 1 1 3.60 3.60 0.364 O 2 1 97.20 97.20 9.818 O 3 1 487.35 487.35 49.227 Residual 16 158.40 9.90 Lecture #16-3/29/2005 Slide 35 of 43

Nonorthogonal Post Hoc A Priori Orthogonal Orthogonal Coding Partitioning the Sum of Squares Nonorthogonal Orthogonal contrasts are not the only type of contrasts that can be made A Priori. Nonorthogonal contrasts can be decided upon and tested in a similar manner. Because overlap occurs in nonorthogonal contrasts, the R 2 will not be additive. Furthermore, this nonadditivity will result in the sum of the contrast sum of squares being greater than the sum of squares due to the regression. in SPSS Lecture #16-3/29/2005 Slide 36 of 43

Lecture #16-3/29/2005 Slide 37 of 43

Nonorthogonal Post Hoc A Priori Orthogonal Orthogonal Coding Partitioning the Sum of Squares Nonorthogonal in SPSS Estimation of planned nonorthogonal contrasts proceeds much as the contrasts described previously. Again, L must be computed. The only difference is that one must now control the overall Type I error rate. This procedure equates to dividing the overall α by the number of planned comparisons. This procedure is commonly referred to as the Bonferroni procedure. It equates to running multiple t-tests with the levels of significance being adjusted. Lecture #16-3/29/2005 Slide 38 of 43

in SPSS In SPSS, contrasts can be run in several different ways. Post Hoc A Priori Under Analyze...General Linear Model...Multivariate, there are two boxes to choose from:. Post hoc. in SPSS Post Hoc Lecture #16-3/29/2005 Slide 39 of 43

in SPSS Post Hoc A Priori in SPSS Post Hoc Lecture #16-3/29/2005 Slide 40 of 43

Post Hoc in SPSS Post Hoc A Priori in SPSS Post Hoc Lecture #16-3/29/2005 Slide 41 of 43

Final Thought Post Hoc A Priori in SPSS Final Thought Next Class Mean comparisons are no different than coded regression for categorical independent variables. Be sure to acknowledge whether a contrast was planned or developed post hoc. Everything for today was done with a single categorical independent variable. As we add independent variables, the model complexity increases, but the mathematics are the same. Lecture #16-3/29/2005 Slide 42 of 43

Next Time Midterm returned. Post Hoc A Priori Unequal sample sizes. categorical independent variables. Interactions between categorical independent variables. in SPSS Final Thought Next Class Lecture #16-3/29/2005 Slide 43 of 43