Chapter 5 Contrasts for one-way ANOVA

Similar documents
Chapter 7 Factorial ANOVA: Two-way ANOVA

Chapter 6 Planned Contrasts and Post-hoc Tests for one-way ANOVA

Two-Sample Inferential Statistics

10/31/2012. One-Way ANOVA F-test

Sampling Distributions: Central Limit Theorem

Analytical Comparisons Among Treatment Means (Chapter 4) Analysis of Trend (Chapter 5) ERSH 8310 Fall 2009

4:3 LEC - PLANNED COMPARISONS AND REGRESSION ANALYSES

The t-test: A z-score for a sample mean tells us where in the distribution the particular mean lies

8/23/2018. One-Way ANOVA F-test. 1. Situation/hypotheses. 2. Test statistic. 3.Distribution. 4. Assumptions

Hypothesis testing: Steps

Disadvantages of using many pooled t procedures. The sampling distribution of the sample means. The variability between the sample means

Advanced Experimental Design

One-Way ANOVA. Some examples of when ANOVA would be appropriate include:

Orthogonal, Planned and Unplanned Comparisons

Inferences About the Difference Between Two Means

One-factor analysis of variance (ANOVA)

Hypothesis testing: Steps

CHAPTER 17 CHI-SQUARE AND OTHER NONPARAMETRIC TESTS FROM: PAGANO, R. R. (2007)

INTRODUCTION TO ANALYSIS OF VARIANCE

4.1. Introduction: Comparing Means

Keppel, G. & Wickens, T. D. Design and Analysis Chapter 4: Analytical Comparisons Among Treatment Means

What Does the F-Ratio Tell Us?

Review. One-way ANOVA, I. What s coming up. Multiple comparisons

Lecture 7: Hypothesis Testing and ANOVA

Introduction to Business Statistics QM 220 Chapter 12

Analysis of 2x2 Cross-Over Designs using T-Tests

Chapter 23. Inference About Means

Chapter 13 Correlation

One-way between-subjects ANOVA. Comparing three or more independent means

An Analysis of College Algebra Exam Scores December 14, James D Jones Math Section 01

WELCOME! Lecture 13 Thommy Perlinger

ANALYTICAL COMPARISONS AMONG TREATMENT MEANS (CHAPTER 4)

One-Way ANOVA Cohen Chapter 12 EDUC/PSY 6600

Statistics For Economics & Business

Degrees of freedom df=1. Limitations OR in SPSS LIM: Knowing σ and µ is unlikely in large

Chapter 9 Inferences from Two Samples

BIOL Biometry LAB 6 - SINGLE FACTOR ANOVA and MULTIPLE COMPARISON PROCEDURES

Lecture 5: ANOVA and Correlation

COMPARING SEVERAL MEANS: ANOVA

Notes on Maxwell & Delaney

ANOVA: Comparing More Than Two Means

Statistics: revision

The independent-means t-test:

Sampling Distributions

Statistical methods for comparing multiple groups. Lecture 7: ANOVA. ANOVA: Definition. ANOVA: Concepts

T.I.H.E. IT 233 Statistics and Probability: Sem. 1: 2013 ESTIMATION AND HYPOTHESIS TESTING OF TWO POPULATIONS

Review of Multiple Regression

In a one-way ANOVA, the total sums of squares among observations is partitioned into two components: Sums of squares represent:

Chapter 23: Inferences About Means

Department of Economics. Business Statistics. Chapter 12 Chi-square test of independence & Analysis of Variance ECON 509. Dr.

Regression, part II. I. What does it all mean? A) Notice that so far all we ve done is math.

Statistics for IT Managers

COGS 14B: INTRODUCTION TO STATISTICAL ANALYSIS

Analysis of Variance (ANOVA)

While you wait: Enter the following in your calculator. Find the mean and sample variation of each group. Bluman, Chapter 12 1

PSY 216. Assignment 9 Answers. Under what circumstances is a t statistic used instead of a z-score for a hypothesis test

Hypothesis Testing hypothesis testing approach

1 Descriptive statistics. 2 Scores and probability distributions. 3 Hypothesis testing and one-sample t-test. 4 More on t-tests

Multiple Regression. More Hypothesis Testing. More Hypothesis Testing The big question: What we really want to know: What we actually know: We know:

Basic Statistics. 1. Gross error analyst makes a gross mistake (misread balance or entered wrong value into calculation).

HYPOTHESIS TESTING II TESTS ON MEANS. Sorana D. Bolboacă

Parametric versus Nonparametric Statistics-when to use them and which is more powerful? Dr Mahmoud Alhussami

9/28/2013. PSY 511: Advanced Statistics for Psychological and Behavioral Research 1

THE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE

Chapter 8 Student Lecture Notes 8-1. Department of Economics. Business Statistics. Chapter 12 Chi-square test of independence & Analysis of Variance

psyc3010 lecture 2 factorial between-ps ANOVA I: omnibus tests

ANOVA - analysis of variance - used to compare the means of several populations.

Table of z values and probabilities for the standard normal distribution. z is the first column plus the top row. Each cell shows P(X z).

DETAILED CONTENTS PART I INTRODUCTION AND DESCRIPTIVE STATISTICS. 1. Introduction to Statistics

Mathematical Notation Math Introduction to Applied Statistics

Frequency table: Var2 (Spreadsheet1) Count Cumulative Percent Cumulative From To. Percent <x<=

Pooled Variance t Test

Student s t-distribution. The t-distribution, t-tests, & Measures of Effect Size

Hypothesis testing I. - In particular, we are talking about statistical hypotheses. [get everyone s finger length!] n =

Chapter 7: Correlation

Multiple Comparison Procedures Cohen Chapter 13. For EDUC/PSY 6600

Statistical Inference: Estimation and Confidence Intervals Hypothesis Testing

Unit 27 One-Way Analysis of Variance

One-Way Analysis of Variance (ANOVA)

This gives us an upper and lower bound that capture our population mean.

Psych 230. Psychological Measurement and Statistics

GLM Repeated-measures designs: One within-subjects factor

REVIEW 8/2/2017 陈芳华东师大英语系

Basics on t-tests Independent Sample t-tests Single-Sample t-tests Summary of t-tests Multiple Tests, Effect Size Proportions. Statistiek I.

POLI 443 Applied Political Research

MATH Notebook 3 Spring 2018

Chapter 24. Comparing Means

Do not copy, post, or distribute. Independent-Samples t Test and Mann- C h a p t e r 13

Introduction to the Analysis of Variance (ANOVA) Computing One-Way Independent Measures (Between Subjects) ANOVAs

Regression: Main Ideas Setting: Quantitative outcome with a quantitative explanatory variable. Example, cont.

Using SPSS for One Way Analysis of Variance

Ch. 16: Correlation and Regression

Topic 4: Orthogonal Contrasts

Data Analysis and Statistical Methods Statistics 651

Questions 3.83, 6.11, 6.12, 6.17, 6.25, 6.29, 6.33, 6.35, 6.50, 6.51, 6.53, 6.55, 6.59, 6.60, 6.65, 6.69, 6.70, 6.77, 6.79, 6.89, 6.

Calculating Fobt for all possible combinations of variances for each sample Calculating the probability of (F) for each different value of Fobt

Analysis of variance

Analysis of variance

The One-Way Independent-Samples ANOVA. (For Between-Subjects Designs)

Ordinary Least Squares Regression Explained: Vartanian

Transcription:

Chapter Contrasts for one-way NOV Page. What is a contrast? -. Types of contrasts -. Significance tests of a single contrast -0. Brand name contrasts -. Relationships between the omnibus F and contrasts - 6. Robust tests for a single contrast -9 7. Effect sizes for a single contrast - 8. n example - dvanced topics in contrast analysis 9. Trend analysis -9 0. Simultaneous significance tests on multiple contrasts -. Contrasts with unequal cell size -6. final example -70-006. Karpinski

Contrasts for one-way NOV. What is a contrast? focused test of means weighted sum of means Contrasts allow you to test your research hypothesis (as opposed to the statistical hypothesis) Example: You want to investigate if a college education improves ST scores. You obtain five groups with n in each group: o High School Seniors o College Seniors Mathematics Majors Chemistry Majors English Majors History Majors o ll participants take the ST and scores are recorded o The omnibus F-test examines the following hypotheses: H : µ µ µ µ µ 0 H : Not all i 's are equal µ o But you want to know: Do college seniors score differently than high school seniors? Do natural science majors score differently than humanities majors? Do math majors score differently than chemistry majors? Do English majors score differently than history majors? HS College Students Students Math Chemistry English History µ µ µ µ µ - 006. Karpinski

Do college seniors score differently than high school seniors? HS College Students Students Math Chemistry English History µ µ µ µ µ H 0 µ µ µ µ : µ H µ µ µ µ : µ Do natural science majors score differently than humanities majors? HS College Students Students Math Chemistry English History µ µ µ µ H 0 µ µ µ µ : H µ µ µ µ : Do math majors score differently than chemistry majors? HS College Students Students Math Chemistry English History µ µ H : µ µ H : µ µ 0 Do English majors score differently than history majors? HS College Students Students Math Chemistry English History µ µ H : µ µ H : µ µ 0-006. Karpinski

In general, a contrast is a set of weights that defines a specific comparison over the cell means a ψ c i µ i c µ c µ c µ... c a µ a j a ˆ ψ c i X i c X c X c X... c a X a j o Where (µ,µ,µ,...,µ a ) are the population means for each group (X, X,X,...,X a ) are the observed means for each group (c,c,c,...,c a )are weights/contrast coefficients with a c i i 0 contrast is a linear combination of cell means o Do college seniors score differently than high school seniors? µ µ µ µ H 0 : µ or H 0 : µ µ µ µ 0 ψ µ c,,,, o Do natural science majors score differently than humanities majors? µ µ µ µ H 0 : or H 0 : µ µ µ 0 ψ µ µ c 0,,,, o Do math majors score differently than chemistry majors? H 0 : µ µ or H 0 : µ 0 ψ µ c ( 0,,,0,0 ) - 006. Karpinski

. Types of Contrasts Pairwise contrasts o Comparisons between two cell means o Contrast is of the form c i and c i for some i and i a(a ) o If you have a groups then there are possible pairwise contrasts o Examples: Do math majors score differently than chemistry majors? ψ µ c ( 0,,,0,0 ) Do English majors score differently than history majors? ψ µ c ( 0,0,0,, ) µ o When there are two groups (a ), then the two independent samples t- test is equivalent to the c (, ) contrast on the two means. Complex contrasts o contrast between more than two means o There are an infinite number of contrasts you can perform for any design o Do college seniors score differently than high school seniors? ψ µ c,,,, o Do natural science majors score differently than humanities majors? ψ µ µ c 0,,,, o So long as the coefficients sum to zero, you can make any comparison: ψ k. 0µ.08µ.98µ.8µ. 7µ c (. 0,.08,.98,.8,.7) But remember you have to be able to interpret the result! - 006. Karpinski

Orthogonal contrasts o Sometimes called non-redundant contrasts o Orthogonality may be best understood through a counter-example o Suppose you want to test three contrasts: Do math majors score differently than high school seniors? ψ µ c (,,0,0,0 ) Do chemistry majors score differently than high school seniors? ψ µ c (,0,,0,0 ) Do math majors score differently than chemistry majors? ψ µ c ( 0,,,0,0 ) µ o But we notice that ψ µ µ ( µ ) ( µ ) ( µ ) ψ ψ If I know ψ and ψ then I can determine the value of ψ ψ, ψ, and ψ are redundant or non-orthogonal contrasts o Orthogonality defined: set of contrasts is orthogonal if they are independent of each other (or if knowing the value of one contrast in no way provides any information about the other contrast) If a set of contrasts are orthogonal then the contrast coefficients are not correlated with each other Two contrasts are orthogonal if the angle between them in a-space is a right angle Two contrasts are orthogonal if for equal n ψ (a,a,a,...,a a ) ψ (b,b,b,...,b a ) a a i b i 0 or a b a b... a a b a 0 j -6 006. Karpinski

Two contrasts are orthogonal if for unequal n ψ (a,a,a,...,a a ) ψ (b,b,b,...,b a ) a j a i b i n i 0 or a b n a b n... a a b a n a 0 o Examples of Orthogonality (assuming equal n) a c i c * i 0* ( ) * Set #: c (,0, ) and c,, j 0 0 c and c are orthogonal Set #: ( 0,, ) c c c and c,, a c i c i ( 0* ) * * j 0 c and c are orthogonal 0 c c Set #: c (,,0 ) and c (,0, ) a 6 c i c 6i ( *) ( *0) 0* j 0 0 ( ) c and c 6 are NOT orthogonal -7 006. Karpinski

o set of contrasts is orthogonal if each contrast is orthogonal to all other contrasts in the set You can check that: c (,,0,0 ) c c c (,,,0) c c (,,, ) c c c o If you have a groups, then there are a- possible orthogonal contrasts We lose one contrast for the grand mean (the unit contrast) Having the contrasts sum to zero assures that they will be orthogonal to the unit contrast If you have more than a- contrasts, then the contrasts are redundant and you can write at least one contrast as a linear combination of the other contrasts Example: For a, we can find only orthogonal contrasts. ny other contrasts are redundant. µ ψ ψ µ µ µ ψ µ µ µ ψ ψ ψ is not orthogonal to ψ ψ is not orthogonal to ψ We can write ψ in terms of ψ and ψ 9 ψ ψ ψ 0 9 ( µ ) µ µ 0 9 9 µ µ µ µ 0 0 0 0 µ µ µ In general, you will not need to show how a contrast may be calculated from a set of orthogonal contrasts. It is sufficient to know that if you have more than a- contrasts, there must be at least one contrast you can write as a linear combination of the other contrasts. -8 006. Karpinski

o There is nothing wrong with testing non-orthogonal contrasts, as long as you are aware that they are redundant. o For example, you may want to examine all possible pairwise contrasts. These contrasts are not orthogonal, but they may be relevant to your research hypothesis. Nice properties of orthogonal contrasts: o We will learn to compute Sums of Squares associated with each contrast (SSC i ) o For a set of a- orthogonal contrasts Each contrast has one degree of freedom SSB SSC SSC SSC... SSC a- In other words, a set of a- orthogonal contrasts partitions the SSB Recall that for the omnibus NOV, the df bet a. The omnibus test combines the results of these a- contrasts and reports them in one lump test ny set of a- orthogonal contrasts will yield the identical result as the omnibus test -9 006. Karpinski

. Significance tests of a single contrast Recall the general form of a t-test: estimate of population parameter t ~ estimated standard error t ~ ψˆ standard error( ψˆ ) To compute a significance test for a single contrast, we need: o n estimate of the value of the contrast o n estimate of the standard error of the contrast (The standard deviation of the sampling distribution of the contrast) o Value of a contrast: a ψ c i µ i c µ c µ c µ... c a µ a j a ˆ ψ c i X i c X c X c X... c a X a j ψˆ is an unbiased estimate of the true population value of ψ o Standard error of a contrast Recall that standard error is the standard deviation of the sampling distribution The standard error for the two independent samples t-test: Std Error s pooled n n The standard error of a contrast has a similar form: Std Error( ˆ ψ ) MSW c i a i n i Where ci is the squared weight for each group ni is the sample size for each group MSW is MSW from the omnibus NOV -0 006. Karpinski

Constructing a significance test estimate of population parameter t ~ estimated standard error t ~ ψˆ standard error( ψˆ ) o Now we can insert the parts of the t-test into the equation: t observed i MSW c X i c i n i with degrees of freedom df w N a o To determine the level of significance, you can: Look up t crit for df N-a and the appropriate α Or preferably, compare p obs with p crit o Note that because the test for a contrast is calculated using the t- distribution, you can use either a one-tailed or two-tailed test of significance. s previously mentioned, you typically want to report the two-tailed test of the contrast. - 006. Karpinski

Example #: a and c (, ) ψ ˆ () X ( ) X X X Std error (ψˆ ) MSW ( ) MSW n n n n t ψˆ StdError( ψˆ ) X X MSW n n o But we know that for two groups, MSW s pooled o Thus, the two independent samples t-test is identical to a c (, ) contrast on the two means Example #: study of the effects of reward on learning in children DV Number of trials to learn a puzzle Level of Reward Frequent Infrequent (66%) (%) 9 7 0 6 8 9 7 6 8 6 0 6 7 Constant (00%) Never (0%) H: Constant reward will produce faster learning than the average of the other conditions H: Frequent reward will produce faster learning than the average of infrequent or no reward H: Infrequent reward will produce faster learning than no reward - 006. Karpinski

trials 0.00 7.0 0.00 8.0 6 0.00.00.00.00.00 reward TRILS 0 8 N 6 Constant REWRD 6 Frequent 6 Infrequent 6 Never o Step : Convert the research hypothesis into a contrast of means H: Constant reward will produce faster learning than the average of the other conditions µ µ µ µ µ µ H 0 : µ H : µ ψ µ c,,, H: Frequent reward will produce faster learning than the average of infrequent or no reward µ µ µ µ H 0 : µ H : µ ψ µ c 0,,, H: Infrequent reward will produce faster learning than no reward 0 : µ µ ψ : µ µ H H µ c 0,0,, ( ) - 006. Karpinski

o Step : Determine if the contrasts of interest are orthogonal c & c & c & c : ( * 0) * * * 0 j c c i i c : ( * 0) * 0 * * 0 j c c i i c : ( 0 * 0) ( * 0) * * 0 j c i ic o Step : Compute values for each contrast c c c c c c ψˆ X X X X () (6) (7).6667 (). X (6) (7) ˆ X X ψ ˆ X X ψ (6) (7) o Step : Conduct omnibus NOV to obtain MSW NOV Source of Variation SS df MS F P-value F crit Between Groups 0..88 0.000.8867 Within Groups 6 6.87 Total 9 9 Note: we do not care about the results of this test. We only want to calculate MSW - 006. Karpinski

o Step : Compute standard error for each contrast Std Error( ˆ ψ ) MSW c i a i n i Std err ( ψ ˆ ).87.87*.667. 06 Std err ( ψ ˆ ).87 0 ().87*.0.078 Std err ( ψ ˆ ).87 0 0 () ( ).87*.0.0 o Step 6: Compute observed t or F statistic for each contrast t observed i MSW c X i c i n i.6667 ψ ˆ : observed. 67.06 t. ψ ˆ : t observed. 0.078 ψ ˆ : t observed 0. 80.0-006. Karpinski

o Step 7: Determine statistical significance.0 Method (The table method): Find t crit for df N-a and α. 0 t crit (6) α.0. Compare t crit to t obs if if t < t then retain H 0 observed observed critical t t then reject H 0 critical We reject the null hypothesis for ψ ˆ and ψ ˆ Constant reward produced faster learning than the average of the other conditions Frequent reward produced faster learning than the average of infrequent or no reward.0 Method : (The exact method): Find p obs for df N-a and α. 0 for each t obs and Then compare p obs to p crit.0 ψ : t observed (6).6, p.0 ψ : t observed (6).0, p <.0 ψ : t observed (6) 0.80, p. o lternatively, you can perform an F-test to evaluate contrasts. We know that t F t observed i MSW c X i c i n i ψˆ F observed c i MSW n i df N-a df (, N-a) You will obtain the exact same results with t-tests or F-tests. -6 006. Karpinski

Confidence intervals for contrasts o In general, the formula for a confidence interval is estimate ± ( t * standard error) critical o For a contrast, the formula for a confidence interval is estimate ± ( t * standard error) critical ci ψ ˆ ± tcritical ( dfw) * MSW ni In the learning example, t crit (6) α.0. ψ :.667 ± (. *.06) ψ :. ± (. *.078) ψ :.0 ± (. *.0) (-.8, -0.) (-7.79, -.) (-.6,.6) Using SPSS to evaluate contrasts o If you use ONEWY, you can directly enter the contrast coefficients to obtain the desired contrast. ONEWY trials BY reward /STT desc /CONTRST, -., -., -. /CONTRST 0,, -., -. /CONTRST 0, 0,, -. -7 006. Karpinski

Descriptives TRILS Constant Frequent Infrequent Never Total 9% Confidence Interval for Mean N Mean Std. Deviation Std. Error Lower Bound Upper Bound.0000.707.6.0.8780.0000..088 8.0880.90 6.0000.707.6.0 6.8780 7.0000.00000.6.70 0.70 0.0000.7888.708..878 Contrast Coefficients Contrast REWRD Constant Frequent Infrequent Never -. -. -. 0 -. -. 0 0 - Contrast Tests TRILS ssume equal variances Contrast Value of Contrast Std. Error t df Sig. (-tailed) -.60.068 -.60 6.09 -.000.0789 -.0 6.000 -.0000.99 -.80 6. o Multiplying the coefficients by a constant will not change the results of the significance test on that contrast. If you multiply the values of a contrast by any constant (positive or negative), you will obtain the identical test statistic and p-value in your analysis. The value of the contrast, the standard error, and the size of the CIs will shrink or expand by a factor of the constant used, but key features (i.e., p-values and whether or not the CIs overlap) remain the same. -8 006. Karpinski

o Let s examine Hypothesis using three different sets of contrast coefficients: ONEWY trials BY reward /STT desc /CONTRST, -., -., -. /CONTRST, -, -, - /CONTRST -6,,,. Contrast Coefficients Contrast REWRD Constant Frequent Infrequent Never -. -. -. - - - -6 Contrast Tests TRILS ssume equal variances Contrast Value of Contrast Std. Error t df Sig. (-tailed) -.60.068 -.60 6.09-8.0000.099 -.6 6.08 6.0000 6.0998.6 6.08 You get more precise values if you enter the exact contrast coefficients into SPSS, so try to avoid rounding decimal places. Instead, multiply the coefficients by a constant so that all coefficients are whole numbers. In this case, the tests for contrasts and are exact. The test for contrast is slightly off due to rounding. -9 006. Karpinski

Sums of Squares of a contrast o s previously mentioned, a set of a- orthogonal contrasts will perfectly partition the SSB: SSB SSψ ˆ SSψ ˆ SSψ ˆ o To compute the Sums of Squares of a contrast: ψˆ SSψˆ c i n i SSψ ˆ ( ) ( 8) ( ) ( ) 6. 6.67 SSψ ˆ 0 () (.) 0. 00.8. SSψ ˆ ( ) () 0 0 ( ).0.. SSψ ˆ SSψ ˆ SSψ ˆ 6.67 00.8. 0 SSB 0 NOV Source of Variation SS df MS F P-value F crit Between Groups 0..88 0.000.8867 Within Groups 6 6.87 Total 9 9-0 006. Karpinski

o Once we have calculated the SSC, then we can compute an F-test directly: SSC dfc F (, dfw) SSW dfw SSC MSW ψ ˆ : F observed 6.67.87 6.88 00.8 ψ ˆ : F observed 6. 0.87.0 ψ ˆ : F observed 0. 6.87 o NOV table for contrasts NOV Source of Variation SS df MS F P-value Between Groups 0..88 0.000 ψ ˆ 6.67 6.7 6.887 0.086 ψ ˆ 00.8 00.8 6.0 0.00007 ψ ˆ.0.0 0.6 0.67 Within Groups 6 6.87 Total 9 9 In this NOV table, we show that SSC partitions SSB. But this relationship only holds for sets of orthogonal contrasts In general, you should only construct an NOV table for a set of a- orthogonal contrasts Note: We will shortly see they you can either perform the omnibus test OR tests of orthogonal contrasts, but not both. Nevertheless, this NOV table nicely displays the SS partition. - 006. Karpinski

. Brand name contrasts easily obtained from SPSS Difference contrasts Helmert contrasts Simple contrasts Repeated contrasts Polynomial contrasts (to be covered later) Difference contrasts: Each level of a factor is compared to the mean of the previous levels (These are orthogonal with equal n) c c c c - 0 0 - - 0 - - - Contrast Coefficients (L' Matrix) Parameter Intercept [REWRD.00] [REWRD.00] [REWRD.00] [REWRD.00] REWRD Difference Contrast Level vs. Level vs. Level vs. Level Previous Previous.000.000.000 -.000 -.00 -..000 -.00 -..000.000 -..000.000.000 The default display of this matrix is the transpose of the corresponding L matrix. UNINOV trials BY reward /CONTRST (reward)difference /PRINT test(lmatrix) - 006. Karpinski

Helmert contrasts: Each level of a factor is compared to the mean of subsequent levels (These are orthogonal with equal n). o The researcher s original hypotheses for the learning data are Helmert contrasts. c c c c - - - 0 - - 0 0 - Contrast Coefficients (L' Matrix) Parameter Intercept [REWRD.00] [REWRD.00] [REWRD.00] [REWRD.00] REWRD Helmert Contrast Level Level Level vs. vs. Later vs. Later Level.000.000.000.000.000.000 -..000.000 -. -.00.000 -. -.00 -.000 The default display of this matrix is the transpose of the corresponding L matrix. UNINOV trials BY reward /CONTRST (reward)helmert /PRINT test(lmatrix) Simple contrasts: Each level of a factor is compared to the last level (These contrasts are not orthogonal). c c c c 0 0-0 0-0 0 - Contrast Coefficients (L' Matrix) Parameter Intercept [REWRD.00] [REWRD.00] [REWRD.00] [REWRD.00] REWRD Simple Contrast a Level vs. Level vs. Level vs. Level Level Level 0 0 0 0 0 0 0 0 0 - - - The default display of this matrix is the transpose of the corresponding L matrix. a. Reference category UNINOV trials BY reward /CONTRST (reward)simple /PRINT test(lmatrix). - 006. Karpinski

Repeated contrasts: Each level of a factor is compared to the previous level (These contrasts are not orthogonal). c c c c - 0 0 0-0 0 0 - Contrast Coefficients (L' Matrix) Parameter Intercept [REWRD.00] [REWRD.00] [REWRD.00] [REWRD.00] REWRD Repeated Contrast Level vs. Level vs. Level vs. Level Level Level 0 0 0 0 0-0 0-0 0 - The default display of this matrix is the transpose of the corresponding L matrix. UNINOV trials BY reward /CONTRST (reward)repeated /PRINT test(lmatrix).. Relationships between the omnibus F and contrasts (for equal n designs) Relationship #: The omnibus F test is equal to the average t s from all possible pairwise contrasts. o Consequence: If the omnibus F test is significant, then at least one pairwise contrast is significant. - 006. Karpinski

Relationship #: If you take the average F (or t s) from a set of a- orthogonal contrasts, the result will equal the omnibus F! o mini-proof: For ψ ˆ : F SSψˆ dfψˆ SSψˆ.... For ˆ ψ SSW a : F a MSW dfw SS ˆ ψ a df ˆ ψ a SSW dfw SS ˆ ψ a MSW F F F... F a a SS ˆ ψ MSW SS ˆ ψ MSW... SS ˆ ψ a MSW a SS ˆ ψ SS ˆ ψ... SS ˆ MSW a ψ a SSB MSW a SSB a MSW MSB MSW F omnibus o Consequence: If the omnibus F test is significant, then at least one contrast is significant. - 006. Karpinski

o In the learning example: NOV Source of Variation SS df MS F P-value Between Groups 0..88 0.000 ψ ˆ 6.7 6.7 6.80 0.0900 ψ ˆ 00.8 00.8 6.0 0.00007 ψ ˆ.0.0 0.6 0.67 Within Groups 6 6.87 Total 9 9 verage F from the set of orthogonal contrasts: 6.80 6.0.6.6 (Difference is due to rounding error) Is it possible to have a significant contrast, but have a non-significant omnibus F? YES! o Let s consider an example: IV Level Level Level Level 6 7 6 6 8 6-6 006. Karpinski

ONEWY dv BY iv /CONTRST - - -. NOV DV Between Groups Within Groups Total Sum of Squares df Mean Square F Sig..70 7.97.67.0 0.000 6.00 6.70 9 o Omnibus F-test is not significant Contrast Tests DV ssume equal variances Contrast Value of Contrast Std. Error t df Sig. (-tailed) 7.0000.99.88 6.0 o The contrast comparing Group to the average of the other groups is significant Suppose none of the pairwise contrasts are significant. Is it possible to have a significant contrast? YES! o If none of the pairwise contrasts are significant, then the omnibus F test will not be significant. But you may still find a contrast that is significant! IV Level Level Level Level 0 6 6-7 006. Karpinski

ONEWY dv BY iv /CONTRST - 0 0 /CONTRST 0-0 /CONTRST 0 0 - /CONTRST 0-0 /CONTRST 0 0 - /CONTRST 0 0 -. o None of the pairwise contrasts are significant: Contrast Tests DV Contrast 6 Value of Contrast Std. Error t df Sig. (-tailed) -.0000.00000 -.000 6.06 -.0000.00000 -.000 6.06 -.0000.00000 -.000 6..0000.00000.000 6.000.0000.00000.000 6. -.0000.00000 -.000 6. o So we know that the omnibus F-test is not significant: NOV DV Between Groups Within Groups Total Sum of Squares df Mean Square F Sig..70.8.8.8 0.000 6.00.70 9 o But it is still possible to find a significant contrast: ONEWY dv BY iv /CONTRST - -. Contrast Tests DV Contrast Value of Contrast Std. Error t df Sig. (-tailed) -.0000. -. 6.00-8 006. Karpinski

To reiterate: o significant omnibus F-test There will be at least significant contrast o significant contrast DOES NOT IMPLY a significant omnibus F-test o non significant omnibus F-test DOES NOT IMPLY all contrasts will be non-significant 6. Robust tests for a single contrast The assumptions for contrasts are the same as those for NOV o Independent samples o Within each group, participants are independent and randomly selected o Equal population variances in each group o Each group is drawn from a normal population Tests of contrasts are not robust to heterogeneity of variances, even with equal n We can use our standard NOV techniques to test these assumptions. Presumably, by the time you are testing contrasts, you have already identified troublesome aspects about your data. But once you have identified the problems what can you do? o In general, the same fixes for NOV work for contrasts o Transformations can be used for non-normality or heterogeneous variances o sensitivity analysis can be used to investigate the impact of outliers There are two additional tools we did not use for NOV o Use a contrast-specific variance so that we do not assume equality of variances in all groups o Try a pairwise rank-based alternative -9 006. Karpinski

Use a contrast-specific variance o In the standard hypothesis test of a contrast, the denominator uses the MSW, a pooled variance estimate ci X i tobserved ci MSW n o What we would like to do is compute a new standard error of ψˆ that does not rely on MSW o The details are messy but fortunately you do not have to do the dirty work; SPSS automatically prints out tests of contrasts with unequal variances. When a, this test reduces exactly to the Welch s separate variance two-sample t-test o Let s return to the learning example and pretend that we found heterogeneous variance. Thus, to test our original hypotheses in the data (see p -), we need to use the modified test for contrast: ONEWY trials BY reward /CONTRST, -, -, - /CONTRST 0,, -., -. /CONTRST 0, 0,, -. i Contrast Tests TRILS ssume equal variances Does not assume equal variances ˆ H Contrast ψ : t (.).0, p. 00 ψ ˆ H : t ( 7.0).8, p. 00 ψ : t (.) 0.7, p. 0 ˆ H Value of Contrast Std. Error t df Sig. (-tailed) -8.0000.099 -.6 6.08 -.000.0789 -.0 6.000 -.0000.99 -.80 6. -8.0000.978 -.0..00 -.000.99 -.8 7.0.00 -.0000.780 -.7..0 o Remember, this Welch correction only corrects for unequal variances and does not correct or adjust for non-normality. -0 006. Karpinski

Try a non-parametric, rank-based alternative o Pair-wise tests can be conducted using the Mann-Whitney U test (or an NOV on the ranked data). o However, complex comparisons should be avoided! Because ranked data are ordinal data, we should not average (or take any linear combination) across groups. o comparison of Mann-Whitney U pairwise contrasts with NOV by ranks approach ψ : µ µ ψ : µ µ c : (,,0,0) c : (0,,,0 ) Mann-Whitney U pairwise contrasts: NPR TESTS NPR TESTS /M-W trials BY reward( ). /M-W trials BY reward( ). Test Statistics Mann-Whitney U Wilcoxon W Z symp. Sig. (-tailed) Exact Sig. [*(-tailed Sig.)] a. Not corrected for ties. TRILS 9.00.00 -.68..8 a Test Statistics Mann-Whitney U Wilcoxon W Z symp. Sig. (-tailed) Exact Sig. [*(-tailed Sig.)] a. Not corrected for ties. TRILS.000.000 -.6.008.008 a NOV by ranks approach: RNK VRIBLEStrials. ONEWY rtrials BY reward /CONT - 0 0 /CONT 0-0. Contrast Tests RNK of TRILS ssume equal variances Does not assume equal variances Contrast Value of Contrast Std. Error t df Sig. (-tailed) -.0000.99 -. 6.9-8.80000.99 -.680 6.00 -.0000.07 -.8.97.8-8.80000.7707 -.08.9.00-006. Karpinski

pproach Contrast Mann-Whitney U NOV by Ranks µ µ z -0.68, p. t(6) -0., p.9 µ z -.6, p.008 t(6) -.680, p.00 µ o rank modification of NOV is easy to use but: Not much theoretical work has been done on this type of test. This approach is probably not valid for multi-factor NOV. This approach is likely to be trouble for complex comparisons. Remember that the conclusions you draw are on the ranks, and not on the observed values! 7. Effect sizes for a single contrast For pairwise contrasts, you can use Hedges s g: g X X ˆ σ X X MSW In the general case there are several options o Use omega squared ( ω ) ˆ ω SSψˆ MSW SST MSW ω.0 ω.06 ω. small effect size medium effect size large effect size ω has an easy interpretation: it is the percentage of the variance in the dependent variable (in the population) that is accounted for by the contrast - 006. Karpinski

o Treat the complex comparison as a comparison between two groups and use Hedges s g, but we need the sum of the contrast coefficients to equal : g ˆ ψ MSW where a i ny contrast can be considered to be a comparison between two groups. We can use the mean of those two groups to compute a d. ψ µ ψ µ µ ψ is a comparison between group and the average of groups - H 0 : µ µ µ µ µ g X X X X X MSW ˆ ψ MSW ψ is a comparison between the average of groups and and the average of groups and µ µ µ µ H 0 : g X X X X MSW ˆ ψ MSW Interpretation of this g is the same as the g for two groups, but you must be able to interpret the contrast as a comparison between two groups. For example, polynomial contrasts cannot be considered comparisons between two groups. Thus, g is not appropriate for polynomial contrasts. - 006. Karpinski

o Compute an r measure of effect size: r F contrast F contrast df within t contrast t contrast df within r is interpretable as the (partial) correlation between the group means and the contrast values, controlling for non-contrast variability. 8. n example Rehabilitation Example. We have a sample of male participants between the age of 8 and 0 who have all undergone corrective knee surgery in the past year. We would like to investigate the relationship between prior physical fitness status (below average, average, above average) and the number of days required for successful completion of physical therapy. Prior physical fitness status Below verage verage bove verage 9 0 6 8 9 0 8 0 0 0 9 9 o We would like to test if: bove average participants complete therapy faster than other groups verage participants complete therapy faster than below average participants verage participants complete therapy slower than above average participants - 006. Karpinski

days 0 0.00 0.00 0 0.00 0.00 DYS 0 0.00 N 8 Below verage FITNESS 0 verage 6 bove verage.00.0.00.0.00 fitness o We need to convert the hypotheses to contrast coefficients bove average participants complete therapy faster than other groups µ µ H 0 : µ ψ µ c,, verage participants complete therapy faster than below average participants H 0 : µ µ ψ µ c (,,0 ) µ verage participants complete therapy slower than above average participants H 0 : µ µ ψ µ c ( 0,, ) µ o re these three contrasts an orthogonal set? With groups, we can only have orthogonal contrasts If we had equal sample sizes, then ψ ψ With unequal n we do not have an orthogonal set of contrasts - 006. Karpinski

o Conduct significance tests for these contrasts ONEWY days BY fitness /STT desc /CONTRST -.,-., /CONTRST -,, 0 /CONTRST 0, -,. Descriptives DYS Below verage verage bove verage Total 9% Confidence Interval for Mean N Mean Std. Deviation Std. Error Lower Bound Upper Bound 8 8.0000.77.969.09.79 0.0000.60.09 9.9.78 6.0000.79.8079 9.0 8.660.0000 6.8778.09 9.098.90 NOV DYS Between Groups Within Groups Total Sum of Squares df Mean Square F Sig. 67.000 6.000 6.96.000 6.000 9.80 088.000 Contrast Coefficients Contrast FITNESS Below bove verage verage verage -. -. - 0 0 - Contrast Tests DYS ssume equal variances Does not assume equal variances Contrast Value of Contrast Std. Error t df Sig. (-tailed) -.0000.00 -..000-6.0000.9 -.8.00-8.0000.988 -.8.00 -.0000.0 -.8 8.98.00-6.0000.86 -.697.97.00-8.0000. -.78 8.696.00-6 006. Karpinski

o Compute a measure of effect size for each contrast ˆ ω SSψˆ MSW SST MSW We need to compute the SS for each contrast SSψˆ ψˆ c i n i SSψ ˆ 8 ( ) 0 () 6.80. 9 SS ˆ ψ ( ) () ( 0 ) 8 ( 6) 0 6 6 60. SSψ ˆ 0 ( 8) ( ) () 0 6 6 0. 667 Now compute omega squared.80 9.8 ψ ˆ : ˆ ω. 7 088 9.8 60 9.8 ψ ˆ : ˆ ω. 7 088 9.8 0 9.8 ψ ˆ : ˆ ω. 99 088 9.8-7 006. Karpinski

OR Compute an r measure of effect size for each contrast: r F contrast F contrast df within t contrast t contrast df within ψ ˆ : r...7 ψ ˆ : r.8.8. ψ ˆ : r.8.8.6 o Report the results ψ : t ()., p <.0, ω. 7 ˆ ˆ ψ : t ().8, p.0, ω. ˆ ψ : t ().8, p <.0, ω. 0 In your results section, you need to say in English (not in statistics or symbols) what each contrast is testing In general, it is not necessary to report the value of the contrast or the contrast coefficients used... contrast revealed that above average individuals recovered faster than all other individuals, t ()., p <.0, ω. 7. Pairwise tests also revealed that average individuals completed therapy faster than below average individuals, t ().8, p.0, ω., and that above average individuals completed therapy faster than average participants, t ().8, p <.0, ω.0. -8 006. Karpinski

9. Polynomial Trend Contrasts Trend contrasts are a specific kind of orthogonal contrasts that may be of interest for certain designs. Tests for trends are used only for comparing quantitative (ordered) independent variables. o IV 0mg, 0mg, 0mg, 0mg of a drug Trend contrasts are used to explore polynomial trends in the data Order of # of Trend Polynomial Bends Shape Linear st 0 Straight Line Quadratic nd U-shaped Cubic rd Wave Quartic th Wave Etc. Linear Quadratic 0 - - - 0 - - - Cubic Quartic 0 - - - 0 0 - -9 006. Karpinski

Memory Example # Study Time Minute Minutes Minutes Minutes 6 8 7 0 9 7 9 0 7 9 8 7 8 9 6 7 9 o It looks like there might be a linear trend in the data o To test for trends, tables of orthogonal trend contrasts have been computed. For a, we can have orthogonal contrasts c c c c Linear - - Quadratic - - Cubic - - o To use these values, the levels of the IV need to be equally spaced and the cell sizes must be equal -0 006. Karpinski

o To compute the trend contrasts, we use the orthogonal trend contrasts and the usual procedure for computing and testing contrasts: For the linear contrast, use c (-,-,, ) ψ linear µ µ µ ˆ ψ linear X X X X () (6) (8) (9) SS( ˆ () ψ linear ) ( ) 6 ( ) () 6 6 () 8.7 6 For the quadratic contrast, use c (, -, -, ) ψ quadratic µ µ µ µ ˆ ψ quadratic X X X X 6 8 9 SS( ˆ ψ quadratic ) ( ) () 6 ( ) 6 6 ( ) () 6.0 For the cubic contrast, use c (-,, -, ) ψ cubic µ µ µ ˆ ψ cubic X X X X (6) (8) 9 SS( ˆ () ψ cubic ) ( ) 6 () 6 ( ) 0.0 () 6 6 Comments about trend contrasts o These contrasts are orthogonal (when ns are equal), so it is possible to have any combination of effects (or lack of effects) o Because the sets of weights are not equally scaled, you cannot compare the strength of effects simply by inspecting the value of the contrast. o Some people place an additional constraint on the contrast weights: a c i i When the sum of the absolute value of the contrast values is not constant across contrasts (as with the trend contrasts), then you CN NOT compare contrast values. You can only compare sums of squares and measures of effect size. - 006. Karpinski

Memory Example # Study Time Minute Minutes Minutes Minutes 6 6 7 8 7 9 0 6 7 9 8 7 8 6 6 7 o Looks like we have a linear effect with some quadratic ψ linear µ µ µ ˆ ψ linear X X X X (6) (6) (7) () 9 SS( ˆ (9) ψ linear ) ( ) 6 ( ) () 6 6 () 08. 6 ψ quadratic µ µ µ µ ˆ ψ quadratic X X X X 6 6 7 SS( ˆ ψ quadratic ) () () 6 ( ) 6 6 ( ) () 6 7.0 ψ cubic µ µ µ ˆ ψ cubic X X X X 6 (6) (7) SS( ˆ ψ cubic ) ( ) () ( ) 6 () 6 6 () 6.70-006. Karpinski

Memory Example # Study Time Minute Minutes Minutes Minutes 6 8 7 0 9 0 0 7 9 7 8 6 7 Looks like we have a quadratic effect ψ linear µ µ µ ˆ ψ linear X X X X () (6) (7) () SS( ˆ () ψ linear ) ( ) 6 ( ) () 6 6 ().8 6 ψ quadratic µ µ µ µ ˆ ψ quadratic X X X X 6 6 7 0 SS( ˆ ψ quadratic ) (0) () 6 ( ) 6 6 ( ) () 6 0 ψ cubic µ µ µ ˆ ψ cubic X X X X (6) (7) SS( ˆ ψ cubic ) ( ) ( ) ( ) 6 () 6 6 () 6.0-006. Karpinski

trials Statistical tests for trend analysis: Reanalyzing the learning example o Our learning example is perfectly suited for a trend analysis (Why?) o When we initially analyzed these data, we selected one set of orthogonal contrasts, but there are many possible sets of orthogonal contrasts, including the trend contrasts 0.00 7.0.00.0 0.00.00.00.00.00 reward o For a, we can test for a linear, a quadratic, and a cubic trend c c c c Linear - - Quadratic - - Cubic - - ψˆ linear X X X X () () (6) (7) 0 SS( ˆ (0) ψ lin ) ( ) ( ) () () ψˆ quadratic X X X X () () (6) (7) 00 SS( ˆ () ψ quad ) () ( ) ( ) () ψˆ cubic X X X X () () (6) (7) 0 ( 0) ( ) () ( ) SS( ˆ ψ cubic ) () - 006. Karpinski

o Rather than determine significance by hand, we can use ONEWY: ONEWY trials BY reward /CONT -, -,, /CONT, -, -, /CONT -,, -,. Contrast Coefficients Contrast REWRD.00.00.00.00 - - - - - - Contrast Tests TRILS ssume equal variances Contrast Value of Contrast Std. Error t df Sig. (-tailed) 0.0000.9700.080 6.000.0000.76068.6 6.7-0.0000.9700 -.0 6.0 We find evidence for significant linear and cubic trends ψ ˆ : t(6).08, p <.0, r.79 linear ψ ˆ : t(6)., p.7, r.7 quadratic ψ ˆ : t(6)., p.0, r. cubic - 006. Karpinski

To complete the NOV table, we need the sums of squares for each contrast SSψˆ ψˆ c i n i SS ˆ ψ linear 00 SS ˆ ψ quad SS ˆ ψ cubic SSψˆ linear SSψˆ quadratic SSψˆ cubic 00 0 SSB NOV Source of Variation SS df MS F P-value Between Groups 0..88 0.000 ψˆ lin 00 00.806 0.000 ψˆ quad.90 0.777 ψˆ cubic 6. 0.087 Within Groups 6 6.87 Total 9 9-6 006. Karpinski

You can also directly ask for polynomial contrasts in SPSS o Method : ONEWY ONEWY trials BY reward /POLYNOMIL. fter polynomial, enter the highest degree polynomial you wish to test. NOV TRILS Between Groups (Combined) Linear Term Contrast Deviation Sum of Squares df Mean Square F Sig. 0.000..8.000 00.000 00.000.806.000 0.000.000.87.0 Within Groups Total Quadratic Term Cubic Term Contrast Deviation Contrast.000.000.90.7.000.000 6..0.000.000 6..0 6.000 6.87 9.000 9 o dvantages of the ONEWY method for polynomial contrasts: It utilizes the easiest oneway NOV command It gives you the sums of squares of the contrast It uses the spacing of the IV in the data (Be careful!) It gives you the deviation test (to be explained later) o Disadvantages of the ONEWY method for polynomial contrasts: You can not see the value of the contrast or the contrast coefficients -7 006. Karpinski

o Method : UNINOV UNINOV trials BY reward /CONTRST (reward)polynomial /PRINT test(lmatrix). Contrast Results (K Matrix) REWRD Polynomial Contrast a Linear Contrast Estimate Hypothesized Value Difference (Estimate - Hypothesized) Dependent Variable TRILS.7 0.7 Quadratic Cubic Std. Error Sig. 9% Confidence Interval for Difference Lower Bound Upper Bound Contrast Estimate Hypothesized Value Difference (Estimate - Hypothesized) Std. Error Sig. 9% Confidence Interval for Difference Lower Bound Upper Bound Contrast Estimate Hypothesized Value Difference (Estimate - Hypothesized).880.000.606 6.8.000 0.000.880.7 -.866.866 -.6 0 -.6 Std. Error Sig. 9% Confidence Interval for Difference a. Metric.000,.000,.000,.000 Lower Bound Upper Bound.880.0 -.0 -.70 Test Results Dependent Variable: TRILS Sum of Source Squares df Mean Square F Sig. Contrast 0.000..8.000 Error 6.000 6.87-8 006. Karpinski

Contrast Coefficients (L' Matrix) REWRD Polynomial Contrast a Parameter Intercept [REWRD.00] [REWRD.00] [REWRD.00] [REWRD.00] Linear Quadratic Cubic.000.000.000 -.67.00 -. -. -.00.67. -.00 -.67.67.00. The default display of this matrix is the transpose of the corresponding L matrix. a. Metric.000,.000,.000,.000 This is the matrix of trend coefficients used by SPSS to calculate the contrasts. SPSS coefficients SPSS coefficients X 6 c (.67,.,.,.67) c (,,, ) c (.,.,.,.) c (,,,) c (.,.67,.67.) c (,,,) You can check that these coefficients are orthogonal o Suppose the reward intervals were not equally spaced: Constant (00%) Frequent (7%) Level of Reward Infrequent (%) Never (0%) Now we cannot use the tabled contrast values, because they require equal spacing between intervals -9 006. Karpinski

trials trials Equal spacing Unequal spacing 0.00 0.00 7.0 7.0.00.00.0.0 0.00 0.00.00.00.00.00 reward 0.00 0. 0.0 0.7.00 reward UNINOV trials BY reward /CONTRST (reward)polynomial (,.7,., 0) /PRINT test(lmatrix). Contrast Results (K Matrix) REWRD Polynomial Contrast a Linear Quadratic Cubic Contrast Estimate Hypothesized Value Difference (Estimate - Hypothesized) Std. Error Sig. 9% Confidence Interval for Difference Lower Bound Upper Bound Contrast Estimate Hypothesized Value Difference (Estimate - Hypothesized) Std. Error Sig. 9% Confidence Interval for Difference Lower Bound Upper Bound Contrast Estimate Hypothesized Value Difference (Estimate - Hypothesized) Std. Error Sig. 9% Confidence Interval for Difference a. Metric.000,.70,.0,.000 Lower Bound Upper Bound Dependent Variable TRILS -.7 0 -.7.880.000-6.60 -.877.000 0.000.880.7 -.866.866.8 0.8.880.09 -.8.7-0 006. Karpinski

Now only the linear trend is significant SPSS calculates a set of orthogonal trend contrasts, based on the spacing you provide. Here they are: Contrast Coefficients (L' Matrix) REWRD Polynomial Contrast a Parameter Intercept [REWRD.00] [REWRD.00] [REWRD.00] [REWRD.00] Linear Quadratic Cubic.000.000.000.6.00.6.6 -.00 -.6 -.6 -.00.6 -.6.00 -.6 The default display of this matrix is the transpose of the corresponding L matrix. a. Metric.000,.70,.0,.000 o dvantages of the UNINOV method: It is the only way SPSS gives you confidence intervals for a contrast (But remember, the width of CIs depends on the contrast values) It allows you to deal with unequally spaced intervals (It assumes equal spacing unless you tell it otherwise; no matter how you have the data coded!) It will print the contrast values SPSS uses o Disadvantages of the UNINOV method: It does not print out a test statistic or the degrees of freedom of the test!?! Remember, for a one-way design, you can obtain a test of any contrast by using the ONEWY command and entering the values for each contrast. With ONEWY method, you know exactly how the contrast is being computed and analyzed - 006. Karpinski

0. Simultaneous significance tests on multiple orthogonal contrasts Sometimes you may wish to test the significance of several contrasts in a single omnibus test Example # o We would like to compare the effect of four drugs on body temperature. To test these drugs, we randomly assign people to receive one of the four drugs. fter a period of time, we record each participant s body temperature. Drug Drug B Drug C Drug D 9. 9. 9.7 96. 9.8 96. 9.0 9. 9.0 96. 9.9 96. 9. 96. 9.6 9.8 9.6 9.9 9. 9.7 o Our hypothesis was that Drug B would result in a higher body temperature than the other drugs ψ µ µ B µ C D o We also wanted to know if the other drugs (, C, D) differed in their effect on body temperature H : µ µ µ 0 C D H : The three means are not equal Note that this hypothesis is an omnibus hypothesis! Report TEMP DRUG B C D Total Mean N Std. Deviation 9.000.6 96.000.6 9.9000.786 9.7000.67 9.70 0.677-006. Karpinski

o Step : Obtain SSB, SST, and MSW NOV TEMP Between Groups Within Groups Total Sum of Squares df Mean Square F Sig..7. 7.70.00.90 6.8 7.7 9 o Step : Test ψ µ µ µ B C D ψ ˆ (9.) (96.) (9.9) (97.). SS ˆ ψ ( ) () ( ) ( ) (.) 6..60. NOV Source of Variation SS df MS F P-value Between Groups.7. 7.70.00 ψ.60.60.69 0.007 Within Groups.9 6.8 Total 7.7 9 o Step : Test H 0 : µ µ C µ D The trick is to remember that an omnibus NOV test m means is equal to the simultaneous test on any set of (m-) orthogonal contrasts We can then combine these orthogonal contrasts in a single omnibus F-test: F comb (m, dfw) SSC... SSC m m MSW - 006. Karpinski

We need to choose any contrasts so long as we have an orthogonal set of three contrasts (including the contrast associated with the first hypothesis): c : (,,, ) c : (,0,,) c : (,0,,0) This set will work because all three contrasts are orthogonal. Now, let s compute the simultaneous F-test of these two contrasts. ψ ˆ (9.) 0 (9.9) (97.). SS ˆ ˆ ψ ( ) ( ) ( ) 0 ( ) (.) ψ (9.) 0 (9.9) 0(97.) 0. SS ˆ ψ ( ) ( ) () 0 ( 0 ) ( 0.).69.08..09.. F comb (m, dfw) SSC... SSC m m MSW F comb (,6).08 0..8 We reject H 0 and conclude that.7, p.0 µ, are not all equal µ C, and µ D NOV Source of Variation SS df MS F P-value Between Groups.7. 7.70.00 ψ.60.60.69 0.007 µ µ C µ D.6 0.867.78.086 ( ψ,ψ ) Within Groups.9 6.8 Total 7.7 9-006. Karpinski

o Note: We also could have computed the test of µ µ C µ D without directly computing the two orthogonal contrasts: If we knew the combined sums of squares of these two contrasts then we could fill in the remainder of the NOV table. But we do know the combined sums of squares for the remaining two contrasts (so long as all the contrasts are orthogonal)! NOV Source of Variation SS df MS F P-value Between Groups.7. 7.70.00 ψ.60.60.69 0.007 µ µ C µ D Within Groups.9 6.8 Total 7.7 9 SSB SSψ SSψ SSψ SS ψ SSψ SSB SSψ.7.60.6 We can substitute SSψ SSψ into the table and compute the F-test as we did previous (except in this case, we never identified or computed the two additional contrasts to complete the orthogonal set). NOV Source of Variation SS df MS F P-value Between Groups.7. 7.70.00 ψ.60.60.69 0.007 µ µ C µ D.6 0.867.78.086 Within Groups.9 6.8 Total 7.7 9-006. Karpinski

Example # o We want to examine the effect of caffeine on cognitive performance and attention. Participants are randomly assigned to one of dosages of caffeine. In a subsequent proofreading task, we count the number of errors. Dose of Caffeine 0mg 0mg 00mg 0mg 00mg 0 0 0 6 0 - N 0 0 0 0 0 0 mg 0 mg 00 mg 0 mg 00 mg DOSE -6 006. Karpinski

Descriptives ERRORS 0 mg 0 mg 00 mg 0 mg 00 mg Total 9% Confidence Interval for Mean N Mean Std. Deviation Std. Error Lower Bound Upper Bound 0.9000.97.789.06.76 0.000.978.07.808.9 0.000.87.6667.7968.00 0.000.989.909.6.87 0.000.978.07.808.9 0.000.696.67.7689. o We would like to test if there is a linear or a quadratic trend. We are not really interested in any higher order trends o With equally spaced intervals, we can use the coefficients from the orthogonal polynomial table. With five groups, we can test up to four orthogonal polynomials c c c c c Linear - - 0 Quadratic - - - Cubic - 0 - Quartic - 6 - -7 006. Karpinski

o Method : Let SPSS do all the work ONEWY errors BY dose /POLYNOMIL. NOV ERRORS Between Groups (Combined) Linear Term Contrast Deviation Sum of Squares df Mean Square F Sig..600.60.79.00.0.0..09 8.90 6.06 6..00 Within Groups Total Quadratic Term Contrast Deviation.07.07.8.00.98.9..089.900.976 66.00 9 Under Linear Term CONTRST is the test for the linear contrast: F(,)., p.09 DEVITION is the combined test for the quadratic, cubic, and quartic contrasts: F(,) 6.6, p.00 Under Quadratic Term CONTRST is the test for the quadratic contrast: F(,)., p.00 DEVITION is the combined test for the cubic and quartic contrasts F(,)., p.089 Is it safe to report that there are no higher order trends? -8 006. Karpinski

o Method a: Let SPSS do some of the work ONEWY errors BY dose /CONT - - 0 /CONT - - - /CONT - 0 - /CONT - 6 -. Contrast Tests ERRORS ssume equal variances Does not assume equal variances Contrast Value of Contrast Std. Error t df Sig. (-tailed) -.000.98770 -.6.09.000.6866.679.00.000.98770.7.0 -.0000.6 -.8.70 -.000.060 -.976.7.060.000.890.66.679.00.000.9769. 8.7.0 -.0000.7908 -.0 6.966.678 o Method b: Let SPSS do some of the work UNINOV errors BY dose /CONTRST (dose)polynomil /PRINTTEST(LMTRIX). Contrast Coefficients (L' Matrix) DOSE Polynomial Contrast a Parameter Intercept [DOSE.00] [DOSE.00] [DOSE.00] [DOSE.00] [DOSE.00] Linear Quadratic Cubic Order.000.000.000.000 -.6. -.6.0 -.6 -.67.6 -.78.000 -..000.77.6 -.67 -.6 -.78.6..6.0 The default display of this matrix is the transpose of the corresponding L matrix. a. Metric.000,.000,.000,.000,.000-9 006. Karpinski

DOSE Polynomial Contrast a Linear Quadratic Cubic Order Contrast Results (K Matrix) Contrast Estimate Hypothesized Value Difference (Estimate - Hypothesized) Std. Error Sig. 9% Confidence Interval Lower Bound for Difference Upper Bound Contrast Estimate Hypothesized Value Difference (Estimate - Hypothesized) Std. Error Sig. 9% Confidence Interval Lower Bound for Difference Upper Bound Contrast Estimate Hypothesized Value Difference (Estimate - Hypothesized) Std. Error Sig. 9% Confidence Interval Lower Bound for Difference Upper Bound Contrast Estimate Hypothesized Value Difference (Estimate - Hypothesized) Depende nt Variable ERRORS -.66 0 -.66..09 -.9 -.0E-0.9 0.9..00.0.778.696 0.696..0 6.66E-0. -.0 0 -.0 Std. Error Sig. 9% Confidence Interval for Difference a. Metric.000,.000,.000,.000,.000 Lower Bound Upper Bound..70 -.79.0-60 006. Karpinski

o Here are the results ψ ˆ linear : t()., p.09, r.0 ψ ˆ : t().68, p.00, r.8 quadratic ψ ˆ cubic : t()., p.0, r. ψ ˆ : t() 0.8, p.70, r.06 quartic We conclude there are significant linear, quadratic and cubic trends. o Wait a minute... Didn t we just conclude there were no significant trends higher than quadratic!?! o When the omnibus test is not significant, you still may be able to find significant contrasts. (Remember, we demonstrated that a significant contrast does not imply a significant omnibus F-test) Use combined contrast tests with caution! NOV Table Source Sum of Squares df Mean Square F Sig. Between Groups.600.60.79.00 Linear Term.0.0..09 Quadratic Term.07.07.8.00 Cubic Term.80.80.96.0 th-order Term...6.70 Within Groups.900.976 Total 66.00 9 In general, to test m orthogonal contrasts simultaneously SS ˆ ψ... SS ˆ ψ m m F(m,dfw) MSW Where m a -6 006. Karpinski

. Polynomial trends with unequal cell size Our formulas for computing the value of a contrast, the sums of squares of a contrast, and the significance of a contrast can all handle designs with unequal n in each cell a ˆ ψ c i X i c X c X c X... c a X a j SSψˆ ψˆ c i n i t observed (N a) c i X i MSW c i n i or F observed (,N a) ˆ ψ MSW c i n i The problem is in the orthogonality of contrasts with unequal n ψ (a,a,a,...,a a ) ψ (b,b,b,...,b a ) o Two contrasts are orthogonal for unequal n if a j a i b i n i 0 or a b n a b n... a a b a n a 0 o ll of our standard orthogonal contrasts will no longer be orthogonal -6 006. Karpinski

anxiety Example # with unequal n It was of interest to determine the effects of the ingestion of alcohol on anxiety level. Five groups of 0 year-old adults were administered between 0 and ounces of pure alcohol per day over a one-month period. t the end of the experiment, their anxiety scores were measured with a well-known nxiety scale. 0oz. oz. oz. oz. oz. 99 9 8 99 9 0 8 9 0 0 09 87 87 0 98 9 88 0 00 6 9 06 X j 0.7 0.80 00.00 8.60 9.80 n j 7 0 0.00 0 9 NXIETY 00 80 60 0 N.00.00 7.00 0.00.00 0.00 90.00 70.00 0.00.00.00.00.00 LCOHOL alcohol -6 006. Karpinski

o We would like to test for linear, quadratic, cubic and higher-order trends UNINOV anxiety BY alcohol /CONTRST (alcohol)polynomial /PRINTtest(LMTRIX). Contrast Coefficients (L' Matrix) LCOHOL Polynomial Contrast a Parameter Intercept [LCOHOL.00] [LCOHOL.00] [LCOHOL.00] [LCOHOL.00] [LCOHOL.00] Linear Quadratic Cubic Order.000.000.000.000 -.6. -.6.0 -.6 -.67.6 -.78.000 -..000.77.6 -.67 -.6 -.78.6..6.0 The default display of this matrix is the transpose of the corresponding L matrix. a. Metric.000,.000,.000,.000,.000 We could also use ONEWY, but then we could not see the contrast coefficients SPSS generates these contrasts, let s check to see if they are orthogonal a a i b i 0 j n i Linear vs.quadratic : (.6)(.) (.6)(.67) (.6)(.67) (.6)(.) 0 0 The SPSS generated contrasts are not orthogonal when ns are unequal! Let s see what happens when we proceed: ψˆ SSψˆ c i n i -6 006. Karpinski

ˆ.6X.6X 0X.6X.6X.6 0.7 ψ linear.8 ( ).6( 0.80) 0.6( 8.60).6( 9.80) ψ ˆ quad.8 ψ ˆ cubic. 8 ψ ˆ quartic 8. 80 SSψˆ linear (.8) (.6) (.6) ( 0) (.6) (.6) 7 97.8 SSψˆ quadratic (.8) (.6) (.6) ( 0) (.6) (.6) 7 786.9 SSψˆ cubic (.8) (.6) (.6) ( 0) (.6) (.6) 7 8. SSψˆ quartic ( 8.8) (.0) (.78) (.77) (.78) (.0) 7 9.7 NOV NXIETY Between Groups Within Groups Total Sum of Squares df Mean Square F Sig. 9.06 88.766 9.7.000 9.0 9.0 8.6-6 006. Karpinski

dv SSψˆ linear SSψˆ quadratic SSψˆ cubic SSψˆ quartic SSB 9.07 97.8 786.9 8. 9.7 6.0 6.0 9.07 For non-orthogonal contrasts, we can no longer decompose the sums of squares additively One fix is to weight the cell means by their cell size. Using this weighted approach is equivalent to adjusting the contrast coefficients by the cell size. Example # with unequal n Group X j n j 0 0 6.00 8 6 0.00.00 0.00 DV - N 0 0.00.00.00.00 GROUP.00.00.00.00 group -66 006. Karpinski

o Is there a linear trend in the DV? ONEWY dv BY group /POLY. NOV DV Between Groups (Combined) Linear Term Unweighted Weighted Sum of Squares df Mean Square F Sig. 66.68.6 8.996.000.06.06.0.87.00.00.80.0 Within Groups Quadratic Term Cubic Term Unweighted Weighted Unweighted Weighted.6.6.99.000.06.06.6.000.7.7.06.80.7.7.06.80 6.8 66.6 Total 9.06 69 o ccording to the unweighted analysis, there is no linear trend This analysis treats all group means equally The Contrast SS do not partition the SSB o ccording to the weighted analysis, there is a linear trend This analysis gives most of the weight to group and group The Contrast SS do partition the SSB exactly o Which method is better? If your goal is to compare group means, then you should conduct the unweighted analysis. This case holds most of the time! Remember, there is nothing wrong with testing non-orthogonal contrasts But you cannot construct combined contrasts tests If the inequality in the cell sizes reflects a meaningful difference in group sizes and you want to reflect those differences in your analysis, then a weighted means approach may be appropriate. You must have a representative sample Your main goal would NOT be to compare groups If you think a weighted analysis may be appropriate, then you should read more about proper interpretation of this analysis. (see Maxwell & Delaney, 990) -67 006. Karpinski