Analysis of Variance (ANOVA)

Similar documents
Analysis of Variance: Repeated measures

Independent Samples ANOVA

An Old Research Question

Comparing Several Means: ANOVA

One-way between-subjects ANOVA. Comparing three or more independent means

The t-test: A z-score for a sample mean tells us where in the distribution the particular mean lies

Multiple t Tests. Introduction to Analysis of Variance. Experiments with More than 2 Conditions

One-way between-subjects ANOVA. Comparing three or more independent means

Difference in two or more average scores in different groups

Using SPSS for One Way Analysis of Variance

Analysis of Variance: Part 1

COMPARING SEVERAL MEANS: ANOVA

Introduction to Analysis of Variance. Chapter 11

Introduction to the Analysis of Variance (ANOVA)

Preview from Notesale.co.uk Page 3 of 63

BIOL Biometry LAB 6 - SINGLE FACTOR ANOVA and MULTIPLE COMPARISON PROCEDURES

Factorial Independent Samples ANOVA

Review. One-way ANOVA, I. What s coming up. Multiple comparisons

Introduction to the Analysis of Variance (ANOVA) Computing One-Way Independent Measures (Between Subjects) ANOVAs

One-Way ANOVA. Some examples of when ANOVA would be appropriate include:

ANALYTICAL COMPARISONS AMONG TREATMENT MEANS (CHAPTER 4)

Group comparison test for independent samples

Multiple Comparisons

The independent-means t-test:

Laboratory Topics 4 & 5

HYPOTHESIS TESTING. Hypothesis Testing

Repeated-Measures ANOVA in SPSS Correct data formatting for a repeated-measures ANOVA in SPSS involves having a single line of data for each

WELCOME! Lecture 13 Thommy Perlinger

Note: k = the # of conditions n = # of data points in a condition N = total # of data points

Analysis of Variance (ANOVA)

LAB 2. HYPOTHESIS TESTING IN THE BIOLOGICAL SCIENCES- Part 2

Analyses of Variance. Block 2b

Data Analysis and Statistical Methods Statistics 651

One-way Analysis of Variance. Major Points. T-test. Ψ320 Ainsworth

Regression With a Categorical Independent Variable: Mean Comparisons

CHAPTER 17 CHI-SQUARE AND OTHER NONPARAMETRIC TESTS FROM: PAGANO, R. R. (2007)

Analysis of Variance

10/31/2012. One-Way ANOVA F-test

Stat 529 (Winter 2011) Experimental Design for the Two-Sample Problem. Motivation: Designing a new silver coins experiment

Sampling Distributions: Central Limit Theorem

8/23/2018. One-Way ANOVA F-test. 1. Situation/hypotheses. 2. Test statistic. 3.Distribution. 4. Assumptions

The legacy of Sir Ronald A. Fisher. Fisher s three fundamental principles: local control, replication, and randomization.

Multiple Comparison Procedures Cohen Chapter 13. For EDUC/PSY 6600

Orthogonal, Planned and Unplanned Comparisons

The One-Way Repeated-Measures ANOVA. (For Within-Subjects Designs)

Hypothesis T e T sting w ith with O ne O One-Way - ANOV ANO A V Statistics Arlo Clark Foos -

INTRODUCTION TO ANALYSIS OF VARIANCE

An inferential procedure to use sample data to understand a population Procedures

OHSU OGI Class ECE-580-DOE :Design of Experiments Steve Brainerd

df=degrees of freedom = n - 1

ANOVA Analysis of Variance

One-Way Analysis of Variance (ANOVA) Paul K. Strode, Ph.D.

Repeated Measures Analysis of Variance

Introduction. Chapter 8

psyc3010 lecture 2 factorial between-ps ANOVA I: omnibus tests

Module 03 Lecture 14 Inferential Statistics ANOVA and TOI

The One-Way Independent-Samples ANOVA. (For Between-Subjects Designs)

Lecture 3: Analysis of Variance II

20.0 Experimental Design

Unit 27 One-Way Analysis of Variance

Analysis of variance

Psych 230. Psychological Measurement and Statistics

Last week: Sample, population and sampling distributions finished with estimation & confidence intervals

1. What does the alternate hypothesis ask for a one-way between-subjects analysis of variance?

Calculating Fobt for all possible combinations of variances for each sample Calculating the probability of (F) for each different value of Fobt

ANCOVA. Lecture 9 Andrew Ainsworth

13: Additional ANOVA Topics. Post hoc Comparisons

Analysis of Covariance. The following example illustrates a case where the covariate is affected by the treatments.

Analysis of Variance and Contrasts

MORE ON SIMPLE REGRESSION: OVERVIEW


REVIEW 8/2/2017 陈芳华东师大英语系

2 Hand-out 2. Dr. M. P. M. M. M c Loughlin Revised 2018

22s:152 Applied Linear Regression. Chapter 8: 1-Way Analysis of Variance (ANOVA) 2-Way Analysis of Variance (ANOVA)

Statistical Inference: Estimation and Confidence Intervals Hypothesis Testing

Multiple comparisons - subsequent inferences for two-way ANOVA

ONE FACTOR COMPLETELY RANDOMIZED ANOVA

Factorial Analysis of Variance

Degrees of freedom df=1. Limitations OR in SPSS LIM: Knowing σ and µ is unlikely in large

In a one-way ANOVA, the total sums of squares among observations is partitioned into two components: Sums of squares represent:

Inferences About the Difference Between Two Means

PSYC 331 STATISTICS FOR PSYCHOLOGISTS

Contrasts (in general)

Lecture 14: ANOVA and the F-test

Your schedule of coming weeks. One-way ANOVA, II. Review from last time. Review from last time /22/2004. Create ANOVA table

Statistiek II. John Nerbonne using reworkings by Hartmut Fitz and Wilbert Heeringa. February 13, Dept of Information Science

In ANOVA the response variable is numerical and the explanatory variables are categorical.

H0: Tested by k-grp ANOVA

Two Factor Completely Between Subjects Analysis of Variance. 2/12/01 Two-Factor ANOVA, Between Subjects 1

Factorial Analysis of Variance

CE3502. Environmental Measurements, Monitoring & Data Analysis. ANOVA: Analysis of. T-tests: Excel options

One-way ANOVA. Experimental Design. One-way ANOVA

Analytical Comparisons Among Treatment Means (Chapter 4) Analysis of Trend (Chapter 5) ERSH 8310 Fall 2009

Do not copy, post, or distribute. Independent-Samples t Test and Mann- C h a p t e r 13

Lec 1: An Introduction to ANOVA

Chapter 12. Analysis of variance

Hypothesis testing: Steps

Keppel, G. & Wickens, T.D. Design and Analysis Chapter 2: Sources of Variability and Sums of Squares

Notes for Week 13 Analysis of Variance (ANOVA) continued WEEK 13 page 1

Topic 4: Orthogonal Contrasts

Transcription:

Analysis of Variance (ANOVA) Two types of ANOVA tests: Independent measures and Repeated measures Comparing 2 means: X 1 = 20 t - test X 2 = 30 How can we Compare 3 means?: X 1 = 20 X 2 = 30 X 3 = 35 ANOVA Test compares three or more groups or conditions For ANOVA we work with variances instead of means Why use ANOVA (3 advantages): (a) ANOVA can test for trends in our data. (b) ANOVA is preferable to performing many t-tests on the same data (avoids increasing the risk of Type 1 error). Effect of fertilizer on plant height (cm) (too much of a good thing) Suppose we have 3 groups. We will have to compare: Population of seeds Randomly select 40 seeds HEIGHT of plant (Cm) No Fertilizer 10 gm Fertilizer 20 gm Fertilizer 30 gm Fertilizer group 1 with group 2 group 1 with group 3 group 2 with group 3 Each time we perform a test there is a (small) probability of rejecting the true null hypothesis. These probabilities add up. So we want a single test. Which is ANOVA. 1

(c) ANOVA can be used to compare groups that differ on two, three or more independent variables, and can detect interactions between them. Age-differences in the effects of alcohol on motor coordination: Independent-Measures ANOVA: Each subject participates in only one condition in the experiment (which is why it is called independent measures). score (errors) An independent-measures ANOVA is equivalent to an independent-measures t-test, except that you have more than two groups of subjects. Alcohol dosage (number of drinks) Logic behind ANOVA: Example Effects of caffeine on memory: FOUR GROUPS: each group gets a different amount of caffeine, followed by a memory test (words remembered from a list) Variation in the set of scores comes from TWO sources: Random variation from the subjects themselves (due to individual variations in motivation, aptitude, mood, ability to understand instructions, etc.). Systematic variation produced by the experimental manipulation. 2

Random variation Systematic variation + Random variation ANOVA compares the amount of systematic variation to the amount of random variation, to produce an F-ratio: F = systematic variation random variation ( error ) Large value of F: a lot of the overall variation in scores is due to the experimental manipulation, rather than to random variation between subjects. Small value of F: the variation in scores produced by the experimental manipulation is small, compared to random variation between subjects. Analysis of variance implies analyzing or breaking down variance. We start by breaking down Sum of Squares or SS. We saw these first when we did the t-test. 9 calculations go into doing an ANOVA test (read this figure from bottom to top with F as your final calculation) 2 sum of squares = #(X " X ) 1 Calculation 1 Calculation 1 Calculation We divide SS by the appropriate "degrees of freedom" (usually the number of groups or subjects minus 1) to get variance. 3 Calculations 3 Calculations 3

step 1 The null hypothesis: step 2 Total SS H 0 : µ 1 = µ 2 = µ 3 = µ 4 No treatment effect set alpha level: α =.05 steps 2, 3 & 4 Calculate 3 SS values: 1) Total 2) Between treatments 3) Within treatments Total SS " G ) 2 SS Total = 297 G = 9.5 step 3 Between treatments SS step 4 Within treatments SS X 1 = 4 X 2 = 9 X 3 =12 X 4 =13 n = number of participants SS between treatments = n [(X 1 " G ) 2 + (X 2 " G ) 2 + (X 3 " G ) 2 + (X 4 " G ) 2 ] SS between treatments = 245 X 1 = 4 X 2 = 9 X 3 =12 X 4 =13 SS 1 " X 1 ) 2 SS 3 " X 3 ) 2 SS 2 " X 2 ) 2 SS 4 " X 4 ) 2 SS within treatments = SS 1 + SS 2 +SS 3 +SS 4 = 52 4

step 5 Calculating df (total, between treatments & within treatments) step 6 Calculating the Between treatments and Within treatments Variance or Mean squares (MS) MS = SS df The ANOVA summary table: df total = All scores 1 = 19 df between treatments = Number of treatments 1 = 3 df within treatments = df 1 + df 2 + df 3 + df 4 = 4 + 4 + 4 + 4 = 16 (Total SS) = (Between-treatments SS) + (Within-treatments SS) (total df) = (between-treatments df ) + (within-treatments df ) step 7 Computing F and Assessing its significance: F = MS between treatments MS within treatments = 81.67 3.25 = 25.13 The bigger the F-ratio, the less likely it is to have arisen merely by chance and the more likely it is that you can reject the null hypothesis. Use the between-treatments and within-treatments df to find the critical value of F. Your F is significant if it is equal to or larger than the critical value in the table. Here, look up the critical F-value for 3 and 16 df Columns correspond to between-treatments df; rows correspond to within-treatments df Here, go along 3 and down 16: critical F is at the intersection. Our obtained F = 25.13, is bigger than 3.24; it is therefore significant at p <.05 1 2 3 4 1 161.4 199.5 215.7 224.6 2 18.51 19.00 19.16 19.25 3 10.13 9.55 9.28 9.12 4 7.71 6.94 6.59 6.39 5 6.61 5.79 5.41 5.19 6 5.99 5.14 4.76 4.53 7 5.59 4.74 4.35 4.12 8 5.32 4.46 4.07 3.84 9 5.12 4.26 3.86 3.63 10 4.96 4.10 3.71 3.48 11 4.84 3.98 3.59 3.36 12 4.75 3.89 3.49 3.26 13 4.67 3.81 3.41 3.18 14 4.60 3.74 3.34 3.11 15 4.54 3.68 3.29 3.06 16 4.49 3.63 3.24 3.01 17 4.45 3.20 3.20 2.96 5

Interpreting the Results: A significant F-ratio merely tells us is that there is a statistically-significant difference between our experimental conditions; it does not say where the difference comes from. In our example, it tells us that caffeine dosage does make a difference to memory performance. BUT the difference may be ONLY between: Caffeine VERSUS No-Caffeine To pinpoint the source of the difference we can do: (t tests) (a) planned comparisons - comparisons between (two) groups which you decide to make in advance of collecting the data. (b) post hoc tests - comparisons between (two) groups which you decide to make after collecting the data: Many different types - e.g. Newman-Keuls, Scheffé, Bonferroni. AND there might be NO difference between: Large dose of Caffeine VERSUS Small Dose of Caffeine Assumptions underlying ANOVA: ANOVA is a parametric test (like the t-test) It assumes: (a) data are interval or ratio measurements; (b) conditions show homogeneity of variance; (c) scores in each condition are roughly normally distributed. Using SPSS for a one-way independent-measures ANOVA on effects of alcohol on time taken on some task such as whack a mole Three groups (10 individuals in each) Treatments: Group 1: two drinks Group 2: one drink Group 3: no alcohol Dependent variable: Time taken to whack 20 ten moles 6

RUNNING SPSS (Analyze > compare means > One Way ANOVA) Data Entry 7

Click Options Then Click Boxes: Descriptive; Homogeneity of variance test; Means plot SPSS output Trend tests: (Makes sense only when levels of Independent Variable correspond to differing amounts of something - such as caffeine dosage - which can be meaningfully ordered). Linear trend: Quadratic trend: (one change in direction) Cubic trend: (two changes in direction) With two groups, you can only test for a linear trend. With three groups, you can test for linear and quadratic trends. With four groups, you can test for linear, quadratic and cubic trends. 8

Conclusions: One-way independent-measures ANOVA enables comparisons between 3 or more groups that represent different levels of one independent variable. A parametric test, so the data must be interval or ratio scores; be normally distributed; and show homogeneity of variance. ANOVA avoids increasing the risk of a Type 1 error. 9