Analysis of variance

Size: px
Start display at page:

Download "Analysis of variance"

Transcription

1 Analysis of variance Tron Anders Moger Comparing more than two groups Up to now we have studied situations with One observation per subject One group Two groups Two or more observations per subject We will now study situations with one observation per subject, and three or more groups of subjects The most important question is as usual: Do the numbers in the groups come from the same population, or from different populations? Parametric model (normal distribution): Differences in mean Non-parametric model: Differences in median

2 ANOVA If you have three groups, could plausibly do pairwise comparisons. But if you have 0 groups? Too many pairwise comparisons: You would get too many false positives! You would really like to compare a null hypothesis of all equal, against some difference ANOVA: ANalysis Of VAriance Testing different types of wheat in a field Interested in finding out if different types of wheat yields different crops Outcome: E.g. wheat in pounds per acre Wheat IIIIIIIIIIIIIIIII IIIIIIIIIIIIIIIII IIIIIIIIIIIIIIIII IIIIIIIIIIIIIIIII Group Wheat IIIIIIIIIIIIIIII IIIIIIIIIIIIIIII IIIIIIIIIIIIIIII IIIIIIIIIIIIIIII Wheat 3 IIIIIIIIIIIIIIIII IIIIIIIIIIIIIIIII IIIIIIIIIIIIIIIII IIIIIIIIIIIIIIIII One-way ANOVA: Testing if mean crop per acre is different for different types of wheat! Find out which wheat type is best!

3 One-way ANOVA: Example Assume treatment results from 3 patients visiting one of three doctors are given: Doctor A: 4,6,3,7 Doctor B: 9,3,30,36,33 Doctor C: 9,7,34,6 H 0 : The means are equal for all groups (The treatment results are from the same population of results) H : The means are different for at least two groups (They are from different populations) Comparing the groups Averages within groups: Doctor A: 7 Doctor B: 3.8 Doctor C: Total average: = Variance around the mean matters for comparison. We must compare the variance within the groups to the variance between the group means. 3

4 Variance within and between groups Sum of squares within groups: SSW = ( x x ) i= j= SSW = (4 7) + (6 7) (9 3.8) +... = 94.8 Compare it with sum of squares between K groups: SSG = ni( xi x) i= SSG = = 4(7 9.46) + 5( ) + 4(9 9.46) = 5.43 (7 9.46) (7 9.46)... ( )... Comparing these, we also need to take into account the number of observations and sizes of groups K n i ij i Adjusting for group sizes Divide by the number of degrees of freedom SSW MSW = n K Both are estimates of population variance of error under H 0 MSG = SSG K n: number of observations K: number of groups Test statistic: MSG MSW Reject H 0 if this is large 4

5 Test statistic thresholds If populations are normal, with the same variance, then we can show that under the null hypothesis, MSG and MSW are Chisquare distributed with K- and n-k d.f. MSG MSW F ~ K, n K The F distribution, with K- and n-k degrees of freedom MSG FK n K α Reject at confidence level α if >,, MSW Find this value in table p. 87 Continuing example MSW = SSW n = K 3 3 = SSG 5.43 MSG = 6. K = 3 = MSG MSW = 9.48 = Page 87: F3,3 3,0.05= 4.0 Thus we can NOT reject the null hypothesis in our case. 5

6 ANOVA table Source of variation Sum of squares Deg. of freedom Mean squares F ratio Between groups Within groups SSG SSW K- n-k MSG MSW MSG MSW Total SST n- SST = (4 9.46) + (6 9.46) (6 9.46) NOTE: SSG + SSW = SST Formulation of the model: H 0 : µ =µ = =µ K X ij =µ i +ε ij Let G i be the difference between the group means and the population mean. Then: G i =µ i -µ of µ i =µ+g i GivingX ij =µ+g i +ε ij And H 0 : G =G = =G K =0 6

7 One-way ANOVA in SPSS: Analyze - Compare Means - One-way ANOVA Move dependent variable to Dependent list and group to Factor Choose Bonferroni in the Post Hoc window to get comparisons of all groups Choose Descriptive and Homogeneity of variance test in the Options window One-way ANOVA in SPSS Value Between Groups Within Groups Total ANOVA Sum of Squares df Mean Square F Sig. 5,43 6,5,765, 94, ,480 47,3 Last column: P-value=0., do not reject H 0 Note that the p-value can also be seen as the smallest value of α at which the null hypothesis is rejected. 7

8 Energy expenditure example: Let us say we have measurements of energy expenditure in three independent groups: Anorectic, lean and obese Wantto test H 0 : Energy expenditure is the same for anorectic, lean and obese Data for anorctic: 5.40, 6.3, 5.34, 5.76, 5.99, 6.55, 6.33, 6. Energy Lean Obese Anorectic Total SPSS output: Descriptives 95% Confidence Interval for Mean N Mean Std. Deviation Std. Error Lower Bound Upper Bound Minimum Maximum 3 8,066,3808, ,380 8,843 6,3 0,88 9 0,978,39787, ,33,373 8,79,79 8 5,976,4403,5568 5,608 6,3444 5,34 6, ,783,98936,363 7,4355 8,9 5,34,79 Test for equal Variances H 0 : All variances are equal Test of Homogeneity of Variances Energy Levene Statistic df df Sig.,84 7,078 Energy Between Groups Within Groups Total ANOVA Sum of Squares df Mean Square F Sig. 79,385 39,693 30,88,000 35,384 7,3 4,769 9 Dependent Variable: Energy Bonferroni (I) Group Lean Obese Anorectic (J) Group Obese Anorectic Lean Anorectic Lean Obese Multiple Comparisons Mean Difference 95% Confidence Interval (I-J) Std. Error Sig. Lower Bound Upper Bound -,36*,4964,000-3,4987 -,9646,08990*,544,00,7769 3,409,36*,4964,000,9646 3,4987 4,353*,5566,000,907 5,744 -,08990*,544,00-3,409 -,7769-4,353*,5566,000-5,744 -,907 *. The mean difference is significant at the.05 level. See that there is a difference between groups. See also between which groups the difference is! 8

9 Conclusion: There is a significant overall difference in energy expenditure between the three groups (p-value<0.00) There are also significant differences for all two-by-two comparisons of groups The Kruskal-Wallis test ANOVA is based on the assumption of normality There is a non-parametric alternative not relying this assumption: Looking at all observations together, rank them Let R, R,,R K be the sums of ranks of each group If some R s are much larger than others, it indicates the numbers in different groups come from different populations 9

10 The Kruskal-Wallis test The test statistic is K Ri W = 3( n+ ) nn ( + ) n i= i Under the null hypothesis, this has an approximate χk distribution. The approximation is OK when each group contains at least 5 observations. Doctor A 4 (rank ) 6 (rank.5) 3 (rank 9.5) 7 (rank 4.5) R =7.5 Example: previous data Doctor B 9 (rank 6.5) 3 (rank 9.5) 30 (rank 8) 36 (rank 3) 33 (rank ) R =48 Doctor C 9 (rank 6.5) 7 (rank 4.5) 34 (rank ) 6 (rank.5) R 3 =5.5 (This is just an example. We really have too few observations for this test!) 0

11 Kruskal-Wallis in SPSS Use Analyze=>Nonparametric tests=>k independent samples For our data, we get Value Group Doctor A Doctor B Doctor C Total Ranks N Mean Rank 4 4,38 5 9,60 4 6,38 3 Test Statistics a,b Chi-Square df Asymp. Sig. Value 4,95,3 a. Kruskal Wallis Test b. Grouping Variable: Group For the energy data: Same result as for one-way ANOVA! Energy Group Lean Obese Anorectic Total Ranks N Mean Rank 3 5,6 9 4,67 8 5,00 30 Test Statistics a,b Chi-Square df Asymp. Sig. Energy,46,000 a. Kruskal Wallis Test b. Grouping Variable: Group Reject H 0

12 When to use what method In situations where we have one observation per subject, and want to compare two or more groups: Use non-parametric tests if you have enough data For two groups: Mann-Whitney U-test (Wilcoxon rank sum) For three or more groups use Kruskal-Wallis If data analysis indicate assumption of normally distributed independent errors is OK For two groups use t-test (equal or unequal variances assumed) For three or more groups use ANOVA What if you have more information on the subjects? When you in addition to the main observation have some observations that can be used to pair or block subjects, and want to compare groups, and assumption of normally distributed independent errors is OK: For two groups, use paired-data t-test For three or more groups, we can use two-way ANOVA

13 Two-way ANOVA: Want to test different fertilizers also Block: Fertilizer Fertilizer Fertilizer 3 Wheat IIIIIIIII IIIIIIIII IIIIIIIII Group: Wheat IIIIIIIIII IIIIIIIIII IIIIIIIIII Wheat 3 IIIIIIIIII IIIIIIIIII IIIIIIIIII Do different wheat types give different wheat crop per acre? Do different fertilizers give different wheat crop per acre? Two-way ANOVA without interaction! Do e.g fertilizer work better for wheat type than for Wheat types and 3? Is there interaction between wheat and fertilizer? Two-way ANOVA with interaction! Two-way ANOVA (without interaction) In two-way ANOVA, data fall into categories in two different ways: Each observation can be placed in a table. Example: Both doctor and type of treatment should influence outcome. Sometimes we are interested in studying both categories, sometimes the second category is used only to reduce unexplained variance (like an independent variable in regression!). Then it is called a blocking variable Compare means, just as before, but for different groups and blocks 3

14 Data from exercise 7.46: Three types of aptitude tests (K=3) given to prospective management trainers: Profile fit, Mindbender, Psych Out Each test type is given to members of each of four groups of subjects (H=4) based on scores in preliminary interviews Test type Subject type Profile fit Mindbender Psych Out Poor Fair Good Excellent Sums of squares for two-way ANOVA Assume K groups, H blocks, and assume one observation x ij for each group i and each block j, so we have n=kh observations (have to be independent!). Mean for category i: Mean for block j: x j x x i Overall mean: Model: X ij =µ+g i +B j +ε ij 4

15 Sums of squares for two-way ANOVA K SSG = H ( x x) K i= H i= j= i SSE = ( x x x + x) ij i j H SSB = K ( x x) K j= i= j= j SST = ( x x) H ij SSG + SSB + SSE = SST ANOVA table for two-way data Source of variation Sums of squares Deg. of freedom Mean squares F ratio Between groups SSG K- MSG= SSG/(K-) MSG/MSE Between blocks SSB H- MSB= SSB/(H-) MSB/MSE Error SSE (K-)(H-) MSE= SSE/(K-)(H-) Total SST n- Test for between groups effect: compare Test for between blocks effect: compare MSG MSE MSB MSE to to FK,( K )( H ) FH,( K )( H ) 5

16 Two-way ANOVA (with interaction) The setup above assumes that the blocking variable influences outcomes in the same way in all categories (and vice versa) We can check if there is interaction between the blocking variable and the categories by extending the model with an interaction term Need more observations per block Other advantages: More precise estimates Data from exercise 7.46 cont d: Each type of test was given three times for each type of subject Test type Subject type Profile fit Mindbender Psych Out Poor Fair Good Excellent

17 Sums of squares for two-way ANOVA (with interaction) Assume K groups, H blocks, and assume L observations x ij, x ij,,x ijl for each category i and each block j block, so we have n=khl observations (independent!). Mean for category i: Mean for block j: Mean for cell ij: x ij Overall mean: x x j x i Model: X ijl =µ+g i +B j +I ij +ε ijl Sums of squares for two-way ANOVA (with interaction) SSG = HL ( x x) K i= i H SSB = KL ( x x) j= j K H L SSE = ( x x ) i= j= l= ijl ij K H L SST = ( x x) i= j= l= ijl K H SSI = L ( x x x + x) i= j= ij i j SSG + SSB + SSI + SSE = SST 7

18 ANOVA table for two-way data (with interaction) Source of variation Between groups Sums of squares SSG Deg. of freedom K- Mean squares MSG= SSG/(K-) F ratio MSG/MSE Between blocks SSB H- MSB= SSB/(H-) MSB/MSE Interaction SSI (K-)(H-) MSI= SSI/(K-)(H-) MSI/MSE Error SSE KH(L-) MSE= SSE/KH(L-) Total SST n- Test for interaction: compare MSI/MSE with Test for block effect: compare MSB/MSE with F( K )( H ), KH( L ) FH, KH( L ) Test for group effect: compare MSG/MSE with FK, KH( L ) Two-way ANOVA in SPSS Analyze->General Linear Model-> Univariate Move dependent variable (Score) to Dependent Variable Move group variable (test type) and block variable (subject type) to Fixed Factor(s) Under Options, may check Descriptive Statistics and Homogeneity Tests, and also get two-by-two comparisons by checking Bonferroni under Post Hoc Gives you a full model (with interaction) 8

19 Levene's Test of Equality of Error Variances a Dependent Variable: Score F df df Sig.,47 4,06 Tests the null hypothesis that the error variance of the dependent variable is equal across groups. a. Design: Intercept+Subjectty+Testtype+Subjectty * Testtype Dependent Variable: Score Some SPSS output: Tests of Between-Subjects Effects Source Type IV Sum of Squares df Mean Square F Sig. Corrected Model 03,556 a 93,869 5,360,000 Intercept 93306, , ,08,000 Subjectty 389, ,667,8,000 Testtype 57,556 8,778 4,709,09 Subjectty * Testtype 586, ,667 5,98,000 Error 46, , Total 94486, Corrected Total 79, 35 a. R Squared =,876 (Adjusted R Squared =,89). Subjectty Dependent Variable: Score Subjectty Poor Fair Good Excellent 95% Confidence Interval Mean Std. Error Lower Bound Upper Bound 70,000,84 68,99 7,70 7,444,84 69,744 73,45 73,000,84 7,99 74,70 78,667,84 76,966 80,367 Equal variances can be assumed See that there is a significant block effect, significant group effect, and a significant interaction effect Means (in plain words) that the test score is different for the four subject types, for the three test types, and that differences between test types depend on what subject type you consider Two-by-two comparisons Dependent Variable: Score Bonferroni Multiple Comparisons (I) Testtype Profile fit Mindbender Psych Out (J) Testtype Mindbender Psych Out Profile fit Psych Out Profile fit Mindbender Mean Difference 95% Confidence Interval (I-J) Std. Error Sig. Lower Bound Upper Bound,83,009,000 -,76 3,43 -,7,009,6-4,76,43 -,83,009,000-3,43,76-3,00*,009,00-5,60 -,40,7,009,6 -,43 4,76 3,00*,009,00,40 5,60 Based on observed means. Multiple Comparisons *. The mean difference is significant at the,05 level. Dependent Variable: Score Bonferroni (I) Subjectty Poor Fair Good Excellent (J) Subjectty Fair Good Excellent Poor Good Excellent Poor Fair Excellent Poor Fair Good Based on observed means. *. The mean difference is significant at the,05 level. Mean Difference 95% Confidence Interval (I-J) Std. Error Sig. Lower Bound Upper Bound -,44,65,000-4,79,9-3,00,65,00-6,35,35-8,67*,65,000 -,0-5,3,44,65,000 -,9 4,79 -,56,65,000-4,9,79-7,*,65,000-0,57-3,87 3,00,65,00 -,35 6,35,56,65,000 -,79 4,9-5,67*,65,000-9,0 -,3 8,67*,65,000 5,3,0 7,*,65,000 3,87 0,57 5,67*,65,000,3 9,0 9

20 Notes on ANOVA All analysis of variance (ANOVA) methods are based on the assumptions of normally distributed and independent errors The same problems can be described using the regression framework. We get exactly the same tests and results! There are many extensions beyond those mentioned In fact, the book only briefly touches this subject More material is needed in order to do two-way ANOVA on your own Next time: How to design a study? Different sampling methods Research designs Sample size considerations 0

SEVERAL μs AND MEDIANS: MORE ISSUES. Business Statistics

SEVERAL μs AND MEDIANS: MORE ISSUES. Business Statistics SEVERAL μs AND MEDIANS: MORE ISSUES Business Statistics CONTENTS Post-hoc analysis ANOVA for 2 groups The equal variances assumption The Kruskal-Wallis test Old exam question Further study POST-HOC ANALYSIS

More information

Statistics for Managers Using Microsoft Excel Chapter 10 ANOVA and Other C-Sample Tests With Numerical Data

Statistics for Managers Using Microsoft Excel Chapter 10 ANOVA and Other C-Sample Tests With Numerical Data Statistics for Managers Using Microsoft Excel Chapter 10 ANOVA and Other C-Sample Tests With Numerical Data 1999 Prentice-Hall, Inc. Chap. 10-1 Chapter Topics The Completely Randomized Model: One-Factor

More information

Introduction and Descriptive Statistics p. 1 Introduction to Statistics p. 3 Statistics, Science, and Observations p. 5 Populations and Samples p.

Introduction and Descriptive Statistics p. 1 Introduction to Statistics p. 3 Statistics, Science, and Observations p. 5 Populations and Samples p. Preface p. xi Introduction and Descriptive Statistics p. 1 Introduction to Statistics p. 3 Statistics, Science, and Observations p. 5 Populations and Samples p. 6 The Scientific Method and the Design of

More information

PSY 307 Statistics for the Behavioral Sciences. Chapter 20 Tests for Ranked Data, Choosing Statistical Tests

PSY 307 Statistics for the Behavioral Sciences. Chapter 20 Tests for Ranked Data, Choosing Statistical Tests PSY 307 Statistics for the Behavioral Sciences Chapter 20 Tests for Ranked Data, Choosing Statistical Tests What To Do with Non-normal Distributions Tranformations (pg 382): The shape of the distribution

More information

Non-parametric tests, part A:

Non-parametric tests, part A: Two types of statistical test: Non-parametric tests, part A: Parametric tests: Based on assumption that the data have certain characteristics or "parameters": Results are only valid if (a) the data are

More information

SPSS Guide For MMI 409

SPSS Guide For MMI 409 SPSS Guide For MMI 409 by John Wong March 2012 Preface Hopefully, this document can provide some guidance to MMI 409 students on how to use SPSS to solve many of the problems covered in the D Agostino

More information

4.1. Introduction: Comparing Means

4.1. Introduction: Comparing Means 4. Analysis of Variance (ANOVA) 4.1. Introduction: Comparing Means Consider the problem of testing H 0 : µ 1 = µ 2 against H 1 : µ 1 µ 2 in two independent samples of two different populations of possibly

More information

Week 14 Comparing k(> 2) Populations

Week 14 Comparing k(> 2) Populations Week 14 Comparing k(> 2) Populations Week 14 Objectives Methods associated with testing for the equality of k(> 2) means or proportions are presented. Post-testing concepts and analysis are introduced.

More information

Lecture 7: Hypothesis Testing and ANOVA

Lecture 7: Hypothesis Testing and ANOVA Lecture 7: Hypothesis Testing and ANOVA Goals Overview of key elements of hypothesis testing Review of common one and two sample tests Introduction to ANOVA Hypothesis Testing The intent of hypothesis

More information

Introduction to Statistical Inference Lecture 10: ANOVA, Kruskal-Wallis Test

Introduction to Statistical Inference Lecture 10: ANOVA, Kruskal-Wallis Test Introduction to Statistical Inference Lecture 10: ANOVA, Kruskal-Wallis Test la Contents The two sample t-test generalizes into Analysis of Variance. In analysis of variance ANOVA the population consists

More information

ANOVA Analysis of Variance

ANOVA Analysis of Variance ANOVA Analysis of Variance ANOVA Analysis of Variance Extends independent samples t test ANOVA Analysis of Variance Extends independent samples t test Compares the means of groups of independent observations

More information

Example: Four levels of herbicide strength in an experiment on dry weight of treated plants.

Example: Four levels of herbicide strength in an experiment on dry weight of treated plants. The idea of ANOVA Reminders: A factor is a variable that can take one of several levels used to differentiate one group from another. An experiment has a one-way, or completely randomized, design if several

More information

Linear regression. We have that the estimated mean in linear regression is. ˆµ Y X=x = ˆβ 0 + ˆβ 1 x. The standard error of ˆµ Y X=x is.

Linear regression. We have that the estimated mean in linear regression is. ˆµ Y X=x = ˆβ 0 + ˆβ 1 x. The standard error of ˆµ Y X=x is. Linear regression We have that the estimated mean in linear regression is The standard error of ˆµ Y X=x is where x = 1 n s.e.(ˆµ Y X=x ) = σ ˆµ Y X=x = ˆβ 0 + ˆβ 1 x. 1 n + (x x)2 i (x i x) 2 i x i. The

More information

TABLES AND FORMULAS FOR MOORE Basic Practice of Statistics

TABLES AND FORMULAS FOR MOORE Basic Practice of Statistics TABLES AND FORMULAS FOR MOORE Basic Practice of Statistics Exploring Data: Distributions Look for overall pattern (shape, center, spread) and deviations (outliers). Mean (use a calculator): x = x 1 + x

More information

Chapter 15: Nonparametric Statistics Section 15.1: An Overview of Nonparametric Statistics

Chapter 15: Nonparametric Statistics Section 15.1: An Overview of Nonparametric Statistics Section 15.1: An Overview of Nonparametric Statistics Understand Difference between Parametric and Nonparametric Statistical Procedures Parametric statistical procedures inferential procedures that rely

More information

Degrees of freedom df=1. Limitations OR in SPSS LIM: Knowing σ and µ is unlikely in large

Degrees of freedom df=1. Limitations OR in SPSS LIM: Knowing σ and µ is unlikely in large Z Test Comparing a group mean to a hypothesis T test (about 1 mean) T test (about 2 means) Comparing mean to sample mean. Similar means = will have same response to treatment Two unknown means are different

More information

ANOVA: Comparing More Than Two Means

ANOVA: Comparing More Than Two Means 1 ANOVA: Comparing More Than Two Means 10.1 ANOVA: The Completely Randomized Design Elements of a Designed Experiment Before we begin any calculations, we need to discuss some terminology. To make this

More information

CHI SQUARE ANALYSIS 8/18/2011 HYPOTHESIS TESTS SO FAR PARAMETRIC VS. NON-PARAMETRIC

CHI SQUARE ANALYSIS 8/18/2011 HYPOTHESIS TESTS SO FAR PARAMETRIC VS. NON-PARAMETRIC CHI SQUARE ANALYSIS I N T R O D U C T I O N T O N O N - P A R A M E T R I C A N A L Y S E S HYPOTHESIS TESTS SO FAR We ve discussed One-sample t-test Dependent Sample t-tests Independent Samples t-tests

More information

Disadvantages of using many pooled t procedures. The sampling distribution of the sample means. The variability between the sample means

Disadvantages of using many pooled t procedures. The sampling distribution of the sample means. The variability between the sample means Stat 529 (Winter 2011) Analysis of Variance (ANOVA) Reading: Sections 5.1 5.3. Introduction and notation Birthweight example Disadvantages of using many pooled t procedures The analysis of variance procedure

More information

Why should I use a Kruskal-Wallis test? (With Minitab) Why should I use a Kruskal-Wallis test? (With SPSS)

Why should I use a Kruskal-Wallis test? (With Minitab) Why should I use a Kruskal-Wallis test? (With SPSS) Why should I use a Kruskal-Wallis test? (With Minitab) To perform this test, select Stat > Nonparametrics > Kruskal-Wallis. Use the Kruskal-Wallis test to determine whether the medians of two or more groups

More information

4/6/16. Non-parametric Test. Overview. Stephen Opiyo. Distinguish Parametric and Nonparametric Test Procedures

4/6/16. Non-parametric Test. Overview. Stephen Opiyo. Distinguish Parametric and Nonparametric Test Procedures Non-parametric Test Stephen Opiyo Overview Distinguish Parametric and Nonparametric Test Procedures Explain commonly used Nonparametric Test Procedures Perform Hypothesis Tests Using Nonparametric Procedures

More information

2 Hand-out 2. Dr. M. P. M. M. M c Loughlin Revised 2018

2 Hand-out 2. Dr. M. P. M. M. M c Loughlin Revised 2018 Math 403 - P. & S. III - Dr. McLoughlin - 1 2018 2 Hand-out 2 Dr. M. P. M. M. M c Loughlin Revised 2018 3. Fundamentals 3.1. Preliminaries. Suppose we can produce a random sample of weights of 10 year-olds

More information

WELCOME! Lecture 13 Thommy Perlinger

WELCOME! Lecture 13 Thommy Perlinger Quantitative Methods II WELCOME! Lecture 13 Thommy Perlinger Parametrical tests (tests for the mean) Nature and number of variables One-way vs. two-way ANOVA One-way ANOVA Y X 1 1 One dependent variable

More information

13: Additional ANOVA Topics

13: Additional ANOVA Topics 13: Additional ANOVA Topics Post hoc comparisons Least squared difference The multiple comparisons problem Bonferroni ANOVA assumptions Assessing equal variance When assumptions are severely violated Kruskal-Wallis

More information

SIMPLE REGRESSION ANALYSIS. Business Statistics

SIMPLE REGRESSION ANALYSIS. Business Statistics SIMPLE REGRESSION ANALYSIS Business Statistics CONTENTS Ordinary least squares (recap for some) Statistical formulation of the regression model Assessing the regression model Testing the regression coefficients

More information

ANOVA: Analysis of Variation

ANOVA: Analysis of Variation ANOVA: Analysis of Variation The basic ANOVA situation Two variables: 1 Categorical, 1 Quantitative Main Question: Do the (means of) the quantitative variables depend on which group (given by categorical

More information

What is a Hypothesis?

What is a Hypothesis? What is a Hypothesis? A hypothesis is a claim (assumption) about a population parameter: population mean Example: The mean monthly cell phone bill in this city is μ = $42 population proportion Example:

More information

Example - Alfalfa (11.6.1) Lecture 14 - ANOVA cont. Alfalfa Hypotheses. Treatment Effect

Example - Alfalfa (11.6.1) Lecture 14 - ANOVA cont. Alfalfa Hypotheses. Treatment Effect (11.6.1) Lecture 14 - ANOVA cont. Sta102 / BME102 Colin Rundel March 19, 2014 Researchers were interested in the effect that acid has on the growth rate of alfalfa plants. They created three treatment

More information

Statistical methods for comparing multiple groups. Lecture 7: ANOVA. ANOVA: Definition. ANOVA: Concepts

Statistical methods for comparing multiple groups. Lecture 7: ANOVA. ANOVA: Definition. ANOVA: Concepts Statistical methods for comparing multiple groups Lecture 7: ANOVA Sandy Eckel seckel@jhsph.edu 30 April 2008 Continuous data: comparing multiple means Analysis of variance Binary data: comparing multiple

More information

Statistics For Economics & Business

Statistics For Economics & Business Statistics For Economics & Business Analysis of Variance In this chapter, you learn: Learning Objectives The basic concepts of experimental design How to use one-way analysis of variance to test for differences

More information

Multiple linear regression

Multiple linear regression Multiple linear regression Course MF 930: Introduction to statistics June 0 Tron Anders Moger Department of biostatistics, IMB University of Oslo Aims for this lecture: Continue where we left off. Repeat

More information

Glossary. The ISI glossary of statistical terms provides definitions in a number of different languages:

Glossary. The ISI glossary of statistical terms provides definitions in a number of different languages: Glossary The ISI glossary of statistical terms provides definitions in a number of different languages: http://isi.cbs.nl/glossary/index.htm Adjusted r 2 Adjusted R squared measures the proportion of the

More information

DETAILED CONTENTS PART I INTRODUCTION AND DESCRIPTIVE STATISTICS. 1. Introduction to Statistics

DETAILED CONTENTS PART I INTRODUCTION AND DESCRIPTIVE STATISTICS. 1. Introduction to Statistics DETAILED CONTENTS About the Author Preface to the Instructor To the Student How to Use SPSS With This Book PART I INTRODUCTION AND DESCRIPTIVE STATISTICS 1. Introduction to Statistics 1.1 Descriptive and

More information

Statistiek II. John Nerbonne using reworkings by Hartmut Fitz and Wilbert Heeringa. February 13, Dept of Information Science

Statistiek II. John Nerbonne using reworkings by Hartmut Fitz and Wilbert Heeringa. February 13, Dept of Information Science Statistiek II John Nerbonne using reworkings by Hartmut Fitz and Wilbert Heeringa Dept of Information Science j.nerbonne@rug.nl February 13, 2014 Course outline 1 One-way ANOVA. 2 Factorial ANOVA. 3 Repeated

More information

Prepared by: Prof. Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti

Prepared by: Prof. Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti Prepared by: Prof. Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti Putra Malaysia Serdang Use in experiment, quasi-experiment

More information

Lecture 15 - ANOVA cont.

Lecture 15 - ANOVA cont. Lecture 15 - ANOVA cont. Statistics 102 Colin Rundel March 18, 2013 One-way ANOVA Example - Alfalfa Example - Alfalfa (11.6.1) Researchers were interested in the effect that acid has on the growth rate

More information

One-Way ANOVA Source Table J - 1 SS B / J - 1 MS B /MS W. Pairwise Post-Hoc Comparisons of Means

One-Way ANOVA Source Table J - 1 SS B / J - 1 MS B /MS W. Pairwise Post-Hoc Comparisons of Means One-Way ANOVA Source Table ANOVA MODEL: ij = µ* + α j + ε ij H 0 : µ 1 = µ =... = µ j or H 0 : Σα j = 0 Source Sum of Squares df Mean Squares F Between Groups n j ( j - * ) J - 1 SS B / J - 1 MS B /MS

More information

Nonparametric Statistics. Leah Wright, Tyler Ross, Taylor Brown

Nonparametric Statistics. Leah Wright, Tyler Ross, Taylor Brown Nonparametric Statistics Leah Wright, Tyler Ross, Taylor Brown Before we get to nonparametric statistics, what are parametric statistics? These statistics estimate and test population means, while holding

More information

Hypothesis testing, part 2. With some material from Howard Seltman, Blase Ur, Bilge Mutlu, Vibha Sazawal

Hypothesis testing, part 2. With some material from Howard Seltman, Blase Ur, Bilge Mutlu, Vibha Sazawal Hypothesis testing, part 2 With some material from Howard Seltman, Blase Ur, Bilge Mutlu, Vibha Sazawal 1 CATEGORICAL IV, NUMERIC DV 2 Independent samples, one IV # Conditions Normal/Parametric Non-parametric

More information

Analysis of variance

Analysis of variance Analysis of variance 1 Method If the null hypothesis is true, then the populations are the same: they are normal, and they have the same mean and the same variance. We will estimate the numerical value

More information

Pooled Variance t Test

Pooled Variance t Test Pooled Variance t Test Tests means of independent populations having equal variances Parametric test procedure Assumptions Both populations are normally distributed If not normal, can be approximated by

More information

Summary of Chapter 7 (Sections ) and Chapter 8 (Section 8.1)

Summary of Chapter 7 (Sections ) and Chapter 8 (Section 8.1) Summary of Chapter 7 (Sections 7.2-7.5) and Chapter 8 (Section 8.1) Chapter 7. Tests of Statistical Hypotheses 7.2. Tests about One Mean (1) Test about One Mean Case 1: σ is known. Assume that X N(µ, σ

More information

1-Way ANOVA MATH 143. Spring Department of Mathematics and Statistics Calvin College

1-Way ANOVA MATH 143. Spring Department of Mathematics and Statistics Calvin College 1-Way ANOVA MATH 143 Department of Mathematics and Statistics Calvin College Spring 2010 The basic ANOVA situation Two variables: 1 Categorical, 1 Quantitative Main Question: Do the (means of) the quantitative

More information

Finding Relationships Among Variables

Finding Relationships Among Variables Finding Relationships Among Variables BUS 230: Business and Economic Research and Communication 1 Goals Specific goals: Re-familiarize ourselves with basic statistics ideas: sampling distributions, hypothesis

More information

Data are sometimes not compatible with the assumptions of parametric statistical tests (i.e. t-test, regression, ANOVA)

Data are sometimes not compatible with the assumptions of parametric statistical tests (i.e. t-test, regression, ANOVA) BSTT523 Pagano & Gauvreau Chapter 13 1 Nonparametric Statistics Data are sometimes not compatible with the assumptions of parametric statistical tests (i.e. t-test, regression, ANOVA) In particular, data

More information

Statistical Inference Theory Lesson 46 Non-parametric Statistics

Statistical Inference Theory Lesson 46 Non-parametric Statistics 46.1-The Sign Test Statistical Inference Theory Lesson 46 Non-parametric Statistics 46.1 - Problem 1: (a). Let p equal the proportion of supermarkets that charge less than $2.15 a pound. H o : p 0.50 H

More information

1. The (dependent variable) is the variable of interest to be measured in the experiment.

1. The (dependent variable) is the variable of interest to be measured in the experiment. Chapter 10 Analysis of variance (ANOVA) 10.1 Elements of a designed experiment 1. The (dependent variable) is the variable of interest to be measured in the experiment. 2. are those variables whose effect

More information

Business Statistics (BK/IBA) Tutorial 4 Full solutions

Business Statistics (BK/IBA) Tutorial 4 Full solutions Business Statistics (BK/IBA) Tutorial 4 Full solutions Instruction In a tutorial session of 2 hours, we will obviously not be able to discuss all questions. Therefore, the following procedure applies:

More information

1 DV is normally distributed in the population for each level of the within-subjects factor 2 The population variances of the difference scores

1 DV is normally distributed in the population for each level of the within-subjects factor 2 The population variances of the difference scores One-way Prepared by: Prof. Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti Putra Malaysia Serdang The purpose is to test the

More information

Nonparametric Statistics

Nonparametric Statistics Nonparametric Statistics Nonparametric or Distribution-free statistics: used when data are ordinal (i.e., rankings) used when ratio/interval data are not normally distributed (data are converted to ranks)

More information

Types of Statistical Tests DR. MIKE MARRAPODI

Types of Statistical Tests DR. MIKE MARRAPODI Types of Statistical Tests DR. MIKE MARRAPODI Tests t tests ANOVA Correlation Regression Multivariate Techniques Non-parametric t tests One sample t test Independent t test Paired sample t test One sample

More information

Contents. Acknowledgments. xix

Contents. Acknowledgments. xix Table of Preface Acknowledgments page xv xix 1 Introduction 1 The Role of the Computer in Data Analysis 1 Statistics: Descriptive and Inferential 2 Variables and Constants 3 The Measurement of Variables

More information

Tentative solutions TMA4255 Applied Statistics 16 May, 2015

Tentative solutions TMA4255 Applied Statistics 16 May, 2015 Norwegian University of Science and Technology Department of Mathematical Sciences Page of 9 Tentative solutions TMA455 Applied Statistics 6 May, 05 Problem Manufacturer of fertilizers a) Are these independent

More information

1 The Randomized Block Design

1 The Randomized Block Design 1 The Randomized Block Design When introducing ANOVA, we mentioned that this model will allow us to include more than one categorical factor(explanatory) or confounding variables in the model. In a first

More information

Using SPSS for One Way Analysis of Variance

Using SPSS for One Way Analysis of Variance Using SPSS for One Way Analysis of Variance This tutorial will show you how to use SPSS version 12 to perform a one-way, between- subjects analysis of variance and related post-hoc tests. This tutorial

More information

Chapter 10: Analysis of variance (ANOVA)

Chapter 10: Analysis of variance (ANOVA) Chapter 10: Analysis of variance (ANOVA) ANOVA (Analysis of variance) is a collection of techniques for dealing with more general experiments than the previous one-sample or two-sample tests. We first

More information

McGill University. Faculty of Science MATH 204 PRINCIPLES OF STATISTICS II. Final Examination

McGill University. Faculty of Science MATH 204 PRINCIPLES OF STATISTICS II. Final Examination McGill University Faculty of Science MATH 204 PRINCIPLES OF STATISTICS II Final Examination Date: 20th April 2009 Time: 9am-2pm Examiner: Dr David A Stephens Associate Examiner: Dr Russell Steele Please

More information

Lecture 5: ANOVA and Correlation

Lecture 5: ANOVA and Correlation Lecture 5: ANOVA and Correlation Ani Manichaikul amanicha@jhsph.edu 23 April 2007 1 / 62 Comparing Multiple Groups Continous data: comparing means Analysis of variance Binary data: comparing proportions

More information

Solutions exercises of Chapter 7

Solutions exercises of Chapter 7 Solutions exercises of Chapter 7 Exercise 1 a. These are paired samples: each pair of half plates will have about the same level of corrosion, so the result of polishing by the two brands of polish are

More information

An Analysis of College Algebra Exam Scores December 14, James D Jones Math Section 01

An Analysis of College Algebra Exam Scores December 14, James D Jones Math Section 01 An Analysis of College Algebra Exam s December, 000 James D Jones Math - Section 0 An Analysis of College Algebra Exam s Introduction Students often complain about a test being too difficult. Are there

More information

CHAPTER 17 CHI-SQUARE AND OTHER NONPARAMETRIC TESTS FROM: PAGANO, R. R. (2007)

CHAPTER 17 CHI-SQUARE AND OTHER NONPARAMETRIC TESTS FROM: PAGANO, R. R. (2007) FROM: PAGANO, R. R. (007) I. INTRODUCTION: DISTINCTION BETWEEN PARAMETRIC AND NON-PARAMETRIC TESTS Statistical inference tests are often classified as to whether they are parametric or nonparametric Parameter

More information

Wolf River. Lecture 19 - ANOVA. Exploratory analysis. Wolf River - Data. Sta 111. June 11, 2014

Wolf River. Lecture 19 - ANOVA. Exploratory analysis. Wolf River - Data. Sta 111. June 11, 2014 Aldrin in the Wolf River Wolf River Lecture 19 - Sta 111 Colin Rundel June 11, 2014 The Wolf River in Tennessee flows past an abandoned site once used by the pesticide industry for dumping wastes, including

More information

Difference in two or more average scores in different groups

Difference in two or more average scores in different groups ANOVAs Analysis of Variance (ANOVA) Difference in two or more average scores in different groups Each participant tested once Same outcome tested in each group Simplest is one-way ANOVA (one variable as

More information

ANOVA CIVL 7012/8012

ANOVA CIVL 7012/8012 ANOVA CIVL 7012/8012 ANOVA ANOVA = Analysis of Variance A statistical method used to compare means among various datasets (2 or more samples) Can provide summary of any regression analysis in a table called

More information

The independent-means t-test:

The independent-means t-test: The independent-means t-test: Answers the question: is there a "real" difference between the two conditions in my experiment? Or is the difference due to chance? Previous lecture: (a) Dependent-means t-test:

More information

MANOVA is an extension of the univariate ANOVA as it involves more than one Dependent Variable (DV). The following are assumptions for using MANOVA:

MANOVA is an extension of the univariate ANOVA as it involves more than one Dependent Variable (DV). The following are assumptions for using MANOVA: MULTIVARIATE ANALYSIS OF VARIANCE MANOVA is an extension of the univariate ANOVA as it involves more than one Dependent Variable (DV). The following are assumptions for using MANOVA: 1. Cell sizes : o

More information

HYPOTHESIS TESTING II TESTS ON MEANS. Sorana D. Bolboacă

HYPOTHESIS TESTING II TESTS ON MEANS. Sorana D. Bolboacă HYPOTHESIS TESTING II TESTS ON MEANS Sorana D. Bolboacă OBJECTIVES Significance value vs p value Parametric vs non parametric tests Tests on means: 1 Dec 14 2 SIGNIFICANCE LEVEL VS. p VALUE Materials and

More information

Transition Passage to Descriptive Statistics 28

Transition Passage to Descriptive Statistics 28 viii Preface xiv chapter 1 Introduction 1 Disciplines That Use Quantitative Data 5 What Do You Mean, Statistics? 6 Statistics: A Dynamic Discipline 8 Some Terminology 9 Problems and Answers 12 Scales of

More information

Example - Alfalfa (11.6.1) Lecture 16 - ANOVA cont. Alfalfa Hypotheses. Treatment Effect

Example - Alfalfa (11.6.1) Lecture 16 - ANOVA cont. Alfalfa Hypotheses. Treatment Effect (11.6.1) Lecture 16 - ANOVA cont. Sta102 / BME102 Colin Rundel October 28, 2015 Researchers were interested in the effect that acid has on the growth rate of alfalfa plants. They created three treatment

More information

2. RELATIONSHIP BETWEEN A QUALITATIVE AND A QUANTITATIVE VARIABLE

2. RELATIONSHIP BETWEEN A QUALITATIVE AND A QUANTITATIVE VARIABLE 7/09/06. RELATIONHIP BETWEEN A QUALITATIVE AND A QUANTITATIVE VARIABLE Design and Data Analysis in Psychology II usana anduvete Chaves alvadorchacón Moscoso. INTRODUCTION You may examine gender differences

More information

N J SS W /df W N - 1

N J SS W /df W N - 1 One-Way ANOVA Source Table ANOVA MODEL: ij = µ* + α j + ε ij H 0 : µ = µ =... = µ j or H 0 : Σα j = 0 Source Sum of Squares df Mean Squares F J Between Groups nj( j * ) J - SS B /(J ) MS B /MS W = ( N

More information

Workshop Research Methods and Statistical Analysis

Workshop Research Methods and Statistical Analysis Workshop Research Methods and Statistical Analysis Session 2 Data Analysis Sandra Poeschl 08.04.2013 Page 1 Research process Research Question State of Research / Theoretical Background Design Data Collection

More information

Analysis of Variance (ANOVA)

Analysis of Variance (ANOVA) Analysis of Variance (ANOVA) Two types of ANOVA tests: Independent measures and Repeated measures Comparing 2 means: X 1 = 20 t - test X 2 = 30 How can we Compare 3 means?: X 1 = 20 X 2 = 30 X 3 = 35 ANOVA

More information

22s:152 Applied Linear Regression. Take random samples from each of m populations.

22s:152 Applied Linear Regression. Take random samples from each of m populations. 22s:152 Applied Linear Regression Chapter 8: ANOVA NOTE: We will meet in the lab on Monday October 10. One-way ANOVA Focuses on testing for differences among group means. Take random samples from each

More information

Data analysis and Geostatistics - lecture VII

Data analysis and Geostatistics - lecture VII Data analysis and Geostatistics - lecture VII t-tests, ANOVA and goodness-of-fit Statistical testing - significance of r Testing the significance of the correlation coefficient: t = r n - 2 1 - r 2 with

More information

22s:152 Applied Linear Regression. Chapter 8: 1-Way Analysis of Variance (ANOVA) 2-Way Analysis of Variance (ANOVA)

22s:152 Applied Linear Regression. Chapter 8: 1-Way Analysis of Variance (ANOVA) 2-Way Analysis of Variance (ANOVA) 22s:152 Applied Linear Regression Chapter 8: 1-Way Analysis of Variance (ANOVA) 2-Way Analysis of Variance (ANOVA) We now consider an analysis with only categorical predictors (i.e. all predictors are

More information

22s:152 Applied Linear Regression. There are a couple commonly used models for a one-way ANOVA with m groups. Chapter 8: ANOVA

22s:152 Applied Linear Regression. There are a couple commonly used models for a one-way ANOVA with m groups. Chapter 8: ANOVA 22s:152 Applied Linear Regression Chapter 8: ANOVA NOTE: We will meet in the lab on Monday October 10. One-way ANOVA Focuses on testing for differences among group means. Take random samples from each

More information

Parametric versus Nonparametric Statistics-when to use them and which is more powerful? Dr Mahmoud Alhussami

Parametric versus Nonparametric Statistics-when to use them and which is more powerful? Dr Mahmoud Alhussami Parametric versus Nonparametric Statistics-when to use them and which is more powerful? Dr Mahmoud Alhussami Parametric Assumptions The observations must be independent. Dependent variable should be continuous

More information

Rank-Based Methods. Lukas Meier

Rank-Based Methods. Lukas Meier Rank-Based Methods Lukas Meier 20.01.2014 Introduction Up to now we basically always used a parametric family, like the normal distribution N (µ, σ 2 ) for modeling random data. Based on observed data

More information

Multiple comparisons - subsequent inferences for two-way ANOVA

Multiple comparisons - subsequent inferences for two-way ANOVA 1 Multiple comparisons - subsequent inferences for two-way ANOVA the kinds of inferences to be made after the F tests of a two-way ANOVA depend on the results if none of the F tests lead to rejection of

More information

CHAPTER 4 Analysis of Variance. One-way ANOVA Two-way ANOVA i) Two way ANOVA without replication ii) Two way ANOVA with replication

CHAPTER 4 Analysis of Variance. One-way ANOVA Two-way ANOVA i) Two way ANOVA without replication ii) Two way ANOVA with replication CHAPTER 4 Analysis of Variance One-way ANOVA Two-way ANOVA i) Two way ANOVA without replication ii) Two way ANOVA with replication 1 Introduction In this chapter, expand the idea of hypothesis tests. We

More information

Introduction to Business Statistics QM 220 Chapter 12

Introduction to Business Statistics QM 220 Chapter 12 Department of Quantitative Methods & Information Systems Introduction to Business Statistics QM 220 Chapter 12 Dr. Mohammad Zainal 12.1 The F distribution We already covered this topic in Ch. 10 QM-220,

More information

Levene's Test of Equality of Error Variances a

Levene's Test of Equality of Error Variances a BUTTERFAT DATA: INTERACTION MODEL Levene's Test of Equality of Error Variances a Dependent Variable: Butterfat (%) F df1 df2 Sig. 2.711 9 90.008 Tests the null hypothesis that the error variance of the

More information

Multiple Comparisons

Multiple Comparisons Multiple Comparisons Error Rates, A Priori Tests, and Post-Hoc Tests Multiple Comparisons: A Rationale Multiple comparison tests function to tease apart differences between the groups within our IV when

More information

ANOVA - analysis of variance - used to compare the means of several populations.

ANOVA - analysis of variance - used to compare the means of several populations. 12.1 One-Way Analysis of Variance ANOVA - analysis of variance - used to compare the means of several populations. Assumptions for One-Way ANOVA: 1. Independent samples are taken using a randomized design.

More information

One-Way Analysis of Variance. With regression, we related two quantitative, typically continuous variables.

One-Way Analysis of Variance. With regression, we related two quantitative, typically continuous variables. One-Way Analysis of Variance With regression, we related two quantitative, typically continuous variables. Often we wish to relate a quantitative response variable with a qualitative (or simply discrete)

More information

In a one-way ANOVA, the total sums of squares among observations is partitioned into two components: Sums of squares represent:

In a one-way ANOVA, the total sums of squares among observations is partitioned into two components: Sums of squares represent: Activity #10: AxS ANOVA (Repeated subjects design) Resources: optimism.sav So far in MATH 300 and 301, we have studied the following hypothesis testing procedures: 1) Binomial test, sign-test, Fisher s

More information

Sociology 6Z03 Review II

Sociology 6Z03 Review II Sociology 6Z03 Review II John Fox McMaster University Fall 2016 John Fox (McMaster University) Sociology 6Z03 Review II Fall 2016 1 / 35 Outline: Review II Probability Part I Sampling Distributions Probability

More information

Chap The McGraw-Hill Companies, Inc. All rights reserved.

Chap The McGraw-Hill Companies, Inc. All rights reserved. 11 pter11 Chap Analysis of Variance Overview of ANOVA Multiple Comparisons Tests for Homogeneity of Variances Two-Factor ANOVA Without Replication General Linear Model Experimental Design: An Overview

More information

Rama Nada. -Ensherah Mokheemer. 1 P a g e

Rama Nada. -Ensherah Mokheemer. 1 P a g e - 9 - Rama Nada -Ensherah Mokheemer - 1 P a g e Quick revision: Remember from the last lecture that chi square is an example of nonparametric test, other examples include Kruskal Wallis, Mann Whitney and

More information

What Is ANOVA? Comparing Groups. One-way ANOVA. One way ANOVA (the F ratio test)

What Is ANOVA? Comparing Groups. One-way ANOVA. One way ANOVA (the F ratio test) What Is ANOVA? One-way ANOVA ANOVA ANalysis Of VAriance ANOVA compares the means of several groups. The groups are sometimes called "treatments" First textbook presentation in 95. Group Group σ µ µ σ µ

More information

One-Way ANOVA. Some examples of when ANOVA would be appropriate include:

One-Way ANOVA. Some examples of when ANOVA would be appropriate include: One-Way ANOVA 1. Purpose Analysis of variance (ANOVA) is used when one wishes to determine whether two or more groups (e.g., classes A, B, and C) differ on some outcome of interest (e.g., an achievement

More information

We need to define some concepts that are used in experiments.

We need to define some concepts that are used in experiments. Chapter 0 Analysis of Variance (a.k.a. Designing and Analysing Experiments) Section 0. Introduction In Chapter we mentioned some different ways in which we could get data: Surveys, Observational Studies,

More information

Chapter 18 Resampling and Nonparametric Approaches To Data

Chapter 18 Resampling and Nonparametric Approaches To Data Chapter 18 Resampling and Nonparametric Approaches To Data 18.1 Inferences in children s story summaries (McConaughy, 1980): a. Analysis using Wilcoxon s rank-sum test: Younger Children Older Children

More information

Exam details. Final Review Session. Things to Review

Exam details. Final Review Session. Things to Review Exam details Final Review Session Short answer, similar to book problems Formulae and tables will be given You CAN use a calculator Date and Time: Dec. 7, 006, 1-1:30 pm Location: Osborne Centre, Unit

More information

13: Additional ANOVA Topics. Post hoc Comparisons

13: Additional ANOVA Topics. Post hoc Comparisons 13: Additional ANOVA Topics Post hoc Comparisons ANOVA Assumptions Assessing Group Variances When Distributional Assumptions are Severely Violated Post hoc Comparisons In the prior chapter we used ANOVA

More information

The legacy of Sir Ronald A. Fisher. Fisher s three fundamental principles: local control, replication, and randomization.

The legacy of Sir Ronald A. Fisher. Fisher s three fundamental principles: local control, replication, and randomization. 1 Chapter 1: Research Design Principles The legacy of Sir Ronald A. Fisher. Fisher s three fundamental principles: local control, replication, and randomization. 2 Chapter 2: Completely Randomized Design

More information

Chapter 10. Design of Experiments and Analysis of Variance

Chapter 10. Design of Experiments and Analysis of Variance Chapter 10 Design of Experiments and Analysis of Variance Elements of a Designed Experiment Response variable Also called the dependent variable Factors (quantitative and qualitative) Also called the independent

More information

STAT 135 Lab 9 Multiple Testing, One-Way ANOVA and Kruskal-Wallis

STAT 135 Lab 9 Multiple Testing, One-Way ANOVA and Kruskal-Wallis STAT 135 Lab 9 Multiple Testing, One-Way ANOVA and Kruskal-Wallis Rebecca Barter April 6, 2015 Multiple Testing Multiple Testing Recall that when we were doing two sample t-tests, we were testing the equality

More information

BIOL Biometry LAB 6 - SINGLE FACTOR ANOVA and MULTIPLE COMPARISON PROCEDURES

BIOL Biometry LAB 6 - SINGLE FACTOR ANOVA and MULTIPLE COMPARISON PROCEDURES BIOL 458 - Biometry LAB 6 - SINGLE FACTOR ANOVA and MULTIPLE COMPARISON PROCEDURES PART 1: INTRODUCTION TO ANOVA Purpose of ANOVA Analysis of Variance (ANOVA) is an extremely useful statistical method

More information