Chapter 14. One-Way Analysis of Variance for Independent Samples Part 2
|
|
- Rudolph Poole
- 6 years ago
- Views:
Transcription
1 Tuesday, December 12, 2000 One-Way ANOVA: Independent Samples: II Page: 1 Richard Lowry, All rights reserved. Chapter 14. One-Way Analysis of Variance for Independent Samples Part 2 For the items covered in Part 2 of this chapter, you will need access to the following summary information from the illustrative analysis performed in Part 1. (Click here if you wish to see full array of data on which this analysis was performed.) =28.86 M b =25.04 M c =22.50 M d Source SS df MS F P between groups ("effect") within groups ("error") < TOTAL Post-ANOVA Comparisons: the Tukey HSD Test A significant F -ratio tells you only that the aggregate difference among the means of the several samples is significantly greater than zero. It does not tell you whether any particular sample mean significantly differs from any particular other. For some research purposes this might be entirely sufficient. Since the investigators in the present example regard their experiment with laboratory rats as only a first step in testing the medication, we can imagine they might be content simply with the global conclusion suggested by the graph of their data: namely, that the curve of "pull" (presumably a reflection of the effect of the medication) slopes downward from A to B to C, then levels off between C and D. There are, however, many situations in which the investigator might wish to
2 Tuesday, December 12, 2000 One-Way ANOVA: Independent Samples: II Page: 2 determine specifically whether significantly differs from M b, or M b from M c, and so on. As noted toward the beginning of Chapter 13, this comparison of sample means two at a time cannot be done by way of simple t-tests, because it potentially involves 3 or more comparisons, depending on the number of samples, k, involved in the original analysis. With k=3, there would be 3 potential comparisons: A B, A C, B C With k=4, as in the present case, there would be 6 potential comparisons: A B, A C, A D, B C, B D, C D With k=5, there would be 10: A B, A C, A D, A E, B C, B D, B E, C D, C E, D E and so forth. The performance of any one or several of these pair-wise comparisons requires a procedure that takes the full range of potential comparisons into account. The subject of post-anova comparisons is a rather complex one, and most of it lies beyond the scope of an introductory presentation. I will describe here only one of the available procedures, which I think will serve the beginning student well enough for most practical purposes. It goes under the name of the Tukey HSD test, the "HSD" being an acronym for the forthright phrase "honestly significant difference." The Tukey test revolves around a measure known as the Studentized range statistic, which we will abbreviate as Q. For any particular pair of means among the k groups, let us designate the larger and smaller as M L and M S, respectively. The Studentized range statistic can then be calculated for any particular pair as Q = M L M S sqrt[ms wg / where MS wg is the within-groups MS obtained in the original analysis and is the number of values of X i per sample ("p/s"=per sample). For the present example, MS wg =7.27 and =5. If the k samples are of different sizes, the value of can be set as equal to the harmonic mean of the sample sizes. For k=3, this would be 3 = (1/N a )+(1/N b )+(1/N c )
3 Tuesday, December 12, 2000 One-Way ANOVA: Independent Samples: II Page: 3 For k=4, 4 = (1/N a )+(1/N b )+(1/N c )+(1/N d ) And so on for k=5, k=6, etc. If the k samples are of different sizes, the value of can be set as equal to the harmonic mean of the sample sizes. For k=3, this would be As it happens, you do not really need to worry about calculating Q, because there is a simpler way of applying the Tukey test. However, I will pause to calculate one instance of it, just to give you an idea of what it looks like. For the present example, =28.86, M b =25.04, MS wg =7.27, and =5. Thus, for the comparison between and M b the Studentized range statistic would be Q = sqrt[7.27 / 5 = 3.16 And similarly for any of the other pair-wise comparisons one might wish to make among the means of this particular set of 4 groups. In any particular case, this Studentized range statistic belongs to a sampling distribution defined by two parameters: the first is k, the number of samples in the original analysis; and the second is df wg, the number of degrees of freedom associated with the denominator of the F-ratio in the original analysis. Within any particular one of these sampling distributions you can define the value of Q required for significance at any particular level. The following three scrollable windows list the critical values of Q at the.05 and.01 levels of significance for three values of K and various values of df wg. These do not constitute the full table of critical values of Q, though they should be sufficient for most situations that the beginning student is likely to encounter. Partial Table of Critical Values of the Studentized Range Statistic, Q
4 Tuesday, December 12, 2000 One-Way ANOVA: Independent Samples: II Page: 4 [If the desired value of df wg does not appear in the table, refer to the next smaller value. Example: for df wg =33, use the tabled critical values for df wg =30. The values listed in these three tables have been calculated by the author using Microsoft Excel. For a complete table of critical values of Q, see E. Pearson and H. Hartley, Biometrika Tables for Statisticians, 3rd edition. New York, Cambridge University Press, In each table, the first column is df wg ; the second and third columns are the critical values of Q for the.05 and.01 levels of significance, respectively. [Interactive tables omitted from this print file. For each of the listings in these tables we will use the designations Q.05 and Q.01 to refer to the critical values for the.05 and.01 levels. For the present example, k=4 and df wg =16. Scroll down the window for k=4 to the point where df wg =16, and you will find Q.05 =4.05 and Q.01 =5.19. The Tukey HSD test then uses these critical values of Q to determine how large the difference between the means of any two particular groups must be in order to be regarded as significant. The other participants in this determination, MS wg and, are the same items you saw in the earlier formula for Q. The following two "HSD" formulas are simply algebraic jugglings of the original formula, in which the value of Q is set to one or the other of the two critical values, Q.05 and Q.01. For the.05 level: HSD.05 = Q.05 x sqrt [ MS wg = 4.05 x sqrt [ = 4.88 And for the.01 level: HSD.01 = Q.01 x sqrt [ MS wg = 5.19 x sqrt [ = 6.26 That is: In order to be considered significant at or beyond the.05 level, the difference between any two particular group means (larger smaller) must be equal to or greater than That is: In order to be considered significant at or beyond the.01 level, the difference between any two particular group means (larger smaller) must be equal to or greater than 6.26.
5 Tuesday, December 12, 2000 One-Way ANOVA: Independent Samples: II Page: 5 The blue entries in the following table show the differences between each pair of group means in our example. As you can see, two of the comparisons (A C and A D) are significant beyond the.01 level, while all the others fail to achieve significance even at the basic.05 level. A B =28.86 M b = HSD.05 = 4.88 A C =28.86 M c = HSD.01 = 6.26 A D =28.86 M d 6.56 B C M b =25.04 M c = B D M b =25.04 M d 2.74 C D M c =22.50 M d 0.20 Our investigators would therefore be able to conclude that 2 units and 3 units of the experimental medication each produced significantly lower mean levels of "pull" than was found in the zero-unit control group. They would not be able to conclude that the effect of 2 units or 3 units was significantly greater than the effect of 1 unit, nor that the mean "pull" of the 1-unit group was significantly smaller that that of the zero-unit control group. Please note carefully, however, that failing to find a significant difference between and M b would not entail that 1 unit of the medication has no effect at all. It merely means that the Tukey HSD test does not detect a significant difference between the two in this particular situation. If the investigators had found approximately the same array of group means with samples of twice the size (10 per group, rather than 5), they would very likely have found all of the pair-wise comparisons to be significant, except for the one between M c and M d.
6 Tuesday, December 12, 2000 One-Way ANOVA: Independent Samples: II Page: 6 One-Way ANOVA and Correlation Here yet again is Figure 14.1, which you have now seen several times over. It will be fairly obvious to the naked eye that the two variables, dosage and pull, are correlated in the sense that variations in the one are associated with variations in the other. It will be equally obvious that the relationship is not of the rectilinear (straight-line) sort described in Chapter 3. It is better described by a curved line, hence "curvilinear." Within the context of a one-way analysis of variance for independent samples, a useful measure of the strength of a curvilinear relationship between the independent and dependent variable is given by a quantity known as as eta-square ("eta" to rhyme with "beta"), which is simply the ratio of SS bg to SS T. For the medication experiment it comes out as SS bg eta 2 = = SS T = 0.55 The essential meaning of "eta 2 =0.55" is this: Of all the variability that exists within the dependent variable "pull," 55% is associated with variability in the independent variable "dosage level." A moment's reflection of what we observed in Chapter 3 will remind you that this is also the essential meaning of the coefficient of determination, r 2. The only intrinsic difference between the two is that r 2 can measure the strength of a correlation only insofar as it is linear (can be described by a straight line), while eta 2 provides a measure of the strength of correlation irrespective of whether it is linear or curvilinear. If the relationship is linear fully describable by a straight line then the values of r 2 and eta 2 will be the same. In the degree that a curved line describes the relationship better than a straight line, then eta 2 will be greater than r 2. This point is illustrated by the two panels of Figure 14.3, which show the data for the N T =20 individual subjects of the experiment layed out in the form of a scatter plot. Applying the procedures of linear correlation to this set of bivariate data will yield the straight regression line shown in the panel on the left, along with r 2 =0.48. The panel on the right shows the same data with a curvilinear line of best fit, corresponding to our calculated value of eta 2 =0.55. Figure Linear and Curvilinear Correlation
7 Tuesday, December 12, 2000 One-Way ANOVA: Independent Samples: II Page: 7 Please note, however, that it is meaningful to speak of eta 2 as analogous to r 2 only when the levels of the independent variable are quantitative and linear, as in the present example where zero units, 1 unit, 2 units, and 3 units of the medication represent points along an equal-interval scale. If the levels of the independent variable are only categorical (several different types of medication, several different types of music, etc.), the meaning of eta 2 reverts back to a version of the more general statement given above: Of all the variability that exists within the dependent variable, such-and-such percent is associated with the differences among the levels of the independent variable. Note that this chapter includes a subchapter on the Kruskal-Wallis Test, which is a non-parametric alternative to the one-way ANOVA for independent samples. End of Chapter 14. Return to Top of Chapter 14, Part 2 Go to Subchapter 14a [The Kruskal-Wallis Test Go to Chapter 15 [One-Way Analysis of Variance for Correlated Samples Home Click this link only if the present page does not appear in a frameset headed by the logo Concepts and Applications of Inferential Statistics
Chapter 3. Introduction to Linear Correlation and Regression Part 3
Tuesday, December 12, 2000 Ch3 Intro Correlation Pt 3 Page: 1 Richard Lowry, 1999-2000 All rights reserved. Chapter 3. Introduction to Linear Correlation and Regression Part 3 Regression The appearance
More informationDETAILED CONTENTS PART I INTRODUCTION AND DESCRIPTIVE STATISTICS. 1. Introduction to Statistics
DETAILED CONTENTS About the Author Preface to the Instructor To the Student How to Use SPSS With This Book PART I INTRODUCTION AND DESCRIPTIVE STATISTICS 1. Introduction to Statistics 1.1 Descriptive and
More informationIndependent Samples ANOVA
Independent Samples ANOVA In this example students were randomly assigned to one of three mnemonics (techniques for improving memory) rehearsal (the control group; simply repeat the words), visual imagery
More informationMultiple t Tests. Introduction to Analysis of Variance. Experiments with More than 2 Conditions
Introduction to Analysis of Variance 1 Experiments with More than 2 Conditions Often the research that psychologists perform has more conditions than just the control and experimental conditions You might
More informationThis module focuses on the logic of ANOVA with special attention given to variance components and the relationship between ANOVA and regression.
WISE ANOVA and Regression Lab Introduction to the WISE Correlation/Regression and ANOVA Applet This module focuses on the logic of ANOVA with special attention given to variance components and the relationship
More informationUsing SPSS for One Way Analysis of Variance
Using SPSS for One Way Analysis of Variance This tutorial will show you how to use SPSS version 12 to perform a one-way, between- subjects analysis of variance and related post-hoc tests. This tutorial
More informationSEVERAL μs AND MEDIANS: MORE ISSUES. Business Statistics
SEVERAL μs AND MEDIANS: MORE ISSUES Business Statistics CONTENTS Post-hoc analysis ANOVA for 2 groups The equal variances assumption The Kruskal-Wallis test Old exam question Further study POST-HOC ANALYSIS
More informationANOVA CIVL 7012/8012
ANOVA CIVL 7012/8012 ANOVA ANOVA = Analysis of Variance A statistical method used to compare means among various datasets (2 or more samples) Can provide summary of any regression analysis in a table called
More informationStatistics for Managers using Microsoft Excel 6 th Edition
Statistics for Managers using Microsoft Excel 6 th Edition Chapter 13 Simple Linear Regression 13-1 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value of
More informationA posteriori multiple comparison tests
A posteriori multiple comparison tests 11/15/16 1 Recall the Lakes experiment Source of variation SS DF MS F P Lakes 58.000 2 29.400 8.243 0.006 Error 42.800 12 3.567 Total 101.600 14 The ANOVA tells us
More informationAnalysis of Variance (ANOVA)
Analysis of Variance (ANOVA) Used for comparing or more means an extension of the t test Independent Variable (factor) = categorical (qualita5ve) predictor should have at least levels, but can have many
More informationModule 8: Linear Regression. The Applied Research Center
Module 8: Linear Regression The Applied Research Center Module 8 Overview } Purpose of Linear Regression } Scatter Diagrams } Regression Equation } Regression Results } Example Purpose } To predict scores
More informationFactorial Independent Samples ANOVA
Factorial Independent Samples ANOVA Liljenquist, Zhong and Galinsky (2010) found that people were more charitable when they were in a clean smelling room than in a neutral smelling room. Based on that
More information13.7 ANOTHER TEST FOR TREND: KENDALL S TAU
13.7 ANOTHER TEST FOR TREND: KENDALL S TAU In 1969 the U.S. government instituted a draft lottery for choosing young men to be drafted into the military. Numbers from 1 to 366 were randomly assigned to
More informationWISE Regression/Correlation Interactive Lab. Introduction to the WISE Correlation/Regression Applet
WISE Regression/Correlation Interactive Lab Introduction to the WISE Correlation/Regression Applet This tutorial focuses on the logic of regression analysis with special attention given to variance components.
More informationYour schedule of coming weeks. One-way ANOVA, II. Review from last time. Review from last time /22/2004. Create ANOVA table
Your schedule of coming weeks One-way ANOVA, II 9.07 //00 Today: One-way ANOVA, part II Next week: Two-way ANOVA, parts I and II. One-way ANOVA HW due Thursday Week of May Teacher out of town all week
More informationPS2.1 & 2.2: Linear Correlations PS2: Bivariate Statistics
PS2.1 & 2.2: Linear Correlations PS2: Bivariate Statistics LT1: Basics of Correlation LT2: Measuring Correlation and Line of best fit by eye Univariate (one variable) Displays Frequency tables Bar graphs
More informationChapter 10. Regression. Understandable Statistics Ninth Edition By Brase and Brase Prepared by Yixun Shi Bloomsburg University of Pennsylvania
Chapter 10 Regression Understandable Statistics Ninth Edition By Brase and Brase Prepared by Yixun Shi Bloomsburg University of Pennsylvania Scatter Diagrams A graph in which pairs of points, (x, y), are
More informationInferences for Regression
Inferences for Regression An Example: Body Fat and Waist Size Looking at the relationship between % body fat and waist size (in inches). Here is a scatterplot of our data set: Remembering Regression In
More informationAn Old Research Question
ANOVA An Old Research Question The impact of TV on high-school grade Watch or not watch Two groups The impact of TV hours on high-school grade Exactly how much TV watching would make difference Multiple
More information1 A Review of Correlation and Regression
1 A Review of Correlation and Regression SW, Chapter 12 Suppose we select n = 10 persons from the population of college seniors who plan to take the MCAT exam. Each takes the test, is coached, and then
More informationCh. 16: Correlation and Regression
Ch. 1: Correlation and Regression With the shift to correlational analyses, we change the very nature of the question we are asking of our data. Heretofore, we were asking if a difference was likely to
More informationIntro to Parametric & Nonparametric Statistics
Kinds of variable The classics & some others Intro to Parametric & Nonparametric Statistics Kinds of variables & why we care Kinds & definitions of nonparametric statistics Where parametric stats come
More informationSPSS Guide For MMI 409
SPSS Guide For MMI 409 by John Wong March 2012 Preface Hopefully, this document can provide some guidance to MMI 409 students on how to use SPSS to solve many of the problems covered in the D Agostino
More informationBasic Statistical Analysis
indexerrt.qxd 8/21/2002 9:47 AM Page 1 Corrected index pages for Sprinthall Basic Statistical Analysis Seventh Edition indexerrt.qxd 8/21/2002 9:47 AM Page 656 Index Abscissa, 24 AB-STAT, vii ADD-OR rule,
More information9 Correlation and Regression
9 Correlation and Regression SW, Chapter 12. Suppose we select n = 10 persons from the population of college seniors who plan to take the MCAT exam. Each takes the test, is coached, and then retakes the
More informationMULTIPLE REGRESSION METHODS
DEPARTMENT OF POLITICAL SCIENCE AND INTERNATIONAL RELATIONS Posc/Uapp 816 MULTIPLE REGRESSION METHODS I. AGENDA: A. Residuals B. Transformations 1. A useful procedure for making transformations C. Reading:
More informationAnalysis of Variance (ANOVA)
Analysis of Variance (ANOVA) Two types of ANOVA tests: Independent measures and Repeated measures Comparing 2 means: X 1 = 20 t - test X 2 = 30 How can we Compare 3 means?: X 1 = 20 X 2 = 30 X 3 = 35 ANOVA
More informationTaguchi Method and Robust Design: Tutorial and Guideline
Taguchi Method and Robust Design: Tutorial and Guideline CONTENT 1. Introduction 2. Microsoft Excel: graphing 3. Microsoft Excel: Regression 4. Microsoft Excel: Variance analysis 5. Robust Design: An Example
More informationA discussion on multiple regression models
A discussion on multiple regression models In our previous discussion of simple linear regression, we focused on a model in which one independent or explanatory variable X was used to predict the value
More informationChapter 1: Linear Regression with One Predictor Variable also known as: Simple Linear Regression Bivariate Linear Regression
BSTT523: Kutner et al., Chapter 1 1 Chapter 1: Linear Regression with One Predictor Variable also known as: Simple Linear Regression Bivariate Linear Regression Introduction: Functional relation between
More informationPS2: Two Variable Statistics
PS2: Two Variable Statistics LT2: Measuring Correlation and Line of best fit by eye. LT3: Linear regression LT4: The χ 2 test of independence. 1 Pearson's Correlation Coefficient In examinations you are
More informationWeek 14 Comparing k(> 2) Populations
Week 14 Comparing k(> 2) Populations Week 14 Objectives Methods associated with testing for the equality of k(> 2) means or proportions are presented. Post-testing concepts and analysis are introduced.
More informationAssignment #7. Chapter 12: 18, 24 Chapter 13: 28. Due next Friday Nov. 20 th by 2pm in your TA s homework box
Assignment #7 Chapter 12: 18, 24 Chapter 13: 28 Due next Friday Nov. 20 th by 2pm in your TA s homework box Lab Report Posted on web-site Dates Rough draft due to TAs homework box on Monday Nov. 16 th
More informationFinite Mathematics : A Business Approach
Finite Mathematics : A Business Approach Dr. Brian Travers and Prof. James Lampes Second Edition Cover Art by Stephanie Oxenford Additional Editing by John Gambino Contents What You Should Already Know
More informationTOPIC 9 SIMPLE REGRESSION & CORRELATION
TOPIC 9 SIMPLE REGRESSION & CORRELATION Basic Linear Relationships Mathematical representation: Y = a + bx X is the independent variable [the variable whose value we can choose, or the input variable].
More informationINTRODUCTION TO ANALYSIS OF VARIANCE
CHAPTER 22 INTRODUCTION TO ANALYSIS OF VARIANCE Chapter 18 on inferences about population means illustrated two hypothesis testing situations: for one population mean and for the difference between two
More information1 Introduction to Minitab
1 Introduction to Minitab Minitab is a statistical analysis software package. The software is freely available to all students and is downloadable through the Technology Tab at my.calpoly.edu. When you
More informationLAB 3 INSTRUCTIONS SIMPLE LINEAR REGRESSION
LAB 3 INSTRUCTIONS SIMPLE LINEAR REGRESSION In this lab you will first learn how to display the relationship between two quantitative variables with a scatterplot and also how to measure the strength of
More informationOne-way between-subjects ANOVA. Comparing three or more independent means
One-way between-subjects ANOVA Comparing three or more independent means ANOVA: A Framework Understand the basic principles of ANOVA Why it is done? What it tells us? Theory of one-way between-subjects
More informationModule 03 Lecture 14 Inferential Statistics ANOVA and TOI
Introduction of Data Analytics Prof. Nandan Sudarsanam and Prof. B Ravindran Department of Management Studies and Department of Computer Science and Engineering Indian Institute of Technology, Madras Module
More informationMIDTERM EXAMINATION (Spring 2011) STA301- Statistics and Probability
STA301- Statistics and Probability Solved MCQS From Midterm Papers March 19,2012 MC100401285 Moaaz.pk@gmail.com Mc100401285@gmail.com PSMD01 MIDTERM EXAMINATION (Spring 2011) STA301- Statistics and Probability
More informationIntroduction and Descriptive Statistics p. 1 Introduction to Statistics p. 3 Statistics, Science, and Observations p. 5 Populations and Samples p.
Preface p. xi Introduction and Descriptive Statistics p. 1 Introduction to Statistics p. 3 Statistics, Science, and Observations p. 5 Populations and Samples p. 6 The Scientific Method and the Design of
More informationReview of Multiple Regression
Ronald H. Heck 1 Let s begin with a little review of multiple regression this week. Linear models [e.g., correlation, t-tests, analysis of variance (ANOVA), multiple regression, path analysis, multivariate
More informationDISTRIBUTIONS USED IN STATISTICAL WORK
DISTRIBUTIONS USED IN STATISTICAL WORK In one of the classic introductory statistics books used in Education and Psychology (Glass and Stanley, 1970, Prentice-Hall) there was an excellent chapter on different
More informationLecture Slides. Elementary Statistics. by Mario F. Triola. and the Triola Statistics Series
Lecture Slides Elementary Statistics Tenth Edition and the Triola Statistics Series by Mario F. Triola Slide 1 Chapter 13 Nonparametric Statistics 13-1 Overview 13-2 Sign Test 13-3 Wilcoxon Signed-Ranks
More informationWhat is a Hypothesis?
What is a Hypothesis? A hypothesis is a claim (assumption) about a population parameter: population mean Example: The mean monthly cell phone bill in this city is μ = $42 population proportion Example:
More informationLecture Slides. Section 13-1 Overview. Elementary Statistics Tenth Edition. Chapter 13 Nonparametric Statistics. by Mario F.
Lecture Slides Elementary Statistics Tenth Edition and the Triola Statistics Series by Mario F. Triola Slide 1 Chapter 13 Nonparametric Statistics 13-1 Overview 13-2 Sign Test 13-3 Wilcoxon Signed-Ranks
More informationSTAT 350 Final (new Material) Review Problems Key Spring 2016
1. The editor of a statistics textbook would like to plan for the next edition. A key variable is the number of pages that will be in the final version. Text files are prepared by the authors using LaTeX,
More information1 Correlation and Inference from Regression
1 Correlation and Inference from Regression Reading: Kennedy (1998) A Guide to Econometrics, Chapters 4 and 6 Maddala, G.S. (1992) Introduction to Econometrics p. 170-177 Moore and McCabe, chapter 12 is
More informationChap The McGraw-Hill Companies, Inc. All rights reserved.
11 pter11 Chap Analysis of Variance Overview of ANOVA Multiple Comparisons Tests for Homogeneity of Variances Two-Factor ANOVA Without Replication General Linear Model Experimental Design: An Overview
More informationAnalysing data: regression and correlation S6 and S7
Basic medical statistics for clinical and experimental research Analysing data: regression and correlation S6 and S7 K. Jozwiak k.jozwiak@nki.nl 2 / 49 Correlation So far we have looked at the association
More informationOne-way between-subjects ANOVA. Comparing three or more independent means
One-way between-subjects ANOVA Comparing three or more independent means Data files SpiderBG.sav Attractiveness.sav Homework: sourcesofself-esteem.sav ANOVA: A Framework Understand the basic principles
More informationIntuitive Biostatistics: Choosing a statistical test
pagina 1 van 5 < BACK Intuitive Biostatistics: Choosing a statistical This is chapter 37 of Intuitive Biostatistics (ISBN 0-19-508607-4) by Harvey Motulsky. Copyright 1995 by Oxfd University Press Inc.
More informationCorrelation Analysis
Simple Regression Correlation Analysis Correlation analysis is used to measure strength of the association (linear relationship) between two variables Correlation is only concerned with strength of the
More informationRegression and the 2-Sample t
Regression and the 2-Sample t James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) Regression and the 2-Sample t 1 / 44 Regression
More informationComparing Several Means: ANOVA
Comparing Several Means: ANOVA Understand the basic principles of ANOVA Why it is done? What it tells us? Theory of one way independent ANOVA Following up an ANOVA: Planned contrasts/comparisons Choosing
More informationChapter 1 Statistical Inference
Chapter 1 Statistical Inference causal inference To infer causality, you need a randomized experiment (or a huge observational study and lots of outside information). inference to populations Generalizations
More informationKeppel, G. & Wickens, T. D. Design and Analysis Chapter 12: Detailed Analyses of Main Effects and Simple Effects
Keppel, G. & Wickens, T. D. Design and Analysis Chapter 1: Detailed Analyses of Main Effects and Simple Effects If the interaction is significant, then less attention is paid to the two main effects, and
More informationIn the previous chapter, we learned how to use the method of least-squares
03-Kahane-45364.qxd 11/9/2007 4:40 PM Page 37 3 Model Performance and Evaluation In the previous chapter, we learned how to use the method of least-squares to find a line that best fits a scatter of points.
More informationBIOSTATISTICS NURS 3324
Simple Linear Regression and Correlation Introduction Previously, our attention has been focused on one variable which we designated by x. Frequently, it is desirable to learn something about the relationship
More informationChapter 14: Finding the Equilibrium Solution and Exploring the Nature of the Equilibration Process
Chapter 14: Finding the Equilibrium Solution and Exploring the Nature of the Equilibration Process Taking Stock: In the last chapter, we learned that equilibrium problems have an interesting dimension
More informationPassing-Bablok Regression for Method Comparison
Chapter 313 Passing-Bablok Regression for Method Comparison Introduction Passing-Bablok regression for method comparison is a robust, nonparametric method for fitting a straight line to two-dimensional
More informationKeppel, G. & Wickens, T. D. Design and Analysis Chapter 4: Analytical Comparisons Among Treatment Means
Keppel, G. & Wickens, T. D. Design and Analysis Chapter 4: Analytical Comparisons Among Treatment Means 4.1 The Need for Analytical Comparisons...the between-groups sum of squares averages the differences
More informationCorrelation and simple linear regression S5
Basic medical statistics for clinical and eperimental research Correlation and simple linear regression S5 Katarzyna Jóźwiak k.jozwiak@nki.nl November 15, 2017 1/41 Introduction Eample: Brain size and
More informationLOOKING FOR RELATIONSHIPS
LOOKING FOR RELATIONSHIPS One of most common types of investigation we do is to look for relationships between variables. Variables may be nominal (categorical), for example looking at the effect of an
More informationRegression Analysis. Table Relationship between muscle contractile force (mj) and stimulus intensity (mv).
Regression Analysis Two variables may be related in such a way that the magnitude of one, the dependent variable, is assumed to be a function of the magnitude of the second, the independent variable; however,
More informationANCOVA. ANCOVA allows the inclusion of a 3rd source of variation into the F-formula (called the covariate) and changes the F-formula
ANCOVA Workings of ANOVA & ANCOVA ANCOVA, Semi-Partial correlations, statistical control Using model plotting to think about ANCOVA & Statistical control You know how ANOVA works the total variation among
More informationSoc 63993, Homework #7 Answer Key: Nonlinear effects/ Intro to path analysis
Soc 63993, Homework #7 Answer Key: Nonlinear effects/ Intro to path analysis Richard Williams, University of Notre Dame, https://www3.nd.edu/~rwilliam/ Last revised February 20, 2015 Problem 1. The files
More informationStatistics 512: Solution to Homework#11. Problems 1-3 refer to the soybean sausage dataset of Problem 20.8 (ch21pr08.dat).
Statistics 512: Solution to Homework#11 Problems 1-3 refer to the soybean sausage dataset of Problem 20.8 (ch21pr08.dat). 1. Perform the two-way ANOVA without interaction for this model. Use the results
More informationData Analysis: Agonistic Display in Betta splendens I. Betta splendens Research: Parametric or Non-parametric Data?
Data Analysis: Agonistic Display in Betta splendens By Joanna Weremjiwicz, Simeon Yurek, and Dana Krempels Once you have collected data with your ethogram, you are ready to analyze that data to see whether
More informationChapter Learning Objectives. Regression Analysis. Correlation. Simple Linear Regression. Chapter 12. Simple Linear Regression
Chapter 12 12-1 North Seattle Community College BUS21 Business Statistics Chapter 12 Learning Objectives In this chapter, you learn:! How to use regression analysis to predict the value of a dependent
More informationFundamentals of Probability Theory and Mathematical Statistics
Fundamentals of Probability Theory and Mathematical Statistics Gerry Del Fiacco Math Center Metropolitan State University St. Paul, Minnesota June 6, 2016 1 Preface This collection of material was researched,
More informationFitting a Straight Line to Data
Fitting a Straight Line to Data Thanks for your patience. Finally we ll take a shot at real data! The data set in question is baryonic Tully-Fisher data from http://astroweb.cwru.edu/sparc/btfr Lelli2016a.mrt,
More informationRegression Analysis. BUS 735: Business Decision Making and Research
Regression Analysis BUS 735: Business Decision Making and Research 1 Goals and Agenda Goals of this section Specific goals Learn how to detect relationships between ordinal and categorical variables. Learn
More informationPsych Jan. 5, 2005
Psych 124 1 Wee 1: Introductory Notes on Variables and Probability Distributions (1/5/05) (Reading: Aron & Aron, Chaps. 1, 14, and this Handout.) All handouts are available outside Mija s office. Lecture
More informationGlossary. The ISI glossary of statistical terms provides definitions in a number of different languages:
Glossary The ISI glossary of statistical terms provides definitions in a number of different languages: http://isi.cbs.nl/glossary/index.htm Adjusted r 2 Adjusted R squared measures the proportion of the
More informationVARIANCE COMPONENT ANALYSIS
VARIANCE COMPONENT ANALYSIS T. KRISHNAN Cranes Software International Limited Mahatma Gandhi Road, Bangalore - 560 001 krishnan.t@systat.com 1. Introduction In an experiment to compare the yields of two
More informationAgonistic Display in Betta splendens: Data Analysis I. Betta splendens Research: Parametric or Non-parametric Data?
Agonistic Display in Betta splendens: Data Analysis By Joanna Weremjiwicz, Simeon Yurek, and Dana Krempels Once you have collected data with your ethogram, you are ready to analyze that data to see whether
More informationHYPOTHESIS TESTING II TESTS ON MEANS. Sorana D. Bolboacă
HYPOTHESIS TESTING II TESTS ON MEANS Sorana D. Bolboacă OBJECTIVES Significance value vs p value Parametric vs non parametric tests Tests on means: 1 Dec 14 2 SIGNIFICANCE LEVEL VS. p VALUE Materials and
More informationThis gives us an upper and lower bound that capture our population mean.
Confidence Intervals Critical Values Practice Problems 1 Estimation 1.1 Confidence Intervals Definition 1.1 Margin of error. The margin of error of a distribution is the amount of error we predict when
More informationWELCOME! Lecture 13 Thommy Perlinger
Quantitative Methods II WELCOME! Lecture 13 Thommy Perlinger Parametrical tests (tests for the mean) Nature and number of variables One-way vs. two-way ANOVA One-way ANOVA Y X 1 1 One dependent variable
More informationChapter 15: Nonparametric Statistics Section 15.1: An Overview of Nonparametric Statistics
Section 15.1: An Overview of Nonparametric Statistics Understand Difference between Parametric and Nonparametric Statistical Procedures Parametric statistical procedures inferential procedures that rely
More informationComparing the means of more than two groups
Comparing the means of more than two groups Chapter 15 Analysis of variance (ANOVA) Like a t-test, but can compare more than two groups Asks whether any of two or more means is different from any other.
More informationEDF 7405 Advanced Quantitative Methods in Educational Research MULTR.SAS
EDF 7405 Advanced Quantitative Methods in Educational Research MULTR.SAS The data used in this example describe teacher and student behavior in 8 classrooms. The variables are: Y percentage of interventions
More informationDetermination of Density 1
Introduction Determination of Density 1 Authors: B. D. Lamp, D. L. McCurdy, V. M. Pultz and J. M. McCormick* Last Update: February 1, 2013 Not so long ago a statistical data analysis of any data set larger
More informationRegression, part II. I. What does it all mean? A) Notice that so far all we ve done is math.
Regression, part II I. What does it all mean? A) Notice that so far all we ve done is math. 1) One can calculate the Least Squares Regression Line for anything, regardless of any assumptions. 2) But, if
More informationContents. Acknowledgments. xix
Table of Preface Acknowledgments page xv xix 1 Introduction 1 The Role of the Computer in Data Analysis 1 Statistics: Descriptive and Inferential 2 Variables and Constants 3 The Measurement of Variables
More informationApplied Regression Analysis
Applied Regression Analysis Lecture 2 January 27, 2005 Lecture #2-1/27/2005 Slide 1 of 46 Today s Lecture Simple linear regression. Partitioning the sum of squares. Tests of significance.. Regression diagnostics
More informationH0: Tested by k-grp ANOVA
Analyses of K-Group Designs : Omnibus F, Pairwise Comparisons & Trend Analyses ANOVA for multiple condition designs Pairwise comparisons and RH Testing Alpha inflation & Correction LSD & HSD procedures
More informationH0: Tested by k-grp ANOVA
Pairwise Comparisons ANOVA for multiple condition designs Pairwise comparisons and RH Testing Alpha inflation & Correction LSD & HSD procedures Alpha estimation reconsidered H0: Tested by k-grp ANOVA Regardless
More informationMTH 2032 Semester II
MTH 232 Semester II 2-2 Linear Algebra Reference Notes Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2 ii Contents Table of Contents
More informationReview. One-way ANOVA, I. What s coming up. Multiple comparisons
Review One-way ANOVA, I 9.07 /15/00 Earlier in this class, we talked about twosample z- and t-tests for the difference between two conditions of an independent variable Does a trial drug work better than
More informationMotion II. Goals and Introduction
Motion II Goals and Introduction As you have probably already seen in lecture or homework, and if you ve performed the experiment Motion I, it is important to develop a strong understanding of how to model
More informationBasic Business Statistics 6 th Edition
Basic Business Statistics 6 th Edition Chapter 12 Simple Linear Regression Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value of a dependent variable based
More informationRelationships between variables. Visualizing Bivariate Distributions: Scatter Plots
SFBS Course Notes Part 7: Correlation Bivariate relationships (p. 1) Linear transformations (p. 3) Pearson r : Measuring a relationship (p. 5) Interpretation of correlations (p. 10) Relationships between
More information14: Correlation. Introduction Scatter Plot The Correlational Coefficient Hypothesis Test Assumptions An Additional Example
14: Correlation Introduction Scatter Plot The Correlational Coefficient Hypothesis Test Assumptions An Additional Example Introduction Correlation quantifies the extent to which two quantitative variables,
More informationANOVA: Analysis of Variation
ANOVA: Analysis of Variation The basic ANOVA situation Two variables: 1 Categorical, 1 Quantitative Main Question: Do the (means of) the quantitative variables depend on which group (given by categorical
More informationSTAT 350. Assignment 4
STAT 350 Assignment 4 1. For the Mileage data in assignment 3 conduct a residual analysis and report your findings. I used the full model for this since my answers to assignment 3 suggested we needed the
More informationIntro to Linear Regression
Intro to Linear Regression Introduction to Regression Regression is a statistical procedure for modeling the relationship among variables to predict the value of a dependent variable from one or more predictor
More information