WELCOME! Lecture 13 Thommy Perlinger

Size: px
Start display at page:

Download "WELCOME! Lecture 13 Thommy Perlinger"

Transcription

1 Quantitative Methods II WELCOME! Lecture 13 Thommy Perlinger

2

3 Parametrical tests (tests for the mean)

4 Nature and number of variables One-way vs. two-way ANOVA One-way ANOVA Y X 1 1 One dependent variable (metric) One independent/explanatory variable (categorical, e.g. different groups) Two-way ANOVA Y X X One dependent variable (metric) Two independent/explanatory variables = factors (categorical)

5 One-way Analysis of Variance (ANOVA) H : 0 H a : µ 1 = µ 2 = = µ k (all k population means are equal) The population means are not all equal Assumptions: Independent random samples The variable is Normally distributed in each group Normally distributed residuals Equal variance in all groups (homogeneity) Advantages: Robust to slight deviations from Normality of the variable when the groups are of equal sizes Robust to slight deviations from equality of variances Disadvantage: Not appropriate if the variability is large (use nonparametric test instead)

6 The basic principles of ANOVA If we have three fertilisers, and we wish to compare their efficacy, this could be done by a field experiment. The three fertilisers are applied to 10 plots each. The 30 plots are later harvested, with the crop yield being calculated for each plot.

7 The basic principles of ANOVA We now have three groups, with ten values (crop yield) in each group, and we wish to know if there are any differences between these groups. The fertilisers do differ in the amount of yield produced. But there is also a lot of variation between plots given the same fertiliser.

8 The basic principles of ANOVA The variability quantifies the spread of the data points around the mean.

9 Variability sum of squares To measure the variability, first the mean is calculated, then the deviation of each point from the mean. The deviations are squared, and then summated, and this sum is a useful measure of variability. The sum will increase the greater the scatter of the data points around the mean. This quantity is referred to as a sum of squares (SS), and is central to our analysis.

10 Variance The SS however cannot be used as a comparative measure between groups, since it clearly will be influenced by the number of data points in the group. Instead, this quantity is converted to a variance by dividing by n 1 The variance is therefore a measure of variability, taking account of the size of the dataset. But why don t we divide by the actual size, n? We actually do not have n independent pieces of information about the variance.

11 Degrees of freedom number of independent pieces of information The first step was to calculate a mean (from the n independent pieces of data collected). The second step is to calculate a variance with reference to that mean. If n 1 deviations are calculated, it is known what the final deviation must be, for they must all add up to zero by definition. So we have only n 1 independent pieces of information on the variability about the mean. The number of independent pieces of information contributing to a statistic are referred to as the degrees of freedom.

12 Partitioning the variance In ANOVA, it is useful to keep the measure of variability in its two components; a sum of squares, and the degrees of freedom associated with the sum of squares. The variance is also partitioned, i.e. divided into two parts: the variance due to the event we are interested in (e.g. different fertilisers), and the variance due to any other factor To illustrate the principle behind partitioning the variability, first consider two extreme datasets.

13 Grand mean If there was almost no variation between the plots due to any of the other factors, and nearly all variation was due to the application of the three fertilisers, then the data would follow the following pattern

14 Grand mean vs. group means The first step would be to calculate a grand mean (as on the previous slide), and there is considerable variation around this mean. The second step is to calculate the three group means that we wish to compare: that is, the means for the plots given fertilisers A, B and C.

15 Group means It can be seen that once these means are fitted, little variation is left around the group means.

16 Amount of variability explained In other words, fitting the group means has removed or explained nearly all the variability in the data. This has happened because the three means are distinct. Now consider the other extreme, in which the three fertilisers are, in fact, identical.

17 Grand mean Once again, the first step is to fit a grand mean and calculate the sum of squares.

18 Group means Second, three group means are fitted, only to find that there is almost as much variability as before.

19 Amount of variability explained Little variability has been explained. This has happened because the three means are relatively close to each other (compared to the scatter of the data).

20 Amount of variability explained The amount of variability that has been explained can be quantified directly by measuring the scatter of the group means around the grand mean. In the first example, the deviations of the group means around the grand mean are considerable. In the second example these deviations are relatively small

21 Group means Now consider a third example, an intermediate situation. In this situation it is not immediately obvious if the fertilisers have had an influence on yield.

22 Significant amount of variability explained? There is an obvious reduction in variability around the three means (compared to the one mean). But at what point do we decide that the amount of variation explained by fitting the three means is significant? The word significant, in this context, actually has a technical meaning. It means When is the variability between the group means greater than that we would expect by chance alone?

23 Three measures of variability SSB = Sum of Squares between groups Sum of squared deviations of the group means from the grand mean. A measure of the variation between the different groups (eg the variation between between plots given different fertilisers). SSW = Sum of Squares within groups Sum of squared deviations of the data around the separate group means. A measure of the variation within each group (eg the variation between the different plots that are given the same fertilizer).

24 Three measures of variability SST = Total Sum of Squares Sum of squared deviations of the data around the grand mean. A measure of the total variability in the dataset. SST = SSB + SSW

25 Partitioning the variability In the first example, SSW was small (small variation within the groups) and SSB was large (large variation between the groups) Small variation within the groups (SSW) Large variation between the groups (SSB)

26 Partitioning the variability In the second example, SSW was large (large variation within the groups) and SSB was small (small variation between the groups) Large variation within the groups (SSW) Small variation between the groups (SSB)

27 Significant amount of variability explained? So, if the variability between the group means is greater than that we would expect by chance alone, there is a significant difference between the group means. For a valid comparison between the two sources of variability, we of course need to compare the variability adjusted for the degree of freedom, i.e. the variances.

28 Partitioning the degrees of freedom The first step in any analysis of variance is to calculate SST. This is done with n-1 degrees of freedom. The second step is to calculate the three group means. When the deviations of two of the three treatment means from the grand mean have been calculated, the third is predetermined. Therefore, calculating SSB (the deviation of the group means from the grand mean) has 2 df associated with it, or more formally k-1 df (k = number of groups).

29 Partitioning the degrees of freedom Finally, SSW measures variation around the different group means. Within each of these groups, the deviations sum to zero. For any number of deviations within the group, the last is always predetermined. Thus SSW has n - k df associated with it (k=number of groups).

30 Mean sum of squares Combining the information from the sum of squares and the degrees of freedom, we get mean sum of squares. MST = Total Mean Square The total variance in the dataset.

31 Three measures of variability MSB = Mean Square between groups The variance between the different groups (eg the variation between plots given different fertilisers, adjusted for the sample size). MSW = Mean Square within groups The variance within each group (eg the variation between the different plots that are given the same fertilizer, adjusted for the sample size).

32 F-ratio If none of the fertilisers influenced yield, then the variation between plots treated with the same fertiliser would be much the same as the variation between plots given different fertilisers. This can be expressed in terms of mean squares: the mean square for fertiliser (within group) would be the same as the mean square between the groups F MSB MSW The F-ratio would be 1 if the group means are the same.

33 F-ratio F-ratio >1 means that the between-group variance is larger than the within-group variance, and the group means are quite different. F-ratio <1 means that the within-group variance is larger than the between-group variance, and the relatively large spread within the groups makes it difficult to say that the group means are different. The F-ratio is compared to the F-distribution to calculate an appropriate P-value.

34 F-distribution The shape of the F-distribution depends on the df (both within-group and between-group) 2 and 27 df 10 and 57 df

35 Example: Eyes and ad response Research from a variety of fields has found significant effects of eye gaze and eye color on emotions and perceptions such as arousal, attractiveness, and honesty. These findings suggest that a model s eyes may play a role in a viewer s response to an ad.

36 Example: Eyes and ad response In a recent study, 222 randomly chosen students at a certain university were presented one of four portfolios. Each portfolio contained a target ad for a fictional product, Sparkle Toothpaste. The students were asked to view the ad and then respond to questions concerning their attitudes and emotions about the ad and product. The variable of main interest is the viewer s attitudes towards the brand, an average of 10 survey questions, on a 7-point scale.

37 Example: Eyes and ad response The only difference in three of the ads was the model s eyes, which were made to be either brown, blue, or green. In the fourth ad, the model is in the same pose but looking downward so the eyes are not visible. Group n Mean Std.dev Blue Brown Green Down

38 Example: Eyes and ad response In SPSS: Analyze >> Graphs >> Legacy Dialogs >> Boxplot. Choose Simple.

39 Example: Eyes and ad response H 0 : H a : µ blue = µ brown = µ green = µ down (all 4 mean attitudes are equal in the population) The population mean attitudes are not all equal Assumptions: Independent random samples The attitude score is Normally distributed in all groups -To be checked by Normality plots and tests Normally distributed residuals - Save residuals from the analysis and investigate Normality plots and tests Equal variance in all groups?

40 Equal variances in ANOVA Using formal tests for the equality of variances in several groups is not recommended (they are largely affected e.g. by deviations from Normality). Since ANOVA is robust to slight deviations from equality of variances, the following rule of thumb can be used: If the largest standard deviation is less than twice the smallest standard deviation, the results from the ANOVA will be approximately correct. OK to use ANOVA if s largest 2 s smallest

41 Example: Eyes and ad response Group n Mean Std.dev Blue Brown Green Down None of these standard deviations are twice as large as any other.

42 Example: Eyes and ad response H 0 : H a : µ blue = µ brown = µ green = µ down (all 4 mean attitudes are equal in the population) The population mean attitudes are not all equal Assumptions: Independent random samples The attitude score is Normally distributed in all groups -To be checked by Normality plots and tests Normally distributed residuals - Save residuals from the analysis and investigate Normality plots and tests Equal variance in all groups

43 Example: Eyes and ad response Significance level? Wrongly rejecting the null hypothesis would mean that we claim that the mean attitude towards the brand is different depending on the eyes on the ad, when the attitudes in fact are the same (on average). Not a serious consequence, the standard 5% is fine to use.

44 Example: Eyes and ad response ANOVA Score Sum of Squares df Mean Square F Sig. Between Groups 24, ,140 2,894,036 Within Groups 613, ,813 Total 637, F-ratio. >1 if the between-group variance is larger than the within-group variance (which means that the group means are quite different) P-value. Tells us if the group means are significantly different In SPSS: Analyze >> Compare Means >> One-way ANOVA. Choose your variable of interest under Dependent List, and the grouping variable under Factor.

45 Example: Eyes and ad response Dependent Variable:Score Tests of Between-Subjects Effects Source Type III Sum of Squares df Mean Square F Sig. Betweengroup Withingroup Corrected Model 24,420 a 3 8,140 2,894,036 Intercept 2430, , ,131,000 group 24, ,140 2,894,036 Error 613, ,813 Total 3352, Corrected Total 637, a. R Squared =,038 (Adjusted R Squared =,025) R 2 the fraction of the overall variance (pooling all the groups) attributable to differences among the group means. In SPSS: Analyze >> General Linear Models >> Univariate Choose your explanatory variable as Fixed factors P-value. Tells us if the group means are significantly different

46 Example: Eyes and ad response The P-value is to be compared to the significance level < 0.05 H 0 rejected (significant result) Conclusion: The test result indicates that the mean attitude towards the brand is different depending on the eyes on the ad (eye color, or not seeing the eyes). But which eye colors? Again, pairwise tests can be used to find which groups that differ.

47 Recap: Pairwise tests can be performed after a multigroup test If you find a significant difference using a multigroup test, you can perform pairwise tests to find which groups/occasions that differ. It is however very important not to start with the pairwise testing, due to multiplicity issues. First use e.g. ANOVA to find out if there are any significant differences at all, then perform pairwise tests to find where the differences are located. This way you don t have to adjust the significance level for multiplicity, and can use e.g. 5% in every pairwise comparison.

48 Two-way Analysis of Variance (ANOVA) There are three sets of hypothesis tests with the two-way ANOVA H : 01 H : 02 H : 03 µ 1_F1 = µ 2_F1 = = µ k_f1 (all k population means of the first factor are equal) µ 1_F2 = µ 2_F2 = = µ k_f2 (all k population means of the second factor are equal) There is no interaction between the two factors The two explanatory variables in a two-way ANOVA are called factors (categorical variables).

49 Two-way Analysis of Variance (ANOVA) Assumptions: Independent random samples The variable is Normally distributed in each group Normally distributed residuals Equal variance in all groups (homogeneity) Advantages: Robust to slight deviations from Normality when the groups are of equal size Robust to slight deviations from equality of variances Disadvantage: Not appropriate if the variability is large (use nonparametric test instead)

50 Example: Cardiovascular risk factors A study of cardiovascular (heart disease) risk factors compared runners who averaged at least 15 miles/week with a control group (non-exercising). Both men and women were included in the study. Y = heart rate after 6 min of exercise Factor 1 = exercise group (runners/control) Factor 2 = gender

51 Example: Cardiovascular risk factors Women Runners Mean: 116 b/m Std: 16.0 b/m Control Mean: 148 b/m Std: 16.3 b/m Men Mean: 104 b/m Std: 12.5 b/m Mean: 130 b/m Std: 17.1 b/m None of the standard deviations are twice as large as any other. In SPSS: Analyze >> Graphs >> Legacy Dialogs >> Boxplot. Choose Clustered.

52 Example: Cardiovascular risk factors Female Male Runners Control Heart rate seems to be Normally distributed in all groups In SPSS: Data >> Split File. Mark Organize output by groups and add the two factor variables. Analyze >> Descriptive Statistics >> Explore. Add dependent variable to Dependent List, and the two factor variables Click Plots, mark Normality plots with tests

53 Example: Cardiovascular risk factors Residuals seem to be Normally distributed In SPSS: Analyze >> General Linear Models >> Univariate. Click Save, mark Standardized residuals. Then check the normality of the saved residuals.

54 Example: Cardiovascular risk factors H : 01 H 0 : 2 H 0 : 3 µ runners = µ control µ female = µ male There is no interaction effect on heart rate between exercise (running) and gender. Assumptions: Independent random samples Heart rate is Normally distributed in all groups Normally distributed residuals Equal variance in all groups

55 Two-way Analysis of Variance (ANOVA) There are two different effects measured with two-way ANOVA: Main effect Interaction effect

56 Main effect The main effect describes the effect of the explanatory variables one at a time. The interaction is ignored for this part. This is the part which is similar to the one-way analysis of variance. Each of the variances calculated to analyze the main effects are like the between-group variances. So for two variables, there will be two main effects.

57 Interaction effect The interaction effect describes the effect that one factor has on the other factor. For two variables, there will be one interaction effect.

58 Example: Cardiovascular risk factors Dependent Variable:Heart rate Tests of Between-Subjects Effects Source Type III Sum of Squares df Mean Square F Sig. Main effects Interaction effect Corrected Model ,090 a , ,345,000 Intercept , , ,259,000 group , , ,647,000 gender 45030, , ,980,000 group * gender 1794, ,005 7,409,007 Error , ,123 Total , Corrected Total , a. R Squared =,528 (Adjusted R Squared =,526) R 2 the fraction of the overall variance (pooling all the groups) attributable to differences among the group means. In SPSS: Analyze >> General Linear Models >> Univariate Choose your explanatory variables as Fixed factors

59 Example: Cardiovascular risk factors If the lines are parallel, there is no interaction effect In SPSS: Analyze >> General Linear Models >> Univariate Click Plots. Add one of the factors to Horizontal axis, the other to Separate Lines, click Add

60 Example: Cardiovascular risk factors Conclusion: There is a significant interaction effect between exercise (running) and gender on heart rate (P-value 0.007). Exercise has a larger effect on heart rate for females than for males, on average. Main effects analysis shows that both exercise and gender, separately, also have significant effects on average heart rate (both P-values 0.000). Men have on average lower heart rate than women, and the running group has on average lower heart rate than the control group.

61 ANOVA Recap: Nature and number of variables Analysis of variance (ANOVA/MANOVA) Y X X X X n One dependent variable (metric) Several independent/explanatory variables (categorical) MANOVA Y + Y Y... Y X X X... X n n Several dependent variables (metric) Several independent/explanatory variables (categorical)

62 Recap: Dependence techniques Several dependent variables in single relationship Measurement scale of the dependent variable Metric Measurement scale of the predictor variable Metric Canonical correlation (Not included in this course) Nonmetric Multivariate analysis of variance (MANOVA) Nonmetric Canonical correlation analysis with dummy (Not included in this course)

Review of Multiple Regression

Review of Multiple Regression Ronald H. Heck 1 Let s begin with a little review of multiple regression this week. Linear models [e.g., correlation, t-tests, analysis of variance (ANOVA), multiple regression, path analysis, multivariate

More information

Chapter 8 Student Lecture Notes 8-1. Department of Economics. Business Statistics. Chapter 12 Chi-square test of independence & Analysis of Variance

Chapter 8 Student Lecture Notes 8-1. Department of Economics. Business Statistics. Chapter 12 Chi-square test of independence & Analysis of Variance Chapter 8 Student Lecture Notes 8-1 Department of Economics Business Statistics Chapter 1 Chi-square test of independence & Analysis of Variance ECON 509 Dr. Mohammad Zainal Chapter Goals After completing

More information

Department of Economics. Business Statistics. Chapter 12 Chi-square test of independence & Analysis of Variance ECON 509. Dr.

Department of Economics. Business Statistics. Chapter 12 Chi-square test of independence & Analysis of Variance ECON 509. Dr. Department of Economics Business Statistics Chapter 1 Chi-square test of independence & Analysis of Variance ECON 509 Dr. Mohammad Zainal Chapter Goals After completing this chapter, you should be able

More information

Statistical methods for comparing multiple groups. Lecture 7: ANOVA. ANOVA: Definition. ANOVA: Concepts

Statistical methods for comparing multiple groups. Lecture 7: ANOVA. ANOVA: Definition. ANOVA: Concepts Statistical methods for comparing multiple groups Lecture 7: ANOVA Sandy Eckel seckel@jhsph.edu 30 April 2008 Continuous data: comparing multiple means Analysis of variance Binary data: comparing multiple

More information

Using SPSS for One Way Analysis of Variance

Using SPSS for One Way Analysis of Variance Using SPSS for One Way Analysis of Variance This tutorial will show you how to use SPSS version 12 to perform a one-way, between- subjects analysis of variance and related post-hoc tests. This tutorial

More information

Disadvantages of using many pooled t procedures. The sampling distribution of the sample means. The variability between the sample means

Disadvantages of using many pooled t procedures. The sampling distribution of the sample means. The variability between the sample means Stat 529 (Winter 2011) Analysis of Variance (ANOVA) Reading: Sections 5.1 5.3. Introduction and notation Birthweight example Disadvantages of using many pooled t procedures The analysis of variance procedure

More information

MANOVA is an extension of the univariate ANOVA as it involves more than one Dependent Variable (DV). The following are assumptions for using MANOVA:

MANOVA is an extension of the univariate ANOVA as it involves more than one Dependent Variable (DV). The following are assumptions for using MANOVA: MULTIVARIATE ANALYSIS OF VARIANCE MANOVA is an extension of the univariate ANOVA as it involves more than one Dependent Variable (DV). The following are assumptions for using MANOVA: 1. Cell sizes : o

More information

M A N O V A. Multivariate ANOVA. Data

M A N O V A. Multivariate ANOVA. Data M A N O V A Multivariate ANOVA V. Čekanavičius, G. Murauskas 1 Data k groups; Each respondent has m measurements; Observations are from the multivariate normal distribution. No outliers. Covariance matrices

More information

22s:152 Applied Linear Regression. Chapter 8: 1-Way Analysis of Variance (ANOVA) 2-Way Analysis of Variance (ANOVA)

22s:152 Applied Linear Regression. Chapter 8: 1-Way Analysis of Variance (ANOVA) 2-Way Analysis of Variance (ANOVA) 22s:152 Applied Linear Regression Chapter 8: 1-Way Analysis of Variance (ANOVA) 2-Way Analysis of Variance (ANOVA) We now consider an analysis with only categorical predictors (i.e. all predictors are

More information

One-Way ANOVA. Some examples of when ANOVA would be appropriate include:

One-Way ANOVA. Some examples of when ANOVA would be appropriate include: One-Way ANOVA 1. Purpose Analysis of variance (ANOVA) is used when one wishes to determine whether two or more groups (e.g., classes A, B, and C) differ on some outcome of interest (e.g., an achievement

More information

Unit 27 One-Way Analysis of Variance

Unit 27 One-Way Analysis of Variance Unit 27 One-Way Analysis of Variance Objectives: To perform the hypothesis test in a one-way analysis of variance for comparing more than two population means Recall that a two sample t test is applied

More information

Statistics and Quantitative Analysis U4320

Statistics and Quantitative Analysis U4320 Statistics and Quantitative Analysis U3 Lecture 13: Explaining Variation Prof. Sharyn O Halloran Explaining Variation: Adjusted R (cont) Definition of Adjusted R So we'd like a measure like R, but one

More information

Independent Samples ANOVA

Independent Samples ANOVA Independent Samples ANOVA In this example students were randomly assigned to one of three mnemonics (techniques for improving memory) rehearsal (the control group; simply repeat the words), visual imagery

More information

Lecture 5: ANOVA and Correlation

Lecture 5: ANOVA and Correlation Lecture 5: ANOVA and Correlation Ani Manichaikul amanicha@jhsph.edu 23 April 2007 1 / 62 Comparing Multiple Groups Continous data: comparing means Analysis of variance Binary data: comparing proportions

More information

ANOVA: Comparing More Than Two Means

ANOVA: Comparing More Than Two Means 1 ANOVA: Comparing More Than Two Means 10.1 ANOVA: The Completely Randomized Design Elements of a Designed Experiment Before we begin any calculations, we need to discuss some terminology. To make this

More information

Analysis of Variance (ANOVA)

Analysis of Variance (ANOVA) Analysis of Variance (ANOVA) Two types of ANOVA tests: Independent measures and Repeated measures Comparing 2 means: X 1 = 20 t - test X 2 = 30 How can we Compare 3 means?: X 1 = 20 X 2 = 30 X 3 = 35 ANOVA

More information

Introduction to Business Statistics QM 220 Chapter 12

Introduction to Business Statistics QM 220 Chapter 12 Department of Quantitative Methods & Information Systems Introduction to Business Statistics QM 220 Chapter 12 Dr. Mohammad Zainal 12.1 The F distribution We already covered this topic in Ch. 10 QM-220,

More information

Module 03 Lecture 14 Inferential Statistics ANOVA and TOI

Module 03 Lecture 14 Inferential Statistics ANOVA and TOI Introduction of Data Analytics Prof. Nandan Sudarsanam and Prof. B Ravindran Department of Management Studies and Department of Computer Science and Engineering Indian Institute of Technology, Madras Module

More information

(Where does Ch. 7 on comparing 2 means or 2 proportions fit into this?)

(Where does Ch. 7 on comparing 2 means or 2 proportions fit into this?) 12. Comparing Groups: Analysis of Variance (ANOVA) Methods Response y Explanatory x var s Method Categorical Categorical Contingency tables (Ch. 8) (chi-squared, etc.) Quantitative Quantitative Regression

More information

Statistics For Economics & Business

Statistics For Economics & Business Statistics For Economics & Business Analysis of Variance In this chapter, you learn: Learning Objectives The basic concepts of experimental design How to use one-way analysis of variance to test for differences

More information

1. The (dependent variable) is the variable of interest to be measured in the experiment.

1. The (dependent variable) is the variable of interest to be measured in the experiment. Chapter 10 Analysis of variance (ANOVA) 10.1 Elements of a designed experiment 1. The (dependent variable) is the variable of interest to be measured in the experiment. 2. are those variables whose effect

More information

CHAPTER 10. Regression and Correlation

CHAPTER 10. Regression and Correlation CHAPTER 10 Regression and Correlation In this Chapter we assess the strength of the linear relationship between two continuous variables. If a significant linear relationship is found, the next step would

More information

Regression: Main Ideas Setting: Quantitative outcome with a quantitative explanatory variable. Example, cont.

Regression: Main Ideas Setting: Quantitative outcome with a quantitative explanatory variable. Example, cont. TCELL 9/4/205 36-309/749 Experimental Design for Behavioral and Social Sciences Simple Regression Example Male black wheatear birds carry stones to the nest as a form of sexual display. Soler et al. wanted

More information

Wed, June 26, (Lecture 8-2). Nonlinearity. Significance test for correlation R-squared, SSE, and SST. Correlation in SPSS.

Wed, June 26, (Lecture 8-2). Nonlinearity. Significance test for correlation R-squared, SSE, and SST. Correlation in SPSS. Wed, June 26, (Lecture 8-2). Nonlinearity. Significance test for correlation R-squared, SSE, and SST. Correlation in SPSS. Last time, we looked at scatterplots, which show the interaction between two variables,

More information

Ron Heck, Fall Week 3: Notes Building a Two-Level Model

Ron Heck, Fall Week 3: Notes Building a Two-Level Model Ron Heck, Fall 2011 1 EDEP 768E: Seminar on Multilevel Modeling rev. 9/6/2011@11:27pm Week 3: Notes Building a Two-Level Model We will build a model to explain student math achievement using student-level

More information

22s:152 Applied Linear Regression. Take random samples from each of m populations.

22s:152 Applied Linear Regression. Take random samples from each of m populations. 22s:152 Applied Linear Regression Chapter 8: ANOVA NOTE: We will meet in the lab on Monday October 10. One-way ANOVA Focuses on testing for differences among group means. Take random samples from each

More information

22s:152 Applied Linear Regression. 1-way ANOVA visual:

22s:152 Applied Linear Regression. 1-way ANOVA visual: 22s:152 Applied Linear Regression 1-way ANOVA visual: Chapter 8: 1-Way Analysis of Variance (ANOVA) 2-Way Analysis of Variance (ANOVA) 0.00 0.05 0.10 0.15 0.20 0.25 0.30 0.35 Y We now consider an analysis

More information

36-309/749 Experimental Design for Behavioral and Social Sciences. Sep. 22, 2015 Lecture 4: Linear Regression

36-309/749 Experimental Design for Behavioral and Social Sciences. Sep. 22, 2015 Lecture 4: Linear Regression 36-309/749 Experimental Design for Behavioral and Social Sciences Sep. 22, 2015 Lecture 4: Linear Regression TCELL Simple Regression Example Male black wheatear birds carry stones to the nest as a form

More information

22s:152 Applied Linear Regression. There are a couple commonly used models for a one-way ANOVA with m groups. Chapter 8: ANOVA

22s:152 Applied Linear Regression. There are a couple commonly used models for a one-way ANOVA with m groups. Chapter 8: ANOVA 22s:152 Applied Linear Regression Chapter 8: ANOVA NOTE: We will meet in the lab on Monday October 10. One-way ANOVA Focuses on testing for differences among group means. Take random samples from each

More information

One-Way Analysis of Variance. With regression, we related two quantitative, typically continuous variables.

One-Way Analysis of Variance. With regression, we related two quantitative, typically continuous variables. One-Way Analysis of Variance With regression, we related two quantitative, typically continuous variables. Often we wish to relate a quantitative response variable with a qualitative (or simply discrete)

More information

Analysis of Variance (ANOVA)

Analysis of Variance (ANOVA) Analysis of Variance (ANOVA) Much of statistical inference centers around the ability to distinguish between two or more groups in terms of some underlying response variable y. Sometimes, there are but

More information

Two-Way ANOVA. Chapter 15

Two-Way ANOVA. Chapter 15 Two-Way ANOVA Chapter 15 Interaction Defined An interaction is present when the effects of one IV depend upon a second IV Interaction effect : The effect of each IV across the levels of the other IV When

More information

4.1. Introduction: Comparing Means

4.1. Introduction: Comparing Means 4. Analysis of Variance (ANOVA) 4.1. Introduction: Comparing Means Consider the problem of testing H 0 : µ 1 = µ 2 against H 1 : µ 1 µ 2 in two independent samples of two different populations of possibly

More information

Data Analysis and Statistical Methods Statistics 651

Data Analysis and Statistical Methods Statistics 651 Data Analysis and Statistical Methods Statistics 651 http://www.stat.tamu.edu/~suhasini/teaching.html Suhasini Subba Rao Motivations for the ANOVA We defined the F-distribution, this is mainly used in

More information

10/31/2012. One-Way ANOVA F-test

10/31/2012. One-Way ANOVA F-test PSY 511: Advanced Statistics for Psychological and Behavioral Research 1 1. Situation/hypotheses 2. Test statistic 3.Distribution 4. Assumptions One-Way ANOVA F-test One factor J>2 independent samples

More information

Wolf River. Lecture 19 - ANOVA. Exploratory analysis. Wolf River - Data. Sta 111. June 11, 2014

Wolf River. Lecture 19 - ANOVA. Exploratory analysis. Wolf River - Data. Sta 111. June 11, 2014 Aldrin in the Wolf River Wolf River Lecture 19 - Sta 111 Colin Rundel June 11, 2014 The Wolf River in Tennessee flows past an abandoned site once used by the pesticide industry for dumping wastes, including

More information

An Analysis of College Algebra Exam Scores December 14, James D Jones Math Section 01

An Analysis of College Algebra Exam Scores December 14, James D Jones Math Section 01 An Analysis of College Algebra Exam s December, 000 James D Jones Math - Section 0 An Analysis of College Algebra Exam s Introduction Students often complain about a test being too difficult. Are there

More information

Comparing Several Means: ANOVA

Comparing Several Means: ANOVA Comparing Several Means: ANOVA Understand the basic principles of ANOVA Why it is done? What it tells us? Theory of one way independent ANOVA Following up an ANOVA: Planned contrasts/comparisons Choosing

More information

Simple Linear Regression: One Qualitative IV

Simple Linear Regression: One Qualitative IV Simple Linear Regression: One Qualitative IV 1. Purpose As noted before regression is used both to explain and predict variation in DVs, and adding to the equation categorical variables extends regression

More information

Research Methodology: Tools

Research Methodology: Tools MSc Business Administration Research Methodology: Tools Applied Data Analysis (with SPSS) Lecture 09: Introduction to Analysis of Variance (ANOVA) April 2014 Prof. Dr. Jürg Schwarz Lic. phil. Heidi Bruderer

More information

ANOVA Analysis of Variance

ANOVA Analysis of Variance ANOVA Analysis of Variance ANOVA Analysis of Variance Extends independent samples t test ANOVA Analysis of Variance Extends independent samples t test Compares the means of groups of independent observations

More information

Multiple linear regression S6

Multiple linear regression S6 Basic medical statistics for clinical and experimental research Multiple linear regression S6 Katarzyna Jóźwiak k.jozwiak@nki.nl November 15, 2017 1/42 Introduction Two main motivations for doing multiple

More information

Repeated-Measures ANOVA in SPSS Correct data formatting for a repeated-measures ANOVA in SPSS involves having a single line of data for each

Repeated-Measures ANOVA in SPSS Correct data formatting for a repeated-measures ANOVA in SPSS involves having a single line of data for each Repeated-Measures ANOVA in SPSS Correct data formatting for a repeated-measures ANOVA in SPSS involves having a single line of data for each participant, with the repeated measures entered as separate

More information

Analysis of variance

Analysis of variance Analysis of variance Tron Anders Moger 3.0.007 Comparing more than two groups Up to now we have studied situations with One observation per subject One group Two groups Two or more observations per subject

More information

psyc3010 lecture 2 factorial between-ps ANOVA I: omnibus tests

psyc3010 lecture 2 factorial between-ps ANOVA I: omnibus tests psyc3010 lecture 2 factorial between-ps ANOVA I: omnibus tests last lecture: introduction to factorial designs next lecture: factorial between-ps ANOVA II: (effect sizes and follow-up tests) 1 general

More information

Introduction to the Analysis of Variance (ANOVA)

Introduction to the Analysis of Variance (ANOVA) Introduction to the Analysis of Variance (ANOVA) The Analysis of Variance (ANOVA) The analysis of variance (ANOVA) is a statistical technique for testing for differences between the means of multiple (more

More information

Final Exam - Solutions

Final Exam - Solutions Ecn 102 - Analysis of Economic Data University of California - Davis March 19, 2010 Instructor: John Parman Final Exam - Solutions You have until 5:30pm to complete this exam. Please remember to put your

More information

Chapter 4 Regression with Categorical Predictor Variables Page 1. Overview of regression with categorical predictors

Chapter 4 Regression with Categorical Predictor Variables Page 1. Overview of regression with categorical predictors Chapter 4 Regression with Categorical Predictor Variables Page. Overview of regression with categorical predictors 4-. Dummy coding 4-3 4-5 A. Karpinski Regression with Categorical Predictor Variables.

More information

Lecture 6: Single-classification multivariate ANOVA (k-group( MANOVA)

Lecture 6: Single-classification multivariate ANOVA (k-group( MANOVA) Lecture 6: Single-classification multivariate ANOVA (k-group( MANOVA) Rationale and MANOVA test statistics underlying principles MANOVA assumptions Univariate ANOVA Planned and unplanned Multivariate ANOVA

More information

Correlation & Simple Regression

Correlation & Simple Regression Chapter 11 Correlation & Simple Regression The previous chapter dealt with inference for two categorical variables. In this chapter, we would like to examine the relationship between two quantitative variables.

More information

Analysis of Variance. Contents. 1 Analysis of Variance. 1.1 Review. Anthony Tanbakuchi Department of Mathematics Pima Community College

Analysis of Variance. Contents. 1 Analysis of Variance. 1.1 Review. Anthony Tanbakuchi Department of Mathematics Pima Community College Introductory Statistics Lectures Analysis of Variance 1-Way ANOVA: Many sample test of means Department of Mathematics Pima Community College Redistribution of this material is prohibited without written

More information

A discussion on multiple regression models

A discussion on multiple regression models A discussion on multiple regression models In our previous discussion of simple linear regression, we focused on a model in which one independent or explanatory variable X was used to predict the value

More information

Lecture 15 - ANOVA cont.

Lecture 15 - ANOVA cont. Lecture 15 - ANOVA cont. Statistics 102 Colin Rundel March 18, 2013 One-way ANOVA Example - Alfalfa Example - Alfalfa (11.6.1) Researchers were interested in the effect that acid has on the growth rate

More information

1 Correlation and Inference from Regression

1 Correlation and Inference from Regression 1 Correlation and Inference from Regression Reading: Kennedy (1998) A Guide to Econometrics, Chapters 4 and 6 Maddala, G.S. (1992) Introduction to Econometrics p. 170-177 Moore and McCabe, chapter 12 is

More information

Example. Multiple Regression. Review of ANOVA & Simple Regression /749 Experimental Design for Behavioral and Social Sciences

Example. Multiple Regression. Review of ANOVA & Simple Regression /749 Experimental Design for Behavioral and Social Sciences 36-309/749 Experimental Design for Behavioral and Social Sciences Sep. 29, 2015 Lecture 5: Multiple Regression Review of ANOVA & Simple Regression Both Quantitative outcome Independent, Gaussian errors

More information

Chapter Fifteen. Frequency Distribution, Cross-Tabulation, and Hypothesis Testing

Chapter Fifteen. Frequency Distribution, Cross-Tabulation, and Hypothesis Testing Chapter Fifteen Frequency Distribution, Cross-Tabulation, and Hypothesis Testing Copyright 2010 Pearson Education, Inc. publishing as Prentice Hall 15-1 Internet Usage Data Table 15.1 Respondent Sex Familiarity

More information

1 DV is normally distributed in the population for each level of the within-subjects factor 2 The population variances of the difference scores

1 DV is normally distributed in the population for each level of the within-subjects factor 2 The population variances of the difference scores One-way Prepared by: Prof. Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti Putra Malaysia Serdang The purpose is to test the

More information

Example - Alfalfa (11.6.1) Lecture 16 - ANOVA cont. Alfalfa Hypotheses. Treatment Effect

Example - Alfalfa (11.6.1) Lecture 16 - ANOVA cont. Alfalfa Hypotheses. Treatment Effect (11.6.1) Lecture 16 - ANOVA cont. Sta102 / BME102 Colin Rundel October 28, 2015 Researchers were interested in the effect that acid has on the growth rate of alfalfa plants. They created three treatment

More information

Didacticiel Études de cas. Parametric hypothesis testing for comparison of two or more populations. Independent and dependent samples.

Didacticiel Études de cas. Parametric hypothesis testing for comparison of two or more populations. Independent and dependent samples. 1 Subject Parametric hypothesis testing for comparison of two or more populations. Independent and dependent samples. The tests for comparison of population try to determine if K (K 2) samples come from

More information

Example - Alfalfa (11.6.1) Lecture 14 - ANOVA cont. Alfalfa Hypotheses. Treatment Effect

Example - Alfalfa (11.6.1) Lecture 14 - ANOVA cont. Alfalfa Hypotheses. Treatment Effect (11.6.1) Lecture 14 - ANOVA cont. Sta102 / BME102 Colin Rundel March 19, 2014 Researchers were interested in the effect that acid has on the growth rate of alfalfa plants. They created three treatment

More information

Tentative solutions TMA4255 Applied Statistics 16 May, 2015

Tentative solutions TMA4255 Applied Statistics 16 May, 2015 Norwegian University of Science and Technology Department of Mathematical Sciences Page of 9 Tentative solutions TMA455 Applied Statistics 6 May, 05 Problem Manufacturer of fertilizers a) Are these independent

More information

Factorial Independent Samples ANOVA

Factorial Independent Samples ANOVA Factorial Independent Samples ANOVA Liljenquist, Zhong and Galinsky (2010) found that people were more charitable when they were in a clean smelling room than in a neutral smelling room. Based on that

More information

This document contains 3 sets of practice problems.

This document contains 3 sets of practice problems. P RACTICE PROBLEMS This document contains 3 sets of practice problems. Correlation: 3 problems Regression: 4 problems ANOVA: 8 problems You should print a copy of these practice problems and bring them

More information

One-way between-subjects ANOVA. Comparing three or more independent means

One-way between-subjects ANOVA. Comparing three or more independent means One-way between-subjects ANOVA Comparing three or more independent means Data files SpiderBG.sav Attractiveness.sav Homework: sourcesofself-esteem.sav ANOVA: A Framework Understand the basic principles

More information

Three Factor Completely Randomized Design with One Continuous Factor: Using SPSS GLM UNIVARIATE R. C. Gardner Department of Psychology

Three Factor Completely Randomized Design with One Continuous Factor: Using SPSS GLM UNIVARIATE R. C. Gardner Department of Psychology Data_Analysis.calm Three Factor Completely Randomized Design with One Continuous Factor: Using SPSS GLM UNIVARIATE R. C. Gardner Department of Psychology This article considers a three factor completely

More information

One-factor analysis of variance (ANOVA)

One-factor analysis of variance (ANOVA) One-factor analysis of variance (ANOVA) March 1, 2017 psych10.stanford.edu Announcements / Action Items Schedule update: final R lab moved to Week 10 Optional Survey 5 coming soon, due on Saturday Last

More information

Chapter 7 Factorial ANOVA: Two-way ANOVA

Chapter 7 Factorial ANOVA: Two-way ANOVA Chapter 7 Factorial ANOVA: Two-way ANOVA Page Two-way ANOVA: Equal n. Examples 7-. Terminology 7-6 3. Understanding main effects 7- and interactions 4. Structural model 7-5 5. Variance partitioning 7-6.

More information

In ANOVA the response variable is numerical and the explanatory variables are categorical.

In ANOVA the response variable is numerical and the explanatory variables are categorical. 1 ANOVA ANOVA means ANalysis Of VAriance. The ANOVA is a tool for studying the influence of one or more qualitative variables on the mean of a numerical variable in a population. In ANOVA the response

More information

Analysing data: regression and correlation S6 and S7

Analysing data: regression and correlation S6 and S7 Basic medical statistics for clinical and experimental research Analysing data: regression and correlation S6 and S7 K. Jozwiak k.jozwiak@nki.nl 2 / 49 Correlation So far we have looked at the association

More information

y = a + bx 12.1: Inference for Linear Regression Review: General Form of Linear Regression Equation Review: Interpreting Computer Regression Output

y = a + bx 12.1: Inference for Linear Regression Review: General Form of Linear Regression Equation Review: Interpreting Computer Regression Output 12.1: Inference for Linear Regression Review: General Form of Linear Regression Equation y = a + bx y = dependent variable a = intercept b = slope x = independent variable Section 12.1 Inference for Linear

More information

In a one-way ANOVA, the total sums of squares among observations is partitioned into two components: Sums of squares represent:

In a one-way ANOVA, the total sums of squares among observations is partitioned into two components: Sums of squares represent: Activity #10: AxS ANOVA (Repeated subjects design) Resources: optimism.sav So far in MATH 300 and 301, we have studied the following hypothesis testing procedures: 1) Binomial test, sign-test, Fisher s

More information

STAT Chapter 10: Analysis of Variance

STAT Chapter 10: Analysis of Variance STAT 515 -- Chapter 10: Analysis of Variance Designed Experiment A study in which the researcher controls the levels of one or more variables to determine their effect on the variable of interest (called

More information

Regression Analysis. BUS 735: Business Decision Making and Research

Regression Analysis. BUS 735: Business Decision Making and Research Regression Analysis BUS 735: Business Decision Making and Research 1 Goals and Agenda Goals of this section Specific goals Learn how to detect relationships between ordinal and categorical variables. Learn

More information

Simple Linear Regression

Simple Linear Regression Simple Linear Regression 1 Correlation indicates the magnitude and direction of the linear relationship between two variables. Linear Regression: variable Y (criterion) is predicted by variable X (predictor)

More information

One-Way ANOVA Cohen Chapter 12 EDUC/PSY 6600

One-Way ANOVA Cohen Chapter 12 EDUC/PSY 6600 One-Way ANOVA Cohen Chapter 1 EDUC/PSY 6600 1 It is easy to lie with statistics. It is hard to tell the truth without statistics. -Andrejs Dunkels Motivating examples Dr. Vito randomly assigns 30 individuals

More information

Advanced Quantitative Data Analysis

Advanced Quantitative Data Analysis Chapter 24 Advanced Quantitative Data Analysis Daniel Muijs Doing Regression Analysis in SPSS When we want to do regression analysis in SPSS, we have to go through the following steps: 1 As usual, we choose

More information

One-Way ANOVA Source Table J - 1 SS B / J - 1 MS B /MS W. Pairwise Post-Hoc Comparisons of Means

One-Way ANOVA Source Table J - 1 SS B / J - 1 MS B /MS W. Pairwise Post-Hoc Comparisons of Means One-Way ANOVA Source Table ANOVA MODEL: ij = µ* + α j + ε ij H 0 : µ 1 = µ =... = µ j or H 0 : Σα j = 0 Source Sum of Squares df Mean Squares F Between Groups n j ( j - * ) J - 1 SS B / J - 1 MS B /MS

More information

Difference in two or more average scores in different groups

Difference in two or more average scores in different groups ANOVAs Analysis of Variance (ANOVA) Difference in two or more average scores in different groups Each participant tested once Same outcome tested in each group Simplest is one-way ANOVA (one variable as

More information

CHAPTER 4 Analysis of Variance. One-way ANOVA Two-way ANOVA i) Two way ANOVA without replication ii) Two way ANOVA with replication

CHAPTER 4 Analysis of Variance. One-way ANOVA Two-way ANOVA i) Two way ANOVA without replication ii) Two way ANOVA with replication CHAPTER 4 Analysis of Variance One-way ANOVA Two-way ANOVA i) Two way ANOVA without replication ii) Two way ANOVA with replication 1 Introduction In this chapter, expand the idea of hypothesis tests. We

More information

16.400/453J Human Factors Engineering. Design of Experiments II

16.400/453J Human Factors Engineering. Design of Experiments II J Human Factors Engineering Design of Experiments II Review Experiment Design and Descriptive Statistics Research question, independent and dependent variables, histograms, box plots, etc. Inferential

More information

STAT 135 Lab 11 Tests for Categorical Data (Fisher s Exact test, χ 2 tests for Homogeneity and Independence) and Linear Regression

STAT 135 Lab 11 Tests for Categorical Data (Fisher s Exact test, χ 2 tests for Homogeneity and Independence) and Linear Regression STAT 135 Lab 11 Tests for Categorical Data (Fisher s Exact test, χ 2 tests for Homogeneity and Independence) and Linear Regression Rebecca Barter April 20, 2015 Fisher s Exact Test Fisher s Exact Test

More information

Regression With a Categorical Independent Variable

Regression With a Categorical Independent Variable Regression With a Independent Variable Lecture 10 November 5, 2008 ERSH 8320 Lecture #10-11/5/2008 Slide 1 of 54 Today s Lecture Today s Lecture Chapter 11: Regression with a single categorical independent

More information

4:3 LEC - PLANNED COMPARISONS AND REGRESSION ANALYSES

4:3 LEC - PLANNED COMPARISONS AND REGRESSION ANALYSES 4:3 LEC - PLANNED COMPARISONS AND REGRESSION ANALYSES FOR SINGLE FACTOR BETWEEN-S DESIGNS Planned or A Priori Comparisons We previously showed various ways to test all possible pairwise comparisons for

More information

ANOVA - analysis of variance - used to compare the means of several populations.

ANOVA - analysis of variance - used to compare the means of several populations. 12.1 One-Way Analysis of Variance ANOVA - analysis of variance - used to compare the means of several populations. Assumptions for One-Way ANOVA: 1. Independent samples are taken using a randomized design.

More information

MATH 1150 Chapter 2 Notation and Terminology

MATH 1150 Chapter 2 Notation and Terminology MATH 1150 Chapter 2 Notation and Terminology Categorical Data The following is a dataset for 30 randomly selected adults in the U.S., showing the values of two categorical variables: whether or not the

More information

y response variable x 1, x 2,, x k -- a set of explanatory variables

y response variable x 1, x 2,, x k -- a set of explanatory variables 11. Multiple Regression and Correlation y response variable x 1, x 2,, x k -- a set of explanatory variables In this chapter, all variables are assumed to be quantitative. Chapters 12-14 show how to incorporate

More information

Statistiek II. John Nerbonne using reworkings by Hartmut Fitz and Wilbert Heeringa. February 13, Dept of Information Science

Statistiek II. John Nerbonne using reworkings by Hartmut Fitz and Wilbert Heeringa. February 13, Dept of Information Science Statistiek II John Nerbonne using reworkings by Hartmut Fitz and Wilbert Heeringa Dept of Information Science j.nerbonne@rug.nl February 13, 2014 Course outline 1 One-way ANOVA. 2 Factorial ANOVA. 3 Repeated

More information

SIMPLE REGRESSION ANALYSIS. Business Statistics

SIMPLE REGRESSION ANALYSIS. Business Statistics SIMPLE REGRESSION ANALYSIS Business Statistics CONTENTS Ordinary least squares (recap for some) Statistical formulation of the regression model Assessing the regression model Testing the regression coefficients

More information

Psych 230. Psychological Measurement and Statistics

Psych 230. Psychological Measurement and Statistics Psych 230 Psychological Measurement and Statistics Pedro Wolf December 9, 2009 This Time. Non-Parametric statistics Chi-Square test One-way Two-way Statistical Testing 1. Decide which test to use 2. State

More information

Prepared by: Prof. Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti

Prepared by: Prof. Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti Prepared by: Prof. Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti Putra Malaysia Serdang Use in experiment, quasi-experiment

More information

This module focuses on the logic of ANOVA with special attention given to variance components and the relationship between ANOVA and regression.

This module focuses on the logic of ANOVA with special attention given to variance components and the relationship between ANOVA and regression. WISE ANOVA and Regression Lab Introduction to the WISE Correlation/Regression and ANOVA Applet This module focuses on the logic of ANOVA with special attention given to variance components and the relationship

More information

The independent-means t-test:

The independent-means t-test: The independent-means t-test: Answers the question: is there a "real" difference between the two conditions in my experiment? Or is the difference due to chance? Previous lecture: (a) Dependent-means t-test:

More information

Chapter 9 FACTORIAL ANALYSIS OF VARIANCE. When researchers have more than two groups to compare, they use analysis of variance,

Chapter 9 FACTORIAL ANALYSIS OF VARIANCE. When researchers have more than two groups to compare, they use analysis of variance, 09-Reinard.qxd 3/2/2006 11:21 AM Page 213 Chapter 9 FACTORIAL ANALYSIS OF VARIANCE Doing a Study That Involves More Than One Independent Variable 214 Types of Effects to Test 216 Isolating Main Effects

More information

Chapter 26: Comparing Counts (Chi Square)

Chapter 26: Comparing Counts (Chi Square) Chapter 6: Comparing Counts (Chi Square) We ve seen that you can turn a qualitative variable into a quantitative one (by counting the number of successes and failures), but that s a compromise it forces

More information

While you wait: Enter the following in your calculator. Find the mean and sample variation of each group. Bluman, Chapter 12 1

While you wait: Enter the following in your calculator. Find the mean and sample variation of each group. Bluman, Chapter 12 1 While you wait: Enter the following in your calculator. Find the mean and sample variation of each group. Bluman, Chapter 12 1 Chapter 12 Analysis of Variance McGraw-Hill, Bluman, 7th ed., Chapter 12 2

More information

Inference for the Regression Coefficient

Inference for the Regression Coefficient Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression line. We can shows that b 0 and b 1 are the unbiased estimates

More information

REVIEW 8/2/2017 陈芳华东师大英语系

REVIEW 8/2/2017 陈芳华东师大英语系 REVIEW Hypothesis testing starts with a null hypothesis and a null distribution. We compare what we have to the null distribution, if the result is too extreme to belong to the null distribution (p

More information

Trendlines Simple Linear Regression Multiple Linear Regression Systematic Model Building Practical Issues

Trendlines Simple Linear Regression Multiple Linear Regression Systematic Model Building Practical Issues Trendlines Simple Linear Regression Multiple Linear Regression Systematic Model Building Practical Issues Overfitting Categorical Variables Interaction Terms Non-linear Terms Linear Logarithmic y = a +

More information

9. Linear Regression and Correlation

9. Linear Regression and Correlation 9. Linear Regression and Correlation Data: y a quantitative response variable x a quantitative explanatory variable (Chap. 8: Recall that both variables were categorical) For example, y = annual income,

More information

One-way between-subjects ANOVA. Comparing three or more independent means

One-way between-subjects ANOVA. Comparing three or more independent means One-way between-subjects ANOVA Comparing three or more independent means ANOVA: A Framework Understand the basic principles of ANOVA Why it is done? What it tells us? Theory of one-way between-subjects

More information