Analysis of Covariance (ANCOVA) Lecture Notes
|
|
- Jonas Lloyd
- 6 years ago
- Views:
Transcription
1 1 Analysis of Covariance (ANCOVA) Lecture Notes Overview: In experimental methods, a central tenet of establishing significant relationships has to do with the notion of random assignment. Random assignment solves a couple of problems. Statistically, it ensures that, in the main, the resulting probability will be independent of the starting conditions of an experiment. Secondly, it is a way of establishing parity, which is to say that it is a method of controlling for what you don t know. ANCOVA is a method that can be thought of as a cross between ANOVA and Regression. It is, in fact, an ANOVA where the effects of some other variable have been controlled for statistically. There are three reasons why one would choose to use an ANCOVA: 1. Reduction of within group or error variance to increase the sensitivity of a test of main effects and/or interactions, by reducing the error term. 2. Elimination of systematic bias to adjust the means on the d.v. to what they would be if all subjects scored equally on the c.v. 3. Stepdown Analysis to compare scores on a d.v. after they are adjusted for scores on another DV, which is treated as a c.v. (MANOVA model). In the first case, some nuisance variable(s) might be used as the covariate(s). As an example, consider the following. One is trying to study the effectiveness of several teaching methods. The DV (variate) might be a score on an achievement test. Since some of the within condition variability will be attributable to individual differences in test taking ability, another measure of test-taking ability might be used as a covariate. If students were randomly assigned, this tactic would increase the sensitivity of the experiment by reducing error variance. A common example for the second case involves situations where random assignment is not possible so called intact groups. For instance, consider the example of whether training for detection of deception is more effective for some careers. People from three careers are randomly sampled, Psychiatrists, Judges, and Secret Service Agents. Their ability to detect whether or not someone is trying to deceive them is measured and it is the DV. However, the study as it stands is flawed because there could be some personality variables that resulted in some types of people being more inclined toward certain professions. So, the ability to detect deception may be a function of personality rather than training e.g., perhaps Psychiatrists are, on average, the most trusting and Secret Services are the least trusting. To remedy this, one could use some appropriate measure of personality as a covariate. ANCOVA is applied to the same kinds of research questions as ANOVA, however, it is a special case of ANOVA where one has some other effect one wishes to control for prior to conducting the ANOVA. An ANCOVA yields the following effects Covariate the effect of the covariate is tested for significance and is paramount to testing whether a regression equation is significant. Main Effects for independent variables are tested for significance. Interaction Effects for independent variables are tested. Can you have interactions between IVs and the CV(s)? Yes, but it implies the assumption of homogeneous regression has been violated (see below). As with ANOVA, one can estimate effect sizes, confidence intervals, etc. Much of ANCOVA is redundant with ANOVA. Also, more complicated designs can be used, such as repeated measures, multiple covariates, n-way interactions, etc. Not all software packages may be able to handle the more complicated designs.
2 2 While ANCOVA seems like a procedure tailor made for non-experimental situations and situations where experimental control couldn t be appropriately practiced, it has its problems. It is tempting to think that if the effect of some nuisance variable was partialed out, the resulting observed significant differences among treatment means are somehow more pure, i.e., any remaining significant differences must be causal in nature. This is a false sense of insight, as causation is something of a different matter. Stevens (2002) highlights some problems with the use of ANCOVA in intact groups. Groups can still differ in unknown ways. Question whether groups that are equivalent on the covariate ever exist since ANCOVA adjusts for equivalence on the covariate. Assumptions of linearity and homogeneity of regression slopes need to be satisfied. Differential growth of subjects i.e., is difference due to treatment or differential growth (such as in pre-post designs where a pre measure is used as a covariate)? Measurement error, which exists, can produce spurious results. The use of a CV. results in the loss of one degree of freedom in the denominator. Hence, a good CV, or a few good orthogonal CVs can yield a much more precise error term that is worth the trade-off. Poor choice of a CV will result in a loss of a degree of freedom with no real gain. Care should be taken such that the relationship between the CV and the DV is independent of the relationship between the IV and DV. Otherwise, some of the relationship between the IV and DV will be captured by the CV and not attributable to the IV. The inclusion of more and more CVs will result in overall loss of variation which will result in the adjusted treatment means moving closer together. Hence, one doesn t want to remove all the variation in the DV prior to analyzing the effect of the IV on it. Sample-specific problems, or bias in the CV can result in over or under correction. Assumptions of ANCOVA: Larger sample sizes (because of the regression of the d.v. on the c.v.) Absence of Multicollinearity and Singularity Normality of sampling distributions (of the means) Homogeneity of Variance Linearity of relationship between covariate and dependent variable Homogeneity of regression that the slope of the regression line is the same for all cells. If, in the population, the relationship between the CV and DV varies as a function of level of one or more IVs, then this assumption would be violated. Reliability of covariates an implicit assumption is that the CV is measured without error. This is often not a tenable assumption. Measures that have some reliability exceeding.80 are recommended. Testing assumption of homogeneity of regression can be done with SPSS MANOVA (or another GLM type procedure) by conducting a test of a main effect for the covariate, the IV and the interaction between the covariate and the IV. A nonsignificant F for the interaction implies the assumption has not been violated. This can be accomplished through GLM, using the menus, by specifying the model complete with the covariate, then selecting the Model button and selecting custom. From here you can build a model with the effect of the IV, the CV and the interaction between the two.
3 3 Additionally, the covariates should be evaluated for utility. An F test is associated with each covariate and a significant F value implies a significant adjustment to cell means was made. With multiple CVs, the significance test is based on the assumption that each CV entered the equation last, even though they are entered together. Entering several correlated CVs is not a particularly useful strategy. It can be useful to do some preliminary research & analysis to identify which measures will make the best covariates, preferably finding those that are minimally correlated with each other (assuming you are picking more than one) and that best predict the DV. Alternatives to ANCOVA: In pre-post situations, using difference scores (assuming same metric). Be aware there is a large body of literature on the measurement of change, and simple change scores should not be used without developing some familiarity with contemporary thinking on the measurement of change. o For instance, it is known that as the correlation between pre- and post-test approaches the reliability of the test, the reliability of the difference score approaches zero. Incorporating pre-scores into a RM ANOVA design. Residualize DV and run an ANOVA on the residualized scores. Not extremely popular. Blocking, assigning/matching people based on pre-scores or creating appropriate IV categories of intact groups. Utilizing the CV as a factor in the experiment, if it lends itself well to categorization. This side-steps many issues, such as homogeneity of regression. Comparisons & Trends Specific comparisons on the between-subjects portion of the design can be easily made using the usual post-hoc analyses or contrast procedures. Repeated measures comparisons can become tricky, especially in subjects by trials designs. For fixed factors, post-hoc comparisons can be made using a Bonferoni type correction, as well as using methods such as Tukey s HSD. For some reason, SPSS doesn t allow the use of multitudes of post-hoc tests, it allows for no correction (LSD), Bonferoni (using the options menu) and the Sidak method. A priori comparisons, such as the Dunn-Bonferroni procedure can also be used. Any good treatment of ANOVA will include at least one chapter on ANCOVA, in which these post-hoc procedures will be elaborated upon. Stevens (2002) presents a Bryant-Paulson Simultaneous Test Procedure for conducting post-hoc tests. It is an extension of the Tukey simultaneous interval procedure and specifically makes the assumption that the covariates(s) is (are) random, which is typically true in Psychology. Formulas are presented on page 365 and vary depending upon whether the study was randomized and whether there were one or more covariates, hence four equations are presented. Basic Approach: ANCOVA can be thought of as covarying out an unwanted influence on the DV, then proceeding with an ANOVA. That is useful for understanding how it works, however, it actually proceeds much as ANOVA with the partitioning of variation due to between and within effects. These partitions are based on deviations of group means from the GM and deviations of individual scores from their group mean. However, two additional partitions are sought.
4 4 1. First a partition around Sums of Squares Regression and Error for the covariate. Differences between CV scores and their GM are partitioned into between and within SS. 2. Next, the relationship between the linear relationship between the DV and CV is partitioned into sums of products associated with covariance between groups and products associated with covariance within groups. Information from these two partitions are used to adjust the between and within group SS. The hope is that the adjustment that narrows the within group variation will outstrip the downward adjustment to the between groups variance. Example: Appendix A contains a numerical example, which will be briefly discussed here. All computations are contained in the appendix. In this example there are three instructional methods and achievement is the dependent measure. Intelligence is used as a covariate. Also, since there was random assignment, this ANCOVA was used in order to reduce error variance and make the test more sensitive. First, running the analysis as a simple ANOVA yields a non-significant F. While there appear to be differences, based on examining the subsequent table of means (look at the unadjusted means), the analysis lacked the power to yield a significant effect. ANOVA Sum of Squares df Mean Square F Sig. Between Groups Within Groups Total Group Mean for X Unadjusted Mean for Y Adjusted Mean for Y Group Group Group Total Now, let s look at the same analysis run as an ANCOVA Dependent Variable: achievement Tests of Between-Subjects Effects Source Type III Sum of Squares df Mean Square F Sig. Corrected Model (a) Intercept x gpid Error Total Corrected Total a R Squared =.527 (Adjusted R Squared =.483)
5 5 Here we see there is a significant effect for X, the covariate. Also, we see there is a significant effect for group differences now. Also note that the mean square for between groups decreased (gpid) from 102 to 92.67, but the error term decreased substantially more, from 42.3 to We can also test for an interaction between the covariate and the between factor, a significant result reflecting a violation of assumption of equal regression slopes within cells. To do this, we set up a second run in SPSS as follows: UNIANOVA y BY gpid WITH x /METHOD = SSTYPE(3) /INTERCEPT = INCLUDE /CRITERIA = ALPHA(.05) /DESIGN = x, gpid, x BY gpid. TITLE 'test of assumption using UNIANOVA'. Which yields the following: Dependent Variable: y Tests of Between-Subjects Effects Source Type III Sum of Squares df Mean Square F Sig. Corrected Model (a) Intercept x gpid gpid * x Error Total Corrected Total a R Squared =.570 (Adjusted R Squared =.498) The insignificant interaction means our homogeneity of regression assumption is tenable and p >.05. MANCOVA: In the same way we can generalize from ANOVA to MANOVA, we can generalize from ANCOVA to MANCOVA. The parallels hold fairly well, where instead of carrying out the significance test on adjusted MSB and MSW, we carry out the significance test on adjusted W and T matrices: * * W Λ = * T The logic of extending from ANCOVA to MANCOVA doesn t change from the logic for extending from ANOVA to MANOVA, except that testing the assumptions of MANCOVA becomes somewhat more difficult, especially if there are multiple covariates involved. Basically it involves testing for interactions among all the covariates and the covariates with the factors. It can also be extended to N-Way MANCOVA.
6 6 Example: While I generally do not prefer taking an example from the book being used, the One-Way MANCOVA example from Stevens (2002) is a bit of a disaster, so using it is not without merit to take it up as our example. It comes from Novince (1977). The syntax for the example is listed below TITLE 'NOVINCE DATA MANCOVA'. DATA LIST FREE/GPID AVOID NEGEVAL SOCINT SRINV PREAVOID PRENEG PRESOCI PRESR. BEGIN DATA. END DATA. MANOVA AVOID NEGEVAL SOCINT SRINV PREAVOID PRENEG PRESOCI PRESR BY GPID(1,3) /ANALYSIS=AVOID NEGEVAL SOCINT SRINV WITH PREAVOID PRENEG PRESOCI PRESR /PRINT=PMEANS /DESIGN /ANALYSIS=AVOID NEGEVAL SOCINT SRINV /DESIGN = PREAVOID + PRENEG + PRESOCI + PRESR, GPID, PREAVOID BY GPID + PRENEG BY GPID + PRESOCI BY GPID + PRESR BY GPID /ANALYSIS=PREAVOID,PRENEG,PRESOCI,PRESR. TITLE 'INCLUDING ALL FOUR VARS & COVARIATES'. The term in bold represents the test for the homogeneity of regression slopes and the underlined portion generates results needed for the Bryant Paulson post-hoc procedure. This code generates, among other things, multivariate and univariate tests for the significance of the covariates: EFFECT.. WITHIN CELLS Regression Multivariate Tests of Significance (S = 4, M = -1/2, N = 10 1/2) Test Name Value Approx. F Hypoth. DF Error DF Sig. of F Pillais Hotellings Wilks Roys EFFECT.. WITHIN CELLS Regression (Cont.) Univariate F-tests with (4,26) D. F. Variable Hypoth. SS Error SS Hypoth. MS Error MS F AVOID NEGEVAL SOCINT SRINV We see here that there is a significant relationship among covariates and the DVs. All F s are significant at p <.05.
7 7 We also can evaluate the assumptions of homogeneity of regression slopes via the following information included in the output. EFFECT.. PREAVOID BY GPID + PRENEG BY GPID + PRESOCI BY GPID + PRESR BY GPID Multivariate Tests of Significance (S = 4, M = 1 1/2, N = 6 1/2) Test Name Value Approx. F Hypoth. DF Error DF Sig. of F Pillais Hotellings Wilks Roys EFFECT.. PREAVOID BY GPID + PRENEG BY GPID + PRESOCI BY GPID + PRESR BY GPID (Cont.) Univariate F-tests with (8,18) D. F. Variable Hypoth. SS Error SS Hypoth. MS Error MS F Sig. of F AVOID NEGEVAL SOCINT SRINV This table indicates to us that the assumption of homogeneity of regression slopes is tenable. Finally, of particular interest is whether there are group differences on the DVs after correcting for the covariates. This can be found in the table below: EFFECT.. GPID Multivariate Tests of Significance (S = 2, M = 1/2, N = 10 1/2) Test Name Value Approx. F Hypoth. DF Error DF Sig. of F Pillais Hotellings Wilks Roys Note.. F statistic for WILKS' Lambda is exact EFFECT.. GPID (Cont.) Univariate F-tests with (2,26) D. F. Variable Hypoth. SS Error SS Hypoth. MS Error MS F Sig. of F AVOID NEGEVAL SOCINT SRINV The multivariate test indicates a significant multivariate effect, and the following ANOVAs indicate that there is a significant effect for each DV. Now we need to collect information for the Bryant Paulson procedure to test for significant differences among the avoidance response between groups 1 & 2, after correcting for the covariates. The bold term in the table starting with EFFECT.. WITHIN CELLS Regression (above) is needed, as well as Hotelling s Trace from the following table (in bold)
8 8 EFFECT.. GPID Multivariate Tests of Significance (S = 2, M = 1/2, N = 12 1/2) Test Name Value Approx. F Hypoth. DF Error DF Sig. of F Pillais Hotellings Wilks Roys Note.. F statistic for WILKS' Lambda is exact. BP = Y * i Y * j * 1 1 MSW [1 + TR( BλWλ )]/ n J BP = = = [1 + (.312) / This comparison does not exceed the critical value of 3.76 (using 3 instead of four covariates) and rounding down to df e = 24. The comparison between 2 & 3 would exceed this value. To correct Steven s 2002 example, run the following syntax and repeat the same post-hoc test. MANOVA AVOID NEGEVAL PREAVOID PRENEG BY GPID(1,3) /ANALYSIS AVOID NEGEVAL WITH PREAVOID PRENEG /PRINT=PMEANS /DESIGN /ANALYSIS AVOID NEGEVAL /DESIGN = PREAVOID + PRENEG, GPID, PREAVOID BY GPID + PRENEG BY GPID /ANALYSIS=PREAVOID PRENEG. TITLE 'FOR STEVENS 2002 MANCOVA EXAMPLE 4, P BP = Y * i Y * j * 1 1 MSW [1 + TR( BλWλ )]/ n J BP = = = [1 + (.222)]/ The critical value (Table G from Stevens, 2002) is 3.69 (conf. page 366 on use of table, ignoring the fact that Stevens uses 3 covariates when only 2 were included in the design). Furthermore, there are specific equations for making contrasts. So, for instance, in a three group between subjects design, if one wanted to compare groups one and two, to group three, one could set up this contrast where the adjusted means are being evaluated.
9 9 Effect sizes can be calculated in ANCOVA as in ANOVA. In addition to calculating an effect size for a given effect, one can also determine an effect size for the covariate. When calculating an effect size, one needs to use the adjusted Sums of Squares for the effect.
10 10 Appendix A: ANCOVA Numerical Example Step 1: Set up data table and calculate summary statistics. In this example, X is a measure of intelligence and Y is a measure of achievement. We are interested in whether there are differences among achievement scores in three different conditions (Groups 1-3). We wish to use intelligence (X) as a covariate. Participants were randomly assigned, so in this instance we are using ANCOVA to increase the sensitivity of the experiment. Group 1 Group 2 Group 3 Total Subj. # X Y X*Y X Y X*Y X Y X*Y X Y n ΣX,Y,X*Y Means ΣX 2,Y SS Pearson-r Summary of Statistics: We obtain X and Y scores during the experiment, so, we need to complete the following for this table: A cross-products column: this is X*Y. For the first subject in Group 1, 98 x 60 = The n, sums and means for each column. The sum of the squared scores for each column. The sums of squares (SS) for each column The Correlation between X and Y for each column of X s and Y s. Step 2: Calculate Bracket Terms for X, Y & X*Y. 2 2 ( ΣX ) ( ΣY ) ( ΣX )( ΣY ) [1] =,, nk nk nk Σ Σ Σ Σ Σ Σ Σ [2] =,, n n n 2 2 ( X j ) ( Yj ) ( X j Yj )
11 [3],, = ΣX ΣY Σ XY These will now be calculated using the data from the table above. First, the X s [1] = = 456, [2] = = 456, [3] = = 458,552 Then the Y s [1] = = 192, [2] = = 192, [3] = = 194,175 And finally, the Cross-Products (XY) [1] = = 296, ( ) + ( ) + ( ) [2] = = 296, [3] = 94, , , 690 = 297, 660 Step 3: Calculate Sums of Squares between groups and within groups for X, Y & X*Y. The between groups SS (SS A ) is calculated by subtracting term 1 from 2 (i.e., 2 1) and SS w is calculated by subtracting 2 from 3 (3 2). SS A(x) = = SS W(x) = 458, ,630.5 = SS A(y) = 192, , = SS W(y) = 194, , = SS TOT(y) = [3] [1] = 194, , =
12 12 SP A = 296, , = SP W = 297, ,551.5 = Step 4: Computed Adjusted SS A and SS W The following adjusted Sums of Squares, which are adjusted after taking the effect of the covariate out, are the ones actually used for the significance test. The formulas look a bit daunting, but they are fairly efficient. SS ' A ( SPA + SPW ) SP ( ) W = SS A( y) = = [ ] = SS A( x) + SSW ( x) SSW ( x) SS ' W 2 2 SPW = SSW ( y) = = = SS W ( x) Step 5: Construct ANOVA Summary Table and obtain SS COV Source SS df MS F p(f) Covariate Adjusted Between Adjusted Within Total The Total Sums of Squares is taken from above. Y is the dependent variable, therefore, SS TOT(y) becomes the SS TOT. The df TOT as always is N-1. The SS Covariate is taken as the result of SSTOT SS A SS B and each covariate has one degree of freedom. SS A has 2 degrees of freedom (k-1) and the SS W has N k c degrees of freedom, where c is the number of covariates. Mean Squares and F statistics are calculated in the usual way. A significant Covariate implies that the covariate accounts for a significant proportion of variation in Y. Finally, an effect size can be estimated as SS ' η 2 = A ' ' SS A + SS = W = Step 6: Mean Comparisons with Adjsuted Means In order to make comparisons between groups, we need to adjust the group means so that the new group means have the effect of the covariate removed. These adjusted means are used in post-hoc comparisons. We begin by calculating the pooled Unstandardized Regression Coefficient. Here pooled means it is the regression coefficient arrived at by pooling information across our treatment conditions. The formula is simple and we have already calculated the intermediate terms we need. SPW b = = = SS W ( x)
13 Next, we need to adjust the means. The equation is simple, and it is convenient to construct a table with the means for X, the unadjusted means for Y and the adjusted means for Y. Group Mean for X Unadjusted Mean for Y Adjusted Mean for Y Group Group Group Total The formula for computing adjusted means is as follows: Y ' = Y j b( X j X ) j So, for group 1, the adjusted mean would be ANCOVA 13 Y ' 1 = ( ) = ( 2.194) = = Notice that the Grand Mean (Total row for X), is not adjusted, just the group means. ANCOVA will adjust the group means so that they are closer to the Grand Mean. It is hoped that the reduction in error variance (MS W ) more than off-sets this adjustment. Prior to doing any post-hoc tests, we must make a second adjustment to the error term to be used in the Tukey HSD procedure. This adjustment corrects for group differences on X and can be completed as follows. MS SS A( x) = MS 1 + = = [ ] = ( k 1) SS W ( x) (3 1) " ' W W Looking up Q from the table in your text book (using k=3 and df W = 30 1 ) we find 3.49 for alpha of.05. Substituting MS W for MS W, we calculate HSD as " MS HSD = q W = 3.49 = n 12 Hence, the only post-hoc test that is significant is that Group 3 is significantly higher than Group 2. Another approach is the Bryant-Paulson generalization of Tukey s simultaneous interval approach. For our example, the formula is: 1 Note, the actual df W was 32, because it wasn t in our table, I rounded down to 30, which would make the HSD value slightly more conservative than rounding up from 32.
14 14 BP = Y Y * * i j MS * [1 + MS / SS ]/ n W B W X X Where MS * W is the adjusted MS within, MSBX is the between MS for the covariate, SS WX is the SS within for X, Y * i and Y * j are the adjusted means for the two comparisons, and n is the common group size, or the harmonic mean in the case of unequal cell sizes. MS BX and SS WX can be obtained by running an ANOVA with the covariate as the DV. Note, there is a different formula for nonrandomized studies. So, for our example BP = = = = [ /1921.5]/ (1.0273) / The critical value (requires a special table, can be found in Stevens, 2002), is The Bryant-Paulson test appears to be consistent with Tukey s, finding significant differences only for groups 2 and 3.
MANOVA is an extension of the univariate ANOVA as it involves more than one Dependent Variable (DV). The following are assumptions for using MANOVA:
MULTIVARIATE ANALYSIS OF VARIANCE MANOVA is an extension of the univariate ANOVA as it involves more than one Dependent Variable (DV). The following are assumptions for using MANOVA: 1. Cell sizes : o
More informationANCOVA. Lecture 9 Andrew Ainsworth
ANCOVA Lecture 9 Andrew Ainsworth What is ANCOVA? Analysis of covariance an extension of ANOVA in which main effects and interactions are assessed on DV scores after the DV has been adjusted for by the
More informationNeuendorf MANOVA /MANCOVA. Model: X1 (Factor A) X2 (Factor B) X1 x X2 (Interaction) Y4. Like ANOVA/ANCOVA:
1 Neuendorf MANOVA /MANCOVA Model: X1 (Factor A) X2 (Factor B) X1 x X2 (Interaction) Y1 Y2 Y3 Y4 Like ANOVA/ANCOVA: 1. Assumes equal variance (equal covariance matrices) across cells (groups defined by
More informationNeuendorf MANOVA /MANCOVA. Model: MAIN EFFECTS: X1 (Factor A) X2 (Factor B) INTERACTIONS : X1 x X2 (A x B Interaction) Y4. Like ANOVA/ANCOVA:
1 Neuendorf MANOVA /MANCOVA Model: MAIN EFFECTS: X1 (Factor A) X2 (Factor B) Y1 Y2 INTERACTIONS : Y3 X1 x X2 (A x B Interaction) Y4 Like ANOVA/ANCOVA: 1. Assumes equal variance (equal covariance matrices)
More informationNeuendorf MANOVA /MANCOVA. Model: X1 (Factor A) X2 (Factor B) X1 x X2 (Interaction) Y4. Like ANOVA/ANCOVA:
1 Neuendorf MANOVA /MANCOVA Model: X1 (Factor A) X2 (Factor B) X1 x X2 (Interaction) Y1 Y2 Y3 Y4 Like ANOVA/ANCOVA: 1. Assumes equal variance (equal covariance matrices) across cells (groups defined by
More informationANCOVA. Psy 420 Andrew Ainsworth
ANCOVA Psy 420 Andrew Ainsworth What is ANCOVA? Analysis of covariance an extension of ANOVA in which main effects and interactions are assessed on DV scores after the DV has been adjusted for by the DV
More informationThree Factor Completely Randomized Design with One Continuous Factor: Using SPSS GLM UNIVARIATE R. C. Gardner Department of Psychology
Data_Analysis.calm Three Factor Completely Randomized Design with One Continuous Factor: Using SPSS GLM UNIVARIATE R. C. Gardner Department of Psychology This article considers a three factor completely
More informationPrepared by: Prof. Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti
Prepared by: Prof. Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti Putra Malaysia Serdang Use in experiment, quasi-experiment
More informationRepeated-Measures ANOVA in SPSS Correct data formatting for a repeated-measures ANOVA in SPSS involves having a single line of data for each
Repeated-Measures ANOVA in SPSS Correct data formatting for a repeated-measures ANOVA in SPSS involves having a single line of data for each participant, with the repeated measures entered as separate
More information4:3 LEC - PLANNED COMPARISONS AND REGRESSION ANALYSES
4:3 LEC - PLANNED COMPARISONS AND REGRESSION ANALYSES FOR SINGLE FACTOR BETWEEN-S DESIGNS Planned or A Priori Comparisons We previously showed various ways to test all possible pairwise comparisons for
More informationBIOL Biometry LAB 6 - SINGLE FACTOR ANOVA and MULTIPLE COMPARISON PROCEDURES
BIOL 458 - Biometry LAB 6 - SINGLE FACTOR ANOVA and MULTIPLE COMPARISON PROCEDURES PART 1: INTRODUCTION TO ANOVA Purpose of ANOVA Analysis of Variance (ANOVA) is an extremely useful statistical method
More informationVariance. Standard deviation VAR = = value. Unbiased SD = SD = 10/23/2011. Functional Connectivity Correlation and Regression.
10/3/011 Functional Connectivity Correlation and Regression Variance VAR = Standard deviation Standard deviation SD = Unbiased SD = 1 10/3/011 Standard error Confidence interval SE = CI = = t value for
More informationGroup comparison test for independent samples
Group comparison test for independent samples The purpose of the Analysis of Variance (ANOVA) is to test for significant differences between means. Supposing that: samples come from normal populations
More informationRepeated Measures ANOVA Multivariate ANOVA and Their Relationship to Linear Mixed Models
Repeated Measures ANOVA Multivariate ANOVA and Their Relationship to Linear Mixed Models EPSY 905: Multivariate Analysis Spring 2016 Lecture #12 April 20, 2016 EPSY 905: RM ANOVA, MANOVA, and Mixed Models
More informationOne-Way ANOVA. Some examples of when ANOVA would be appropriate include:
One-Way ANOVA 1. Purpose Analysis of variance (ANOVA) is used when one wishes to determine whether two or more groups (e.g., classes A, B, and C) differ on some outcome of interest (e.g., an achievement
More informationCorrelations. Notes. Output Created Comments 04-OCT :34:52
Correlations Output Created Comments Input Missing Value Handling Syntax Resources Notes Data Active Dataset Filter Weight Split File N of Rows in Working Data File Definition of Missing Cases Used Processor
More informationDESIGNING EXPERIMENTS AND ANALYZING DATA A Model Comparison Perspective
DESIGNING EXPERIMENTS AND ANALYZING DATA A Model Comparison Perspective Second Edition Scott E. Maxwell Uniuersity of Notre Dame Harold D. Delaney Uniuersity of New Mexico J,t{,.?; LAWRENCE ERLBAUM ASSOCIATES,
More informationMANOVA MANOVA,$/,,# ANOVA ##$%'*!# 1. $!;' *$,$!;' (''
14 3! "#!$%# $# $&'('$)!! (Analysis of Variance : ANOVA) *& & "#!# +, ANOVA -& $ $ (+,$ ''$) *$#'$)!!#! (Multivariate Analysis of Variance : MANOVA).*& ANOVA *+,'$)$/*! $#/#-, $(,!0'%1)!', #($!#$ # *&,
More informationA Re-Introduction to General Linear Models (GLM)
A Re-Introduction to General Linear Models (GLM) Today s Class: You do know the GLM Estimation (where the numbers in the output come from): From least squares to restricted maximum likelihood (REML) Reviewing
More informationANOVA Longitudinal Models for the Practice Effects Data: via GLM
Psyc 943 Lecture 25 page 1 ANOVA Longitudinal Models for the Practice Effects Data: via GLM Model 1. Saturated Means Model for Session, E-only Variances Model (BP) Variances Model: NO correlation, EQUAL
More informationFactorial Independent Samples ANOVA
Factorial Independent Samples ANOVA Liljenquist, Zhong and Galinsky (2010) found that people were more charitable when they were in a clean smelling room than in a neutral smelling room. Based on that
More informationMultiple Comparison Procedures Cohen Chapter 13. For EDUC/PSY 6600
Multiple Comparison Procedures Cohen Chapter 13 For EDUC/PSY 6600 1 We have to go to the deductions and the inferences, said Lestrade, winking at me. I find it hard enough to tackle facts, Holmes, without
More informationOne-way between-subjects ANOVA. Comparing three or more independent means
One-way between-subjects ANOVA Comparing three or more independent means Data files SpiderBG.sav Attractiveness.sav Homework: sourcesofself-esteem.sav ANOVA: A Framework Understand the basic principles
More informationChapter 14: Repeated-measures designs
Chapter 14: Repeated-measures designs Oliver Twisted Please, Sir, can I have some more sphericity? The following article is adapted from: Field, A. P. (1998). A bluffer s guide to sphericity. Newsletter
More informationMultivariate analysis of variance and covariance
Introduction Multivariate analysis of variance and covariance Univariate ANOVA: have observations from several groups, numerical dependent variable. Ask whether dependent variable has same mean for each
More informationAdvanced Experimental Design
Advanced Experimental Design Topic 8 Chapter : Repeated Measures Analysis of Variance Overview Basic idea, different forms of repeated measures Partialling out between subjects effects Simple repeated
More informationIndependent Samples ANOVA
Independent Samples ANOVA In this example students were randomly assigned to one of three mnemonics (techniques for improving memory) rehearsal (the control group; simply repeat the words), visual imagery
More informationANCOVA. ANCOVA allows the inclusion of a 3rd source of variation into the F-formula (called the covariate) and changes the F-formula
ANCOVA Workings of ANOVA & ANCOVA ANCOVA, Semi-Partial correlations, statistical control Using model plotting to think about ANCOVA & Statistical control You know how ANOVA works the total variation among
More informationT. Mark Beasley One-Way Repeated Measures ANOVA handout
T. Mark Beasley One-Way Repeated Measures ANOVA handout Profile Analysis Example In the One-Way Repeated Measures ANOVA, two factors represent separate sources of variance. Their interaction presents an
More informationDETAILED CONTENTS PART I INTRODUCTION AND DESCRIPTIVE STATISTICS. 1. Introduction to Statistics
DETAILED CONTENTS About the Author Preface to the Instructor To the Student How to Use SPSS With This Book PART I INTRODUCTION AND DESCRIPTIVE STATISTICS 1. Introduction to Statistics 1.1 Descriptive and
More informationOne-way between-subjects ANOVA. Comparing three or more independent means
One-way between-subjects ANOVA Comparing three or more independent means ANOVA: A Framework Understand the basic principles of ANOVA Why it is done? What it tells us? Theory of one-way between-subjects
More informationStevens 2. Aufl. S Multivariate Tests c
Stevens 2. Aufl. S. 200 General Linear Model Between-Subjects Factors 1,00 2,00 3,00 N 11 11 11 Effect a. Exact statistic Pillai's Trace Wilks' Lambda Hotelling's Trace Roy's Largest Root Pillai's Trace
More information10/31/2012. One-Way ANOVA F-test
PSY 511: Advanced Statistics for Psychological and Behavioral Research 1 1. Situation/hypotheses 2. Test statistic 3.Distribution 4. Assumptions One-Way ANOVA F-test One factor J>2 independent samples
More informationFormula for the t-test
Formula for the t-test: How the t-test Relates to the Distribution of the Data for the Groups Formula for the t-test: Formula for the Standard Error of the Difference Between the Means Formula for the
More informationM A N O V A. Multivariate ANOVA. Data
M A N O V A Multivariate ANOVA V. Čekanavičius, G. Murauskas 1 Data k groups; Each respondent has m measurements; Observations are from the multivariate normal distribution. No outliers. Covariance matrices
More informationMIXED MODELS FOR REPEATED (LONGITUDINAL) DATA PART 2 DAVID C. HOWELL 4/1/2010
MIXED MODELS FOR REPEATED (LONGITUDINAL) DATA PART 2 DAVID C. HOWELL 4/1/2010 Part 1 of this document can be found at http://www.uvm.edu/~dhowell/methods/supplements/mixed Models for Repeated Measures1.pdf
More informationIntroduction to the Analysis of Variance (ANOVA) Computing One-Way Independent Measures (Between Subjects) ANOVAs
Introduction to the Analysis of Variance (ANOVA) Computing One-Way Independent Measures (Between Subjects) ANOVAs The Analysis of Variance (ANOVA) The analysis of variance (ANOVA) is a statistical technique
More informationMULTIVARIATE ANALYSIS OF VARIANCE
MULTIVARIATE ANALYSIS OF VARIANCE RAJENDER PARSAD AND L.M. BHAR Indian Agricultural Statistics Research Institute Library Avenue, New Delhi - 0 0 lmb@iasri.res.in. Introduction In many agricultural experiments,
More information5:1LEC - BETWEEN-S FACTORIAL ANOVA
5:1LEC - BETWEEN-S FACTORIAL ANOVA The single-factor Between-S design described in previous classes is only appropriate when there is just one independent variable or factor in the study. Often, however,
More informationUnivariate analysis. Simple and Multiple Regression. Univariate analysis. Simple Regression How best to summarise the data?
Univariate analysis Example - linear regression equation: y = ax + c Least squares criteria ( yobs ycalc ) = yobs ( ax + c) = minimum Simple and + = xa xc xy xa + nc = y Solve for a and c Univariate analysis
More informationY (Nominal/Categorical) 1. Metric (interval/ratio) data for 2+ IVs, and categorical (nominal) data for a single DV
1 Neuendorf Discriminant Analysis The Model X1 X2 X3 X4 DF2 DF3 DF1 Y (Nominal/Categorical) Assumptions: 1. Metric (interval/ratio) data for 2+ IVs, and categorical (nominal) data for a single DV 2. Linearity--in
More informationREVIEW 8/2/2017 陈芳华东师大英语系
REVIEW Hypothesis testing starts with a null hypothesis and a null distribution. We compare what we have to the null distribution, if the result is too extreme to belong to the null distribution (p
More informationANOVA in SPSS. Hugo Quené. opleiding Taalwetenschap Universiteit Utrecht Trans 10, 3512 JK Utrecht.
ANOVA in SPSS Hugo Quené hugo.quene@let.uu.nl opleiding Taalwetenschap Universiteit Utrecht Trans 10, 3512 JK Utrecht 7 Oct 2005 1 introduction In this example I ll use fictitious data, taken from http://www.ruf.rice.edu/~mickey/psyc339/notes/rmanova.html.
More informationMultiple Comparisons
Multiple Comparisons Error Rates, A Priori Tests, and Post-Hoc Tests Multiple Comparisons: A Rationale Multiple comparison tests function to tease apart differences between the groups within our IV when
More informationMultivariate Analysis of Variance
Chapter 15 Multivariate Analysis of Variance Jolicouer and Mosimann studied the relationship between the size and shape of painted turtles. The table below gives the length, width, and height (all in mm)
More informationYour schedule of coming weeks. One-way ANOVA, II. Review from last time. Review from last time /22/2004. Create ANOVA table
Your schedule of coming weeks One-way ANOVA, II 9.07 //00 Today: One-way ANOVA, part II Next week: Two-way ANOVA, parts I and II. One-way ANOVA HW due Thursday Week of May Teacher out of town all week
More informationApplied Multivariate Statistical Modeling Prof. J. Maiti Department of Industrial Engineering and Management Indian Institute of Technology, Kharagpur
Applied Multivariate Statistical Modeling Prof. J. Maiti Department of Industrial Engineering and Management Indian Institute of Technology, Kharagpur Lecture - 29 Multivariate Linear Regression- Model
More informationpsyc3010 lecture 2 factorial between-ps ANOVA I: omnibus tests
psyc3010 lecture 2 factorial between-ps ANOVA I: omnibus tests last lecture: introduction to factorial designs next lecture: factorial between-ps ANOVA II: (effect sizes and follow-up tests) 1 general
More informationPsy 420 Final Exam Fall 06 Ainsworth. Key Name
Psy 40 Final Exam Fall 06 Ainsworth Key Name Psy 40 Final A researcher is studying the effect of Yoga, Meditation, Anti-Anxiety Drugs and taking Psy 40 and the anxiety levels of the participants. Twenty
More informationKeppel, G. & Wickens, T. D. Design and Analysis Chapter 4: Analytical Comparisons Among Treatment Means
Keppel, G. & Wickens, T. D. Design and Analysis Chapter 4: Analytical Comparisons Among Treatment Means 4.1 The Need for Analytical Comparisons...the between-groups sum of squares averages the differences
More informationRon Heck, Fall Week 8: Introducing Generalized Linear Models: Logistic Regression 1 (Replaces prior revision dated October 20, 2011)
Ron Heck, Fall 2011 1 EDEP 768E: Seminar in Multilevel Modeling rev. January 3, 2012 (see footnote) Week 8: Introducing Generalized Linear Models: Logistic Regression 1 (Replaces prior revision dated October
More informationsphericity, 5-29, 5-32 residuals, 7-1 spread and level, 2-17 t test, 1-13 transformations, 2-15 violations, 1-19
additive tree structure, 10-28 ADDTREE, 10-51, 10-53 EXTREE, 10-31 four point condition, 10-29 ADDTREE, 10-28, 10-51, 10-53 adjusted R 2, 8-7 ALSCAL, 10-49 ANCOVA, 9-1 assumptions, 9-5 example, 9-7 MANOVA
More information8/04/2011. last lecture: correlation and regression next lecture: standard MR & hierarchical MR (MR = multiple regression)
psyc3010 lecture 7 analysis of covariance (ANCOVA) last lecture: correlation and regression next lecture: standard MR & hierarchical MR (MR = multiple regression) 1 announcements quiz 2 correlation and
More informationSTAT 3900/4950 MIDTERM TWO Name: Spring, 2015 (print: first last ) Covered topics: Two-way ANOVA, ANCOVA, SLR, MLR and correlation analysis
STAT 3900/4950 MIDTERM TWO Name: Spring, 205 (print: first last ) Covered topics: Two-way ANOVA, ANCOVA, SLR, MLR and correlation analysis Instructions: You may use your books, notes, and SPSS/SAS. NO
More informationThe One-Way Independent-Samples ANOVA. (For Between-Subjects Designs)
The One-Way Independent-Samples ANOVA (For Between-Subjects Designs) Computations for the ANOVA In computing the terms required for the F-statistic, we won t explicitly compute any sample variances or
More informationLeast Squares Analyses of Variance and Covariance
Least Squares Analyses of Variance and Covariance One-Way ANOVA Read Sections 1 and 2 in Chapter 16 of Howell. Run the program ANOVA1- LS.sas, which can be found on my SAS programs page. The data here
More informationIntroducing Generalized Linear Models: Logistic Regression
Ron Heck, Summer 2012 Seminars 1 Multilevel Regression Models and Their Applications Seminar Introducing Generalized Linear Models: Logistic Regression The generalized linear model (GLM) represents and
More informationAnalyses of Variance. Block 2b
Analyses of Variance Block 2b Types of analyses 1 way ANOVA For more than 2 levels of a factor between subjects ANCOVA For continuous co-varying factor, between subjects ANOVA for factorial design Multiple
More informationContrasts (in general)
10/1/015 6-09/749 Experimental Design for Behavioral and Social Sciences Contrasts (in general) Context: An ANOVA rejects the overall null hypothesis that all k means of some factor are not equal, i.e.,
More informationGLM Repeated-measures designs: One within-subjects factor
GLM Repeated-measures designs: One within-subjects factor Reading: SPSS dvanced Models 9.0: 2. Repeated Measures Homework: Sums of Squares for Within-Subject Effects Download: glm_withn1.sav (Download
More informationCOMPARING SEVERAL MEANS: ANOVA
LAST UPDATED: November 15, 2012 COMPARING SEVERAL MEANS: ANOVA Objectives 2 Basic principles of ANOVA Equations underlying one-way ANOVA Doing a one-way ANOVA in R Following up an ANOVA: Planned contrasts/comparisons
More informationLecture 5: Hypothesis tests for more than one sample
1/23 Lecture 5: Hypothesis tests for more than one sample Måns Thulin Department of Mathematics, Uppsala University thulin@math.uu.se Multivariate Methods 8/4 2011 2/23 Outline Paired comparisons Repeated
More informationUsing SPSS for One Way Analysis of Variance
Using SPSS for One Way Analysis of Variance This tutorial will show you how to use SPSS version 12 to perform a one-way, between- subjects analysis of variance and related post-hoc tests. This tutorial
More information36-309/749 Experimental Design for Behavioral and Social Sciences. Dec 1, 2015 Lecture 11: Mixed Models (HLMs)
36-309/749 Experimental Design for Behavioral and Social Sciences Dec 1, 2015 Lecture 11: Mixed Models (HLMs) Independent Errors Assumption An error is the deviation of an individual observed outcome (DV)
More informationAnalysis of Covariance (ANCOVA)
Analysis of Covariance (ANCOVA) ANOVA can be extended to include one or more continuous variables that predict the outcome (or dependent variable). Continuous variables such as these, that are not part
More informationMultiple t Tests. Introduction to Analysis of Variance. Experiments with More than 2 Conditions
Introduction to Analysis of Variance 1 Experiments with More than 2 Conditions Often the research that psychologists perform has more conditions than just the control and experimental conditions You might
More informationThis module focuses on the logic of ANOVA with special attention given to variance components and the relationship between ANOVA and regression.
WISE ANOVA and Regression Lab Introduction to the WISE Correlation/Regression and ANOVA Applet This module focuses on the logic of ANOVA with special attention given to variance components and the relationship
More informationDescriptive Statistics
*following creates z scores for the ydacl statedp traitdp and rads vars. *specifically adding the /SAVE subcommand to descriptives will create z. *scores for whatever variables are in the command. DESCRIPTIVES
More informationA Re-Introduction to General Linear Models
A Re-Introduction to General Linear Models Today s Class: Big picture overview Why we are using restricted maximum likelihood within MIXED instead of least squares within GLM Linear model interpretation
More informationThe One-Way Repeated-Measures ANOVA. (For Within-Subjects Designs)
The One-Way Repeated-Measures ANOVA (For Within-Subjects Designs) Logic of the Repeated-Measures ANOVA The repeated-measures ANOVA extends the analysis of variance to research situations using repeated-measures
More informationInteractions between Binary & Quantitative Predictors
Interactions between Binary & Quantitative Predictors The purpose of the study was to examine the possible joint effects of the difficulty of the practice task and the amount of practice, upon the performance
More informationMultiple Regression. More Hypothesis Testing. More Hypothesis Testing The big question: What we really want to know: What we actually know: We know:
Multiple Regression Ψ320 Ainsworth More Hypothesis Testing What we really want to know: Is the relationship in the population we have selected between X & Y strong enough that we can use the relationship
More informationIntroduction to Business Statistics QM 220 Chapter 12
Department of Quantitative Methods & Information Systems Introduction to Business Statistics QM 220 Chapter 12 Dr. Mohammad Zainal 12.1 The F distribution We already covered this topic in Ch. 10 QM-220,
More informationAn Old Research Question
ANOVA An Old Research Question The impact of TV on high-school grade Watch or not watch Two groups The impact of TV hours on high-school grade Exactly how much TV watching would make difference Multiple
More informationOne-Way ANOVA Source Table J - 1 SS B / J - 1 MS B /MS W. Pairwise Post-Hoc Comparisons of Means
One-Way ANOVA Source Table ANOVA MODEL: ij = µ* + α j + ε ij H 0 : µ 1 = µ =... = µ j or H 0 : Σα j = 0 Source Sum of Squares df Mean Squares F Between Groups n j ( j - * ) J - 1 SS B / J - 1 MS B /MS
More informationResearch Methodology: Tools
MSc Business Administration Research Methodology: Tools Applied Data Analysis (with SPSS) Lecture 09: Introduction to Analysis of Variance (ANOVA) April 2014 Prof. Dr. Jürg Schwarz Lic. phil. Heidi Bruderer
More informationCh. 16: Correlation and Regression
Ch. 1: Correlation and Regression With the shift to correlational analyses, we change the very nature of the question we are asking of our data. Heretofore, we were asking if a difference was likely to
More informationWISE Regression/Correlation Interactive Lab. Introduction to the WISE Correlation/Regression Applet
WISE Regression/Correlation Interactive Lab Introduction to the WISE Correlation/Regression Applet This tutorial focuses on the logic of regression analysis with special attention given to variance components.
More informationPSY 216. Assignment 12 Answers. Explain why the F-ratio is expected to be near 1.00 when the null hypothesis is true.
PSY 21 Assignment 12 Answers 1. Problem 1 from the text Explain why the F-ratio is expected to be near 1.00 when the null hypothesis is true. When H0 is true, the treatment had no systematic effect. In
More informationWELCOME! Lecture 13 Thommy Perlinger
Quantitative Methods II WELCOME! Lecture 13 Thommy Perlinger Parametrical tests (tests for the mean) Nature and number of variables One-way vs. two-way ANOVA One-way ANOVA Y X 1 1 One dependent variable
More informationRegression With a Categorical Independent Variable: Mean Comparisons
Regression With a Categorical Independent Variable: Mean Lecture 16 March 29, 2005 Applied Regression Analysis Lecture #16-3/29/2005 Slide 1 of 43 Today s Lecture comparisons among means. Today s Lecture
More informationChapter Seven: Multi-Sample Methods 1/52
Chapter Seven: Multi-Sample Methods 1/52 7.1 Introduction 2/52 Introduction The independent samples t test and the independent samples Z test for a difference between proportions are designed to analyze
More informationMultivariate Regression (Chapter 10)
Multivariate Regression (Chapter 10) This week we ll cover multivariate regression and maybe a bit of canonical correlation. Today we ll mostly review univariate multivariate regression. With multivariate
More informationPLS205!! Lab 9!! March 6, Topic 13: Covariance Analysis
PLS205!! Lab 9!! March 6, 2014 Topic 13: Covariance Analysis Covariable as a tool for increasing precision Carrying out a full ANCOVA Testing ANOVA assumptions Happiness! Covariable as a Tool for Increasing
More informationHypothesis testing, part 2. With some material from Howard Seltman, Blase Ur, Bilge Mutlu, Vibha Sazawal
Hypothesis testing, part 2 With some material from Howard Seltman, Blase Ur, Bilge Mutlu, Vibha Sazawal 1 CATEGORICAL IV, NUMERIC DV 2 Independent samples, one IV # Conditions Normal/Parametric Non-parametric
More informationAnalysis of Variance (ANOVA)
Analysis of Variance (ANOVA) Used for comparing or more means an extension of the t test Independent Variable (factor) = categorical (qualita5ve) predictor should have at least levels, but can have many
More informationKeppel, G. & Wickens, T.D. Design and Analysis Chapter 2: Sources of Variability and Sums of Squares
Keppel, G. & Wickens, T.D. Design and Analysis Chapter 2: Sources of Variability and Sums of Squares K&W introduce the notion of a simple experiment with two conditions. Note that the raw data (p. 16)
More informationLecture 6: Single-classification multivariate ANOVA (k-group( MANOVA)
Lecture 6: Single-classification multivariate ANOVA (k-group( MANOVA) Rationale and MANOVA test statistics underlying principles MANOVA assumptions Univariate ANOVA Planned and unplanned Multivariate ANOVA
More informationRegression With a Categorical Independent Variable
Regression With a Independent Variable Lecture 10 November 5, 2008 ERSH 8320 Lecture #10-11/5/2008 Slide 1 of 54 Today s Lecture Today s Lecture Chapter 11: Regression with a single categorical independent
More informationKeppel, G. & Wickens, T. D. Design and Analysis Chapter 12: Detailed Analyses of Main Effects and Simple Effects
Keppel, G. & Wickens, T. D. Design and Analysis Chapter 1: Detailed Analyses of Main Effects and Simple Effects If the interaction is significant, then less attention is paid to the two main effects, and
More informationDraft Proof - Do not copy, post, or distribute. Chapter Learning Objectives REGRESSION AND CORRELATION THE SCATTER DIAGRAM
1 REGRESSION AND CORRELATION As we learned in Chapter 9 ( Bivariate Tables ), the differential access to the Internet is real and persistent. Celeste Campos-Castillo s (015) research confirmed the impact
More informationN J SS W /df W N - 1
One-Way ANOVA Source Table ANOVA MODEL: ij = µ* + α j + ε ij H 0 : µ = µ =... = µ j or H 0 : Σα j = 0 Source Sum of Squares df Mean Squares F J Between Groups nj( j * ) J - SS B /(J ) MS B /MS W = ( N
More informationMultilevel Models in Matrix Form. Lecture 7 July 27, 2011 Advanced Multivariate Statistical Methods ICPSR Summer Session #2
Multilevel Models in Matrix Form Lecture 7 July 27, 2011 Advanced Multivariate Statistical Methods ICPSR Summer Session #2 Today s Lecture Linear models from a matrix perspective An example of how to do
More informationReview of Multiple Regression
Ronald H. Heck 1 Let s begin with a little review of multiple regression this week. Linear models [e.g., correlation, t-tests, analysis of variance (ANOVA), multiple regression, path analysis, multivariate
More informationSTA441: Spring Multiple Regression. This slide show is a free open source document. See the last slide for copyright information.
STA441: Spring 2018 Multiple Regression This slide show is a free open source document. See the last slide for copyright information. 1 Least Squares Plane 2 Statistical MODEL There are p-1 explanatory
More informationAn Introduction to Multilevel Models. PSYC 943 (930): Fundamentals of Multivariate Modeling Lecture 25: December 7, 2012
An Introduction to Multilevel Models PSYC 943 (930): Fundamentals of Multivariate Modeling Lecture 25: December 7, 2012 Today s Class Concepts in Longitudinal Modeling Between-Person vs. +Within-Person
More informationCourse Introduction and Overview Descriptive Statistics Conceptualizations of Variance Review of the General Linear Model
Course Introduction and Overview Descriptive Statistics Conceptualizations of Variance Review of the General Linear Model EPSY 905: Multivariate Analysis Lecture 1 20 January 2016 EPSY 905: Lecture 1 -
More informationMORE ON SIMPLE REGRESSION: OVERVIEW
FI=NOT0106 NOTICE. Unless otherwise indicated, all materials on this page and linked pages at the blue.temple.edu address and at the astro.temple.edu address are the sole property of Ralph B. Taylor and
More informationUsing the GLM Procedure in SPSS
Using the GLM Procedure in SPSS Alan Taylor, Department of Psychology Macquarie University 2002-2011 Macquarie University 2002-2011 Contents i Introduction 1 1. General 3 1.1 Factors and Covariates 3
More informationStatistical Techniques II EXST7015 Simple Linear Regression
Statistical Techniques II EXST7015 Simple Linear Regression 03a_SLR 1 Y - the dependent variable 35 30 25 The objective Given points plotted on two coordinates, Y and X, find the best line to fit the data.
More information