General linear models. One and Two-way ANOVA in SPSS Repeated measures ANOVA Multiple linear regression

Similar documents
ANCOVA. Psy 420 Andrew Ainsworth

x3,..., Multiple Regression β q α, β 1, β 2, β 3,..., β q in the model can all be estimated by least square estimators

Single and multiple linear regression analysis

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

Multiple Regression. More Hypothesis Testing. More Hypothesis Testing The big question: What we really want to know: What we actually know: We know:

Parametric Test. Multiple Linear Regression Spatial Application I: State Homicide Rates Equations taken from Zar, 1984.

Confidence Intervals, Testing and ANOVA Summary

Correlation. A statistics method to measure the relationship between two variables. Three characteristics

Inference for Regression Simple Linear Regression

Analysis of Covariance

Advanced Experimental Design

Inferences for Regression

Analysing data: regression and correlation S6 and S7

Intro to Linear Regression

Intro to Linear Regression

Categorical Predictor Variables

1 Overview. 2 Multiple Regression framework. Effect Coding. Hervé Abdi

This module focuses on the logic of ANOVA with special attention given to variance components and the relationship between ANOVA and regression.

Checking model assumptions with regression diagnostics

Regression Diagnostics Procedures

MANOVA is an extension of the univariate ANOVA as it involves more than one Dependent Variable (DV). The following are assumptions for using MANOVA:

Additional Notes: Investigating a Random Slope. When we have fixed level-1 predictors at level 2 we show them like this:

Chapter 5. Elements of Multiple Regression Analysis: Two Independent Variables

Chemometrics. Matti Hotokka Physical chemistry Åbo Akademi University

STA 4210 Practise set 2b

Can you tell the relationship between students SAT scores and their college grades?

Chapter 4. Regression Models. Learning Objectives

Types of Statistical Tests DR. MIKE MARRAPODI

Chapter 14 Student Lecture Notes 14-1

Given a sample of n observations measured on k IVs and one DV, we obtain the equation

Data Analysis 1 LINEAR REGRESSION. Chapter 03

Multiple linear regression S6

Repeated Measures Analysis of Variance

Daniel Boduszek University of Huddersfield

Simple Linear Regression

2 Regression Analysis

Deciphering Math Notation. Billy Skorupski Associate Professor, School of Education

More about Single Factor Experiments

Analysis of Variance

LINEAR REGRESSION ANALYSIS. MODULE XVI Lecture Exercises

Chapter Learning Objectives. Regression Analysis. Correlation. Simple Linear Regression. Chapter 12. Simple Linear Regression

Analysis of Variance: Repeated measures

ANOVA Longitudinal Models for the Practice Effects Data: via GLM

sphericity, 5-29, 5-32 residuals, 7-1 spread and level, 2-17 t test, 1-13 transformations, 2-15 violations, 1-19

Statistical Techniques II EXST7015 Simple Linear Regression

Psychology Seminar Psych 406 Dr. Jeffrey Leitzel

(ii) Scan your answer sheets INTO ONE FILE only, and submit it in the drop-box.

ANOVA in SPSS. Hugo Quené. opleiding Taalwetenschap Universiteit Utrecht Trans 10, 3512 JK Utrecht.

T. Mark Beasley One-Way Repeated Measures ANOVA handout

Correlation and Regression Bangkok, 14-18, Sept. 2015

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #6

" M A #M B. Standard deviation of the population (Greek lowercase letter sigma) σ 2

s e, which is large when errors are large and small Linear regression model

SIMPLE REGRESSION ANALYSIS. Business Statistics

Ch 2: Simple Linear Regression

Six Sigma Black Belt Study Guides

ANOVA Situation The F Statistic Multiple Comparisons. 1-Way ANOVA MATH 143. Department of Mathematics and Statistics Calvin College

: The model hypothesizes a relationship between the variables. The simplest probabilistic model: or.

Prepared by: Prof. Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti

Unit 11: Multiple Linear Regression

SMA 6304 / MIT / MIT Manufacturing Systems. Lecture 10: Data and Regression Analysis. Lecturer: Prof. Duane S. Boning

Regression in R. Seth Margolis GradQuant May 31,

Stat 411/511 ESTIMATING THE SLOPE AND INTERCEPT. Charlotte Wickham. stat511.cwick.co.nz. Nov

Data Analysis and Statistical Methods Statistics 651

The Multiple Regression Model

3 Variables: Cyberloafing Conscientiousness Age

Lecture 10 Multiple Linear Regression

using the beginning of all regression models

Chapter 7 Student Lecture Notes 7-1

Review of Multiple Regression

Multiple Regression Analysis

Chapter 19: Logistic regression

Stats fest Analysis of variance. Single factor ANOVA. Aims. Single factor ANOVA. Data

Psy 420 Final Exam Fall 06 Ainsworth. Key Name

MSc / PhD Course Advanced Biostatistics. dr. P. Nazarov

As always, show your work and follow the HW format. You may use Excel, but must show sample calculations.

Multiple Regression. Inference for Multiple Regression and A Case Study. IPS Chapters 11.1 and W.H. Freeman and Company

Taguchi Method and Robust Design: Tutorial and Guideline

Chapter 4: Regression Models

Biostatistics 380 Multiple Regression 1. Multiple Regression

Chapter 12 - Part I: Correlation Analysis

9. Linear Regression and Correlation

STATISTICAL DATA ANALYSIS IN EXCEL

Sections 7.1, 7.2, 7.4, & 7.6

Research Design - - Topic 13a Split Plot Design with Either a Continuous or Categorical Between Subjects Factor 2008 R.C. Gardner, Ph.D.

Introduction to Regression

Prepared by: Prof. Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti

Stat 412/512 TWO WAY ANOVA. Charlotte Wickham. stat512.cwick.co.nz. Feb

Neuendorf MANOVA /MANCOVA. Model: X1 (Factor A) X2 (Factor B) X1 x X2 (Interaction) Y4. Like ANOVA/ANCOVA:

Dr. Junchao Xia Center of Biophysics and Computational Biology. Fall /1/2016 1/46

Analysis of Covariance (ANCOVA) Lecture Notes

Regression With a Categorical Independent Variable: Mean Comparisons

Inference for Regression Inference about the Regression Model and Using the Regression Line

Chapter 26 Multiple Regression, Logistic Regression, and Indicator Variables

Correlation Analysis

y response variable x 1, x 2,, x k -- a set of explanatory variables

14 Multiple Linear Regression

Topic 20: Single Factor Analysis of Variance

General Linear Model (Chapter 4)

Lecture 4: Multivariate Regression, Part 2

Transcription:

General linear models One and Two-way ANOVA in SPSS Repeated measures ANOVA Multiple linear regression

2-way ANOVA in SPSS Example 14.1 2

3 2-way ANOVA in SPSS Click Add

4 Repeated measures The stroop test BLUE

The model 5

Assumptions 6

7 If the sphericity does not hold a) The trick: Greenhouse and Geisser b) Repeated MANOVA c) More complex models

And now in SPSS 8

9 Multiple linear regression Regression: One variable is considered dependent on the other(s) Correlation: No variables are considered dependent on the other(s) Multiple regression: More than one independent variable Linear regression: The independent factor is scalar and linearly dependent on the independent factor(s) Logistic regression: The independent factor is categorical (hopefully only two levels) and follows a s-shaped relation.

10 Remember the simple linear regression? If Y is linearly dependent on X, simple linear regression is used: Y j X j is the intercept, the value of Y when X = 0 is the slope, the rate in which Y increases when X increases

11 I the relation linear? 12 10 8 6 4 2 0-2 -4-3 -2-1 0 1 2 3

12 Multiple linear regression If Y is linearly dependent on more than one independent variable: Yj 1 X1 j 2X 2 j is the intercept, the value of Y when X 1 and X 2 = 0 1 and 2 are termed partial regression coefficients 1 expresses the change of Y for one unit of X when 2 is kept constant 4.5 4 3.5 3 2.5 2 1.5 1 0.5 0 7 6 5 4 3 2 1 0 5 10 15 20 25

13 Multiple linear regression residual error and estimations As the collected data is not expected to fall in a plane an error term must be added Y j 1X1 j 2X 2 j j The error term sums up to be zero. Estimating the dependent factor and the population parameters: ˆ Yj a b1 X1 j b2 X 2 j 4.5 4 3.5 3 2.5 2 1.5 1 0.5 0 7 6 5 4 3 2 1 0 5 10 15 20 25

14 Multiple linear regression general equations In general an finite number (m) of independent variables may be used to estimate the hyperplane Y j m i 1 X i ij j The number of sample points must be two more than the number of variables

15 Multiple linear regression least sum of squares The principle of the least sum of squares are usually used to perform the fit: n j 1 ˆ 2 Y j Y j

Multiple linear regression An example 16

Multiple linear regression The fitted equation 17

18 Multiple linear regression Are any of the coefficients significant? F = regression MS / residual MS

19 Multiple linear regression Is it a good fit? R 2 = 1-regression SS / total SS Is an expression of how much of the variation can be described by the model When comparing models with different numbers of variables the adjusted R-square should be used: R a 2 = 1 regression MS / total MS The multiple regression coefficient: R = sqrt(r 2 ) The standard error of the estimate = sqrt(residual MS)

20 Multiple linear regression Which of the coefficient are significant? s bi is the standard error of the regression parameter b i t-test tests if b i is different from 0 t = b i / s bi is the residual DF p values can be found in a table

21 Multiple linear regression Which of the are most important? The standardized regression coefficient, b is a normalized version of b x 2 ' i i b i 2 b y

22 Multiple linear regression - multicollinearity If two factors are well correlated the estimated b s becomes inaccurate. Collinearity, intercorrelation, nonorthogonality, illconditioning Tolerance or variance inflation factors can be computed Extreme correlation is called singularity and on of the correlated variables must be removed.

23 Multiple linear regression Pairvise correlation coefficients r xy x 2 xy y 2 2 X X Y Y ; x X X 2 i i i ; xy

24 Multiple linear regression Assumptions The same as for simple linear regression: 1. Y s are randomly sampled 2. The residuals are normal distributed 3. The residuals have equal variance 4. The X s are fixed factors (their error are small). 5. The X s are not perfectly correlated