Multiple Regression. More Hypothesis Testing. More Hypothesis Testing The big question: What we really want to know: What we actually know: We know:

Similar documents
Area1 Scaled Score (NAPLEX) .535 ** **.000 N. Sig. (2-tailed)

Chapter 9 - Correlation and Regression

Multiple Linear Regression I. Lecture 7. Correlation (Review) Overview

Review of Multiple Regression

Correlation. A statistics method to measure the relationship between two variables. Three characteristics

: The model hypothesizes a relationship between the variables. The simplest probabilistic model: or.

SPSS Output. ANOVA a b Residual Coefficients a Standardized Coefficients

REVIEW 8/2/2017 陈芳华东师大英语系

Multiple OLS Regression

y response variable x 1, x 2,, x k -- a set of explanatory variables

Correlation and simple linear regression S5

Can you tell the relationship between students SAT scores and their college grades?

Multivariate Correlational Analysis: An Introduction

Simple Linear Regression

Regression: Main Ideas Setting: Quantitative outcome with a quantitative explanatory variable. Example, cont.

Difference in two or more average scores in different groups

36-309/749 Experimental Design for Behavioral and Social Sciences. Sep. 22, 2015 Lecture 4: Linear Regression

Intro to Linear Regression

Intro to Linear Regression

( ), which of the coefficients would end

Regression ( Kemampuan Individu, Lingkungan kerja dan Motivasi)

Practical Biostatistics

Univariate analysis. Simple and Multiple Regression. Univariate analysis. Simple Regression How best to summarise the data?

Multiple Regression. Inference for Multiple Regression and A Case Study. IPS Chapters 11.1 and W.H. Freeman and Company

x3,..., Multiple Regression β q α, β 1, β 2, β 3,..., β q in the model can all be estimated by least square estimators

Ordinary Least Squares Regression Explained: Vartanian

Parametric Test. Multiple Linear Regression Spatial Application I: State Homicide Rates Equations taken from Zar, 1984.

Interactions between Binary & Quantitative Predictors

Example: Forced Expiratory Volume (FEV) Program L13. Example: Forced Expiratory Volume (FEV) Example: Forced Expiratory Volume (FEV)

4/22/2010. Test 3 Review ANOVA

ANCOVA. Psy 420 Andrew Ainsworth

ESP 178 Applied Research Methods. 2/23: Quantitative Analysis

Multiple linear regression S6

Inference for Regression Inference about the Regression Model and Using the Regression Line

23. Inference for regression

(ii) Scan your answer sheets INTO ONE FILE only, and submit it in the drop-box.

Inference for the Regression Coefficient

Inference for Regression Simple Linear Regression

Bivariate Regression Analysis. The most useful means of discerning causality and significance of variables

Research Design - - Topic 19 Multiple regression: Applications 2009 R.C. Gardner, Ph.D.

9. Linear Regression and Correlation

STAT 3900/4950 MIDTERM TWO Name: Spring, 2015 (print: first last ) Covered topics: Two-way ANOVA, ANCOVA, SLR, MLR and correlation analysis

4:3 LEC - PLANNED COMPARISONS AND REGRESSION ANALYSES

Psychology 282 Lecture #4 Outline Inferences in SLR

Review of Statistics 101

Inferences for Regression

Self-Assessment Weeks 6 and 7: Multiple Regression with a Qualitative Predictor; Multiple Comparisons

EDF 7405 Advanced Quantitative Methods in Educational Research MULTR.SAS

Analysing data: regression and correlation S6 and S7

" M A #M B. Standard deviation of the population (Greek lowercase letter sigma) σ 2

TOPIC 9 SIMPLE REGRESSION & CORRELATION

MORE ON SIMPLE REGRESSION: OVERVIEW

Introduction to Regression

Correlation and Linear Regression

MANOVA is an extension of the univariate ANOVA as it involves more than one Dependent Variable (DV). The following are assumptions for using MANOVA:

Chapter Goals. To understand the methods for displaying and describing relationship among variables. Formulate Theories.

AMS 315/576 Lecture Notes. Chapter 11. Simple Linear Regression

Business Statistics. Lecture 9: Simple Regression

Chapter 10-Regression

Prepared by: Prof. Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti

Multiple Regression Analysis

LI EAR REGRESSIO A D CORRELATIO

Multiple Regression and Model Building (cont d) + GIS Lecture 21 3 May 2006 R. Ryznar

Chapter 4. Regression Models. Learning Objectives

Basics of Experimental Design. Review of Statistics. Basic Study. Experimental Design. When an Experiment is Not Possible. Studying Relations

Daniel Boduszek University of Huddersfield

Multiple linear regression

A discussion on multiple regression models

1 Correlation and Inference from Regression

Applied Regression Analysis

Regression. What is regression? Linear Regression. Cal State Northridge Ψ320 Andrew Ainsworth PhD

McGill University. Faculty of Science MATH 204 PRINCIPLES OF STATISTICS II. Final Examination

Lecture 2 Simple Linear Regression STAT 512 Spring 2011 Background Reading KNNL: Chapter 1

General linear models. One and Two-way ANOVA in SPSS Repeated measures ANOVA Multiple linear regression

SPSS LAB FILE 1

Multiple Linear Regression II. Lecture 8. Overview. Readings

Multiple Linear Regression II. Lecture 8. Overview. Readings. Summary of MLR I. Summary of MLR I. Summary of MLR I

Correlation Analysis

Chapter 4 Regression with Categorical Predictor Variables Page 1. Overview of regression with categorical predictors

Business Statistics. Lecture 10: Course Review

ECON 497 Midterm Spring

Chapter 14 Student Lecture Notes Department of Quantitative Methods & Information Systems. Business Statistics. Chapter 14 Multiple Regression

Ch 2: Simple Linear Regression

Econ 3790: Business and Economics Statistics. Instructor: Yogesh Uppal

10/31/2012. One-Way ANOVA F-test

The inductive effect in nitridosilicates and oxysilicates and its effects on 5d energy levels of Ce 3+

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #6

Simple Linear Regression Using Ordinary Least Squares

Advanced Experimental Design

Sociology 593 Exam 2 March 28, 2002

Extensions of One-Way ANOVA.

Key Concepts. Correlation (Pearson & Spearman) & Linear Regression. Assumptions. Correlation parametric & non-para. Correlation

Example. Multiple Regression. Review of ANOVA & Simple Regression /749 Experimental Design for Behavioral and Social Sciences

Draft Proof - Do not copy, post, or distribute. Chapter Learning Objectives REGRESSION AND CORRELATION THE SCATTER DIAGRAM

Binary Logistic Regression

Chs. 16 & 17: Correlation & Regression

1. How will an increase in the sample size affect the width of the confidence interval?

Ch. 16: Correlation and Regression

Ref.: Spring SOS3003 Applied data analysis for social science Lecture note

Levene's Test of Equality of Error Variances a

Transcription:

Multiple Regression Ψ320 Ainsworth More Hypothesis Testing What we really want to know: Is the relationship in the population we have selected between X & Y strong enough that we can use the relationship to make predictions about Y based on X. What we actually know: The extent of the relationship between X and Y made on a calculations of a single sample drawn from our population. More Hypothesis Testing The big question: Is the relationship between X & Y in our sample SO STRONG, that it is unlikely that the relationship in the population is 0. We know: Sampling Error will give us measures of the relationship (r and b) that will vary from sample to sample. BUT, if the true relationship is not zero, most of our r s and b s will be non-zero.

More Hypothesis Testing So, hypothesis testing on regression is of the form: Hypotheses for correlation : h0 : ρ = 0 or h0: ρ 0 or h0 : ρ 0 h: ρ = 0 or h: ρ < 0 or h: ρ > 0 Hypotheses for regression : h0 : β = 0 or h0 : β 0 or h0 : β 0 h : β = 0 or h : β < 0 or h : β > 0 Sampling Distributions of ρ or β h 0 Distribution 0 h Distribution Not 0 Multiple Regression Using several predictors to predict a single dependent variable Hypothesis Testing involves: Finding a measure of overall fit Testing each predictor

An Example Study by Kliewer et al. (998) on effect of violence on internalizing behavior What is internalizing behavior? Predictors Degree of witnessing violence Measure of life stress Measure of social support Violence and Internalizing Children 8-2 years in high-violence areas Hypothesis: violence and stress lead to internalizing behavior. Correlations Pearson Correlation Witnessed Violence Social Support Stress Internalizing Witnessed Social Violence Support Stress Internalizing.080.048.20*.080 -.078 -.73.048 -.078.268**.20* -.73.268** *. Correlation is significant at the 0.05 level (2-tailed). **. Correlation is significant at the 0.0 level (2-tailed). Note: both Stress and Witnessing Violence are significantly correlated with Internalizing. Note: that predictors are largely independent of each other. Multiple Regression Terms Multiple Correlation Coefficient also known as the Multiple Correlation or R The correlation of Y with a set of X variables. Multicollinearity What happens when your set of X variables are correlated with each other? Complicates analysis For this class assume Xs are independent

Multiple Correlation Directly analogous to simple r Always capitalized (e.g. R) Always positive Correlation of Yˆ with observed Y where Yˆ is computed from regression equation It is the correlation between all Xs and Y simultaneously Often reported as R 2 instead of R Multicollinear Xs R 2 Independent Xs X 3 X 2 4 2 Y X X 2 2 Y X, X 2, and Y represent the variables. The numbers reflect variance overlap as follows:. Proportion of Y uniquely predicted by X 2. Proportion of Y uniquely predicted by X 2 3. Proportion of variance shared by X and X 2 4. Proportion of Y redundantly predicted by X and X 2 R 2 When Xs are independent # of Xs 2 2 2 = ( YX. ) ; then R is simply k k = R r R When Xs are multicollinear R # of Xs 2 2 ( ryx. ) k k =

R 2 for the Example Assuming Independent Xs # of Xs 2 2 2 2 2 R = ( r ) =.20 +.73 +.268 = k = 2 R =.040 +.030 +.072 =.43 and R =.43 =.378 From SPSS Model YX. k Model Summary b Adjusted Std. Error of R R Square R Square the Estimate.368 a.36.08 2.2737 a. Predictors: (Constant), STRESS, VIOLWIT, SOCSUPP b. Dependent Variable: INTERNAL Regression Coefficients Multiple Slopes ( for each X) and one intercept. With independent Xs the Bs are calculated in the same way as in simple regression The intercept is now a = Y bx + b2x 2 + b3x3 + Slopes and Intercept Model (Constant) VIOLWIT SOCSUPP STRESS Unstandardized Coefficients a. Dependent Variable: INTERNAL Coefficients a Standardized Coefficients B Std. Error Beta t Sig..57.288.40.689.038.08.202 2..037 -.076.043 -.70 -.766.08.272.06.245 2.560.02 Yˆ = b X + b X + b X + a 2 2 3 3 = (0.038* ViolWit) (0.076* SocSupp) + (0.272* Stress) + 0.57

Interpretation Note slope for Witness and Stress are positive, but slope for Social Support is negative. Does this make sense? If you had two subjects with identical Stress and SocSupp, a one unit increase in Witness would produce 0.038 unit increase in Internal. Interpretation The interpretation holds true for other predictors as well R 2 has same interpretation as r 2 3.6% of variability in Internal accounted for by variability in Witness, Stress, and SocSupp. Predictions Assume a participant s scores were: Witness = 35, SocSupp = 20 and Stress = 5 Yˆ = (0.038* ViolWit) (0.076* SocSupp) + (0.272* Stress) + 0.57 =.038(35).076(20) +.272(5) + 0.57 =.33.52 +.36 +.57 =.69 We would predict.69 as the level of Internal for that participant.

Residuals Residuals are calculated in the same way as in simple regression Error variance and standard error of estimate are also calculated the same way Note: R 2 is used instead of r 2 in all formulas using r 2 and - r 2 Partitioning Variance All of the SS (regression, error and total) are also calculated the same as in simple regression Note: R 2 is used instead of r 2 in all formulas using r 2 and - r 2 Testing for the overall prediction makes more sense now and is not as redundant with the test of the Bs As before, s 2 regression/s 2 residual = F Hypothesis Overall Prediction Test of overall prediction and R 2 given in Analyzing the of Variance as before Model Regression Residual Total ANOVA b Sum of Squares df Mean Square F Sig. 73.320 3 24.440 4.97.003 a 467.090 95 4.97 540.40 98 a. Predictors: (Constant), STRESS, VIOLWIT, SOCSUPP b. Dependent Variable: INTERNAL These are the variances

Testing Slopes and Intercept Tests on regression coefficients given along with the coefficients. Model (Constant) VIOLWIT SOCSUPP STRESS Unstandardized Coefficients a. Dependent Variable: INTERNAL Coefficients a Standardized Coefficients B Std. Error Beta t Sig..57.288.40.689.038.08.202 2..037 -.076.043 -.70 -.766.08.272.06.245 2.560.02 t test on two slopes are significant Intercept These are the Bs These are the t-values