Lecture 20: Multiple linear regression

Size: px
Start display at page:

Download "Lecture 20: Multiple linear regression"

Transcription

1 Lecture 20: Multiple linear regression Statistics 101 Mine Çetinkaya-Rundel April 5, 2012

2 Announcements Announcements Project proposals due Sunday midnight: Respsonse variable: numeric Explanatory variables: at least one numeric and one at least categorical variable Total of at least 3 variables, more the merrier You might want to make a few quick plots of your dataset to see if the relationship between your response and explanatory variables are linear (or could be made liner using transformations). If the relationship is not linear, you may want to consider another dataset otherwise your findings will be inconclusive. Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

3 Recap Review question One waiter recorded information about each tip he received over a period of a few months. Below is the regression model for predicting tip amount from total bill amount (both in $). Which of the below is correct? (Intercept) total bill (a) For each $1 increase in total bill amount, we would expect tip to increase on average by 92 cents. (b) The regression model is tip = total bill. (c) For a bill amount of 0 dollars we would expect the tip amount to be 11 cents on average. (d) The explanatory variable is tips and the response variable is total bill amount. Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

4 1 Multiple regression Categorical variables with two levels Many variables in a model Adjusted R 2 Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, 2012

5 Multiple regression Simple linear regression: Bivariate - two variables: y and x Multiple linear regression: Multiple variables: y and x 1, x 2, Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

6 Categorical variables with two levels GPA vs. Greek Relationship between Greek organization or an SLG and GPA based on class survey: 4.0 gpa no yes greek Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

7 Categorical variables with two levels GPA vs. Greek - linear model gpa_greek = lm(gpa greek, data = survey) summary(gpa_greek) Call: lm(formula = gpa greek, data = survey) Residuals: Min 1Q Median 3Q Max Coefficients: (Intercept) < 2e-16 *** greekyes ** --- Signif. codes: 0 *** ** 0.01 * Residual standard error: on 203 degrees of freedom (13 observations deleted due to missingness) Multiple R-squared: , Adjusted R-squared: F-statistic: on 1 and 203 DF, p-value: Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

8 Categorical variables with two levels GPA vs. Greek - linear model (cont.) (Intercept) greek:yes The variable greek has two levels: yes and no no is the reference level, hence not shown on the output Linear model: ĝpa = greek : yes Intercept: The estimated mean GPA of students who do not belong to a Greek organization is This is the value we get if we plug in 0 for the explanatory variable Slope: The estimated mean GPA of students who belong to a Greek organization is 0.11 higher than those who do not Then, the estimated mean GPA of students who do belong to a Greek organization is = 3.64 This is the value we get if we plug in 1 for the explanatory variable Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

9 Categorical variables with two levels GPA vs. Greek - inference (Intercept) greek:yes Hypotheses: H 0 : β 1 = 0 H A : β 1 0 p value = 0.01 The data provide convincing evidence that the true slope parameter is different than 0, hence there appears to be statistically significant relationship between belonging to a Greek organization or an SLG and GPA Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

10 Categorical variables with two levels Another approach? Clicker question What other approach could we use to evaluate the relationship between belonging to a Greek organization or an SLG and GPA? Remember: 205 students reported their GPA, 87 belong to a Greek organization or an SLG and 118 do not. (a) chi-squared test of independence (b) chi-squared test of goodness-of-fit (c) test for comparing means of dependent groups (d) test for comparing means of independent groups (e) test for comparing proportions of independent groups Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

11 Categorical variables with two levels Another example... Relationship between how politically active students are (on a scale of 1 to 5) and whether or not they are pro-life or pro-choice: 5 politic_active pro choice pro life life_choice Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

12 Categorical variables with two levels Identifying the reference level Clicker question Based on the regression output given below, what is the reference level of the variable life choice? (Intercept) life choice:pro-life (a) pro-choice (b) pro-life (c) neither (d) cannot tell from the information given Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

13 Categorical variables with two levels Identifying the reference level Clicker question Based on the regression output, which of the below is correct? (Intercept) life choice:pro-life The estimated mean political activity level of students who are (a) pro-choice is 0.06 higher than those who are pro-life (b) pro-choice is 0.06 lower than those who are pro-life (c) pro-life is 0.06 lower than those who are pro-choice (d) pro-life is 0.06 higher than those who are pro-choice Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

14 Categorical variables with two levels Identifying the reference level Clicker question Does whether or not a student is pro-life or pro-choice have a statistically significant relationship with their political activity level? (a) no (b) yes (Intercept) life choice:pro-life (c) cannot tell from the information given Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

15 Many variables in a model Weights of books weight (g) volume (cm 3 ) cover hc hc hc hc hc hc hc pb pb pb pb pb pb pb pb l w h Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

16 Many variables in a model Weights of books (cont.) Clicker question The scatterplot shows the relationship between weights and volumes of books as well as the regression output. Which of the below is correct? weight (g) weight = volume R 2 = 80% volume (cm 3 ) (a) Weights of 80% of the books can be predicted accurately using this model. (b) Books that are 10 cm 3 over average are expected to weigh 7 g over average. (c) The correlation between weight and volume is R = = (d) The model underestimates the weight of the book with the highest volume. Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

17 Many variables in a model If you d like to replicate the analysis in R... install.packages("daag") library(daag) data(allbacks) From: Maindonald, J.H. and Braun, W.J. (2nd ed., 2007) Data Analysis and Graphics Using R Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

18 Many variables in a model Modeling weights of books using volume somewhat abbreviated output... m1 = lm(weight volume, data = allbacks) summary(m1) Coefficients: (Intercept) volume e-06 Residual standard error: on 13 degrees of freedom Multiple R-squared: , Adjusted R-squared: F-statistic: on 1 and 13 DF, p-value: 6.262e-06 Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

19 Many variables in a model Weights of hard cover and paperback books Can you identify a trend in the relationship between volume and weight of hardcover and paperback books? 1000 hardcover paperback weight (g) volume (cm 3 ) Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

20 Many variables in a model Modeling weights of books using volume and cover type m2 = lm(weight volume + cover, data = allbacks) summary(m2) Coefficients: (Intercept) ** volume e-08 *** cover:pb *** Residual standard error: 78.2 on 12 degrees of freedom Multiple R-squared: , Adjusted R-squared: F-statistic: on 2 and 12 DF, p-value: 1.455e-07 Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

21 Many variables in a model Determining the reference level Clicker question Based on the regression output below, which level of cover is the reference level? Note that pb: paperback. (Intercept) volume cover:pb (a) paperback (b) hardcover Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

22 Many variables in a model Determining the reference level Clicker question Which of the below correctly describes the roles of variables in this regression model? (Intercept) volume cover:pb (a) response: weight, explanatory: volume, paperback cover (b) response: weight, explanatory: volume, hardcover cover (c) response: volume, explanatory: weight, cover type (d) response: weight, explanatory: volume, cover type Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

23 Many variables in a model Linear model (Intercept) volume cover:pb weight = volume cover : pb 1 For hardcover books: plug in 0 for cover weight = volume = volume 2 For paperback books: plug in 1 for cover weight = volume = volume Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

24 Many variables in a model Visualising the linear model 1000 hardcover paperback weight (g) volume (cm 3 ) Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

25 Many variables in a model Interpretation of the regression coefficients (Intercept) volume cover:pb Slope of volume: All else held constant, for each 1 cm 3 increase in volume we would expect weight to increase on average by 0.72 grams. Slope of cover: All else held constant, the model predicts that paperback books weigh 184 grams lower than hardcover books. Intercept: Hardcover books with no volume are expected on average to weigh 198 grams. Obviously, the intercept does not make sense in context. It only serves to adjust the height of the line. Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

26 Many variables in a model Prediction Clicker question Which of the following is the correct calculation for the predicted weight of a paperback book that is 600 cm 3? (Intercept) volume cover:pb (a) * * 1 (b) * * 1 (c) * * 0 (d) * * 600 Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

27 Many variables in a model Another example: Modeling kid s test scores Predicting cognitive test scores of three- and four-year-old children using characteristics of their mothers. Data are from a survey of adult American women and their children - a subsample from the National Longitudinal Survey of Youth. kid score mom hs mom iq mom work mom age 1 65 yes yes yes yes no no yes yes 25 Gelman, Hill. Data Analysis Using Regression and Multilevel/Hierarchical Models. (2007) Cambridge University Press. Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

28 Many variables in a model Interpreting the slope What is the correct interpretation of the slope for mom s IQ? (Intercept) mom hs:yes mom iq mom work:yes mom age Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

29 Many variables in a model Interpreting the slope What is the correct interpretation of the intercept? (Intercept) mom hs:yes mom iq mom work:yes mom age Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

30 Many variables in a model Interpreting the slope Clicker question What is the correct interpretation of the slope for mom work? (Intercept) mom hs:yes mom iq mom work:yes mom age All else being equal, kids whose moms worked during the first three year s of the kid s life (a) are estimated to score 2.54 points lower (b) are estimated to score 2.54 points higher than those whose moms did not work. Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

31 Adjusted R 2 Revisit: Modeling poverty poverty metro_res white hs_grad female_house Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

32 Adjusted R 2 Predicting poverty using % female householder pov_slr = lm(poverty female_house, data = poverty) (Intercept) female house % in poverty R = 0.53 R 2 = = % female householder Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

33 Adjusted R 2 Another look at R 2 R 2 can be calculated in three ways: 1 square the correlation coefficient of x and y (how we have been calculating it): > cor(poverty$poverty, poverty$female_house)ˆ2 [1] square the correlation coefficient of y and ŷ: > cor(poverty$poverty, pov_slr$fitted.values)ˆ2 [1] based on definition: R 2 = explained variability in y total variability in y Using ANOVA we can calculate the explained variability and total variability in y. Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

34 Adjusted R 2 Sum of squares anova(pov_slr) Df Sum Sq Mean Sq F value Pr(>F) female house Residuals Total Sum of squares of y: SS Total = Sum of squares of residuals: SS Error = (y ȳ) 2 = total variability e 2 i = unexplained variability Sum of squares of x: SS Model = SS Total SS Error explained variability = = R 2 = explained variability total variability = = 0.28 Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

35 Adjusted R 2 Why bother? Why bother with another approach for calculating R 2 when we had a perfectly good way to calculate it as the correlation coefficient squared? Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

36 Adjusted R 2 Predicting poverty using % female hh + % white Linear model: (Intercept) female house white ANOVA: Df Sum Sq Mean Sq F value Pr(>F) female house white Residuals Total R 2 = explained variability total variability = = 0.29 Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

37 Adjusted R 2 Does adding the variable white to the model add valuable information that wasn t provided by female house? poverty metro_res white hs_grad female_house Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

38 Adjusted R 2 Collinearity between explanatory variables poverty vs. % female head of household (Intercept) female house poverty vs. % female head of household and % female hh (Intercept) female house white Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

39 Adjusted R 2 Collinearity between explanatory variables poverty vs. % female head of household (Intercept) female house poverty vs. % female head of household and % female hh (Intercept) female house white Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

40 Adjusted R 2 Collinearity between explanatory variables (cont.) Two predictor variables are said to be collinear when they are correlated, and this collinearity complicates model estimation. Remember: Predictors are also called explanatory or independent variables, so they should be independent of each other. We don t like adding predictors that are associated with each other to the mode, because often times the addition of such variable brings nothing to the table. Instead, we prefer the simplest best model, i.e. parsimonious model. While it s impossible to avoid collinearity from arising in observational data, experiments are usually designed to control for correlated predictors. Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

41 R 2 vs. adjusted R 2 Multiple regression Adjusted R 2 R 2 Adjusted R 2 Model 1 (SLR) Model 2 (MLR) When any variable is added to the model R 2 increases. But if the added variable doesn t really provide any new information, or is completely unrelated, adjusted R 2 does not increase. Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

42 Adjusted R 2 Multiple regression Adjusted R 2 Adjusted R 2 ( R 2 adj = 1 SSError n 1 ) SS Total n p 1 where n is the number of cases and p is the number of predictors (explanatory variables) in the model. Because p is never negative, R 2 adj will always be smaller than R2. R 2 adj applies a penalty for the number of predictors included in the model. Therefore, we choose models with higher R 2 adj over others. Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

43 Calculate adjusted R 2 Multiple regression Adjusted R 2 ANOVA: Df Sum Sq Mean Sq F value Pr(>F) female house white Residuals Total ( R 2 SSError adj = 1 n 1 ) SS Total n p 1 ( = ) ( = ) 48 = = 0.26 Statistics 101 (Mine Çetinkaya-Rundel) L20: Multiple linear regression April 5, / 40

Chapter 8: Multiple and logistic regression

Chapter 8: Multiple and logistic regression Chapter 8: Multiple and logistic regression OpenIntro Statistics, 3rd Edition Slides developed by Mine C etinkaya-rundel of OpenIntro. The slides may be copied, edited, and/or shared via the CC BY-SA license.

More information

Unit 7: Multiple linear regression 1. Introduction to multiple linear regression

Unit 7: Multiple linear regression 1. Introduction to multiple linear regression Announcements Unit 7: Multiple linear regression 1. Introduction to multiple linear regression Sta 101 - Fall 2017 Duke University, Department of Statistical Science Work on your project! Due date- Sunday

More information

Announcements. Unit 6: Simple Linear Regression Lecture : Introduction to SLR. Poverty vs. HS graduate rate. Modeling numerical variables

Announcements. Unit 6: Simple Linear Regression Lecture : Introduction to SLR. Poverty vs. HS graduate rate. Modeling numerical variables Announcements Announcements Unit : Simple Linear Regression Lecture : Introduction to SLR Statistics 1 Mine Çetinkaya-Rundel April 2, 2013 Statistics 1 (Mine Çetinkaya-Rundel) U - L1: Introduction to SLR

More information

Modeling kid s test scores (revisited) Lecture 20 - Model Selection. Model output. Backward-elimination

Modeling kid s test scores (revisited) Lecture 20 - Model Selection. Model output. Backward-elimination Modeling kid s test scores (revisited) Lecture 20 - Model Selection Sta102 / BME102 Colin Rundel November 17, 2014 Predicting cognitive test scores of three- and four-year-old children using characteristics

More information

Nature vs. nurture? Lecture 18 - Regression: Inference, Outliers, and Intervals. Regression Output. Conditions for inference.

Nature vs. nurture? Lecture 18 - Regression: Inference, Outliers, and Intervals. Regression Output. Conditions for inference. Understanding regression output from software Nature vs. nurture? Lecture 18 - Regression: Inference, Outliers, and Intervals In 1966 Cyril Burt published a paper called The genetic determination of differences

More information

Lecture 19: Inference for SLR & Transformations

Lecture 19: Inference for SLR & Transformations Lecture 19: Inference for SLR & Transformations Statistics 101 Mine Çetinkaya-Rundel April 3, 2012 Announcements Announcements HW 7 due Thursday. Correlation guessing game - ends on April 12 at noon. Winner

More information

1 The Classic Bivariate Least Squares Model

1 The Classic Bivariate Least Squares Model Review of Bivariate Linear Regression Contents 1 The Classic Bivariate Least Squares Model 1 1.1 The Setup............................... 1 1.2 An Example Predicting Kids IQ................. 1 2 Evaluating

More information

Announcements. Lecture 18: Simple Linear Regression. Poverty vs. HS graduate rate

Announcements. Lecture 18: Simple Linear Regression. Poverty vs. HS graduate rate Announcements Announcements Lecture : Simple Linear Regression Statistics 1 Mine Çetinkaya-Rundel March 29, 2 Midterm 2 - same regrade request policy: On a separate sheet write up your request, describing

More information

Chi-square tests. Unit 6: Simple Linear Regression Lecture 1: Introduction to SLR. Statistics 101. Poverty vs. HS graduate rate

Chi-square tests. Unit 6: Simple Linear Regression Lecture 1: Introduction to SLR. Statistics 101. Poverty vs. HS graduate rate Review and Comments Chi-square tests Unit : Simple Linear Regression Lecture 1: Introduction to SLR Statistics 1 Monika Jingchen Hu June, 20 Chi-square test of GOF k χ 2 (O E) 2 = E i=1 where k = total

More information

Announcements. Unit 7: Multiple linear regression Lecture 3: Confidence and prediction intervals + Transformations. Uncertainty of predictions

Announcements. Unit 7: Multiple linear regression Lecture 3: Confidence and prediction intervals + Transformations. Uncertainty of predictions Housekeeping Announcements Unit 7: Multiple linear regression Lecture 3: Confidence and prediction intervals + Statistics 101 Mine Çetinkaya-Rundel November 25, 2014 Poster presentation location: Section

More information

Lecture 16 - Correlation and Regression

Lecture 16 - Correlation and Regression Lecture 16 - Correlation and Regression Statistics 102 Colin Rundel April 1, 2013 Modeling numerical variables Modeling numerical variables So far we have worked with single numerical and categorical variables,

More information

Regression. Marc H. Mehlman University of New Haven

Regression. Marc H. Mehlman University of New Haven Regression Marc H. Mehlman marcmehlman@yahoo.com University of New Haven the statistician knows that in nature there never was a normal distribution, there never was a straight line, yet with normal and

More information

2. Outliers and inference for regression

2. Outliers and inference for regression Unit6: Introductiontolinearregression 2. Outliers and inference for regression Sta 101 - Spring 2016 Duke University, Department of Statistical Science Dr. Çetinkaya-Rundel Slides posted at http://bit.ly/sta101_s16

More information

Announcements. Lecture 10: Relationship between Measurement Variables. Poverty vs. HS graduate rate. Response vs. explanatory

Announcements. Lecture 10: Relationship between Measurement Variables. Poverty vs. HS graduate rate. Response vs. explanatory Announcements Announcements Lecture : Relationship between Measurement Variables Statistics Colin Rundel February, 20 In class Quiz #2 at the end of class Midterm #1 on Friday, in class review Wednesday

More information

1 Multiple Regression

1 Multiple Regression 1 Multiple Regression In this section, we extend the linear model to the case of several quantitative explanatory variables. There are many issues involved in this problem and this section serves only

More information

Multiple Regression. Inference for Multiple Regression and A Case Study. IPS Chapters 11.1 and W.H. Freeman and Company

Multiple Regression. Inference for Multiple Regression and A Case Study. IPS Chapters 11.1 and W.H. Freeman and Company Multiple Regression Inference for Multiple Regression and A Case Study IPS Chapters 11.1 and 11.2 2009 W.H. Freeman and Company Objectives (IPS Chapters 11.1 and 11.2) Multiple regression Data for multiple

More information

STA 101 Final Review

STA 101 Final Review STA 101 Final Review Statistics 101 Thomas Leininger June 24, 2013 Announcements All work (besides projects) should be returned to you and should be entered on Sakai. Office Hour: 2 3pm today (Old Chem

More information

Intro to Linear Regression

Intro to Linear Regression Intro to Linear Regression Introduction to Regression Regression is a statistical procedure for modeling the relationship among variables to predict the value of a dependent variable from one or more predictor

More information

Inference. ME104: Linear Regression Analysis Kenneth Benoit. August 15, August 15, 2012 Lecture 3 Multiple linear regression 1 1 / 58

Inference. ME104: Linear Regression Analysis Kenneth Benoit. August 15, August 15, 2012 Lecture 3 Multiple linear regression 1 1 / 58 Inference ME104: Linear Regression Analysis Kenneth Benoit August 15, 2012 August 15, 2012 Lecture 3 Multiple linear regression 1 1 / 58 Stata output resvisited. reg votes1st spend_total incumb minister

More information

Regression, Part I. - In correlation, it would be irrelevant if we changed the axes on our graph.

Regression, Part I. - In correlation, it would be irrelevant if we changed the axes on our graph. Regression, Part I I. Difference from correlation. II. Basic idea: A) Correlation describes the relationship between two variables, where neither is independent or a predictor. - In correlation, it would

More information

Unit 6 - Simple linear regression

Unit 6 - Simple linear regression Sta 101: Data Analysis and Statistical Inference Dr. Çetinkaya-Rundel Unit 6 - Simple linear regression LO 1. Define the explanatory variable as the independent variable (predictor), and the response variable

More information

STAT 3900/4950 MIDTERM TWO Name: Spring, 2015 (print: first last ) Covered topics: Two-way ANOVA, ANCOVA, SLR, MLR and correlation analysis

STAT 3900/4950 MIDTERM TWO Name: Spring, 2015 (print: first last ) Covered topics: Two-way ANOVA, ANCOVA, SLR, MLR and correlation analysis STAT 3900/4950 MIDTERM TWO Name: Spring, 205 (print: first last ) Covered topics: Two-way ANOVA, ANCOVA, SLR, MLR and correlation analysis Instructions: You may use your books, notes, and SPSS/SAS. NO

More information

Statistics and Quantitative Analysis U4320

Statistics and Quantitative Analysis U4320 Statistics and Quantitative Analysis U3 Lecture 13: Explaining Variation Prof. Sharyn O Halloran Explaining Variation: Adjusted R (cont) Definition of Adjusted R So we'd like a measure like R, but one

More information

FinalExamReview. Sta Fall Provided: Z, t and χ 2 tables

FinalExamReview. Sta Fall Provided: Z, t and χ 2 tables Final Exam FinalExamReview Sta 101 - Fall 2017 Duke University, Department of Statistical Science When: Wednesday, December 13 from 9:00am-12:00pm What to bring: Scientific calculator (graphing calculator

More information

Lab 3 A Quick Introduction to Multiple Linear Regression Psychology The Multiple Linear Regression Model

Lab 3 A Quick Introduction to Multiple Linear Regression Psychology The Multiple Linear Regression Model Lab 3 A Quick Introduction to Multiple Linear Regression Psychology 310 Instructions.Work through the lab, saving the output as you go. You will be submitting your assignment as an R Markdown document.

More information

Statistics Introductory Correlation

Statistics Introductory Correlation Statistics Introductory Correlation Session 10 oscardavid.barrerarodriguez@sciencespo.fr April 9, 2018 Outline 1 Statistics are not used only to describe central tendency and variability for a single variable.

More information

Section 3: Simple Linear Regression

Section 3: Simple Linear Regression Section 3: Simple Linear Regression Carlos M. Carvalho The University of Texas at Austin McCombs School of Business http://faculty.mccombs.utexas.edu/carlos.carvalho/teaching/ 1 Regression: General Introduction

More information

STA441: Spring Multiple Regression. This slide show is a free open source document. See the last slide for copyright information.

STA441: Spring Multiple Regression. This slide show is a free open source document. See the last slide for copyright information. STA441: Spring 2018 Multiple Regression This slide show is a free open source document. See the last slide for copyright information. 1 Least Squares Plane 2 Statistical MODEL There are p-1 explanatory

More information

STAT 350: Summer Semester Midterm 1: Solutions

STAT 350: Summer Semester Midterm 1: Solutions Name: Student Number: STAT 350: Summer Semester 2008 Midterm 1: Solutions 9 June 2008 Instructor: Richard Lockhart Instructions: This is an open book test. You may use notes, text, other books and a calculator.

More information

REVIEW 8/2/2017 陈芳华东师大英语系

REVIEW 8/2/2017 陈芳华东师大英语系 REVIEW Hypothesis testing starts with a null hypothesis and a null distribution. We compare what we have to the null distribution, if the result is too extreme to belong to the null distribution (p

More information

Lecture 18: Simple Linear Regression

Lecture 18: Simple Linear Regression Lecture 18: Simple Linear Regression BIOS 553 Department of Biostatistics University of Michigan Fall 2004 The Correlation Coefficient: r The correlation coefficient (r) is a number that measures the strength

More information

Variance Decomposition and Goodness of Fit

Variance Decomposition and Goodness of Fit Variance Decomposition and Goodness of Fit 1. Example: Monthly Earnings and Years of Education In this tutorial, we will focus on an example that explores the relationship between total monthly earnings

More information

Inferences for Regression

Inferences for Regression Inferences for Regression An Example: Body Fat and Waist Size Looking at the relationship between % body fat and waist size (in inches). Here is a scatterplot of our data set: Remembering Regression In

More information

Data Set 1A: Algal Photosynthesis vs. Salinity and Temperature

Data Set 1A: Algal Photosynthesis vs. Salinity and Temperature Data Set A: Algal Photosynthesis vs. Salinity and Temperature Statistical setting These data are from a controlled experiment in which two quantitative variables were manipulated, to determine their effects

More information

Intro to Linear Regression

Intro to Linear Regression Intro to Linear Regression Introduction to Regression Regression is a statistical procedure for modeling the relationship among variables to predict the value of a dependent variable from one or more predictor

More information

ST430 Exam 1 with Answers

ST430 Exam 1 with Answers ST430 Exam 1 with Answers Date: October 5, 2015 Name: Guideline: You may use one-page (front and back of a standard A4 paper) of notes. No laptop or textook are permitted but you may use a calculator.

More information

ST430 Exam 2 Solutions

ST430 Exam 2 Solutions ST430 Exam 2 Solutions Date: November 9, 2015 Name: Guideline: You may use one-page (front and back of a standard A4 paper) of notes. No laptop or textbook are permitted but you may use a calculator. Giving

More information

ANCOVA. ANCOVA allows the inclusion of a 3rd source of variation into the F-formula (called the covariate) and changes the F-formula

ANCOVA. ANCOVA allows the inclusion of a 3rd source of variation into the F-formula (called the covariate) and changes the F-formula ANCOVA Workings of ANOVA & ANCOVA ANCOVA, Semi-Partial correlations, statistical control Using model plotting to think about ANCOVA & Statistical control You know how ANOVA works the total variation among

More information

Review of Statistics 101

Review of Statistics 101 Review of Statistics 101 We review some important themes from the course 1. Introduction Statistics- Set of methods for collecting/analyzing data (the art and science of learning from data). Provides methods

More information

Inference for Regression

Inference for Regression Inference for Regression Section 9.4 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 13b - 3339 Cathy Poliak, Ph.D. cathy@math.uh.edu

More information

y = a + bx 12.1: Inference for Linear Regression Review: General Form of Linear Regression Equation Review: Interpreting Computer Regression Output

y = a + bx 12.1: Inference for Linear Regression Review: General Form of Linear Regression Equation Review: Interpreting Computer Regression Output 12.1: Inference for Linear Regression Review: General Form of Linear Regression Equation y = a + bx y = dependent variable a = intercept b = slope x = independent variable Section 12.1 Inference for Linear

More information

Draft Proof - Do not copy, post, or distribute. Chapter Learning Objectives REGRESSION AND CORRELATION THE SCATTER DIAGRAM

Draft Proof - Do not copy, post, or distribute. Chapter Learning Objectives REGRESSION AND CORRELATION THE SCATTER DIAGRAM 1 REGRESSION AND CORRELATION As we learned in Chapter 9 ( Bivariate Tables ), the differential access to the Internet is real and persistent. Celeste Campos-Castillo s (015) research confirmed the impact

More information

Exam 3 Practice Questions Psych , Fall 9

Exam 3 Practice Questions Psych , Fall 9 Vocabular Eam 3 Practice Questions Psch 3101-100, Fall 9 Rather than choosing some practice terms at random, I suggest ou go through all the terms in the vocabular lists. The real eam will ask for definitions

More information

Linear Regression. Linear Regression. Linear Regression. Did You Mean Association Or Correlation?

Linear Regression. Linear Regression. Linear Regression. Did You Mean Association Or Correlation? Did You Mean Association Or Correlation? AP Statistics Chapter 8 Be careful not to use the word correlation when you really mean association. Often times people will incorrectly use the word correlation

More information

Chapter Goals. To understand the methods for displaying and describing relationship among variables. Formulate Theories.

Chapter Goals. To understand the methods for displaying and describing relationship among variables. Formulate Theories. Chapter Goals To understand the methods for displaying and describing relationship among variables. Formulate Theories Interpret Results/Make Decisions Collect Data Summarize Results Chapter 7: Is There

More information

Announcements. Lecture 1 - Data and Data Summaries. Data. Numerical Data. all variables. continuous discrete. Homework 1 - Out 1/15, due 1/22

Announcements. Lecture 1 - Data and Data Summaries. Data. Numerical Data. all variables. continuous discrete. Homework 1 - Out 1/15, due 1/22 Announcements Announcements Lecture 1 - Data and Data Summaries Statistics 102 Colin Rundel January 13, 2013 Homework 1 - Out 1/15, due 1/22 Lab 1 - Tomorrow RStudio accounts created this evening Try logging

More information

Simple Linear Regression: One Qualitative IV

Simple Linear Regression: One Qualitative IV Simple Linear Regression: One Qualitative IV 1. Purpose As noted before regression is used both to explain and predict variation in DVs, and adding to the equation categorical variables extends regression

More information

Regression and the 2-Sample t

Regression and the 2-Sample t Regression and the 2-Sample t James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) Regression and the 2-Sample t 1 / 44 Regression

More information

Chapter 4: Regression Models

Chapter 4: Regression Models Sales volume of company 1 Textbook: pp. 129-164 Chapter 4: Regression Models Money spent on advertising 2 Learning Objectives After completing this chapter, students will be able to: Identify variables,

More information

Gov 2000: 9. Regression with Two Independent Variables

Gov 2000: 9. Regression with Two Independent Variables Gov 2000: 9. Regression with Two Independent Variables Matthew Blackwell Fall 2016 1 / 62 1. Why Add Variables to a Regression? 2. Adding a Binary Covariate 3. Adding a Continuous Covariate 4. OLS Mechanics

More information

22s:152 Applied Linear Regression

22s:152 Applied Linear Regression 22s:152 Applied Linear Regression Chapter 7: Dummy Variable Regression So far, we ve only considered quantitative variables in our models. We can integrate categorical predictors by constructing artificial

More information

Unit5: Inferenceforcategoricaldata. 4. MT2 Review. Sta Fall Duke University, Department of Statistical Science

Unit5: Inferenceforcategoricaldata. 4. MT2 Review. Sta Fall Duke University, Department of Statistical Science Unit5: Inferenceforcategoricaldata 4. MT2 Review Sta 101 - Fall 2015 Duke University, Department of Statistical Science Dr. Çetinkaya-Rundel Slides posted at http://bit.ly/sta101_f15 Outline 1. Housekeeping

More information

Multiple Linear Regression for the Salary Data

Multiple Linear Regression for the Salary Data Multiple Linear Regression for the Salary Data 5 10 15 20 10000 15000 20000 25000 Experience Salary HS BS BS+ 5 10 15 20 10000 15000 20000 25000 Experience Salary No Yes Problem & Data Overview Primary

More information

Stat 412/512 TWO WAY ANOVA. Charlotte Wickham. stat512.cwick.co.nz. Feb

Stat 412/512 TWO WAY ANOVA. Charlotte Wickham. stat512.cwick.co.nz. Feb Stat 42/52 TWO WAY ANOVA Feb 6 25 Charlotte Wickham stat52.cwick.co.nz Roadmap DONE: Understand what a multiple regression model is. Know how to do inference on single and multiple parameters. Some extra

More information

Announcements. Unit 4: Inference for numerical variables Lecture 4: ANOVA. Data. Statistics 104

Announcements. Unit 4: Inference for numerical variables Lecture 4: ANOVA. Data. Statistics 104 Announcements Announcements Unit 4: Inference for numerical variables Lecture 4: Statistics 104 Go to Sakai s to pick a time for a one-on-one meeting. Mine Çetinkaya-Rundel June 6, 2013 Statistics 104

More information

Introduction and Background to Multilevel Analysis

Introduction and Background to Multilevel Analysis Introduction and Background to Multilevel Analysis Dr. J. Kyle Roberts Southern Methodist University Simmons School of Education and Human Development Department of Teaching and Learning Background and

More information

Lecture 4: Multivariate Regression, Part 2

Lecture 4: Multivariate Regression, Part 2 Lecture 4: Multivariate Regression, Part 2 Gauss-Markov Assumptions 1) Linear in Parameters: Y X X X i 0 1 1 2 2 k k 2) Random Sampling: we have a random sample from the population that follows the above

More information

Garvan Ins)tute Biosta)s)cal Workshop 16/6/2015. Tuan V. Nguyen. Garvan Ins)tute of Medical Research Sydney, Australia

Garvan Ins)tute Biosta)s)cal Workshop 16/6/2015. Tuan V. Nguyen. Garvan Ins)tute of Medical Research Sydney, Australia Garvan Ins)tute Biosta)s)cal Workshop 16/6/2015 Tuan V. Nguyen Tuan V. Nguyen Garvan Ins)tute of Medical Research Sydney, Australia Introduction to linear regression analysis Purposes Ideas of regression

More information

Diagnostics and Transformations Part 2

Diagnostics and Transformations Part 2 Diagnostics and Transformations Part 2 Bivariate Linear Regression James H. Steiger Department of Psychology and Human Development Vanderbilt University Multilevel Regression Modeling, 2009 Diagnostics

More information

R 2 and F -Tests and ANOVA

R 2 and F -Tests and ANOVA R 2 and F -Tests and ANOVA December 6, 2018 1 Partition of Sums of Squares The distance from any point y i in a collection of data, to the mean of the data ȳ, is the deviation, written as y i ȳ. Definition.

More information

1-Way ANOVA MATH 143. Spring Department of Mathematics and Statistics Calvin College

1-Way ANOVA MATH 143. Spring Department of Mathematics and Statistics Calvin College 1-Way ANOVA MATH 143 Department of Mathematics and Statistics Calvin College Spring 2010 The basic ANOVA situation Two variables: 1 Categorical, 1 Quantitative Main Question: Do the (means of) the quantitative

More information

Mrs. Poyner/Mr. Page Chapter 3 page 1

Mrs. Poyner/Mr. Page Chapter 3 page 1 Name: Date: Period: Chapter 2: Take Home TEST Bivariate Data Part 1: Multiple Choice. (2.5 points each) Hand write the letter corresponding to the best answer in space provided on page 6. 1. In a statistics

More information

Correlation and Regression Notes. Categorical / Categorical Relationship (Chi-Squared Independence Test)

Correlation and Regression Notes. Categorical / Categorical Relationship (Chi-Squared Independence Test) Relationship Hypothesis Tests Correlation and Regression Notes Categorical / Categorical Relationship (Chi-Squared Independence Test) Ho: Categorical Variables are independent (show distribution of conditional

More information

Announcements. Lecture 5: Probability. Dangling threads from last week: Mean vs. median. Dangling threads from last week: Sampling bias

Announcements. Lecture 5: Probability. Dangling threads from last week: Mean vs. median. Dangling threads from last week: Sampling bias Recap Announcements Lecture 5: Statistics 101 Mine Çetinkaya-Rundel September 13, 2011 HW1 due TA hours Thursday - Sunday 4pm - 9pm at Old Chem 211A If you added the class last week please make sure to

More information

Gov 2000: 9. Regression with Two Independent Variables

Gov 2000: 9. Regression with Two Independent Variables Gov 2000: 9. Regression with Two Independent Variables Matthew Blackwell Harvard University mblackwell@gov.harvard.edu Where are we? Where are we going? Last week: we learned about how to calculate a simple

More information

Regression Analysis IV... More MLR and Model Building

Regression Analysis IV... More MLR and Model Building Regression Analysis IV... More MLR and Model Building This session finishes up presenting the formal methods of inference based on the MLR model and then begins discussion of "model building" (use of regression

More information

Data Analysis Using R ASC & OIR

Data Analysis Using R ASC & OIR Data Analysis Using R ASC & OIR Overview } What is Statistics and the process of study design } Correlation } Simple Linear Regression } Multiple Linear Regression 2 What is Statistics? Statistics is a

More information

Announcements: You can turn in homework until 6pm, slot on wall across from 2202 Bren. Make sure you use the correct slot! (Stats 8, closest to wall)

Announcements: You can turn in homework until 6pm, slot on wall across from 2202 Bren. Make sure you use the correct slot! (Stats 8, closest to wall) Announcements: You can turn in homework until 6pm, slot on wall across from 2202 Bren. Make sure you use the correct slot! (Stats 8, closest to wall) We will cover Chs. 5 and 6 first, then 3 and 4. Mon,

More information

Correlation & Simple Regression

Correlation & Simple Regression Chapter 11 Correlation & Simple Regression The previous chapter dealt with inference for two categorical variables. In this chapter, we would like to examine the relationship between two quantitative variables.

More information

Conditions for Regression Inference:

Conditions for Regression Inference: AP Statistics Chapter Notes. Inference for Linear Regression We can fit a least-squares line to any data relating two quantitative variables, but the results are useful only if the scatterplot shows a

More information

Unit 6 - Introduction to linear regression

Unit 6 - Introduction to linear regression Unit 6 - Introduction to linear regression Suggested reading: OpenIntro Statistics, Chapter 7 Suggested exercises: Part 1 - Relationship between two numerical variables: 7.7, 7.9, 7.11, 7.13, 7.15, 7.25,

More information

Multiple Regression: Chapter 13. July 24, 2015

Multiple Regression: Chapter 13. July 24, 2015 Multiple Regression: Chapter 13 July 24, 2015 Multiple Regression (MR) Response Variable: Y - only one response variable (quantitative) Several Predictor Variables: X 1, X 2, X 3,..., X p (p = # predictors)

More information

Example: Poisondata. 22s:152 Applied Linear Regression. Chapter 8: ANOVA

Example: Poisondata. 22s:152 Applied Linear Regression. Chapter 8: ANOVA s:5 Applied Linear Regression Chapter 8: ANOVA Two-way ANOVA Used to compare populations means when the populations are classified by two factors (or categorical variables) For example sex and occupation

More information

This document contains 3 sets of practice problems.

This document contains 3 sets of practice problems. P RACTICE PROBLEMS This document contains 3 sets of practice problems. Correlation: 3 problems Regression: 4 problems ANOVA: 8 problems You should print a copy of these practice problems and bring them

More information

Final Exam - Solutions

Final Exam - Solutions Ecn 102 - Analysis of Economic Data University of California - Davis March 19, 2010 Instructor: John Parman Final Exam - Solutions You have until 5:30pm to complete this exam. Please remember to put your

More information

Final Exam. Name: Solution:

Final Exam. Name: Solution: Final Exam. Name: Instructions. Answer all questions on the exam. Open books, open notes, but no electronic devices. The first 13 problems are worth 5 points each. The rest are worth 1 point each. HW1.

More information

Multiple Linear Regression. Chapter 12

Multiple Linear Regression. Chapter 12 13 Multiple Linear Regression Chapter 12 Multiple Regression Analysis Definition The multiple regression model equation is Y = b 0 + b 1 x 1 + b 2 x 2 +... + b p x p + ε where E(ε) = 0 and Var(ε) = s 2.

More information

Statistiek II. John Nerbonne. March 17, Dept of Information Science incl. important reworkings by Harmut Fitz

Statistiek II. John Nerbonne. March 17, Dept of Information Science incl. important reworkings by Harmut Fitz Dept of Information Science j.nerbonne@rug.nl incl. important reworkings by Harmut Fitz March 17, 2015 Review: regression compares result on two distinct tests, e.g., geographic and phonetic distance of

More information

Chapter 4. Regression Models. Learning Objectives

Chapter 4. Regression Models. Learning Objectives Chapter 4 Regression Models To accompany Quantitative Analysis for Management, Eleventh Edition, by Render, Stair, and Hanna Power Point slides created by Brian Peterson Learning Objectives After completing

More information

Confidence Intervals, Testing and ANOVA Summary

Confidence Intervals, Testing and ANOVA Summary Confidence Intervals, Testing and ANOVA Summary 1 One Sample Tests 1.1 One Sample z test: Mean (σ known) Let X 1,, X n a r.s. from N(µ, σ) or n > 30. Let The test statistic is H 0 : µ = µ 0. z = x µ 0

More information

Announcements. Final Review: Units 1-7

Announcements. Final Review: Units 1-7 Announcements Announcements Final : Units 1-7 Statistics 104 Mine Çetinkaya-Rundel June 24, 2013 Final on Wed: cheat sheet (one sheet, front and back) and calculator Must have webcam + audio on at all

More information

Basic Business Statistics, 10/e

Basic Business Statistics, 10/e Chapter 4 4- Basic Business Statistics th Edition Chapter 4 Introduction to Multiple Regression Basic Business Statistics, e 9 Prentice-Hall, Inc. Chap 4- Learning Objectives In this chapter, you learn:

More information

Occupy movement - Duke edition. Lecture 14: Large sample inference for proportions. Exploratory analysis. Another poll on the movement

Occupy movement - Duke edition. Lecture 14: Large sample inference for proportions. Exploratory analysis. Another poll on the movement Occupy movement - Duke edition Lecture 14: Large sample inference for proportions Statistics 101 Mine Çetinkaya-Rundel October 20, 2011 On Tuesday we asked you about how closely you re following the news

More information

Simple, Marginal, and Interaction Effects in General Linear Models

Simple, Marginal, and Interaction Effects in General Linear Models Simple, Marginal, and Interaction Effects in General Linear Models PRE 905: Multivariate Analysis Lecture 3 Today s Class Centering and Coding Predictors Interpreting Parameters in the Model for the Means

More information

Stat 411/511 ESTIMATING THE SLOPE AND INTERCEPT. Charlotte Wickham. stat511.cwick.co.nz. Nov

Stat 411/511 ESTIMATING THE SLOPE AND INTERCEPT. Charlotte Wickham. stat511.cwick.co.nz. Nov Stat 411/511 ESTIMATING THE SLOPE AND INTERCEPT Nov 20 2015 Charlotte Wickham stat511.cwick.co.nz Quiz #4 This weekend, don t forget. Usual format Assumptions Display 7.5 p. 180 The ideal normal, simple

More information

Simple linear regression

Simple linear regression Simple linear regression Business Statistics 41000 Fall 2015 1 Topics 1. conditional distributions, squared error, means and variances 2. linear prediction 3. signal + noise and R 2 goodness of fit 4.

More information

Multiple Regression Introduction to Statistics Using R (Psychology 9041B)

Multiple Regression Introduction to Statistics Using R (Psychology 9041B) Multiple Regression Introduction to Statistics Using R (Psychology 9041B) Paul Gribble Winter, 2016 1 Correlation, Regression & Multiple Regression 1.1 Bivariate correlation The Pearson product-moment

More information

Stat 135, Fall 2006 A. Adhikari HOMEWORK 10 SOLUTIONS

Stat 135, Fall 2006 A. Adhikari HOMEWORK 10 SOLUTIONS Stat 135, Fall 2006 A. Adhikari HOMEWORK 10 SOLUTIONS 1a) The model is cw i = β 0 + β 1 el i + ɛ i, where cw i is the weight of the ith chick, el i the length of the egg from which it hatched, and ɛ i

More information

Tests of Linear Restrictions

Tests of Linear Restrictions Tests of Linear Restrictions 1. Linear Restricted in Regression Models In this tutorial, we consider tests on general linear restrictions on regression coefficients. In other tutorials, we examine some

More information

Inference for Regression Inference about the Regression Model and Using the Regression Line, with Details. Section 10.1, 2, 3

Inference for Regression Inference about the Regression Model and Using the Regression Line, with Details. Section 10.1, 2, 3 Inference for Regression Inference about the Regression Model and Using the Regression Line, with Details Section 10.1, 2, 3 Basic components of regression setup Target of inference: linear dependency

More information

ECNS 561 Multiple Regression Analysis

ECNS 561 Multiple Regression Analysis ECNS 561 Multiple Regression Analysis Model with Two Independent Variables Consider the following model Crime i = β 0 + β 1 Educ i + β 2 [what else would we like to control for?] + ε i Here, we are taking

More information

(ii) Scan your answer sheets INTO ONE FILE only, and submit it in the drop-box.

(ii) Scan your answer sheets INTO ONE FILE only, and submit it in the drop-box. FINAL EXAM ** Two different ways to submit your answer sheet (i) Use MS-Word and place it in a drop-box. (ii) Scan your answer sheets INTO ONE FILE only, and submit it in the drop-box. Deadline: December

More information

Stat 401B Exam 2 Fall 2015

Stat 401B Exam 2 Fall 2015 Stat 401B Exam Fall 015 I have neither given nor received unauthorized assistance on this exam. Name Signed Date Name Printed ATTENTION! Incorrect numerical answers unaccompanied by supporting reasoning

More information

Applied Regression Analysis. Section 2: Multiple Linear Regression

Applied Regression Analysis. Section 2: Multiple Linear Regression Applied Regression Analysis Section 2: Multiple Linear Regression 1 The Multiple Regression Model Many problems involve more than one independent variable or factor which affects the dependent or response

More information

Variance Decomposition in Regression James M. Murray, Ph.D. University of Wisconsin - La Crosse Updated: October 04, 2017

Variance Decomposition in Regression James M. Murray, Ph.D. University of Wisconsin - La Crosse Updated: October 04, 2017 Variance Decomposition in Regression James M. Murray, Ph.D. University of Wisconsin - La Crosse Updated: October 04, 2017 PDF file location: http://www.murraylax.org/rtutorials/regression_anovatable.pdf

More information

Linear Regression. In this lecture we will study a particular type of regression model: the linear regression model

Linear Regression. In this lecture we will study a particular type of regression model: the linear regression model 1 Linear Regression 2 Linear Regression In this lecture we will study a particular type of regression model: the linear regression model We will first consider the case of the model with one predictor

More information

Regression Analysis II

Regression Analysis II Regression Analysis II Measures of Goodness of fit Two measures of Goodness of fit Measure of the absolute fit of the sample points to the sample regression line Standard error of the estimate An index

More information

Multiple Linear Regression CIVL 7012/8012

Multiple Linear Regression CIVL 7012/8012 Multiple Linear Regression CIVL 7012/8012 2 Multiple Regression Analysis (MLR) Allows us to explicitly control for many factors those simultaneously affect the dependent variable This is important for

More information

AP Statistics Unit 6 Note Packet Linear Regression. Scatterplots and Correlation

AP Statistics Unit 6 Note Packet Linear Regression. Scatterplots and Correlation Scatterplots and Correlation Name Hr A scatterplot shows the relationship between two quantitative variables measured on the same individuals. variable (y) measures an outcome of a study variable (x) may

More information

Data Set 8: Laysan Finch Beak Widths

Data Set 8: Laysan Finch Beak Widths Data Set 8: Finch Beak Widths Statistical Setting This handout describes an analysis of covariance (ANCOVA) involving one categorical independent variable (with only two levels) and one quantitative covariate.

More information