Topic 18: Model Selection and Diagnostics
|
|
- Branden Booker
- 5 years ago
- Views:
Transcription
1 Topic 18: Model Selection and Diagnostics
2 Variable Selection We want to choose a best model that is a subset of the available explanatory variables Two separate problems 1. How many explanatory variables should we use (i.e., subset size) 2. Given the subset size, which variables should we choose
3 KNNL Example Page 350, Section 9.2 n = 54 patients / cases Y : survival time (liver operation) X s (explanatory variables) are Blood clotting score Prognostic index Enzyme function test Liver function test
4 KNNL Example cont. We start with the usual plots and descriptive statistics Note that time-to-event / survival data are often heavily skewed and typically transformed with a log prior to model fitting
5 Ln Transform of Y Recall that regression model requires Y X to be Normally distributed, not Y Better to look at residuals With data like these, transform reduces influence of long right tail and stabilizes the variance of the residuals
6 Tab delimited Data Data a1; infile 'U:\.www\datasets512\CH09TA01.txt delimiter='09'x; input blood prog enz liver age gender alcmod alcheavy surv lsurv; run; Dummy variables for alcohol use Ln(surv)
7 Data Obs blood prog enz liver age Gender alcmod alcheavy surv lsurv
8
9 Long right tail
10
11
12 Generate scatterplots proc corr plot=matrix; var blood prog enz liver; run; proc corr plot=scatter; var blood prog enz liver; with lsurv; run;
13
14
15
16
17
18 Correlation Summary Pearson Correlation Coefficients, N = 54 Prob > r under H0: Rho=0 blood prog enz liver lsurv < <.0001
19 The Two Problems in Variable Selection 1. To determine an appropriate subset size Might use adjusted R 2, C p, MSE, PRESS, AIC, SBC (BIC) 2. To determine best model of a fixed size Might use R 2
20 Adjusted R 2 R 2 by its construction is guaranteed to increase with p SSE cannot decrease with additional X and SSTO constant Adjusted R 2 uses df to account for changes in p R n 1 SSE MSE 2 = 1 p 1 p = n p SSTO MSTO a
21 Adjusted R 2 Want to find model that maximizes Since MSTO remains constant for a given data set, equivalent to finding model that minimizes MSE Details on pages R a
22 C p Criterion The basic idea is to compare subset models with a full model A subset model is good if there is not substantial bias in the predicted values relative to the full model Looks at the ratio of total mean squared error and the true error variance See page for details
23 C p Criterion C p = SSE p MSE(Full ) ( n 2 p) SSE based on a specific choice of p-1 variables MSE(full) based on all the variables Consider full set C p =(n-p)-(n-2p)=p
24 Use of C p p is the number of regression coefficients including the intercept A model is good according to this criterion if C p p Rule: Pick the smallest model for which C p is smaller than p or pick the model that minimizes C p, provided the C p is not much larger than p
25 SBC (BIC) and AIC Criterion based on log(likelihood) plus a penalty for more complexity AIC minimize SBC minimize SSE p n log + 2p n SSE p n log + p log(n) n
26 Other approaches PRESS (prediction SS) For each case i, delete the case and predict Y using the fitted model based on the other n-1 cases Look at the SS for observed minus predicted Want to minimize the PRESS Appears this requires n regressions but not the case
27 Variable Selection in SAS Additional proc reg model statement options useful in variable selection INCLUDE=n forces the first n explanatory variables into all models BEST=n limits the output to the best n models of each subset size or total START=n limits output to models that include at least n explanatory variables
28 Variable Selection Step-type procedures Forward selection (Step up) Backward elimination (Step down) Stepwise (forward selection with a backward glance) Very popular but now have much better search techniques like BEST
29 2. Ordering models of the same subset size Use R 2 or SSE / MSE or F* This approach can lead us to consider several models that give us approximately the same predicted values May need to apply knowledge of the subject matter to make a final selection Not that important if prediction is the key goal
30 Proc Reg Code proc reg data=a1; model lsurv= blood prog enz liver/ selection=rsquare cp aic sbc b best=3; run;
31 Selection Results Number in Model R-Square C(p) AIC SBC
32 Number in Model Selection Results Parameter Estimates Intercept blood prog enz liver
33 Proc Reg Code proc reg data=a1; model lsurv= blood prog enz liver/ selection=cp aic sbc b best=3; run;
34 Selection Results Number in Model C(p) R-Square AIC SBC WARNING: selection=cp just lists the models in order based on lowest C(p), regardless of whether it is good or not
35 How to Choose with C(p) 1. Want small C(p) 2. Want C(p) near p In original paper, it was suggested to plot C(p) versus p and consider the smallest model that satisfies these criteria Can be somewhat subjective when determining near
36 Proc Reg Creates data set with estimates & criteria proc reg data=a1 outest=b1; model lsurv=blood prog enz liver/ selection=rsquare cp aic sbc b; run;quit; symbol1 v=circle i=none; symbol2 v=none i=join; proc gplot data=b1; plot _Cp_*_P P_*_P_ / overlay; run;
37 Start to approach C(p)=p line here
38 Model Validation Since data used to generate parameter estimates, you d expect model to predict fitted Y s well Should check model predictive ability for a separate data set if available Various techniques of cross validation (data split, leave one out) are possible if only one data set available
39 Additional Multiple Regression Diagnostics Partial regression plots Studentized deleted residuals Hat matrix diagonals Dffits, Cook s D, DFBETAS Variance inflation factor Tolerance
40 KNNL Example Page 386, Section 10.1 Y is amount of life insurance X 1 is average annual income X 2 is a risk aversion score n = 18 managers
41 Read in the data set data a1; infile../data/ch10ta01.txt'; input income risk insur;
42
43 Partial regression plots Also called added variable plots or adjusted variable plots One plot for each X i
44 Partial regression plots These plots show the strength of the marginal relationship between Y and X i in the full model (recall partial correlation) They can also detect Nonlinear relationships Heterogeneous variances Outliers
45 Partial regression plots Consider plot for X 1 Use the other X s to predict Y Use the other X s to predict X 1 Plot the residuals from the first regression vs the residuals from the second regression
46 The partial option with proc reg proc reg data=a1; model insur=income risk /partial; run;
47 Output Analysis of Variance Source DF Sum of Squares Mean Square F Value Pr > F Model <.0001 Error Corrected Total Root MSE R-Square Dependent Mean Adj R-Sq Coeff Var
48 Output Parameter Estimates Parameter Standard Variable DF Estimate Error t Value Pr > t Intercept <.0001 income <.0001 risk
49 Curvilinear relationship
50 Can also see that here
51 Other Residuals There are several versions of residuals 1. Our usual residuals ei = Yi Yˆ i 2. Studentized residuals * ei ei = MSE 1 h ( ) Studentized means dividing by its standard error Are almost distributed t (n-p) ii
52 Studentized deleted Residual Delete case i and refit the model Compute the predicted value for case i using this refitted model Compute the studentized residual Don t do this literally but this is the concept Results in t-distributed residuals
53 Studentized Deleted Residuals We use the notation (i) to indicate that case i has been deleted from the model fit computations d = Y Yˆ is the deleted residual i i i(i) Turns out d i = e i /(1-h ii ) Also Var(d i )=Var(e i )/(1-h ii ) 2 =MSE (i) /(1- h ii ) t i = ei MSE(i) 1 h ii ( )
54 Using Residuals When we examine the residuals, regardless of version, we are looking for Outliers Non-normal error distributions Influential observations
55 The r option and studentized residuals proc reg data=a1; model insur=income risk/r; run;
56 Output Output Statistics Obs income risk Dependent Variable Predicted Value Std Error Mean Predict Residual Std Error Residual Student Residual Cook's D
57
58 Cook s Distance A measure of the influence of case i on all of the Ŷ i s (all the cases) It is a standardized version of the sum of squares of the differences between the predicted values computed with and without case i Compare with F(p,n-p) Concern if distance above 50%-tile
59
60 The influence option and studentized deleted residuals proc reg data=a1; model insur=income risk /influence; run;
61 Output Output Statistics Hat Diag Cov DFBETAS Obs income risk Residual RStudent H Ratio DFFITS Intercept income risk
62 Hat matrix diagonals h ii is a measure of how much Y i is contributing to the prediction of Ŷ i Ŷ i = h i1 Y 1 + h i2 Y 2 + h i3 Y 3 + h ii is sometimes called the leverage of the i th observation It is a measure of the distance between the X values for the i th case and the means of the X values
63 Hat matrix diagonals 0 h ii 1 Σ(h ii ) = p Large value of h ii suggess that i th case is distant from the center of all X s The average value is p/n Values far from this average point to cases that should be examined carefully
64
65 Output Output Statistics Hat Diag Cov DFBETAS Obs income risk Residual RStudent H Ratio DFFITS Intercept income risk
66 DFFITS A measure of the influence of case i on Ŷ i (a single case) Thus, it is closely related to h ii It is a standardized version of the difference between Ŷ i computed with and without case i Concern if greater than 1 for small data sets or greater than 2 pn for large data sets
67 DFBETAS A measure of the influence of case i on each of the regression coefficients It is a standardized version of the difference between the regression coefficient computed with and without case i Concern if DFBETA greater than 1 in small data sets or greater than 2/ n for large data sets
68 Variance Inflation Factor The VIF is related to the variance of the estimated regression coefficients We calculate it for each explanatory variable One suggested rule is that a value of 10 or more for VIF indicates excessive multicollinearity
69 Tolerance TOL = (1-R 2 k) where R 2 k is the squared multiple correlation obtained in a regression where all other explanatory variables are used to predict X k TOL = 1/VIF Described in comment on p 410
70 Output Parameter Estimates Parameter Standard Variance Variable DF Estimate Error t Value Pr > t Tolerance Inflation Intercept < income < risk
71 Full diagnostics proc reg data=a1; model insur=income risk /r partial influence tol; id income risk; plot rstudent.*(income risk); run;
72 Plot statement inside Reg Can generate several plots within Proc Reg Need to know symbol names Available in Table 1 once you click on plot command inside REG syntax r. represents usual residuals rstudent. represents deleted resids p. represents predicted values
73
74
75 Last slide We went over KNNL Chapters 9 and 10 We used program topic18.sas to generate the output
Outline. Topic 13 - Model Selection. Predicting Survival - Page 350. Survival Time as a Response. Variable Selection R 2 C p Adjusted R 2 PRESS
Topic 13 - Model Selection - Fall 2013 Variable Selection R 2 C p Adjusted R 2 PRESS Outline Automatic Search Procedures Topic 13 2 Predicting Survival - Page 350 Surgical unit wants to predict survival
More informationOutline. Review regression diagnostics Remedial measures Weighted regression Ridge regression Robust regression Bootstrapping
Topic 19: Remedies Outline Review regression diagnostics Remedial measures Weighted regression Ridge regression Robust regression Bootstrapping Regression Diagnostics Summary Check normality of the residuals
More informationModel Selection Procedures
Model Selection Procedures Statistics 135 Autumn 2005 Copyright c 2005 by Mark E. Irwin Model Selection Procedures Consider a regression setting with K potential predictor variables and you wish to explore
More informationTopic 17 - Single Factor Analysis of Variance. Outline. One-way ANOVA. The Data / Notation. One way ANOVA Cell means model Factor effects model
Topic 17 - Single Factor Analysis of Variance - Fall 2013 One way ANOVA Cell means model Factor effects model Outline Topic 17 2 One-way ANOVA Response variable Y is continuous Explanatory variable is
More informationTopic 20: Single Factor Analysis of Variance
Topic 20: Single Factor Analysis of Variance Outline Single factor Analysis of Variance One set of treatments Cell means model Factor effects model Link to linear regression using indicator explanatory
More informationCOMPREHENSIVE WRITTEN EXAMINATION, PAPER III FRIDAY AUGUST 26, 2005, 9:00 A.M. 1:00 P.M. STATISTICS 174 QUESTION
COMPREHENSIVE WRITTEN EXAMINATION, PAPER III FRIDAY AUGUST 26, 2005, 9:00 A.M. 1:00 P.M. STATISTICS 174 QUESTION Answer all parts. Closed book, calculators allowed. It is important to show all working,
More informationSAS Commands. General Plan. Output. Construct scatterplot / interaction plot. Run full model
Topic 23 - Unequal Replication Data Model Outline - Fall 2013 Parameter Estimates Inference Topic 23 2 Example Page 954 Data for Two Factor ANOVA Y is the response variable Factor A has levels i = 1, 2,...,
More informationModel Building Chap 5 p251
Model Building Chap 5 p251 Models with one qualitative variable, 5.7 p277 Example 4 Colours : Blue, Green, Lemon Yellow and white Row Blue Green Lemon Insects trapped 1 0 0 1 45 2 0 0 1 59 3 0 0 1 48 4
More information1) Answer the following questions as true (T) or false (F) by circling the appropriate letter.
1) Answer the following questions as true (T) or false (F) by circling the appropriate letter. T F T F T F a) Variance estimates should always be positive, but covariance estimates can be either positive
More informationssh tap sas913, sas
B. Kedem, STAT 430 SAS Examples SAS8 ===================== ssh xyz@glue.umd.edu, tap sas913, sas https://www.statlab.umd.edu/sasdoc/sashtml/onldoc.htm Multiple Regression ====================== 0. Show
More informationChapter 11 : State SAT scores for 1982 Data Listing
EXST3201 Chapter 12a Geaghan Fall 2005: Page 1 Chapter 12 : Variable selection An example: State SAT scores In 1982 there was concern for scores of the Scholastic Aptitude Test (SAT) scores that varied
More informationReview: Second Half of Course Stat 704: Data Analysis I, Fall 2014
Review: Second Half of Course Stat 704: Data Analysis I, Fall 2014 Tim Hanson, Ph.D. University of South Carolina T. Hanson (USC) Stat 704: Data Analysis I, Fall 2014 1 / 13 Chapter 8: Polynomials & Interactions
More informationLINEAR REGRESSION. Copyright 2013, SAS Institute Inc. All rights reserved.
LINEAR REGRESSION LINEAR REGRESSION REGRESSION AND OTHER MODELS Type of Response Type of Predictors Categorical Continuous Continuous and Categorical Continuous Analysis of Variance (ANOVA) Ordinary Least
More informationTopic 28: Unequal Replication in Two-Way ANOVA
Topic 28: Unequal Replication in Two-Way ANOVA Outline Two-way ANOVA with unequal numbers of observations in the cells Data and model Regression approach Parameter estimates Previous analyses with constant
More informationDr. Maddah ENMG 617 EM Statistics 11/28/12. Multiple Regression (3) (Chapter 15, Hines)
Dr. Maddah ENMG 617 EM Statistics 11/28/12 Multiple Regression (3) (Chapter 15, Hines) Problems in multiple regression: Multicollinearity This arises when the independent variables x 1, x 2,, x k, are
More informationChapter 8 Quantitative and Qualitative Predictors
STAT 525 FALL 2017 Chapter 8 Quantitative and Qualitative Predictors Professor Dabao Zhang Polynomial Regression Multiple regression using X 2 i, X3 i, etc as additional predictors Generates quadratic,
More informationLecture 11: Simple Linear Regression
Lecture 11: Simple Linear Regression Readings: Sections 3.1-3.3, 11.1-11.3 Apr 17, 2009 In linear regression, we examine the association between two quantitative variables. Number of beers that you drink
More informationRegression Model Building
Regression Model Building Setting: Possibly a large set of predictor variables (including interactions). Goal: Fit a parsimonious model that explains variation in Y with a small set of predictors Automated
More informationholding all other predictors constant
Multiple Regression Numeric Response variable (y) p Numeric predictor variables (p < n) Model: Y = b 0 + b 1 x 1 + + b p x p + e Partial Regression Coefficients: b i effect (on the mean response) of increasing
More informationMulticollinearity Exercise
Multicollinearity Exercise Use the attached SAS output to answer the questions. [OPTIONAL: Copy the SAS program below into the SAS editor window and run it.] You do not need to submit any output, so there
More informationDetecting and Assessing Data Outliers and Leverage Points
Chapter 9 Detecting and Assessing Data Outliers and Leverage Points Section 9.1 Background Background Because OLS estimators arise due to the minimization of the sum of squared errors, large residuals
More informationChapter 2 Inferences in Simple Linear Regression
STAT 525 SPRING 2018 Chapter 2 Inferences in Simple Linear Regression Professor Min Zhang Testing for Linear Relationship Term β 1 X i defines linear relationship Will then test H 0 : β 1 = 0 Test requires
More informationSTATISTICS 110/201 PRACTICE FINAL EXAM
STATISTICS 110/201 PRACTICE FINAL EXAM Questions 1 to 5: There is a downloadable Stata package that produces sequential sums of squares for regression. In other words, the SS is built up as each variable
More informationStatistics 512: Applied Linear Models. Topic 4
Topic Overview This topic will cover General Linear Tests Extra Sums of Squares Partial Correlations Multicollinearity Model Selection Statistics 512: Applied Linear Models Topic 4 General Linear Tests
More informationGeneral Linear Model (Chapter 4)
General Linear Model (Chapter 4) Outcome variable is considered continuous Simple linear regression Scatterplots OLS is BLUE under basic assumptions MSE estimates residual variance testing regression coefficients
More information3 Variables: Cyberloafing Conscientiousness Age
title 'Cyberloafing, Mike Sage'; run; PROC CORR data=sage; var Cyberloafing Conscientiousness Age; run; quit; The CORR Procedure 3 Variables: Cyberloafing Conscientiousness Age Simple Statistics Variable
More informationSTATISTICS 174: APPLIED STATISTICS FINAL EXAM DECEMBER 10, 2002
Time allowed: 3 HOURS. STATISTICS 174: APPLIED STATISTICS FINAL EXAM DECEMBER 10, 2002 This is an open book exam: all course notes and the text are allowed, and you are expected to use your own calculator.
More informationa. YOU MAY USE ONE 8.5 X11 TWO-SIDED CHEAT SHEET AND YOUR TEXTBOOK (OR COPY THEREOF).
STAT3503 Test 2 NOTE: a. YOU MAY USE ONE 8.5 X11 TWO-SIDED CHEAT SHEET AND YOUR TEXTBOOK (OR COPY THEREOF). b. YOU MAY USE ANY ELECTRONIC CALCULATOR. c. FOR FULL MARKS YOU MUST SHOW THE FORMULA YOU USE
More informationChapter 1 Linear Regression with One Predictor
STAT 525 FALL 2018 Chapter 1 Linear Regression with One Predictor Professor Min Zhang Goals of Regression Analysis Serve three purposes Describes an association between X and Y In some applications, the
More informationTopic 23: Diagnostics and Remedies
Topic 23: Diagnostics and Remedies Outline Diagnostics residual checks ANOVA remedial measures Diagnostics Overview We will take the diagnostics and remedial measures that we learned for regression and
More informationHow the mean changes depends on the other variable. Plots can show what s happening...
Chapter 8 (continued) Section 8.2: Interaction models An interaction model includes one or several cross-product terms. Example: two predictors Y i = β 0 + β 1 x i1 + β 2 x i2 + β 12 x i1 x i2 + ɛ i. How
More informationOutline Topic 21 - Two Factor ANOVA
Outline Topic 21 - Two Factor ANOVA Data Model Parameter Estimates - Fall 2013 Equal Sample Size One replicate per cell Unequal Sample size Topic 21 2 Overview Now have two factors (A and B) Suppose each
More informationLecture 11 Multiple Linear Regression
Lecture 11 Multiple Linear Regression STAT 512 Spring 2011 Background Reading KNNL: 6.1-6.5 11-1 Topic Overview Review: Multiple Linear Regression (MLR) Computer Science Case Study 11-2 Multiple Regression
More informationSTATISTICS 479 Exam II (100 points)
Name STATISTICS 79 Exam II (1 points) 1. A SAS data set was created using the following input statement: Answer parts(a) to (e) below. input State $ City $ Pop199 Income Housing Electric; (a) () Give the
More informationTopic 14: Inference in Multiple Regression
Topic 14: Inference in Multiple Regression Outline Review multiple linear regression Inference of regression coefficients Application to book example Inference of mean Application to book example Inference
More informationBooklet of Code and Output for STAC32 Final Exam
Booklet of Code and Output for STAC32 Final Exam December 8, 2014 List of Figures in this document by page: List of Figures 1 Popcorn data............................. 2 2 MDs by city, with normal quantile
More informationChapter 6 Multiple Regression
STAT 525 FALL 2018 Chapter 6 Multiple Regression Professor Min Zhang The Data and Model Still have single response variable Y Now have multiple explanatory variables Examples: Blood Pressure vs Age, Weight,
More informationOutline. Topic 22 - Interaction in Two Factor ANOVA. Interaction Not Significant. General Plan
Topic 22 - Interaction in Two Factor ANOVA - Fall 2013 Outline Strategies for Analysis when interaction not present when interaction present when n ij = 1 when factor(s) quantitative Topic 22 2 General
More informationTopic 25 - One-Way Random Effects Models. Outline. Random Effects vs Fixed Effects. Data for One-way Random Effects Model. One-way Random effects
Topic 5 - One-Way Random Effects Models One-way Random effects Outline Model Variance component estimation - Fall 013 Confidence intervals Topic 5 Random Effects vs Fixed Effects Consider factor with numerous
More informationSimple Regression Model Setup Estimation Inference Prediction. Model Diagnostic. Multiple Regression. Model Setup and Estimation.
Statistical Computation Math 475 Jimin Ding Department of Mathematics Washington University in St. Louis www.math.wustl.edu/ jmding/math475/index.html October 10, 2013 Ridge Part IV October 10, 2013 1
More informationChap 10: Diagnostics, p384
Chap 10: Dagnostcs, p384 Multcollnearty 10.5 p406 Defnton Multcollnearty exsts when two or more ndependent varables used n regresson are moderately or hghly correlated. - when multcollnearty exsts, regresson
More informationTwo-factor studies. STAT 525 Chapter 19 and 20. Professor Olga Vitek
Two-factor studies STAT 525 Chapter 19 and 20 Professor Olga Vitek December 2, 2010 19 Overview Now have two factors (A and B) Suppose each factor has two levels Could analyze as one factor with 4 levels
More informationFinal Review. Yang Feng. Yang Feng (Columbia University) Final Review 1 / 58
Final Review Yang Feng http://www.stat.columbia.edu/~yangfeng Yang Feng (Columbia University) Final Review 1 / 58 Outline 1 Multiple Linear Regression (Estimation, Inference) 2 Special Topics for Multiple
More informationLecture 1 Linear Regression with One Predictor Variable.p2
Lecture Linear Regression with One Predictor Variablep - Basics - Meaning of regression parameters p - β - the slope of the regression line -it indicates the change in mean of the probability distn of
More informationOverview Scatter Plot Example
Overview Topic 22 - Linear Regression and Correlation STAT 5 Professor Bruce Craig Consider one population but two variables For each sampling unit observe X and Y Assume linear relationship between variables
More informationStat 500 Midterm 2 12 November 2009 page 0 of 11
Stat 500 Midterm 2 12 November 2009 page 0 of 11 Please put your name on the back of your answer book. Do NOT put it on the front. Thanks. Do not start until I tell you to. The exam is closed book, closed
More informationLecture 10 Multiple Linear Regression
Lecture 10 Multiple Linear Regression STAT 512 Spring 2011 Background Reading KNNL: 6.1-6.5 10-1 Topic Overview Multiple Linear Regression Model 10-2 Data for Multiple Regression Y i is the response variable
More informationStatistics for exp. medical researchers Regression and Correlation
Faculty of Health Sciences Regression analysis Statistics for exp. medical researchers Regression and Correlation Lene Theil Skovgaard Sept. 28, 2015 Linear regression, Estimation and Testing Confidence
More information10. Alternative case influence statistics
10. Alternative case influence statistics a. Alternative to D i : dffits i (and others) b. Alternative to studres i : externally-studentized residual c. Suggestion: use whatever is convenient with the
More informationChapter 10 Building the Regression Model II: Diagnostics
Chapter 10 Building the Regression Model II: Diagnostics 許湘伶 Applied Linear Regression Models (Kutner, Nachtsheim, Neter, Li) hsuhl (NUK) LR Chap 10 1 / 41 10.1 Model Adequacy for a Predictor Variable-Added
More informationEXST7015: Estimating tree weights from other morphometric variables Raw data print
Simple Linear Regression SAS example Page 1 1 ********************************************; 2 *** Data from Freund & Wilson (1993) ***; 3 *** TABLE 8.24 : ESTIMATING TREE WEIGHTS ***; 4 ********************************************;
More informationDepartment of Mathematics The University of Toledo. Master of Science Degree Comprehensive Examination Applied Statistics.
Department of Mathematics The University of Toledo Master of Science Degree Comprehensive Examination Applied Statistics April 8, 205 nstructions Do all problems. Show all of your computations. Prove all
More informationSTA 302 H1F / 1001 HF Fall 2007 Test 1 October 24, 2007
STA 302 H1F / 1001 HF Fall 2007 Test 1 October 24, 2007 LAST NAME: SOLUTIONS FIRST NAME: STUDENT NUMBER: ENROLLED IN: (circle one) STA 302 STA 1001 INSTRUCTIONS: Time: 90 minutes Aids allowed: calculator.
More informationLecture notes on Regression & SAS example demonstration
Regression & Correlation (p. 215) When two variables are measured on a single experimental unit, the resulting data are called bivariate data. You can describe each variable individually, and you can also
More informationStatistics 5100 Spring 2018 Exam 1
Statistics 5100 Spring 2018 Exam 1 Directions: You have 60 minutes to complete the exam. Be sure to answer every question, and do not spend too much time on any part of any question. Be concise with all
More informationDay 4: Shrinkage Estimators
Day 4: Shrinkage Estimators Kenneth Benoit Data Mining and Statistical Learning March 9, 2015 n versus p (aka k) Classical regression framework: n > p. Without this inequality, the OLS coefficients have
More informationUnit 11: Multiple Linear Regression
Unit 11: Multiple Linear Regression Statistics 571: Statistical Methods Ramón V. León 7/13/2004 Unit 11 - Stat 571 - Ramón V. León 1 Main Application of Multiple Regression Isolating the effect of a variable
More information1 A Review of Correlation and Regression
1 A Review of Correlation and Regression SW, Chapter 12 Suppose we select n = 10 persons from the population of college seniors who plan to take the MCAT exam. Each takes the test, is coached, and then
More informationRegression Diagnostics Procedures
Regression Diagnostics Procedures ASSUMPTIONS UNDERLYING REGRESSION/CORRELATION NORMALITY OF VARIANCE IN Y FOR EACH VALUE OF X For any fixed value of the independent variable X, the distribution of the
More informationMath 423/533: The Main Theoretical Topics
Math 423/533: The Main Theoretical Topics Notation sample size n, data index i number of predictors, p (p = 2 for simple linear regression) y i : response for individual i x i = (x i1,..., x ip ) (1 p)
More informationTopic 16: Multicollinearity and Polynomial Regression
Topic 16: Multicollinearity and Polynomial Regression Outline Multicollinearity Polynomial regression An example (KNNL p256) The P-value for ANOVA F-test is
More informationOutline. Topic 20 - Diagnostics and Remedies. Residuals. Overview. Diagnostics Plots Residual checks Formal Tests. STAT Fall 2013
Topic 20 - Diagnostics and Remedies - Fall 2013 Diagnostics Plots Residual checks Formal Tests Remedial Measures Outline Topic 20 2 General assumptions Overview Normally distributed error terms Independent
More informationThe General Linear Model. April 22, 2008
The General Linear Model. April 22, 2008 Multiple regression Data: The Faroese Mercury Study Simple linear regression Confounding The multiple linear regression model Interpretation of parameters Model
More informationThe General Linear Model. November 20, 2007
The General Linear Model. November 20, 2007 Multiple regression Data: The Faroese Mercury Study Simple linear regression Confounding The multiple linear regression model Interpretation of parameters Model
More informationDiscussion # 6, Water Quality and Mercury in Fish
Solution: Discussion #, Water Quality and Mercury in Fish Summary Approach The purpose of the analysis was somewhat ambiguous: analysis to determine which of the explanatory variables appears to influence
More informationREGRESSION DIAGNOSTICS AND REMEDIAL MEASURES
REGRESSION DIAGNOSTICS AND REMEDIAL MEASURES Lalmohan Bhar I.A.S.R.I., Library Avenue, Pusa, New Delhi 110 01 lmbhar@iasri.res.in 1. Introduction Regression analysis is a statistical methodology that utilizes
More informationSTAT 3A03 Applied Regression Analysis With SAS Fall 2017
STAT 3A03 Applied Regression Analysis With SAS Fall 2017 Assignment 5 Solution Set Q. 1 a The code that I used and the output is as follows PROC GLM DataS3A3.Wool plotsnone; Class Amp Len Load; Model CyclesAmp
More informationContents. 1 Review of Residuals. 2 Detecting Outliers. 3 Influential Observations. 4 Multicollinearity and its Effects
Contents 1 Review of Residuals 2 Detecting Outliers 3 Influential Observations 4 Multicollinearity and its Effects W. Zhou (Colorado State University) STAT 540 July 6th, 2015 1 / 32 Model Diagnostics:
More informationThe Steps to Follow in a Multiple Regression Analysis
ABSTRACT The Steps to Follow in a Multiple Regression Analysis Theresa Hoang Diem Ngo, Warner Bros. Home Video, Burbank, CA A multiple regression analysis is the most powerful tool that is widely used,
More informationBE640 Intermediate Biostatistics 2. Regression and Correlation. Simple Linear Regression Software: SAS. Emergency Calls to the New York Auto Club
BE640 Intermediate Biostatistics 2. Regression and Correlation Simple Linear Regression Software: SAS Emergency Calls to the New York Auto Club Source: Chatterjee, S; Handcock MS and Simonoff JS A Casebook
More informationy response variable x 1, x 2,, x k -- a set of explanatory variables
11. Multiple Regression and Correlation y response variable x 1, x 2,, x k -- a set of explanatory variables In this chapter, all variables are assumed to be quantitative. Chapters 12-14 show how to incorporate
More informationMultiple Linear Regression
Andrew Lonardelli December 20, 2013 Multiple Linear Regression 1 Table Of Contents Introduction: p.3 Multiple Linear Regression Model: p.3 Least Squares Estimation of the Parameters: p.4-5 The matrix approach
More informationThe program for the following sections follows.
Homework 6 nswer sheet Page 31 The program for the following sections follows. dm'log;clear;output;clear'; *************************************************************; *** EXST734 Homework Example 1
More informationTopic 32: Two-Way Mixed Effects Model
Topic 3: Two-Way Mixed Effects Model Outline Two-way mixed models Three-way mixed models Data for two-way design Y is the response variable Factor A with levels i = 1 to a Factor B with levels j = 1 to
More informationHandout 1: Predicting GPA from SAT
Handout 1: Predicting GPA from SAT appsrv01.srv.cquest.utoronto.ca> appsrv01.srv.cquest.utoronto.ca> ls Desktop grades.data grades.sas oldstuff sasuser.800 appsrv01.srv.cquest.utoronto.ca> cat grades.data
More informationCorrelation and the Analysis of Variance Approach to Simple Linear Regression
Correlation and the Analysis of Variance Approach to Simple Linear Regression Biometry 755 Spring 2009 Correlation and the Analysis of Variance Approach to Simple Linear Regression p. 1/35 Correlation
More informationMultiple linear regression S6
Basic medical statistics for clinical and experimental research Multiple linear regression S6 Katarzyna Jóźwiak k.jozwiak@nki.nl November 15, 2017 1/42 Introduction Two main motivations for doing multiple
More informationTopic 29: Three-Way ANOVA
Topic 29: Three-Way ANOVA Outline Three-way ANOVA Data Model Inference Data for three-way ANOVA Y, the response variable Factor A with levels i = 1 to a Factor B with levels j = 1 to b Factor C with levels
More informationChapter 1 Statistical Inference
Chapter 1 Statistical Inference causal inference To infer causality, you need a randomized experiment (or a huge observational study and lots of outside information). inference to populations Generalizations
More informationModel Selection. Frank Wood. December 10, 2009
Model Selection Frank Wood December 10, 2009 Standard Linear Regression Recipe Identify the explanatory variables Decide the functional forms in which the explanatory variables can enter the model Decide
More informationChapter 1: Linear Regression with One Predictor Variable also known as: Simple Linear Regression Bivariate Linear Regression
BSTT523: Kutner et al., Chapter 1 1 Chapter 1: Linear Regression with One Predictor Variable also known as: Simple Linear Regression Bivariate Linear Regression Introduction: Functional relation between
More informationStat 302 Statistical Software and Its Applications SAS: Simple Linear Regression
1 Stat 302 Statistical Software and Its Applications SAS: Simple Linear Regression Fritz Scholz Department of Statistics, University of Washington Winter Quarter 2015 February 16, 2015 2 The Spirit of
More informationMultiple Linear Regression
Multiple Linear Regression University of California, San Diego Instructor: Ery Arias-Castro http://math.ucsd.edu/~eariasca/teaching.html 1 / 42 Passenger car mileage Consider the carmpg dataset taken from
More informationLecture 3: Inference in SLR
Lecture 3: Inference in SLR STAT 51 Spring 011 Background Reading KNNL:.1.6 3-1 Topic Overview This topic will cover: Review of hypothesis testing Inference about 1 Inference about 0 Confidence Intervals
More informationMultiple Regression and Model Building (cont d) + GIS Lecture 21 3 May 2006 R. Ryznar
Multiple Regression and Model Building (cont d) + GIS 11.220 Lecture 21 3 May 2006 R. Ryznar Model Summary b 1-[(SSE/n-k+1)/(SST/n-1)] Model 1 Adjusted Std. Error of R R Square R Square the Estimate.991
More informationEXST Regression Techniques Page 1. We can also test the hypothesis H :" œ 0 versus H :"
EXST704 - Regression Techniques Page 1 Using F tests instead of t-tests We can also test the hypothesis H :" œ 0 versus H :" Á 0 with an F test.! " " " F œ MSRegression MSError This test is mathematically
More informationData Mining and Data Warehousing. Henryk Maciejewski. Data Mining Predictive modelling: regression
Data Mining and Data Warehousing Henryk Maciejewski Data Mining Predictive modelling: regression Algorithms for Predictive Modelling Contents Regression Classification Auxiliary topics: Estimation of prediction
More informationLINEAR REGRESSION ANALYSIS. MODULE XVI Lecture Exercises
LINEAR REGRESSION ANALYSIS MODULE XVI Lecture - 44 Exercises Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Exercise 1 The following data has been obtained on
More informationRegression Review. Statistics 149. Spring Copyright c 2006 by Mark E. Irwin
Regression Review Statistics 149 Spring 2006 Copyright c 2006 by Mark E. Irwin Matrix Approach to Regression Linear Model: Y i = β 0 + β 1 X i1 +... + β p X ip + ɛ i ; ɛ i iid N(0, σ 2 ), i = 1,..., n
More informationAny of 27 linear and nonlinear models may be fit. The output parallels that of the Simple Regression procedure.
STATGRAPHICS Rev. 9/13/213 Calibration Models Summary... 1 Data Input... 3 Analysis Summary... 5 Analysis Options... 7 Plot of Fitted Model... 9 Predicted Values... 1 Confidence Intervals... 11 Observed
More informationSTOR 455 STATISTICAL METHODS I
STOR 455 STATISTICAL METHODS I Jan Hannig Mul9variate Regression Y=X β + ε X is a regression matrix, β is a vector of parameters and ε are independent N(0,σ) Es9mated parameters b=(x X) - 1 X Y Predicted
More informationRegression Analysis V... More Model Building: Including Qualitative Predictors, Model Searching, Model "Checking"/Diagnostics
Regression Analysis V... More Model Building: Including Qualitative Predictors, Model Searching, Model "Checking"/Diagnostics The session is a continuation of a version of Section 11.3 of MMD&S. It concerns
More informationRegression Analysis V... More Model Building: Including Qualitative Predictors, Model Searching, Model "Checking"/Diagnostics
Regression Analysis V... More Model Building: Including Qualitative Predictors, Model Searching, Model "Checking"/Diagnostics The session is a continuation of a version of Section 11.3 of MMD&S. It concerns
More information12.12 MODEL BUILDING, AND THE EFFECTS OF MULTICOLLINEARITY (OPTIONAL)
12.12 Model Building, and the Effects of Multicollinearity (Optional) 1 Although Excel and MegaStat are emphasized in Business Statistics in Practice, Second Canadian Edition, some examples in the additional
More informationSPECIAL TOPICS IN REGRESSION ANALYSIS
1 SPECIAL TOPICS IN REGRESSION ANALYSIS Representing Nominal Scales in Regression Analysis There are several ways in which a set of G qualitative distinctions on some variable of interest can be represented
More informationLinear model selection and regularization
Linear model selection and regularization Problems with linear regression with least square 1. Prediction Accuracy: linear regression has low bias but suffer from high variance, especially when n p. It
More informationSTAT 4385 Topic 06: Model Diagnostics
STAT 4385 Topic 06: Xiaogang Su, Ph.D. Department of Mathematical Science University of Texas at El Paso xsu@utep.edu Spring, 2016 1/ 40 Outline Several Types of Residuals Raw, Standardized, Studentized
More informationNotes 6. Basic Stats Procedures part II
Statistics 5106, Fall 2007 Notes 6 Basic Stats Procedures part II Testing for Correlation between Two Variables You have probably all heard about correlation. When two variables are correlated, they are
More informationSTA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #6
STA 8 Applied Linear Models: Regression Analysis Spring 011 Solution for Homework #6 6. a) = 11 1 31 41 51 1 3 4 5 11 1 31 41 51 β = β1 β β 3 b) = 1 1 1 1 1 11 1 31 41 51 1 3 4 5 β = β 0 β1 β 6.15 a) Stem-and-leaf
More informationPsychology Seminar Psych 406 Dr. Jeffrey Leitzel
Psychology Seminar Psych 406 Dr. Jeffrey Leitzel Structural Equation Modeling Topic 1: Correlation / Linear Regression Outline/Overview Correlations (r, pr, sr) Linear regression Multiple regression interpreting
More information