Assumptions of the error term, assumptions of the independent variables
|
|
- Crystal Owen
- 5 years ago
- Views:
Transcription
1 Petra Petrovics, Renáta Géczi-Papp Assumptions of the error term, assumptions of the independent variables 6 th seminar
2 Multiple linear regression model Linear relationship between x 1, x 2,, x p and y Y depends on: x 1, x 2,, x p p independent variables The error term (ε) β 0, β 1,, β p regression coefficients Y = β 0 + β 1 x 1 + β 2 x β p x p +ε
3 Assumptions of the error term The expected value of the error term equals 0 E(ε X 1, X 2, X p )=0 Constant variance (homoscedasticity) Var(ε) = 2 The error term is uncorrelated across observations. Normally distributed error term.
4 Assumptions of the independent variables Linear independency. Fix values, which do not change sample by sample. There is no scale error. The independent variable is uncorrelated with the error term.
5 Standard linear regression model When the abovementioned assumptions are met If the sample data does not meet the assumptions, more complex models and estimation procedures are required.
6 SPSS y - turnover x 1 -property x 2 -number of emloyees
7 Assumptions of the error term The expected value of the error term equals 0 E(ε X 1, X 2, X p )=0 Constant variance (homoscedasticity) Var(ε) = 2 The error term is uncorrelated across observations. Normally distributed error term.
8 1. M(ε) = 0 The positive and negative values offset each other. If different from 0, the reason may be that we missed a significant explanatory variable. It is difficult to verify in practice. If we assume that the least squares method is valid, then this condition is met.
9 Assumptions of the error term The expected value of the error term equals 0 E(ε X 1, X 2, X p )=0 Constant variance (homoscedasticity) Var(ε) = 2 The error term is uncorrelated across observations. Normally distributed error term.
10 2. Homoscedasticity (Var(ε) = 2 ) the variance of the error term is the same for all observations. Testing: o o Plots of residuals versus independent variables (or predicted value ŷ or time) Statistic tests Goldfeld-Quandt test, (Especially when the hetescedasticity is related to one of the independent variables.)
11 Graphical tests for homoscedasticity e e e x i ŷ x i ŷ x i ŷ Homoscedastic residuals Heteroscedastic residuals e residual
12 SPSS Analyze / Regression / Linear - Plots Dependent variable Standardized predicted value Standardized residuum Deleted residuum Adjusted predicted value Studentized residuum Studentized deleted residuum Relationship of standardized predicted value (ZPRED) and standardized residuum (ZRESID) Homoscedasticity?
13 Output Variance of residuum ~constant Homoscedasticity
14 If it is heteroscedastic LOGARITHM! Transform/Compute variable
15 Assumptions of the error term The expected value of the error term equals 0 E(ε X 1, X 2, X p )=0 Constant variance (homoscedasticity) Var(ε) = 2 The error term is uncorrelated across observations. Normally distributed error term.
16 The error term is uncorrelated across observations In case of cross-sectional, data the observations meet the assumption of simple random sampling, thus we do not have to test this hypothesis. before making estimations according to time series data, we need to determine the residual autocorrelation.
17 Causes of autocorrelation if we did not use every important descriptive variables in the model (we can t recognise the effect, no data, short time series) if the model specification is wrong i.e.: the relationship is not linear, but we use linear regression not random scaling errors
18 Plots to detect autocorrelation e t e t Independent variable there is no in the equation. e t-1 e t-1 e We sholud to use other type of function. t
19 The Durbin-Watson test H 0 : ρ = 0 no autocorrelation H 1 : ρ 0 autocorrelation +violatoró autocorrelation - violator autocorrelation 0 d l d u 2 4-d u 4-d l 4 No problem d n t 2 Limits: ( e t n t 1 e e 2 t t 1) 0 d 4 Positive autocorrelation: 0 d 2 Negative autocorrelation : 2 d 4 Weaker problem: no decision Use more variable Use larger database 2
20 A Durbin-Watson test decision table H 1 Accept H 0 :p=0 Reject p>0 Positive autocorrelatio n p<0 Negative autocorrelatio n No decision d>d u d<d l d l <d<d u d<4-d u d>4-d l 4-d l <d<4-d u Source: Kerékgyártó-Mundruczó [1999]
21 Durbin-Watson statistics (5% significance level) Source: Statisztikai képletgyűjtemény n m = 1 m = 2 m = 3 m = 4 m = 5 d L d U d L d U d L d U d L d U d L d U 15 1,08 1,36 0,95 1,54 0,82 1,75 0,69 1,97 0,56 2, ,10 1,37 0,98 1,54 0,86 1,73 0,74 1,93 0,62 2, ,13 1,38 1,02 1,54 0,90 1,71 0,78 1,90 0,67 2, ,16 1,39 1,05 1,53 0,93 1,69 0,82 1,87 0,71 2, ,18 1,40 1,08 1,53 0,97 1,68 0,86 1,85 0,75 2, ,20 1,41 1,10 1,54 1,00 1,68 0,90 1,83 0,79 1, ,22 1,42 1,13 1,54 1,03 1,67 0,93 1,81 0,83 1, ,24 1,43 1,15 1,54 1,05 1,66 0,96 1,80 0,86 1, ,26 1,44 1,17 1,54 1,08 1,66 0,99 1,79 0,90 1, ,27 1,45 1,19 1,55 1,10 1,66 1,01 1,78 0,93 1, ,29 1,45 1,21 1,55 1,12 1,66 1,04 1,77 0,95 1, ,30 1,46 1,22 1,55 1,14 1,65 1,06 1,76 0,98 1, ,32 1,47 1,24 1,56 1,16 1,65 1,08 1,76 1,01 1, ,33 1,48 1,26 1,56 1,18 1,65 1,10 1,75 1,03 1, ,34 1,48 1,27 1,56 1,20 1,65 1,12 1,74 1,05 1, ,35 1,49 1,28 1,57 1,21 1,65 1,14 1,74 1,07 1, ,36 1,50 1,30 1,57 1,23 1,65 1,16 1,74 1,09 1, ,37 1,50 1,31 1,57 1,24 1,65 1,18 1,73 1,11 1, ,38 1,51 1,32 1,58 1,26 1,65 1,19 1,73 1,13 1, ,39 1,51 1,33 1,58 1,27 1,65 1,21 1,73 1,15 1, ,40 1,52 1,34 1,58 1,28 1,65 1,22 1,73 1,16 1, ,41 1,52 1,35 1,59 1,29 1,65 1,24 1,73 1,18 1, ,42 1,53 1,36 1,59 1,31 1,66 1,25 1,72 1,19 1, ,43 1,54 1,37 1,59 1,32 1,66 1,26 1,72 1,21 1, ,43 1,54 1,38 1,60 1,33 1,66 1,27 1,72 1,22 1, ,44 1,54 1,39 1,60 1,34 1,66 1,29 1,72 1,23 1, ,50 1,59 1,46 1,63 1,42 1,67 1,38 1,72 1,34 1, ,55 1,62 1,51 1,65 1,48 1,69 1,44 1,73 1,41 1, ,58 1,64 1,55 1,67 1,52 1,70 1,49 1,74 1,46 1, ,61 1,66 1,59 1,69 1,56 1,72 1,53 1,74 1,51 1, ,63 1,68 1,61 1,70 1,59 1,73 1,57 1,75 1,54 1, ,65 1,69 1,63 1,72 1,61 1,74 1,59 1,76 1,57 1,78
22 SPSS Analyze / Regression / Linear - Statistics
23 0 d l d u 2 4-d u 4-d l 4 0,95 1,54 2,46 3,05 1,381 d l <d<d u no decison We need to include more variables / Or need to increase the number of observations!
24 Assumptions of the error term The expected value of the error term equals 0 E(ε X 1, X 2, X p )=0 Constant variance (homoscedasticity) Var(ε) = 2 The error term is uncorrelated across observations. Normally distributed error term.
25 Testing: Plots Normally distributed errors Quantitative tests- Goodness-of-fit tests Chi square 2 test Kolmogorov-Smirnoff test
26 Graphical testing e z A plot of the values of the residuals against normal distributed values. The assumption is not violated when the figure is nearly linear.
27 Goodness-of-fit test H 0 : P r (ε j ) = P j (the distribution is normal) H 1 : J j : P r (ε j ) P j r 2 2 ( f ) i 1 npi np i H 0 2 (1 ),( r 1 b )
28 SPSS Analyze / Regression / Linear - Plots Dependent variable Standardized predicted value Standardized residuum Deleted residuum Adjusted predicted value Studentized residuum Studentized deleted residuum Histogram
29 Output The bell-shaped normal distribution with a mean of 0 and standard deviation 1. Approximately normal (but not definitely)
30 2 nd solution Analyze / Regression / Linear - SAVE
31 Nonparametric Test Analyze / Nonparametric Test / 1-Samle K-S... H 0 - normális eloszlás H 1 - nem normális eloszlás
32 Output If the significance level (p) is smaller than 5% (0,05), we reject the null hypothesis. Now it is higher than 0,05, which means we accept the normal distribution.
33 3 rd solution Graphs / Histogram - Display normal curve
34 Assumptions of the independent variables Linear independency. Fix values, which do not change sample by sample. There is no scale error. The independent variable is uncorrelated with the error term.
35 Testing: Multicollinearity X j =f(x 1, X 2,,X j-1, X j+1,,x p ) regression models: Multiple determination coefficient F-test(F>F krit ) VIF- indicator
36 VIF-measure Variance Inflation Factor 1 VIF VIF=1 with the others) VIF if R j2 =0 (jth independent variable doesn t correlate R j2 =1 (jth independent variable is an exact linear combination of other independent variables) VIF 1 VIF 2 - weak multicollinearity 2 5 j 1 1 R VIF 5 - strong disturbing multicollinearity VIF - very strong, harmful multicollinearity 2 j
37 Correction for Multicollinearity We should find the offending independent variables to exclude them. We can combine independent variables which are strongly (creating principle components), which will differ from the original independents, but it will contain the information content of the original ones.
38 SPSS Analyze / Regression / Linear - Statistics
39 Thank You For Your Attention
Ref.: Spring SOS3003 Applied data analysis for social science Lecture note
SOS3003 Applied data analysis for social science Lecture note 05-2010 Erling Berge Department of sociology and political science NTNU Spring 2010 Erling Berge 2010 1 Literature Regression criticism I Hamilton
More informationSingle and multiple linear regression analysis
Single and multiple linear regression analysis Marike Cockeran 2017 Introduction Outline of the session Simple linear regression analysis SPSS example of simple linear regression analysis Additional topics
More informationECON 497: Lecture 4 Page 1 of 1
ECON 497: Lecture 4 Page 1 of 1 Metropolitan State University ECON 497: Research and Forecasting Lecture Notes 4 The Classical Model: Assumptions and Violations Studenmund Chapter 4 Ordinary least squares
More informationPrepared by: Prof. Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti
Prepared by: Prof Dr Bahaman Abu Samah Department of Professional Development and Continuing Education Faculty of Educational Studies Universiti Putra Malaysia Serdang M L Regression is an extension to
More informationIris Wang.
Chapter 10: Multicollinearity Iris Wang iris.wang@kau.se Econometric problems Multicollinearity What does it mean? A high degree of correlation amongst the explanatory variables What are its consequences?
More informationDiagnostics of Linear Regression
Diagnostics of Linear Regression Junhui Qian October 7, 14 The Objectives After estimating a model, we should always perform diagnostics on the model. In particular, we should check whether the assumptions
More informationRegression Diagnostics Procedures
Regression Diagnostics Procedures ASSUMPTIONS UNDERLYING REGRESSION/CORRELATION NORMALITY OF VARIANCE IN Y FOR EACH VALUE OF X For any fixed value of the independent variable X, the distribution of the
More informationEconometrics Part Three
!1 I. Heteroskedasticity A. Definition 1. The variance of the error term is correlated with one of the explanatory variables 2. Example -- the variance of actual spending around the consumption line increases
More informationMultiple Regression Analysis
Multle Regresson Analss Roland Szlág Ph.D. Assocate rofessor Correlaton descres the strength of a relatonsh, the degree to whch one varale s lnearl related to another Regresson shows us how to determne
More informationLECTURE 11. Introduction to Econometrics. Autocorrelation
LECTURE 11 Introduction to Econometrics Autocorrelation November 29, 2016 1 / 24 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists of choosing: 1. correct
More informationLINEAR REGRESSION ANALYSIS. MODULE XVI Lecture Exercises
LINEAR REGRESSION ANALYSIS MODULE XVI Lecture - 44 Exercises Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Exercise 1 The following data has been obtained on
More informationAUTOCORRELATION. Phung Thanh Binh
AUTOCORRELATION Phung Thanh Binh OUTLINE Time series Gauss-Markov conditions The nature of autocorrelation Causes of autocorrelation Consequences of autocorrelation Detecting autocorrelation Remedial measures
More informationDEMAND ESTIMATION (PART III)
BEC 30325: MANAGERIAL ECONOMICS Session 04 DEMAND ESTIMATION (PART III) Dr. Sumudu Perera Session Outline 2 Multiple Regression Model Test the Goodness of Fit Coefficient of Determination F Statistic t
More informationCHAPTER 6: SPECIFICATION VARIABLES
Recall, we had the following six assumptions required for the Gauss-Markov Theorem: 1. The regression model is linear, correctly specified, and has an additive error term. 2. The error term has a zero
More informationMultiple Linear Regression
Multiple Linear Regression University of California, San Diego Instructor: Ery Arias-Castro http://math.ucsd.edu/~eariasca/teaching.html 1 / 42 Passenger car mileage Consider the carmpg dataset taken from
More informationRidge Regression. Summary. Sample StatFolio: ridge reg.sgp. STATGRAPHICS Rev. 10/1/2014
Ridge Regression Summary... 1 Data Input... 4 Analysis Summary... 5 Analysis Options... 6 Ridge Trace... 7 Regression Coefficients... 8 Standardized Regression Coefficients... 9 Observed versus Predicted...
More informationRegression Analysis. BUS 735: Business Decision Making and Research. Learn how to detect relationships between ordinal and categorical variables.
Regression Analysis BUS 735: Business Decision Making and Research 1 Goals of this section Specific goals Learn how to detect relationships between ordinal and categorical variables. Learn how to estimate
More informationLAB 3 INSTRUCTIONS SIMPLE LINEAR REGRESSION
LAB 3 INSTRUCTIONS SIMPLE LINEAR REGRESSION In this lab you will first learn how to display the relationship between two quantitative variables with a scatterplot and also how to measure the strength of
More informationOutline. Nature of the Problem. Nature of the Problem. Basic Econometrics in Transportation. Autocorrelation
1/30 Outline Basic Econometrics in Transportation Autocorrelation Amir Samimi What is the nature of autocorrelation? What are the theoretical and practical consequences of autocorrelation? Since the assumption
More informationStatistics for Managers using Microsoft Excel 6 th Edition
Statistics for Managers using Microsoft Excel 6 th Edition Chapter 13 Simple Linear Regression 13-1 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value of
More informationEconometrics. 9) Heteroscedasticity and autocorrelation
30C00200 Econometrics 9) Heteroscedasticity and autocorrelation Timo Kuosmanen Professor, Ph.D. http://nomepre.net/index.php/timokuosmanen Today s topics Heteroscedasticity Possible causes Testing for
More informationRegression Models - Introduction
Regression Models - Introduction In regression models, two types of variables that are studied: A dependent variable, Y, also called response variable. It is modeled as random. An independent variable,
More informationDr. Maddah ENMG 617 EM Statistics 11/28/12. Multiple Regression (3) (Chapter 15, Hines)
Dr. Maddah ENMG 617 EM Statistics 11/28/12 Multiple Regression (3) (Chapter 15, Hines) Problems in multiple regression: Multicollinearity This arises when the independent variables x 1, x 2,, x k, are
More informationUsing the Regression Model in multivariate data analysis
Bulletin of the Transilvania University of Braşov Series V: Economic Sciences Vol. 10 (59) No. 1-2017 Using the Regression Model in multivariate data analysis Cristinel CONSTANTIN 1 Abstract: This paper
More informationQuantitative Methods I: Regression diagnostics
Quantitative Methods I: Regression University College Dublin 10 December 2014 1 Assumptions and errors 2 3 4 Outline Assumptions and errors 1 Assumptions and errors 2 3 4 Assumptions: specification Linear
More informationRegression Analysis. BUS 735: Business Decision Making and Research
Regression Analysis BUS 735: Business Decision Making and Research 1 Goals and Agenda Goals of this section Specific goals Learn how to detect relationships between ordinal and categorical variables. Learn
More informationIntroduction to statistical modeling
Introduction to statistical modeling Illustrated with XLSTAT Jean Paul Maalouf webinar@xlstat.com linkedin.com/in/jean-paul-maalouf November 30, 2016 www.xlstat.com 1 PLAN XLSTAT: who are we? Statistics:
More informationRegression Models - Introduction
Regression Models - Introduction In regression models there are two types of variables that are studied: A dependent variable, Y, also called response variable. It is modeled as random. An independent
More informationECON 4230 Intermediate Econometric Theory Exam
ECON 4230 Intermediate Econometric Theory Exam Multiple Choice (20 pts). Circle the best answer. 1. The Classical assumption of mean zero errors is satisfied if the regression model a) is linear in the
More informationAssumptions of classical multiple regression model
ESD: Recitation #7 Assumptions of classical multiple regression model Linearity Full rank Exogeneity of independent variables Homoscedasticity and non autocorrellation Exogenously generated data Normal
More informationMaking sense of Econometrics: Basics
Making sense of Econometrics: Basics Lecture 4: Qualitative influences and Heteroskedasticity Egypt Scholars Economic Society November 1, 2014 Assignment & feedback enter classroom at http://b.socrative.com/login/student/
More informationFinQuiz Notes
Reading 10 Multiple Regression and Issues in Regression Analysis 2. MULTIPLE LINEAR REGRESSION Multiple linear regression is a method used to model the linear relationship between a dependent variable
More informationChecking model assumptions with regression diagnostics
@graemeleehickey www.glhickey.com graeme.hickey@liverpool.ac.uk Checking model assumptions with regression diagnostics Graeme L. Hickey University of Liverpool Conflicts of interest None Assistant Editor
More informationMath 3330: Solution to midterm Exam
Math 3330: Solution to midterm Exam Question 1: (14 marks) Suppose the regression model is y i = β 0 + β 1 x i + ε i, i = 1,, n, where ε i are iid Normal distribution N(0, σ 2 ). a. (2 marks) Compute the
More informationHeteroskedasticity and Autocorrelation
Lesson 7 Heteroskedasticity and Autocorrelation Pilar González and Susan Orbe Dpt. Applied Economics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 Lesson 7. Heteroskedasticity
More informationThe Model Building Process Part I: Checking Model Assumptions Best Practice
The Model Building Process Part I: Checking Model Assumptions Best Practice Authored by: Sarah Burke, PhD 31 July 2017 The goal of the STAT T&E COE is to assist in developing rigorous, defensible test
More informationECON 497: Lecture Notes 10 Page 1 of 1
ECON 497: Lecture Notes 10 Page 1 of 1 Metropolitan State University ECON 497: Research and Forecasting Lecture Notes 10 Heteroskedasticity Studenmund Chapter 10 We'll start with a quote from Studenmund:
More informationINTRODUCTORY REGRESSION ANALYSIS
;»»>? INTRODUCTORY REGRESSION ANALYSIS With Computer Application for Business and Economics Allen Webster Routledge Taylor & Francis Croup NEW YORK AND LONDON TABLE OF CONTENT IN DETAIL INTRODUCTORY REGRESSION
More informationLECTURE 10. Introduction to Econometrics. Multicollinearity & Heteroskedasticity
LECTURE 10 Introduction to Econometrics Multicollinearity & Heteroskedasticity November 22, 2016 1 / 23 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists
More informationMulticollinearity occurs when two or more predictors in the model are correlated and provide redundant information about the response.
Multicollinearity Read Section 7.5 in textbook. Multicollinearity occurs when two or more predictors in the model are correlated and provide redundant information about the response. Example of multicollinear
More informationEconometrics - 30C00200
Econometrics - 30C00200 Lecture 11: Heteroskedasticity Antti Saastamoinen VATT Institute for Economic Research Fall 2015 30C00200 Lecture 11: Heteroskedasticity 12.10.2015 Aalto University School of Business
More informationMULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS
MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS Page 1 MSR = Mean Regression Sum of Squares MSE = Mean Squared Error RSS = Regression Sum of Squares SSE = Sum of Squared Errors/Residuals α = Level
More informationInference with Simple Regression
1 Introduction Inference with Simple Regression Alan B. Gelder 06E:071, The University of Iowa 1 Moving to infinite means: In this course we have seen one-mean problems, twomean problems, and problems
More informationInferences for Regression
Inferences for Regression An Example: Body Fat and Waist Size Looking at the relationship between % body fat and waist size (in inches). Here is a scatterplot of our data set: Remembering Regression In
More informationCorrelation Analysis
Simple Regression Correlation Analysis Correlation analysis is used to measure strength of the association (linear relationship) between two variables Correlation is only concerned with strength of the
More informationAny of 27 linear and nonlinear models may be fit. The output parallels that of the Simple Regression procedure.
STATGRAPHICS Rev. 9/13/213 Calibration Models Summary... 1 Data Input... 3 Analysis Summary... 5 Analysis Options... 7 Plot of Fitted Model... 9 Predicted Values... 1 Confidence Intervals... 11 Observed
More informationRegression Analysis By Example
Regression Analysis By Example Third Edition SAMPRIT CHATTERJEE New York University ALI S. HADI Cornell University BERTRAM PRICE Price Associates, Inc. A Wiley-Interscience Publication JOHN WILEY & SONS,
More informationMultiple Regression Analysis
1 OUTLINE Basic Concept: Multiple Regression MULTICOLLINEARITY AUTOCORRELATION HETEROSCEDASTICITY REASEARCH IN FINANCE 2 BASIC CONCEPTS: Multiple Regression Y i = β 1 + β 2 X 1i + β 3 X 2i + β 4 X 3i +
More informationChapter 5. Classical linear regression model assumptions and diagnostics. Introductory Econometrics for Finance c Chris Brooks
Chapter 5 Classical linear regression model assumptions and diagnostics Introductory Econometrics for Finance c Chris Brooks 2013 1 Violation of the Assumptions of the CLRM Recall that we assumed of the
More informationLINEAR REGRESSION. Copyright 2013, SAS Institute Inc. All rights reserved.
LINEAR REGRESSION LINEAR REGRESSION REGRESSION AND OTHER MODELS Type of Response Type of Predictors Categorical Continuous Continuous and Categorical Continuous Analysis of Variance (ANOVA) Ordinary Least
More informationSTAT 4385 Topic 06: Model Diagnostics
STAT 4385 Topic 06: Xiaogang Su, Ph.D. Department of Mathematical Science University of Texas at El Paso xsu@utep.edu Spring, 2016 1/ 40 Outline Several Types of Residuals Raw, Standardized, Studentized
More informationChristopher Dougherty London School of Economics and Political Science
Introduction to Econometrics FIFTH EDITION Christopher Dougherty London School of Economics and Political Science OXFORD UNIVERSITY PRESS Contents INTRODU CTION 1 Why study econometrics? 1 Aim of this
More information2 Prediction and Analysis of Variance
2 Prediction and Analysis of Variance Reading: Chapters and 2 of Kennedy A Guide to Econometrics Achen, Christopher H. Interpreting and Using Regression (London: Sage, 982). Chapter 4 of Andy Field, Discovering
More informationAnswer all questions from part I. Answer two question from part II.a, and one question from part II.b.
B203: Quantitative Methods Answer all questions from part I. Answer two question from part II.a, and one question from part II.b. Part I: Compulsory Questions. Answer all questions. Each question carries
More informationThe Model Building Process Part I: Checking Model Assumptions Best Practice (Version 1.1)
The Model Building Process Part I: Checking Model Assumptions Best Practice (Version 1.1) Authored by: Sarah Burke, PhD Version 1: 31 July 2017 Version 1.1: 24 October 2017 The goal of the STAT T&E COE
More informationEconometrics Summary Algebraic and Statistical Preliminaries
Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L
More informationMultiple Regression Analysis
Chapter 4 Multiple Regression Analysis The simple linear regression covered in Chapter 2 can be generalized to include more than one variable. Multiple regression analysis is an extension of the simple
More informationApart from this page, you are not permitted to read the contents of this question paper until instructed to do so by an invigilator.
B. Sc. Examination by course unit 2014 MTH5120 Statistical Modelling I Duration: 2 hours Date and time: 16 May 2014, 1000h 1200h Apart from this page, you are not permitted to read the contents of this
More informationEcon107 Applied Econometrics
Econ107 Applied Econometrics Topics 2-4: discussed under the classical Assumptions 1-6 (or 1-7 when normality is needed for finite-sample inference) Question: what if some of the classical assumptions
More informationWe like to capture and represent the relationship between a set of possible causes and their response, by using a statistical predictive model.
Statistical Methods in Business Lecture 5. Linear Regression We like to capture and represent the relationship between a set of possible causes and their response, by using a statistical predictive model.
More informationMultivariate Regression
Multivariate Regression The so-called supervised learning problem is the following: we want to approximate the random variable Y with an appropriate function of the random variables X 1,..., X p with the
More informationVARIANCE ANALYSIS OF WOOL WOVEN FABRICS TENSILE STRENGTH USING ANCOVA MODEL
ANNALS OF THE UNIVERSITY OF ORADEA FASCICLE OF TEXTILES, LEATHERWORK VARIANCE ANALYSIS OF WOOL WOVEN FABRICS TENSILE STRENGTH USING ANCOVA MODEL VÎLCU Adrian 1, HRISTIAN Liliana 2, BORDEIANU Demetra Lăcrămioara
More informationy response variable x 1, x 2,, x k -- a set of explanatory variables
11. Multiple Regression and Correlation y response variable x 1, x 2,, x k -- a set of explanatory variables In this chapter, all variables are assumed to be quantitative. Chapters 12-14 show how to incorporate
More informationEco and Bus Forecasting Fall 2016 EXERCISE 2
ECO 5375-701 Prof. Tom Fomby Eco and Bus Forecasting Fall 016 EXERCISE Purpose: To learn how to use the DTDS model to test for the presence or absence of seasonality in time series data and to estimate
More informationTypes of Statistical Tests DR. MIKE MARRAPODI
Types of Statistical Tests DR. MIKE MARRAPODI Tests t tests ANOVA Correlation Regression Multivariate Techniques Non-parametric t tests One sample t test Independent t test Paired sample t test One sample
More informationSTAT Checking Model Assumptions
STAT 704 --- Checking Model Assumptions Recall we assumed the following in our model: (1) The regression relationship between the response and the predictor(s) specified in the model is appropriate (2)
More informationEcon 510 B. Brown Spring 2014 Final Exam Answers
Econ 510 B. Brown Spring 2014 Final Exam Answers Answer five of the following questions. You must answer question 7. The question are weighted equally. You have 2.5 hours. You may use a calculator. Brevity
More informationBasic Business Statistics 6 th Edition
Basic Business Statistics 6 th Edition Chapter 12 Simple Linear Regression Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value of a dependent variable based
More informationLeonor Ayyangar, Health Economics Resource Center VA Palo Alto Health Care System Menlo Park, CA
Skewness, Multicollinearity, Heteroskedasticity - You Name It, Cost Data Have It! Solutions to Violations of Assumptions of Ordinary Least Squares Regression Models Using SAS Leonor Ayyangar, Health Economics
More informationCh.10 Autocorrelated Disturbances (June 15, 2016)
Ch10 Autocorrelated Disturbances (June 15, 2016) In a time-series linear regression model setting, Y t = x tβ + u t, t = 1, 2,, T, (10-1) a common problem is autocorrelation, or serial correlation of the
More information1/34 3/ Omission of a relevant variable(s) Y i = α 1 + α 2 X 1i + α 3 X 2i + u 2i
1/34 Outline Basic Econometrics in Transportation Model Specification How does one go about finding the correct model? What are the consequences of specification errors? How does one detect specification
More informationAdvanced Quantitative Methods: Regression diagnostics
Advanced Quantitative Methods: Regression diagnostics Johan A. Elkink University College Dublin 9 February 2018 1, leverage, influence 2 3 Heteroscedasticity 4 1, leverage, influence 2 3 Heteroscedasticity
More informationChapter 16. Simple Linear Regression and dcorrelation
Chapter 16 Simple Linear Regression and dcorrelation 16.1 Regression Analysis Our problem objective is to analyze the relationship between interval variables; regression analysis is the first tool we will
More informationSTAT 212 Business Statistics II 1
STAT 1 Business Statistics II 1 KING FAHD UNIVERSITY OF PETROLEUM & MINERALS DEPARTMENT OF MATHEMATICAL SCIENCES DHAHRAN, SAUDI ARABIA STAT 1: BUSINESS STATISTICS II Semester 091 Final Exam Thursday Feb
More informationFinal Exam. Question 1 (20 points) 2 (25 points) 3 (30 points) 4 (25 points) 5 (10 points) 6 (40 points) Total (150 points) Bonus question (10)
Name Economics 170 Spring 2004 Honor pledge: I have neither given nor received aid on this exam including the preparation of my one page formula list and the preparation of the Stata assignment for the
More informationMultiple linear regression S6
Basic medical statistics for clinical and experimental research Multiple linear regression S6 Katarzyna Jóźwiak k.jozwiak@nki.nl November 15, 2017 1/42 Introduction Two main motivations for doing multiple
More informationMacroeconometrics. Christophe BOUCHER. Session 4 Classical linear regression model assumptions and diagnostics
Macroeconometrics Christophe BOUCHER Session 4 Classical linear regression model assumptions and diagnostics 1 Violation of the Assumptions of the CLRM Recall that we assumed of the CLRM disturbance terms:
More informationBasic Statistics Exercises 66
Basic Statistics Exercises 66 42. Suppose we are interested in predicting a person's height from the person's length of stride (distance between footprints). The following data is recorded for a random
More informationBusiness Statistics. Tommaso Proietti. Linear Regression. DEF - Università di Roma 'Tor Vergata'
Business Statistics Tommaso Proietti DEF - Università di Roma 'Tor Vergata' Linear Regression Specication Let Y be a univariate quantitative response variable. We model Y as follows: Y = f(x) + ε where
More informationIntroduction to Econometrics. Heteroskedasticity
Introduction to Econometrics Introduction Heteroskedasticity When the variance of the errors changes across segments of the population, where the segments are determined by different values for the explanatory
More informationChapter 4: Regression Models
Sales volume of company 1 Textbook: pp. 129-164 Chapter 4: Regression Models Money spent on advertising 2 Learning Objectives After completing this chapter, students will be able to: Identify variables,
More informationVII. Serially Correlated Residuals/ Autocorrelation
VII. Serially Correlated Residuals/ Autocorrelation A.Graphical Detection of Serially Correlated (Autocorrelated) Residuals In multiple regression, we typically assume and 2 ( ) ε ~ N 0,Iσ % ( i j) cov
More informationECON 497 Midterm Spring
ECON 497 Midterm Spring 2009 1 ECON 497: Economic Research and Forecasting Name: Spring 2009 Bellas Midterm You have three hours and twenty minutes to complete this exam. Answer all questions and explain
More informationCourse information EC2020 Elements of econometrics
Course information 2015 16 EC2020 Elements of econometrics Econometrics is the application of statistical methods to the quantification and critical assessment of hypothetical economic relationships using
More informationSample Problems. Note: If you find the following statements true, you should briefly prove them. If you find them false, you should correct them.
Sample Problems 1. True or False Note: If you find the following statements true, you should briefly prove them. If you find them false, you should correct them. (a) The sample average of estimated residuals
More informationApplied Econometrics. Applied Econometrics. Applied Econometrics. Applied Econometrics. What is Autocorrelation. Applied Econometrics
Autocorrelation 1. What is 2. What causes 3. First and higher orders 4. Consequences of 5. Detecting 6. Resolving Learning Objectives 1. Understand meaning of in the CLRM 2. What causes 3. Distinguish
More informationM.Sc. (Final) DEGREE EXAMINATION, MAY Final Year STATISTICS. Time : 03 Hours Maximum Marks : 100
(DMSTT21) M.Sc. (Final) DEGREE EXAMINATION, MAY - 2013 Final Year STATISTICS Paper - I : Statistical Quality Control Time : 03 Hours Maximum Marks : 100 Answer any Five questions All questions carry equal
More informationChapter 16. Simple Linear Regression and Correlation
Chapter 16 Simple Linear Regression and Correlation 16.1 Regression Analysis Our problem objective is to analyze the relationship between interval variables; regression analysis is the first tool we will
More informationPolynomial Regression
Polynomial Regression Summary... 1 Analysis Summary... 3 Plot of Fitted Model... 4 Analysis Options... 6 Conditional Sums of Squares... 7 Lack-of-Fit Test... 7 Observed versus Predicted... 8 Residual Plots...
More information1 Graphical method of detecting autocorrelation. 2 Run test to detect autocorrelation
1 Graphical method of detecting autocorrelation Residual plot : A graph of the estimated residuals ˆɛ i against time t is plotted. If successive residuals tend to cluster on one side of the zero line of
More informationNATCOR Regression Modelling for Time Series
Universität Hamburg Institut für Wirtschaftsinformatik Prof. Dr. D.B. Preßmar Professor Robert Fildes NATCOR Regression Modelling for Time Series The material presented has been developed with the substantial
More information2.4.3 Estimatingσ Coefficient of Determination 2.4. ASSESSING THE MODEL 23
2.4. ASSESSING THE MODEL 23 2.4.3 Estimatingσ 2 Note that the sums of squares are functions of the conditional random variables Y i = (Y X = x i ). Hence, the sums of squares are random variables as well.
More informationPhd Program in Transportation. Transport Demand Modeling. MODULE 2 Multiple Linear Regression
Phd Program in Transportation Transport Demand Modeling Filipe Moura MODULE 2 Multiple Linear Regression Phd in Transportation / Transport Demand Modelling 1 Outline 1. Learning objectives 2. What is MR
More informationLinear Models, Problems
Linear Models, Problems John Fox McMaster University Draft: Please do not quote without permission Revised January 2003 Copyright c 2002, 2003 by John Fox I. The Normal Linear Model: Structure and Assumptions
More informationInference for Regression Inference about the Regression Model and Using the Regression Line, with Details. Section 10.1, 2, 3
Inference for Regression Inference about the Regression Model and Using the Regression Line, with Details Section 10.1, 2, 3 Basic components of regression setup Target of inference: linear dependency
More informationAgricultural and Applied Economics 637 Applied Econometrics II
Agricultural and Applied Economics 637 Applied Econometrics II Assignment 1 Review of GLS Heteroskedasity and Autocorrelation (Due: Feb. 4, 2011) In this assignment you are asked to develop relatively
More informationContents. 1 Review of Residuals. 2 Detecting Outliers. 3 Influential Observations. 4 Multicollinearity and its Effects
Contents 1 Review of Residuals 2 Detecting Outliers 3 Influential Observations 4 Multicollinearity and its Effects W. Zhou (Colorado State University) STAT 540 July 6th, 2015 1 / 32 Model Diagnostics:
More information1. The OLS Estimator. 1.1 Population model and notation
1. The OLS Estimator OLS stands for Ordinary Least Squares. There are 6 assumptions ordinarily made, and the method of fitting a line through data is by least-squares. OLS is a common estimation methodology
More informationIntroductory Econometrics
Introductory Econometrics Violation of basic assumptions Heteroskedasticity Barbara Pertold-Gebicka CERGE-EI 16 November 010 OLS assumptions 1. Disturbances are random variables drawn from a normal distribution.
More informationPartitioned Covariance Matrices and Partial Correlations. Proposition 1 Let the (p + q) (p + q) covariance matrix C > 0 be partitioned as C = C11 C 12
Partitioned Covariance Matrices and Partial Correlations Proposition 1 Let the (p + q (p + q covariance matrix C > 0 be partitioned as ( C11 C C = 12 C 21 C 22 Then the symmetric matrix C > 0 has the following
More information