coefficients n 2 are the residuals obtained when we estimate the regression on y equals the (simple regression) estimated effect of the part of x 1
|
|
- Sherman Chambers
- 5 years ago
- Views:
Transcription
1 Review - Interpreting the Regression If we estimate: It can be shown that: where ˆ1 r i coefficients β ˆ+ βˆ x+ βˆ ˆ= x2 y ˆβ n n 2 1 = rˆ i1yi rˆ i1 i= 1 i= 1 xˆ are the residuals obtained when we estimate the regression ˆ ˆ 1=γ 0+ γ ˆ 2x2 The estimated effect of x 1 on y equals the (simple regression) estimated effect of the part of x 1 that is not explained by x 2 Note that the average of the residuals is always 0, hence the expression for the simple linear regression estimator is simplified This interpretation holds in general (with more variables), Multiple Linear Regression 1
2 Review Conditions under which exclusion of variables preserves unbiasedness of estimators Estimate the following regressions: ~ y= yˆ = ~ β βˆ 0 0 ~ + β x + βˆ 1 1 x βˆ ~ Ifβˆ 2 = 0, then β1 = βˆ 1 (check first order conditions) ~ If x ˆ 1 and x2 are uncorrelated, then β1 = β1 ~ However, in general it will be the case that β βˆ 2 x Multiple Linear Regression 2
3 Review - More or Less Variables? In general, and assuming MLR.1 to MLR.4 holds for as many variables as those under consideration: If we do not include a variable and this variable is uncorrelated with the included regressors, then the OLS estimators will be unbiased Remember, if the other factors (in u) are uncorrelated with the regressors, we can still interpret the estimated effects as ceteribus paribus effects If we do not include a variable and this variable is correlated with the included regressors, then the OLS estimators will be biased, except if the coefficient of the variable not included is 0 in the full model Multiple Linear Regression 3
4 So, always more variables? Even if they are irrelevant (or almost irrelevant) and therefore do not induce bias in the other estimators? No! Why? Variances of the estimators can become large! Can show, under MLR.1 to MLR.5, that: =1,2,,k Is the coefficient of determination from regressing x on all the other regressors. Tells us how much the other regressors explain x Multiple Linear Regression 4
5 Understanding OLS Variances =1,2,,k Strong linear relations among the independent variables are harmful: a larger R 2 implies a larger variance for the estimators (almost multicollinearity) If some irrelevant variable is uncorrelated with the remaining regressors, then including it maintains the variance unchanged (not interesting case ) Typically, variables that you think would be useful but turn out to seem irrelevant, are highly correlated with variables already included. This is undesirable as the variances of the estimators become large. So, avoid including these variables, since estimators for the other coefficients will be unbiased and display a smaller variance a larger σ 2 implies a larger variance of the OLS estimators a larger SST implies a smaller variance of the estimators (increases with sample size, so in large samples we should not be too worried!!) Multiple Linear Regression 5
6 Review - The Gauss-Markov Theorem Under MLR.1 to MLR.5 (the so-called Gauss-Markov Assumptions) it can be shown that OLS is BLUE Best Linear Unbiased Estimator Thus, if the 5 assumptions are presumed to hold, use OLS No other linear and unbiased estimator has a variance smaller than OLS Variances here are matrices, we are saying that is a positive semi-definite matrix (implies that all individual OLS parameter estimators have smaller variance than any other linear unbiased estimator for those parameters) Multiple Linear Regression 6
7 Inference in the Multiple Linear Regression Model 7
8 Inference in the multiple linear regression model Suppose you want to test whether a variable is important in explaining variation in the dependent variable: E.g., is the effect of tenure on wages statistically significant (i.e., different than zero)? Is the effect of height on wages statistically significant? Or suppose you want to test whether a coefficient has a particular value E.g., is the effect of one additional year of schooling on expected monthly wages equal to 200? Need to take into account sampling distribution of our estimators We will check whether under the maintained hypothesis (or null htpothesis) the observed values of certain test statistics are likely If they are not we say we reect the null Inference 8
9 Inference in the multiple linear regression model Assumption MLR.6 (Normality) The distribution of the population error u is independent of x 1, x 2,,x k and u is normally distributed with mean 0 and variance σ 2 We write: u ~ Normal (0,σ 2 ) Independence is stronger than MLR.4 (zero conditional mean). It implies MLR.4. Also, normality and independence imply MLR.5 so all the results regarding unbiasedness and variance of the estimators remain valid Normality is unrealistic in many cases (e.g., wages cannot be negative but normality of u could deliver negative wages). However, most results would hold in large samples without the normality assumption Inference 9
10 Classical Linear Model Assumptions MLR.1 through MLR.6 are the Classical Linear Model assumptions With these assumptions, one can prove that the OLS estimators are the minimum variance unbiased estimators: no other unbiased estimator has a variance smaller than OLS Inference 10
11 Distribution of OLS estimators Under MLR.1 through MLR.6 it is straightforward to show that: y x ~ Normal (β 0 + β 1 x β k x k, σ 2 ) Also, since the OLS estimators are a linear function of the error term u, then (conditional on the x s) : βˆ ~ Normal ( βˆ β ) sd ( β, Var( βˆ )) ( βˆ ) ~ Normal, so that ( 0,1) : where sd stands for standard deviation (square root of the variance, derived in previous classes) Inference 11
12 Distribution of OLS estimators Now, theσ 2 that appears in the expression for the standard deviation of the estimators must be estimated Also, conditional on the x s n k 1) σˆ / σ ~ χ which implies: ( ) ( ) βˆ β βˆ β βˆ βˆ β ( n k 1 ( ) sd ( ) ( ) σ Normal (0,1) = = ( ) ( ) ( ) σ 2 ~ t se βˆ ˆ ˆ ˆ ˆ 1 sd β se β n k sd β χn k 1 n k 1 Therefore, conditional on the x s, we have: Degrees of ( βˆ β ) se ( βˆ ) ~ t n k 1 freedom : n k 1(for large n this is similar to a Normal (0,1)) Inference 12
13 Performing a test on a coefficient 1 - Set the null hypothesis (and the alternative). E.g., H 0 : β = 0 (coefficient on experience in our wage regression) and H 1 : β > Choose significance level α (Probability of reecting the null if the null is actually true) E.g., α= Look at the sampling distribution of the test statistic t (random variable) involving the parameter: t ( βˆ β ) = k seβ ( ˆ ) n 1 Under the null hypothesis, the test statistic should be small across samples. Reect the null if the observed value of the test statistic is very unlikely (very large). ~ t Inference 13
14 Performing a test on a coefficient 4 - For one-sided tests where the alternative is favoured if t obs is large and positive (e.g., H 1 : β > 0), reect the null if the observed test statistic, t obs, is larger than c, where c is implicitly given by: Prob[t>c H 0 is true]=α For one-sided tests where the alternative is favoured if t obs is large and negative (e.g., H 1 : β < 0), reect the null if the observed test statistic, t obs, is smaller than -c, where c is implicitly given by: Prob[t<-c H 0 is true]=α For two-sided tests, where the alternative is favoured if t obs is large in absolute value (e.g., H 1 : β 0), reect the null if the absolute value of observed test statistic, t obs, is larger than c, where c is implicitly given by: Prob[ t >c H 0 is true]=α Inference 14
15 One-Sided Alternative y i = β 0 + β 1 x i1 + + β k x ik + u i H 0 : β = 0 H 1 : β > 0 t obs here: Fail to reect the null ( βˆ β ) Test statistic : t = tn k 1 seβ (1 α) ( ˆ ) ~ t obs here: Reect the null α - Distribution of the test statistic under the null 0 Inference 15 c
16 Two-Sided Alternatives y i = β 0 + β 1 X i1 + + β k X ik + u i H 0 : β = 0 H 1 : β 0 t obs here: Reect the null α/2 ( βˆ β ) Test statistic : t = tn k 1 seβ t obs here: Fail to reect the null (1 α) ( ˆ ) ~ t obs here: Reect the null α/2 -c - Distribution of the test statistic under the null 0 Inference 16 c
17 Example: Hypothesis testing Dependent variable: Log of wages The t ratios are the observed values of the test statistic for testing β = 0, e.g., 96.75= / Inference 17
18 Example: Hypothesis testing Choose α=0.05 Test H 0 :β = 0 (coefficient on education in our wage regression) against H 1 : β 0 t obs =( )/ =96.75 t obs >1.96 so we reect the null. We say the coefficient for education is significant at the 5% level We use Normal approximation since n is large c= Distribution of the test statistic under the null 0 c=1.96 Inference 18
19 Example: Hypothesis testing Choose α=0.05 Test H 0 :β = 0 (coefficient on education in our wage regression) against H 1 : β > 0 (clearly more reasonable ) t obs =( )/ =96.75 t obs >1.645 so we reect the null. We use Normal approximation since n is large Distribution of the test statistic under the null 0 c=1.645 Inference 19
20 Example: Hypothesis testing Choose α=0.05 Test H 0 :β = 0.07 (coefficient on education in our wage regression) against H 1 : β 0.07 t obs =( )/ =7.772 t obs >1.96 so we reect the null. We use Normal approximation since n is large c= c=1.96 Inference 20
21 P-Value Given the observed value of the t statistic, what would be the smallest significance level at which the null H 0 :β = 0 would be reected against the alternative H 1 : β 0? This is the P-Value It is given by Prob[ t > t obs H 0 true] P-Value /2 P-Value /2 1- P-value - t obs If the α>p-value we would reect the null! t obs Inference 21
22 Confidence intervals A (1 - α) % confidence interval is defined as: βˆ in a ± c. se t n k 1 ( βˆ ), where c is distribution the 1- α 2 percentile If the hypothesized value of a parameter (b ) is inside the confidence interval, we would not reect the null β = b against β b at the significance level α Inference 22
23 Testing multiple exclusion restrictions Unrestricted model: Restricted model: H 1 : Not H 0 Under the null: r stands for restricted and ur for unrestricted, q is number of restrictions Does SSR ur decrease enough compared to SSR r? If F obs is too large we reect the null Inference 23
24 Testing multiple exclusion restrictions H 1 : Not H 0 Obtained by dividing numerator and denominator above by SST This is different from testing significance of each coefficient individually!! It is a test of oint significance Inference 24
25 Testing multiple exclusion restrictions: F test f(f) Reect the null if the observed test statistic, Fobs, is larger than c, where c is implicitly given by: Prob[F>c H 0 is true]=α fail to reect (1 α) 0 c α reect F Inference 25
26 Testing multiple exclusion restrictions: Example Dependent Variable: log of monthly wages, n=11064 Inference 26
27 Testing multiple exclusion restrictions: Example α=0.05 Inference 27
28 Overall Significance of the model Use: H 1 is: Not H 0 Under the null Testing general linear restrictions: in the practical sessions! Inference 28
Multiple Linear Regression
Multiple Linear Regression Asymptotics Asymptotics Multiple Linear Regression: Assumptions Assumption MLR. (Linearity in parameters) Assumption MLR. (Random Sampling from the population) We have a random
More informationEconometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018
Econometrics I KS Module 2: Multivariate Linear Regression Alexander Ahammer Department of Economics Johannes Kepler University of Linz This version: April 16, 2018 Alexander Ahammer (JKU) Module 2: Multivariate
More informationMultiple Regression Analysis
Multiple Regression Analysis y = β 0 + β 1 x 1 + β 2 x 2 +... β k x k + u 2. Inference 0 Assumptions of the Classical Linear Model (CLM)! So far, we know: 1. The mean and variance of the OLS estimators
More informationMultiple Linear Regression CIVL 7012/8012
Multiple Linear Regression CIVL 7012/8012 2 Multiple Regression Analysis (MLR) Allows us to explicitly control for many factors those simultaneously affect the dependent variable This is important for
More informationMultiple Regression Analysis: Heteroskedasticity
Multiple Regression Analysis: Heteroskedasticity y = β 0 + β 1 x 1 + β x +... β k x k + u Read chapter 8. EE45 -Chaiyuth Punyasavatsut 1 topics 8.1 Heteroskedasticity and OLS 8. Robust estimation 8.3 Testing
More informationSteps in Regression Analysis
MGMG 522 : Session #2 Learning to Use Regression Analysis & The Classical Model (Ch. 3 & 4) 2-1 Steps in Regression Analysis 1. Review the literature and develop the theoretical model 2. Specify the model:
More informationMultiple Regression: Inference
Multiple Regression: Inference The t-test: is ˆ j big and precise enough? We test the null hypothesis: H 0 : β j =0; i.e. test that x j has no effect on y once the other explanatory variables are controlled
More informationEconometrics Multiple Regression Analysis: Heteroskedasticity
Econometrics Multiple Regression Analysis: João Valle e Azevedo Faculdade de Economia Universidade Nova de Lisboa Spring Semester João Valle e Azevedo (FEUNL) Econometrics Lisbon, April 2011 1 / 19 Properties
More informationStatistical Inference with Regression Analysis
Introductory Applied Econometrics EEP/IAS 118 Spring 2015 Steven Buck Lecture #13 Statistical Inference with Regression Analysis Next we turn to calculating confidence intervals and hypothesis testing
More informationMultiple Regression Analysis. Part III. Multiple Regression Analysis
Part III Multiple Regression Analysis As of Sep 26, 2017 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant
More informationIntroductory Econometrics
Based on the textbook by Wooldridge: : A Modern Approach Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna November 23, 2013 Outline Introduction
More informationMultiple Regression Analysis
Multiple Regression Analysis y = 0 + 1 x 1 + x +... k x k + u 6. Heteroskedasticity What is Heteroskedasticity?! Recall the assumption of homoskedasticity implied that conditional on the explanatory variables,
More informationHomoskedasticity. Var (u X) = σ 2. (23)
Homoskedasticity How big is the difference between the OLS estimator and the true parameter? To answer this question, we make an additional assumption called homoskedasticity: Var (u X) = σ 2. (23) This
More informationStatistical Inference. Part IV. Statistical Inference
Part IV Statistical Inference As of Oct 5, 2017 Sampling Distributions of the OLS Estimator 1 Statistical Inference Sampling Distributions of the OLS Estimator Testing Against One-Sided Alternatives Two-Sided
More informationInference in Regression Analysis
ECNS 561 Inference Inference in Regression Analysis Up to this point 1.) OLS is unbiased 2.) OLS is BLUE (best linear unbiased estimator i.e., the variance is smallest among linear unbiased estimators)
More informationThe general linear regression with k explanatory variables is just an extension of the simple regression as follows
3. Multiple Regression Analysis The general linear regression with k explanatory variables is just an extension of the simple regression as follows (1) y i = β 0 + β 1 x i1 + + β k x ik + u i. Because
More informationApplied Econometrics (QEM)
Applied Econometrics (QEM) based on Prinicples of Econometrics Jakub Mućk Department of Quantitative Economics Jakub Mućk Applied Econometrics (QEM) Meeting #3 1 / 42 Outline 1 2 3 t-test P-value Linear
More informationECNS 561 Multiple Regression Analysis
ECNS 561 Multiple Regression Analysis Model with Two Independent Variables Consider the following model Crime i = β 0 + β 1 Educ i + β 2 [what else would we like to control for?] + ε i Here, we are taking
More informationCHAPTER 6: SPECIFICATION VARIABLES
Recall, we had the following six assumptions required for the Gauss-Markov Theorem: 1. The regression model is linear, correctly specified, and has an additive error term. 2. The error term has a zero
More informationTHE MULTIVARIATE LINEAR REGRESSION MODEL
THE MULTIVARIATE LINEAR REGRESSION MODEL Why multiple regression analysis? Model with more than 1 independent variable: y 0 1x1 2x2 u It allows : -Controlling for other factors, and get a ceteris paribus
More informationLecture 4: Multivariate Regression, Part 2
Lecture 4: Multivariate Regression, Part 2 Gauss-Markov Assumptions 1) Linear in Parameters: Y X X X i 0 1 1 2 2 k k 2) Random Sampling: we have a random sample from the population that follows the above
More informationReview of Econometrics
Review of Econometrics Zheng Tian June 5th, 2017 1 The Essence of the OLS Estimation Multiple regression model involves the models as follows Y i = β 0 + β 1 X 1i + β 2 X 2i + + β k X ki + u i, i = 1,...,
More informationThe Simple Regression Model. Simple Regression Model 1
The Simple Regression Model Simple Regression Model 1 Simple regression model: Objectives Given the model: - where y is earnings and x years of education - Or y is sales and x is spending in advertising
More informationMultiple Regression Analysis: Inference MULTIPLE REGRESSION ANALYSIS: INFERENCE. Sampling Distributions of OLS Estimators
1 2 Multiple Regression Analysis: Inference MULTIPLE REGRESSION ANALYSIS: INFERENCE Hüseyin Taştan 1 1 Yıldız Technical University Department of Economics These presentation notes are based on Introductory
More informationLECTURE 5. Introduction to Econometrics. Hypothesis testing
LECTURE 5 Introduction to Econometrics Hypothesis testing October 18, 2016 1 / 26 ON TODAY S LECTURE We are going to discuss how hypotheses about coefficients can be tested in regression models We will
More informationEconometrics. Final Exam. 27thofJune,2008. Timeforcompletion: 2h30min
Econometrics Final Exam 27thofJune,2008 João Valle e Azevedo António José Morgado Tiago Silva Vieira Timeforcompletion: 2h30min Give your answers in the space provided. Usedraftpapertoplanyouranswersbeforewritingthemontheexampaper.
More informationRecitation 1: Regression Review. Christina Patterson
Recitation 1: Regression Review Christina Patterson Outline For Recitation 1. Statistics. Bias, sampling variance and hypothesis testing.. Two important statistical theorems: Law of large numbers (LLN)
More informationEconomics 345: Applied Econometrics Section A01 University of Victoria Midterm Examination #2 Version 1 SOLUTIONS Fall 2016 Instructor: Martin Farnham
Economics 345: Applied Econometrics Section A01 University of Victoria Midterm Examination #2 Version 1 SOLUTIONS Fall 2016 Instructor: Martin Farnham Last name (family name): First name (given name):
More informationInterpreting Regression Results
Interpreting Regression Results Carlo Favero Favero () Interpreting Regression Results 1 / 42 Interpreting Regression Results Interpreting regression results is not a simple exercise. We propose to split
More informationBusiness Economics BUSINESS ECONOMICS. PAPER No. : 8, FUNDAMENTALS OF ECONOMETRICS MODULE No. : 3, GAUSS MARKOV THEOREM
Subject Business Economics Paper No and Title Module No and Title Module Tag 8, Fundamentals of Econometrics 3, The gauss Markov theorem BSE_P8_M3 1 TABLE OF CONTENTS 1. INTRODUCTION 2. ASSUMPTIONS OF
More informationMultivariate Regression Analysis
Matrices and vectors The model from the sample is: Y = Xβ +u with n individuals, l response variable, k regressors Y is a n 1 vector or a n l matrix with the notation Y T = (y 1,y 2,...,y n ) 1 x 11 x
More informationCh 2: Simple Linear Regression
Ch 2: Simple Linear Regression 1. Simple Linear Regression Model A simple regression model with a single regressor x is y = β 0 + β 1 x + ɛ, where we assume that the error ɛ is independent random component
More informationstatistical sense, from the distributions of the xs. The model may now be generalized to the case of k regressors:
Wooldridge, Introductory Econometrics, d ed. Chapter 3: Multiple regression analysis: Estimation In multiple regression analysis, we extend the simple (two-variable) regression model to consider the possibility
More informationECO375 Tutorial 4 Introduction to Statistical Inference
ECO375 Tutorial 4 Introduction to Statistical Inference Matt Tudball University of Toronto Mississauga October 19, 2017 Matt Tudball (University of Toronto) ECO375H5 October 19, 2017 1 / 26 Statistical
More informationLECTURE 6. Introduction to Econometrics. Hypothesis testing & Goodness of fit
LECTURE 6 Introduction to Econometrics Hypothesis testing & Goodness of fit October 25, 2016 1 / 23 ON TODAY S LECTURE We will explain how multiple hypotheses are tested in a regression model We will define
More informationLectures 5 & 6: Hypothesis Testing
Lectures 5 & 6: Hypothesis Testing in which you learn to apply the concept of statistical significance to OLS estimates, learn the concept of t values, how to use them in regression work and come across
More informationHeteroskedasticity. Part VII. Heteroskedasticity
Part VII Heteroskedasticity As of Oct 15, 2015 1 Heteroskedasticity Consequences Heteroskedasticity-robust inference Testing for Heteroskedasticity Weighted Least Squares (WLS) Feasible generalized Least
More informationApplied Statistics and Econometrics
Applied Statistics and Econometrics Lecture 5 Saul Lach September 2017 Saul Lach () Applied Statistics and Econometrics September 2017 1 / 44 Outline of Lecture 5 Now that we know the sampling distribution
More informationECON The Simple Regression Model
ECON 351 - The Simple Regression Model Maggie Jones 1 / 41 The Simple Regression Model Our starting point will be the simple regression model where we look at the relationship between two variables In
More information1 Independent Practice: Hypothesis tests for one parameter:
1 Independent Practice: Hypothesis tests for one parameter: Data from the Indian DHS survey from 2006 includes a measure of autonomy of the women surveyed (a scale from 0-10, 10 being the most autonomous)
More informationThe Linear Regression Model
The Linear Regression Model Carlo Favero Favero () The Linear Regression Model 1 / 67 OLS To illustrate how estimation can be performed to derive conditional expectations, consider the following general
More informationy ˆ i = ˆ " T u i ( i th fitted value or i th fit)
1 2 INFERENCE FOR MULTIPLE LINEAR REGRESSION Recall Terminology: p predictors x 1, x 2,, x p Some might be indicator variables for categorical variables) k-1 non-constant terms u 1, u 2,, u k-1 Each u
More informationECON3150/4150 Spring 2016
ECON3150/4150 Spring 2016 Lecture 4 - The linear regression model Siv-Elisabeth Skjelbred University of Oslo Last updated: January 26, 2016 1 / 49 Overview These lecture slides covers: The linear regression
More informationInference for the Regression Coefficient
Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression line. We can shows that b 0 and b 1 are the unbiased estimates
More informationEconometrics Summary Algebraic and Statistical Preliminaries
Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L
More informationMeasuring the fit of the model - SSR
Measuring the fit of the model - SSR Once we ve determined our estimated regression line, we d like to know how well the model fits. How far/close are the observations to the fitted line? One way to do
More informationLecture 4: Multivariate Regression, Part 2
Lecture 4: Multivariate Regression, Part 2 Gauss-Markov Assumptions 1) Linear in Parameters: Y X X X i 0 1 1 2 2 k k 2) Random Sampling: we have a random sample from the population that follows the above
More informationUnless provided with information to the contrary, assume for each question below that the Classical Linear Model assumptions hold.
Economics 345: Applied Econometrics Section A01 University of Victoria Midterm Examination #2 Version 1 SOLUTIONS Spring 2015 Instructor: Martin Farnham Unless provided with information to the contrary,
More informationEstimating σ 2. We can do simple prediction of Y and estimation of the mean of Y at any value of X.
Estimating σ 2 We can do simple prediction of Y and estimation of the mean of Y at any value of X. To perform inferences about our regression line, we must estimate σ 2, the variance of the error term.
More informationECO220Y Simple Regression: Testing the Slope
ECO220Y Simple Regression: Testing the Slope Readings: Chapter 18 (Sections 18.3-18.5) Winter 2012 Lecture 19 (Winter 2012) Simple Regression Lecture 19 1 / 32 Simple Regression Model y i = β 0 + β 1 x
More informationLecture 5: Hypothesis testing with the classical linear model
Lecture 5: Hypothesis testing with the classical linear model Assumption MLR6: Normality MLR6 is not one of the Gauss-Markov assumptions. It s not necessary to assume the error is normally distributed
More informationEconomics 326 Methods of Empirical Research in Economics. Lecture 14: Hypothesis testing in the multiple regression model, Part 2
Economics 326 Methods of Empirical Research in Economics Lecture 14: Hypothesis testing in the multiple regression model, Part 2 Vadim Marmer University of British Columbia May 5, 2010 Multiple restrictions
More informationLinear models and their mathematical foundations: Simple linear regression
Linear models and their mathematical foundations: Simple linear regression Steffen Unkel Department of Medical Statistics University Medical Center Göttingen, Germany Winter term 2018/19 1/21 Introduction
More informationModel Mis-specification
Model Mis-specification Carlo Favero Favero () Model Mis-specification 1 / 28 Model Mis-specification Each specification can be interpreted of the result of a reduction process, what happens if the reduction
More informationBasic econometrics. Tutorial 3. Dipl.Kfm. Johannes Metzler
Basic econometrics Tutorial 3 Dipl.Kfm. Introduction Some of you were asking about material to revise/prepare econometrics fundamentals. First of all, be aware that I will not be too technical, only as
More informationEconometrics - 30C00200
Econometrics - 30C00200 Lecture 11: Heteroskedasticity Antti Saastamoinen VATT Institute for Economic Research Fall 2015 30C00200 Lecture 11: Heteroskedasticity 12.10.2015 Aalto University School of Business
More informationAdvanced Econometrics I
Lecture Notes Autumn 2010 Dr. Getinet Haile, University of Mannheim 1. Introduction Introduction & CLRM, Autumn Term 2010 1 What is econometrics? Econometrics = economic statistics economic theory mathematics
More informationMa 3/103: Lecture 24 Linear Regression I: Estimation
Ma 3/103: Lecture 24 Linear Regression I: Estimation March 3, 2017 KC Border Linear Regression I March 3, 2017 1 / 32 Regression analysis Regression analysis Estimate and test E(Y X) = f (X). f is the
More informationSimultaneous Equation Models Learning Objectives Introduction Introduction (2) Introduction (3) Solving the Model structural equations
Simultaneous Equation Models. Introduction: basic definitions 2. Consequences of ignoring simultaneity 3. The identification problem 4. Estimation of simultaneous equation models 5. Example: IS LM model
More informationInference in Regression Analysis
Inference in Regression Analysis Dr. Frank Wood Frank Wood, fwood@stat.columbia.edu Linear Regression Models Lecture 4, Slide 1 Today: Normal Error Regression Model Y i = β 0 + β 1 X i + ǫ i Y i value
More informationEconometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague
Econometrics Week 4 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 23 Recommended Reading For the today Serial correlation and heteroskedasticity in
More informationInference in Regression Model
Inference in Regression Model Christopher Taber Department of Economics University of Wisconsin-Madison March 25, 2009 Outline 1 Final Step of Classical Linear Regression Model 2 Confidence Intervals 3
More informationRegression Models - Introduction
Regression Models - Introduction In regression models there are two types of variables that are studied: A dependent variable, Y, also called response variable. It is modeled as random. An independent
More informationBias Variance Trade-off
Bias Variance Trade-off The mean squared error of an estimator MSE(ˆθ) = E([ˆθ θ] 2 ) Can be re-expressed MSE(ˆθ) = Var(ˆθ) + (B(ˆθ) 2 ) MSE = VAR + BIAS 2 Proof MSE(ˆθ) = E((ˆθ θ) 2 ) = E(([ˆθ E(ˆθ)]
More informationSimple Linear Regression: The Model
Simple Linear Regression: The Model task: quantifying the effect of change X in X on Y, with some constant β 1 : Y = β 1 X, linear relationship between X and Y, however, relationship subject to a random
More information1 Linear Regression Analysis The Mincer Wage Equation Data Econometric Model Estimation... 11
Econ 495 - Econometric Review 1 Contents 1 Linear Regression Analysis 4 1.1 The Mincer Wage Equation................. 4 1.2 Data............................. 6 1.3 Econometric Model.....................
More informationBrief Suggested Solutions
DEPARTMENT OF ECONOMICS UNIVERSITY OF VICTORIA ECONOMICS 366: ECONOMETRICS II SPRING TERM 5: ASSIGNMENT TWO Brief Suggested Solutions Question One: Consider the classical T-observation, K-regressor linear
More informationLecture 2 Multiple Regression and Tests
Lecture 2 and s Dr.ssa Rossella Iraci Capuccinello 2017-18 Simple Regression Model The random variable of interest, y, depends on a single factor, x 1i, and this is an exogenous variable. The true but
More informationSimple Linear Regression
Simple Linear Regression ST 430/514 Recall: A regression model describes how a dependent variable (or response) Y is affected, on average, by one or more independent variables (or factors, or covariates)
More informationECON3150/4150 Spring 2015
ECON3150/4150 Spring 2015 Lecture 3&4 - The linear regression model Siv-Elisabeth Skjelbred University of Oslo January 29, 2015 1 / 67 Chapter 4 in S&W Section 17.1 in S&W (extended OLS assumptions) 2
More informationCorrelation Analysis
Simple Regression Correlation Analysis Correlation analysis is used to measure strength of the association (linear relationship) between two variables Correlation is only concerned with strength of the
More informationEconomics 345: Applied Econometrics Section A01 University of Victoria Midterm Examination #2 Version 2 Fall 2016 Instructor: Martin Farnham
Economics 345: Applied Econometrics Section A01 University of Victoria Midterm Examination #2 Version 2 Fall 2016 Instructor: Martin Farnham Last name (family name): First name (given name): Student ID
More informationHomework Set 2, ECO 311, Fall 2014
Homework Set 2, ECO 311, Fall 2014 Due Date: At the beginning of class on October 21, 2014 Instruction: There are twelve questions. Each question is worth 2 points. You need to submit the answers of only
More informationHypothesis testing Goodness of fit Multicollinearity Prediction. Applied Statistics. Lecturer: Serena Arima
Applied Statistics Lecturer: Serena Arima Hypothesis testing for the linear model Under the Gauss-Markov assumptions and the normality of the error terms, we saw that β N(β, σ 2 (X X ) 1 ) and hence s
More informationBusiness Statistics. Lecture 10: Course Review
Business Statistics Lecture 10: Course Review 1 Descriptive Statistics for Continuous Data Numerical Summaries Location: mean, median Spread or variability: variance, standard deviation, range, percentiles,
More informationMultiple Regression. Midterm results: AVG = 26.5 (88%) A = 27+ B = C =
Economics 130 Lecture 6 Midterm Review Next Steps for the Class Multiple Regression Review & Issues Model Specification Issues Launching the Projects!!!!! Midterm results: AVG = 26.5 (88%) A = 27+ B =
More information2) For a normal distribution, the skewness and kurtosis measures are as follows: A) 1.96 and 4 B) 1 and 2 C) 0 and 3 D) 0 and 0
Introduction to Econometrics Midterm April 26, 2011 Name Student ID MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. (5,000 credit for each correct
More informationRewrap ECON November 18, () Rewrap ECON 4135 November 18, / 35
Rewrap ECON 4135 November 18, 2011 () Rewrap ECON 4135 November 18, 2011 1 / 35 What should you now know? 1 What is econometrics? 2 Fundamental regression analysis 1 Bivariate regression 2 Multivariate
More informationMultiple Regression Analysis
Chapter 4 Multiple Regression Analysis The simple linear regression covered in Chapter 2 can be generalized to include more than one variable. Multiple regression analysis is an extension of the simple
More informationECON2228 Notes 2. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 47
ECON2228 Notes 2 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 2 2014 2015 1 / 47 Chapter 2: The simple regression model Most of this course will be concerned with
More informationEconometrics. Week 8. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague
Econometrics Week 8 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 25 Recommended Reading For the today Instrumental Variables Estimation and Two Stage
More informationReview of Classical Least Squares. James L. Powell Department of Economics University of California, Berkeley
Review of Classical Least Squares James L. Powell Department of Economics University of California, Berkeley The Classical Linear Model The object of least squares regression methods is to model and estimate
More informationThe Simple Regression Model. Part II. The Simple Regression Model
Part II The Simple Regression Model As of Sep 22, 2015 Definition 1 The Simple Regression Model Definition Estimation of the model, OLS OLS Statistics Algebraic properties Goodness-of-Fit, the R-square
More informationApplied Statistics and Econometrics
Applied Statistics and Econometrics Lecture 6 Saul Lach September 2017 Saul Lach () Applied Statistics and Econometrics September 2017 1 / 53 Outline of Lecture 6 1 Omitted variable bias (SW 6.1) 2 Multiple
More information5.1 Model Specification and Data 5.2 Estimating the Parameters of the Multiple Regression Model 5.3 Sampling Properties of the Least Squares
5.1 Model Specification and Data 5. Estimating the Parameters of the Multiple Regression Model 5.3 Sampling Properties of the Least Squares Estimator 5.4 Interval Estimation 5.5 Hypothesis Testing for
More informationMotivation for multiple regression
Motivation for multiple regression 1. Simple regression puts all factors other than X in u, and treats them as unobserved. Effectively the simple regression does not account for other factors. 2. The slope
More informationReview of Statistics
Review of Statistics Topics Descriptive Statistics Mean, Variance Probability Union event, joint event Random Variables Discrete and Continuous Distributions, Moments Two Random Variables Covariance and
More informationInferences for Regression
Inferences for Regression An Example: Body Fat and Waist Size Looking at the relationship between % body fat and waist size (in inches). Here is a scatterplot of our data set: Remembering Regression In
More informationAnswers to Problem Set #4
Answers to Problem Set #4 Problems. Suppose that, from a sample of 63 observations, the least squares estimates and the corresponding estimated variance covariance matrix are given by: bβ bβ 2 bβ 3 = 2
More informationProperties of the least squares estimates
Properties of the least squares estimates 2019-01-18 Warmup Let a and b be scalar constants, and X be a scalar random variable. Fill in the blanks E ax + b) = Var ax + b) = Goal Recall that the least squares
More informationECON Introductory Econometrics. Lecture 5: OLS with One Regressor: Hypothesis Tests
ECON4150 - Introductory Econometrics Lecture 5: OLS with One Regressor: Hypothesis Tests Monique de Haan (moniqued@econ.uio.no) Stock and Watson Chapter 5 Lecture outline 2 Testing Hypotheses about one
More informationChapter 8 Heteroskedasticity
Chapter 8 Walter R. Paczkowski Rutgers University Page 1 Chapter Contents 8.1 The Nature of 8. Detecting 8.3 -Consistent Standard Errors 8.4 Generalized Least Squares: Known Form of Variance 8.5 Generalized
More informationLecture 8. Using the CLR Model. Relation between patent applications and R&D spending. Variables
Lecture 8. Using the CLR Model Relation between patent applications and R&D spending Variables PATENTS = No. of patents (in 000) filed RDEP = Expenditure on research&development (in billions of 99 $) The
More informationHeteroscedasticity and Autocorrelation
Heteroscedasticity and Autocorrelation Carlo Favero Favero () Heteroscedasticity and Autocorrelation 1 / 17 Heteroscedasticity, Autocorrelation, and the GLS estimator Let us reconsider the single equation
More informationApplied Quantitative Methods II
Applied Quantitative Methods II Lecture 4: OLS and Statistics revision Klára Kaĺıšková Klára Kaĺıšková AQM II - Lecture 4 VŠE, SS 2016/17 1 / 68 Outline 1 Econometric analysis Properties of an estimator
More informationPeter Hoff Linear and multilinear models April 3, GLS for multivariate regression 5. 3 Covariance estimation for the GLM 8
Contents 1 Linear model 1 2 GLS for multivariate regression 5 3 Covariance estimation for the GLM 8 4 Testing the GLH 11 A reference for some of this material can be found somewhere. 1 Linear model Recall
More informationThe Statistical Property of Ordinary Least Squares
The Statistical Property of Ordinary Least Squares The linear equation, on which we apply the OLS is y t = X t β + u t Then, as we have derived, the OLS estimator is ˆβ = [ X T X] 1 X T y Then, substituting
More informationEconomics 113. Simple Regression Assumptions. Simple Regression Derivation. Changing Units of Measurement. Nonlinear effects
Economics 113 Simple Regression Models Simple Regression Assumptions Simple Regression Derivation Changing Units of Measurement Nonlinear effects OLS and unbiased estimates Variance of the OLS estimates
More informationRegression with a Single Regressor: Hypothesis Tests and Confidence Intervals
Regression with a Single Regressor: Hypothesis Tests and Confidence Intervals (SW Chapter 5) Outline. The standard error of ˆ. Hypothesis tests concerning β 3. Confidence intervals for β 4. Regression
More informationRecent Advances in the Field of Trade Theory and Policy Analysis Using Micro-Level Data
Recent Advances in the Field of Trade Theory and Policy Analysis Using Micro-Level Data July 2012 Bangkok, Thailand Cosimo Beverelli (World Trade Organization) 1 Content a) Classical regression model b)
More information