FENG CHIA UNIVERSITY ECONOMETRICS I: HOMEWORK 4. Prof. Mei-Yuan Chen Spring 2008
|
|
- Suzan Miles
- 5 years ago
- Views:
Transcription
1 FENG CHIA UNIVERSITY ECONOMETRICS I: HOMEWORK 4 Prof. Mei-Yuan Chen Spring 008. Partition and rearrange the matrix X as [x i X i ]. That is, X i is the matrix X excluding the column x i. Let u i denote the residual vector of regressing y on X i and v i denote the residual vector of regressing x i on X i. Define the partial correlation coefficient of y and x i as r i u i v i (u i u i) / (v i v i) / Let R i and R be the coefficients of determination obtained from the regressions of y on X i and y on X, respectively. (a) Applying matrix inversion formula to show I P (I P i ) (I P i )x i x i (I P i ) x i (I P i )x, i where P X(X X) X and P i X i (X i X i) X i. (b) Show that ( R )/( R i ) r i, Using this result to verify R R i r i ( R i ). What does this result tell you? (c) Let τ i denote the t-ratio of ˆβ it, the i-th element of ˆβ T obtained from regressing y on X. First show that τ i (T k)r i /( r i ). Using this result to verify r i τ i /(τ i + T k). (d) Combining the results in (b) and (c) to show R R i τ i ( R )/(T k). What does this result tell you?
2 . Let X,...,X T be independent random variables with the density function f(x;p) ( p) x p. Find the MLE for p. 3. Consider the model y i β 0 + β x i + ǫ i, i,,t, where x i are non-stochastic and ǫ i are independently distributed as N(kx i, σ0 ). Find the MLE for β 0 and β. 4. Given the model y t β +β x t +...+β k x tk +ǫ t, consider the standardized regression: y t β x t + + β k x tk + ǫ t, where β i are known as the beta coefficients, y t y t ȳ, x ti x ti x i s xi, ǫ t ǫ t ǫ, with s y (T ) t (y t ȳ) and s x i (T ) t (x ti x i ). (a) What are the relationships between β i and β i? Give an interpretation of the beta coefficients. (b) Are the t-ratios of the standardized regression different from those of the original regression? 5. Given the model y t β x t + β x t + β 3 x t3 + ǫ t with β +β α and β +β 3 α, suppose that all the classical assumptions hold. (a) As α is unknown, how do you test this constraint in the original model? (b) How would you estimate α? Iour estimator ˆα the BLUE? 6. Suppose that a linear model with k explanatory variables has been estimated. (a) Show that ˆσ T Centered TSS( R )/(T ). What does this result tell you?
3 (b) Suppose that we want to test the hypothesis that s coefficients are zero. Show that the F-test can be written as φ (T k + s)ˆσ c (T k)ˆσ u sˆσ u, where ˆσ c and ˆσ u are the variance estimates of the constrained and unconstrained models, respectively. By setting a (T k)/s also show that ˆσ c ˆσ u a + φ a +. (c) Based on the results in (a) and (b), what can you say if φ > or φ <? 3
4 ECONOMETRICS I Answer Key for Homework 4 Prof. Mei-Yuan Chen Apring 008. (a) Using matrix inversion formula in Greene (993, p. 7) and letting X [X i x i ], we get [ X (X X) i X i X i x ] i x i X i x i x i ( ) (X i X i) I + X i x ix i X i(x i X i) x i (I P i)x i (X i X i) X i x i x i (I P i)x i. x i X i(x i X i) x i (I P i)x i x i (I P i)x i When X [x i X i ] [ x (X X) i x i x i X i X i x i X i X i x i (I P i)x i (X i X i) X i x i It is then easy to verify that ] x i X i(x i X i) x i (I P i)x i ( x i (I P i)x i (X i X i ) I + X i x ix i X i(x i X i) x I P I [x i X i ][ i x i x i X i X i x i X i X i I P i (I P i)x i x i (I P i) x i (I P i )x. i ] [ x i X i (b) Note that u i (I P i )y and v i (I P i )X i. Then R R i y (I P)y y (I P i )y x i (I P i)x i ) y (I P i )y (y (I P i )x i ) /(x i (I P i )x i ) y (I P i )y r i. This result hold for both centered and non-centered R. (c) By Frisch-Waugh-Lovell Theorem, ˆβ it [x i (I P i)x i ] x i (I P i)y. By (a), var(ˆβ ˆ it ) σt x i (I P i )x i y (I P)y (T k)[x i (I P i )x i ] 4 ].
5 y (I P i )y (y (I P i )x i ) /(x i (I P i)x i ) (T k)[x i (I P i)x i ] y (I P i )yx i (I P i)x i (y (I P i )x i ) (T k)[x i (I P i)x i ] Therefore, since r i [y (I P i )x i ] /[y (I P i )yx i (I P i )x i ], ˆβ it τi var(ˆβ ˆ it ) ( ) x i (I P i )y ( (T k)[x i (I P i)x i ] ) x i (I P i)x i y (I P i )yx i (I P i)x i (y (I P i )x i ) ( [y (I P (T k) i )y][x i (I P i )x i ] ) [x i (I P i)y] T k /ri (d) Straightforward. (T k)r i r i. The MLE is p / x, where x T t x t. 3. The MLE are β i (x i x)(y i ȳ) i (x i x) k,. β 0 ȳ β x, where x T i x i and ȳ T i y i. These results should be obvious because the model can be written as y i β 0 + (β + k)x i + ǫ i, where ǫ i is distributed as N(0, σ 0 ). Hence, we can obtain the MLE for β 0 and β +k using standard formula. 4. As ȳ β + β x + + β k x k + ǫ, we have y t ȳ β x t x β s x }{{} β x t x s x }{{} x t x + + β kt x k k + ǫ t ǫ + + β k s xk 5 }{{} β k x kt x k s xk } {{ } x kt + ǫ t ǫ. }{{} ǫ t
6 This gives the standardized regression. When x it changes by one unit, i.e., x it changes by one standard deviation s xi, then yt will change by β i, i.e., y t will change by βi. This standardization thus permits comparison among regression coefficients. That the t-ratios remain the same can be easily verified using a simple linear regression model. 5. The constraints imply that β +β +β 3 0, hence a simple t-test on this hypothesis will do. Also observe that y t β x t + β x t + β 3 x 3t + ǫ t β x t + (α β )x t (α + β )x 3t + ǫ t β (x t x t x 3t ) + α(x t x 3t ) + ǫ t. The OLS estimator ˆα remains to be the BLUE because all the classical assumptions are still valid. 6. (a) The result follows from the fact that (b) R e e/(t k) (y y Tȳ )/(T ) ˆσ T TSS/(T ). Thus, R increases whenever ˆσ T decreases. φ (ESS c ESS u )/s ESS u /(T k) (e c e c e u e u )/s e u e u /(T k) (T k + s)ˆσ c (T k)ˆσ u sˆσ u. It is straightforward to show that for a (T k)/s, ˆσ c ˆσ u a + φ a +. (c) φ > implies ˆσ c > ˆσ u. Hence by (a), R c < R u. That is, when φ >, dropping these s variables would reduce R. Note that whether φ is significant does not matter. Similarly, φ < implies ˆσ c < ˆσ u, and hence R c > R u. 6
7 FENG CHIA UNIVERSITY ECONOMETRICS I: HOMEWORK 3 Prof. Mei-Yuan Chen Spring 008. Given the model y Xβ 0 + e, where X is T k. Let ˆβ T denote the OLS estimator and R k denote the resulting centered R, where the subscript k signifies a model with k explanatory variables. (a) Show that R k k i ˆβ it t (x ti x i )y t t (y t ȳ), where ˆβ it is the i-th element of ˆβ T, x ti is the t-th element of the i-th explanatory variable, y t is the t-th element of y, x i t x ti /T, and ȳ t y t /T. (b) Suppose that you delete an explanatory variable from the model (so that the model has k explanatory variables) and obtain R k, show that R k R k.. Consider the model y Xβ 0 + e, where X does not contain the constant term. (a) Show that y y T(ȳ) ŷ ŷ T( ŷ) + ê ê T( ê) T ŷ ê. (b) If we use R Centered RSS Centered TSS, or ESS R Centered TSS, are they bounded between zero and one? How should one compute R in the model without a constant term? 3. Suppose that we estimate the model y Xβ 0 + e and obtain ˆβ T and centered R. (a) If y 000 y and X are used as the dependent and explanatory variables, what is the effect of this change on ˆβ T and R? (b) If y and X 000 X are used as the dependent and explanatory variables, what is the effect of this change on ˆβ T and R? (c) If y and X are used as the dependent and explanatory variables, what is the effect of this change on ˆβ T and R? 4. Find a condition under which R is negative. 7
8 ECONOMETRICS I Answer Key for Homework 3 Prof. Mei-Yuan Chen Spring 008. (a) Let x t be the t-th column of X, then R k t (ŷ t ȳ) t (y t ȳ) t [ˆβ T (x t x)]ŷ t t (y t ȳ) ˆβ t T (x t x)(ŷ t + ê t ) k t (y ˆβ i it t (x ti x i )y t t ȳ) T t (y. t ȳ) Note that this expression holds when the model has a constant term. (b) Using the result of (a), k (k) Rk i ˆβ T it t (x ti x i )y t t (y, t ȳ) R k k i t (x ti x i )y t t (y, t ȳ) (k ) ˆβ it ˆβ (k) T ˆβ (k ) where and T are the OLS estimates for models with k and k variables, respectively. Note that the first k elements of are different form ˆβ (k ) T ˆβ (k) T in general. Suppose that Rk > R k. Then the estimator ˇ β (k) T [ˆβ (k ) ˆβ (k ) ˆβ (k ) k 0] yields the coefficient of determination Rk for the model with k variables. This contradicts the LS principle of maximizing R.. (a) Since y y ŷ ŷ + ê ê and ȳ ŷ + ê, T(ȳ) T( ŷ) + T( ê) + T ŷ ê, we get the answer. That is, y y T(ȳ) ŷ ŷ T( ŷ) + ê ê T( ê) T ŷ ê. (b) When a model does not contain the constant term, the centered R need not be bounded between 0 and, and non-centered R should be used. Note that, R Centered RSS Centered TSS >, if ê ê T( ê) T ŷ ê < 0; R ESS Centered TSS < 0, if ŷ ŷ T( ŷ) T( ê) T ŷ ê < (a) ˆβ T 000 ˆβ T, and R is unchanged. (b) ˆβ T ˆβ T /000, and R is unchanged. (c) ˆβ T and R are unchanged. 4. R < 0 when R < (k )/(T k). 8
9 FENG CHIA UNIVERSITY ECONOMETRICS I: HOMEWORK Prof. Mei-Yuan Chen Spring 008. Consider a location regression model with T observations {x,x,...,x T }: x t α + e t,t,...,t. (a) What is X as shown in the lecture note for this model? (b) What is the OLS estimator, ˆα T, for α? (c) What is the variance of the sampling distribution of ˆα T? (d) What is the sample variance you suggested for ˆα T?. Show algebraically the following results (a) ˆβ T (ryx)( t y )/( (b) ˆβ T (r yx )( t xy) T T t y /T)/( t x /T) where ˆβ T is the OLS estimate of the linear regression model y t α + βx t + u t,t,...,t. 3. A regresion model: y t βx t +u t,t, is considered. If u and u are statistically independent with common mean 0 and variance σu, find the sampling distribution of the following two estimators of the slope coefficient: t ˇβ y t t x, ˆβ t y tx t. t t x t Show that var(ˇβ) > var(ˆβ). 4. Given data on y and x, construct a linear regression model for each equation below and explain how you can estimate the parameters α and β. (a) y α + β log x. (b) y αx β. (c) y αe βx. x (d) y αx β. 9
10 (e) y eα+βx + e α+βx. 5. Consider the bivariate normal distributions is specified by f(x,y) Q(x, y) π σ exp[q(x,y)] y ρ { (x ) ( )( ) ( ) } µx x µx y µy y µy ( ρ ρ + ) σ y σ y (a) What is the conditional density function of Y on X x? (b) What is the conditional mean of Y on X x? (c) What is the conditional variance of Y on X x? 0
11 ECONOMETRICS I Answer Key for Homework Prof. Mei-Yuan Chen Spring 008. (a) x x. x T α. + e e. e T. (b) Q(α) t (x t α) /T and the first-order condition becomes Q(α) α T (x T t α) set 0.. (a) t Denote the solution as ˆα T which satisfies t (x t ˆα T ) 0, we have ˆα T t x t/t x T. (c) var(ˆα T ) var( x T ) σ X /T. (d) s ˆα T s x T s X /T, where s X t (x t ˆx T ) /(T ) is the sample variance of σ X. ˆβ n i (x i x n )(y i ȳ n ) i (x i x n ) [ i (x i x n )(y i ȳ n )] [ i (x i x n ) ][ i (y i ȳ n ) ] rxy i (y i ȳ n ) n i (x i x n )(y i ȳ n ). i (y i ȳ n ) i (x i x n )(y i ȳ n ) (b) ˆβ n i (x i x n )(y i ȳ n ) i (x i x n ) i (x i x n )(y i ȳ n ) i (x i x n ) i (y i ȳ n ) i (y i ȳ n ) i (x i x n ) r xy i (y i ȳ n ) i (x i x n ).
12 3. Since y t βx t + u t,t,, and we have var(ˇβ) σ u ( t x t ), var(ˆβ) var(ˇβ) var(ˆβ) It is easy to have t σ u t x t σu ( t x t) σ u t x t σ u[ t x t ( t x t) ] ( t x t). t x t x t ( x t ) (x + x ) (x + x ) x x x + x (x x ). We get the proof. t 4. (a) Regres on and log x. (b) ln y ln α + β ln x, regress ln y on and ln x to get lnˆ α and ˆβ. (c) ln y ln α + βx, regress ln y on and x to get lnˆ α and ˆβ. (d) (/y) α β(/x), regress /y on and (/x) to get ˆα and ˆβ. (e) y/( y) e α+βx, regress ln(y/( y)) on and x to get ˆα and ˆβ. 5. As Q(x, y) ( ρ ) ( ρ ) { (x ) ( )( µx y µy x µx ρ σ y { (y ) ( )( ) µy y µy x µx ρ σ y ( ) x ( ) } ρ µx x µx + ( ρ ) { (y µy σ y ρ x µ x σ y ) + ( ) } y µy σ y + ρ ( x µx ) ( ) } x + ( ρ µx ) ) Therefore, f(x,y) π σ y ρ exp[q(x,y)]
13 [ { (y exp µy πσy ρ ( ρ ρ x µ ) }] x ) σ y [ { ( ) }] x exp πσx ( ρ ( ρ µx ) ) ( ) exp y µ y ρσy (x µ x ) πσy ρ ( ρ )σy [ (x µx ) ] exp πσx f(y x)f(x). σ x It is ready to have that the conditional mean is µ y x µ y + ρσ y (x µ x ), and the conditional variance is var(y x) σ y ( ρ ). 3
14 FENG CHIA UNIVERSITY ECONOMETRICS I: HOMEWORK Prof. Mei-Yuan Chen Spring 008. Suppose that random variables x and y take only two values 0 and, and have the following joint probability function x 0 x y y Find E(y x), E(y x) and var(y x) for x 0 and x.. The value of the mean of a random sample of size 0 from a normal population X is x n 8.. Find the 95 % confidence interval for the mean of the population on the assumption that the variance is σ X Let x n be the mean of a random sample of size n from an N(µ,σ ) population. What is the probability that the interval ( x n σ/ n, x n + σ/ n) includes the point µ? 4. The mean of a random sample of size 7 from a normal population is x n 4.7. Determine the 90 % confidence interval for the population mean when the estimate variance of the population is Suppose a simple linear regression model is considered for the conditional mean of Y on X x a t α + βx t + e t for a random sample {(y t,x t ),t,...,t }. (a) What is the functional form of the conditional mean implied by the supposed regression model? (b) Are the parameters of the function of conditional mean assumed to be constant over the whole sample? (c) What are the OLS estimators for α and β? (d) What are the variances of the OLS estimators for α and β and the covariance between them? 4
Quick Review on Linear Multiple Regression
Quick Review on Linear Multiple Regression Mei-Yuan Chen Department of Finance National Chung Hsing University March 6, 2007 Introduction for Conditional Mean Modeling Suppose random variables Y, X 1,
More informationIntroduction to Estimation Methods for Time Series models. Lecture 1
Introduction to Estimation Methods for Time Series models Lecture 1 Fulvio Corsi SNS Pisa Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 1 / 19 Estimation
More informationClassical Least Squares Theory
Classical Least Squares Theory CHUNG-MING KUAN Department of Finance & CRETA National Taiwan University October 18, 2014 C.-M. Kuan (Finance & CRETA, NTU) Classical Least Squares Theory October 18, 2014
More informationClassical Least Squares Theory
Classical Least Squares Theory CHUNG-MING KUAN Department of Finance & CRETA National Taiwan University October 14, 2012 C.-M. Kuan (Finance & CRETA, NTU) Classical Least Squares Theory October 14, 2012
More informationQuantitative Analysis of Financial Markets. Summary of Part II. Key Concepts & Formulas. Christopher Ting. November 11, 2017
Summary of Part II Key Concepts & Formulas Christopher Ting November 11, 2017 christopherting@smu.edu.sg http://www.mysmu.edu/faculty/christophert/ Christopher Ting 1 of 16 Why Regression Analysis? Understand
More informationPeter Hoff Linear and multilinear models April 3, GLS for multivariate regression 5. 3 Covariance estimation for the GLM 8
Contents 1 Linear model 1 2 GLS for multivariate regression 5 3 Covariance estimation for the GLM 8 4 Testing the GLH 11 A reference for some of this material can be found somewhere. 1 Linear model Recall
More information[y i α βx i ] 2 (2) Q = i=1
Least squares fits This section has no probability in it. There are no random variables. We are given n points (x i, y i ) and want to find the equation of the line that best fits them. We take the equation
More informationIn the bivariate regression model, the original parameterization is. Y i = β 1 + β 2 X2 + β 2 X2. + β 2 (X 2i X 2 ) + ε i (2)
RNy, econ460 autumn 04 Lecture note Orthogonalization and re-parameterization 5..3 and 7.. in HN Orthogonalization of variables, for example X i and X means that variables that are correlated are made
More informationSummer School in Statistics for Astronomers V June 1 - June 6, Regression. Mosuk Chow Statistics Department Penn State University.
Summer School in Statistics for Astronomers V June 1 - June 6, 2009 Regression Mosuk Chow Statistics Department Penn State University. Adapted from notes prepared by RL Karandikar Mean and variance Recall
More informationECONOMETRICS (I) MEI-YUAN CHEN. Department of Finance National Chung Hsing University. July 17, 2003
ECONOMERICS (I) MEI-YUAN CHEN Department of Finance National Chung Hsing University July 17, 2003 c Mei-Yuan Chen. he L A EX source file is ec471.tex. Contents 1 Introduction 1 2 Reviews of Statistics
More informationL2: Two-variable regression model
L2: Two-variable regression model Feng Li feng.li@cufe.edu.cn School of Statistics and Mathematics Central University of Finance and Economics Revision: September 4, 2014 What we have learned last time...
More informationStatement: With my signature I confirm that the solutions are the product of my own work. Name: Signature:.
MATHEMATICAL STATISTICS Homework assignment Instructions Please turn in the homework with this cover page. You do not need to edit the solutions. Just make sure the handwriting is legible. You may discuss
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationProblems. Suppose both models are fitted to the same data. Show that SS Res, A SS Res, B
Simple Linear Regression 35 Problems 1 Consider a set of data (x i, y i ), i =1, 2,,n, and the following two regression models: y i = β 0 + β 1 x i + ε, (i =1, 2,,n), Model A y i = γ 0 + γ 1 x i + γ 2
More informationSimple Linear Regression
Simple Linear Regression In simple linear regression we are concerned about the relationship between two variables, X and Y. There are two components to such a relationship. 1. The strength of the relationship.
More informationReference: Davidson and MacKinnon Ch 2. In particular page
RNy, econ460 autumn 03 Lecture note Reference: Davidson and MacKinnon Ch. In particular page 57-8. Projection matrices The matrix M I X(X X) X () is often called the residual maker. That nickname is easy
More informationSimple Linear Regression: The Model
Simple Linear Regression: The Model task: quantifying the effect of change X in X on Y, with some constant β 1 : Y = β 1 X, linear relationship between X and Y, however, relationship subject to a random
More informationMa 3/103: Lecture 24 Linear Regression I: Estimation
Ma 3/103: Lecture 24 Linear Regression I: Estimation March 3, 2017 KC Border Linear Regression I March 3, 2017 1 / 32 Regression analysis Regression analysis Estimate and test E(Y X) = f (X). f is the
More informationECON 3150/4150, Spring term Lecture 6
ECON 3150/4150, Spring term 2013. Lecture 6 Review of theoretical statistics for econometric modelling (II) Ragnar Nymoen University of Oslo 31 January 2013 1 / 25 References to Lecture 3 and 6 Lecture
More informationECON The Simple Regression Model
ECON 351 - The Simple Regression Model Maggie Jones 1 / 41 The Simple Regression Model Our starting point will be the simple regression model where we look at the relationship between two variables In
More informationSTAT420 Midterm Exam. University of Illinois Urbana-Champaign October 19 (Friday), :00 4:15p. SOLUTIONS (Yellow)
STAT40 Midterm Exam University of Illinois Urbana-Champaign October 19 (Friday), 018 3:00 4:15p SOLUTIONS (Yellow) Question 1 (15 points) (10 points) 3 (50 points) extra ( points) Total (77 points) Points
More informationThe regression model with one stochastic regressor (part II)
The regression model with one stochastic regressor (part II) 3150/4150 Lecture 7 Ragnar Nymoen 6 Feb 2012 We will finish Lecture topic 4: The regression model with stochastic regressor We will first look
More informationEconometrics of Panel Data
Econometrics of Panel Data Jakub Mućk Meeting # 2 Jakub Mućk Econometrics of Panel Data Meeting # 2 1 / 26 Outline 1 Fixed effects model The Least Squares Dummy Variable Estimator The Fixed Effect (Within
More informationFormulary Applied Econometrics
Department of Economics Formulary Applied Econometrics c c Seminar of Statistics University of Fribourg Formulary Applied Econometrics 1 Rescaling With y = cy we have: ˆβ = cˆβ With x = Cx we have: ˆβ
More informationECON 4160, Autumn term Lecture 1
ECON 4160, Autumn term 2017. Lecture 1 a) Maximum Likelihood based inference. b) The bivariate normal model Ragnar Nymoen University of Oslo 24 August 2017 1 / 54 Principles of inference I Ordinary least
More informationLecture 14 Simple Linear Regression
Lecture 4 Simple Linear Regression Ordinary Least Squares (OLS) Consider the following simple linear regression model where, for each unit i, Y i is the dependent variable (response). X i is the independent
More informationLinear Regression. In this problem sheet, we consider the problem of linear regression with p predictors and one intercept,
Linear Regression In this problem sheet, we consider the problem of linear regression with p predictors and one intercept, y = Xβ + ɛ, where y t = (y 1,..., y n ) is the column vector of target values,
More informationGeneral Linear Model: Statistical Inference
Chapter 6 General Linear Model: Statistical Inference 6.1 Introduction So far we have discussed formulation of linear models (Chapter 1), estimability of parameters in a linear model (Chapter 4), least
More informationLecture 3: Multiple Regression
Lecture 3: Multiple Regression R.G. Pierse 1 The General Linear Model Suppose that we have k explanatory variables Y i = β 1 + β X i + β 3 X 3i + + β k X ki + u i, i = 1,, n (1.1) or Y i = β j X ji + u
More informationMultivariate Regression Analysis
Matrices and vectors The model from the sample is: Y = Xβ +u with n individuals, l response variable, k regressors Y is a n 1 vector or a n l matrix with the notation Y T = (y 1,y 2,...,y n ) 1 x 11 x
More informationSimple Linear Regression
Simple Linear Regression ST 430/514 Recall: A regression model describes how a dependent variable (or response) Y is affected, on average, by one or more independent variables (or factors, or covariates)
More information3. Linear Regression With a Single Regressor
3. Linear Regression With a Single Regressor Econometrics: (I) Application of statistical methods in empirical research Testing economic theory with real-world data (data analysis) 56 Econometrics: (II)
More informationINTRODUCTORY ECONOMETRICS
INTRODUCTORY ECONOMETRICS Lesson 2b Dr Javier Fernández etpfemaj@ehu.es Dpt. of Econometrics & Statistics UPV EHU c J Fernández (EA3-UPV/EHU), February 21, 2009 Introductory Econometrics - p. 1/192 GLRM:
More informationMATH11400 Statistics Homepage
MATH11400 Statistics 1 2010 11 Homepage http://www.stats.bris.ac.uk/%7emapjg/teach/stats1/ 4. Linear Regression 4.1 Introduction So far our data have consisted of observations on a single variable of interest.
More informationReview of Econometrics
Review of Econometrics Zheng Tian June 5th, 2017 1 The Essence of the OLS Estimation Multiple regression model involves the models as follows Y i = β 0 + β 1 X 1i + β 2 X 2i + + β k X ki + u i, i = 1,...,
More informationWeighted Least Squares
Weighted Least Squares The standard linear model assumes that Var(ε i ) = σ 2 for i = 1,..., n. As we have seen, however, there are instances where Var(Y X = x i ) = Var(ε i ) = σ2 w i. Here w 1,..., w
More informationSCHOOL OF MATHEMATICS AND STATISTICS. Linear and Generalised Linear Models
SCHOOL OF MATHEMATICS AND STATISTICS Linear and Generalised Linear Models Autumn Semester 2017 18 2 hours Attempt all the questions. The allocation of marks is shown in brackets. RESTRICTED OPEN BOOK EXAMINATION
More informationRegression. ECO 312 Fall 2013 Chris Sims. January 12, 2014
ECO 312 Fall 2013 Chris Sims Regression January 12, 2014 c 2014 by Christopher A. Sims. This document is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License What
More informationØkonomisk Kandidateksamen 2004 (I) Econometrics 2. Rettevejledning
Økonomisk Kandidateksamen 2004 (I) Econometrics 2 Rettevejledning This is a closed-book exam (uden hjælpemidler). Answer all questions! The group of questions 1 to 4 have equal weight. Within each group,
More informationProblem Selected Scores
Statistics Ph.D. Qualifying Exam: Part II November 20, 2010 Student Name: 1. Answer 8 out of 12 problems. Mark the problems you selected in the following table. Problem 1 2 3 4 5 6 7 8 9 10 11 12 Selected
More informationINTRODUCING LINEAR REGRESSION MODELS Response or Dependent variable y
INTRODUCING LINEAR REGRESSION MODELS Response or Dependent variable y Predictor or Independent variable x Model with error: for i = 1,..., n, y i = α + βx i + ε i ε i : independent errors (sampling, measurement,
More informationIntroduction to Econometrics Midterm Examination Fall 2005 Answer Key
Introduction to Econometrics Midterm Examination Fall 2005 Answer Key Please answer all of the questions and show your work Clearly indicate your final answer to each question If you think a question is
More informationECON 3150/4150, Spring term Lecture 7
ECON 3150/4150, Spring term 2014. Lecture 7 The multivariate regression model (I) Ragnar Nymoen University of Oslo 4 February 2014 1 / 23 References to Lecture 7 and 8 SW Ch. 6 BN Kap 7.1-7.8 2 / 23 Omitted
More informationwhere x and ȳ are the sample means of x 1,, x n
y y Animal Studies of Side Effects Simple Linear Regression Basic Ideas In simple linear regression there is an approximately linear relation between two variables say y = pressure in the pancreas x =
More informationLinear models and their mathematical foundations: Simple linear regression
Linear models and their mathematical foundations: Simple linear regression Steffen Unkel Department of Medical Statistics University Medical Center Göttingen, Germany Winter term 2018/19 1/21 Introduction
More information(X i X) 2. n 1 X X. s X. s 2 F (n 1),(m 1)
X X X 10 n 5 X n X N(µ X, σx ) n s X = (X i X). n 1 (n 1)s X σ X n = (X i X) σ X χ n 1. t t χ t (X µ X )/ σ X n s X σx = X µ X σ X n σx s X = X µ X n s X t n 1. F F χ F F n (X i X) /(n 1) m (Y i Y ) /(m
More informationECON3150/4150 Spring 2015
ECON3150/4150 Spring 2015 Lecture 3&4 - The linear regression model Siv-Elisabeth Skjelbred University of Oslo January 29, 2015 1 / 67 Chapter 4 in S&W Section 17.1 in S&W (extended OLS assumptions) 2
More informationApplied Econometrics (QEM)
Applied Econometrics (QEM) based on Prinicples of Econometrics Jakub Mućk Department of Quantitative Economics Jakub Mućk Applied Econometrics (QEM) Meeting #3 1 / 42 Outline 1 2 3 t-test P-value Linear
More informationThe regression model with one fixed regressor cont d
The regression model with one fixed regressor cont d 3150/4150 Lecture 4 Ragnar Nymoen 27 January 2012 The model with transformed variables Regression with transformed variables I References HGL Ch 2.8
More informationMotivation for multiple regression
Motivation for multiple regression 1. Simple regression puts all factors other than X in u, and treats them as unobserved. Effectively the simple regression does not account for other factors. 2. The slope
More informationPh.D. Qualifying Exam Friday Saturday, January 6 7, 2017
Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017 Put your solution to each problem on a separate sheet of paper. Problem 1. (5106) Let X 1, X 2,, X n be a sequence of i.i.d. observations from a
More informationStatistics 910, #5 1. Regression Methods
Statistics 910, #5 1 Overview Regression Methods 1. Idea: effects of dependence 2. Examples of estimation (in R) 3. Review of regression 4. Comparisons and relative efficiencies Idea Decomposition Well-known
More informationApplied Econometrics (QEM)
Applied Econometrics (QEM) The Simple Linear Regression Model based on Prinicples of Econometrics Jakub Mućk Department of Quantitative Economics Jakub Mućk Applied Econometrics (QEM) Meeting #2 The Simple
More informationSimple Linear Regression
Simple Linear Regression September 24, 2008 Reading HH 8, GIll 4 Simple Linear Regression p.1/20 Problem Data: Observe pairs (Y i,x i ),i = 1,...n Response or dependent variable Y Predictor or independent
More informationSimple and Multiple Linear Regression
Sta. 113 Chapter 12 and 13 of Devore March 12, 2010 Table of contents 1 Simple Linear Regression 2 Model Simple Linear Regression A simple linear regression model is given by Y = β 0 + β 1 x + ɛ where
More informationEconometrics I Lecture 3: The Simple Linear Regression Model
Econometrics I Lecture 3: The Simple Linear Regression Model Mohammad Vesal Graduate School of Management and Economics Sharif University of Technology 44716 Fall 1397 1 / 32 Outline Introduction Estimating
More informationMS&E 226: Small Data. Lecture 11: Maximum likelihood (v2) Ramesh Johari
MS&E 226: Small Data Lecture 11: Maximum likelihood (v2) Ramesh Johari ramesh.johari@stanford.edu 1 / 18 The likelihood function 2 / 18 Estimating the parameter This lecture develops the methodology behind
More informationScatter plot of data from the study. Linear Regression
1 2 Linear Regression Scatter plot of data from the study. Consider a study to relate birthweight to the estriol level of pregnant women. The data is below. i Weight (g / 100) i Weight (g / 100) 1 7 25
More informationFrom last time... The equations
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this
More informationMA 575 Linear Models: Cedric E. Ginestet, Boston University Midterm Review Week 7
MA 575 Linear Models: Cedric E. Ginestet, Boston University Midterm Review Week 7 1 Random Vectors Let a 0 and y be n 1 vectors, and let A be an n n matrix. Here, a 0 and A are non-random, whereas y is
More informationLecture 15. Hypothesis testing in the linear model
14. Lecture 15. Hypothesis testing in the linear model Lecture 15. Hypothesis testing in the linear model 1 (1 1) Preliminary lemma 15. Hypothesis testing in the linear model 15.1. Preliminary lemma Lemma
More informationQuestions and Answers on Unit Roots, Cointegration, VARs and VECMs
Questions and Answers on Unit Roots, Cointegration, VARs and VECMs L. Magee Winter, 2012 1. Let ɛ t, t = 1,..., T be a series of independent draws from a N[0,1] distribution. Let w t, t = 1,..., T, be
More informationF9 F10: Autocorrelation
F9 F10: Autocorrelation Feng Li Department of Statistics, Stockholm University Introduction In the classic regression model we assume cov(u i, u j x i, x k ) = E(u i, u j ) = 0 What if we break the assumption?
More informationPart 6: Multivariate Normal and Linear Models
Part 6: Multivariate Normal and Linear Models 1 Multiple measurements Up until now all of our statistical models have been univariate models models for a single measurement on each member of a sample of
More informationMEI Exam Review. June 7, 2002
MEI Exam Review June 7, 2002 1 Final Exam Revision Notes 1.1 Random Rules and Formulas Linear transformations of random variables. f y (Y ) = f x (X) dx. dg Inverse Proof. (AB)(AB) 1 = I. (B 1 A 1 )(AB)(AB)
More informationJoint Gaussian Graphical Model Review Series I
Joint Gaussian Graphical Model Review Series I Probability Foundations Beilun Wang Advisor: Yanjun Qi 1 Department of Computer Science, University of Virginia http://jointggm.org/ June 23rd, 2017 Beilun
More informationBIOS 2083 Linear Models c Abdus S. Wahed
Chapter 5 206 Chapter 6 General Linear Model: Statistical Inference 6.1 Introduction So far we have discussed formulation of linear models (Chapter 1), estimability of parameters in a linear model (Chapter
More informationSTAT 511. Lecture : Simple linear regression Devore: Section Prof. Michael Levine. December 3, Levine STAT 511
STAT 511 Lecture : Simple linear regression Devore: Section 12.1-12.4 Prof. Michael Levine December 3, 2018 A simple linear regression investigates the relationship between the two variables that is not
More informationEconometrics A. Simple linear model (2) Keio University, Faculty of Economics. Simon Clinet (Keio University) Econometrics A October 16, / 11
Econometrics A Keio University, Faculty of Economics Simple linear model (2) Simon Clinet (Keio University) Econometrics A October 16, 2018 1 / 11 Estimation of the noise variance σ 2 In practice σ 2 too
More informationRegression diagnostics
Regression diagnostics Kerby Shedden Department of Statistics, University of Michigan November 5, 018 1 / 6 Motivation When working with a linear model with design matrix X, the conventional linear model
More informationMultiple Linear Regression
Multiple Linear Regression Simple linear regression tries to fit a simple line between two variables Y and X. If X is linearly related to Y this explains some of the variability in Y. In most cases, there
More informationScatter plot of data from the study. Linear Regression
1 2 Linear Regression Scatter plot of data from the study. Consider a study to relate birthweight to the estriol level of pregnant women. The data is below. i Weight (g / 100) i Weight (g / 100) 1 7 25
More informationAMS 315/576 Lecture Notes. Chapter 11. Simple Linear Regression
AMS 315/576 Lecture Notes Chapter 11. Simple Linear Regression 11.1 Motivation A restaurant opening on a reservations-only basis would like to use the number of advance reservations x to predict the number
More informationProbability and Statistics Notes
Probability and Statistics Notes Chapter Seven Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Seven Notes Spring 2011 1 / 42 Outline
More informationInverse of a Square Matrix. For an N N square matrix A, the inverse of A, 1
Inverse of a Square Matrix For an N N square matrix A, the inverse of A, 1 A, exists if and only if A is of full rank, i.e., if and only if no column of A is a linear combination 1 of the others. A is
More informationProblem Set #6: OLS. Economics 835: Econometrics. Fall 2012
Problem Set #6: OLS Economics 835: Econometrics Fall 202 A preliminary result Suppose we have a random sample of size n on the scalar random variables (x, y) with finite means, variances, and covariance.
More informationECON Program Evaluation, Binary Dependent Variable, Misc.
ECON 351 - Program Evaluation, Binary Dependent Variable, Misc. Maggie Jones () 1 / 17 Readings Chapter 13: Section 13.2 on difference in differences Chapter 7: Section on binary dependent variables Chapter
More informationLecture 6: Geometry of OLS Estimation of Linear Regession
Lecture 6: Geometry of OLS Estimation of Linear Regession Xuexin Wang WISE Oct 2013 1 / 22 Matrix Algebra An n m matrix A is a rectangular array that consists of nm elements arranged in n rows and m columns
More informationThe Simple Regression Model. Part II. The Simple Regression Model
Part II The Simple Regression Model As of Sep 22, 2015 Definition 1 The Simple Regression Model Definition Estimation of the model, OLS OLS Statistics Algebraic properties Goodness-of-Fit, the R-square
More informationChapter 1. Linear Regression with One Predictor Variable
Chapter 1. Linear Regression with One Predictor Variable 1.1 Statistical Relation Between Two Variables To motivate statistical relationships, let us consider a mathematical relation between two mathematical
More informationSimple Linear Regression
Simple Linear Regression Reading: Hoff Chapter 9 November 4, 2009 Problem Data: Observe pairs (Y i,x i ),i = 1,... n Response or dependent variable Y Predictor or independent variable X GOALS: Exploring
More informationCopula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011
Copula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011 Outline Ordinary Least Squares (OLS) Regression Generalized Linear Models
More information1. The Multivariate Classical Linear Regression Model
Business School, Brunel University MSc. EC550/5509 Modelling Financial Decisions and Markets/Introduction to Quantitative Methods Prof. Menelaos Karanasos (Room SS69, Tel. 08956584) Lecture Notes 5. The
More informationRegression #2. Econ 671. Purdue University. Justin L. Tobias (Purdue) Regression #2 1 / 24
Regression #2 Econ 671 Purdue University Justin L. Tobias (Purdue) Regression #2 1 / 24 Estimation In this lecture, we address estimation of the linear regression model. There are many objective functions
More information18.S096 Problem Set 3 Fall 2013 Regression Analysis Due Date: 10/8/2013
18.S096 Problem Set 3 Fall 013 Regression Analysis Due Date: 10/8/013 he Projection( Hat ) Matrix and Case Influence/Leverage Recall the setup for a linear regression model y = Xβ + ɛ where y and ɛ are
More informationECON 5350 Class Notes Functional Form and Structural Change
ECON 5350 Class Notes Functional Form and Structural Change 1 Introduction Although OLS is considered a linear estimator, it does not mean that the relationship between Y and X needs to be linear. In this
More informationL7: Multicollinearity
L7: Multicollinearity Feng Li feng.li@cufe.edu.cn School of Statistics and Mathematics Central University of Finance and Economics Introduction ï Example Whats wrong with it? Assume we have this data Y
More informationSTA 2201/442 Assignment 2
STA 2201/442 Assignment 2 1. This is about how to simulate from a continuous univariate distribution. Let the random variable X have a continuous distribution with density f X (x) and cumulative distribution
More informationWe begin by thinking about population relationships.
Conditional Expectation Function (CEF) We begin by thinking about population relationships. CEF Decomposition Theorem: Given some outcome Y i and some covariates X i there is always a decomposition where
More informationLinear Regression with 1 Regressor. Introduction to Econometrics Spring 2012 Ken Simons
Linear Regression with 1 Regressor Introduction to Econometrics Spring 2012 Ken Simons Linear Regression with 1 Regressor 1. The regression equation 2. Estimating the equation 3. Assumptions required for
More informationEstimating σ 2. We can do simple prediction of Y and estimation of the mean of Y at any value of X.
Estimating σ 2 We can do simple prediction of Y and estimation of the mean of Y at any value of X. To perform inferences about our regression line, we must estimate σ 2, the variance of the error term.
More informationIntroduction to Computational Finance and Financial Econometrics Probability Review - Part 2
You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /
More informationEconometrics of Panel Data
Econometrics of Panel Data Jakub Mućk Meeting # 3 Jakub Mućk Econometrics of Panel Data Meeting # 3 1 / 21 Outline 1 Fixed or Random Hausman Test 2 Between Estimator 3 Coefficient of determination (R 2
More informationModel Specification Testing in Nonparametric and Semiparametric Time Series Econometrics. Jiti Gao
Model Specification Testing in Nonparametric and Semiparametric Time Series Econometrics Jiti Gao Department of Statistics School of Mathematics and Statistics The University of Western Australia Crawley
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationChapter 10. Simple Linear Regression and Correlation
Chapter 10. Simple Linear Regression and Correlation In the two sample problems discussed in Ch. 9, we were interested in comparing values of parameters for two distributions. Regression analysis is the
More informationDay 4: Shrinkage Estimators
Day 4: Shrinkage Estimators Kenneth Benoit Data Mining and Statistical Learning March 9, 2015 n versus p (aka k) Classical regression framework: n > p. Without this inequality, the OLS coefficients have
More informationInterpreting Regression Results
Interpreting Regression Results Carlo Favero Favero () Interpreting Regression Results 1 / 42 Interpreting Regression Results Interpreting regression results is not a simple exercise. We propose to split
More informationJoint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:
Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,
More informationThe Multiple Regression Model Estimation
Lesson 5 The Multiple Regression Model Estimation Pilar González and Susan Orbe Dpt Applied Econometrics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 Lesson 5 Regression model:
More information