Finansiell Statistik, GN, 15 hp, VT2008 Lecture 15: Multiple Linear Regression & Correlation
|
|
- Alannah Anderson
- 5 years ago
- Views:
Transcription
1 Finansiell Statistik, GN, 5 hp, VT28 Lecture 5: Multiple Linear Regression & Correlation Gebrenegus Ghilagaber, PhD, ssociate Professor May 5, 28
2 Introduction In the simple linear regression Y i = + X i + i () the least square estimates of the parameters were obtained from the normal equations: and Y i = na + b X i Y i = a X i + b X i (2) X 2 i (3) 2
3 2 The Models and its assumptions Suppose, we extend the simple linear regression model by including one explanatory variable. Then, the population multiple regression model becomes and its sample estimate is given by Y i = + X i + 2 X 2i + i (4) by i = a + b X i + b 2 X 2i + e i (5) The sum of squares of the error terms is, then, given by: e 2 i = b Y i 2 X n Y i = (Y i a b X i b 2 X 2i ) 2 ; (6) 3
4 and the least square estimates of the parameters are obtained from the normal equations: Y i = na + b X i + b 2 X 2i (7) and X i Y i = a X i + b Xi 2 + b 2 X i X 2i (8) X 2i Y i = a X 2i + b X i X 2i + b 2 X2i 2 (9) 4
5 3 Standard ssumptions for Multiple Regression (with two explanatory variables) Normality - the i are normally distributed. Zero mean - the i have zero mean, E( i ) = Constant Variance (Homoscedasticity): the i are normally distributed.with mean and constant variance, i s N(; 2 ) Independence: the i are independent, Cov( i ; j ) = for i 6= j: 5
6 X i and i are uncorrelated: The X i are either xed or random but uncorrelated with the i, Cov( i ; X i ) = No Multicollinearity: the explanatory variables X and X 2 are not strongly correlated 6
7 4 Estimating Multiple Regression Parameters In obtaining the least square estimates of the parameters in multiple regression, it is easier to work with the deviations: y i = Y i Y ; x i = X i X ; and x 2i = X 2i X 2 instead of Y i ; X i ; and X 2i : In such a case, and y i = x 2i = Yi Y = ; X2i X 2 = x i = Xi X = ; 7
8 Thus, equations (8) and (9) may be rewritted as, x i y i = b x 2 i + b 2 x i x 2i () x 2i y i = b x i x 2i + b 2 x 2 2i () 8
9 From equations () and () we get, b = b 2 = x 2 2i x 2i y i 2 x 2 i x 2 2i x i x 2i x i y i x 2 i x i y i 2 x 2 i x 2 2i x i x 2i x 2i y i x i x 2i x i x 2i and (it can be shown that), a = Y b X b 2 X 2 9
10 5 Decomposing the total variance - the NOV Table s we did in Simple Linear Regression, we can now decompose the total variance into its various sources and create an NOV table: Y i Y = Y i b Y i + b Y i Y = Y i b Y i + b Y i Y so that Yi Y 2 = n X = +2 Yi Yi b Y i 2 + n X b Y i Y 2 Yi b Y i b Y i Y b Y i 2 + n X b Y i Y 2
11 The corresponding NOV-table will then be given by Source of Degrees Sum of Mean F-ratio variation of freedom Squares Squares Regression k SS R = Error n k SS E = b Y i Y 2 MS R = SS R k Yi b Y i 2 MS E = SS E n k F = MS R MS E Total n SS T = Yi Y 2 MS T = SS T n Note that the degrees of freedom and the sum of squares are additive but not the mean squares: k + (n k ) = n ; SS R + SS E = SS T ; MS R + MS E 6= MS T
12 6 The Residual Standard Error, S 2 e; & the coef- cient of multiple determination, R 2 oth Se 2 and R 2 may be used to evaluate the goodness-of- t of our multiple regression model. The Residual Standard Error, Se; 2 is just the standard deviation of the error terms: S e = v Yi b 2 Y u i t n k = s SSE n k = s SSE ; when k = 2. n 3 2
13 The coe cient of multiple determination, R 2, is given by R 2 = SS R SS = E SS T SS T It gives the proportion (percentage) of the total variation in the dependent variable (Y) that is explained by the explanatory variables X and X 2. The larger the value of R 2, the better the t of the model. The adjusted, Radj 2, that takes due account of the degrees of freedom, is given by Radj 2 SS = E = (n k ) SS = E n SS T = (n ) SS T n k! SS = R n SS T n k = R 2 n n k = R 2 n n k 3
14 gain, note that R 2 adj R2, indicating that the unadjusted R 2 is an overestimate. oth S e and R 2 measure the goodness-of- t for a regression model, but S e is an absolute measure while R 2 is a relative measure. 4
15 7 Testing for the overall model-signi cance To test, the appropriate test statistics is H : = 2 = ::: = k = H : i 6= ; for at least one i. F = MS R MS E = Y b i Y 2 Yi Y b 2 C i which is to be compared with F (k;n k ;) : n k k 5
16 This is a global test in the sense that if the test is signi cant (H is rejected), we don t yet know which of the i is (are) signi cantly di erent from. Note also that the test statistics may be related to the coe cient of multiple determination, R 2, as follows: F = = Y b i Y 2 Yi Y b 2 C i R 2! n k R 2 k n k k = 2 byi Y (Y i Y ) 2 (Y i Y ) 2 X n 2 byi Y (Y i Y ) 2 C n k 6 k
17 8 Tests on sets of individual regression coe - cients To test, say H : i = H : i 6= for the individual coe cients, we may use the t-statistic: t = b i S(b i ) 7
18 and compare the calculated value of t with that of t (n k ; 2 ): The standard errors of the individual estimates are given by and S(b ) = S(b 2 ) = Se 2 x 2 2i x 2 i x 2 2i Se 2 x 2 i x 2 i x 2 2i where, x i = X i X ; and x 2i = X 2i X 2 : x i x 2i x i x 2i 2 2 8
19 9 Con dence Interval for the mean response Once we get the least square estimates of the model parameters, the estimated regression model is given by by i = a + b X i + b 2 X 2i This model may be used, among others, to predict values of Y for given values of X and X 2 : Thus, for new values X ;n+ and X 2;n+, the predicted value of Y is given by: by n+ = a + b X ;n+ + b 2 X 2;n+ Since, Y b i is a statistic (computed from a sample) it is subject to variation. This variation is measured by its standard error which is given by s Se S byn+ = S Y = 2 s MSE n = n(n k ) 9
20 This may, then, be used to construct ( )% con dence interval for the predicted population mean response, E(Y n+ jx ; n+ ; X 2 ; n+ ) as! S e by n+ t (n 3; 2 ) p ; Y b S e n+ + t n (n 3; 2 ) p n 2
21 Example(s) i Y i X i X 2i X Mean 4:2 3 3 y i x i x 2i x 2 i x 2 2i x i y i x 2i y i x i x 2i (a) Fit a Simple Linear Regression: b Y i = a + b X i and estimate all relevant quantities (NOV, S 2 e ; R2, etc...). (b) Do same with b Y i = a 2 + b 2 X 2i (c) Fit a Multiple Linear Regression b Y i = a 3 + b 3 X i + b 4 X 2i (with NOV, S 2 e; R 2, etc...) and compare the results with those in (a) and (b) 2
22 Introduction to Matrix lgebra 2 (This section is Extra! It is not part of the course, but it may be helpfull to know!!!) 2. De nition & Notation matrix is is a rectangular array of numbers. If has n rows and p columns, we say it is of order n x p: For instance, n observations on p 22
23 variables give an n x p matrix as follows: = a a 2 : : : a p a 2 a 22 : : : a 2p : : : : : : : : : a n a n2 : : : a np C vector is a matrix with only one row or column: a = a a 2 : : : a c 23
24 is a row-vector, while b = b b 2 : : : b r C is a column-vector 24
25 2.2 Elementary Operations with Matrices If = a : : : a p : : : : : : a n : : : a np C and = b : : : b p : : : : : : b n : : : b np C then, their sum is given by + = a + b : : : a p + b p : : : : : : a n + b n : : : a np + b np C 25
26 For a constant, c c = ca : : : ca p : : : : : : ca n : : : ca np C Further, if number of columns in is equal to number of rows in (p = n) then their product is given by a b +a 2 b 2 +::: + a p b p ::: a b p +a 2 b 2p +::: + a p b pp : : *= : : : : a n b +a n2 b 2 +::: + a np b p ::: a n b p +a n2 b 2p +::: + a np b pp 26
27 2.3 Row Exchanges, Inverse, Transpose The transpose of an r x c matrix is denoted by and is the c x r matrix formed by interchanging the roles of rows and columns: = a a 2 : : : a n a 2 a 22 : : : a n2 : : : : : : : : : a p a 2p : : : a np C 27
28 The inverse of matrix is denoted by and is such that = = I = : : : : : : : : : : : : : : : : : : is the identity-matrix whose elements are -s in the main-diagonal and -s elsewhere C 28
29 2.4 Square Matrices, Symetric Matrices, etc... matrix is said to be square matrix if its number of rows and columns are equal matrix is said to be symetric matrix if it = (if it is equal to its transpose) 29
30 2.5 Determinants The determinant of a matrix is denoted by det() or jj and is de ned only for square matrices, For a 2 x 2 matrix its determinant is given by = a a 2 a 2 a 22! det () = jj = a a 22 a 2 a 2 3
31 while for a 3 x 3 matrix its determinant is given by = a a 2 a 3 a 2 a 22 a 23 a 3 a 32 a 33 C det () = jj = a a 22 a 33 + a 2 a 23 a 3 + a 3 a 2 a 32 a 3 a 22 a 3 a a 23 a 32 a 2 a 2 a 33 Computation of larger matrices gets more complicated but there are special methods 3
32 2.6 Eigen-values and eigen-vectors 2.7 Positive-de nite matrices 32
33 3 The Matrix-approach to Linear Regression 3. Model formulation Let Y = y y 2 : : : y n C variable (dependent variable), be a column-vector of n observations of the response 33
34 X = x 2 : : : x p x 22 x 2p : : : : : : : : : x n2 x np an n x (p+) matrix of explanatory variables (including a constant for the intercept), C = 2 C a column-vector of regression coe cients (one intercept and p 34
35 p slopes), and = 2 : : : n C a column-vector of disturbance (error) terms. Then, the multiple regression model may be written in matrix form as Y = X + 35
36 3.2 Model ssumptions Soime of the standard assumptions are E () = = : : : C ; 36
37 and Cov () = E = 2 I = 2 : : : : : : : : : : : : : : : : : : C Thus, E (Y) = E (X + ) = E (X ) + E () = E (X ) = X 37
38 3.3 Estimation of Parameters If e = Y c Y = Y X b = y by y 2 by 2 : : : y n by n is the estimated vector of error terms, then the vector of coe cients is estimated by minimizing the sum of squares of these error terms (Least Square method): e e = Y X b Y X b = Y Y 2 b X Y + b X X b C 38
39 This sum of squares is then minimized by di erentiating e e with respect to b, equating to and solving for b : so that b e e = =) 2X Y + 2 b X X = b = X Y X X = X X X Y and the tted regression model is given by cy = X b = X X X X Y and E b = E X X X Y = X X X E (Y) = X X X X = showing that the least square estimate b is an unbiassed estimator of the true parameter. 39
40 3.4 Numerical Examples Let Y = C ; Y 2 = C ; X = C ; =! ; =! Then, Y = X =) b = X X X Y 4
41 where X X = ! C = ! and X X = ! = : : : :222! 4
42 while X Y = ! C = 4 935! 42
43 Thus, b = =! b = X b X X : : Y = : :222! : : = 3! : : ! 4 935! =) b = 3 and b = 2 43
44 Similarly, Y 2 = X =) b = X X X Y 2 where X Y 2 = ! C = 45 55! 44
45 so that b = b b! = X X X Y 2 == : : = : : =) b = 3 and b = 3 : : : :222! = 3 3!! 45 55! The results are intuitively appealing since... 45
Econometrics Midterm Examination Answers
Econometrics Midterm Examination Answers March 4, 204. Question (35 points) Answer the following short questions. (i) De ne what is an unbiased estimator. Show that X is an unbiased estimator for E(X i
More informationFinansiell Statistik, GN, 15 hp, VT2008 Lecture 17-1: Regression with dichotomous outcome variable - Logistic Regression
Finansiell Statistik, GN, 15 hp, VT2008 Lecture 17-1: Regression with dichotomous outcome variable - Logistic Regression Gebrenegus Ghilagaber, PhD, Associate Professor May 7, 2008 1 1 Introduction Let
More informationFinansiell Statistik, GN, 15 hp, VT2008 Lecture 12::(1) Analysis of Variance (2) Chi-Square Tests for Independence and for Goodness-of- t
Finansiell Statistik, GN, 15 hp, VT2008 Lecture 12::(1) Analysis of Variance (2) Chi-Square Tests for Independence and for Goodness-of- t Gebrenegus Ghilagaber, PhD, Associate Professor 1 1 Introduction
More informationFinansiell Statistik, GN, 15 hp, VT2008 Lecture 10-11: Statistical Inference: Hypothesis Testing
Finansiell Statistik, GN, 15 hp, VT008 Lecture 10-11: Statistical Inference: Hypothesis Testing Gebrenegus Ghilagaber, PhD, Associate Professor April 1, 008 1 1 Statistical Inferences: Introduction Recall:
More informationFöreläsning /31
1/31 Föreläsning 10 090420 Chapter 13 Econometric Modeling: Model Speci cation and Diagnostic testing 2/31 Types of speci cation errors Consider the following models: Y i = β 1 + β 2 X i + β 3 X 2 i +
More informationEconometrics Homework 1
Econometrics Homework Due Date: March, 24. by This problem set includes questions for Lecture -4 covered before midterm exam. Question Let z be a random column vector of size 3 : z = @ (a) Write out z
More information1 A Non-technical Introduction to Regression
1 A Non-technical Introduction to Regression Chapters 1 and Chapter 2 of the textbook are reviews of material you should know from your previous study (e.g. in your second year course). They cover, in
More informationLECTURE 6. Introduction to Econometrics. Hypothesis testing & Goodness of fit
LECTURE 6 Introduction to Econometrics Hypothesis testing & Goodness of fit October 25, 2016 1 / 23 ON TODAY S LECTURE We will explain how multiple hypotheses are tested in a regression model We will define
More information1. The Multivariate Classical Linear Regression Model
Business School, Brunel University MSc. EC550/5509 Modelling Financial Decisions and Markets/Introduction to Quantitative Methods Prof. Menelaos Karanasos (Room SS69, Tel. 08956584) Lecture Notes 5. The
More informationQuantitative Techniques - Lecture 8: Estimation
Quantitative Techniques - Lecture 8: Estimation Key words: Estimation, hypothesis testing, bias, e ciency, least squares Hypothesis testing when the population variance is not known roperties of estimates
More informationProblem set 1 - Solutions
EMPIRICAL FINANCE AND FINANCIAL ECONOMETRICS - MODULE (8448) Problem set 1 - Solutions Exercise 1 -Solutions 1. The correct answer is (a). In fact, the process generating daily prices is usually assumed
More informationFinansiell Statistik, GN, 15 hp, VT2008 Lecture 17-2: Index Numbers
Finansiell Statistik, GN, 15 hp, VT2008 Lecture 17-2: Index Numbers Gebrenegus Ghilagaber, PhD, Associate Professor May 7, 2008 1 1 Introduction Index numbers are summary measures used to compare the general
More informationCh 2: Simple Linear Regression
Ch 2: Simple Linear Regression 1. Simple Linear Regression Model A simple regression model with a single regressor x is y = β 0 + β 1 x + ɛ, where we assume that the error ɛ is independent random component
More informationEconomics 326 Methods of Empirical Research in Economics. Lecture 14: Hypothesis testing in the multiple regression model, Part 2
Economics 326 Methods of Empirical Research in Economics Lecture 14: Hypothesis testing in the multiple regression model, Part 2 Vadim Marmer University of British Columbia May 5, 2010 Multiple restrictions
More informationEconomics 620, Lecture 7: Still More, But Last, on the K-Varable Linear Model
Economics 620, Lecture 7: Still More, But Last, on the K-Varable Linear Model Nicholas M. Kiefer Cornell University Professor N. M. Kiefer (Cornell University) Lecture 7: the K-Varable Linear Model IV
More informationIntroductory Econometrics
Introductory Econometrics Violation of basic assumptions Heteroskedasticity Barbara Pertold-Gebicka CERGE-EI 16 November 010 OLS assumptions 1. Disturbances are random variables drawn from a normal distribution.
More informationLösningsförslag till skriftlig tentamen i FINANSIELL STATISTIK, grundnivå, 7,5 hp, torsdagen 15 januari 2009.
Statistiska Institutionen Gebrenegus Ghilagaber (docent) Lösningsförslag till skriftlig tentamen i FINANSIELL STATISTIK, grundnivå, 7,5 hp, torsdagen 5 januari 009. Sannolkhetslära De ne the following
More informationLecture 10 Multiple Linear Regression
Lecture 10 Multiple Linear Regression STAT 512 Spring 2011 Background Reading KNNL: 6.1-6.5 10-1 Topic Overview Multiple Linear Regression Model 10-2 Data for Multiple Regression Y i is the response variable
More information. a m1 a mn. a 1 a 2 a = a n
Biostat 140655, 2008: Matrix Algebra Review 1 Definition: An m n matrix, A m n, is a rectangular array of real numbers with m rows and n columns Element in the i th row and the j th column is denoted by
More informationa11 a A = : a 21 a 22
Matrices The study of linear systems is facilitated by introducing matrices. Matrix theory provides a convenient language and notation to express many of the ideas concisely, and complicated formulas are
More informationECONOMETRICS II (ECO 2401S) University of Toronto. Department of Economics. Spring 2013 Instructor: Victor Aguirregabiria
ECONOMETRICS II (ECO 2401S) University of Toronto. Department of Economics. Spring 2013 Instructor: Victor Aguirregabiria SOLUTION TO FINAL EXAM Friday, April 12, 2013. From 9:00-12:00 (3 hours) INSTRUCTIONS:
More informationEconomics 620, Lecture 13: Time Series I
Economics 620, Lecture 13: Time Series I Nicholas M. Kiefer Cornell University Professor N. M. Kiefer (Cornell University) Lecture 13: Time Series I 1 / 19 AUTOCORRELATION Consider y = X + u where y is
More informationEconomics 620, Lecture 4: The K-Varable Linear Model I
Economics 620, Lecture 4: The K-Varable Linear Model I Nicholas M. Kiefer Cornell University Professor N. M. Kiefer (Cornell University) Lecture 4: The K-Varable Linear Model I 1 / 20 Consider the system
More informationSimple Linear Regression Analysis
LINEAR REGRESSION ANALYSIS MODULE II Lecture - 6 Simple Linear Regression Analysis Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Prediction of values of study
More informationCorrelation Analysis
Simple Regression Correlation Analysis Correlation analysis is used to measure strength of the association (linear relationship) between two variables Correlation is only concerned with strength of the
More informationLECTURE 13: TIME SERIES I
1 LECTURE 13: TIME SERIES I AUTOCORRELATION: Consider y = X + u where y is T 1, X is T K, is K 1 and u is T 1. We are using T and not N for sample size to emphasize that this is a time series. The natural
More informationi) the probability of type I error; ii) the 95% con dence interval; iii) the p value; iv) the probability of type II error; v) the power of a test.
1. Explain what is: i) the probability of type I error; ii) the 95% con dence interval; iii) the p value; iv) the probability of type II error; v) the power of a test. Answer: i) It is the probability
More informationSection Least Squares Regression
Section 2.3 - Least Squares Regression Statistics 104 Autumn 2004 Copyright c 2004 by Mark E. Irwin Regression Correlation gives us a strength of a linear relationship is, but it doesn t tell us what it
More informationStudy Notes on Matrices & Determinants for GATE 2017
Study Notes on Matrices & Determinants for GATE 2017 Matrices and Determinates are undoubtedly one of the most scoring and high yielding topics in GATE. At least 3-4 questions are always anticipated from
More informationInference in Regression Analysis
Inference in Regression Analysis Dr. Frank Wood Frank Wood, fwood@stat.columbia.edu Linear Regression Models Lecture 4, Slide 1 Today: Normal Error Regression Model Y i = β 0 + β 1 X i + ǫ i Y i value
More informationDepartment of Economics Queen s University. ECON435/835: Development Economics Professor: Huw Lloyd-Ellis
Department of Economics Queen s University ECON435/835: Development Economics Professor: Huw Lloyd-Ellis Assignment # Answer Guide Due Date:.30 a.m., Monday October, 202. (48 percent) Let s see the extent
More informationPANEL DATA RANDOM AND FIXED EFFECTS MODEL. Professor Menelaos Karanasos. December Panel Data (Institute) PANEL DATA December / 1
PANEL DATA RANDOM AND FIXED EFFECTS MODEL Professor Menelaos Karanasos December 2011 PANEL DATA Notation y it is the value of the dependent variable for cross-section unit i at time t where i = 1,...,
More information(c) i) In ation (INFL) is regressed on the unemployment rate (UNR):
BRUNEL UNIVERSITY Master of Science Degree examination Test Exam Paper 005-006 EC500: Modelling Financial Decisions and Markets EC5030: Introduction to Quantitative methods Model Answers. COMPULSORY (a)
More informationEconomics 620, Lecture 4: The K-Variable Linear Model I. y 1 = + x 1 + " 1 y 2 = + x 2 + " 2 :::::::: :::::::: y N = + x N + " N
1 Economics 620, Lecture 4: The K-Variable Linear Model I Consider the system y 1 = + x 1 + " 1 y 2 = + x 2 + " 2 :::::::: :::::::: y N = + x N + " N or in matrix form y = X + " where y is N 1, X is N
More informationBiostatistics 380 Multiple Regression 1. Multiple Regression
Biostatistics 0 Multiple Regression ORIGIN 0 Multiple Regression Multiple Regression is an extension of the technique of linear regression to describe the relationship between a single dependent (response)
More information1 The Multiple Regression Model: Freeing Up the Classical Assumptions
1 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions were crucial for many of the derivations of the previous chapters. Derivation of the OLS estimator
More informationECONOMET RICS P RELIM EXAM August 24, 2010 Department of Economics, Michigan State University
ECONOMET RICS P RELIM EXAM August 24, 2010 Department of Economics, Michigan State University Instructions: Answer all four (4) questions. Be sure to show your work or provide su cient justi cation for
More informationECONOMETRICS FIELD EXAM Michigan State University May 9, 2008
ECONOMETRICS FIELD EXAM Michigan State University May 9, 2008 Instructions: Answer all four (4) questions. Point totals for each question are given in parenthesis; there are 00 points possible. Within
More informationLinear Algebra. James Je Heon Kim
Linear lgebra James Je Heon Kim (jjk9columbia.edu) If you are unfamiliar with linear or matrix algebra, you will nd that it is very di erent from basic algebra or calculus. For the duration of this session,
More informationLecture 2. The Simple Linear Regression Model: Matrix Approach
Lecture 2 The Simple Linear Regression Model: Matrix Approach Matrix algebra Matrix representation of simple linear regression model 1 Vectors and Matrices Where it is necessary to consider a distribution
More informationLinear Algebra Review
Linear Algebra Review Yang Feng http://www.stat.columbia.edu/~yangfeng Yang Feng (Columbia University) Linear Algebra Review 1 / 45 Definition of Matrix Rectangular array of elements arranged in rows and
More informationChapter 1: Linear Regression with One Predictor Variable also known as: Simple Linear Regression Bivariate Linear Regression
BSTT523: Kutner et al., Chapter 1 1 Chapter 1: Linear Regression with One Predictor Variable also known as: Simple Linear Regression Bivariate Linear Regression Introduction: Functional relation between
More informationInference in Normal Regression Model. Dr. Frank Wood
Inference in Normal Regression Model Dr. Frank Wood Remember We know that the point estimator of b 1 is b 1 = (Xi X )(Y i Ȳ ) (Xi X ) 2 Last class we derived the sampling distribution of b 1, it being
More informationElementary Row Operations on Matrices
King Saud University September 17, 018 Table of contents 1 Definition A real matrix is a rectangular array whose entries are real numbers. These numbers are organized on rows and columns. An m n matrix
More informationTesting Linear Restrictions: cont.
Testing Linear Restrictions: cont. The F-statistic is closely connected with the R of the regression. In fact, if we are testing q linear restriction, can write the F-stastic as F = (R u R r)=q ( R u)=(n
More informationPanel Data. March 2, () Applied Economoetrics: Topic 6 March 2, / 43
Panel Data March 2, 212 () Applied Economoetrics: Topic March 2, 212 1 / 43 Overview Many economic applications involve panel data. Panel data has both cross-sectional and time series aspects. Regression
More informationMultiple Linear Regression
Multiple Linear Regression Simple linear regression tries to fit a simple line between two variables Y and X. If X is linearly related to Y this explains some of the variability in Y. In most cases, there
More informationTime Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley
Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the
More informationMULTIVARIATE POPULATIONS
CHAPTER 5 MULTIVARIATE POPULATIONS 5. INTRODUCTION In the following chapters we will be dealing with a variety of problems concerning multivariate populations. The purpose of this chapter is to provide
More informationThe Multiple Regression Model
Multiple Regression The Multiple Regression Model Idea: Examine the linear relationship between 1 dependent (Y) & or more independent variables (X i ) Multiple Regression Model with k Independent Variables:
More informationLecture 6 Multiple Linear Regression, cont.
Lecture 6 Multiple Linear Regression, cont. BIOST 515 January 22, 2004 BIOST 515, Lecture 6 Testing general linear hypotheses Suppose we are interested in testing linear combinations of the regression
More informationECON Introductory Econometrics. Lecture 7: OLS with Multiple Regressors Hypotheses tests
ECON4150 - Introductory Econometrics Lecture 7: OLS with Multiple Regressors Hypotheses tests Monique de Haan (moniqued@econ.uio.no) Stock and Watson Chapter 7 Lecture outline 2 Hypothesis test for single
More informationLecture Notes Part 2: Matrix Algebra
17.874 Lecture Notes Part 2: Matrix Algebra 2. Matrix Algebra 2.1. Introduction: Design Matrices and Data Matrices Matrices are arrays of numbers. We encounter them in statistics in at least three di erent
More informationECON The Simple Regression Model
ECON 351 - The Simple Regression Model Maggie Jones 1 / 41 The Simple Regression Model Our starting point will be the simple regression model where we look at the relationship between two variables In
More informationChapter 6: Endogeneity and Instrumental Variables (IV) estimator
Chapter 6: Endogeneity and Instrumental Variables (IV) estimator Advanced Econometrics - HEC Lausanne Christophe Hurlin University of Orléans December 15, 2013 Christophe Hurlin (University of Orléans)
More informationEconomics 620, Lecture 19: Introduction to Nonparametric and Semiparametric Estimation
Economics 620, Lecture 19: Introduction to Nonparametric and Semiparametric Estimation Nicholas M. Kiefer Cornell University Professor N. M. Kiefer (Cornell University) Lecture 19: Nonparametric Analysis
More informationLecture 14 Simple Linear Regression
Lecture 4 Simple Linear Regression Ordinary Least Squares (OLS) Consider the following simple linear regression model where, for each unit i, Y i is the dependent variable (response). X i is the independent
More informationRegression Models - Introduction
Regression Models - Introduction In regression models there are two types of variables that are studied: A dependent variable, Y, also called response variable. It is modeled as random. An independent
More informationMultiple Choice Questions (circle one part) 1: a b c d e 2: a b c d e 3: a b c d e 4: a b c d e 5: a b c d e
Economics 102: Analysis of Economic Data Cameron Fall 2005 Department of Economics, U.C.-Davis Final Exam (A) Tuesday December 16 Compulsory. Closed book. Total of 56 points and worth 40% of course grade.
More informationSimple and Multiple Linear Regression
Sta. 113 Chapter 12 and 13 of Devore March 12, 2010 Table of contents 1 Simple Linear Regression 2 Model Simple Linear Regression A simple linear regression model is given by Y = β 0 + β 1 x + ɛ where
More information6. Multiple Linear Regression
6. Multiple Linear Regression SLR: 1 predictor X, MLR: more than 1 predictor Example data set: Y i = #points scored by UF football team in game i X i1 = #games won by opponent in their last 10 games X
More informationIntroductory Econometrics. Lecture 13: Hypothesis testing in the multiple regression model, Part 1
Introductory Econometrics Lecture 13: Hypothesis testing in the multiple regression model, Part 1 Jun Ma School of Economics Renmin University of China October 19, 2016 The model I We consider the classical
More informationSTAT 350: Geometry of Least Squares
The Geometry of Least Squares Mathematical Basics Inner / dot product: a and b column vectors a b = a T b = a i b i a b a T b = 0 Matrix Product: A is r s B is s t (AB) rt = s A rs B st Partitioned Matrices
More informationECO220Y Simple Regression: Testing the Slope
ECO220Y Simple Regression: Testing the Slope Readings: Chapter 18 (Sections 18.3-18.5) Winter 2012 Lecture 19 (Winter 2012) Simple Regression Lecture 19 1 / 32 Simple Regression Model y i = β 0 + β 1 x
More informationChapter 14 Student Lecture Notes Department of Quantitative Methods & Information Systems. Business Statistics. Chapter 14 Multiple Regression
Chapter 14 Student Lecture Notes 14-1 Department of Quantitative Methods & Information Systems Business Statistics Chapter 14 Multiple Regression QMIS 0 Dr. Mohammad Zainal Chapter Goals After completing
More informationWeb Appendix to Multivariate High-Frequency-Based Volatility (HEAVY) Models
Web Appendix to Multivariate High-Frequency-Based Volatility (HEAVY) Models Diaa Noureldin Department of Economics, University of Oxford, & Oxford-Man Institute, Eagle House, Walton Well Road, Oxford OX
More informationCollection of Formulae and Statistical Tables for the B2-Econometrics and B3-Time Series Analysis courses and exams
Collection of Formulae and Statistical Tables for the B2-Econometrics and B3-Time Series Analysis courses and exams Lars Forsberg Uppsala University Spring 2015 Abstract This collection of formulae is
More informationBNAD 276 Lecture 10 Simple Linear Regression Model
1 / 27 BNAD 276 Lecture 10 Simple Linear Regression Model Phuong Ho May 30, 2017 2 / 27 Outline 1 Introduction 2 3 / 27 Outline 1 Introduction 2 4 / 27 Simple Linear Regression Model Managerial decisions
More informationInference for Regression
Inference for Regression Section 9.4 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 13b - 3339 Cathy Poliak, Ph.D. cathy@math.uh.edu
More informationEconomics 620, Lecture 9: Asymptotics III: Maximum Likelihood Estimation
Economics 620, Lecture 9: Asymptotics III: Maximum Likelihood Estimation Nicholas M. Kiefer Cornell University Professor N. M. Kiefer (Cornell University) Lecture 9: Asymptotics III(MLE) 1 / 20 Jensen
More informationMATRICES. a m,1 a m,n A =
MATRICES Matrices are rectangular arrays of real or complex numbers With them, we define arithmetic operations that are generalizations of those for real and complex numbers The general form a matrix of
More informationCh 3: Multiple Linear Regression
Ch 3: Multiple Linear Regression 1. Multiple Linear Regression Model Multiple regression model has more than one regressor. For example, we have one response variable and two regressor variables: 1. delivery
More informationBasic Econometrics - rewiev
Basic Econometrics - rewiev Jerzy Mycielski Model Linear equation y i = x 1i β 1 + x 2i β 2 +... + x Ki β K + ε i, dla i = 1,..., N, Elements dependent (endogenous) variable y i independent (exogenous)
More informationx i = 1 yi 2 = 55 with N = 30. Use the above sample information to answer all the following questions. Show explicitly all formulas and calculations.
Exercises for the course of Econometrics Introduction 1. () A researcher is using data for a sample of 30 observations to investigate the relationship between some dependent variable y i and independent
More informationChapter 2 The Simple Linear Regression Model: Specification and Estimation
Chapter The Simple Linear Regression Model: Specification and Estimation Page 1 Chapter Contents.1 An Economic Model. An Econometric Model.3 Estimating the Regression Parameters.4 Assessing the Least Squares
More informationChapter 1. The Noble Eightfold Path to Linear Regression
Chapter 1 The Noble Eightfold Path to Linear Regression In this chapter, I show several di erent ways of solving the linear regression problem. The di erent approaches are interesting in their own way.
More informationLecture 2 Simple Linear Regression STAT 512 Spring 2011 Background Reading KNNL: Chapter 1
Lecture Simple Linear Regression STAT 51 Spring 011 Background Reading KNNL: Chapter 1-1 Topic Overview This topic we will cover: Regression Terminology Simple Linear Regression with a single predictor
More informationNotes on Time Series Modeling
Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g
More informationThe linear regression model: functional form and structural breaks
The linear regression model: functional form and structural breaks Ragnar Nymoen Department of Economics, UiO 16 January 2009 Overview Dynamic models A little bit more about dynamics Extending inference
More informationChapter 14. Linear least squares
Serik Sagitov, Chalmers and GU, March 5, 2018 Chapter 14 Linear least squares 1 Simple linear regression model A linear model for the random response Y = Y (x) to an independent variable X = x For a given
More information1 Correlation between an independent variable and the error
Chapter 7 outline, Econometrics Instrumental variables and model estimation 1 Correlation between an independent variable and the error Recall that one of the assumptions that we make when proving the
More informationMeasurement Error. Often a data set will contain imperfect measures of the data we would ideally like.
Measurement Error Often a data set will contain imperfect measures of the data we would ideally like. Aggregate Data: (GDP, Consumption, Investment are only best guesses of theoretical counterparts and
More informationBasic Business Statistics 6 th Edition
Basic Business Statistics 6 th Edition Chapter 12 Simple Linear Regression Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value of a dependent variable based
More informationIntro to Linear Regression
Intro to Linear Regression Introduction to Regression Regression is a statistical procedure for modeling the relationship among variables to predict the value of a dependent variable from one or more predictor
More informationMultiple Linear Regression
Multiple Linear Regression University of California, San Diego Instructor: Ery Arias-Castro http://math.ucsd.edu/~eariasca/teaching.html 1 / 42 Passenger car mileage Consider the carmpg dataset taken from
More informationApplied Statistics and Econometrics
Applied Statistics and Econometrics Lecture 6 Saul Lach September 2017 Saul Lach () Applied Statistics and Econometrics September 2017 1 / 53 Outline of Lecture 6 1 Omitted variable bias (SW 6.1) 2 Multiple
More informationLinear Regression. y» F; Ey = + x Vary = ¾ 2. ) y = + x + u. Eu = 0 Varu = ¾ 2 Exu = 0:
Linear Regression 1 Single Explanatory Variable Assume (y is not necessarily normal) where Examples: y» F; Ey = + x Vary = ¾ 2 ) y = + x + u Eu = 0 Varu = ¾ 2 Exu = 0: 1. School performance as a function
More informationViolation of OLS assumption- Multicollinearity
Violation of OLS assumption- Multicollinearity What, why and so what? Lars Forsberg Uppsala University, Department of Statistics October 17, 2014 Lars Forsberg (Uppsala University) 1110 - Multi - co -
More informationReview of Linear Algebra
Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=
More informationEconometrics Lecture 1 Introduction and Review on Statistics
Econometrics Lecture 1 Introduction and Review on Statistics Chau, Tak Wai Shanghai University of Finance and Economics Spring 2014 1 / 69 Introduction This course is about Econometrics. Metrics means
More information11. Bootstrap Methods
11. Bootstrap Methods c A. Colin Cameron & Pravin K. Trivedi 2006 These transparencies were prepared in 20043. They can be used as an adjunct to Chapter 11 of our subsequent book Microeconometrics: Methods
More informationEconometrics of Panel Data
Econometrics of Panel Data Jakub Mućk Meeting # 2 Jakub Mućk Econometrics of Panel Data Meeting # 2 1 / 26 Outline 1 Fixed effects model The Least Squares Dummy Variable Estimator The Fixed Effect (Within
More informationAddition and subtraction: element by element, and dimensions must match.
Matrix Essentials review: ) Matrix: Rectangular array of numbers. ) ranspose: Rows become columns and vice-versa ) single row or column is called a row or column) Vector ) R ddition and subtraction: element
More informationMath 3330: Solution to midterm Exam
Math 3330: Solution to midterm Exam Question 1: (14 marks) Suppose the regression model is y i = β 0 + β 1 x i + ε i, i = 1,, n, where ε i are iid Normal distribution N(0, σ 2 ). a. (2 marks) Compute the
More informationLINEAR REGRESSION ANALYSIS. MODULE XVI Lecture Exercises
LINEAR REGRESSION ANALYSIS MODULE XVI Lecture - 44 Exercises Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Exercise 1 The following data has been obtained on
More informationSTA 4210 Practise set 2a
STA 410 Practise set a For all significance tests, use = 0.05 significance level. S.1. A multiple linear regression model is fit, relating household weekly food expenditures (Y, in $100s) to weekly income
More informationLecture 15 Multiple regression I Chapter 6 Set 2 Least Square Estimation The quadratic form to be minimized is
Lecture 15 Multiple regression I Chapter 6 Set 2 Least Square Estimation The quadratic form to be minimized is Q = (Y i β 0 β 1 X i1 β 2 X i2 β p 1 X i.p 1 ) 2, which in matrix notation is Q = (Y Xβ) (Y
More informationEcon 3790: Statistics Business and Economics. Instructor: Yogesh Uppal
Econ 3790: Statistics Business and Economics Instructor: Yogesh Uppal Email: yuppal@ysu.edu Chapter 14 Covariance and Simple Correlation Coefficient Simple Linear Regression Covariance Covariance between
More informationInferences for Regression
Inferences for Regression An Example: Body Fat and Waist Size Looking at the relationship between % body fat and waist size (in inches). Here is a scatterplot of our data set: Remembering Regression In
More informationSection 9.2: Matrices.. a m1 a m2 a mn
Section 9.2: Matrices Definition: A matrix is a rectangular array of numbers: a 11 a 12 a 1n a 21 a 22 a 2n A =...... a m1 a m2 a mn In general, a ij denotes the (i, j) entry of A. That is, the entry in
More information