Chapter 2 The Simple Linear Regression Model: Specification and Estimation

Size: px
Start display at page:

Download "Chapter 2 The Simple Linear Regression Model: Specification and Estimation"

Transcription

1 Chapter The Simple Linear Regression Model: Specification and Estimation Page 1

2 Chapter Contents.1 An Economic Model. An Econometric Model.3 Estimating the Regression Parameters.4 Assessing the Least Squares Estimators.5 The Gauss-Markov Theorem.6 The Probability Distributions of the Least Squares Estimators.7 Estimating the Variance of the Error Term s Page

3 .1 An Economic Model Page 3

4 .1 An Economic Model As economists we are usually more interested in studying relationships between variables Economic theory tells us that expenditure on economic goods depends on income Consequently we call y the dependent variable and x the independent or explanatory variable In econometrics, we recognize that real-world expenditures are random variables, and we want to use data to learn about the relationship Page 4

5 .1 An Economic Model The pdf is a conditional probability density function since it is conditional upon an x The conditional mean, or expected value, of y is E(y x) The expected value of a random variable is called its mean value, which is really a contraction of population mean, the center of the probability distribution of the random variable This is not the same as the sample mean, which is the arithmetic average of numerical values Page 5

6 .1 An Economic Model Figure.1a Probability distribution of food expenditure y given income x = $1000 Page 6

7 .1 An Economic Model The conditional variance of y is σ which measures the dispersion of y about its mean μ y x The parameters μ y x and σ, if they were known, would give us some valuable information about the population we are considering Page 7

8 .1 An Economic Model Figure.1b Probability distributions of food expenditures y given incomes x = $1000 and x = $000 Page 8

9 .1 An Economic Model In order to investigate the relationship between expenditure and income we must build an economic model and then a corresponding econometric model that forms the basis for a quantitative or empirical economic analysis This econometric model is also called a regression model Page 9

10 .1 An Economic Model The simple regression function is written as E E( y x) μy β 1β x ( y x) x Eq..1 y Eq..1 where β 1 is the intercept and β is the slope Page 10

11 .1 An Economic Model E y 1 ( y x) x It is called simple regression not because it is easy, but because there is only one explanatory variable on the right-hand side of the equation Page 11

12 .1 An Economic Model Figure. The economic model: a linear relationship between average per person food expenditure and income Page 1

13 .1 An Economic Model The slope of the regression line can be written as: Eq.. β E( y x) de( y x) x dx E ( y x) de( y x) where Δ denotes change x in dxand de(y x)/dx denotes the derivative of the expected value of y given an x value Eq.. Δ denotes change in and de(y x)/dx denotes the derivative of the expected value of y given an x value Page 13

14 . An Econometric Model Page 14

15 . An Econometric Model Figure.3 The probability density function for y at two levels of income Page 15

16 . An Econometric Model There are several key assumptions underlying the simple linear regression More will be added later Page 16

17 . An Econometric Model ASSUMPTIONS OF THE SIMPLE LINEAR REGRESSION MODEL - I Assumption 1: The mean value of y, for each value of x, is given by the linear regression E( y x) β β 1 x Page 17

18 . An Econometric Model ASSUMPTIONS OF THE SIMPLE LINEAR REGRESSION MODEL - I Assumption : For each value of x, the values of y are distributed about their mean value, following probability distributions that all have the same variance var( y x) σ Page 18

19 . An Econometric Model ASSUMPTIONS OF THE SIMPLE LINEAR REGRESSION MODEL - I Assumption 3: The sample values of y are all uncorrelated, and have zero covariance, implying that there is no linear association among them cov( y i, y j ) 0 This assumption can be made stronger by assuming that the values of y are all statistically independent Page 19

20 . An Econometric Model ASSUMPTIONS OF THE SIMPLE LINEAR REGRESSION MODEL - I Assumption 4: The variable x is not random, and must take at least two different values Page 0

21 . An Econometric Model ASSUMPTIONS OF THE SIMPLE LINEAR REGRESSION MODEL - I Assumption 5: (optional) The values of y are normally distributed about their mean for each value of x y N x ~ (β1 β,σ ) Page 1

22 . An Econometric Model..1 Introducing the Error Term The random error term is defined as Eq..3 e ye y x yβ β x 1 Rearranging gives Eq..4 y β β x e 1 where y is the dependent variable and x is the independent variable Page

23 . An Econometric Model..1 Introducing the Error Term The expected value of the error term, given x, is Ee ( x) Ey ( x) β β x 0 1 The mean value of the error term, given x, is zero Page 3

24 . An Econometric Model Figure.4 Probability density functions for e and y..1 Introducing the Error Term Page 4

25 . An Econometric Model ASSUMPTIONS OF THE SIMPLE LINEAR REGRESSION MODEL - II..1 Introducing the Error Term Assumption SR1: The value of y, for each value of x, is: y β β xe 1 Page 5

26 . An Econometric Model ASSUMPTIONS OF THE SIMPLE LINEAR REGRESSION MODEL - II..1 Introducing the Error Term Assumption SR: The expected value of the random error e is: E ( e) 0 This is equivalent to assuming that E( y) β β 1 x Page 6

27 . An Econometric Model ASSUMPTIONS OF THE SIMPLE LINEAR REGRESSION MODEL - II..1 Introducing the Error Term Assumption SR3: The variance of the random error e is: var( e) σ var( y) The random variables y and e have the same variance because they differ only by a constant. Page 7

28 . An Econometric Model ASSUMPTIONS OF THE SIMPLE LINEAR REGRESSION MODEL - II..1 Introducing the Error Term Assumption SR4: The covariance between any pair of random errors, e i and e j is: cov( e i, e j ) cov( y i, y j ) 0 The stronger version of this assumption is that the random errors e are statistically independent, in which case the values of the dependent variable y are also statistically independent Page 8

29 . An Econometric Model ASSUMPTIONS OF THE SIMPLE LINEAR REGRESSION MODEL - II..1 Introducing the Error Term Assumption SR5: The variable x is not random, and must take at least two different values Page 9

30 . An Econometric Model ASSUMPTIONS OF THE SIMPLE LINEAR REGRESSION MODEL - II..1 Introducing the Error Term Assumption SR6: (optional) The values of e are normally distributed about their mean if the values of y are normally distributed, and vice versa e ~ N(0,σ ) Page 30

31 . An Econometric Model Figure.5 The relationship among y, e and the true regression line..1 Introducing the Error Term Page 31

32 .3 Estimating the Regression Parameters Page 3

33 .3 Estimating the Regression Parameters Table.1 Food Expenditure and Income Data Page 33

34 .3 Estimating the Regression Parameters Figure.6 Data for food expenditure example Page 34

35 .3 Estimating the Regression Parameters.3.1 The Least Squares Principle The fitted regression line is: Eq..5 ˆ i y b b x 1 i The least squares residual is: Eq..6 ˆ ˆ ei yi yi yi b1 b x i Page 35

36 .3 Estimating the Regression Parameters Figure.7 The relationship among y, ê and the fitted regression line.3.1 The Least Squares Principle Page 36

37 .3 Estimating the Regression Parameters.3.1 The Least Squares Principle Suppose we have another fitted line: * * * ˆ i 1 y b b x The least squares line has the smaller sum of squared residuals: if SSE N i1 eˆ i and SSE * N i1 eˆ i * i then SSE SSE * Page 37

38 .3 Estimating the Regression Parameters.3.1 The Least Squares Principle Least squares estimates for the unknown parameters β 1 and β are obtained my minimizing the sum of squares function: N S(β,β ) ( y β β xi) 1 i 1 i1 Page 38

39 .3 Estimating the Regression Parameters THE LEAST SQUARES ESTIMATORS.3.1 The Least Squares Principle Eq..7 b ( x x )( y i i ( xi x ) y ) Eq..8 b 1 y b x Page 39

40 .3 Estimating the Regression Parameters.3. Estimates for the Food Expenditure Function b ( x x)( y y) i i ( xi x) b1 y b x (10.096)( ) A convenient way to report the values for b 1 and b is to write out the estimated or fitted regression line: yˆ i x i Page 40

41 .3 Estimating the Regression Parameters Figure.8 The fitted regression line.3. Estimates for the Food Expenditure Function Page 41

42 .3 Estimating the Regression Parameters.3.3 Interpreting the Estimates The value b = 10.1 is an estimate of, the amount by which weekly expenditure on food per household increases when household weekly income increases by $100. Thus, we estimate that if income goes up by $100, expected weekly expenditure on food will increase by approximately $10.1 Strictly speaking, the intercept estimate b 1 = 83.4 is an estimate of the weekly food expenditure on food for a household with zero income Page 4

43 .3 Estimating the Regression Parameters.3.3a Elasticities Income elasticity is a useful way to characterize the responsiveness of consumer expenditure to changes in income. The elasticity of a variable y with respect to another variable x is: percentage change in percentage change in y x y x x y In the linear economic model given by Eq..1 we have shown that β E( y) x Page 43

44 .3 Estimating the Regression Parameters.3.3a Elasticities Eq..9 The elasticity of mean expenditure with respect to income is: E ( y ) E ( y ) ( ) E y x β x xx x Ey ( ) E( y) A frequently used alternative is to calculate the elasticity at the point of the means because it is a representative point on the regression line. x ˆ b 10.1 y Page 44

45 .3 Estimating the Regression Parameters.3.3b Prediction Suppose that we wanted to predict weekly food expenditure for a household with a weekly income of $000. This prediction is carried out by substituting x = 0 into our estimated equation to obtain: yˆ x i (0) We predict that a household with a weekly income of $000 will spend $87.61 per week on food Page 45

46 .3 Estimating the Regression Parameters Figure.9 EViews Regression Output.3.3c Computer Output Page 46

47 .3 Estimating the Regression Parameters.3.4 Other Economic Models The simple regression model can be applied to estimate the parameters of many relationships in economics, business, and the social sciences The applications of regression analysis are fascinating and useful Page 47

48 .4 Assessing the Least Squares Fit Page 48

49 .4 Assessing the Least Squares Fit We call b 1 and b the least squares estimators. We can investigate the properties of the estimators b 1 and b, which are called their sampling properties, and deal with the following important questions: 1. If the least squares estimators are random variables, then what are their expected values, variances, covariances, and probability distributions?. How do the least squares estimators compare with other procedures that might be used, and how can we compare alternative estimators? Page 49

50 .4 Assessing the Least Squares Fit.4.1 The Estimator b Eq..10 The estimator b can be rewritten as: N b i1 w i y i Eq..11 where w i xi x ( x x) i It could also be write as: Eq..1 b β we i i Page 50

51 .4 Assessing the Least Squares Fit.4. The Expected Values of b 1 and b Eq..13 We will show that if our model assumptions hold, then E(b ) = β, which means that the estimator is unbiased. We can find the expected value of b using the fact that the expected value of a sum is the sum of the expected values: E( b) E( b we i i) E(β we 1 1we... wnen) E(β ) E( we 1 1) E( we)... E( wnen) E(β ) E( we ) e i using E( ) 0 and β we i ( ei) β i i E( wiei ) wi E( ei ) Page 51

52 .4 Assessing the Least Squares Fit.4. The Expected Values of b 1 and b The property of unbiasedness is about the average values of b 1 and b if many samples of the same size are drawn from the same population If we took the averages of estimates from many samples, these averages would approach the true parameter values b 1 and b Unbiasedness does not say that an estimate from any one sample is close to the true parameter value, and thus we cannot say that an estimate is unbiased We can say that the least squares estimation procedure (or the least squares estimator) is unbiased Page 5

53 .4 Assessing the Least Squares Fit Table. Estimates from 10 Samples.4.3 Repeated Sampling Page 53

54 .4 Assessing the Least Squares Fit Figure.10 Two possible probability density functions for b.4.3 Repeated Sampling The variance of b is defined asvar( b ) E[ b E( b )] Page 54

55 .4 Assessing the Least Squares Fit.4.4 The Variances and Covariances of b 1 and b Eq..14 Eq..15 Eq..16 If the regression model assumptions SR1-SR5 are correct (assumption SR6 is not required), then the variances and covariance of b 1 and b are: var( b ) x i 1 σ N xi x var( b ) cov( b, b ) σ σ xi x x 1 xi x Page 55

56 .4 Assessing the Least Squares Fit.4.4 The Variances and Covariances of b 1 and b MAJOR POINTS ABOUT THE VARIANCES AND COVARIANCES OF b 1 AND b 1. The larger the variance term σ, the greater the uncertainty there is in the statistical model, and the larger the variances and covariance of the least squares estimators.. The larger the sum of squares, x i x, the smaller the variances of the least squares estimators and the more precisely we can estimate the unknown parameters. 3. The larger the sample size N, the smaller the variances and covariance of the least squares estimators. i 4. The larger the term x, the larger the variance of the least squares estimator b The absolute magnitude of the covariance increases the larger in magnitude is the sample mean x, and the covariance has a sign opposite to that of x. Page 56

57 .4 Assessing the Least Squares Fit.4.4 The Variances and Covariances of b 1 and b Figure.11 The influence of variation in the explanatory variable x on precision of estimation (a) Low x variation, low precision (b) High x variation, high precision The variance of b is defined as var( b ) Eb E( b ) Page 57

58 .5 The Gauss-Markov Theorem Page 58

59 .5 The Gauss-Markov Theorem GAUSS-MARKOV THEOREM Under the assumptions SR1-SR5 of the linear regression model, the estimators b 1 and b have the smallest variance of all linear and unbiased estimators of b 1 and b. They are the Best Linear Unbiased Estimators (BLUE) of b 1 and b Page 59

60 .5 The Gauss-Markov Theorem MAJOR POINTS ABOUT THE GAUSS-MARKOV THEOREM 1. The estimators b 1 and b are best when compared to similar estimators, those which are linear and unbiased. The Theorem does not say that b 1 and b are the best of all possible estimators.. The estimators b 1 and b are best within their class because they have the minimum variance. When comparing two linear and unbiased estimators, we always want to use the one with the smaller variance, since that estimation rule gives us the higher probability of obtaining an estimate that is close to the true parameter value. 3. In order for the Gauss-Markov Theorem to hold, assumptions SR1- SR5 must be true. If any of these assumptions are not true, then b 1 and b are not the best linear unbiased estimators of β 1 and β. Page 60

61 .5 The Gauss-Markov Theorem MAJOR POINTS ABOUT THE GAUSS-MARKOV THEOREM 4. The Gauss-Markov Theorem does not depend on the assumption of normality (assumption SR6). 5. In the simple linear regression model, if we want to use a linear and unbiased estimator, then we have to do no more searching. The estimators b 1 and b are the ones to use. This explains why we are studying these estimators and why they are so widely used in research, not only in economics but in all social and physical sciences as well. 6. The Gauss-Markov theorem applies to the least squares estimators. It does not apply to the least squares estimates from a single sample. Page 61

62 .6 The Probability Distributions of the Least Squares Estimators Page 6

63 .6 The Probability Distributions of the Least Squares Estimators If we make the normality assumption (assumption SR6 about the error term) then the least squares estimators are normally distributed: Eq..17 b σ i 1 ~ Nβ 1, i x N x x Eq..18 b ~ N β, σ x i x Page 63

64 .6 The Probability Distributions of the Least Squares Estimators A CENTRAL LIMIT THEOREM If assumptions SR1-SR5 hold, and if the sample size N is sufficiently large, then the least squares estimators have a distribution that approximates the normal distributions shown in Eq..17 and Eq..18 Page 64

65 .7 Estimating the Variance of the Error Term Page 65

66 .7 Estimating the Variance of the Error Term The variance of the random error e i is: var( e ) σ E[ e E( e )] E( e) i i i i if the assumption E(e i ) = 0 is correct. Since the expectation is an average value we might consider estimating σ as the average of the squared errors: ˆσ e i N where the error terms are e y β1βx i i i Page 66

67 .7 Estimating the Variance of the Error Term Eq..19 The least squares residuals are obtained by replacing the unknown parameters by their least squares estimates: eˆ y yˆ y b b x eˆ i σ N There is a simple modification that produces an unbiased estimator, and that is: so that: i i i i 1 i e i ˆ N E ˆσ σ Page 67

68 .7 Estimating the Variance of the Error Term.7.1 Estimating the Variance and Covariance of the Least Squares Estimators Eq..0 Eq..1 Eq.. Replace the unknown error variance σ in Eq..14 Eq..16 by to obtain: ˆ x var( b ) Nxi x ˆσ var( b ) x x i ˆ 1 σ cov( b, b ) σˆ 1 i x x i x Page 68

69 .7 Estimating the Variance of the Error Term.7.1 Estimating the Variance and Covariance of the Least Squares Estimators The square roots of the estimated variances are the standard errors of b 1 and b : Eq..3 se( b) var( b) 1 1 Eq..4 se( b ) var( b ) Page 69

70 .7 Estimating the Variance of the Error Term Table.3 Least Squares Residuals.7. Calculations for the Food Expenditure Data e i ˆ ˆσ N 38 Page 70

71 .7 Estimating the Variance of the Error Term.7. Calculations for the Food Expenditure Data The estimated variances and covariances for a regression are arrayed in a rectangular array, or matrix, with variances on the diagonal and covariances in the off-diagonal positions. var( b1) cov( b1, b) cov( b1, b) var( b) Page 71

72 .7 Estimating the Variance of the Error Term.7. Calculations for the Food Expenditure Data For the food expenditure data the estimated covariance matrix is: C Income C Income Page 7

73 .7 Estimating the Variance of the Error Term.7.3 Interpreting the Standard Errors The standard errors of b 1 and b are measures of the sampling variability of the least squares estimates b 1 and b in repeated samples. The estimators are random variables. As such, they have probability distributions, means, and variances. In particular, if assumption SR6 holds, and the random error terms e i are normally distributed, then: ~ β,var( ) σ i b N b x x Page 73

74 .7 Estimating the Variance of the Error Term.7.3 Interpreting the Standard Errors The estimator variance, var(b ), or its square root, σ which we might call the true b var b standard deviation of b, measures the sampling variation of the estimates b The bigger b is the more variation in the least squares estimates b we see from sample to sample. If b is large then the estimates might change a great deal from sample to sample If b is small relative to the parameter b,we know that the least squares estimate will fall near b with high probability Page 74

75 .7 Estimating the Variance of the Error Term.7.3 Interpreting the Standard Errors The question we address with the standard error is How much variation about their means do the estimates exhibit from sample to sample? Page 75

76 .7 Estimating the Variance of the Error Term.7.3 Interpreting the Standard Errors b We estimate σ, and then estimate using: se( b ) var( b ) x ˆσ i x The standard error of b is thus an estimate of what the standard deviation of many estimates b would be in a very large number of samples, and is an indicator of the width of the pdf of b shown in Figure.1 Page 76

77 .7 Estimating the Variance of the Error Term Figure.1 The probability density function of the least squares estimator b..7.3 Interpreting the Standard Errors Page 77

78 Key Words Page 105

79 Keywords Page 106

4.1 Least Squares Prediction 4.2 Measuring Goodness-of-Fit. 4.3 Modeling Issues. 4.4 Log-Linear Models

4.1 Least Squares Prediction 4.2 Measuring Goodness-of-Fit. 4.3 Modeling Issues. 4.4 Log-Linear Models 4.1 Least Squares Prediction 4. Measuring Goodness-of-Fit 4.3 Modeling Issues 4.4 Log-Linear Models y = β + β x + e 0 1 0 0 ( ) E y where e 0 is a random error. We assume that and E( e 0 ) = 0 var ( e

More information

Applied Econometrics (QEM)

Applied Econometrics (QEM) Applied Econometrics (QEM) The Simple Linear Regression Model based on Prinicples of Econometrics Jakub Mućk Department of Quantitative Economics Jakub Mućk Applied Econometrics (QEM) Meeting #2 The Simple

More information

ECON The Simple Regression Model

ECON The Simple Regression Model ECON 351 - The Simple Regression Model Maggie Jones 1 / 41 The Simple Regression Model Our starting point will be the simple regression model where we look at the relationship between two variables In

More information

5.1 Model Specification and Data 5.2 Estimating the Parameters of the Multiple Regression Model 5.3 Sampling Properties of the Least Squares

5.1 Model Specification and Data 5.2 Estimating the Parameters of the Multiple Regression Model 5.3 Sampling Properties of the Least Squares 5.1 Model Specification and Data 5. Estimating the Parameters of the Multiple Regression Model 5.3 Sampling Properties of the Least Squares Estimator 5.4 Interval Estimation 5.5 Hypothesis Testing for

More information

Regression Models - Introduction

Regression Models - Introduction Regression Models - Introduction In regression models there are two types of variables that are studied: A dependent variable, Y, also called response variable. It is modeled as random. An independent

More information

Chapter 8 Heteroskedasticity

Chapter 8 Heteroskedasticity Chapter 8 Walter R. Paczkowski Rutgers University Page 1 Chapter Contents 8.1 The Nature of 8. Detecting 8.3 -Consistent Standard Errors 8.4 Generalized Least Squares: Known Form of Variance 8.5 Generalized

More information

Intermediate Econometrics

Intermediate Econometrics Intermediate Econometrics Markus Haas LMU München Summer term 2011 15. Mai 2011 The Simple Linear Regression Model Considering variables x and y in a specific population (e.g., years of education and wage

More information

Economics 620, Lecture 2: Regression Mechanics (Simple Regression)

Economics 620, Lecture 2: Regression Mechanics (Simple Regression) 1 Economics 620, Lecture 2: Regression Mechanics (Simple Regression) Observed variables: y i ; x i i = 1; :::; n Hypothesized (model): Ey i = + x i or y i = + x i + (y i Ey i ) ; renaming we get: y i =

More information

statistical sense, from the distributions of the xs. The model may now be generalized to the case of k regressors:

statistical sense, from the distributions of the xs. The model may now be generalized to the case of k regressors: Wooldridge, Introductory Econometrics, d ed. Chapter 3: Multiple regression analysis: Estimation In multiple regression analysis, we extend the simple (two-variable) regression model to consider the possibility

More information

Homoskedasticity. Var (u X) = σ 2. (23)

Homoskedasticity. Var (u X) = σ 2. (23) Homoskedasticity How big is the difference between the OLS estimator and the true parameter? To answer this question, we make an additional assumption called homoskedasticity: Var (u X) = σ 2. (23) This

More information

The cover page of the Encyclopedia of Health Economics (2014) Introduction to Econometric Application in Health Economics

The cover page of the Encyclopedia of Health Economics (2014) Introduction to Econometric Application in Health Economics PHPM110062 Teaching Demo The cover page of the Encyclopedia of Health Economics (2014) Introduction to Econometric Application in Health Economics Instructor: Mengcen Qian School of Public Health What

More information

Applied Econometrics (QEM)

Applied Econometrics (QEM) Applied Econometrics (QEM) based on Prinicples of Econometrics Jakub Mućk Department of Quantitative Economics Jakub Mućk Applied Econometrics (QEM) Meeting #3 1 / 42 Outline 1 2 3 t-test P-value Linear

More information

Simple Linear Regression

Simple Linear Regression Simple Linear Regression ST 430/514 Recall: A regression model describes how a dependent variable (or response) Y is affected, on average, by one or more independent variables (or factors, or covariates)

More information

Making sense of Econometrics: Basics

Making sense of Econometrics: Basics Making sense of Econometrics: Basics Lecture 2: Simple Regression Egypt Scholars Economic Society Happy Eid Eid present! enter classroom at http://b.socrative.com/login/student/ room name c28efb78 Outline

More information

Linear models. Linear models are computationally convenient and remain widely used in. applied econometric research

Linear models. Linear models are computationally convenient and remain widely used in. applied econometric research Linear models Linear models are computationally convenient and remain widely used in applied econometric research Our main focus in these lectures will be on single equation linear models of the form y

More information

Business Economics BUSINESS ECONOMICS. PAPER No. : 8, FUNDAMENTALS OF ECONOMETRICS MODULE No. : 3, GAUSS MARKOV THEOREM

Business Economics BUSINESS ECONOMICS. PAPER No. : 8, FUNDAMENTALS OF ECONOMETRICS MODULE No. : 3, GAUSS MARKOV THEOREM Subject Business Economics Paper No and Title Module No and Title Module Tag 8, Fundamentals of Econometrics 3, The gauss Markov theorem BSE_P8_M3 1 TABLE OF CONTENTS 1. INTRODUCTION 2. ASSUMPTIONS OF

More information

Simple Linear Regression

Simple Linear Regression Simple Linear Regression Christopher Ting Christopher Ting : christophert@smu.edu.sg : 688 0364 : LKCSB 5036 January 7, 017 Web Site: http://www.mysmu.edu/faculty/christophert/ Christopher Ting QF 30 Week

More information

LECTURE 2 LINEAR REGRESSION MODEL AND OLS

LECTURE 2 LINEAR REGRESSION MODEL AND OLS SEPTEMBER 29, 2014 LECTURE 2 LINEAR REGRESSION MODEL AND OLS Definitions A common question in econometrics is to study the effect of one group of variables X i, usually called the regressors, on another

More information

Linear Regression. In this problem sheet, we consider the problem of linear regression with p predictors and one intercept,

Linear Regression. In this problem sheet, we consider the problem of linear regression with p predictors and one intercept, Linear Regression In this problem sheet, we consider the problem of linear regression with p predictors and one intercept, y = Xβ + ɛ, where y t = (y 1,..., y n ) is the column vector of target values,

More information

Two-Variable Regression Model: The Problem of Estimation

Two-Variable Regression Model: The Problem of Estimation Two-Variable Regression Model: The Problem of Estimation Introducing the Ordinary Least Squares Estimator Jamie Monogan University of Georgia Intermediate Political Methodology Jamie Monogan (UGA) Two-Variable

More information

Semester 2, 2015/2016

Semester 2, 2015/2016 ECN 3202 APPLIED ECONOMETRICS 2. Simple linear regression B Mr. Sydney Armstrong Lecturer 1 The University of Guyana 1 Semester 2, 2015/2016 PREDICTION The true value of y when x takes some particular

More information

Lecture 2 Linear Regression: A Model for the Mean. Sharyn O Halloran

Lecture 2 Linear Regression: A Model for the Mean. Sharyn O Halloran Lecture 2 Linear Regression: A Model for the Mean Sharyn O Halloran Closer Look at: Linear Regression Model Least squares procedure Inferential tools Confidence and Prediction Intervals Assumptions Robustness

More information

Advanced Quantitative Methods: ordinary least squares

Advanced Quantitative Methods: ordinary least squares Advanced Quantitative Methods: Ordinary Least Squares University College Dublin 31 January 2012 1 2 3 4 5 Terminology y is the dependent variable referred to also (by Greene) as a regressand X are the

More information

Estadística II Chapter 4: Simple linear regression

Estadística II Chapter 4: Simple linear regression Estadística II Chapter 4: Simple linear regression Chapter 4. Simple linear regression Contents Objectives of the analysis. Model specification. Least Square Estimators (LSE): construction and properties

More information

Regression Analysis Tutorial 34 LECTURE / DISCUSSION. Statistical Properties of OLS

Regression Analysis Tutorial 34 LECTURE / DISCUSSION. Statistical Properties of OLS Regression Analysis Tutorial 34 LETURE / DISUSSION Statistical Properties of OLS Regression Analysis Tutorial 35 Statistical Properties of OLS y = " + $x + g dependent included omitted variable explanatory

More information

Properties of the least squares estimates

Properties of the least squares estimates Properties of the least squares estimates 2019-01-18 Warmup Let a and b be scalar constants, and X be a scalar random variable. Fill in the blanks E ax + b) = Var ax + b) = Goal Recall that the least squares

More information

The Simple Regression Model. Part II. The Simple Regression Model

The Simple Regression Model. Part II. The Simple Regression Model Part II The Simple Regression Model As of Sep 22, 2015 Definition 1 The Simple Regression Model Definition Estimation of the model, OLS OLS Statistics Algebraic properties Goodness-of-Fit, the R-square

More information

Regression Models - Introduction

Regression Models - Introduction Regression Models - Introduction In regression models, two types of variables that are studied: A dependent variable, Y, also called response variable. It is modeled as random. An independent variable,

More information

Multiple Regression Analysis. Part III. Multiple Regression Analysis

Multiple Regression Analysis. Part III. Multiple Regression Analysis Part III Multiple Regression Analysis As of Sep 26, 2017 1 Multiple Regression Analysis Estimation Matrix form Goodness-of-Fit R-square Adjusted R-square Expected values of the OLS estimators Irrelevant

More information

Multiple Linear Regression CIVL 7012/8012

Multiple Linear Regression CIVL 7012/8012 Multiple Linear Regression CIVL 7012/8012 2 Multiple Regression Analysis (MLR) Allows us to explicitly control for many factors those simultaneously affect the dependent variable This is important for

More information

ACE 564 Spring Lecture 8. Violations of Basic Assumptions I: Multicollinearity and Non-Sample Information. by Professor Scott H.

ACE 564 Spring Lecture 8. Violations of Basic Assumptions I: Multicollinearity and Non-Sample Information. by Professor Scott H. ACE 564 Spring 2006 Lecture 8 Violations of Basic Assumptions I: Multicollinearity and Non-Sample Information by Professor Scott H. Irwin Readings: Griffiths, Hill and Judge. "Collinear Economic Variables,

More information

Using regression to study economic relationships is called econometrics. econo = of or pertaining to the economy. metrics = measurement

Using regression to study economic relationships is called econometrics. econo = of or pertaining to the economy. metrics = measurement EconS 450 Forecasting part 3 Forecasting with Regression Using regression to study economic relationships is called econometrics econo = of or pertaining to the economy metrics = measurement Econometrics

More information

LECTURE 6. Introduction to Econometrics. Hypothesis testing & Goodness of fit

LECTURE 6. Introduction to Econometrics. Hypothesis testing & Goodness of fit LECTURE 6 Introduction to Econometrics Hypothesis testing & Goodness of fit October 25, 2016 1 / 23 ON TODAY S LECTURE We will explain how multiple hypotheses are tested in a regression model We will define

More information

Business Statistics. Lecture 9: Simple Regression

Business Statistics. Lecture 9: Simple Regression Business Statistics Lecture 9: Simple Regression 1 On to Model Building! Up to now, class was about descriptive and inferential statistics Numerical and graphical summaries of data Confidence intervals

More information

Inverse of a Square Matrix. For an N N square matrix A, the inverse of A, 1

Inverse of a Square Matrix. For an N N square matrix A, the inverse of A, 1 Inverse of a Square Matrix For an N N square matrix A, the inverse of A, 1 A, exists if and only if A is of full rank, i.e., if and only if no column of A is a linear combination 1 of the others. A is

More information

Types of economic data

Types of economic data Types of economic data Time series data Cross-sectional data Panel data 1 1-2 1-3 1-4 1-5 The distinction between qualitative and quantitative data The previous data sets can be used to illustrate an important

More information

Inference for Regression Simple Linear Regression

Inference for Regression Simple Linear Regression Inference for Regression Simple Linear Regression IPS Chapter 10.1 2009 W.H. Freeman and Company Objectives (IPS Chapter 10.1) Simple linear regression p Statistical model for linear regression p Estimating

More information

The Multiple Regression Model Estimation

The Multiple Regression Model Estimation Lesson 5 The Multiple Regression Model Estimation Pilar González and Susan Orbe Dpt Applied Econometrics III (Econometrics and Statistics) Pilar González and Susan Orbe OCW 2014 Lesson 5 Regression model:

More information

Multiple Linear Regression

Multiple Linear Regression Multiple Linear Regression Asymptotics Asymptotics Multiple Linear Regression: Assumptions Assumption MLR. (Linearity in parameters) Assumption MLR. (Random Sampling from the population) We have a random

More information

Least Squares Estimation-Finite-Sample Properties

Least Squares Estimation-Finite-Sample Properties Least Squares Estimation-Finite-Sample Properties Ping Yu School of Economics and Finance The University of Hong Kong Ping Yu (HKU) Finite-Sample 1 / 29 Terminology and Assumptions 1 Terminology and Assumptions

More information

Multiple Regression Analysis

Multiple Regression Analysis Multiple Regression Analysis y = β 0 + β 1 x 1 + β 2 x 2 +... β k x k + u 2. Inference 0 Assumptions of the Classical Linear Model (CLM)! So far, we know: 1. The mean and variance of the OLS estimators

More information

Section 3.3. How Can We Predict the Outcome of a Variable? Agresti/Franklin Statistics, 1of 18

Section 3.3. How Can We Predict the Outcome of a Variable? Agresti/Franklin Statistics, 1of 18 Section 3.3 How Can We Predict the Outcome of a Variable? Agresti/Franklin Statistics, 1of 18 Regression Line Predicts the value for the response variable, y, as a straight-line function of the value of

More information

Bivariate Relationships Between Variables

Bivariate Relationships Between Variables Bivariate Relationships Between Variables BUS 735: Business Decision Making and Research 1 Goals Specific goals: Detect relationships between variables. Be able to prescribe appropriate statistical methods

More information

Environmental Econometrics

Environmental Econometrics Environmental Econometrics Syngjoo Choi Fall 2008 Environmental Econometrics (GR03) Fall 2008 1 / 37 Syllabus I This is an introductory econometrics course which assumes no prior knowledge on econometrics;

More information

Lecture 34: Properties of the LSE

Lecture 34: Properties of the LSE Lecture 34: Properties of the LSE The following results explain why the LSE is popular. Gauss-Markov Theorem Assume a general linear model previously described: Y = Xβ + E with assumption A2, i.e., Var(E

More information

Statistics for Engineers Lecture 9 Linear Regression

Statistics for Engineers Lecture 9 Linear Regression Statistics for Engineers Lecture 9 Linear Regression Chong Ma Department of Statistics University of South Carolina chongm@email.sc.edu April 17, 2017 Chong Ma (Statistics, USC) STAT 509 Spring 2017 April

More information

Topic 7: HETEROSKEDASTICITY

Topic 7: HETEROSKEDASTICITY Universidad Carlos III de Madrid César Alonso ECONOMETRICS Topic 7: HETEROSKEDASTICITY Contents 1 Introduction 1 1.1 Examples............................. 1 2 The linear regression model with heteroskedasticity

More information

1 Correlation between an independent variable and the error

1 Correlation between an independent variable and the error Chapter 7 outline, Econometrics Instrumental variables and model estimation 1 Correlation between an independent variable and the error Recall that one of the assumptions that we make when proving the

More information

STAT5044: Regression and Anova. Inyoung Kim

STAT5044: Regression and Anova. Inyoung Kim STAT5044: Regression and Anova Inyoung Kim 2 / 47 Outline 1 Regression 2 Simple Linear regression 3 Basic concepts in regression 4 How to estimate unknown parameters 5 Properties of Least Squares Estimators:

More information

Chapter 15 Panel Data Models. Pooling Time-Series and Cross-Section Data

Chapter 15 Panel Data Models. Pooling Time-Series and Cross-Section Data Chapter 5 Panel Data Models Pooling Time-Series and Cross-Section Data Sets of Regression Equations The topic can be introduced wh an example. A data set has 0 years of time series data (from 935 to 954)

More information

The regression model with one fixed regressor cont d

The regression model with one fixed regressor cont d The regression model with one fixed regressor cont d 3150/4150 Lecture 4 Ragnar Nymoen 27 January 2012 The model with transformed variables Regression with transformed variables I References HGL Ch 2.8

More information

Simple Linear Regression for the MPG Data

Simple Linear Regression for the MPG Data Simple Linear Regression for the MPG Data 2000 2500 3000 3500 15 20 25 30 35 40 45 Wgt MPG What do we do with the data? y i = MPG of i th car x i = Weight of i th car i =1,...,n n = Sample Size Exploratory

More information

Econometrics Summary Algebraic and Statistical Preliminaries

Econometrics Summary Algebraic and Statistical Preliminaries Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L

More information

Introduction to Estimation Methods for Time Series models. Lecture 1

Introduction to Estimation Methods for Time Series models. Lecture 1 Introduction to Estimation Methods for Time Series models Lecture 1 Fulvio Corsi SNS Pisa Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 1 SNS Pisa 1 / 19 Estimation

More information

Bias Variance Trade-off

Bias Variance Trade-off Bias Variance Trade-off The mean squared error of an estimator MSE(ˆθ) = E([ˆθ θ] 2 ) Can be re-expressed MSE(ˆθ) = Var(ˆθ) + (B(ˆθ) 2 ) MSE = VAR + BIAS 2 Proof MSE(ˆθ) = E((ˆθ θ) 2 ) = E(([ˆθ E(ˆθ)]

More information

Gov 2000: 9. Regression with Two Independent Variables

Gov 2000: 9. Regression with Two Independent Variables Gov 2000: 9. Regression with Two Independent Variables Matthew Blackwell Fall 2016 1 / 62 1. Why Add Variables to a Regression? 2. Adding a Binary Covariate 3. Adding a Continuous Covariate 4. OLS Mechanics

More information

Inference for Regression Inference about the Regression Model and Using the Regression Line

Inference for Regression Inference about the Regression Model and Using the Regression Line Inference for Regression Inference about the Regression Model and Using the Regression Line PBS Chapter 10.1 and 10.2 2009 W.H. Freeman and Company Objectives (PBS Chapter 10.1 and 10.2) Inference about

More information

A time series plot: a variable Y t on the vertical axis is plotted against time on the horizontal axis

A time series plot: a variable Y t on the vertical axis is plotted against time on the horizontal axis TIME AS A REGRESSOR A time series plot: a variable Y t on the vertical axis is plotted against time on the horizontal axis Many economic variables increase or decrease with time A linear trend relationship

More information

1. Simple Linear Regression

1. Simple Linear Regression 1. Simple Linear Regression Suppose that we are interested in the average height of male undergrads at UF. We put each male student s name (population) in a hat and randomly select 100 (sample). Then their

More information

The general linear regression with k explanatory variables is just an extension of the simple regression as follows

The general linear regression with k explanatory variables is just an extension of the simple regression as follows 3. Multiple Regression Analysis The general linear regression with k explanatory variables is just an extension of the simple regression as follows (1) y i = β 0 + β 1 x i1 + + β k x ik + u i. Because

More information

Graduate Econometrics Lecture 4: Heteroskedasticity

Graduate Econometrics Lecture 4: Heteroskedasticity Graduate Econometrics Lecture 4: Heteroskedasticity Department of Economics University of Gothenburg November 30, 2014 1/43 and Autocorrelation Consequences for OLS Estimator Begin from the linear model

More information

Chapter 2 Multiple Regression I (Part 1)

Chapter 2 Multiple Regression I (Part 1) Chapter 2 Multiple Regression I (Part 1) 1 Regression several predictor variables The response Y depends on several predictor variables X 1,, X p response {}}{ Y predictor variables {}}{ X 1, X 2,, X p

More information

Linear Model Under General Variance

Linear Model Under General Variance Linear Model Under General Variance We have a sample of T random variables y 1, y 2,, y T, satisfying the linear model Y = X β + e, where Y = (y 1,, y T )' is a (T 1) vector of random variables, X = (T

More information

CHAPTER 2: Assumptions and Properties of Ordinary Least Squares, and Inference in the Linear Regression Model

CHAPTER 2: Assumptions and Properties of Ordinary Least Squares, and Inference in the Linear Regression Model CHAPTER 2: Assumptions and Properties of Ordinary Least Squares, and Inference in the Linear Regression Model Prof. Alan Wan 1 / 57 Table of contents 1. Assumptions in the Linear Regression Model 2 / 57

More information

Lecture 9 SLR in Matrix Form

Lecture 9 SLR in Matrix Form Lecture 9 SLR in Matrix Form STAT 51 Spring 011 Background Reading KNNL: Chapter 5 9-1 Topic Overview Matrix Equations for SLR Don t focus so much on the matrix arithmetic as on the form of the equations.

More information

Section 3: Simple Linear Regression

Section 3: Simple Linear Regression Section 3: Simple Linear Regression Carlos M. Carvalho The University of Texas at Austin McCombs School of Business http://faculty.mccombs.utexas.edu/carlos.carvalho/teaching/ 1 Regression: General Introduction

More information

Chapter 1: Linear Regression with One Predictor Variable also known as: Simple Linear Regression Bivariate Linear Regression

Chapter 1: Linear Regression with One Predictor Variable also known as: Simple Linear Regression Bivariate Linear Regression BSTT523: Kutner et al., Chapter 1 1 Chapter 1: Linear Regression with One Predictor Variable also known as: Simple Linear Regression Bivariate Linear Regression Introduction: Functional relation between

More information

Econ 3790: Statistics Business and Economics. Instructor: Yogesh Uppal

Econ 3790: Statistics Business and Economics. Instructor: Yogesh Uppal Econ 3790: Statistics Business and Economics Instructor: Yogesh Uppal Email: yuppal@ysu.edu Chapter 14 Covariance and Simple Correlation Coefficient Simple Linear Regression Covariance Covariance between

More information

Max. Likelihood Estimation. Outline. Econometrics II. Ricardo Mora. Notes. Notes

Max. Likelihood Estimation. Outline. Econometrics II. Ricardo Mora. Notes. Notes Maximum Likelihood Estimation Econometrics II Department of Economics Universidad Carlos III de Madrid Máster Universitario en Desarrollo y Crecimiento Económico Outline 1 3 4 General Approaches to Parameter

More information

Simple and Multiple Linear Regression

Simple and Multiple Linear Regression Sta. 113 Chapter 12 and 13 of Devore March 12, 2010 Table of contents 1 Simple Linear Regression 2 Model Simple Linear Regression A simple linear regression model is given by Y = β 0 + β 1 x + ɛ where

More information

The Simple Regression Model. Simple Regression Model 1

The Simple Regression Model. Simple Regression Model 1 The Simple Regression Model Simple Regression Model 1 Simple regression model: Objectives Given the model: - where y is earnings and x years of education - Or y is sales and x is spending in advertising

More information

3 Multiple Linear Regression

3 Multiple Linear Regression 3 Multiple Linear Regression 3.1 The Model Essentially, all models are wrong, but some are useful. Quote by George E.P. Box. Models are supposed to be exact descriptions of the population, but that is

More information

Ch 2: Simple Linear Regression

Ch 2: Simple Linear Regression Ch 2: Simple Linear Regression 1. Simple Linear Regression Model A simple regression model with a single regressor x is y = β 0 + β 1 x + ɛ, where we assume that the error ɛ is independent random component

More information

Lecture 3: Multiple Regression

Lecture 3: Multiple Regression Lecture 3: Multiple Regression R.G. Pierse 1 The General Linear Model Suppose that we have k explanatory variables Y i = β 1 + β X i + β 3 X 3i + + β k X ki + u i, i = 1,, n (1.1) or Y i = β j X ji + u

More information

Variance. Standard deviation VAR = = value. Unbiased SD = SD = 10/23/2011. Functional Connectivity Correlation and Regression.

Variance. Standard deviation VAR = = value. Unbiased SD = SD = 10/23/2011. Functional Connectivity Correlation and Regression. 10/3/011 Functional Connectivity Correlation and Regression Variance VAR = Standard deviation Standard deviation SD = Unbiased SD = 1 10/3/011 Standard error Confidence interval SE = CI = = t value for

More information

ACE 562 Fall Lecture 2: Probability, Random Variables and Distributions. by Professor Scott H. Irwin

ACE 562 Fall Lecture 2: Probability, Random Variables and Distributions. by Professor Scott H. Irwin ACE 562 Fall 2005 Lecture 2: Probability, Random Variables and Distributions Required Readings: by Professor Scott H. Irwin Griffiths, Hill and Judge. Some Basic Ideas: Statistical Concepts for Economists,

More information

Statistical View of Least Squares

Statistical View of Least Squares Basic Ideas Some Examples Least Squares May 22, 2007 Basic Ideas Simple Linear Regression Basic Ideas Some Examples Least Squares Suppose we have two variables x and y Basic Ideas Simple Linear Regression

More information

Chapter 1. Linear Regression with One Predictor Variable

Chapter 1. Linear Regression with One Predictor Variable Chapter 1. Linear Regression with One Predictor Variable 1.1 Statistical Relation Between Two Variables To motivate statistical relationships, let us consider a mathematical relation between two mathematical

More information

The Simple Linear Regression Model

The Simple Linear Regression Model The Simple Linear Regression Model Lesson 3 Ryan Safner 1 1 Department of Economics Hood College ECON 480 - Econometrics Fall 2017 Ryan Safner (Hood College) ECON 480 - Lesson 3 Fall 2017 1 / 77 Bivariate

More information

Economics 113. Simple Regression Assumptions. Simple Regression Derivation. Changing Units of Measurement. Nonlinear effects

Economics 113. Simple Regression Assumptions. Simple Regression Derivation. Changing Units of Measurement. Nonlinear effects Economics 113 Simple Regression Models Simple Regression Assumptions Simple Regression Derivation Changing Units of Measurement Nonlinear effects OLS and unbiased estimates Variance of the OLS estimates

More information

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) 1 EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) Taisuke Otsu London School of Economics Summer 2018 A.1. Summation operator (Wooldridge, App. A.1) 2 3 Summation operator For

More information

Correlation in Linear Regression

Correlation in Linear Regression Vrije Universiteit Amsterdam Research Paper Correlation in Linear Regression Author: Yura Perugachi-Diaz Student nr.: 2566305 Supervisor: Dr. Bartek Knapik May 29, 2017 Faculty of Sciences Research Paper

More information

Unit 10: Simple Linear Regression and Correlation

Unit 10: Simple Linear Regression and Correlation Unit 10: Simple Linear Regression and Correlation Statistics 571: Statistical Methods Ramón V. León 6/28/2004 Unit 10 - Stat 571 - Ramón V. León 1 Introductory Remarks Regression analysis is a method for

More information

Estimating σ 2. We can do simple prediction of Y and estimation of the mean of Y at any value of X.

Estimating σ 2. We can do simple prediction of Y and estimation of the mean of Y at any value of X. Estimating σ 2 We can do simple prediction of Y and estimation of the mean of Y at any value of X. To perform inferences about our regression line, we must estimate σ 2, the variance of the error term.

More information

Multivariate Regression Analysis

Multivariate Regression Analysis Matrices and vectors The model from the sample is: Y = Xβ +u with n individuals, l response variable, k regressors Y is a n 1 vector or a n l matrix with the notation Y T = (y 1,y 2,...,y n ) 1 x 11 x

More information

Introduction to Simple Linear Regression

Introduction to Simple Linear Regression Introduction to Simple Linear Regression Yang Feng http://www.stat.columbia.edu/~yangfeng Yang Feng (Columbia University) Introduction to Simple Linear Regression 1 / 68 About me Faculty in the Department

More information

Intermediate Econometrics

Intermediate Econometrics Intermediate Econometrics Heteroskedasticity Text: Wooldridge, 8 July 17, 2011 Heteroskedasticity Assumption of homoskedasticity, Var(u i x i1,..., x ik ) = E(u 2 i x i1,..., x ik ) = σ 2. That is, the

More information

Finding Relationships Among Variables

Finding Relationships Among Variables Finding Relationships Among Variables BUS 230: Business and Economic Research and Communication 1 Goals Specific goals: Re-familiarize ourselves with basic statistics ideas: sampling distributions, hypothesis

More information

SF2930: REGRESION ANALYSIS LECTURE 1 SIMPLE LINEAR REGRESSION.

SF2930: REGRESION ANALYSIS LECTURE 1 SIMPLE LINEAR REGRESSION. SF2930: REGRESION ANALYSIS LECTURE 1 SIMPLE LINEAR REGRESSION. Tatjana Pavlenko 17 January 2018 WHAT IS REGRESSION? INTRODUCTION Regression analysis is a statistical technique for investigating and modeling

More information

Econometrics I Lecture 3: The Simple Linear Regression Model

Econometrics I Lecture 3: The Simple Linear Regression Model Econometrics I Lecture 3: The Simple Linear Regression Model Mohammad Vesal Graduate School of Management and Economics Sharif University of Technology 44716 Fall 1397 1 / 32 Outline Introduction Estimating

More information

REGRESSION ANALYSIS AND INDICATOR VARIABLES

REGRESSION ANALYSIS AND INDICATOR VARIABLES REGRESSION ANALYSIS AND INDICATOR VARIABLES Thesis Submitted in partial fulfillment of the requirements for the award of degree of Masters of Science in Mathematics and Computing Submitted by Sweety Arora

More information

Business Statistics. Lecture 10: Correlation and Linear Regression

Business Statistics. Lecture 10: Correlation and Linear Regression Business Statistics Lecture 10: Correlation and Linear Regression Scatterplot A scatterplot shows the relationship between two quantitative variables measured on the same individuals. It displays the Form

More information

Section 4: Multiple Linear Regression

Section 4: Multiple Linear Regression Section 4: Multiple Linear Regression Carlos M. Carvalho The University of Texas at Austin McCombs School of Business http://faculty.mccombs.utexas.edu/carlos.carvalho/teaching/ 1 The Multiple Regression

More information

Review of Classical Least Squares. James L. Powell Department of Economics University of California, Berkeley

Review of Classical Least Squares. James L. Powell Department of Economics University of California, Berkeley Review of Classical Least Squares James L. Powell Department of Economics University of California, Berkeley The Classical Linear Model The object of least squares regression methods is to model and estimate

More information

An overview of applied econometrics

An overview of applied econometrics An overview of applied econometrics Jo Thori Lind September 4, 2011 1 Introduction This note is intended as a brief overview of what is necessary to read and understand journal articles with empirical

More information

Lecture 16 - Correlation and Regression

Lecture 16 - Correlation and Regression Lecture 16 - Correlation and Regression Statistics 102 Colin Rundel April 1, 2013 Modeling numerical variables Modeling numerical variables So far we have worked with single numerical and categorical variables,

More information

Review of Statistics

Review of Statistics Review of Statistics Topics Descriptive Statistics Mean, Variance Probability Union event, joint event Random Variables Discrete and Continuous Distributions, Moments Two Random Variables Covariance and

More information

Statistics II. Management Degree Management Statistics IIDegree. Statistics II. 2 nd Sem. 2013/2014. Management Degree. Simple Linear Regression

Statistics II. Management Degree Management Statistics IIDegree. Statistics II. 2 nd Sem. 2013/2014. Management Degree. Simple Linear Regression Model 1 2 Ordinary Least Squares 3 4 Non-linearities 5 of the coefficients and their to the model We saw that econometrics studies E (Y x). More generally, we shall study regression analysis. : The regression

More information

LECTURE 5. Introduction to Econometrics. Hypothesis testing

LECTURE 5. Introduction to Econometrics. Hypothesis testing LECTURE 5 Introduction to Econometrics Hypothesis testing October 18, 2016 1 / 26 ON TODAY S LECTURE We are going to discuss how hypotheses about coefficients can be tested in regression models We will

More information

Quantitative Analysis of Financial Markets. Summary of Part II. Key Concepts & Formulas. Christopher Ting. November 11, 2017

Quantitative Analysis of Financial Markets. Summary of Part II. Key Concepts & Formulas. Christopher Ting. November 11, 2017 Summary of Part II Key Concepts & Formulas Christopher Ting November 11, 2017 christopherting@smu.edu.sg http://www.mysmu.edu/faculty/christophert/ Christopher Ting 1 of 16 Why Regression Analysis? Understand

More information