Linear Regression with One Regressor

Similar documents
residual. (Note that usually in descriptions of regression analysis, upper-case

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model

Chapter Business Statistics: A First Course Fifth Edition. Learning Objectives. Correlation vs. Regression. In this chapter, you learn:

12.2 Estimating Model parameters Assumptions: ox and y are related according to the simple linear regression model

ECON 482 / WH Hong The Simple Regression Model 1. Definition of the Simple Regression Model

Probability and. Lecture 13: and Correlation

Chapter 13 Student Lecture Notes 13-1

Simple Linear Regression

Simple Linear Regression

ENGI 3423 Simple Linear Regression Page 12-01


Ordinary Least Squares Regression. Simple Regression. Algebra and Assumptions.

Lecture 8: Linear Regression

Objectives of Multiple Regression

Midterm Exam 1, section 2 (Solution) Thursday, February hour, 15 minutes

Econometric Methods. Review of Estimation

STA302/1001-Fall 2008 Midterm Test October 21, 2008

Midterm Exam 1, section 1 (Solution) Thursday, February hour, 15 minutes

Correlation and Simple Linear Regression

Econometrics. 3) Statistical properties of the OLS estimator

b. There appears to be a positive relationship between X and Y; that is, as X increases, so does Y.

Chapter Two. An Introduction to Regression ( )

4. Standard Regression Model and Spatial Dependence Tests

( ) = ( ) ( ) Chapter 13 Asymptotic Theory and Stochastic Regressors. Stochastic regressors model

Fundamentals of Regression Analysis

Simple Linear Regression and Correlation.

Lecture Notes Types of economic variables

Handout #8. X\Y f(x) 0 1/16 1/ / /16 3/ / /16 3/16 0 3/ /16 1/16 1/8 g(y) 1/16 1/4 3/8 1/4 1/16 1

Simple Linear Regression - Scalar Form

The equation is sometimes presented in form Y = a + b x. This is reasonable, but it s not the notation we use.

Based on Neter, Wasserman and Whitemore: Applied Statistics, Chapter 18, pp

Example: Multiple linear regression. Least squares regression. Repetition: Simple linear regression. Tron Anders Moger

Statistics. Correlational. Dr. Ayman Eldeib. Simple Linear Regression and Correlation. SBE 304: Linear Regression & Correlation 1/3/2018

STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS. x, where. = y - ˆ " 1

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

Multiple Linear Regression Analysis

Simple Linear Regression and Correlation. Applied Statistics and Probability for Engineers. Chapter 11 Simple Linear Regression and Correlation

ESS Line Fitting

Statistics Review Part 3. Hypothesis Tests, Regression

DISTURBANCE TERMS. is a scalar and x i

Lecture 1: Introduction to Regression

Statistics MINITAB - Lab 5

: At least two means differ SST

Lecture 1: Introduction to Regression

ECONOMETRIC THEORY. MODULE VIII Lecture - 26 Heteroskedasticity

Lecture 2: Linear Least Squares Regression

CLASS NOTES. for. PBAF 528: Quantitative Methods II SPRING Instructor: Jean Swanson. Daniel J. Evans School of Public Affairs

COV. Violation of constant variance of ε i s but they are still independent. The error term (ε) is said to be heteroscedastic.

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections

Multiple Regression. More than 2 variables! Grade on Final. Multiple Regression 11/21/2012. Exam 2 Grades. Exam 2 Re-grades

Multiple Regression Analysis

Chapter 2 Simple Linear Regression

Wu-Hausman Test: But if X and ε are independent, βˆ. ECON 324 Page 1

Econ 388 R. Butler 2016 rev Lecture 5 Multivariate 2 I. Partitioned Regression and Partial Regression Table 1: Projections everywhere

Understand the properties of the parameter estimates, how we compute confidence intervals for them and how we test hypotheses about them.

Chapter 14 Logistic Regression Models

Handout #6. X\Y f(x) 0 1/16 1/ / /16 3/ / /16 3/16 0 3/ /16 1/16 1/8 g(y) 1/16 1/4 3/8 1/4 1/16 1

Special Instructions / Useful Data

Topic 4: Simple Correlation and Regression Analysis

Chapter 5 Properties of a Random Sample

Mean is only appropriate for interval or ratio scales, not ordinal or nominal.

Recall MLR 5 Homskedasticity error u has the same variance given any values of the explanatory variables Var(u x1,...,xk) = 2 or E(UU ) = 2 I

The Mathematics of Portfolio Theory

Applied Statistics and Probability for Engineers, 5 th edition February 23, b) y ˆ = (85) =

Chapter 3 Multiple Linear Regression Model

1. The weight of six Golden Retrievers is 66, 61, 70, 67, 92 and 66 pounds. The weight of six Labrador Retrievers is 54, 60, 72, 78, 84 and 67.

Lecture 3. Sampling, sampling distributions, and parameter estimation

X X X E[ ] E X E X. is the ()m n where the ( i,)th. j element is the mean of the ( i,)th., then

Reaction Time VS. Drug Percentage Subject Amount of Drug Times % Reaction Time in Seconds 1 Mary John Carl Sara William 5 4

ECON 5360 Class Notes GMM

Multiple Choice Test. Chapter Adequacy of Models for Regression

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #1

2SLS Estimates ECON In this case, begin with the assumption that E[ i

Chapter 8. Inferences about More Than Two Population Central Values

Training Sample Model: Given n observations, [[( Yi, x i the sample model can be expressed as (1) where, zero and variance σ

Statistics: Unlocking the Power of Data Lock 5

Lecture Notes 2. The ability to manipulate matrices is critical in economics.

Transforming Numerical Methods Education for the STEM Undergraduate Torque (N-m)

Chapter 4 Multiple Random Variables

University of Belgrade. Faculty of Mathematics. Master thesis Regression and Correlation

Simple Linear Regression Analysis

REVIEW OF SIMPLE LINEAR REGRESSION SIMPLE LINEAR REGRESSION

CHAPTER VI Statistical Analysis of Experimental Data

Chapter 2 Supplemental Text Material

Module 7. Lecture 7: Statistical parameter estimation

Simulation Output Analysis

ρ < 1 be five real numbers. The

Line Fitting and Regression

THE ROYAL STATISTICAL SOCIETY HIGHER CERTIFICATE

Generative classification models

Introduction to F-testing in linear regression models

Correlation and Regression Analysis

Answer key to problem set # 2 ECON 342 J. Marcelo Ochoa Spring, 2009

Lecture 2: The Simple Regression Model

Application of Calibration Approach for Regression Coefficient Estimation under Two-stage Sampling Design

Continuous Distributions

Bivariate regression. dependent and independent variables. Which way is the relationship?

STK4011 and STK9011 Autumn 2016

Hierarchical Bayes prediction for the 2008 US Presidential election

Transcription:

Lear Regresso wth Oe Regressor AIM QA.7. Expla how regresso aalyss ecoometrcs measures the relatoshp betwee depedet ad depedet varables. A regresso aalyss has the goal of measurg how chages oe varable, called a depedet or explaed varable ca be explaed by chages oe or more other varables called the depedet or explaatory varables. The regresso aalyss measures the relatoshp by estmatg a equato (e.g., lear regresso model). The parameters of the equato dcate the relatoshp. AIM QA.7.2 Iterpret a populato regresso fucto, regresso coeffcets, parameters, slope, tercept, ad the error term. The geeral form of the lear regresso model s: Y β 0 +β X +ε Where The subscrpt rus over observatos,,, ; Y s the depedet varable, the regressad, or smply the left-had varable; X s the depedet varable, the regressor, or smply the rght-had varable; β 0 +β X s the populato regresso le or populato regresso fucto; β 0 s the tercept of the populato regresso le (represets the value of Y f X s zero); β s the slope of the populato regresso le (measures the chage Y for a oe ut chage X); ad ε s the error term or ose compoet; ts expected value s zero. Y (Depedet varable) Y β 0 +β X ε ε β 0 (Itercept) β (Slope) X (Idepedet varable) 57

AIM QA.7.3 Iterpret a sample regresso fucto, regresso coeffcets, parameters, slope, tercept, ad the error term. The geeral form of the sample regresso fucto s: Where Yˆ b 0 +b X +e The sample regresso coeffcets are b 0 ad b, whch are the tercept ad slope. ˆ The e s called the resdual ( ): e Y - b 0 +b X Sample statstcs (b 0 +b ) are computed as estmates of the parameters β 0 ad β AIM QA.7.4 Descrbe the key propertes of a lear regresso. The Assumptos of the classcal ormal lear regresso model are:. A lear relato exsts betwee the depedet varable ad the depedet varable 2. The depedet varable s ucorrelated wth the error terms 3. The expected value of the error term s zero. E(ε )0 4. Homoskedastcty ( 同 ):The varace of the error term s the same for all observatos. V(ε ) V(ε ) σ 2 ε 5. No seral correlato ( 關 ) of the error terms; The error term s depedet across observatos. Corr(ε, ε j ) 0 6. The error term s ormally dstrbuted 58

AIM QA.7.5 Descrbe a ordary least squares (OLS) regresso ad calculate the tercept ad slope of the regresso. AIM QA.7.6 Descrbe the method ad the three key assumptos of OLS for estmato of parameters. AIM QA.7.7 Summarze the beefts of usg OLS estmators. AIM QA.7.8 Descrbe the propertes of OLS estmators ad ther samplg dstrbutos, ad expla the propertes of cosstet estmators geeral. AIM QA.7.9 Iterpret the explaed sum of squares (ESS), the total sum of squares (TSS), ad the resdual sum of squares (RSS), the stadard error of the regresso (SER), ad the regresso R 2 AIM QA.7.0 Iterpret the results of a OLS regresso The mstake made predctg the th observato s Y Yˆ Y (b + b X ) Y b b X 0 0 The sum of these squared predcto mstakes over all observatos s 0 (Y b b X ) 2 The estmators of the tercept ad slope that mmze the sum of squared mstakes are called the ordary least squares (OLS) estmators of β 0 ad β. The OLS estmators of the slope β ad the tercept β 0 are ˆ β (X X )(Y Y ) (X 2 S S XY ˆ X ) XX X βˆ 0 Y βˆ X β X Y XY X 2 2 The OLS predcted values Yˆ ad resduals εˆ are ˆ βˆ + βˆ X,,, Y 0 εˆ Y Yˆ,,, The estmated tercept ( βˆ 0 ), slope ( βˆ ), ad resdual ( εˆ ) are computed from a sample of observatos of X ad Y,,,. These are estmates of the ukow true populato tercept (β 0 ), slpoe (β ), ad error term (ε ). 59

The Least Squares Assumptos: () The error term ε has codtoal mea zero gve X:E(ε X ) 0; (2) (X, Y ), I,, are depedet ad detcally dstrbuted (..d.) draws from ther jot dstrbuto; (3) Large outlers are ulkely: X ad Y have ozero fte fourth momets. The explaed sum of squares (ESS) s also kow as sum of squares regresso ˆ (SSR), s the sum of squared devatos of Y from ther average ESS 60 (Yˆ Y ) 2

The total sum of squares (TSS) s also kow as sum of squares total (SST), s the sum of squared devatos of Y from ts average. TSS (Y Y ) The resdual sum of squares (RSS) s also kow as sum of squared errors (SSE), s the sum of the squared OLS resduals. RSS (Y Yˆ ) 2 Total sum of squares Explaed sum of squares + Resdual sum of squares (Y Y ) 2 (Yˆ Y ) 2 + 2 (Y Yˆ ) 2 TSS ESS + RSS The R 2, coeffcet of determato ( ), s the fracto of the sample varace of Y explaed by (or predcted by) X. R 2 ESS RSS TSS TSS The stadard error of the regresso (SER) s a estmator of the stadard devato of the regresso error ε. SER RSS 2 6