Chapter 1. Linear Regression with One Predictor Variable

Size: px
Start display at page:

Download "Chapter 1. Linear Regression with One Predictor Variable"

Transcription

1 Chapter 1. Linear Regression with One Predictor Variable 1.1 Statistical Relation Between Two Variables To motivate statistical relationships, let us consider a mathematical relation between two mathematical variables x and y. This may be represented by a functional relation; y = f(x), (1) which says that given a value of x, there is a unique value of y, which can be exactly determined. 1

2 For example, the relation between the number of hours(x) driven on a car and distance (y) travelled may be given by y = cx, where c is the constant speed. There are many examples in physical and other sciences of such relations, known as the deterministic or exact relationship. To define a statistical relationship, we replace the mathematical variables by random variables, X and Y and add a random component of error ɛ representing deviation from the true relation is given by y = f(x) + ɛ (2) 2

3 Here (x, y) represent a typical value of the bivariate random variable (X, Y ). Such a relation is also known as stochastic relation and model the random phenomenon where (i) there is tendency of Y values to vary around a smooth function and (ii) there is a random scatter of points around this systematic component. Figure 1.1 presents the plot of heights and weights of 23 students enrolled in my last year s of STAT360 (for the data given in Table 1.1). 3

4 This graph shows the tendency of the data to vary around a straight line. This tendency of the variation in weights as function of height is called linear trend. Since the points do not fall on a straight line, it may be suitable to use a statistical relationship, i.e. y = β 0 + β 1 x + ɛ where β 0 and β 1 are unknown constants, x represents height and y represents weight, and ɛ represents a random error. The subject matter of this course is the study of such relationships. 4

5 Figure 1.1 Scatter Plot of Height-Weight Data of STAT Class 5

6 Table 1.1 Heights and Weights of 23 Students in STAT 360 Class of 2001 Student ID Height(Cms.) Weight(Kgs.)

7 1.2 Regression Models Terminology: Regression The conditional expectation given by m(x) = E(Y X = x) in a bivariate setting is called regression of Y on X. The term regression was used by Sir Francis Galton ( ) in studying the height of the offsprings as a function of the heights of their parents in a paper entitled Regression towards mediocrity in hereditary stature (Nature, vol. 15, pp ). 7

8 In this paper Galton reported on his discovery that the offsprings did not resemble their parents in size but tend to be always more mediocre [i.e. more average] than they - to be smaller than the parents if parents were large; to be larger than parents if they were very small... Thus the random variable Y may be assumed to vary around its mean m(x) as a function of X, and denoting the random deviation Y m(x) by ɛ, we can write Y = m(x) + ɛ (3) Note that the probability distribution of ɛ is the conditional probability distribution of Y m(x) X = x, which is essentially the same as the Eq. 2. Hence statistical relationships as such as these are known as regression models. 8

9 Dependent and Independent Variable The relation y = f(x) implicitly requires to study the changes in y as a function of x and some times is interpreted as a causal relation, (i.e. x causes y). This understanding has resulted in defining x as independent variable and y as dependent variable. Uses of Regression Relation The regression model is used for Description: Simply knowing the nature of the relationship such as described by Sir Galton. 9

10 Prediction: Prediction of Y values (which are random) as a function of some related variable. This is an educated guess. For example, increase in Sales (Y ) as a function of advertising expense (X) for a company will be an important quantity to predict. In this context, X is known to be the predictor variable and Y is known to be predicand variable or response variable. Control: Knowledge of regression relations is used in control of Y values. For example, in an industrial processes, temperature (X) may be used to control the density Y of the finished product. Hence, to produce material of a given average density, the regression relation may be used to determine the proper temperature level. 10

11 1.3 Simple Linear Regression Model Distribution of the Error Unspecified Let n observations obtained from the bivariate random variable (X, Y ) be denoted by (X i, Y i ), i = 1, 2,..., n. Then the Simple Linear Regression (SLR) Model can be stated as follows; Y i = β 0 + β 1 X i + ɛ i (4) Y i : value of the response (or dependent) variable in the ith trial β 0 and β 1 are parameters known as the regression parameters X i is a known constant, the value of the predictor variable in the ith trial 11

12 ɛ i : random error term for the ith trial, such that E(ɛ i ) = 0 and V ar(ɛ i ) = σ 2 {ɛ i } = σ 2 ɛ i and ɛ j for i j are uncorrelated so that their covariance is zero, ı.e. cov(ɛ i, ɛ j ) = σ{ɛ i, ɛ j } = 0. Normal Distribution of the Errors For theoretical purposes, it is important to assume that the errors are normally distributed, this we denote by ɛ i i.i.d N(0, σ 2 ). Note that i.i.d is short for independent and identically distributed, and zero covariance between two normal random variables implies independence. Model with this extra assumption is known as Normal Simple Linear Regression model. 12

13 Some Features of the SLR model In the expressions below, expectations are used as if the X values are fixed; hence in fact these are conditional expectations. This should not create a confusion, if we assume that the regression relation is to study the variations in Y for fixed values of X. (i) Y i is sum of a constant and a random variable; hence it is a random variable. (ii) E(Y i ) = β 0 + β 1 X i (iii) V ar(y i ) = σ 2 {Y i } = σ 2 {ɛ i } = σ 2 Hence this model assumes that the mean function is linear but variance function is constant in X. 13

14 (iv) For i j, the observations Y i and Y j are uncorrelated. The above observations follow by simple rules of expectation and variance. Meaning of Regression Parameters Since, E(Y ) = β 0 + β 1 X, it is clear that β 0 = E(Y X = 0) = intercept of the regression line = mean response when X = 0 and β 1 = slope of the regression line = Change in average response per unit change in X. 14

15 1.4 Estimation of Regression Function Method of Least Squares (LS) When the distribution of errors is not specified, we need to minimize the observed errors Y i β 0 β 1 X i. The least square principle provides the best fitting line to the data by minimizing Q(β 0, β 1 ) = n i=1 (Y i β 0 β 1 X i ) 2 (5) Note: Other criteria may also be proposed, such as considering the least absolute deviation (LAD), n i=1 (Y i β 0 β 1 X i ) but LS offers an enormous theoretical simplification and has some required good properties of resulting estimators. 15

16 Least Square Estimators The analytical solution for β 0 and β 1, denoted by b 0 and b 1, respectively, is obtained by solving the following simultaneous linear equations known as the normal equations): Yi = nb 0 + b 1 Xi (6) Xi Y i = b 0 Xi + b 1 X 2 i (7) These can be explicitly solved to give, b 1 = (Xi X)(Y i Ȳ ) (Xi X) 2 (8) b 0 = Ȳ b 1 X (9) 16

17 Proof The minimizing equations are It is easy to obtain Q β 0 = 0 (10) Q β 1 = 0 (11) Q β 0 = 2 (Y i β 0 β 1 X i ) (12) Q β 1 = 2 X i (Y i β 0 β 1 X i ) (13) Equating these to zero and substituting b 0 and b 1 respectively for β 0 and β 1, we get (Yi β 0 β 1 X i ) = 0 (14) Xi (Y i β 0 β 1 X i ) = 0 (15) Expanding the summation over individual terms we get Yi nβ 0 β 1 Xi = 0(16) Xi Y i β 0 Xi β 1 X 2 i = 0(17) 17

18 Rearranging the terms gives the normal equations. From the first normal equation, we get 1 1 Yi = b 0 + b 1 Xi (18) n n or Hence, Ȳ = b 0 + b 1 X (19) b 0 = Ȳ b 1 X Substituting this in the second normal equation, we get, Xi Y i = n X(Ȳ b 1 X) + b 1 X 2 i = n XȲ + b 1 ( X 2 i n X 2 ) 18

19 This gives b 1 = Xi Y i n XȲ X 2 i n X 2 Using the fact that Xi Y i n XȲ = (X i X)(Y i Ȳ ) the above expression becomes b 1 = (Xi X)(Y i Ȳ ) (Xi X) 2 19

20 Example For the data in Table 1.1, the following computations are obtained; n = 23; Xi = , Yi = Xi Y i = ; X 2 i = Hence X = /23 = , Ȳ = /23 = For computing b 1, the numerator is computed as Xi Y i n XȲ = X i Y i ( X i )( Y i ) and denominator as Hence, X 2 i n X 2 = X 2 i ( X i ) 2 n n b 1 = / /23 = b 0 = =

21 1.5 Point Estimation of Mean Response Let X h be a typical value of the independent variable at which the mean response E(Y ) has to be estimated. Note that this is equivalent to estimating the regression function for X = X h. E(Y ) = β 0 + β 1 X, (20) Note that individual value of Y is known as response and E(Y ) is known as the mean response. The regression function is linear in parameters β 0 and β 1, hence, its estimate is easily obtained as Ŷ = ˆβ 0 + ˆβ 1 X = b 0 + b 1 X (21) 21

22 For the cases in the study, we call Ŷ i : Ŷ i = b 0 + b 1 X i, i = 1, 2,..., n; (22) the fitted value for the ith case; viewed as the estimate if the mean response for X = X i. Example 1.2 For the data in Table 1.1, the estimators of b 0 and b 1 were obtained as b 0 = , b 1 = 1.41, Hence, the estimated regression function is given by Ŷ = X. This estimated regression function is plotted in Figure 1.2. The fitted values are reported in the following table. 22

23 Table 1.2: Fitted Values and Residuals for Height-Weight(2001) Data Student# Height(X) Weight(Y) Fits Residuals

24 Regression Plot Y = X R-Sq = 55.7 % Weight(Y) Height(X) 180 Figure 1.2 Scatter Plot and Fitted Line Plot of Height-Weight Data of STAT Class 24

25 The graph shows a good scatter around the fitted line. Suppose that the mean weight of a person of typical height X = 171cms is desired; the corresponding point estimate is given by Ŷ = (171) = 68.23Kg. Table 1.2 also gives the fitted values for all the heights in the data; just by substituting X i for X in the equation of the fitted line. This table also gives the value of the residuals, which are the differences between the observed values and fitted values. In general, the ith residual is given by e i = Y i Ŷ i (23) 25

26 For the SLR model it can be written as e i = Y i b 0 b 1 X i = (Y i Ȳ ) b 1 (X i X) (24) The latter equation is useful for theoretical derivations in the course. The residuals are in some sense estimates of the errors ɛ i. They are used to justify the validity of the model as well as in finding departures from the model. 26

27 1.6 Properties of Fitted Regression Line (i) Sum of all the residuals equal zero, i.e n i=1 e i = 0 (25) Note that this implies that the sample mean of the residual values ē = 1 n ni=1 e i = 0.The sample mean being an estimator of the population mean, this is inline with the assumption that E(ɛ) = 0. To prove this use Eq (25) and the fact that ni=1 (X i X) = 0 = n i=1 (Y i Ȳ ) = 0. (ii) Sum of the squared residuals (Y i Ŷ i ) 2 is minimum, for the least squared residuals, e i = (Y i Ŷ i ). Note that this was the requirement in the least square estimation. 27

28 (iii) Sum of the observed values Y i equals the sum of fitted values Ŷ i : n i=1 Y i = n i=1 Ŷ i (26) This follows from the first property as ei = Y i Ŷ i = 0. This implies that the (sample) mean of the observed values Ȳ and the fitted values Ŷ are the same. (iv) Sum of the weighted residuals is zero, when the residuals are weighted by the corresponding level of the predictor variable: Xi e i = 0 (27) 28

29 To prove this we see that Xi e i = X i {(Y i Ȳ ) b 1 (X i X)} = X i (Y i Ȳ ) b 1 Xi (X i X) (28) Furthermore, S xy = (X i X)(Y i Ȳ ) = X i (Y i Ȳ ) X(Y i Ȳ ) = X i (Y i Ȳ ) X (Y i Ȳ ) = X i (Y i Ȳ ) as (Yi Ȳ ) = 0 Hence Eq (??) becomes Xi e i = S xy b 1 S xx 29

30 Using the formula for b 1 : b 1 = S xy S xx, the above equation becomes Xi e i = S xy S xy S xx S xx = S xy S xy = 0 (v) Sum of the weighted residuals is zero, when the residuals are weighted by the corresponding level of the fitted values: Ŷi e i = 0 (29) This easily follows as Ŷi e i = b 0 ei + b 1 Xi e i and the facts (proved earlier) that ei = 0 X i e i = 0 30

31 (vi) The (fitted) regression line always passes through the point ( X, Ȳ ). Substituting X = X, we find that Ŷ = Ȳ b 1 X + b 1 X = Ȳ which proves this property. Notes: (i) Property (i) follows from the first normal equation as ei = (Y i b 0 b 1 X i ) = Y i nb 0 b 1 Xi (ii) The property (v) follows from the 2nd normal equation as: Xi e i = X i Y i b 0 Xi b 1 X 2 i 31

32 (iii) If the data is transformed as Y y = Y Ȳ and X x = X X, the fitted equation becomes ŷ = b 1 x (30) where ŷ = Ŷ Ȳ. It is clear to see that this equation passes through the point (0,0); which is a consequence of shifting the origin to the point ( X, Ȳ ). 1.7 Estimation of Error Variance In general, variation can be estimated by squared deviations of observations from the mean or estimate of the mean. For example for the observations Y 1, Y 2,..., Y n from a normal population N(µ, σ 2 ), the unbiased estimator of σ 2 is given by 32

33 ˆσ 2 = 1 n s 2 = 1 n 1 n i=1 n i=1 (Y i µ) 2, if µ is known. (Y i Ȳ ) 2, if µ is unknown. In other words, we say that the estimate of σ 2 is sum of squared deviations divided by degrees of freedom; n if µ is known and n 1 if µ is estimated by Ȳ. In the case of regression model the approximation to deviations of observations Y i from its mean m(x i ) = β 0 +β 1 X i is given by Y i m(x i ) is given by e i = Y i Ŷ i = Y i b 0 b 1 X i and the corresponding sum of squares, denoted by SSE, for Sum of Squares due to Error. is given by SSE = n i=1 (Y i Ŷ i ) 2 = n i=1 e 2 i (31) 33

34 The corresponding degrees of freedom is n 2 (2 degrees of freedom are lost for estimating two parameters, β 0 and β 1.) This gives rise to the following estimate of σ 2, ni=1 e 2 i MSE = SSE n 2 = (32) n 2 where M SE stands for Mean Square due to Error. It will be proved later that MSE is unbiased for σ 2. Example The estimate of the error variance for the data of Table 1.1 is obtained as follows: Note that Sum of Squared Errors (Residuals) is given by SSE = based on n = 23 observations. MSE is given by Hence, MSE = /21 =

35 1.8 Normal Error Regression Models The information about the parameters is given by the distribution of errors Y 1,..., Y n. For the normal error regression model, ɛ 1,...ɛ n are independent and normally distributed with zero mean and variance= σ 2. This implies that Y 1,..., Y n are also normal and independent where Y i N(β 0 + β 1 X i, σ 2 ). The probability density function for Y i is given by f i (Y i ) = = 1 σ 2π exp{ 1 2σ 2(Y i m(x i )) 2 } 1 σ 2π exp{ 1 2σ 2(Y i β 0 β 1 X i ) 2 } 35

36 Since, Y 1,..., Y n are independent, the joint probability density function of Y 1,..., Y n is given by f(y 1,..., Y n ) = f 1 (Y 1 )f 2 (Y 2 )...f n (Y n ) and the likelihood function L(β 0, β 1, σ 2 ) is given by L(β 0, β 1, σ 2 ) = f(y 1,..., Y n ) { } n 1 n = σ exp{ 1 2π 2σ 2(Y i β 0 β 1 X i )) 2 } i=1 { } n 1 = σ exp{ 1 n (Y 2π 2σ 2 i β 0 β 1 X i ) 2 } (33) i=1 For finding the maximum likelihood estimators of β 0, β 1, σ 2, the likelihood function has to be maximized. Equivalently, we consider to maximize the log-likelihood function given by log e L = n 2 log e(2π) n 2 log e σ 2 1 2σ 2 n (Y i β 0 β 1 X i ) 2 i=1 (34) 36

37 Maximum Likelihood Estimators of Parameters The maximum likelihood estimators are obtained by solving the following three equations log e L β 0 = 0 log e L β 1 = 0 log e L σ 2 = 0 These partial derivatives are given by log e L β 0 = 1 σ 2 log e L β 1 = 1 σ 2 n i=1 n i=1 log e L σ 2 = n 2σ σ 4 (Y i β 0 β 1 X i ) X i (Y i β 0 β 1 X i ) n i=1 (Y i β 0 β 1 X i ) 2 37

38 Replacing β 0, β 1, σ 2 by ˆβ 0, ˆβ 1, ˆσ 2, after a little simplification, we obtain n i=1 n i=1 (Y i ˆβ 0 ˆβ 1 X i ) = 0, (35) X i (Y i ˆβ 0 ˆβ 1 X i ) = 0, (36) ni=1 (Y i ˆβ 0 ˆβ 1 X i ) 2 n = ˆσ 2. (37) Note that the equations (36) and (37) are the two normal equations obtained by the least square method. Hence the Maximum Likelihood estimators of β 0 and β 1 are the same as b 0 and b 1 respectively. Whereas the MLE for σ 2 is given by ˆσ 2 = = n i=1 (Y i b 0 b 1 X i ) 2 n i=1 e2 i n n (38) (39) Note that the MLE for σ 2 is biased, as E(ˆσ 2 ) = E( n 2 n MSE) = n 2 n σ2 (40) 38

39 The following output is obtained using MINITAB (available in Math and Stat department PC lab) using Height weight data for this class. (The data can be downloaded by following the links from chaubey either in excel or text format, which can be subsequently copied and pasted on MINITAB worksheet) Use Stat-Regression-Regression menu to obtain the following output. MINITAB ignores the missing data denoted by * ) Regression Analysis: Weight versus Height The regression equation is Weight = Height 52 cases used 6 cases contain missing values 39

40 Predictor Coef SE Coef T P Constant Height S = R-Sq = 68.7% R-Sq(adj) = 68.1% Analysis of Variance Source DF SS MS F P Regression Residual Error Total

41 Notes: 1. If the missing weight is substituted by the fitted value and the regression is run again; the same results are obtained. To store the fitted values and residuals, use STORAGE option by clicking the proper spaces. 2. To obtain a fitted line plot use Stat-Regression- Fitted Line Plot menu in MINITAB. 3. Regression OUTPUT may also be obtained from EXCEL using Tools-Data Analysis - Regression. 41

STAT5044: Regression and Anova. Inyoung Kim

STAT5044: Regression and Anova. Inyoung Kim STAT5044: Regression and Anova Inyoung Kim 2 / 47 Outline 1 Regression 2 Simple Linear regression 3 Basic concepts in regression 4 How to estimate unknown parameters 5 Properties of Least Squares Estimators:

More information

Simple Linear Regression

Simple Linear Regression Simple Linear Regression ST 430/514 Recall: A regression model describes how a dependent variable (or response) Y is affected, on average, by one or more independent variables (or factors, or covariates)

More information

1. Simple Linear Regression

1. Simple Linear Regression 1. Simple Linear Regression Suppose that we are interested in the average height of male undergrads at UF. We put each male student s name (population) in a hat and randomly select 100 (sample). Then their

More information

PART I. (a) Describe all the assumptions for a normal error regression model with one predictor variable,

PART I. (a) Describe all the assumptions for a normal error regression model with one predictor variable, Concordia University Department of Mathematics and Statistics Course Number Section Statistics 360/2 01 Examination Date Time Pages Final December 2002 3 hours 6 Instructors Course Examiner Marks Y.P.

More information

Chapter 1: Linear Regression with One Predictor Variable also known as: Simple Linear Regression Bivariate Linear Regression

Chapter 1: Linear Regression with One Predictor Variable also known as: Simple Linear Regression Bivariate Linear Regression BSTT523: Kutner et al., Chapter 1 1 Chapter 1: Linear Regression with One Predictor Variable also known as: Simple Linear Regression Bivariate Linear Regression Introduction: Functional relation between

More information

Ch 2: Simple Linear Regression

Ch 2: Simple Linear Regression Ch 2: Simple Linear Regression 1. Simple Linear Regression Model A simple regression model with a single regressor x is y = β 0 + β 1 x + ɛ, where we assume that the error ɛ is independent random component

More information

Simple and Multiple Linear Regression

Simple and Multiple Linear Regression Sta. 113 Chapter 12 and 13 of Devore March 12, 2010 Table of contents 1 Simple Linear Regression 2 Model Simple Linear Regression A simple linear regression model is given by Y = β 0 + β 1 x + ɛ where

More information

[4+3+3] Q 1. (a) Describe the normal regression model through origin. Show that the least square estimator of the regression parameter is given by

[4+3+3] Q 1. (a) Describe the normal regression model through origin. Show that the least square estimator of the regression parameter is given by Concordia University Department of Mathematics and Statistics Course Number Section Statistics 360/1 40 Examination Date Time Pages Final June 2004 3 hours 7 Instructors Course Examiner Marks Y.P. Chaubey

More information

Concordia University (5+5)Q 1.

Concordia University (5+5)Q 1. (5+5)Q 1. Concordia University Department of Mathematics and Statistics Course Number Section Statistics 360/1 40 Examination Date Time Pages Mid Term Test May 26, 2004 Two Hours 3 Instructor Course Examiner

More information

27. SIMPLE LINEAR REGRESSION II

27. SIMPLE LINEAR REGRESSION II 27. SIMPLE LINEAR REGRESSION II The Model In linear regression analysis, we assume that the relationship between X and Y is linear. This does not mean, however, that Y can be perfectly predicted from X.

More information

Introduction to Simple Linear Regression

Introduction to Simple Linear Regression Introduction to Simple Linear Regression Yang Feng http://www.stat.columbia.edu/~yangfeng Yang Feng (Columbia University) Introduction to Simple Linear Regression 1 / 68 About me Faculty in the Department

More information

MAT2377. Rafa l Kulik. Version 2015/November/26. Rafa l Kulik

MAT2377. Rafa l Kulik. Version 2015/November/26. Rafa l Kulik MAT2377 Rafa l Kulik Version 2015/November/26 Rafa l Kulik Bivariate data and scatterplot Data: Hydrocarbon level (x) and Oxygen level (y): x: 0.99, 1.02, 1.15, 1.29, 1.46, 1.36, 0.87, 1.23, 1.55, 1.40,

More information

STAT 360-Linear Models

STAT 360-Linear Models STAT 360-Linear Models Instructor: Yogendra P. Chaubey Sample Test Questions Fall 004 Note: The following questions are from previous tests and exams. The final exam will be for three hours and will contain

More information

Six Sigma Black Belt Study Guides

Six Sigma Black Belt Study Guides Six Sigma Black Belt Study Guides 1 www.pmtutor.org Powered by POeT Solvers Limited. Analyze Correlation and Regression Analysis 2 www.pmtutor.org Powered by POeT Solvers Limited. Variables and relationships

More information

Chapter 1 Linear Regression with One Predictor

Chapter 1 Linear Regression with One Predictor STAT 525 FALL 2018 Chapter 1 Linear Regression with One Predictor Professor Min Zhang Goals of Regression Analysis Serve three purposes Describes an association between X and Y In some applications, the

More information

Inference for Regression Inference about the Regression Model and Using the Regression Line

Inference for Regression Inference about the Regression Model and Using the Regression Line Inference for Regression Inference about the Regression Model and Using the Regression Line PBS Chapter 10.1 and 10.2 2009 W.H. Freeman and Company Objectives (PBS Chapter 10.1 and 10.2) Inference about

More information

Linear models and their mathematical foundations: Simple linear regression

Linear models and their mathematical foundations: Simple linear regression Linear models and their mathematical foundations: Simple linear regression Steffen Unkel Department of Medical Statistics University Medical Center Göttingen, Germany Winter term 2018/19 1/21 Introduction

More information

STAT 4385 Topic 03: Simple Linear Regression

STAT 4385 Topic 03: Simple Linear Regression STAT 4385 Topic 03: Simple Linear Regression Xiaogang Su, Ph.D. Department of Mathematical Science University of Texas at El Paso xsu@utep.edu Spring, 2017 Outline The Set-Up Exploratory Data Analysis

More information

Simple Linear Regression

Simple Linear Regression Simple Linear Regression In simple linear regression we are concerned about the relationship between two variables, X and Y. There are two components to such a relationship. 1. The strength of the relationship.

More information

Lecture 2 Simple Linear Regression STAT 512 Spring 2011 Background Reading KNNL: Chapter 1

Lecture 2 Simple Linear Regression STAT 512 Spring 2011 Background Reading KNNL: Chapter 1 Lecture Simple Linear Regression STAT 51 Spring 011 Background Reading KNNL: Chapter 1-1 Topic Overview This topic we will cover: Regression Terminology Simple Linear Regression with a single predictor

More information

Basic Business Statistics 6 th Edition

Basic Business Statistics 6 th Edition Basic Business Statistics 6 th Edition Chapter 12 Simple Linear Regression Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value of a dependent variable based

More information

Regression Models - Introduction

Regression Models - Introduction Regression Models - Introduction In regression models there are two types of variables that are studied: A dependent variable, Y, also called response variable. It is modeled as random. An independent

More information

TMA4255 Applied Statistics V2016 (5)

TMA4255 Applied Statistics V2016 (5) TMA4255 Applied Statistics V2016 (5) Part 2: Regression Simple linear regression [11.1-11.4] Sum of squares [11.5] Anna Marie Holand To be lectured: January 26, 2016 wiki.math.ntnu.no/tma4255/2016v/start

More information

AMS 315/576 Lecture Notes. Chapter 11. Simple Linear Regression

AMS 315/576 Lecture Notes. Chapter 11. Simple Linear Regression AMS 315/576 Lecture Notes Chapter 11. Simple Linear Regression 11.1 Motivation A restaurant opening on a reservations-only basis would like to use the number of advance reservations x to predict the number

More information

Correlation and Regression

Correlation and Regression Correlation and Regression October 25, 2017 STAT 151 Class 9 Slide 1 Outline of Topics 1 Associations 2 Scatter plot 3 Correlation 4 Regression 5 Testing and estimation 6 Goodness-of-fit STAT 151 Class

More information

Math 3330: Solution to midterm Exam

Math 3330: Solution to midterm Exam Math 3330: Solution to midterm Exam Question 1: (14 marks) Suppose the regression model is y i = β 0 + β 1 x i + ε i, i = 1,, n, where ε i are iid Normal distribution N(0, σ 2 ). a. (2 marks) Compute the

More information

SF2930: REGRESION ANALYSIS LECTURE 1 SIMPLE LINEAR REGRESSION.

SF2930: REGRESION ANALYSIS LECTURE 1 SIMPLE LINEAR REGRESSION. SF2930: REGRESION ANALYSIS LECTURE 1 SIMPLE LINEAR REGRESSION. Tatjana Pavlenko 17 January 2018 WHAT IS REGRESSION? INTRODUCTION Regression analysis is a statistical technique for investigating and modeling

More information

Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017

Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017 Ph.D. Qualifying Exam Friday Saturday, January 6 7, 2017 Put your solution to each problem on a separate sheet of paper. Problem 1. (5106) Let X 1, X 2,, X n be a sequence of i.i.d. observations from a

More information

Simple Regression Model Setup Estimation Inference Prediction. Model Diagnostic. Multiple Regression. Model Setup and Estimation.

Simple Regression Model Setup Estimation Inference Prediction. Model Diagnostic. Multiple Regression. Model Setup and Estimation. Statistical Computation Math 475 Jimin Ding Department of Mathematics Washington University in St. Louis www.math.wustl.edu/ jmding/math475/index.html October 10, 2013 Ridge Part IV October 10, 2013 1

More information

Statistics for Engineers Lecture 9 Linear Regression

Statistics for Engineers Lecture 9 Linear Regression Statistics for Engineers Lecture 9 Linear Regression Chong Ma Department of Statistics University of South Carolina chongm@email.sc.edu April 17, 2017 Chong Ma (Statistics, USC) STAT 509 Spring 2017 April

More information

STAT 705 Chapter 16: One-way ANOVA

STAT 705 Chapter 16: One-way ANOVA STAT 705 Chapter 16: One-way ANOVA Timothy Hanson Department of Statistics, University of South Carolina Stat 705: Data Analysis II 1 / 21 What is ANOVA? Analysis of variance (ANOVA) models are regression

More information

BNAD 276 Lecture 10 Simple Linear Regression Model

BNAD 276 Lecture 10 Simple Linear Regression Model 1 / 27 BNAD 276 Lecture 10 Simple Linear Regression Model Phuong Ho May 30, 2017 2 / 27 Outline 1 Introduction 2 3 / 27 Outline 1 Introduction 2 4 / 27 Simple Linear Regression Model Managerial decisions

More information

Math 423/533: The Main Theoretical Topics

Math 423/533: The Main Theoretical Topics Math 423/533: The Main Theoretical Topics Notation sample size n, data index i number of predictors, p (p = 2 for simple linear regression) y i : response for individual i x i = (x i1,..., x ip ) (1 p)

More information

UNIVERSITY OF MASSACHUSETTS. Department of Mathematics and Statistics. Basic Exam - Applied Statistics. Tuesday, January 17, 2017

UNIVERSITY OF MASSACHUSETTS. Department of Mathematics and Statistics. Basic Exam - Applied Statistics. Tuesday, January 17, 2017 UNIVERSITY OF MASSACHUSETTS Department of Mathematics and Statistics Basic Exam - Applied Statistics Tuesday, January 17, 2017 Work all problems 60 points are needed to pass at the Masters Level and 75

More information

Mathematics for Economics MA course

Mathematics for Economics MA course Mathematics for Economics MA course Simple Linear Regression Dr. Seetha Bandara Simple Regression Simple linear regression is a statistical method that allows us to summarize and study relationships between

More information

STAT Chapter 11: Regression

STAT Chapter 11: Regression STAT 515 -- Chapter 11: Regression Mostly we have studied the behavior of a single random variable. Often, however, we gather data on two random variables. We wish to determine: Is there a relationship

More information

Statistical View of Least Squares

Statistical View of Least Squares Basic Ideas Some Examples Least Squares May 22, 2007 Basic Ideas Simple Linear Regression Basic Ideas Some Examples Least Squares Suppose we have two variables x and y Basic Ideas Simple Linear Regression

More information

Maximum Likelihood Estimation

Maximum Likelihood Estimation Maximum Likelihood Estimation Merlise Clyde STA721 Linear Models Duke University August 31, 2017 Outline Topics Likelihood Function Projections Maximum Likelihood Estimates Readings: Christensen Chapter

More information

Lecture 1: Linear Models and Applications

Lecture 1: Linear Models and Applications Lecture 1: Linear Models and Applications Claudia Czado TU München c (Claudia Czado, TU Munich) ZFS/IMS Göttingen 2004 0 Overview Introduction to linear models Exploratory data analysis (EDA) Estimation

More information

Regression Estimation - Least Squares and Maximum Likelihood. Dr. Frank Wood

Regression Estimation - Least Squares and Maximum Likelihood. Dr. Frank Wood Regression Estimation - Least Squares and Maximum Likelihood Dr. Frank Wood Least Squares Max(min)imization Function to minimize w.r.t. β 0, β 1 Q = n (Y i (β 0 + β 1 X i )) 2 i=1 Minimize this by maximizing

More information

Regression Analysis. Regression: Methodology for studying the relationship among two or more variables

Regression Analysis. Regression: Methodology for studying the relationship among two or more variables Regression Analysis Regression: Methodology for studying the relationship among two or more variables Two major aims: Determine an appropriate model for the relationship between the variables Predict the

More information

Oct Simple linear regression. Minimum mean square error prediction. Univariate. regression. Calculating intercept and slope

Oct Simple linear regression. Minimum mean square error prediction. Univariate. regression. Calculating intercept and slope Oct 2017 1 / 28 Minimum MSE Y is the response variable, X the predictor variable, E(X) = E(Y) = 0. BLUP of Y minimizes average discrepancy var (Y ux) = C YY 2u C XY + u 2 C XX This is minimized when u

More information

22s:152 Applied Linear Regression. Chapter 5: Ordinary Least Squares Regression. Part 1: Simple Linear Regression Introduction and Estimation

22s:152 Applied Linear Regression. Chapter 5: Ordinary Least Squares Regression. Part 1: Simple Linear Regression Introduction and Estimation 22s:152 Applied Linear Regression Chapter 5: Ordinary Least Squares Regression Part 1: Simple Linear Regression Introduction and Estimation Methods for studying the relationship of two or more quantitative

More information

Problems. Suppose both models are fitted to the same data. Show that SS Res, A SS Res, B

Problems. Suppose both models are fitted to the same data. Show that SS Res, A SS Res, B Simple Linear Regression 35 Problems 1 Consider a set of data (x i, y i ), i =1, 2,,n, and the following two regression models: y i = β 0 + β 1 x i + ε, (i =1, 2,,n), Model A y i = γ 0 + γ 1 x i + γ 2

More information

Lecture 4 Multiple linear regression

Lecture 4 Multiple linear regression Lecture 4 Multiple linear regression BIOST 515 January 15, 2004 Outline 1 Motivation for the multiple regression model Multiple regression in matrix notation Least squares estimation of model parameters

More information

Lecture 18: Simple Linear Regression

Lecture 18: Simple Linear Regression Lecture 18: Simple Linear Regression BIOS 553 Department of Biostatistics University of Michigan Fall 2004 The Correlation Coefficient: r The correlation coefficient (r) is a number that measures the strength

More information

where x and ȳ are the sample means of x 1,, x n

where x and ȳ are the sample means of x 1,, x n y y Animal Studies of Side Effects Simple Linear Regression Basic Ideas In simple linear regression there is an approximately linear relation between two variables say y = pressure in the pancreas x =

More information

Inference for Regression

Inference for Regression Inference for Regression Section 9.4 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 13b - 3339 Cathy Poliak, Ph.D. cathy@math.uh.edu

More information

STAT 135 Lab 13 (Review) Linear Regression, Multivariate Random Variables, Prediction, Logistic Regression and the δ-method.

STAT 135 Lab 13 (Review) Linear Regression, Multivariate Random Variables, Prediction, Logistic Regression and the δ-method. STAT 135 Lab 13 (Review) Linear Regression, Multivariate Random Variables, Prediction, Logistic Regression and the δ-method. Rebecca Barter May 5, 2015 Linear Regression Review Linear Regression Review

More information

Statistics for Managers using Microsoft Excel 6 th Edition

Statistics for Managers using Microsoft Excel 6 th Edition Statistics for Managers using Microsoft Excel 6 th Edition Chapter 13 Simple Linear Regression 13-1 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value of

More information

Linear Models in Machine Learning

Linear Models in Machine Learning CS540 Intro to AI Linear Models in Machine Learning Lecturer: Xiaojin Zhu jerryzhu@cs.wisc.edu We briefly go over two linear models frequently used in machine learning: linear regression for, well, regression,

More information

Correlation & Simple Regression

Correlation & Simple Regression Chapter 11 Correlation & Simple Regression The previous chapter dealt with inference for two categorical variables. In this chapter, we would like to examine the relationship between two quantitative variables.

More information

STA 302 H1F / 1001 HF Fall 2007 Test 1 October 24, 2007

STA 302 H1F / 1001 HF Fall 2007 Test 1 October 24, 2007 STA 302 H1F / 1001 HF Fall 2007 Test 1 October 24, 2007 LAST NAME: SOLUTIONS FIRST NAME: STUDENT NUMBER: ENROLLED IN: (circle one) STA 302 STA 1001 INSTRUCTIONS: Time: 90 minutes Aids allowed: calculator.

More information

Regression Estimation Least Squares and Maximum Likelihood

Regression Estimation Least Squares and Maximum Likelihood Regression Estimation Least Squares and Maximum Likelihood Dr. Frank Wood Frank Wood, fwood@stat.columbia.edu Linear Regression Models Lecture 3, Slide 1 Least Squares Max(min)imization Function to minimize

More information

Inference for Regression Simple Linear Regression

Inference for Regression Simple Linear Regression Inference for Regression Simple Linear Regression IPS Chapter 10.1 2009 W.H. Freeman and Company Objectives (IPS Chapter 10.1) Simple linear regression p Statistical model for linear regression p Estimating

More information

LAB 5 INSTRUCTIONS LINEAR REGRESSION AND CORRELATION

LAB 5 INSTRUCTIONS LINEAR REGRESSION AND CORRELATION LAB 5 INSTRUCTIONS LINEAR REGRESSION AND CORRELATION In this lab you will learn how to use Excel to display the relationship between two quantitative variables, measure the strength and direction of the

More information

The simple linear regression model discussed in Chapter 13 was written as

The simple linear regression model discussed in Chapter 13 was written as 1519T_c14 03/27/2006 07:28 AM Page 614 Chapter Jose Luis Pelaez Inc/Blend Images/Getty Images, Inc./Getty Images, Inc. 14 Multiple Regression 14.1 Multiple Regression Analysis 14.2 Assumptions of the Multiple

More information

Econometrics I KS. Module 1: Bivariate Linear Regression. Alexander Ahammer. This version: March 12, 2018

Econometrics I KS. Module 1: Bivariate Linear Regression. Alexander Ahammer. This version: March 12, 2018 Econometrics I KS Module 1: Bivariate Linear Regression Alexander Ahammer Department of Economics Johannes Kepler University of Linz This version: March 12, 2018 Alexander Ahammer (JKU) Module 1: Bivariate

More information

Section 4: Multiple Linear Regression

Section 4: Multiple Linear Regression Section 4: Multiple Linear Regression Carlos M. Carvalho The University of Texas at Austin McCombs School of Business http://faculty.mccombs.utexas.edu/carlos.carvalho/teaching/ 1 The Multiple Regression

More information

3. Diagnostics and Remedial Measures

3. Diagnostics and Remedial Measures 3. Diagnostics and Remedial Measures So far, we took data (X i, Y i ) and we assumed where ɛ i iid N(0, σ 2 ), Y i = β 0 + β 1 X i + ɛ i i = 1, 2,..., n, β 0, β 1 and σ 2 are unknown parameters, X i s

More information

Inferences for Regression

Inferences for Regression Inferences for Regression An Example: Body Fat and Waist Size Looking at the relationship between % body fat and waist size (in inches). Here is a scatterplot of our data set: Remembering Regression In

More information

Simple Linear Regression

Simple Linear Regression Simple Linear Regression ST 370 Regression models are used to study the relationship of a response variable and one or more predictors. The response is also called the dependent variable, and the predictors

More information

Fitting a regression model

Fitting a regression model Fitting a regression model We wish to fit a simple linear regression model: y = β 0 + β 1 x + ɛ. Fitting a model means obtaining estimators for the unknown population parameters β 0 and β 1 (and also for

More information

Correlation Analysis

Correlation Analysis Simple Regression Correlation Analysis Correlation analysis is used to measure strength of the association (linear relationship) between two variables Correlation is only concerned with strength of the

More information

Chapter 2. Continued. Proofs For ANOVA Proof of ANOVA Identity. the product term in the above equation can be simplified as n

Chapter 2. Continued. Proofs For ANOVA Proof of ANOVA Identity. the product term in the above equation can be simplified as n Chapter 2. Continued Proofs For ANOVA Proof of ANOVA Identity We are going to prove that Writing SST SSR + SSE. Y i Ȳ (Y i Ŷ i ) + (Ŷ i Ȳ ) Squaring both sides summing over all i 1,...n, we get (Y i Ȳ

More information

Lectures on Simple Linear Regression Stat 431, Summer 2012

Lectures on Simple Linear Regression Stat 431, Summer 2012 Lectures on Simple Linear Regression Stat 43, Summer 0 Hyunseung Kang July 6-8, 0 Last Updated: July 8, 0 :59PM Introduction Previously, we have been investigating various properties of the population

More information

Analysis of Bivariate Data

Analysis of Bivariate Data Analysis of Bivariate Data Data Two Quantitative variables GPA and GAES Interest rates and indices Tax and fund allocation Population size and prison population Bivariate data (x,y) Case corr&reg 2 Independent

More information

Master s Written Examination

Master s Written Examination Master s Written Examination Option: Statistics and Probability Spring 05 Full points may be obtained for correct answers to eight questions Each numbered question (which may have several parts) is worth

More information

Y i = η + ɛ i, i = 1,...,n.

Y i = η + ɛ i, i = 1,...,n. Nonparametric tests If data do not come from a normal population (and if the sample is not large), we cannot use a t-test. One useful approach to creating test statistics is through the use of rank statistics.

More information

Chapter 16. Simple Linear Regression and dcorrelation

Chapter 16. Simple Linear Regression and dcorrelation Chapter 16 Simple Linear Regression and dcorrelation 16.1 Regression Analysis Our problem objective is to analyze the relationship between interval variables; regression analysis is the first tool we will

More information

Ph.D. Qualifying Exam Friday Saturday, January 3 4, 2014

Ph.D. Qualifying Exam Friday Saturday, January 3 4, 2014 Ph.D. Qualifying Exam Friday Saturday, January 3 4, 2014 Put your solution to each problem on a separate sheet of paper. Problem 1. (5166) Assume that two random samples {x i } and {y i } are independently

More information

Association studies and regression

Association studies and regression Association studies and regression CM226: Machine Learning for Bioinformatics. Fall 2016 Sriram Sankararaman Acknowledgments: Fei Sha, Ameet Talwalkar Association studies and regression 1 / 104 Administration

More information

Section 3: Simple Linear Regression

Section 3: Simple Linear Regression Section 3: Simple Linear Regression Carlos M. Carvalho The University of Texas at Austin McCombs School of Business http://faculty.mccombs.utexas.edu/carlos.carvalho/teaching/ 1 Regression: General Introduction

More information

Homework 2: Simple Linear Regression

Homework 2: Simple Linear Regression STAT 4385 Applied Regression Analysis Homework : Simple Linear Regression (Simple Linear Regression) Thirty (n = 30) College graduates who have recently entered the job market. For each student, the CGPA

More information

Applied Econometrics (QEM)

Applied Econometrics (QEM) Applied Econometrics (QEM) The Simple Linear Regression Model based on Prinicples of Econometrics Jakub Mućk Department of Quantitative Economics Jakub Mućk Applied Econometrics (QEM) Meeting #2 The Simple

More information

Formal Statement of Simple Linear Regression Model

Formal Statement of Simple Linear Regression Model Formal Statement of Simple Linear Regression Model Y i = β 0 + β 1 X i + ɛ i Y i value of the response variable in the i th trial β 0 and β 1 are parameters X i is a known constant, the value of the predictor

More information

Multiple Linear Regression

Multiple Linear Regression Multiple Linear Regression Simple linear regression tries to fit a simple line between two variables Y and X. If X is linearly related to Y this explains some of the variability in Y. In most cases, there

More information

MA 575 Linear Models: Cedric E. Ginestet, Boston University Midterm Review Week 7

MA 575 Linear Models: Cedric E. Ginestet, Boston University Midterm Review Week 7 MA 575 Linear Models: Cedric E. Ginestet, Boston University Midterm Review Week 7 1 Random Vectors Let a 0 and y be n 1 vectors, and let A be an n n matrix. Here, a 0 and A are non-random, whereas y is

More information

Business Statistics. Tommaso Proietti. Linear Regression. DEF - Università di Roma 'Tor Vergata'

Business Statistics. Tommaso Proietti. Linear Regression. DEF - Università di Roma 'Tor Vergata' Business Statistics Tommaso Proietti DEF - Università di Roma 'Tor Vergata' Linear Regression Specication Let Y be a univariate quantitative response variable. We model Y as follows: Y = f(x) + ε where

More information

STAT2201 Assignment 6

STAT2201 Assignment 6 STAT2201 Assignment 6 Question 1 Regression methods were used to analyze the data from a study investigating the relationship between roadway surface temperature (x) and pavement deflection (y). Summary

More information

Simple Linear Regression for the MPG Data

Simple Linear Regression for the MPG Data Simple Linear Regression for the MPG Data 2000 2500 3000 3500 15 20 25 30 35 40 45 Wgt MPG What do we do with the data? y i = MPG of i th car x i = Weight of i th car i =1,...,n n = Sample Size Exploratory

More information

Statistical Techniques II EXST7015 Simple Linear Regression

Statistical Techniques II EXST7015 Simple Linear Regression Statistical Techniques II EXST7015 Simple Linear Regression 03a_SLR 1 Y - the dependent variable 35 30 25 The objective Given points plotted on two coordinates, Y and X, find the best line to fit the data.

More information

2.4.3 Estimatingσ Coefficient of Determination 2.4. ASSESSING THE MODEL 23

2.4.3 Estimatingσ Coefficient of Determination 2.4. ASSESSING THE MODEL 23 2.4. ASSESSING THE MODEL 23 2.4.3 Estimatingσ 2 Note that the sums of squares are functions of the conditional random variables Y i = (Y X = x i ). Hence, the sums of squares are random variables as well.

More information

Bias Variance Trade-off

Bias Variance Trade-off Bias Variance Trade-off The mean squared error of an estimator MSE(ˆθ) = E([ˆθ θ] 2 ) Can be re-expressed MSE(ˆθ) = Var(ˆθ) + (B(ˆθ) 2 ) MSE = VAR + BIAS 2 Proof MSE(ˆθ) = E((ˆθ θ) 2 ) = E(([ˆθ E(ˆθ)]

More information

Simple linear regression

Simple linear regression Simple linear regression Biometry 755 Spring 2008 Simple linear regression p. 1/40 Overview of regression analysis Evaluate relationship between one or more independent variables (X 1,...,X k ) and a single

More information

BIOS 2083 Linear Models c Abdus S. Wahed

BIOS 2083 Linear Models c Abdus S. Wahed Chapter 5 206 Chapter 6 General Linear Model: Statistical Inference 6.1 Introduction So far we have discussed formulation of linear models (Chapter 1), estimability of parameters in a linear model (Chapter

More information

The Simple Linear Regression Model

The Simple Linear Regression Model The Simple Linear Regression Model Lesson 3 Ryan Safner 1 1 Department of Economics Hood College ECON 480 - Econometrics Fall 2017 Ryan Safner (Hood College) ECON 480 - Lesson 3 Fall 2017 1 / 77 Bivariate

More information

Section 4.6 Simple Linear Regression

Section 4.6 Simple Linear Regression Section 4.6 Simple Linear Regression Objectives ˆ Basic philosophy of SLR and the regression assumptions ˆ Point & interval estimation of the model parameters, and how to make predictions ˆ Point and interval

More information

Categorical Predictor Variables

Categorical Predictor Variables Categorical Predictor Variables We often wish to use categorical (or qualitative) variables as covariates in a regression model. For binary variables (taking on only 2 values, e.g. sex), it is relatively

More information

Lecture 6 Multiple Linear Regression, cont.

Lecture 6 Multiple Linear Regression, cont. Lecture 6 Multiple Linear Regression, cont. BIOST 515 January 22, 2004 BIOST 515, Lecture 6 Testing general linear hypotheses Suppose we are interested in testing linear combinations of the regression

More information

Matrix Approach to Simple Linear Regression: An Overview

Matrix Approach to Simple Linear Regression: An Overview Matrix Approach to Simple Linear Regression: An Overview Aspects of matrices that you should know: Definition of a matrix Addition/subtraction/multiplication of matrices Symmetric/diagonal/identity matrix

More information

Ch 13 & 14 - Regression Analysis

Ch 13 & 14 - Regression Analysis Ch 3 & 4 - Regression Analysis Simple Regression Model I. Multiple Choice:. A simple regression is a regression model that contains a. only one independent variable b. only one dependent variable c. more

More information

Regression Models - Introduction

Regression Models - Introduction Regression Models - Introduction In regression models, two types of variables that are studied: A dependent variable, Y, also called response variable. It is modeled as random. An independent variable,

More information

STAT 511. Lecture : Simple linear regression Devore: Section Prof. Michael Levine. December 3, Levine STAT 511

STAT 511. Lecture : Simple linear regression Devore: Section Prof. Michael Levine. December 3, Levine STAT 511 STAT 511 Lecture : Simple linear regression Devore: Section 12.1-12.4 Prof. Michael Levine December 3, 2018 A simple linear regression investigates the relationship between the two variables that is not

More information

ECON The Simple Regression Model

ECON The Simple Regression Model ECON 351 - The Simple Regression Model Maggie Jones 1 / 41 The Simple Regression Model Our starting point will be the simple regression model where we look at the relationship between two variables In

More information

Multiple Regression Examples

Multiple Regression Examples Multiple Regression Examples Example: Tree data. we have seen that a simple linear regression of usable volume on diameter at chest height is not suitable, but that a quadratic model y = β 0 + β 1 x +

More information

Linear Regression. In this problem sheet, we consider the problem of linear regression with p predictors and one intercept,

Linear Regression. In this problem sheet, we consider the problem of linear regression with p predictors and one intercept, Linear Regression In this problem sheet, we consider the problem of linear regression with p predictors and one intercept, y = Xβ + ɛ, where y t = (y 1,..., y n ) is the column vector of target values,

More information

Simple Linear Regression

Simple Linear Regression Simple Linear Regression September 24, 2008 Reading HH 8, GIll 4 Simple Linear Regression p.1/20 Problem Data: Observe pairs (Y i,x i ),i = 1,...n Response or dependent variable Y Predictor or independent

More information

Øving 8. STAT111 Sondre Hølleland Auditorium π April Oppgaver

Øving 8. STAT111 Sondre Hølleland Auditorium π April Oppgaver Øving 8 STAT111 Sondre Hølleland Auditorium π 4 11. April 2016 Oppgaver Section 12.1: 9, 11 Section 12.2: 13, 23, 24 a. b. Fasit Section 12.1: 9. a)0.095 b) 0.475 c) 0.83 og 1.305 d) 0.4207 og 0.3446 e)0.0036

More information

Ch 3: Multiple Linear Regression

Ch 3: Multiple Linear Regression Ch 3: Multiple Linear Regression 1. Multiple Linear Regression Model Multiple regression model has more than one regressor. For example, we have one response variable and two regressor variables: 1. delivery

More information