Time Series Analysis. Smoothing Time Series. 2) assessment of/accounting for seasonality. 3) assessment of/exploiting "serial correlation"

Size: px
Start display at page:

Download "Time Series Analysis. Smoothing Time Series. 2) assessment of/accounting for seasonality. 3) assessment of/exploiting "serial correlation""

Transcription

1 Time Series Analysis 2) assessment of/accounting for seasonality This (not surprisingly) concerns the analysis of data collected over time... weekly values, monthly values, quarterly values, yearly values, etc. Usually the intent is to discern whether there is some pattern in the values collected to date, with the intention of short term forecasting (to use as the basis of business decisions). We will write 3) assessment of/exploiting "serial correlation" These are usually/most effectively done on a scale where the "local" variation in y t is approximately constant. y t = response of interest at time t (we usually think of these as equally spaced in clock time). Standard analyses of business time series involve: Smoothing Time Series 1) smoothing/trend assessment 1 There are various fairly simple smoothing/averaging methods. Two are "ordinary moving averages" and "exponentially weighted moving averages." 2 Ordinary Moving Averages For a "span" of k periods, ey t = moving average through time t = y t + y t 1 + y t y t k 1 k Where seasonal effects are expected, it is standard to use (ey t 1 is the EWMA from the previous period and the current EWMA is a compromise between the previous EWMA and the current observation.) One must start this recursion somewhere and it s common to take ey 1 = y 1. Notice that w =1does no smoothing, while w =0smooths so much that the EWMA never changes (i.e. all the values are equal to the first). k = number of periods per cycle Exponentially Weighted Moving Averages These weight observations less heavily as one moves back in time from the current period. They are typically computed "recursively" as ey t = exponentially weighted moving average at time t Exercise/Example Table 13.1 (page 13-5) of the text gives quarterly retail sales for JC Penney, (in millions of dollars). "By hand" 1) using k = 4 find the ordinary moving average for period 8, then 2) using (e.g.) w =.3, find the exponentially weighted moving average value for that period. (Finish the table.) = wy t +(1 w) ey t 1 3 4

2 t y t Span k =4MA w =.3 EWMA =.3(4507) +.7(4452) Ã! 4789 =.3(5537) +.7(4469) = =.3(8157) +.7(4789) A plot of both the original time series and the k =4MA values for the JC Penney data is in Figure 13.13, page of the text. Here is a JMP "Overlay Plot" version of this picture and an indication of how you can get JMP to make the MA s. Figure 1: JC Penney Sales and k =4MA Series 5 6 Figure 2: JMP "Column Formula" for JC Penney MA s Computation of EWMAs in JMP doesn t appear to be simple. Here is a plot of 3 EWMA series for the JC Penney sales data that shows the effect of changing w on how much smoothing is done. The most jagged plot is the (red) raw data plot (w =1.0). The (green) w =.5 EWMA plot is smoother. The (blue) w =.3 EWMA plot is smoother still. The (black) w =.1 plot is smoothest. Figure 3: EWMAs for JC Penney Sales Data 7 8

3 There are other more sophisticated smoothing methods available in statistical software. JMP provides "splines." JMP Cubic Spline Smoothers Theseareavailableusingthe"FitYbyX" procedure in JMP. They have a "stiffness knob" that lets one adjust how much "wiggling" the smoothed curve can do. Here are several splines fit to the JC Penney sales data. The "stiffness knob" is the parameter "λ." Figure 4: Splines Fit to the JC Penney Data 9 10 JMP will store the smoothed values obtained from these spline smoothers (just as it will store predicted values from regressions) in the original data table, if one clicks on the appropriate red triangle and chooses that option. Typically one wants to "smooth" a time series in order to make forecasts/projections into the future. The MA, EWMA, and spline smoothers don t really provide forecasts beyond projecting a current value ey t to the next period, period t+1. A possibility for smoothing that provides forecasts other than a current smoothed value is to fit a simple curve to the series using regression, where the "x" variable is "t" (that is, the data vectors are (1,y 1 ), (2,y 2 ),...). It is particularly easy to fit "low order" polynomials (lines, parabolas, etc.) to such data using JMP. These provide extrapolations beyond the end of the data set. JMP Fitting of (Low Order) Polynomial Trends to Time Series These are again conveniently available using the "Fit Y by X" procedure in JMP. 11 (Conceptually, one could also use the multiple regression procedure "Fit Model" after adding columns to the data table for powers of t. But we ll use the more elegant "Fit Y by X" method.) Below is a JMP graphic for linear and quadratic (1st and second order polynomial) fits to the JC Penney time series. NOTICE that the extrapolations to a 25th period from these two polynomials will be quite different! The two fitted equations are and ŷ t = t ŷ t = t (t 12.5) 2 and by clicking on the appropriate red triangles (next to the "Linear Fit" or "Polynomial Fit") one can save the predicted values into the data table. (If one 12

4 uses the "Fit Model" procedure, one can save the formula for fitted equation and get JMP to automatically compute forecasts into the future by adding rows to the data table with hypothetical t s in them.) Figure 5: JMP Fit of Linear and Parabolic Trends to the JC Penney Data As one moves from a line, to a parabola, to a cubic, etc., fitted polynomials will be allowed to be more and more wiggly, doing a better and better job of hitting the plotted points, but becoming less and less believable in terms of forecasts. The happiest circumstance is that where a simple straight line/linear trend seems to provide an adequate summary of the main movement of the time series. and ŷ 25 = (25) ( ) 2 = 7851 Exercise/Example Compute "by hand" the linear and quadratic forecasts of y 25 (the sales for the period immediately after the end of the data set) for the JC Penney sales based on the JMP fitted equations. These (quite different) forecasts are ŷ 25 = (25) = Accounting for/adjusting for Seasonality The linear trend fit to the JC Penney data misses the seasonality in the data. Mostly, the straight line in Figure 5 "over-predicts" in the first 3 quarters of each year and "under-predicts" in the 4th quarter of each year. (t = 1, 5, 9, 13, 17, 21 are "first quarter" periods, t =2, 6, 10, 14, 18, 22 are "second quarter" periods, etc.) It is well known that retail sales are typically best in the 4th quarter, where the Christmas season spurs consumer buying. 16

5 It makes sense in the analysis of business and economic time series to try to adjust smoothed values (and forecasts) in light of seasonal effects. Here we ll consider several ways of doing this. Simple Arithmetic and "Additive" Adjustment for Seasonal Effects One simple way of trying to account for seasonality is to look at all periods of a given type (e.g. 1st quarter periods where data are quarterly, or all June figures where data are monthly) and compute an average deviation of the original time series from the smoothed or fitted values in those periods. That average can then be added to smoothed values or forecasts from a smooth curve in order to account for seasonality. all June figures where data are monthly) and compute an average ratio of the actual values to the smoothed or fitted values in those periods. That average can then be used as a multiplier forsmoothedvaluesorforecasts from a smooth curve in order to account for seasonality. Example The table below gives simple computation of "additive" and "multiplicative" seasonality factors for the 1st quarter JC Penney sales, based on the linear trend fit to the data and pictured in Figure 5. Simple Arithmetic and "Multiplicative" Adjustment for Seasonal Effects A second simple way of trying to account for seasonality is to look at all periods of a given type (e.g. 1st quarter periods where data are quarterly, or Period, t y t ŷ t y t ŷ t y t ŷ t Then note that the average y t ŷ t is 3180 = and the average y t /ŷ t is = So fitted values or forecasts from the line fit tothejcpenneydatacouldbe adjusted by either addition of 530 or multiplication by For example, the forecast for period 25 (the first period after the end of the data in hand and a first quarter) from the linear fit in Figure 5 alone is This could be adjusted for the seasonality as either ŷ 25 = ( 530) = 8342 (making use of an "additive" seasonality adjustment) or as ŷ 25 = 8872(.9228) = 8187 (making use of a "multiplicative" seasonality adjustment). Exercise The table below gives the 4th quarter values and fitted values from the line fit to the JC Penney data. Finish the calculations, get additive and multiplicative seasonality factors, and use them to make 4th quarter forecasts 20

6 for the year following the end of the data (this is period t =28and the linear fit alone projects sales of ŷ 28 = (28) = 9228). Period, t y t ŷ t y t ŷ t y t ŷ t and making a multiplicative adjustment ŷ 28 = 9229 ( ) = The U.S. government reports values of all kinds of economic time series. In many cases, both "raw" and "seasonally adjusted" versions of these are announced. That is, not only does the government announce a value of "housing starts," but it also announces a value of "seasonally adjusted housing starts." If SF is a multiplicative seasonality factor for the particular month under discussion, this means that both housing starts Making an additive adjustment ŷ 28 =9228+( )= 21 and are reported. seasonally adjusted housing starts = housing starts SF 22 Using Dummy Variables in MRL to Account for Seasonality Amoresophisticated and convenient means of creating (additive) seasonality adjustments is to employ dummy variables in a multiple linear regression. That is, if there are k seasons, one can think of making up k 1 dummy variables x 1,x 2,...,x k 1 where for period t Figure 6: JMP Data Table Prepared for Using MLR to Account for Seasonality ( 1 if period t is from season j x j,t = 0 otherwise and then using these in a Multiple Linear Regression, fitting (for example) y t b 0 + b 1 t + a 1 x 1,t + a 2 x 2,t + + a k 1 x k 1,t The next figure shows the set-up of a JMP data table for the JC Penney data to make use of this idea

7 What this method does is let the "intercept" of a linear trend in y t change with period. A "cartoon" showing how this works for the case where there are k =4seasonsisbelow. Figure 7: Cartoon for Dummy Variables and Seasonality (k =4Seasons) To fit such a "linear trend plus season dummies" model to time series data, one can employ a multiple linear regression program. JMP s "Fit Model" routine incorporates such a program. The JMP "Fit Model" dialogue box and resulting report for the JC Penney data follow. Figure 8: JMP "Fit Model" Dialogue Box for Using Dummies to Account for Seasonality 27 28

8 The report shows that fitted values for 4th quarter periods t are ŷ t = t and, for example, fitted values for 1st quarter periods are ŷ t = ( ( )) t So, for example, 25th period (the first quarter immediately after the end of the data set) sales would be forecast as ŷ 25 = (25) = 8073 and 28th period sales (4th quarter sales for the year after the data ends) would be forecast as Figure 9: JMP Report for Fitting Linear Trend Plus Seasonal Dummies to the JC Penney Data 29 ŷ 28 = (28) = 10, Using Serial Correlation (in Residuals) To Improve Predictions Sometimes "trend plus seasonal effect" is all the information carried by a time series. But there are also many cases where yet more information can be extracted from the time series to improve on "trend plus seasonal effect" forecasts. This involves using residuals e t = y t ŷ t (for ŷ t the "fitted trend plus seasonal effect" values for the data in hand). allows us to effectively predict a given residual from previous ones. sometimes the pairs (e t 1,e t ) That is, show some linear relationship that can be exploited. When that can be done, predictions of future residuals can be added to "trend plus seasonal" forecasts forfutureperiods. Figure 10 shows the residuals and "lag 1 residuals" for the linear trend plus seasonal fit to the JC Penney sales data in the data table. If residuals look like random draws from a fixed universe, then there is nothing left in them to exploit. But sometimes they exhibit "serial correlation" that 31 32

9 Figure 10: Residuals e t and Lag 1 Residuals e t 1 for the JC Penney Data Next, there are 3 plots. In the first e t is plotted against t and in the second, e t is plotted against e t 1. These plots (in Figures 11 and 12) show the same thing in different terms. There is a time pattern in the residuals. So consecutive residuals tend to be big (positive) together and small (negative) together. That is because the fitted model over-predicts early in the data set and late in the data set, and under-predicts in the middle of the data set. That can also be seen if one looks carefully at the third plot of both y t versus t and ŷ t versus t (Figure 13) Figure 11: Plot of Residuals versus Period for the JC Penney Data Figure 12: Plot of Residual e t versus Lag 1 Residual e t 1 for the JC Penney Data 35 36

10 The pattern in Figure 12 suggests that one might predict a residual from the immediately preceding residual using some form of regression. Figure 14 shows that using simple linear regression of residuals on lag 1 residuals gives a fitted equation ê t = e t 1 Notice that this means that from the last point in the JC Penney data set (period 24) it is possible to predict the residual at period 25, since the residual for period 24 will then be known! That is ê 25 = e 24 Figure 13: JC Penney Sales and Fitted Sales Figure 14: JMP Report for SLR of Residual on Lag 1 Residual 39 In fact, this line of thinking suggests that we can improve on the forecast of y 25 based solely on linear trend plus seasonal (ŷ 25 =8073)byusing ŷ 25 +ê 25 Looking in the data table of Figure 10, we see that the residual in the final period of the data set is e 24 = and thus that ê 25 = ( ) = 506 so that what might be an improved forecast for period 25 is ( 506) = 7567 The basic idea of predicting residuals from previous residuals can be carried even further. One can try predicting a residual on the basis of not only the 40

11 immediately preceding one, but the immediately preceding two (or more). That is, it is possible to regress e t on e t 1 and e t 2 in order to come up with a way of forecasting a next residual (and therefore improving a trend plus seasonal forecast). We will not show any details here (for one thing because the idea doesn t really offer any improvement in the JC Penney example), but the idea should be clear. January 1949 through December 1960 and the counts are in thousands of passengers.) This data set can be used to admirably demonstrate the topics discussed here. (Although we have made use of the JC Penney data set for illustrative purposes, it is far smaller than the minimum size that should really be used in a time series analysis. The length 144 airline passenger data set is closer to being of practical size for reliable development of forecasts.) Figure 15 is a plot of the raw passenger counts versus time. Case Study: JMP Airline Passenger Count Data In the "Sample Data" provided with a JMP installation are some time series data. "Seriesg.jmp" gives 12 years worth of monthly airline passenger counts taken from the time series book of Box and Jenkins. (The data are from some upward curvature. It is far easier to fit and forecast series that don t have these features. So what we can do is to try to transform the raw counts, fit andforecastwiththetransformedseries,andthen"untransform"tomakefinal interpretations. That is, we will analyze the (base 10) logarithms of passenger counts y t =log 10 (passenger count at period t) Figure 15: Airline Passenger Counts Time Series Figure15hasafeaturethatiscommontomanyeconomictimeseriesofany appreciable length. Namely, as time goes on, the "local" or short term variation seems to increase as the general level of the count increases. Besides, it looks like the general trend of count versus time may not be linear, but rather have 43 Figure 16 is a plot of y t and happily looks "better" than the original series in Figure 15 for purposes of fitting and forecasting. 44

12 Figure 16: Logarithms of Passenger Counts A firststepinanalysisofthey t series is perhaps to see how a linear trend does at describing the data. We can use JMP to do SLR and fit a line to the (t, y t ) values and save the predictions. These can then be plotted using "Overlay Plot" along with the original series to get Figure Figure 17: Linear Trend Fit to y t Series 46 Of course, the linear trend ignores the seasonality in the times series. Since these are monthly data, we could define 11 monthly indicator variables. But that would be tedious, and happily the JMP data table (partially pictured in Figure 18) has the month information coded into it in the form of a "nominal" variable "Season." Since "Season" is a "nominal" variable (indicated by the red cap N) if we tell JMP s "Fit Model" routine to use it in a multiple regression, it will automatically use the single nominal variable to create 12 1=11 dummy variables for all but one of the values of "Season." That is, we may fill in the "Fit Model" dialogue box as in Figure 19 to get fitted values for the "linear trend plus seasonal" model. Figure 18: Partial JMP Data Table for the Airline Passenger Data 47 48

13 A partial JMP report for the fitting indicated in Figure 19 is shown in Figure 20. A plot of the fitted values for the linear trend plus seasonal model is shown in Figure 21. Figure 19: JMP "FIT Model" Dialogue Box for Linear Trend Plus Seasonal Fit Figure 21: Linear Trend Plus Seasonal Fit to Logarithms of Passenger Counts Figure 20: Partial JMP Report for Linear Trend Plus Seasonal Fit to y t 51 52

14 Of course, the fit indicated in Figure 21 is better than the one in Figure 17. And the forecasts provided by the regression model can be extended into the "future" (beyond t =144that represents the last point in the data set). But there is even more that can be done if one considers the nature of the residuals from the regression fit. Figure 22 shows a plot of the residuals e t = y t ŷ t versus t and Figure 23 shows that there is a fair amount of correlation between residuals and lagged residuals. (This is no surprise given the nature of the plot in Figure 22 where "slow" trends in the residuals make ones close together in time similar in value.) Figure 22: Plot of Residuals versus t for the Log Passenger Counts It is possible (by examining regressions of residuals on lagged residuals) to come to the conclusion that in terms of predicting residuals from earlier residuals it suffices to simply use the single previous one (nothing important is gained by using the two previous ones). And in fact, for this problem, an appropriate prediction equation (coming from SLR of e t on e t 1 )is ê t = e t 1 This can be used to adjust the fits/predictions from the linear trend plus seasonal model of log counts as Figure 23: Residuals, Lag 1 Residuals, and Lag 2 Residuals for Log Passenger Counts 55 (adjusted fit) t =ŷ t +ê t These are plotted along with the original series and the earlier fitted values ŷ t in Figure 24. There is a small, but clearly discernible improvement in the quality of the modeling provided by this adjustment for serial correlation in the residuals. 56

15 Notice then that an adjusted forecast of log passenger count for period t =145 (the January/Season 1 following the end of the data set) becomes ŷ 145 +ê 145 = ( ( ) (145)) + ( ( )) = This figure is (of course) on a log scale. We may "untransform" this value in order to get a forecast for a passenger count (as opposed to a log passenger count). This is =454 In fact, it is worthwhile to see a final plot, that compares the original series of counts to the whole set of values Figure 24: Original Values y t, Fitted Values ŷ t, and Adjusted Fitted Values ŷ t +ê t 57 10ŷt+ê t 58 that function as fitted values on the original (count) scale. This is shown in Figure 25 (including the value for period 145, whose plotted symbol is larger than the others, and represents a forecast beyond the end of the original data set). Figure 25: Plot of Passenger Counts and Final Fitted Values 10ŷt+ê t (Including aforecastfort =145) 59 60

Time series and Forecasting

Time series and Forecasting Chapter 2 Time series and Forecasting 2.1 Introduction Data are frequently recorded at regular time intervals, for instance, daily stock market indices, the monthly rate of inflation or annual profit figures.

More information

3 Time Series Regression

3 Time Series Regression 3 Time Series Regression 3.1 Modelling Trend Using Regression Random Walk 2 0 2 4 6 8 Random Walk 0 2 4 6 8 0 10 20 30 40 50 60 (a) Time 0 10 20 30 40 50 60 (b) Time Random Walk 8 6 4 2 0 Random Walk 0

More information

Regression Analysis V... More Model Building: Including Qualitative Predictors, Model Searching, Model "Checking"/Diagnostics

Regression Analysis V... More Model Building: Including Qualitative Predictors, Model Searching, Model Checking/Diagnostics Regression Analysis V... More Model Building: Including Qualitative Predictors, Model Searching, Model "Checking"/Diagnostics The session is a continuation of a version of Section 11.3 of MMD&S. It concerns

More information

Regression Analysis V... More Model Building: Including Qualitative Predictors, Model Searching, Model "Checking"/Diagnostics

Regression Analysis V... More Model Building: Including Qualitative Predictors, Model Searching, Model Checking/Diagnostics Regression Analysis V... More Model Building: Including Qualitative Predictors, Model Searching, Model "Checking"/Diagnostics The session is a continuation of a version of Section 11.3 of MMD&S. It concerns

More information

Regression Analysis IV... More MLR and Model Building

Regression Analysis IV... More MLR and Model Building Regression Analysis IV... More MLR and Model Building This session finishes up presenting the formal methods of inference based on the MLR model and then begins discussion of "model building" (use of regression

More information

Cyclical Effect, and Measuring Irregular Effect

Cyclical Effect, and Measuring Irregular Effect Paper:15, Quantitative Techniques for Management Decisions Module- 37 Forecasting & Time series Analysis: Measuring- Seasonal Effect, Cyclical Effect, and Measuring Irregular Effect Principal Investigator

More information

Time-Series Analysis. Dr. Seetha Bandara Dept. of Economics MA_ECON

Time-Series Analysis. Dr. Seetha Bandara Dept. of Economics MA_ECON Time-Series Analysis Dr. Seetha Bandara Dept. of Economics MA_ECON Time Series Patterns A time series is a sequence of observations on a variable measured at successive points in time or over successive

More information

Decision 411: Class 3

Decision 411: Class 3 Decision 411: Class 3 Discussion of HW#1 Introduction to seasonal models Seasonal decomposition Seasonal adjustment on a spreadsheet Forecasting with seasonal adjustment Forecasting inflation Poor man

More information

Regression Models for Time Trends: A Second Example. INSR 260, Spring 2009 Bob Stine

Regression Models for Time Trends: A Second Example. INSR 260, Spring 2009 Bob Stine Regression Models for Time Trends: A Second Example INSR 260, Spring 2009 Bob Stine 1 Overview Resembles prior textbook occupancy example Time series of revenue, costs and sales at Best Buy, in millions

More information

Decision 411: Class 3

Decision 411: Class 3 Decision 411: Class 3 Discussion of HW#1 Introduction to seasonal models Seasonal decomposition Seasonal adjustment on a spreadsheet Forecasting with seasonal adjustment Forecasting inflation Poor man

More information

Business Statistics. Lecture 9: Simple Regression

Business Statistics. Lecture 9: Simple Regression Business Statistics Lecture 9: Simple Regression 1 On to Model Building! Up to now, class was about descriptive and inferential statistics Numerical and graphical summaries of data Confidence intervals

More information

Decision 411: Class 3

Decision 411: Class 3 Decision 411: Class 3 Discussion of HW#1 Introduction to seasonal models Seasonal decomposition Seasonal adjustment on a spreadsheet Forecasting with seasonal adjustment Forecasting inflation Log transformation

More information

Exponential Smoothing. INSR 260, Spring 2009 Bob Stine

Exponential Smoothing. INSR 260, Spring 2009 Bob Stine Exponential Smoothing INSR 260, Spring 2009 Bob Stine 1 Overview Smoothing Exponential smoothing Model behind exponential smoothing Forecasts and estimates Hidden state model Diagnostic: residual plots

More information

Year 10 Mathematics Semester 2 Bivariate Data Chapter 13

Year 10 Mathematics Semester 2 Bivariate Data Chapter 13 Year 10 Mathematics Semester 2 Bivariate Data Chapter 13 Why learn this? Observations of two or more variables are often recorded, for example, the heights and weights of individuals. Studying the data

More information

Chapter 5: Forecasting

Chapter 5: Forecasting 1 Textbook: pp. 165-202 Chapter 5: Forecasting Every day, managers make decisions without knowing what will happen in the future 2 Learning Objectives After completing this chapter, students will be able

More information

Section 3.1 Quadratic Functions

Section 3.1 Quadratic Functions Chapter 3 Lecture Notes Page 1 of 72 Section 3.1 Quadratic Functions Objectives: Compare two different forms of writing a quadratic function Find the equation of a quadratic function (given points) Application

More information

Ch3. TRENDS. Time Series Analysis

Ch3. TRENDS. Time Series Analysis 3.1 Deterministic Versus Stochastic Trends The simulated random walk in Exhibit 2.1 shows a upward trend. However, it is caused by a strong correlation between the series at nearby time points. The true

More information

Graphical Analysis and Errors MBL

Graphical Analysis and Errors MBL Graphical Analysis and Errors MBL I Graphical Analysis Graphs are vital tools for analyzing and displaying data Graphs allow us to explore the relationship between two quantities -- an independent variable

More information

Lecture Prepared By: Mohammad Kamrul Arefin Lecturer, School of Business, North South University

Lecture Prepared By: Mohammad Kamrul Arefin Lecturer, School of Business, North South University Lecture 15 20 Prepared By: Mohammad Kamrul Arefin Lecturer, School of Business, North South University Modeling for Time Series Forecasting Forecasting is a necessary input to planning, whether in business,

More information

CHAPTER 18. Time Series Analysis and Forecasting

CHAPTER 18. Time Series Analysis and Forecasting CHAPTER 18 Time Series Analysis and Forecasting CONTENTS STATISTICS IN PRACTICE: NEVADA OCCUPATIONAL HEALTH CLINIC 18.1 TIME SERIES PATTERNS Horizontal Pattern Trend Pattern Seasonal Pattern Trend and

More information

Quadratic Equations Part I

Quadratic Equations Part I Quadratic Equations Part I Before proceeding with this section we should note that the topic of solving quadratic equations will be covered in two sections. This is done for the benefit of those viewing

More information

Forecasting. Simon Shaw 2005/06 Semester II

Forecasting. Simon Shaw 2005/06 Semester II Forecasting Simon Shaw s.c.shaw@maths.bath.ac.uk 2005/06 Semester II 1 Introduction A critical aspect of managing any business is planning for the future. events is called forecasting. Predicting future

More information

CHAPTER 4: DATASETS AND CRITERIA FOR ALGORITHM EVALUATION

CHAPTER 4: DATASETS AND CRITERIA FOR ALGORITHM EVALUATION CHAPTER 4: DATASETS AND CRITERIA FOR ALGORITHM EVALUATION 4.1 Overview This chapter contains the description about the data that is used in this research. In this research time series data is used. A time

More information

Forecasting Chapter 3

Forecasting Chapter 3 Forecasting Chapter 3 Introduction Current factors and conditions Past experience in a similar situation 2 Accounting. New product/process cost estimates, profit projections, cash management. Finance.

More information

Time Series Analysis

Time Series Analysis Time Series Analysis A time series is a sequence of observations made: 1) over a continuous time interval, 2) of successive measurements across that interval, 3) using equal spacing between consecutive

More information

INTRODUCTION TO FORECASTING (PART 2) AMAT 167

INTRODUCTION TO FORECASTING (PART 2) AMAT 167 INTRODUCTION TO FORECASTING (PART 2) AMAT 167 Techniques for Trend EXAMPLE OF TRENDS In our discussion, we will focus on linear trend but here are examples of nonlinear trends: EXAMPLE OF TRENDS If you

More information

PRACTICE FINAL , FALL What will NOT be on the final

PRACTICE FINAL , FALL What will NOT be on the final PRACTICE FINAL - 1010-004, FALL 2013 If you are completing this practice final for bonus points, please use separate sheets of paper to do your work and circle your answers. Turn in all work you did to

More information

Forecasting. Al Nosedal University of Toronto. March 8, Al Nosedal University of Toronto Forecasting March 8, / 80

Forecasting. Al Nosedal University of Toronto. March 8, Al Nosedal University of Toronto Forecasting March 8, / 80 Forecasting Al Nosedal University of Toronto March 8, 2016 Al Nosedal University of Toronto Forecasting March 8, 2016 1 / 80 Forecasting Methods: An Overview There are many forecasting methods available,

More information

Forecasting: Principles and Practice. Rob J Hyndman. 1. Introduction to forecasting OTexts.org/fpp/1/ OTexts.org/fpp/2/3

Forecasting: Principles and Practice. Rob J Hyndman. 1. Introduction to forecasting OTexts.org/fpp/1/ OTexts.org/fpp/2/3 Rob J Hyndman Forecasting: Principles and Practice 1. Introduction to forecasting OTexts.org/fpp/1/ OTexts.org/fpp/2/3 Forecasting: Principles and Practice 1 Resources Slides Exercises Textbook Useful

More information

, (1) e i = ˆσ 1 h ii. c 2016, Jeffrey S. Simonoff 1

, (1) e i = ˆσ 1 h ii. c 2016, Jeffrey S. Simonoff 1 Regression diagnostics As is true of all statistical methodologies, linear regression analysis can be a very effective way to model data, as along as the assumptions being made are true. For the regression

More information

Biostatistics and Design of Experiments Prof. Mukesh Doble Department of Biotechnology Indian Institute of Technology, Madras

Biostatistics and Design of Experiments Prof. Mukesh Doble Department of Biotechnology Indian Institute of Technology, Madras Biostatistics and Design of Experiments Prof. Mukesh Doble Department of Biotechnology Indian Institute of Technology, Madras Lecture - 39 Regression Analysis Hello and welcome to the course on Biostatistics

More information

Decision 411: Class 7

Decision 411: Class 7 Decision 411: Class 7 Confidence limits for sums of coefficients Use of the time index as a regressor The difficulty of predicting the future Confidence intervals for sums of coefficients Sometimes the

More information

Probability Distributions

Probability Distributions CONDENSED LESSON 13.1 Probability Distributions In this lesson, you Sketch the graph of the probability distribution for a continuous random variable Find probabilities by finding or approximating areas

More information

Midterm 2 - Solutions

Midterm 2 - Solutions Ecn 102 - Analysis of Economic Data University of California - Davis February 24, 2010 Instructor: John Parman Midterm 2 - Solutions You have until 10:20am to complete this exam. Please remember to put

More information

Systems and Matrices CHAPTER 7

Systems and Matrices CHAPTER 7 CHAPTER 7 Systems and Matrices 7.1 Solving Systems of Two Equations 7.2 Matrix Algebra 7.3 Multivariate Linear Systems and Row Operations 7.4 Partial Fractions 7.5 Systems of Inequalities in Two Variables

More information

Stat 587: Key points and formulae Week 15

Stat 587: Key points and formulae Week 15 Odds ratios to compare two proportions: Difference, p 1 p 2, has issues when applied to many populations Vit. C: P[cold Placebo] = 0.82, P[cold Vit. C] = 0.74, Estimated diff. is 8% What if a year or place

More information

Multiple regression: Model building. Topics. Correlation Matrix. CQMS 202 Business Statistics II Prepared by Moez Hababou

Multiple regression: Model building. Topics. Correlation Matrix. CQMS 202 Business Statistics II Prepared by Moez Hababou Multiple regression: Model building CQMS 202 Business Statistics II Prepared by Moez Hababou Topics Forward versus backward model building approach Using the correlation matrix Testing for multicolinearity

More information

Math 120. x x 4 x. . In this problem, we are combining fractions. To do this, we must have

Math 120. x x 4 x. . In this problem, we are combining fractions. To do this, we must have Math 10 Final Eam Review 1. 4 5 6 5 4 4 4 7 5 Worked out solutions. In this problem, we are subtracting one polynomial from another. When adding or subtracting polynomials, we combine like terms. Remember

More information

Please bring the task to your first physics lesson and hand it to the teacher.

Please bring the task to your first physics lesson and hand it to the teacher. Pre-enrolment task for 2014 entry Physics Why do I need to complete a pre-enrolment task? This bridging pack serves a number of purposes. It gives you practice in some of the important skills you will

More information

Section 3: Simple Linear Regression

Section 3: Simple Linear Regression Section 3: Simple Linear Regression Carlos M. Carvalho The University of Texas at Austin McCombs School of Business http://faculty.mccombs.utexas.edu/carlos.carvalho/teaching/ 1 Regression: General Introduction

More information

Based on the original slides from Levine, et. all, First Edition, Prentice Hall, Inc

Based on the original slides from Levine, et. all, First Edition, Prentice Hall, Inc Based on the original slides from Levine, et. all, First Edition, Prentice Hall, Inc Process of predicting a future event Underlying basis of all business decisions Production Inventory Personnel Facilities

More information

Chapter 8. Linear Regression. Copyright 2010 Pearson Education, Inc.

Chapter 8. Linear Regression. Copyright 2010 Pearson Education, Inc. Chapter 8 Linear Regression Copyright 2010 Pearson Education, Inc. Fat Versus Protein: An Example The following is a scatterplot of total fat versus protein for 30 items on the Burger King menu: Copyright

More information

Chapter 8. Linear Regression. The Linear Model. Fat Versus Protein: An Example. The Linear Model (cont.) Residuals

Chapter 8. Linear Regression. The Linear Model. Fat Versus Protein: An Example. The Linear Model (cont.) Residuals Chapter 8 Linear Regression Copyright 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Slide 8-1 Copyright 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Fat Versus

More information

Forecasting: The First Step in Demand Planning

Forecasting: The First Step in Demand Planning Forecasting: The First Step in Demand Planning Jayant Rajgopal, Ph.D., P.E. University of Pittsburgh Pittsburgh, PA 15261 In a supply chain context, forecasting is the estimation of future demand General

More information

Discrete mathematics is the study of techniques, ideas and modes of

Discrete mathematics is the study of techniques, ideas and modes of CHAPTER 1 Discrete Systems Discrete mathematics is the study of techniques, ideas and modes of reasoning that are indispensable in applied disciplines such as computer science or information technology.

More information

YEAR 10 GENERAL MATHEMATICS 2017 STRAND: BIVARIATE DATA PART II CHAPTER 12 RESIDUAL ANALYSIS, LINEARITY AND TIME SERIES

YEAR 10 GENERAL MATHEMATICS 2017 STRAND: BIVARIATE DATA PART II CHAPTER 12 RESIDUAL ANALYSIS, LINEARITY AND TIME SERIES YEAR 10 GENERAL MATHEMATICS 2017 STRAND: BIVARIATE DATA PART II CHAPTER 12 RESIDUAL ANALYSIS, LINEARITY AND TIME SERIES This topic includes: Transformation of data to linearity to establish relationships

More information

Solving Quadratic & Higher Degree Inequalities

Solving Quadratic & Higher Degree Inequalities Ch. 10 Solving Quadratic & Higher Degree Inequalities We solve quadratic and higher degree inequalities very much like we solve quadratic and higher degree equations. One method we often use to solve quadratic

More information

Nonlinear Regression Act4 Exponential Predictions (Statcrunch)

Nonlinear Regression Act4 Exponential Predictions (Statcrunch) Nonlinear Regression Act4 Exponential Predictions (Statcrunch) Directions: Now that we have established the exponential relationships with these variables and analyzed the residuals, let s use the equations

More information

Week 9: An Introduction to Time Series

Week 9: An Introduction to Time Series BUS41100 Applied Regression Analysis Week 9: An Introduction to Time Series Dependent data, autocorrelation, AR and periodic regression models Max H. Farrell The University of Chicago Booth School of Business

More information

MISCELLANEOUS REGRESSION TOPICS

MISCELLANEOUS REGRESSION TOPICS DEPARTMENT OF POLITICAL SCIENCE AND INTERNATIONAL RELATIONS Posc/Uapp 816 MISCELLANEOUS REGRESSION TOPICS I. AGENDA: A. Example of correcting for autocorrelation. B. Regression with ordinary independent

More information

3. If a forecast is too high when compared to an actual outcome, will that forecast error be positive or negative?

3. If a forecast is too high when compared to an actual outcome, will that forecast error be positive or negative? 1. Does a moving average forecast become more or less responsive to changes in a data series when more data points are included in the average? 2. Does an exponential smoothing forecast become more or

More information

AP Statistics. Chapter 9 Re-Expressing data: Get it Straight

AP Statistics. Chapter 9 Re-Expressing data: Get it Straight AP Statistics Chapter 9 Re-Expressing data: Get it Straight Objectives: Re-expression of data Ladder of powers Straight to the Point We cannot use a linear model unless the relationship between the two

More information

Chapter 7 Forecasting Demand

Chapter 7 Forecasting Demand Chapter 7 Forecasting Demand Aims of the Chapter After reading this chapter you should be able to do the following: discuss the role of forecasting in inventory management; review different approaches

More information

Regression, part II. I. What does it all mean? A) Notice that so far all we ve done is math.

Regression, part II. I. What does it all mean? A) Notice that so far all we ve done is math. Regression, part II I. What does it all mean? A) Notice that so far all we ve done is math. 1) One can calculate the Least Squares Regression Line for anything, regardless of any assumptions. 2) But, if

More information

Lesson 3 - Linear Functions

Lesson 3 - Linear Functions Lesson 3 - Linear Functions Introduction As an overview for the course, in Lesson's 1 and 2 we discussed the importance of functions to represent relationships and the associated notation of these functions

More information

A Study Guide for. Students PREPARING FOR GRADE. Nova Scotia Examinations in Mathematics

A Study Guide for. Students PREPARING FOR GRADE. Nova Scotia Examinations in Mathematics A Study Guide for Students PREPARING FOR 12 GRADE Nova Scotia Examinations in Mathematics A Study Guide for Students PREPARING FOR 12 GRADE Nova Scotia Examinations in Mathematics For more information,

More information

Industrial Engineering Prof. Inderdeep Singh Department of Mechanical & Industrial Engineering Indian Institute of Technology, Roorkee

Industrial Engineering Prof. Inderdeep Singh Department of Mechanical & Industrial Engineering Indian Institute of Technology, Roorkee Industrial Engineering Prof. Inderdeep Singh Department of Mechanical & Industrial Engineering Indian Institute of Technology, Roorkee Module - 04 Lecture - 05 Sales Forecasting - II A very warm welcome

More information

POL 681 Lecture Notes: Statistical Interactions

POL 681 Lecture Notes: Statistical Interactions POL 681 Lecture Notes: Statistical Interactions 1 Preliminaries To this point, the linear models we have considered have all been interpreted in terms of additive relationships. That is, the relationship

More information

Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis. 17th Class 7/1/10

Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis. 17th Class 7/1/10 Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis 17th Class 7/1/10 The only function of economic forecasting is to make astrology look respectable. --John Kenneth Galbraith show

More information

An introduction to plotting data

An introduction to plotting data An introduction to plotting data Eric D. Black California Institute of Technology v2.0 1 Introduction Plotting data is one of the essential skills every scientist must have. We use it on a near-daily basis

More information

Chapter 9: Roots and Irrational Numbers

Chapter 9: Roots and Irrational Numbers Chapter 9: Roots and Irrational Numbers Index: A: Square Roots B: Irrational Numbers C: Square Root Functions & Shifting D: Finding Zeros by Completing the Square E: The Quadratic Formula F: Quadratic

More information

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010

Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Problem Set 2 Solution Sketches Time Series Analysis Spring 2010 Forecasting 1. Let X and Y be two random variables such that E(X 2 ) < and E(Y 2 )

More information

How to handle and solve a linear equation (what s a linear equation?) How to draw the solution set for a linear inequality

How to handle and solve a linear equation (what s a linear equation?) How to draw the solution set for a linear inequality Study guide for final exam, Math 1090 - College Algebra for Business and Social Sciences This guide is meant to be a help on studying what I think is most important important that you learn form this exam,

More information

loose-leaf paper Name: Class: Date: Multiple Choice Identify the choice that best completes the statement or answers the question.

loose-leaf paper Name: Class: Date: Multiple Choice Identify the choice that best completes the statement or answers the question. Class: Date: Algebra 2 Trig Midterm Exam Review 2014 loose-leaf paper Do all work in a neat and organzied manner on Multiple Choice Identify the choice that best completes the statement or answers the

More information

Design and Optimization of Energy Systems Prof. C. Balaji Department of Mechanical Engineering Indian Institute of Technology, Madras

Design and Optimization of Energy Systems Prof. C. Balaji Department of Mechanical Engineering Indian Institute of Technology, Madras Design and Optimization of Energy Systems Prof. C. Balaji Department of Mechanical Engineering Indian Institute of Technology, Madras Lecture - 13 Introduction to Curve Fitting Good afternoon. So, we will

More information

Glossary. The ISI glossary of statistical terms provides definitions in a number of different languages:

Glossary. The ISI glossary of statistical terms provides definitions in a number of different languages: Glossary The ISI glossary of statistical terms provides definitions in a number of different languages: http://isi.cbs.nl/glossary/index.htm Adjusted r 2 Adjusted R squared measures the proportion of the

More information

MATH CRASH COURSE GRA6020 SPRING 2012

MATH CRASH COURSE GRA6020 SPRING 2012 MATH CRASH COURSE GRA6020 SPRING 2012 STEFFEN GRØNNEBERG Contents 1. Basic stuff concerning equations and functions 2 2. Sums, with the Greek letter Sigma (Σ) 3 2.1. Why sums are so important to us 3 2.2.

More information

General Recipe for Constant-Coefficient Equations

General Recipe for Constant-Coefficient Equations General Recipe for Constant-Coefficient Equations We want to look at problems like y (6) + 10y (5) + 39y (4) + 76y + 78y + 36y = (x + 2)e 3x + xe x cos x + 2x + 5e x. This example is actually more complicated

More information

WEIGHTED LEAST SQUARES. Model Assumptions for Weighted Least Squares: Recall: We can fit least squares estimates just assuming a linear mean function.

WEIGHTED LEAST SQUARES. Model Assumptions for Weighted Least Squares: Recall: We can fit least squares estimates just assuming a linear mean function. 1 2 WEIGHTED LEAST SQUARES Recall: We can fit least squares estimates just assuming a linear mean function. Without the constant variance assumption, we can still conclude that the coefficient estimators

More information

Forecasting: principles and practice. Rob J Hyndman 1.1 Introduction to Forecasting

Forecasting: principles and practice. Rob J Hyndman 1.1 Introduction to Forecasting Forecasting: principles and practice Rob J Hyndman 1.1 Introduction to Forecasting 1 Outline 1 Background 2 Case studies 3 The statistical forecasting perspective 4 What can we forecast? 2 Resources Slides

More information

Lesson 11-1: Parabolas

Lesson 11-1: Parabolas Lesson -: Parabolas The first conic curve we will work with is the parabola. You may think that you ve never really used or encountered a parabola before. Think about it how many times have you been going

More information

Nonlinear Regression. Summary. Sample StatFolio: nonlinear reg.sgp

Nonlinear Regression. Summary. Sample StatFolio: nonlinear reg.sgp Nonlinear Regression Summary... 1 Analysis Summary... 4 Plot of Fitted Model... 6 Response Surface Plots... 7 Analysis Options... 10 Reports... 11 Correlation Matrix... 12 Observed versus Predicted...

More information

Stat 101: Lecture 6. Summer 2006

Stat 101: Lecture 6. Summer 2006 Stat 101: Lecture 6 Summer 2006 Outline Review and Questions Example for regression Transformations, Extrapolations, and Residual Review Mathematical model for regression Each point (X i, Y i ) in the

More information

Lesson 2: Exploring Quadratic Relations Quad Regression Unit 5 Quadratic Relations

Lesson 2: Exploring Quadratic Relations Quad Regression Unit 5 Quadratic Relations (A) Lesson Context BIG PICTURE of this UNIT: CONTEXT of this LESSON: How do we analyze and then work with a data set that shows both increase and decrease What is a parabola and what key features do they

More information

Introduction to Algebra: The First Week

Introduction to Algebra: The First Week Introduction to Algebra: The First Week Background: According to the thermostat on the wall, the temperature in the classroom right now is 72 degrees Fahrenheit. I want to write to my friend in Europe,

More information

Linear Regression 3.2

Linear Regression 3.2 3.2 Linear Regression Regression is an analytic technique for determining the relationship between a dependent variable and an independent variable. When the two variables have a linear correlation, you

More information

Operation and Supply Chain Management Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras

Operation and Supply Chain Management Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Operation and Supply Chain Management Prof. G. Srinivasan Department of Management Studies Indian Institute of Technology, Madras Lecture - 3 Forecasting Linear Models, Regression, Holt s, Seasonality

More information

STAT 3022 Spring 2007

STAT 3022 Spring 2007 Simple Linear Regression Example These commands reproduce what we did in class. You should enter these in R and see what they do. Start by typing > set.seed(42) to reset the random number generator so

More information

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.

More information

Multiple Representations: Equations to Tables and Graphs Transcript

Multiple Representations: Equations to Tables and Graphs Transcript Algebra l Teacher: It s good to see you again. Last time we talked about multiple representations. If we could, I would like to continue and discuss the subtle differences of multiple representations between

More information

DEPARTMENT OF QUANTITATIVE METHODS & INFORMATION SYSTEMS

DEPARTMENT OF QUANTITATIVE METHODS & INFORMATION SYSTEMS DEPARTMENT OF QUANTITATIVE METHODS & INFORMATION SYSTEMS Moving Averages and Smoothing Methods ECON 504 Chapter 7 Fall 2013 Dr. Mohammad Zainal 2 This chapter will describe three simple approaches to forecasting

More information

Solving and Graphing Polynomials

Solving and Graphing Polynomials UNIT 9 Solving and Graphing Polynomials You can see laminar and turbulent fl ow in a fountain. Copyright 009, K1 Inc. All rights reserved. This material may not be reproduced in whole or in part, including

More information

TIMES SERIES INTRODUCTION INTRODUCTION. Page 1. A time series is a set of observations made sequentially through time

TIMES SERIES INTRODUCTION INTRODUCTION. Page 1. A time series is a set of observations made sequentially through time TIMES SERIES INTRODUCTION A time series is a set of observations made sequentially through time A time series is said to be continuous when observations are taken continuously through time, or discrete

More information

Any of 27 linear and nonlinear models may be fit. The output parallels that of the Simple Regression procedure.

Any of 27 linear and nonlinear models may be fit. The output parallels that of the Simple Regression procedure. STATGRAPHICS Rev. 9/13/213 Calibration Models Summary... 1 Data Input... 3 Analysis Summary... 5 Analysis Options... 7 Plot of Fitted Model... 9 Predicted Values... 1 Confidence Intervals... 11 Observed

More information

ISQS 5349 Spring 2013 Final Exam

ISQS 5349 Spring 2013 Final Exam ISQS 5349 Spring 2013 Final Exam Name: General Instructions: Closed books, notes, no electronic devices. Points (out of 200) are in parentheses. Put written answers on separate paper; multiple choices

More information

Empirical Project, part 1, ECO 672

Empirical Project, part 1, ECO 672 Empirical Project, part 1, ECO 672 Due Date: see schedule in syllabus Instruction: The empirical project has two parts. This is part 1, which is worth 15 points. You need to work independently on this

More information

UNIT 2: REASONING WITH LINEAR EQUATIONS AND INEQUALITIES. Solving Equations and Inequalities in One Variable

UNIT 2: REASONING WITH LINEAR EQUATIONS AND INEQUALITIES. Solving Equations and Inequalities in One Variable UNIT 2: REASONING WITH LINEAR EQUATIONS AND INEQUALITIES This unit investigates linear equations and inequalities. Students create linear equations and inequalities and use them to solve problems. They

More information

Quarter 2 400, , , , , , ,000 50,000

Quarter 2 400, , , , , , ,000 50,000 Algebra 2 Quarter 2 Quadratic Functions Introduction to Polynomial Functions Hybrid Electric Vehicles Since 1999, there has been a growing trend in the sales of hybrid electric vehicles. These data show

More information

Economic Forecasting Lecture 9: Smoothing Methods

Economic Forecasting Lecture 9: Smoothing Methods Economic Forecasting Lecture 9: Smoothing Methods Richard G. Pierse 1 Introduction Smoothing methods are rather different from the model-based methods that we have been looking at up to now in this module.

More information

June If you want, you may scan your assignment and convert it to a.pdf file and it to me.

June If you want, you may scan your assignment and convert it to a.pdf file and  it to me. Summer Assignment Pre-Calculus Honors June 2016 Dear Student: This assignment is a mandatory part of the Pre-Calculus Honors course. Students who do not complete the assignment will be placed in the regular

More information

Decision 411: Class 9. HW#3 issues

Decision 411: Class 9. HW#3 issues Decision 411: Class 9 Presentation/discussion of HW#3 Introduction to ARIMA models Rules for fitting nonseasonal models Differencing and stationarity Reading the tea leaves : : ACF and PACF plots Unit

More information

t. y = x x R² =

t. y = x x R² = A4-11 Model Functions finding model functions for data using technology Pre-requisites: A4-8 (polynomial functions), A4-10 (power and exponential functions) Estimated Time: 2 hours Summary Learn Solve

More information

Instructors Manual Algebra and Trigonometry, 2e Cynthia Y. Young

Instructors Manual Algebra and Trigonometry, 2e Cynthia Y. Young Dear Instructor, Instructors Manual Algebra and Trigonometry, 2e Cynthia Y. Young I hope that when you read my Algebra and Trigonometry book it will be seamless with how you teach. My goal when writing

More information

ABSTRACT EXPONENTIAL CURVE FIT

ABSTRACT EXPONENTIAL CURVE FIT i INCORRECT NONLINEAR CURVE FITS IN EXCEL Rick Hesse, Graziadio School of Management, Pepperdine University P. O. Box 6175, Ventura, CA 93006 805/644-5396 RickHesse@AOL.com ABSTRACT A big problem with

More information

Stat 135, Fall 2006 A. Adhikari HOMEWORK 10 SOLUTIONS

Stat 135, Fall 2006 A. Adhikari HOMEWORK 10 SOLUTIONS Stat 135, Fall 2006 A. Adhikari HOMEWORK 10 SOLUTIONS 1a) The model is cw i = β 0 + β 1 el i + ɛ i, where cw i is the weight of the ith chick, el i the length of the egg from which it hatched, and ɛ i

More information

Complete Week 18 Package

Complete Week 18 Package Complete Week 18 Package Jeanette Stein Table of Contents Unit 4 Pacing Chart -------------------------------------------------------------------------------------------- 1 Day 86 Bellringer --------------------------------------------------------------------------------------------

More information

Inference with Simple Regression

Inference with Simple Regression 1 Introduction Inference with Simple Regression Alan B. Gelder 06E:071, The University of Iowa 1 Moving to infinite means: In this course we have seen one-mean problems, twomean problems, and problems

More information

The words linear and algebra don t always appear together. The word

The words linear and algebra don t always appear together. The word Chapter 1 Putting a Name to Linear Algebra In This Chapter Aligning the algebra part of linear algebra with systems of equations Making waves with matrices and determinants Vindicating yourself with vectors

More information

Chapter 7. Optimization and Minimum Principles. 7.1 Two Fundamental Examples. Least Squares

Chapter 7. Optimization and Minimum Principles. 7.1 Two Fundamental Examples. Least Squares Chapter 7 Optimization and Minimum Principles 7 Two Fundamental Examples Within the universe of applied mathematics, optimization is often a world of its own There are occasional expeditions to other worlds

More information

HOLLOMAN S AP STATISTICS BVD CHAPTER 08, PAGE 1 OF 11. Figure 1 - Variation in the Response Variable

HOLLOMAN S AP STATISTICS BVD CHAPTER 08, PAGE 1 OF 11. Figure 1 - Variation in the Response Variable Chapter 08: Linear Regression There are lots of ways to model the relationships between variables. It is important that you not think that what we do is the way. There are many paths to the summit We are

More information