Tracking Accuracy: An Essential Step to Improve Your Forecasting Process

Similar documents
Four Basic Steps for Creating an Effective Demand Forecasting Process

Effective Strategies for Forecasting a Product Hierarchy

The Ins and Outs of Using Dynamic Regression Models for Forecasting

Computing & Telecommunications Services Monthly Report January CaTS Help Desk. Wright State University (937)

GAMINGRE 8/1/ of 7

WHEN IS IT EVER GOING TO RAIN? Table of Average Annual Rainfall and Rainfall For Selected Arizona Cities

Computing & Telecommunications Services

Getting the Most out of Statistical Forecasting!

REPORT ON LABOUR FORECASTING FOR CONSTRUCTION

Forecasting. Copyright 2015 Pearson Education, Inc.

YACT (Yet Another Climate Tool)? The SPI Explorer

Technical note on seasonal adjustment for M0

Multivariate Regression Model Results

FEB DASHBOARD FEB JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC

2019 Settlement Calendar for ASX Cash Market Products. ASX Settlement

Lesson Adaptation Activity: Analyzing and Interpreting Data

SYSTEM BRIEF DAILY SUMMARY

BUSI 460 Suggested Answers to Selected Review and Discussion Questions Lesson 7

SYSTEM BRIEF DAILY SUMMARY

Improve Forecasts: Use Defect Signals

ACCA Interactive Timetable

STATISTICAL FORECASTING and SEASONALITY (M. E. Ippolito; )

ISO Lead Auditor Lean Six Sigma PMP Business Process Improvement Enterprise Risk Management IT Sales Training

Published by ASX Settlement Pty Limited A.B.N Settlement Calendar for ASX Cash Market Products

ACCA Interactive Timetable & Fees

ACCA Interactive Timetable

Florida Courts E-Filing Authority Board. Service Desk Report March 2019

Jackson County 2013 Weather Data

CIMA Dates and Prices Online Classroom Live September August 2016

ACCA Interactive Timetable

ACCA Interactive Timetable

ACCA Interactive Timetable & Fees

2018 Annual Review of Availability Assessment Hours

NASA Products to Enhance Energy Utility Load Forecasting

ISO Lead Auditor Lean Six Sigma PMP Business Process Improvement Enterprise Risk Management IT Sales Training

ACCA Interactive Timetable

ISO Lead Auditor Lean Six Sigma PMP Business Process Improvement Enterprise Risk Management IT Sales Training

Summary of Seasonal Normal Review Investigations CWV Review

Jackson County 2018 Weather Data 67 Years of Weather Data Recorded at the UF/IFAS Marianna North Florida Research and Education Center

Sales Analysis User Manual

Life Cycle of Convective Systems over Western Colombia

GTR # VLTs GTR/VLT/Day %Δ:

DAILY QUESTIONS 28 TH JUNE 18 REASONING - CALENDAR

2017 Settlement Calendar for ASX Cash Market Products ASX SETTLEMENT

Determine the trend for time series data

ACCA Interactive Timetable

Lecture Prepared By: Mohammad Kamrul Arefin Lecturer, School of Business, North South University

Mountain View Community Shuttle Monthly Operations Report

Dates and Prices ICAEW - Manchester In Centre Programme Prices

Winter Season Resource Adequacy Analysis Status Report

2016 Year-End Benchmark Oil and Gas Prices (Average of Previous 12 months First-Day-of-the Month [FDOM] Prices)

How Accurate is My Forecast?

ACCA Interactive Timetable & Fees

ENGINE SERIAL NUMBERS

ACCA Interactive Timetable & Fees

Suan Sunandha Rajabhat University

ACCA Interactive Timetable & Fees

CIMA Professional

ACCA Interactive Timetable & Fees

CIMA Professional

ACCA Interactive Timetable & Fees

FVA Analysis and Forecastability

ACCA Interactive Timetable

Location. Datum. Survey. information. Etrometa. Step Gauge. Description. relative to Herne Bay is -2.72m. The site new level.

ACCA Interactive Timetable & Fees

ACCA Interactive Timetable & Fees

Scarborough Tide Gauge

Location. Datum. Survey. information. Etrometa. Step Gauge. Description. relative to Herne Bay is -2.72m. The site new level.

ACCA Interactive Timetable

Introduction to TIGGE and GIFS. Richard Swinbank, with thanks to members of GIFS-TIGGE WG & THORPEX IPO

ACCA Interactive Timetable

ACCA Interactive Timetable & Fees

Astrophysics. Paul Hertz Director, Astrophysics Division Science Mission

2013 GROWTH INCENTIVES PROGRAM FAQS

PRACTICE FOR PLACEMENT EXAM PART A

CIMA Professional 2018

TMC Monthly Operational Summary

CIMA Professional 2018

An alternative to Red-Yellow-Green Board Reports

Time Series Analysis

Jackson County 2014 Weather Data

MEASURING FORECASTER PERFORMANCE IN A COLLABORATIVE SETTING WITH FIELD SALES, CUSTOMERS OR SUPPLIER PARTNERS

THE BUILDING SAFETY GROUP HEALTH AND SAFETY TRAINING COURSES 2018 DATES AND PRICES

Chiang Rai Province CC Threat overview AAS1109 Mekong ARCC

3. If a forecast is too high when compared to an actual outcome, will that forecast error be positive or negative?

Technical note on seasonal adjustment for Capital goods imports

Jayalath Ekanayake Jonas Tappolet Harald Gall Abraham Bernstein. Time variance and defect prediction in software projects: additional figures

Fall & Winter Weather Hazards. Presented by: Fort Rucker Weather Operations

Webinar and Weekly Summary February 15th, 2011

The World Bank Indonesia National Slum Upgrading Project (P154782)

ACCA Interactive Timetable

Forecasting 101: The Anatomy of a Forecast. Calendar of Events. Case Study: Brooks Sports. In Search of "Forecastability" Forecast Pro Tips and Tricks

Problem 4.1, HR7E Forecasting R. Saltzman. b) Forecast for Oct. 12 using 3-week weighted moving average with weights.1,.3,.6: 372.

P7.7 A CLIMATOLOGICAL STUDY OF CLOUD TO GROUND LIGHTNING STRIKES IN THE VICINITY OF KENNEDY SPACE CENTER, FLORIDA

ACCA Interactive Timetable

DROUGHT INDICES BEING USED FOR THE GREATER HORN OF AFRICA (GHA)

Operations Management

Extreme Temperature Protocol in Middlesex-London

Climate Change and Water Supply Research. Drought Response Workshop October 8, 2013

Final Report. COMET Partner's Project. University of Texas at San Antonio

Transcription:

Tracking Accuracy: An Essential Step to Improve Your Forecasting Process Presented by Eric Stellwagen President & Co-founder Business Forecast Systems, Inc. estellwagen@forecastpro.com Business Forecast Systems, Inc. 68 Leonard Street Belmont, MA 02478 USA (617) 484-5050 www.forecastpro.com

On-Demand Webinars & Materials S A recording of today s Webinar will be posted next week on forecastpro.com along with the slide set (in.pdf format) All previously presented Webinars are archived and available for viewing on-demand at forecastpro.com Attendees will receive an email notifying them when the recording and materials are available

Eric Stellwagen President, CEO & Co-founder of Business Forecast Systems, Inc. Co-author of Forecast Pro product line. More than 30 years of dedicated business forecasting experience. Served on the. board of directors of the International Institute of Forecasters for 12 years. Is currently serving on the practitioner advisory board of Foresight: The International Journal of Applied Forecasting.

What We ll Cover Introductions Why Track Accuracy? How Do We Measure Error? How Do We Track Accuracy? How Do We Spot Problems? Summary Q&A

Why Track Forecast Accuracy? To improve your forecasting process o Forecasting should be a continuous improvement process. o Improving your forecasting requires knowing what s working and what s not. To gain insight into expected performance To benchmark To spot problems early

How do we measure error?

Form of Error Measurement Tracking forecast accuracy requires measuring forecast error. Error measurements generally take one of three forms: Percentage-based measurements Unit-based measurements Relative-based measurements

MAPE and MAD MAPE: Mean Absolute Percent Error Tells you the average error size as a percent. MAD: Mean Absolute Deviation Tells you the average error size in units.

Error Measurements: MAPE The MAPE (Mean Absolute Percent Error) measures the average size of the error in percentage terms. Step 1: Calculate the absolute size of the error in each forecast period Step 2: Calculate the size of the error as a percentage of actual Step 3: Take the average percent error across periods Month Actual Forecast Absolute Error Absolute % Error 1 112.3 124.7 12.4 11.0% 2 108.4 103.7 4.7 4.3% 3 148.9 116.6 32.3 21.7% 4 117.4 78.5 38.9 33.1% MAPE 17.6%

Error Measurements: MAD The MAD (Mean Absolute Deviation) measures the average size of the error in units. Mean = Average of Absolute = Magnitude of (doesn t matter if it s positive or negative) Deviation = The error Step 1: Calculate the absolute size of the error in each forecast period Step 2: Take the average across periods Month Actual Forecast Absolute Error 1 112.3 124.7 12.4 2 108.4 103.7 4.7 3 148.9 116.6 32.3 4 117.4 78.5 38.9 MAD 22.08

Error Measurement Considerations The MAPE is easy to interpret, even when you don t know a product s demand volume; however, the MAPE is scale sensitive and becomes meaningless for lowvolume data or data with zero demand periods. The MAD is a good statistic to use when analyzing a single product s forecast and you know the demand volume.

Measuring Error Across Products Aggregating error measurements across products can be problematic. When aggregating MAPEs, low-volume products can dominate the results. When aggregating MADs, high-volume products can dominate the results. When aggregating across products some corporations establish weighted error measurements to properly reflect the various products relative importance to the corporation. This is an excellent practice.

How do we track accuracy?

Types of Accuracy Measures Within-sample statistics (a.k.a. goodness-of-fit statistics) tell you how accurately a forecasting method tracks the historical data.

Within-sample Statistics

Within-sample Statistics Can aid the model-building process. Are NOT a good indicator of expected performance.

Types of Accuracy Measures Within-sample statistics (a.k.a. goodness-of-fit statistics) tell you how accurately a forecasting method tracks the historical data. Out-of-sample statistics tell you how accurately a forecasting method actually forecasted: Hold-out analysis Wait and See : Real-time tracking Out-of-sample statistics yield a better measure of expected forecast accuracy than within-sample statistics, with real-time tracking providing the best error accuracy measurements.

Hold-out Analysis

Hold-out Analysis Allows you to compare different approaches Provides insight into expected accuracy May not be able to simulate your true forecasting process

Real-time Tracking

Real-Time Tracking Tracks the actual forecast process Allows you to compare different forecasts (e.g., statistical vs. adjusted vs. salesperson s, etc.) Can be used to determine the value add (if any) of judgment Provides the most accurate insight into expected accuracy Is the strongest of all approaches

Building a Forecast Archive We begin with historic data through December 2016 and generate a forecast Date Jan-17 Feb-17 Mar-17 Apr-17 May-17 Jun-17 Actual Origin 2016-Dec 25,950 11,808 12,429 11,302 6,033 8,211

Building a Forecast Archive Once January's demand is known we generate a new forecast Date Jan-17 Feb-17 Mar-17 Apr-17 May-17 Jun-17 Jul-17 Actual 18,468 Origin 2016-Dec 25,950 11,808 12,429 11,302 6,033 8,211 2017-Jan 12,697 14,114 13,535 6,837 9,726 6,780

Building a Forecast Archive Once February's demand is known we generate a new forecast Date Jan-17 Feb-17 Mar-17 Apr-17 May-17 Jun-17 Jul-17 Aug-17 Actual 18,468 9,720 Origin 2016-Dec 25,950 11,808 12,429 11,302 6,033 8,211 2017-Jan 12,697 14,114 13,535 6,837 9,726 6,780 2017-Feb 13,265 12,913 6,654 9,102 6,574 8,493

Building a Forecast Archive Once March's demand is known we generate a new forecast Date Jan-17 Feb-17 Mar-17 Apr-17 May-17 Jun-17 Jul-17 Aug-17 Sep-17 Actual 18,468 9,720 15,552 Origin 2016-Dec 25,950 11,808 12,429 11,302 6,033 8,211 2017-Jan 12,697 14,114 13,535 6,837 9,726 6,780 2017-Feb 13,265 12,913 6,654 9,102 6,574 8,493 2017-Mar 9,623 4,364 6,983 4,801 6,901 14,710

Building a Forecast Archive Once April's demand is known we generate a new forecast Date Jan-17 Feb-17 Mar-17 Apr-17 May-17 Jun-17 Jul-17 Aug-17 Jan-17 Oct-17 Actual 18,468 9,720 15,552 10,692 Origin 2016-Dec 25,950 11,808 12,429 11,302 6,033 8,211 2017-Jan 12,697 14,114 13,535 6,837 9,726 6,780 2017-Feb 13,265 12,913 6,654 9,102 6,574 8,493 2017-Mar 9,623 4,364 6,983 4,801 6,901 14,710 2017-Apr 4,367 6,994 4,802 6,905 14,725 17,624

Building a Forecast Archive Once May's demand is known we generate a new forecast Date Jan-17 Feb-17 Mar-17 Apr-17 May-17 Jun-17 Jul-17 Aug-17 Jan-17 Oct-17 Nov-17 Actual 18,468 9,720 15,552 10,692 6,804 Origin 2016-Dec 25,950 11,808 12,429 11,302 6,033 8,211 2017-Jan 12,697 14,114 13,535 6,837 9,726 6,780 2017-Feb 13,265 12,913 6,654 9,102 6,574 8,493 2017-Mar 9,623 4,364 6,983 4,801 6,901 14,710 2017-Apr 4,367 6,994 4,802 6,905 14,725 17,624 2017-May 6,873 4,800 6,858 14,554 17,527 15,184

Building a Forecast Archive Once June 2017 sales are known, we can compare the forecasts in the red box to what actually happened--this is the basis for a "waterfall" report Date Jan-17 Feb-17 Mar-17 Apr-17 May-17 Jun-17 Jul-17 Aug-17 Jan-17 Oct-17 Nov-17 Actual 18,468 9,720 15,552 10,692 6,804 7,776 Origin 2016-Dec 25,950 11,808 12,429 11,302 6,033 8,211 2017-Jan 12,697 14,114 13,535 6,837 9,726 6,780 2017-Feb 13,265 12,913 6,654 9,102 6,574 8,493 2017-Mar 9,623 4,364 6,983 4,801 6,901 14,710 2017-Apr 4,367 6,994 4,802 6,905 14,725 17,624 2017-May 6,873 4,800 6,858 14,554 17,527 15,184

A Waterfall Report Adjusted forecast Showing forecasts Date Jan-17 Feb-17 Mar-17 Apr-17 May-17 Jun-17 Actual 18,468 9,720 15,552 10,692 6,804 7,776 Origin 2016-Dec 25,950 11,808 12,429 11,302 6,033 8,211 2017-Jan 12,697 14,114 13,535 6,837 9,726 2017-Feb 13,265 12,913 6,654 9,102 2017-Mar 9,623 4,364 6,983 2017-Apr 4,367 6,994 2017-May 6,873 Lead time 1 2 3 4 5 6 Series Analysis No. observations 6 6 6 6 6 6 Avg. Forecast 12,129 12,811 13,373 13,778 14,061 13,474 Avg. Error 627 1,309 1,871 2,276 2,559 1,972 MAD 2,859 2,862 3,226 2,785 3,070 2,298 Avg. Perc. Error -0.1% 5.3% 12.7% 17.1% 19.6% 15.7% MAPE 23.9% 23.6% 23.5% 20.4% 25.0% 18.5% CMAPE 6.0% 6.0% 6.5% 5.3% 6.3% 5.0%

Relative Absolute Error MAD: Mean Absolute Deviation Tells you the average error size in units. MAPE: Mean Absolute Percent Error Tells you the average error size as a percent. RAE: Relative Absolute Error Tells you the error size relative to the error from a Naïve model (same as last period)..

Error Measurements: RAE The RAE (Relative Absolute Error) is the ratio of the absolute error from the current method to the absolute error from a Naïve model. A geometric mean can be used to average RAEs. Actual Forecast Actual Naive _ Forecast Month Actual Forecast Absolute Error Naïve Forecast Naïve Absolute Error 1 112.3 124.7 12.4 154.2 41.9 0.30 2 108.4 103.7 4.7 112.3 3.9 1.21 3 148.9 116.6 32.3 108.4 40.5 0.80 4 117.4 78.5 38.9 148.9 31.5 1.23 GMRAE 0.77 RAE

FVA Stair Step Report The RAE provides an indication of the value added (or destroyed) by your current forecasting model. This concept can be extended to generate a Forecast Value Add (FVA) report Process Step Naïve Forecast Statistical Forecast Demand Planner Override Forecast Accuracy 60% FVA vs. Naïve 65% 5% FVA vs. Statistical 62% 2% -3% You can report on an individual time series, or for an aggregation of many (or all) time series. If you are doing better than a Naïve forecast, your process is adding value. If you are doing worse than a Naïve forecast, you are simply wasting time and resources. (Slide courtesy of Mike Gilliland, SAS Institute, Inc.)

Exception Reports

Exception Reports Reduce the need for manual review. Allow you to focus on the items where human attention is most needed.

Summary

Conclusions Tracking forecast accuracy allows you to improve your forecasting process, gain insight into expected performance, benchmark and spot problems quickly. All error measurement statistics have strengths and weaknesses and care should used when selecting which ones to focus on. Out-of-sample performance provides a better measure of expected forecast accuracy than within-sample performance. Exception reports are a useful tool to zero in on forecasts that need human attention.

Tracking Accuracy: Best Practices Establish a forecast archive and routinely track accuracy. Ideally, track every step in your forecasting process to determine what is adding/destroying value. Establish a feedback loop to allow participants to learn and improve. Monitor for changes in forecast accuracy and take action when necessary. Understand the differences among error measurements and choose appropriate metrics for the task at hand.

Our Next Webinar How to Boost Your Forecast Accuracy by Modeling the Impact of Promotions and Other Events October 26, 2017 @ 1:30 pm EDT Presented by Sarah Darin, Senior Consultant, Business Forecast Systems, Inc. Visit www.forecastpro.com to sign up!

On-Demand Webinars & Materials S A recording of today s Webinar will be posted next week on forecastpro.com along with the slide set (in.pdf format) All previously presented Webinars are archived and available for viewing on-demand on forecastpro.com Attendees will receive an email notifying them when the recording and materials are available

User Conference Forecast Pro User Conference 2017 October 2-4, 2017 in Boston, MA USA Designed to help you get the most out of Forecast Pro and to improve your forecasting. Is structured as an intimate event where you can make valuable connections with other Forecast Pro users (60 people max). Features tutorials, user case studies, product training, panels and networking events. More info at: forecastpro.com/userconference2017/index.html

Forecast Pro Software S Examples from today s Webinar used Forecast Pro To learn more about Forecast Pro: Request a live WebEx demo for your team (submit your request as a question right now) Visit www.forecastpro.com Call us at +1.617.484.5050

Training and Workshops We offer forecasting seminars, Webinars and product training workshops. On-site, and remote-based (via WebEx) classes are available. Learn more at www.forecastpro.com Subscribe to our blog at theforecastpro.com

Questions?

Thank you for attending!