Four Basic Steps for Creating an Effective Demand Forecasting Process Presented by Eric Stellwagen President & Cofounder Business Forecast Systems, Inc. estellwagen@forecastpro.com Business Forecast Systems, Inc. 68 Leonard Street Belmont, MA 02478 USA (617) 484-5050 www.forecastpro.com
On-Demand Webinars & Materials S A recording of today s Webinar will be posted next week on forecastpro.com along with the slide set (in.pdf format) All previously presented Webinars are archived and available for viewing on-demand at forecastpro.com Attendees will receive an email notifying them when the recording and materials are available
Eric Stellwagen President, CEO & Co-founder of Business Forecast Systems, Inc. Co-author of Forecast Pro product line. More than 30 years of dedicated business forecasting experience. Served on the. board of directors of the International Institute of Forecasters for 12 years. Is currently serving on the practitioner advisory board of Foresight: The International Journal of Applied Forecasting.
What We ll Cover Overview Generating Baseline Forecasts Adding Judgement Tracking Accuracy Improving the Process Summary Q&A
Step One: Generating Baseline Forecasts
What is a Statistical Baseline Forecast? A quantitative forecast of demand which: Usually assumes continuity between the past and the future. Provides a starting point for your forecasting process. Provides a benchmark for your final forecast.
Evolution of Baseline Forecasts Phase 1 Phase 2 Phase 3 Judgment & Spreadsheets Automatic Time Series Approaches Customized Approaches
Demand Patterns Vary The type of demand pattern will often dictate the forecasting method used. Common types of demand include: Stable and Ongoing Intermittent and/or Low-volume Rapidly Changing Promoted Limited History
Automatic Time Series Approaches Pros: Simple to understand and explain Widely accepted and used Often quite accurate Adaptive Easy to apply
Cons: Automatic Time Series Approaches Require adequate demand history Assume continuity between past and future Do not capture response to non-calendar based events (e.g., promotions) Do not capture response to explanatory variables Are not all the same implementations vary and some perform poorly
Rejecting Automatic Models When you disagree with the forecasts generated using an automatic time series approach you should reject them. Generally there are three ways to do this: Judgmentally override the forecasted values. Dictate that a different forecasting model be used. Reconfigure the input data.
Baseline Forecasting: Best Practices Avoid one-size-fits-all approaches they rarely work well. Understand where automatic modeling works well and where it does not. Periodically analyze and strive to improve your baseline forecasting models and your data practices (e.g., collection procedures, documentation, organization, etc.).
Step Two: Adding Judgment
Pros: Judgmental Forecasting Does not require statistical expertise. Allows forecaster to incorporate domain knowledge. This knowledge can come from many sources including experience with similar products, feedback from sales staff, customer surveys, focus groups, etc. Does not require historical data.
Cons: Judgmental Forecasting Is subjective. Can be biased by company politics, sales goals, etc. Can be difficult to fine tune future forecasts. Is not automatic can be very time consuming.
Judgmental Forecasting Judgment often plays an important role in forecasting, particularly with new products, short product-life-cycle products, rapidly changing environments and instances where the forecaster s domain knowledge is not captured in the statistical forecasting model.
Adding Judgment: Best Practices Best Practices Add judgment in the form of an override to a statistically generated baseline forecast. Document all overrides made. Track accuracy vs. baseline to understand where you are adding/destroying value. Research Suggests Large adjustments tend to add value more often than small ones. Downward adjustments tend to add value more often than upward adjustments.
Step Three: Tracking Accuracy
Why Track Forecast Accuracy? To improve your forecasting process Forecasting should be a continuous improvement process. Improving your forecasting requires knowing what s working and what s not.
Why Track Forecast Accuracy? To improve your forecasting process To gain insight into expected performance To benchmark To spot problems early
Form of Error Measurement Tracking forecast accuracy requires measuring forecast error. Error measurements generally take one of three forms: Percentage-based measurements Unit-based measurements Relative-based measurements
MAPE, MAD and RAE MAPE: Mean Absolute Percent Error Tells you the average error size as a percent. MAD: Mean Absolute Deviation Tells you the average error size in units. RAE: Relative Absolute Error Tells you the error size relative to the error from a Naïve model.
Error Measurement Considerations The MAPE is easy to interpret, even when you don t know a product s demand volume; however, the MAPE is scale sensitive and becomes meaningless for low-volume data or data with zero demand periods. The MAD is a good statistic to use when analyzing a single product s forecast and you know the demand volume. The RAE provides an indication of the value added (or destroyed) by your current forecasting model. This concept can be extended to generate a FVA report.
Measuring Error Across Products Aggregating error measurements across products can be problematic. When aggregating MAPEs, low-volume products can dominate the results. When aggregating MADs, high-volume products can dominate the results. When aggregating error across products some corporations establish weighted error measurements to properly reflect the various products relative importance to the corporation. This is an excellent practice.
Building a Forecast Archive Tracking accuracy requires creating a forecast archive. In this example, we begin with historic data through December 2015 and generate a forecast Date Jan-16 Feb-16 Mar-16 Apr-16 May-16 Jun-16 Actual Origin 2015-Dec 25,950 11,808 12,429 11,302 6,033 8,211
Building a Forecast Archive Once January's demand is known we generate a new forecast Date Jan-16 Feb-16 Mar-16 Apr-16 May-16 Jun-16 Jul-16 Actual 18,468 Origin 2015-Dec 25,950 11,808 12,429 11,302 6,033 8,211 2016-Jan 12,697 14,114 13,535 6,837 9,726 6,780
Building a Forecast Archive Once February's demand is known we generate a new forecast Date Jan-16 Feb-16 Mar-16 Apr-16 May-16 Jun-16 Jul-16 Aug-16 Actual 18,468 9,720 Origin 2015-Dec 25,950 11,808 12,429 11,302 6,033 8,211 2016-Jan 12,697 14,114 13,535 6,837 9,726 6,780 2016-Feb 13,265 12,913 6,654 9,102 6,574 8,493
Building a Forecast Archive Once March's demand is known we generate a new forecast Date Jan-16 Feb-16 Mar-16 Apr-16 May-16 Jun-16 Jul-16 Aug-16 Sep-16 Actual 18,468 9,720 15,552 Origin 2015-Dec 25,950 11,808 12,429 11,302 6,033 8,211 2016-Jan 12,697 14,114 13,535 6,837 9,726 6,780 2016-Feb 13,265 12,913 6,654 9,102 6,574 8,493 2016-Mar 9,623 4,364 6,983 4,801 6,901 14,710
Building a Forecast Archive Once April's demand is known we generate a new forecast Date Jan-16 Feb-16 Mar-16 Apr-16 May-16 Jun-16 Jul-16 Aug-16 Sep-16 Oct-16 Actual 18,468 9,720 15,552 10,692 Origin 2015-Dec 25,950 11,808 12,429 11,302 6,033 8,211 2016-Jan 12,697 14,114 13,535 6,837 9,726 6,780 2016-Feb 13,265 12,913 6,654 9,102 6,574 8,493 2016-Mar 9,623 4,364 6,983 4,801 6,901 14,710 2016-Apr 4,367 6,994 4,802 6,905 14,725 17,624
Building a Forecast Archive Once May's demand is known we generate a new forecast Date Jan-16 Feb-16 Mar-16 Apr-16 May-16 Jun-16 Jul-16 Aug-16 Sep-16 Oct-16 Nov-16 Actual 18,468 9,720 15,552 10,692 6,804 Origin 2015-Dec 25,950 11,808 12,429 11,302 6,033 8,211 2016-Jan 12,697 14,114 13,535 6,837 9,726 6,780 2016-Feb 13,265 12,913 6,654 9,102 6,574 8,493 2016-Mar 9,623 4,364 6,983 4,801 6,901 14,710 2016-Apr 4,367 6,994 4,802 6,905 14,725 17,624 2016-May 6,873 4,800 6,858 14,554 17,527 15,184
Building a Forecast Archive Once June 2016 sales are known, we can compare the forecasts in the red box to what actually happened--this is the basis for a "waterfall" report Date Jan-16 Feb-16 Mar-16 Apr-16 May-16 Jun-16 Jul-16 Aug-16 Sep-16 Oct-16 Nov-16 Actual 18,468 9,720 15,552 10,692 6,804 7,776 Origin 2015-Dec 25,950 11,808 12,429 11,302 6,033 8,211 2016-Jan 12,697 14,114 13,535 6,837 9,726 6,780 2016-Feb 13,265 12,913 6,654 9,102 6,574 8,493 2016-Mar 9,623 4,364 6,983 4,801 6,901 14,710 2016-Apr 4,367 6,994 4,802 6,905 14,725 17,624 2016-May 6,873 4,800 6,858 14,554 17,527 15,184
A Waterfall Report Adjusted forecast Showing forecasts Date 2016-Jan 2016-Feb 2016-Mar 2016-Apr 2016-May 2016-Jun Actual 18,468 9,720 15,552 10,692 6,804 7,776 Origin 2015-Dec 25,950 11,808 12,429 11,302 6,033 8,211 2016-Jan 12,697 14,114 13,535 6,837 9,726 2016-Feb 13,265 12,913 6,654 9,102 2016-Mar 9,623 4,364 6,983 2016-Apr 4,367 6,994 2016-May 6,873 Lead time 1 2 3 4 5 6 Series Analysis No. observations 6 6 6 6 6 6 Avg. Forecast 12,129 12,811 13,373 13,778 14,061 13,474 Avg. Error 627 1,309 1,871 2,276 2,559 1,972 MAD 2,859 2,862 3,226 2,785 3,070 2,298 Avg. Perc. Error -0.1% 5.3% 12.7% 17.1% 19.6% 15.7% MAPE 23.9% 23.6% 23.5% 20.4% 25.0% 18.5% CMAPE 6.0% 6.0% 6.5% 5.3% 6.3% 5.0%
FVA Stair Step Report Process Step Naïve Forecast Statistical Forecast Demand Planner Override Forecast Accuracy 60% FVA vs. Naïve 65% 5% FVA vs. Statistical 62% 2% -3% Can report on an individual time series, or for an aggregation of many (or all) time series. If you are doing better than a naïve forecast, your process is adding value. If you are doing worse than a naïve forecast, you are simply wasting time and resources. (Slide courtesy of Mike Gilliland, SAS Institute, Inc.)
Tracking Accuracy: Best Practices Establish a forecast archive and routinely track accuracy. Ideally, track every step in your forecasting process to determine what is adding/destroying value. Establish a feed back loop to allow participants to learn and improve. Monitor for changes in forecast accuracy and take action when necessary. Understand the differences among error measurements and choose appropriate metrics for the task at hand.
Step Four: Improving the Process
Improving the Process: Best Practices Always think in terms of continuous improvement and realize that every step of the process can be improved. Take a long term approach. Numerous incremental improvements add up to substantial improvements over time. Document and institutionalize your process to give it staying power. Follow best practices in all steps of the process whenever possible.
On-Demand Webinars & Materials S A recording of today s Webinar will be posted next week on forecastpro.com along with the slide set (in.pdf format) All previously presented Webinars are archived and available for viewing on-demand on forecastpro.com Attendees will receive an email notifying them when the recording and materials are available
Our Next Webinar Forecasting Weekly and Daily Data: Practical Strategies for Better Results January 26, 2017 at 1:30 p.m. EST Presented by Eric Stellwagen Visit www.forecastpro.com to sign up!
Forecast Pro Software S Examples from today s Webinar used Forecast Pro To learn more about Forecast Pro: Request a live WebEx demo for your team (submit your request as a question right now) Visit www.forecastpro.com Call us at +1.617.484.5050
Forecast Training and Workshops We offer forecasting seminars, Webinars and product training workshops. On-site, and remote-based (via WebEx) classes are available. Learn more at www.forecastpro.com Subscribe to our blog at theforecastpro.com
Questions?
Thank you for attending!