DEMANDING SOLUTIONS
  • Home
  • How it works
  • Team
  • Try It Today

Technical Details

Demanding Solutions is built with a simple and flexible architecture, with a standard data model that can be used to fit across all models.  The architecture allows the easy incorporation of new forecasting models as they become available, and the choice of using Python, R or other implementations of those models, as desired.

We implement standard and robust forecast evaluation across all models to make it easy for users to compare results quickly from different techniques.

Forecast Models currently supported

We currently support four different, popular and powerful forecast techniques.
Picture

ETS

Picture

(S)ARIMA(X)

Picture

Facebook Prophet

Picture

Amazon DeepAR

ETS

This one of the most popular simple smoothing models for time series forecasting and represents a great baseline approach to compare others to, although it only uses features present in the actual data series itself, so is not as flexible as some other techniques in incorporating domain knowledge.  We use the powerful underlying 'forecast' implementation in R, which provides power users more ability to customize and diagnose than current native Python packages.

(S)ARIMA(X)

This is a very popular time series method for forecasting individual time series, often delivering state-of-the-art results.  It allows for the incorporation of domain knowledge using additional regressors but is not often currently applied to demand forecasting because of the complexity of applying and tuning it for many parallel SKUs.  We implement auto-tuning to allow you to see how well it might work, and then give you the ability to customize parameters for even better results.  We use the best-in-class ARIMA implementation in the R 'forecast' package, rather than less powerful native Python packages.

Facebook Prophet

This is a recently released model that Facebook created for use in their own business forecasting.  It is a General Additive Model designed to easily incorporate 'human-in-the-loop' adjustments, making it an interesting option when there is a lot of domain knowledge that can be exploited.  We implement the Python version of the package.

Amazon DeepAR

This is a recently released Neural Net method, developed by Amazon for their own demand forecasting.  It learns jointly across multiple series, rather than fitting each series individually, making it an interesting option when there are many related SKUs in a category that exhibit similar underlying behavior, like response to holidays or promotional activity, although it cannot incorporate domain knowledge with additional regressors.   We implement this via AWS SageMaker.

Forecast Evaluation Approach

Evaluating forecast accuracy can be a tricky thing.  Time series models are easy to overfit, even when holding out the most recent periods.  We provide support for the robust approach of making simulated historical forecasts across the data.

The evaluation starts by building a model only on the data up to a cut-off point.  Then we make a forecast from that cut-off for a horizon and comparing results only for a chosen evaluation period at the end of that horizon.  We repeat this for a range of non-overlapping cut-offs to create a complete forecast evaluation.
Picture
This approach also represents the real-world use of demand forecasting, where forecasts must typically be made 3-4 months ahead to allow time to manufacture and ship product.  We allow the user to see how the technique would have performed across the historical data, and aggregate accuracy across that period as desired.

We evaluated our product's accuracy to actual results from the company's own archived forecasts made with matching cutoffs and evaluation periods.

Platform engineering

The data pipeline takes standardized data for Point of Sale or Orders information and takes care of the pre-preparation required for each model, which varies considerably.   It then trains the model and does parameter searches where required.  Output data from each model is then transformed back into a standard form.  The user can see high level visualization of the results and diagnostics online, and make alterations to parameters as desired and re-run models.  When they are finished with that iterative process, they can download the complete forecast data to their own machines for further analysis and processing as desired.
Picture
Our web app uses AWS Elastic Beanstalk to manage the provisioning of EC2 and Amazon RD servers, web server configuration and automatic scaling.  We built the app itself using the Python Pyramid framework for its combination of ease of getting started and long-term flexibility.  We maintain a PostgreSQL database for persistent storage of any desired data and administrative information like user access rights.  We store time series data in an efficient and scalable manner by storing individual time series as binarized objects - single entries in our database model.
Picture
Modelling is controlled with Python code, but we implement some of the models in R, so connect to a local R server for those.  DeepAR only runs on AWS SageMaker so we communicate with that by passing data into and out of AWS S3 storage.

Adding support for additional models simply requires us to implement the custom preprocessing transform, model training and output transform steps for that new model.

The current connection to AWS does not allow us to automatically initiate running the SageMaker fit/predict step -- that has to be initiated manually.  We'd like to automate that step too.
​​

​Home

​How it works

​Team

Try It Today

Copyright © 2018
  • Home
  • How it works
  • Team
  • Try It Today