In the previous series of posts, I built a system for Time Series Analysis in Python using the statsmodels library on AWS Fargate. In this post, I add another library to do more or less the same thing. This time I will use the fbprophet library.

I used this post as a reference to build this forecasting strategy.

To begin add the fbprophet library into requirements.txt.

In a new file called fb_prophet_model.py in the fargate/src/forecast_models directory add the following code.

"""
TSA using FB Prohpet
"""

import pandas as pd
from fbprophet import Prophet
from scipy.stats import boxcox
from scipy.special import inv_boxcox

class FbProphetModel:
    def __init__(self, json_data):
        self.json_data = json_data
        self.data_frame = pd.DataFrame(json_data)

    def magic(self):
        self.data_frame['ds'] = self.data_frame['date']
        self.data_frame.set_index('date')
        self.data_frame['y'], lam = boxcox(self.data_frame['quantity'])
        train = Prophet(seasonality_mode='multiplicative')
        train.fit(self.data_frame)
        future = train.make_future_dataframe(periods=365, include_history=True)
        forecast = train.predict(future)
        forecast['date'] = pd.to_datetime(forecast['ds'], format='%Y-%m-%d')
        forecast[['yhat','yhat_upper','yhat_lower']] = forecast[['yhat','yhat_upper','yhat_lower']].apply(lambda x: inv_boxcox(x, lam))
        forecast[['quantity', 'quantity_upper', 'quantity_lower']] = forecast[['yhat', 'yhat_upper', 'yhat_lower']]
        return forecast[['date', 'quantity', 'quantity_upper', 'quantity_lower']]

In the index.py file, add the a function to check the model name passed to it and return the correct model class.

def model(model_name):
    """
    Figure out which model to use
    Returns the FbProphetModel if 'prophet'
    is the model name given, otherwise defaults to
    ExponentialSmoothingModel
    """
    if model_name == 'prophet':
        return FbProphetModel
    return ExponentialSmoothingModel

Also add in the new import from src.forecast_models.fb_prophet_model import FbProphetModel

In the lambda handler, add the environment variable MODEL to the task runner.

ecs_client.run_task(
            cluster=cluster,
            taskDefinition=runner,
            launchType='FARGATE',
            networkConfiguration={
                'awsvpcConfiguration': {
                    'subnets': [subnet],
                    'assignPublicIp': 'ENABLED'
                }
            },
            overrides={
                'containerOverrides': [
                    {
                        'environment': [
                            {
                                'name': 'INPUT_JSON_KEY',
                                'value': request_id + '/input.json'
                            },
                            {
                                'name': 'S3_BUCKET',
                                'value': s3_bucket
                            },
                            # Add the model to default to prophet
                            {
                                'name': 'MODEL',
                                'value': 'prophet'
                            }
                        ],
                        'name': runner
                    }
                ]
            }
        )

And that is essentially it. Build the docker image and push it to ECR and start playing around.

All the code in this series is available here.