← Back to Model Library

NeuralProphet Model

A Neural Network-based Forecasting Model Inspired by Facebook Prophet

Overview

NeuralProphet is a neural network-based forecasting model built on PyTorch, designed to be a direct evolution and enhancement of Facebook's Prophet model. It retains Prophet's intuitive component-based interpretability (trend, seasonality, holidays, auto-regressive effects, and exogenous regressors) while leveraging the power and flexibility of deep learning. By using a PyTorch backend, NeuralProphet offers faster training, especially with GPU acceleration, and extends capabilities to include lagged regressors, future regressors, and a more robust handling of missing values and outliers.

Architecture & Components

NeuralProphet's architecture combines a modular design similar to Prophet with neural network components:

  • Trend Component: Modeled as a piecewise linear trend with automatically detected changepoints, similar to Prophet. This captures the long-term direction of the time series.
  • Seasonality Component: Modeled using Fourier series, allowing for flexible representation of multiple seasonalities (e.g., daily, weekly, yearly). NeuralProphet can also automatically detect and fit these patterns.
  • Lagged Auto-Regressive (AR) Component: A key addition over original Prophet. This component explicitly models the dependence of the current value on past values of the time series itself, often using a shallow neural network (e.g., a multi-layer perceptron or a simple recurrent layer) to capture these effects.

    $ y_t = \text{Trend}_t + \text{Seasonality}_t + \text{Holidays}_t + \text{AR}(y_{t-1}, \dots, y_{t-p}) + \text{Regressors}_t + \epsilon_t $

  • Regressors (Exogenous Variables): NeuralProphet can incorporate both lagged regressors (past values of external features) and future regressors (future values of external features, which must be known for the forecast horizon). These are integrated into the model through linear or non-linear layers.
  • Events & Holidays: Specific events and holidays can be added as binary regressors, allowing the model to account for their impact on the time series.
  • Normalization & Missing Data Imputation: NeuralProphet automatically handles data normalization and can impute missing values during training, making it robust to incomplete datasets.
  • Uncertainty Quantification: Provides prediction intervals, typically through Monte Carlo dropout, offering a measure of forecast uncertainty.
NeuralProphet Architecture Diagram

Conceptual diagram of NeuralProphet's component-based architecture, including AR and regressor components.

When to Use NeuralProphet

NeuralProphet is an excellent choice for time series forecasting when:

  • You need an interpretable model: It retains Prophet's component-based interpretability (trend, seasonality, etc.).
  • You require fast training and GPU acceleration: Its PyTorch backend allows for efficient training on GPUs.
  • You want to explicitly model auto-regressive effects: The added AR component can capture short-term dependencies more effectively than original Prophet.
  • You have external regressors (covariates), both lagged and future-known.
  • Your data contains missing values or outliers: It offers robust handling of these issues.
  • You need probabilistic forecasts (uncertainty intervals).
  • You are comfortable with a Python-based deep learning ecosystem.

Pros and Cons

Pros

  • Interpretable Components: Clear breakdown into trend, seasonality, AR, etc.
  • Fast Training with GPU: Leverages PyTorch for efficient computation.
  • Explicit AR Component: Better captures short-term dynamics than original Prophet.
  • Robust to Missing Data & Outliers: Built-in handling mechanisms.
  • Probabilistic Forecasts: Provides uncertainty intervals.
  • Flexible Regressor Handling: Supports both lagged and future-known exogenous variables.

Cons

  • Newer Model: Compared to original Prophet, it has a shorter track record and smaller community.
  • Complexity for Customization: While user-friendly for basic use, deep customization requires PyTorch knowledge.
  • Hyperparameter Tuning: Still requires careful tuning for optimal performance.
  • Can Be Slower than Pure Statistical Models: For very simple time series, a classical model might be faster to train.

Example Implementation

NeuralProphet is implemented as a Python library, making it straightforward to use. Here's a conceptual example.

Python Example (using `neuralprophet` library)

                        
import pandas as pd
import numpy as np
from neuralprophet import NeuralProphet
import matplotlib.pyplot as plt

# 1. Generate synthetic data for demonstration
N = 730 # 2 years of daily data
df = pd.DataFrame({
    'ds': pd.to_datetime(pd.date_range(start='2017-01-01', periods=N, freq='D')),
    'y': [i + (i%30)*2 + (i%365)*5 + 100 + (np.random.rand()*10 - 5) for i in range(N)]
})

# Add a simple exogenous regressor (e.g., temperature proxy)
df['temp_regressor'] = np.sin(np.arange(N) / 50) * 10 + np.random.normal(0, 1, N)

# 2. Define and fit the NeuralProphet model
# You can enable/disable components as needed
m = NeuralProphet(
    growth='linear', # 'linear' or 'logistic'
    seasonality_mode='additive', # 'additive' or 'multiplicative'
    yearly_seasonality=True,
    weekly_seasonality=True,
    daily_seasonality=False,
    n_lags=10, # Add 10 lagged autoregressive values
    n_forecasts=30, # Predict 30 steps into the future
    epochs=50, # Reduced for quick demo
    batch_size=32,
)

# Add future regressors (if known for forecast horizon)
# For this example, 'temp_regressor' is known for the future
m.add_future_regressor(name='temp_regressor')

# Fit the model
print("Starting NeuralProphet model training...")
metrics = m.fit(df, freq="D", progress_bar=False) # freq must be specified
print("NeuralProphet model training complete.")

# 3. Create a future dataframe for predictions
# This will include future 'ds' values and any future regressors
future = m.make_future_dataframe(df, periods=30, n_historic_predictions=True)

# Make a forecast
# The `regressors_df` argument is crucial if you added future regressors
forecast = m.predict(future)

# Print the first few and last few rows of the forecast
print("\nForecast Head:\n", forecast[['ds', 'yhat1', 'yhat1_lower', 'yhat1_upper']].head())
print("\nForecast Tail:\n", forecast[['ds', 'yhat1', 'yhat1_lower', 'yhat1_upper']].tail())

# 4. Plot the forecast
fig_forecast = m.plot(forecast)
plt.title("NeuralProphet Forecast")
plt.show()

# 5. Plot the individual components of the forecast
fig_components = m.plot_components(forecast)
plt.title("NeuralProphet Forecast Components")
plt.show()

# Plot the lagged regressors component
# fig_lagged_regressors = m.plot_parameters(components=['ar'])
# plt.title("NeuralProphet AR Component")
# plt.show()
                        

Dependencies & Resources

Dependencies: pandas, numpy, neuralprophet (which includes `pytorch` and `pytorch-lightning`), matplotlib (for plotting).