Open In Colab

Sparse Autoregression

Here we fit NeuralProphet to data with 5-minute resolution (daily temperatures at Yosemite). This is a continuation of the example notebook autoregression_yosemite_temps, focusing on sparsity.

[1]:
if 'google.colab' in str(get_ipython()):
    !pip install git+https://github.com/ourownstory/neural_prophet.git # may take a while
    #!pip install neuralprophet # much faster, but may not have the latest upgrades/bugfixes

import pandas as pd
from neuralprophet import NeuralProphet, set_log_level
set_log_level("ERROR")
[2]:
data_location = "https://raw.githubusercontent.com/ourownstory/neuralprophet-data/main/datasets/"
df = pd.read_csv(data_location + "yosemite_temps.csv")
# df.head(3)

Sparsifying the AR coefficients

The autoregression component of NeuralProphet is defined as a AR-Net (paper, github). Thus, we can set ar_sparsity to a number smaller one, if we like to induce sparsity in the AR coefficients.

However, fitting a model with multiple components and regularizations can be harder to fit and in some cases you may need to take manual control over the training hyperparameters.

We will start by setting a sparsity to 50%

[3]:
m = NeuralProphet(
    n_lags=6*12,
    n_forecasts=3*12,
    n_changepoints=0,
    weekly_seasonality=False,
    daily_seasonality=False,
    learning_rate=0.01,
    ar_sparsity=0.9,
)
metrics = m.fit(df, freq='5min') # validate_each_epoch=True, plot_live_loss=True

[4]:
fig_param = m.plot_parameters()
_images/sparse_autoregression_yosemite_temps_6_0.png
[5]:
m = m.highlight_nth_step_ahead_of_each_forecast(1)
fig_param = m.plot_parameters()
_images/sparse_autoregression_yosemite_temps_7_0.png
[6]:
m = m.highlight_nth_step_ahead_of_each_forecast(36)
fig_param = m.plot_parameters()
_images/sparse_autoregression_yosemite_temps_8_0.png

Further reducing the non-zero AR-coefficents

By setting the ar_sparsity lower, we can further reduce the number of non-zero weights. Here we set it to 10%

[7]:
m = NeuralProphet(
    n_lags=6*12,
    n_forecasts=3*12,
    n_changepoints=0,
    daily_seasonality=False,
    weekly_seasonality=False,
    learning_rate=0.01,
    ar_sparsity=0.5,
)
metrics = m.fit(df, freq='5min')

[8]:
fig_param = m.plot_parameters()
_images/sparse_autoregression_yosemite_temps_11_0.png
[9]:
m = m.highlight_nth_step_ahead_of_each_forecast(1)
fig_param = m.plot_parameters()
_images/sparse_autoregression_yosemite_temps_12_0.png
[10]:
m = m.highlight_nth_step_ahead_of_each_forecast(36)
fig_param = m.plot_parameters()
_images/sparse_autoregression_yosemite_temps_13_0.png

Extreme sparsity

The lower we set ar_sparsity, the fewer non-zero weiths are fitted by the model. Here we set it to 1%, which should lead to a single non-zero lag.

Note: Extreme values can lead to training instability.

[11]:
m = NeuralProphet(
    n_lags=6*12,
    n_forecasts=3*12,
    n_changepoints=0,
    daily_seasonality=False,
    weekly_seasonality=False,
    learning_rate=0.01,
    ar_sparsity=0.1,
)
metrics = m.fit(df, freq='5min')

[12]:
fig_param = m.plot_parameters()
_images/sparse_autoregression_yosemite_temps_16_0.png
[ ]:

[13]:
m = m.highlight_nth_step_ahead_of_each_forecast(1)
fig_param = m.plot_parameters()
_images/sparse_autoregression_yosemite_temps_18_0.png
[14]:
m = m.highlight_nth_step_ahead_of_each_forecast(36)
fig_param = m.plot_parameters()
_images/sparse_autoregression_yosemite_temps_19_0.png