Today, give a try to Techtonique web app, a tool designed to help you make informed, data-driven decisions using Mathematics, Statistics, Machine Learning, and Data Visualization. Here is a tutorial with audio, video, code, and slides: https://moudiki2.gumroad.com/l/nrhgb. 100 API requests are now (and forever) offered to every user every month, no matter the pricing tier.
Link to notebook at the end.
0 - Install packages¶
!pip install nnetsauce --upgrade --no-cache-dir
A Random Vector Functional Link (RVFL) artificial neural network with 2 regularization parameters has been successfully used for forecasting in professional settings (available soon for an increased flexibility in https://www.techtonique.net/ and Microsoft Excel, stay tuned):
- Introduced in Multiple Time Series Forecasting Using Quasi-Randomized Functional Link Neural Networks: https://www.mdpi.com/2227-9091/6/1/22
- https://www.ressources-actuarielles.net/EXT/ISFA/fp-isfa.nsf/0/0b9df464e9543283c1256f130067b2f9/$FILE/GSE_RVFL_LAL.pdf
- https://journee-iard-2023.institutdesactuaires.com/global/gene/link.php?doc_id=19376&fg=1
- https://www.institutdesactuaires.com/docs/mem/ed9acfdd661fe96f70d1078f2b70bade.pdf
- https://www.ressources-actuarielles.net/EXT/ISFA/1226-02.nsf/d512ad5b22d73cc1c1257052003f1aed/957f021200c51ba5c1258c340023c8da/$FILE/m%C3%A9moire_ia_pt_diallo.pdf
All the implementations cited in these works were done in R, but this post considers various python implementations/adaptations/flavors of the model, with potentially increased model capacity (and regularization parameters). The models used here as workhorses for Multivariate Time Series forecasting are described in https://www.researchgate.net/publication/339512391_Quasi-randomized_networks_for_regression_and_classification_with_two_shrinkage_parameters (could be tuned further these days, I guess).
It's worth mentioning that these models are amenable to backpropagation (see https://thierrymoudiki.github.io/blog/2025/06/24/r/backprop-qrnn-r-version and https://thierrymoudiki.github.io/blog/2025/06/23/python/backprop-qrnn, although slower in this setting), but there's no evidence that backpropagating would lead to a better model.
1 - Get data set¶
import pandas as pd
url = "https://raw.githubusercontent.com/Techtonique/"
url += "datasets/main/time_series/multivariate/"
url += "ice_cream_vs_heater.csv"
df_temp = pd.read_csv(url)
df_temp.index = pd.DatetimeIndex(df_temp.date)
df_icecream = df_temp.drop(columns=['date']).diff().dropna()
display(df_icecream.describe())
df_icecream.plot()
heater | icecream | |
---|---|---|
count | 197.00 | 197.00 |
mean | -0.02 | 0.31 |
std | 5.78 | 7.73 |
min | -23.00 | -32.00 |
25% | -2.00 | -3.00 |
50% | 0.00 | 1.00 |
75% | 3.00 | 5.00 |
max | 19.00 | 20.00 |
<Axes: xlabel='date'>
2 - Use BayesianRVFL2Regressor as a workhorse (fully Bayesian approach)¶
Remember to tune the model's hyperparameters, for example by using https://thierrymoudiki.github.io/blog/2024/12/09/python/bayesconfoptim.
import nnetsauce as ns
regr = ns.MTS(obj=ns.BayesianRVFL2Regressor(s1=1, s2=100, dropout=0.1, sigma=15),
n_hidden_features=5, lags=20, show_progress=False, verbose=False)
regr.fit(df_icecream) # fit the model
regr.predict(h=50, return_std=True)
regr.plot("heater", type_plot="pi")
regr.plot("icecream", type_plot="pi")
3 - Use Ridge2Regressor
as a workhorse (frequentist approach --> conformal prediction)¶
Remember to tune the model's hyperparameters, for example by using https://thierrymoudiki.github.io/blog/2024/12/09/python/bayesconfoptim.
regr = ns.MTS(obj=ns.Ridge2Regressor(lambda1=0.01, lambda2=1000, n_hidden_features=5),
lags=10, n_hidden_features=10, replications=250, type_pi="scp2-kde")
regr.fit(df_icecream) # fit the model
regr.predict(h=50)
regr.plot("heater", type_plot="pi")
regr.plot("icecream", type_plot="pi")
100%|██████████| 2/2 [00:00<00:00, 15.94it/s]
4 - Using R port¶
This is the original R implementation, but in Python. See https://docs.techtonique.net/ahead/index.html and https://docs.techtonique.net/ahead_python/ahead.html#Ridge2Regressor.
!pip install ahead --verbose
import os
import numpy as np
import pandas as pd
from ahead import Ridge2Regressor
from time import time
# Forecasting horizon
h = 50
# univariate ts forecasting
print("Example 1 -----")
obj_MTS = Ridge2Regressor(h = h, lags=20, date_formatting = "original")
start = time()
obj_MTS.forecast(df_icecream)
print("Elapsed", time()-start)
Example 1 ----- Elapsed 0.03411126136779785
d1.plot("heater")
d1.plot("icecream")
Comments powered by Talkyard.