Today, give a try to Techtonique web app, a tool designed to help you make informed, data-driven decisions using Mathematics, Statistics, Machine Learning, and Data Visualization. Here is a tutorial with audio, video, code, and slides: https://moudiki2.gumroad.com/l/nrhgb. 100 API requests are now (and forever) offered to every user every month, no matter the pricing tier.
The model presented here is a frequentist – conformalized – version of the Bayesian one presented last week in #152. The model is implemented in learningmachine
, both in Python and R. Model explanations are given as sensitivity analyses.
0 - install packages
For R
utils::install.packages(c("rmarkdown", "reticulate", "remotes"))
Installing packages into '/cloud/lib/x86_64-pc-linux-gnu-library/4.4'
(as 'lib' is unspecified)
remotes::install_github("thierrymoudiki/bayesianrvfl")
Skipping install of 'bayesianrvfl' from a github remote, the SHA1 (a8e9e78a) has not changed since last install.
Use `force = TRUE` to force installationSkipping install of 'learningmachine' from a github remote, the SHA1 (6b930284) has not changed since last install.
Use `force = TRUE` to force installation
library("learningmachine")
Loading required package: randtoolboxLoading required package: rngWELLThis is randtoolbox. For an overview, type 'help("randtoolbox")'.Loading required package: tseriesRegistered S3 method overwritten by 'quantmod':
method from
as.zoo.data.frame zoo Loading required package: memoiseLoading required package: foreachLoading required package: skimrLoading required package: snowLoading required package: doSNOWLoading required package: iterators
For Python
pip install matplotlib
Defaulting to user installation because normal site-packages is not writeable
Requirement already satisfied: matplotlib in /cloud/python/lib/python3.8/site-packages (3.7.5)
Requirement already satisfied: pyparsing>=2.3.1 in /cloud/python/lib/python3.8/site-packages (from matplotlib) (3.1.2)
Requirement already satisfied: packaging>=20.0 in /cloud/python/lib/python3.8/site-packages (from matplotlib) (24.1)
Requirement already satisfied: kiwisolver>=1.0.1 in /cloud/python/lib/python3.8/site-packages (from matplotlib) (1.4.5)
Requirement already satisfied: importlib-resources>=3.2.0 in /cloud/python/lib/python3.8/site-packages (from matplotlib) (6.4.3)
Requirement already satisfied: numpy<2,>=1.20 in /cloud/python/lib/python3.8/site-packages (from matplotlib) (1.24.4)
Requirement already satisfied: pillow>=6.2.0 in /cloud/python/lib/python3.8/site-packages (from matplotlib) (10.4.0)
Requirement already satisfied: cycler>=0.10 in /cloud/python/lib/python3.8/site-packages (from matplotlib) (0.12.1)
Requirement already satisfied: python-dateutil>=2.7 in /cloud/python/lib/python3.8/site-packages (from matplotlib) (2.9.0.post0)
Requirement already satisfied: fonttools>=4.22.0 in /cloud/python/lib/python3.8/site-packages (from matplotlib) (4.53.1)
Requirement already satisfied: contourpy>=1.0.1 in /cloud/python/lib/python3.8/site-packages (from matplotlib) (1.1.1)
Requirement already satisfied: zipp>=3.1.0 in /cloud/python/lib/python3.8/site-packages (from importlib-resources>=3.2.0->matplotlib) (3.20.0)
Requirement already satisfied: six>=1.5 in /cloud/python/lib/python3.8/site-packages (from python-dateutil>=2.7->matplotlib) (1.16.0)
[notice] A new release of pip is available: 23.0.1 -> 24.2
[notice] To update, run: /opt/python/3.8.17/bin/python3.8 -m pip install --upgrade pip
pip install git+https://github.com/Techtonique/learningmachine_python.git
Defaulting to user installation because normal site-packages is not writeable
Collecting git+https://github.com/Techtonique/learningmachine_python.git
Cloning https://github.com/Techtonique/learningmachine_python.git to /tmp/pip-req-build-37h1oa6g
Running command git clone --filter=blob:none --quiet https://github.com/Techtonique/learningmachine_python.git /tmp/pip-req-build-37h1oa6g
Resolved https://github.com/Techtonique/learningmachine_python.git to commit 3ec7ca96df71add6218d55db9ef5d8eb40275877
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Requirement already satisfied: numpy in /cloud/python/lib/python3.8/site-packages (from learningmachine==2.2.2) (1.24.4)
Requirement already satisfied: pandas in /cloud/python/lib/python3.8/site-packages (from learningmachine==2.2.2) (2.0.3)
Requirement already satisfied: rpy2>=3.4.5 in /cloud/python/lib/python3.8/site-packages (from learningmachine==2.2.2) (3.5.16)
Requirement already satisfied: scikit-learn in /cloud/python/lib/python3.8/site-packages (from learningmachine==2.2.2) (1.3.2)
Requirement already satisfied: scipy in /cloud/python/lib/python3.8/site-packages (from learningmachine==2.2.2) (1.10.1)
Requirement already satisfied: jinja2 in /cloud/python/lib/python3.8/site-packages (from rpy2>=3.4.5->learningmachine==2.2.2) (3.1.4)
Requirement already satisfied: tzlocal in /cloud/python/lib/python3.8/site-packages (from rpy2>=3.4.5->learningmachine==2.2.2) (5.2)
Requirement already satisfied: backports.zoneinfo in /cloud/python/lib/python3.8/site-packages (from rpy2>=3.4.5->learningmachine==2.2.2) (0.2.1)
Requirement already satisfied: cffi>=1.15.1 in /cloud/python/lib/python3.8/site-packages (from rpy2>=3.4.5->learningmachine==2.2.2) (1.17.0)
Requirement already satisfied: tzdata>=2022.1 in /cloud/python/lib/python3.8/site-packages (from pandas->learningmachine==2.2.2) (2024.1)
Requirement already satisfied: python-dateutil>=2.8.2 in /cloud/python/lib/python3.8/site-packages (from pandas->learningmachine==2.2.2) (2.9.0.post0)
Requirement already satisfied: pytz>=2020.1 in /cloud/python/lib/python3.8/site-packages (from pandas->learningmachine==2.2.2) (2024.1)
Requirement already satisfied: threadpoolctl>=2.0.0 in /cloud/python/lib/python3.8/site-packages (from scikit-learn->learningmachine==2.2.2) (3.5.0)
Requirement already satisfied: joblib>=1.1.1 in /cloud/python/lib/python3.8/site-packages (from scikit-learn->learningmachine==2.2.2) (1.4.2)
Requirement already satisfied: pycparser in /cloud/python/lib/python3.8/site-packages (from cffi>=1.15.1->rpy2>=3.4.5->learningmachine==2.2.2) (2.22)
Requirement already satisfied: six>=1.5 in /cloud/python/lib/python3.8/site-packages (from python-dateutil>=2.8.2->pandas->learningmachine==2.2.2) (1.16.0)
Requirement already satisfied: MarkupSafe>=2.0 in /cloud/python/lib/python3.8/site-packages (from jinja2->rpy2>=3.4.5->learningmachine==2.2.2) (2.1.5)
[notice] A new release of pip is available: 23.0.1 -> 24.2
[notice] To update, run: /opt/python/3.8.17/bin/python3.8 -m pip install --upgrade pip
1 - Python code version
import packages
import numpy as np
import warnings
import learningmachine as lm
/cloud/python/lib/python3.8/site-packages/rpy2/rinterface_lib/embedded.py:276: UserWarning: R was initialized outside of rpy2 (R_NilValue != NULL). Trying to use it nevertheless.
warnings.warn(msg)
R was initialized outside of rpy2 (R_NilValue != NULL). Trying to use it nevertheless.
plotting function
warnings.filterwarnings('ignore')
split_color = 'green'
split_color2 = 'orange'
local_color = 'gray'
def plot_func(x,
y,
y_u=None,
y_l=None,
pred=None,
shade_color=split_color,
method_name="",
title=""):
fig = plt.figure()
plt.plot(x, y, 'k.', alpha=.3, markersize=10,
fillstyle='full', label=u'Test set observations')
if (y_u is not None) and (y_l is not None):
plt.fill(np.concatenate([x, x[::-1]]),
np.concatenate([y_u, y_l[::-1]]),
alpha=.3, fc=shade_color, ec='None',
label = method_name + ' Prediction interval')
if pred is not None:
plt.plot(x, pred, 'k--', lw=2, alpha=0.9,
label=u'Predicted value')
#plt.ylim([-2.5, 7])
plt.xlabel('$X$')
plt.ylabel('$Y$')
plt.legend(loc='upper right')
plt.title(title)
plt.show()
fit_obj = lm.Regressor(method="rvfl",
pi_method = "kdejackknifeplus",
nb_hidden = 3)
data = fetch_california_housing()
X, y = data.data[:600], data.target[:600]
X = pd.DataFrame(X, columns=data.feature_names)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2,
random_state=123)
start = time()
fit_obj.fit(X_train, y_train, reg_lambda=1)
Regressor(method='rvfl', nb_hidden=3, pi_method='kdejackknifeplus')Elapsed time: 11.298886775970459preds: DescribeResult(preds=array([2.49357776, 1.43407668, 1.5975387 , 1.12443032, 2.82375138,
2.4580493 , 3.54353746, 1.8909777 , 2.05285337, 3.77584837,
1.83513261, 1.07393557, 1.94194497, 0.92784638, 1.05871262,
2.22243527, 1.65294959, 1.91187611, 1.77770079, 1.25768064,
1.87622468, 1.74061527, 2.92453382, 1.66174853, 1.6614039 ,
1.77202212, 5.07081658, 2.38262843, 2.53746616, 1.363203 ,
3.20583396, 1.86606062, 1.70192252, 1.07269937, 1.31627684,
2.03799042, 3.26290966, 3.97461689, 1.13406842, 2.47295003,
1.5368609 , 1.09332542, 1.4092314 , 2.60417043, 5.89174275,
1.50619443, 2.4628304 , 2.19871466, 2.1841939 , 1.48221595,
1.87870123, 0.77989849, 1.44449574, 2.38624907, 1.29244221,
0.78829802, 1.41881675, 1.74509957, 1.11005633, 4.01926467,
2.05504522, 1.71159601, 3.2461924 , 2.83470994, 2.19983398,
1.59497106, 1.70886701, 1.90077656, 1.45410688, 1.66886891,
1.75110881, 3.05357402, 3.6834588 , 1.08916238, 1.84916719,
1.80851509, 2.15596533, 1.97719052, 1.92875913, 3.38678466,
1.32564179, 1.81436977, 1.63403928, 1.18568231, 1.7315052 ,
0.8090071 , 2.47389329, 1.19407051, 2.23409297, 2.30256486,
5.37829418, 1.38890835, 2.00637569, 1.99204335, 1.85314134,
2.86190699, 1.91334581, 1.24205971, 3.20558421, 4.26259962,
2.01732404, 3.95679516, 1.39072672, 1.65839735, 2.30550471,
2.3517598 , 2.13887703, 1.25020289, 1.88503591, 1.16169133,
2.00430928, 3.26703677, 1.3242707 , 2.01948611, 2.43700375,
1.69810225, 1.69708568, 2.18234363, 2.12271321, 0.94102412]), sims=array([[2.52197294, 2.65764086, 2.40986397, ..., 2.78250935, 2.00710776,
2.45581687],
[1.73158936, 1.64068277, 1.30405511, ..., 1.5192879 , 1.7531569 ,
1.96604158],
[1.9742429 , 1.39726812, 1.32734606, ..., 1.22639871, 1.46987265,
1.40392566],
...,
[2.0119616 , 2.40848185, 2.09426227, ..., 2.62052065, 2.51099616,
2.36614074],
[1.35463496, 2.3932347 , 2.05260157, ..., 2.40050845, 1.99334353,
1.87698499],
[0.48279083, 0.94856269, 0.97790733, ..., 1.0927788 , 1.23311912,
0.40108719]]), lower=array([ 1.93689669e+00, 6.14177968e-01, 1.04195634e+00, 3.75503359e-01,
2.02144872e+00, 1.82276306e+00, 2.64944272e+00, 1.34667642e+00,
1.29086670e+00, 3.05297704e+00, 1.17765182e+00, 4.30461766e-01,
1.37895403e+00, 2.98779763e-01, 4.40822194e-01, 1.55654853e+00,
9.38628380e-01, 1.21829098e+00, 9.84558834e-01, 6.49532691e-01,
9.01772850e-01, 1.19276603e+00, 2.17680551e+00, 1.09038146e+00,
1.16893453e+00, 9.31502331e-01, 4.18264443e+00, 1.49607918e+00,
1.57691294e+00, 3.28849227e-01, 2.39003022e+00, 1.07840109e+00,
8.10677053e-01, 5.57945108e-01, 5.89483726e-01, 1.40646500e+00,
2.38769699e+00, 3.33592642e+00, 4.92576112e-01, 1.64897773e+00,
8.70602613e-01, 3.28203503e-01, 5.98006128e-01, 1.85519350e+00,
5.28848929e+00, 8.98852307e-01, 1.62711360e+00, 1.55280346e+00,
1.49136612e+00, 8.73686102e-01, 1.36286310e+00, -1.81572146e-01,
7.77535746e-01, 1.55053396e+00, 7.13006914e-01, 4.64261897e-03,
8.01311589e-01, 7.48462573e-01, 4.37894150e-01, 3.20953201e+00,
1.01569966e+00, 1.04846962e+00, 2.60967933e+00, 1.88654294e+00,
1.14477050e+00, 1.14136267e+00, 1.18280741e+00, 1.25494957e+00,
8.98454204e-01, 2.10925704e-01, 1.05478650e+00, 2.35858707e+00,
2.94164760e+00, 1.67571043e-01, 1.33878742e+00, 1.25839350e+00,
1.53951656e+00, 9.05624320e-01, 1.17061173e+00, 2.54655499e+00,
6.47466317e-01, 1.20289748e+00, 1.13571370e+00, 4.64266018e-01,
1.14215867e+00, 1.49252246e-01, 1.82103747e+00, -2.49955629e-01,
1.61755751e+00, 1.65686043e+00, 4.78967402e+00, 4.81742186e-01,
1.44596520e+00, 1.32448131e+00, 8.89972080e-01, 2.34075771e+00,
1.24303256e+00, 7.03972518e-01, 2.60285187e+00, 3.53153738e+00,
1.35196678e+00, 3.50221519e+00, 4.47336205e-01, 9.67171390e-01,
1.76976985e+00, 1.59672116e+00, 1.28269514e+00, 6.40395915e-01,
1.29953005e+00, 5.27235568e-01, 1.46471898e+00, 1.66067091e+00,
4.41349172e-01, 1.39726429e+00, 1.80795176e+00, 1.07103425e+00,
5.76820176e-01, 1.21573235e+00, 1.34041046e+00, 1.73301081e-01]), upper=array([2.99397933, 1.96132754, 1.9741853 , 1.56915801, 3.40435554,
2.91464536, 4.17157769, 2.35189282, 2.54248989, 4.34702931,
2.28681813, 1.58798973, 2.46838793, 1.4842451 , 1.52209298,
2.75057218, 2.25719594, 2.41607558, 2.30447366, 1.71324602,
2.33541568, 2.24194145, 3.37712022, 2.16161502, 2.16743013,
2.32044303, 5.7525094 , 2.95770978, 3.05001003, 1.96251894,
3.65612204, 2.41472665, 2.15530985, 1.58579045, 1.84162227,
2.53564354, 3.82321955, 4.44042528, 1.60388757, 2.96240096,
2.12835682, 1.5469312 , 1.86880196, 3.16319094, 6.30956862,
1.96174807, 2.99380439, 2.88775395, 2.60191973, 1.93653757,
2.19505169, 1.30622414, 1.85682241, 2.95891228, 1.79928438,
1.29625491, 2.09220303, 2.36290631, 1.53658861, 4.7506125 ,
2.67337766, 2.17030365, 3.75023808, 3.35541687, 2.77520135,
2.08450402, 2.16371497, 2.38837026, 1.87832477, 2.16419849,
2.36599111, 3.50414737, 4.29523451, 1.53640485, 2.2309006 ,
2.33145818, 2.66451633, 2.49759326, 2.43879385, 3.90232458,
1.84661217, 2.27974655, 2.06887939, 1.71836616, 2.15279379,
1.31422259, 2.90745441, 1.76813 , 2.71020272, 2.81272411,
5.92280218, 1.86562672, 2.5298048 , 2.51531422, 2.39877546,
3.36143692, 2.46371693, 1.76912589, 3.67435993, 4.72940884,
2.48373427, 4.37798834, 2.15801521, 2.11703737, 3.27506479,
2.88194859, 2.6982464 , 1.80432322, 2.45259713, 1.66864527,
2.49172073, 3.85706896, 1.95383092, 2.49247639, 2.9572688 ,
2.24464142, 2.27349167, 2.61578328, 2.64429111, 1.49723436]))coverage rate: 0.8
plot_func(x = range(len(y_test)),
y = y_test,
y_u=preds.upper,
y_l=preds.lower,
pred=preds.preds,
method_name="before update",
title="")
# update
fit_obj.update(X_test.iloc[0,:], y_test[0])
Regressor(method='rvfl', nb_hidden=3, pi_method='kdejackknifeplus')coverage rate: 0.8067226890756303
plot_func(x = range(len(y_test[:-1])),
y = y_test[:-1],
y_u=preds.upper,
y_l=preds.lower,
pred=preds.preds,
method_name="after update",
title="")
2 - R code version
X <- as.matrix(mtcars[,-1])
y <- mtcars$mpg
set.seed(123)
(index_train <- base::sample.int(n = nrow(X),
size = floor(0.7*nrow(X)),
replace = FALSE))
[1] 31 15 19 14 3 10 18 22 11 5 20 29 23 30 9 28 8 27 7 32 26 17[1] 22 10[1] 10 10
obj <- learningmachine::Regressor$new(method = "rvfl",
nb_hidden = 50L, pi_method = "splitconformal")
obj$get_type()
[1] "regression"[1] "Regressor"
t0 <- proc.time()[3]
obj$fit(X_train, y_train, reg_lambda = 0.01)
cat("Elapsed: ", proc.time()[3] - t0, "s n")
Elapsed: 0.01 s n
print(obj$predict(X_test))
$preds
Mazda RX4 Mazda RX4 Wag Hornet 4 Drive Valiant
21.350888 19.789387 13.106761 9.695310
Merc 450SE Merc 450SL Lincoln Continental Toyota Corona
11.131161 12.568682 2.044672 19.289805
Camaro Z28 Pontiac Firebird
14.847878 12.282272
$lower
Mazda RX4 Mazda RX4 Wag Hornet 4 Drive Valiant
12.3508879 10.7893873 4.1067608 0.6953102
Merc 450SE Merc 450SL Lincoln Continental Toyota Corona
2.1311611 3.5686817 -6.9553279 10.2898053
Camaro Z28 Pontiac Firebird
5.8478777 3.2822719
$upper
Mazda RX4 Mazda RX4 Wag Hornet 4 Drive Valiant
30.35089 28.78939 22.10676 18.69531
Merc 450SE Merc 450SL Lincoln Continental Toyota Corona
20.13116 21.56868 11.04467 28.28981
Camaro Z28 Pontiac Firebird
23.84788 21.28227
obj$summary(X_test, y=y_test, show_progress=FALSE)
$R_squared
[1] -1.505856
$R_squared_adj
[1] 23.55271
$Residuals
Min. 1st Qu. Median Mean 3rd Qu. Max.
-1.548 1.461 5.000 4.349 7.949 8.405
$Coverage_rate
[1] 100
$ttests
estimate lower upper p-value signif
cyl 137.649985 39.777048 235.5229227 1.115728e-02 *
disp -2.406399 -4.650678 -0.1621204 3.825959e-02 *
hp -0.527573 -1.402043 0.3468975 2.054686e-01
drat 707.372951 246.095138 1168.6507638 7.059500e-03 **
wt -500.429007 -565.047979 -435.8100352 2.910469e-08 ***
qsec -89.930939 -124.899691 -54.9621860 2.537870e-04 ***
vs 234.198406 -127.886990 596.2838006 1.774484e-01
am -235.789718 -512.422513 40.8430776 8.592503e-02 .
gear 52.646721 -6.640614 111.9340567 7.547657e-02 .
carb -17.100561 -87.819649 53.6185270 5.976705e-01
$effects
── Data Summary ────────────────────────
Values
Name effects
Number of rows 10
Number of columns 10
_______________________
Column type frequency:
numeric 10
________________________
Group variables None
── Variable type: numeric ──────────────────────────────────────────────────────
skim_variable mean sd p0 p25 p50 p75 p100
1 cyl 138. 137. -8.40 75.8 91.1 98.6 394.
2 disp -2.41 3.14 -8.46 -1.32 -1.08 -0.775 -0.300
3 hp -0.528 1.22 -3.40 -0.695 -0.188 0.0137 0.893
4 drat 707. 645. 55.7 388. 482. 563. 1939.
5 wt -500. 90.3 -698. -538. -500. -458. -377.
6 qsec -89.9 48.9 -145. -128. -102. -64.0 2.67
7 vs 234. 506. -121. -13.2 36.8 53.2 1269.
8 am -236. 387. -653. -450. -397. -168. 519.
9 gear 52.6 82.9 -107. -4.69 66.2 112. 170.
10 carb -17.1 98.9 -117. -64.6 -60.6 -17.5 171.
hist
1 ▂▇▁▁▂
2 ▂▁▁▁▇
3 ▁▁▁▇▂
4 ▅▇▁▁▃
5 ▂▁▆▇▃
6 ▇▆▁▂▃
7 ▇▁▁▁▂
8 ▆▇▂▁▃
9 ▂▅▅▅▇
10 ▇▂▁▁▂
t0 <- proc.time()[3]
obj$fit(X_train, y_train)
cat("Elapsed: ", proc.time()[3] - t0, "s n")
Elapsed: 0.128 s n
[1] 1
update RVFL model
previous_coefs <- drop(obj$model$coef)
newx <- X_test[1, ]
newy <- y_test[1]
new_X_test <- X_test[-1, ]
new_y_test <- y_test[-1]
t0 <- proc.time()[3]
obj$update(newx, newy)
cat("Elapsed: ", proc.time()[3] - t0, "s n")
Elapsed: 0.242 s n
summary(previous_coefs)
Min. 1st Qu. Median Mean 3rd Qu. Max.
-0.68212 -0.26567 -0.05157 0.00700 0.21046 2.19222 Min. 1st Qu. Median Mean 3rd Qu. Max.
-0.030666 -0.002610 0.004189 0.002917 0.011386 0.025243
obj$summary(new_X_test, y=new_y_test, show_progress=FALSE)
$R_squared
[1] -1.809339
$R_squared_adj
[1] 12.23735
$Residuals
Min. 1st Qu. Median Mean 3rd Qu. Max.
-1.168 2.513 5.541 5.058 8.185 8.703
$Coverage_rate
[1] 100
$ttests
estimate lower upper p-value signif
cyl 111.6701473 17.076928 206.2633669 2.615518e-02 *
disp -1.7983224 -3.876380 0.2797349 8.106884e-02 .
hp -0.4167545 -1.501658 0.6681495 4.015523e-01
drat 569.9102780 148.862037 990.9585186 1.420088e-02 *
wt -504.1496696 -583.757006 -424.5423330 4.741273e-07 ***
qsec -107.9102921 -138.571336 -77.2492482 3.936777e-05 ***
vs 145.0280002 -173.164419 463.2204193 3.239468e-01
am -319.6910568 -566.618653 -72.7634604 1.745263e-02 *
gear 57.7630332 -18.934712 134.4607782 1.206459e-01
carb -42.9572292 -108.690903 22.7764447 1.702409e-01
$effects
── Data Summary ────────────────────────
Values
Name effects
Number of rows 9
Number of columns 10
_______________________
Column type frequency:
numeric 10
________________________
Group variables None
── Variable type: numeric ──────────────────────────────────────────────────────
skim_variable mean sd p0 p25 p50 p75 p100
1 cyl 112. 123. -13.5 64.5 93.6 93.9 426.
2 disp -1.80 2.70 -8.94 -1.41 -0.805 -0.689 -0.361
3 hp -0.417 1.41 -3.54 -0.679 -0.0942 -0.0556 1.19
4 drat 570. 548. 36.8 371. 439. 501. 1972.
5 wt -504. 104. -742. -523. -497. -461. -382.
6 qsec -108. 39.9 -152. -143. -115. -93.0 -35.9
7 vs 145. 414. -116. -23.9 51.1 81.2 1231.
8 am -320. 321. -575. -479. -395. -368. 465.
9 gear 57.8 99.8 -113. 1.22 35.2 130. 196.
10 carb -43.0 85.5 -129. -79.6 -77.9 -22.5 165.
hist
1 ▅▇▁▁▂
2 ▁▁▁▁▇
3 ▂▁▂▇▃
4 ▅▇▁▁▂
5 ▂▁▂▇▃
6 ▇▅▅▂▂
7 ▇▁▁▁▁
8 ▇▁▁▁▁
9 ▃▇▇▇▇
10 ▇▅▁▁▂
res <- obj$predict(X = new_X_test)
new_y_train <- c(y_train, newy)
plot(c(new_y_train, res$preds), type='l',
main="",
ylab="",
ylim = c(min(c(res$upper, res$lower, y)),
max(c(res$upper, res$lower, y))))
lines(c(new_y_train, res$upper), col="gray60")
lines(c(new_y_train, res$lower), col="gray60")
lines(c(new_y_train, res$preds), col = "red")
lines(c(new_y_train, new_y_test), col = "blue")
abline(v = length(y_train), lty=2, col="black")
[1] 1
update RVFL model (Pt.2)
newx <- X_test[2, ]
newy <- y_test[2]
new_X_test <- X_test[-c(1, 2), ]
new_y_test <- y_test[-c(1, 2)]
t0 <- proc.time()[3]
obj$update(newx, newy)
cat("Elapsed: ", proc.time()[3] - t0, "s n")
Elapsed: 0.077 s n
obj$summary(new_X_test, y=new_y_test, show_progress=FALSE)
$R_squared
[1] -3.356623
$R_squared_adj
[1] 11.16545
$Residuals
Min. 1st Qu. Median Mean 3rd Qu. Max.
-1.950 5.030 6.374 6.369 8.774 11.528
$Coverage_rate
[1] 75
$ttests
estimate lower upper p-value signif
cyl 40.8981137 6.878148 74.9180798 2.494779e-02 *
disp -0.7335494 -1.206939 -0.2601595 8.026181e-03 **
hp -0.8233606 -2.198927 0.5522055 1.998737e-01
drat 549.7206897 416.053783 683.3875968 2.570765e-05 ***
wt -469.9351032 -535.877454 -403.9927527 6.344763e-07 ***
qsec -116.6183871 -156.767393 -76.4693814 2.380078e-04 ***
vs -194.4213942 -288.046178 -100.7966103 1.732503e-03 **
am -395.7216847 -562.762331 -228.6810387 8.143911e-04 ***
gear 53.0732573 -59.833653 165.9801679 3.030574e-01
carb -25.9448064 -63.759959 11.8703467 1.487567e-01
$effects
── Data Summary ────────────────────────
Values
Name effects
Number of rows 8
Number of columns 10
_______________________
Column type frequency:
numeric 10
________________________
Group variables None
── Variable type: numeric ──────────────────────────────────────────────────────
skim_variable mean sd p0 p25 p50 p75 p100
1 cyl 40.9 40.7 -40.5 23.9 56.3 69.9 77.8
2 disp -0.734 0.566 -1.64 -1.03 -0.571 -0.372 -0.139
3 hp -0.823 1.65 -3.99 -1.18 -0.974 -0.196 1.25
4 drat 550. 160. 170. 549. 606. 642. 643.
5 wt -470. 78.9 -543. -537. -489. -437. -336.
6 qsec -117. 48.0 -179. -143. -131. -99.1 -29.9
7 vs -194. 112. -377. -283. -162. -120. -46.3
8 am -396. 200. -719. -481. -357. -319. -67.7
9 gear 53.1 135. -143. -23.9 16.5 172. 231.
10 carb -25.9 45.2 -101. -48.8 -23.8 -9.36 45.7
hist
1 ▂▂▂▁▇
2 ▅▁▂▇▅
3 ▂▁▇▂▃
4 ▁▁▁▁▇
5 ▇▅▂▁▅
6 ▂▇▂▂▂
7 ▂▅▂▇▂
8 ▃▁▇▂▂
9 ▂▅▅▁▇
10 ▂▅▇▁▅
res <- obj$predict(X = new_X_test)
new_y_train <- c(y_train, y_test[c(1, 2)])
plot(c(new_y_train, res$preds), type='l',
main="",
ylab="",
ylim = c(min(c(res$upper, res$lower, y)),
max(c(res$upper, res$lower, y))))
lines(c(new_y_train, res$upper), col="gray60")
lines(c(new_y_train, res$lower), col="gray60")
lines(c(new_y_train, res$preds), col = "red")
lines(c(new_y_train, new_y_test), col = "blue")
abline(v = length(y_train), lty=2, col="black")
[1] 0.75
For attribution, please cite this work as:
T. Moudiki (2024-08-19). Conformalized adaptive (online/streaming) learning using learningmachine in Python and R. Retrieved from https://thierrymoudiki.github.io/blog/2024/08/19/r/python/code-conformal-adaptive-RVFL
BibTeX citation (remove empty spaces)@misc{ tmoudiki20240819, author = { T. Moudiki }, title = { Conformalized adaptive (online/streaming) learning using learningmachine in Python and R }, url = { https://thierrymoudiki.github.io/blog/2024/08/19/r/python/code-conformal-adaptive-RVFL }, year = { 2024 } }
Previous publications
- external regressors in ahead::dynrmf's interface for Machine learning forecasting Sep 1, 2025
- Another interesting decision, now for 'Beyond Nelson-Siegel and splines: A model-agnostic Machine Learning framework for discount curve calibration, interpolation and extrapolation' Aug 20, 2025
- Boosting any randomized based learner for regression, classification and univariate/multivariate time series forcasting Jul 26, 2025
- New nnetsauce version with CustomBackPropRegressor (CustomRegressor with Backpropagation) and ElasticNet2Regressor (Ridge2 with ElasticNet regularization) Jul 15, 2025
- mlsauce (home to a model-agnostic gradient boosting algorithm) can now be installed from PyPI. Jul 10, 2025
- A user-friendly graphical interface to techtonique dot net's API (will eventually contain graphics). Jul 8, 2025
- Calling =TECHTO_MLCLASSIFICATION for Machine Learning supervised CLASSIFICATION in Excel is just a matter of copying and pasting Jul 7, 2025
- Calling =TECHTO_MLREGRESSION for Machine Learning supervised regression in Excel is just a matter of copying and pasting Jul 6, 2025
- Calling =TECHTO_RESERVING and =TECHTO_MLRESERVING for claims triangle reserving in Excel is just a matter of copying and pasting Jul 5, 2025
- Calling =TECHTO_SURVIVAL for Survival Analysis in Excel is just a matter of copying and pasting Jul 4, 2025
- Calling =TECHTO_SIMULATION for Stochastic Simulation in Excel is just a matter of copying and pasting Jul 3, 2025
- Calling =TECHTO_FORECAST for forecasting in Excel is just a matter of copying and pasting Jul 2, 2025
- Random Vector Functional Link (RVFL) artificial neural network with 2 regularization parameters successfully used for forecasting/synthetic simulation in professional settings: Extensions (including Bayesian) Jul 1, 2025
- R version of 'Backpropagating quasi-randomized neural networks' Jun 24, 2025
- Backpropagating quasi-randomized neural networks Jun 23, 2025
- Beyond ARMA-GARCH: leveraging any statistical model for volatility forecasting Jun 21, 2025
- Stacked generalization (Machine Learning model stacking) + conformal prediction for forecasting with ahead::mlf Jun 18, 2025
- An Overfitting dilemma: XGBoost Default Hyperparameters vs GenericBooster + LinearRegression Default Hyperparameters Jun 14, 2025
- Programming language-agnostic reserving using RidgeCV, LightGBM, XGBoost, and ExtraTrees Machine Learning models Jun 13, 2025
- Exceptionally, and on a more personal note (otherwise I may get buried alive)... Jun 10, 2025
- Free R, Python and SQL editors in techtonique dot net Jun 9, 2025
- Beyond Nelson-Siegel and splines: A model-agnostic Machine Learning framework for discount curve calibration, interpolation and extrapolation Jun 7, 2025
- scikit-learn, glmnet, xgboost, lightgbm, pytorch, keras, nnetsauce in probabilistic Machine Learning (for longitudinal data) Reserving (work in progress) Jun 6, 2025
- R version of Probabilistic Machine Learning (for longitudinal data) Reserving (work in progress) Jun 5, 2025
- Probabilistic Machine Learning (for longitudinal data) Reserving (work in progress) Jun 4, 2025
- Python version of Beyond ARMA-GARCH: leveraging model-agnostic Quasi-Randomized networks and conformal prediction for nonparametric probabilistic stock forecasting (ML-ARCH) Jun 3, 2025
- Beyond ARMA-GARCH: leveraging model-agnostic Machine Learning and conformal prediction for nonparametric probabilistic stock forecasting (ML-ARCH) Jun 2, 2025
- Permutations and SHAPley values for feature importance in techtonique dot net's API (with R + Python + the command line) Jun 1, 2025
- Which patient is going to survive longer? Another guide to using techtonique dot net's API (with R + Python + the command line) for survival analysis May 31, 2025
- A Guide to Using techtonique.net's API and rush for simulating and plotting Stochastic Scenarios May 30, 2025
- Simulating Stochastic Scenarios with Diffusion Models: A Guide to Using techtonique.net's API for the purpose May 29, 2025
- Will my apartment in 5th avenue be overpriced or not? Harnessing the power of www.techtonique.net (+ xgboost, lightgbm, catboost) to find out May 28, 2025
- How long must I wait until something happens: A Comprehensive Guide to Survival Analysis via an API May 27, 2025
- Harnessing the Power of techtonique.net: A Comprehensive Guide to Machine Learning Classification via an API May 26, 2025
- Quantile regression with any regressor -- Examples with RandomForestRegressor, RidgeCV, KNeighborsRegressor May 20, 2025
- Survival stacking: survival analysis translated as supervised classification in R and Python May 5, 2025
- 'Bayesian' optimization of hyperparameters in a R machine learning model using the bayesianrvfl package Apr 25, 2025
- A lightweight interface to scikit-learn in R: Bayesian and Conformal prediction Apr 21, 2025
- A lightweight interface to scikit-learn in R Pt.2: probabilistic time series forecasting in conjunction with ahead::dynrmf Apr 20, 2025
- Extending the Theta forecasting method to GLMs, GAMs, GLMBOOST and attention: benchmarking on Tourism, M1, M3 and M4 competition data sets (28000 series) Apr 14, 2025
- Extending the Theta forecasting method to GLMs and attention Apr 8, 2025
- Nonlinear conformalized Generalized Linear Models (GLMs) with R package 'rvfl' (and other models) Mar 31, 2025
- Probabilistic Time Series Forecasting (predictive simulations) in Microsoft Excel using Python, xlwings lite and www.techtonique.net Mar 28, 2025
- Conformalize (improved prediction intervals and simulations) any R Machine Learning model with misc::conformalize Mar 25, 2025
- My poster for the 18th FINANCIAL RISKS INTERNATIONAL FORUM by Institut Louis Bachelier/Fondation du Risque/Europlace Institute of Finance Mar 19, 2025
- Interpretable probabilistic kernel ridge regression using Matérn 3/2 kernels Mar 16, 2025
- (News from) Probabilistic Forecasting of univariate and multivariate Time Series using Quasi-Randomized Neural Networks (Ridge2) and Conformal Prediction Mar 9, 2025
- Word-Online: re-creating Karpathy's char-RNN (with supervised linear online learning of word embeddings) for text completion Mar 8, 2025
- CRAN-like repository for most recent releases of Techtonique's R packages Mar 2, 2025
- Presenting 'Online Probabilistic Estimation of Carbon Beta and Carbon Shapley Values for Financial and Climate Risk' at Institut Louis Bachelier Feb 27, 2025
- Web app with DeepSeek R1 and Hugging Face API for chatting Feb 23, 2025
- tisthemachinelearner: A Lightweight interface to scikit-learn with 2 classes, Classifier and Regressor (in Python and R) Feb 17, 2025
- R version of survivalist: Probabilistic model-agnostic survival analysis using scikit-learn, xgboost, lightgbm (and conformal prediction) Feb 12, 2025
- Model-agnostic global Survival Prediction of Patients with Myeloid Leukemia in QRT/Gustave Roussy Challenge (challengedata.ens.fr): Python's survivalist Quickstart Feb 10, 2025
- A simple test of the martingale hypothesis in esgtoolkit Feb 3, 2025
- Command Line Interface (CLI) for techtonique.net's API Jan 31, 2025
- Gradient-Boosting and Boostrap aggregating anything (alert: high performance): Part5, easier install and Rust backend Jan 27, 2025
- Just got a paper on conformal prediction REJECTED by International Journal of Forecasting despite evidence on 30,000 time series (and more). What's going on? Part2: 1311 time series from the Tourism competition Jan 20, 2025
- Techtonique is out! (with a tutorial in various programming languages and formats) Jan 14, 2025
- Univariate and Multivariate Probabilistic Forecasting with nnetsauce and TabPFN Jan 14, 2025
- Just got a paper on conformal prediction REJECTED by International Journal of Forecasting despite evidence on 30,000 time series (and more). What's going on? Jan 5, 2025
- Python and Interactive dashboard version of Stock price forecasting with Deep Learning: throwing power at the problem (and why it won't make you rich) Dec 31, 2024
- Stock price forecasting with Deep Learning: throwing power at the problem (and why it won't make you rich) Dec 29, 2024
- No-code Machine Learning Cross-validation and Interpretability in techtonique.net Dec 23, 2024
- survivalist: Probabilistic model-agnostic survival analysis using scikit-learn, glmnet, xgboost, lightgbm, pytorch, keras, nnetsauce and mlsauce Dec 15, 2024
- Model-agnostic 'Bayesian' optimization (for hyperparameter tuning) using conformalized surrogates in GPopt Dec 9, 2024
- You can beat Forecasting LLMs (Large Language Models a.k.a foundation models) with nnetsauce.MTS Pt.2: Generic Gradient Boosting Dec 1, 2024
- You can beat Forecasting LLMs (Large Language Models a.k.a foundation models) with nnetsauce.MTS Nov 24, 2024
- Unified interface and conformal prediction (calibrated prediction intervals) for R package forecast (and 'affiliates') Nov 23, 2024
- GLMNet in Python: Generalized Linear Models Nov 18, 2024
- Gradient-Boosting anything (alert: high performance): Part4, Time series forecasting Nov 10, 2024
- Predictive scenarios simulation in R, Python and Excel using Techtonique API Nov 3, 2024
- Chat with your tabular data in www.techtonique.net Oct 30, 2024
- Gradient-Boosting anything (alert: high performance): Part3, Histogram-based boosting Oct 28, 2024
- R editor and SQL console (in addition to Python editors) in www.techtonique.net Oct 21, 2024
- R and Python consoles + JupyterLite in www.techtonique.net Oct 15, 2024
- Gradient-Boosting anything (alert: high performance): Part2, R version Oct 14, 2024
- Gradient-Boosting anything (alert: high performance) Oct 6, 2024
- Benchmarking 30 statistical/Machine Learning models on the VN1 Forecasting -- Accuracy challenge Oct 4, 2024
- Automated random variable distribution inference using Kullback-Leibler divergence and simulating best-fitting distribution Oct 2, 2024
- Forecasting in Excel using Techtonique's Machine Learning APIs under the hood Sep 30, 2024
- Techtonique web app for data-driven decisions using Mathematics, Statistics, Machine Learning, and Data Visualization Sep 25, 2024
- Parallel for loops (Map or Reduce) + New versions of nnetsauce and ahead Sep 16, 2024
- Adaptive (online/streaming) learning with uncertainty quantification using Polyak averaging in learningmachine Sep 10, 2024
- New versions of nnetsauce and ahead Sep 9, 2024
- Prediction sets and prediction intervals for conformalized Auto XGBoost, Auto LightGBM, Auto CatBoost, Auto GradientBoosting Sep 2, 2024
- Quick/automated R package development workflow (assuming you're using macOS or Linux) Part2 Aug 30, 2024
- R package development workflow (assuming you're using macOS or Linux) Aug 27, 2024
- A new method for deriving a nonparametric confidence interval for the mean Aug 26, 2024
- Conformalized adaptive (online/streaming) learning using learningmachine in Python and R Aug 19, 2024
- Bayesian (nonlinear) adaptive learning Aug 12, 2024
- Auto XGBoost, Auto LightGBM, Auto CatBoost, Auto GradientBoosting Aug 5, 2024
- Copulas for uncertainty quantification in time series forecasting Jul 28, 2024
- Forecasting uncertainty: sequential split conformal prediction + Block bootstrap (web app) Jul 22, 2024
- learningmachine for Python (new version) Jul 15, 2024
- learningmachine v2.0.0: Machine Learning with explanations and uncertainty quantification Jul 8, 2024
- My presentation at ISF 2024 conference (slides with nnetsauce probabilistic forecasting news) Jul 3, 2024
- 10 uncertainty quantification methods in nnetsauce forecasting Jul 1, 2024
- Forecasting with XGBoost embedded in Quasi-Randomized Neural Networks Jun 24, 2024
- Forecasting Monthly Airline Passenger Numbers with Quasi-Randomized Neural Networks Jun 17, 2024
- Automated hyperparameter tuning using any conformalized surrogate Jun 9, 2024
- Recognizing handwritten digits with Ridge2Classifier Jun 3, 2024
- Forecasting the Economy May 27, 2024
- A detailed introduction to Deep Quasi-Randomized 'neural' networks May 19, 2024
- Probability of receiving a loan; using learningmachine May 12, 2024
- mlsauce's `v0.18.2`: various examples and benchmarks with dimension reduction May 6, 2024
- mlsauce's `v0.17.0`: boosting with Elastic Net, polynomials and heterogeneity in explanatory variables Apr 29, 2024
- mlsauce's `v0.13.0`: taking into account inputs heterogeneity through clustering Apr 21, 2024
- mlsauce's `v0.12.0`: prediction intervals for LSBoostRegressor Apr 15, 2024
- Conformalized predictive simulations for univariate time series on more than 250 data sets Apr 7, 2024
- learningmachine v1.1.2: for Python Apr 1, 2024
- learningmachine v1.0.0: prediction intervals around the probability of the event 'a tumor being malignant' Mar 25, 2024
- Bayesian inference and conformal prediction (prediction intervals) in nnetsauce v0.18.1 Mar 18, 2024
- Multiple examples of Machine Learning forecasting with ahead Mar 11, 2024
- rtopy (v0.1.1): calling R functions in Python Mar 4, 2024
- ahead forecasting (v0.10.0): fast time series model calibration and Python plots Feb 26, 2024
- A plethora of datasets at your fingertips Part3: how many times do couples cheat on each other? Feb 19, 2024
- nnetsauce's introduction as of 2024-02-11 (new version 0.17.0) Feb 11, 2024
- Tuning Machine Learning models with GPopt's new version Part 2 Feb 5, 2024
- Tuning Machine Learning models with GPopt's new version Jan 29, 2024
- Subsampling continuous and discrete response variables Jan 22, 2024
- DeepMTS, a Deep Learning Model for Multivariate Time Series Jan 15, 2024
- A classifier that's very accurate (and deep) Pt.2: there are > 90 classifiers in nnetsauce Jan 8, 2024
- learningmachine: prediction intervals for conformalized Kernel ridge regression and Random Forest Jan 1, 2024
- A plethora of datasets at your fingertips Part2: how many times do couples cheat on each other? Descriptive analytics, interpretability and prediction intervals using conformal prediction Dec 25, 2023
- Diffusion models in Python with esgtoolkit (Part2) Dec 18, 2023
- Diffusion models in Python with esgtoolkit Dec 11, 2023
- Julia packaging at the command line Dec 4, 2023
- Quasi-randomized nnetworks in Julia, Python and R Nov 27, 2023
- A plethora of datasets at your fingertips Nov 20, 2023
- A classifier that's very accurate (and deep) Nov 12, 2023
- mlsauce version 0.8.10: Statistical/Machine Learning with Python and R Nov 5, 2023
- AutoML in nnetsauce (randomized and quasi-randomized nnetworks) Pt.2: multivariate time series forecasting Oct 29, 2023
- AutoML in nnetsauce (randomized and quasi-randomized nnetworks) Oct 22, 2023
- Version v0.14.0 of nnetsauce for R and Python Oct 16, 2023
- A diffusion model: G2++ Oct 9, 2023
- Diffusion models in ESGtoolkit + announcements Oct 2, 2023
- An infinity of time series forecasting models in nnetsauce (Part 2 with uncertainty quantification) Sep 25, 2023
- (News from) forecasting in Python with ahead (progress bars and plots) Sep 18, 2023
- Forecasting in Python with ahead Sep 11, 2023
- Risk-neutralize simulations Sep 4, 2023
- Comparing cross-validation results using crossval_ml and boxplots Aug 27, 2023
- Reminder Apr 30, 2023
- Did you ask ChatGPT about who you are? Apr 16, 2023
- A new version of nnetsauce (randomized and quasi-randomized 'neural' networks) Apr 2, 2023
- Simple interfaces to the forecasting API Nov 23, 2022
- A web application for forecasting in Python, R, Ruby, C#, JavaScript, PHP, Go, Rust, Java, MATLAB, etc. Nov 2, 2022
- Prediction intervals (not only) for Boosted Configuration Networks in Python Oct 5, 2022
- Boosted Configuration (neural) Networks Pt. 2 Sep 3, 2022
- Boosted Configuration (_neural_) Networks for classification Jul 21, 2022
- A Machine Learning workflow using Techtonique Jun 6, 2022
- Super Mario Bros © in the browser using PyScript May 8, 2022
- News from ESGtoolkit, ycinterextra, and nnetsauce Apr 4, 2022
- Explaining a Keras _neural_ network predictions with the-teller Mar 11, 2022
- New version of nnetsauce -- various quasi-randomized networks Feb 12, 2022
- A dashboard illustrating bivariate time series forecasting with `ahead` Jan 14, 2022
- Hundreds of Statistical/Machine Learning models for univariate time series, using ahead, ranger, xgboost, and caret Dec 20, 2021
- Forecasting with `ahead` (Python version) Dec 13, 2021
- Tuning and interpreting LSBoost Nov 15, 2021
- Time series cross-validation using `crossvalidation` (Part 2) Nov 7, 2021
- Fast and scalable forecasting with ahead::ridge2f Oct 31, 2021
- Automatic Forecasting with `ahead::dynrmf` and Ridge regression Oct 22, 2021
- Forecasting with `ahead` Oct 15, 2021
- Classification using linear regression Sep 26, 2021
- `crossvalidation` and random search for calibrating support vector machines Aug 6, 2021
- parallel grid search cross-validation using `crossvalidation` Jul 31, 2021
- `crossvalidation` on R-universe, plus a classification example Jul 23, 2021
- Documentation and source code for GPopt, a package for Bayesian optimization Jul 2, 2021
- Hyperparameters tuning with GPopt Jun 11, 2021
- A forecasting tool (API) with examples in curl, R, Python May 28, 2021
- Bayesian Optimization with GPopt Part 2 (save and resume) Apr 30, 2021
- Bayesian Optimization with GPopt Apr 16, 2021
- Compatibility of nnetsauce and mlsauce with scikit-learn Mar 26, 2021
- Explaining xgboost predictions with the teller Mar 12, 2021
- An infinity of time series models in nnetsauce Mar 6, 2021
- New activation functions in mlsauce's LSBoost Feb 12, 2021
- 2020 recap, Gradient Boosting, Generalized Linear Models, AdaOpt with nnetsauce and mlsauce Dec 29, 2020
- A deeper learning architecture in nnetsauce Dec 18, 2020
- Classify penguins with nnetsauce's MultitaskClassifier Dec 11, 2020
- Bayesian forecasting for uni/multivariate time series Dec 4, 2020
- Generalized nonlinear models in nnetsauce Nov 28, 2020
- Boosting nonlinear penalized least squares Nov 21, 2020
- Statistical/Machine Learning explainability using Kernel Ridge Regression surrogates Nov 6, 2020
- NEWS Oct 30, 2020
- A glimpse into my PhD journey Oct 23, 2020
- Submitting R package to CRAN Oct 16, 2020
- Simulation of dependent variables in ESGtoolkit Oct 9, 2020
- Forecasting lung disease progression Oct 2, 2020
- New nnetsauce Sep 25, 2020
- Technical documentation Sep 18, 2020
- A new version of nnetsauce, and a new Techtonique website Sep 11, 2020
- Back next week, and a few announcements Sep 4, 2020
- Explainable 'AI' using Gradient Boosted randomized networks Pt2 (the Lasso) Jul 31, 2020
- LSBoost: Explainable 'AI' using Gradient Boosted randomized networks (with examples in R and Python) Jul 24, 2020
- nnetsauce version 0.5.0, randomized neural networks on GPU Jul 17, 2020
- Maximizing your tip as a waiter (Part 2) Jul 10, 2020
- New version of mlsauce, with Gradient Boosted randomized networks and stump decision trees Jul 3, 2020
- Announcements Jun 26, 2020
- Parallel AdaOpt classification Jun 19, 2020
- Comments section and other news Jun 12, 2020
- Maximizing your tip as a waiter Jun 5, 2020
- AdaOpt classification on MNIST handwritten digits (without preprocessing) May 29, 2020
- AdaOpt (a probabilistic classifier based on a mix of multivariable optimization and nearest neighbors) for R May 22, 2020
- AdaOpt May 15, 2020
- Custom errors for cross-validation using crossval::crossval_ml May 8, 2020
- Documentation+Pypi for the `teller`, a model-agnostic tool for Machine Learning explainability May 1, 2020
- Encoding your categorical variables based on the response variable and correlations Apr 24, 2020
- Linear model, xgboost and randomForest cross-validation using crossval::crossval_ml Apr 17, 2020
- Grid search cross-validation using crossval Apr 10, 2020
- Documentation for the querier, a query language for Data Frames Apr 3, 2020
- Time series cross-validation using crossval Mar 27, 2020
- On model specification, identification, degrees of freedom and regularization Mar 20, 2020
- Import data into the querier (now on Pypi), a query language for Data Frames Mar 13, 2020
- R notebooks for nnetsauce Mar 6, 2020
- Version 0.4.0 of nnetsauce, with fruits and breast cancer classification Feb 28, 2020
- Create a specific feed in your Jekyll blog Feb 21, 2020
- Git/Github for contributing to package development Feb 14, 2020
- Feedback forms for contributing Feb 7, 2020
- nnetsauce for R Jan 31, 2020
- A new version of nnetsauce (v0.3.1) Jan 24, 2020
- ESGtoolkit, a tool for Monte Carlo simulation (v0.2.0) Jan 17, 2020
- Search bar, new year 2020 Jan 10, 2020
- 2019 Recap, the nnetsauce, the teller and the querier Dec 20, 2019
- Understanding model interactions with the `teller` Dec 13, 2019
- Using the `teller` on a classifier Dec 6, 2019
- Benchmarking the querier's verbs Nov 29, 2019
- Composing the querier's verbs for data wrangling Nov 22, 2019
- Comparing and explaining model predictions with the teller Nov 15, 2019
- Tests for the significance of marginal effects in the teller Nov 8, 2019
- Introducing the teller Nov 1, 2019
- Introducing the querier Oct 25, 2019
- Prediction intervals for nnetsauce models Oct 18, 2019
- Using R in Python for statistical learning/data science Oct 11, 2019
- Model calibration with `crossval` Oct 4, 2019
- Bagging in the nnetsauce Sep 25, 2019
- Adaboost learning with nnetsauce Sep 18, 2019
- Change in blog's presentation Sep 4, 2019
- nnetsauce on Pypi Jun 5, 2019
- More nnetsauce (examples of use) May 9, 2019
- nnetsauce Mar 13, 2019
- crossval Mar 13, 2019
- test Mar 10, 2019
Comments powered by Talkyard.