Today, give a try to Techtonique web app, a tool designed to help you make informed, data-driven decisions using Mathematics, Statistics, Machine Learning, and Data Visualization. Here is a tutorial with audio, video, code, and slides: https://moudiki2.gumroad.com/l/nrhgb. 100 API requests are now (and forever) offered to every user every month, no matter the pricing tier.
During the past few weeks, I’ve been adapting a Python version of the (seemingly abandoned?) official Stanford GLMNet Python package. Don’t try to build a programming interface on it yet, as it’s still “moving”.
GLMNet implements the entire lasso or elastic-net regularization path for linear
regression, logistic
and multinomial
regression models, poisson
regression and the cox
model. My implementation is faithful to the R Fortran-based one, but:
- uses
numpy
instead ofscipy
- uses
scikit-learn
style, with a main classGLMNet
having methodsfit
andpredict
If (like me) you’re fond of GLMNet and scikit-learn style, you may love this package. Here, I illustrate the usage of this “new” package, along with its use within the Techtonique ecosystem (nnetsauce
and mlsauce
).
!pip install nnetsauce
!pip install git+https://github.com/Techtonique/mlsauce.git --verbose --upgrade --no-cache-dir
!pip install git+https://github.com/thierrymoudiki/glmnetforpython.git --verbose --upgrade --no-cache-dir
1 - GLMNet
1 - 1 GLMNet Classification
import nnetsauce as ns
import mlsauce as ms
import numpy as np
import glmnetforpython as glmnet
from sklearn.datasets import load_breast_cancer, load_iris, load_wine
from sklearn.model_selection import train_test_split
from time import time
datasets = [load_iris, load_breast_cancer, load_wine]
for dataset in datasets:
print(f"\n\n dataset: {dataset.__name__} -------------------")
X, y = dataset(return_X_y=True)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=123)
clf = glmnet.GLMNet(family="multinomial")
print(clf.get_params())
start = time()
clf.fit(X_train, y_train)
print(f"elapsed: {time() - start}")
#clf.print()
#print(clf.score(X_test, y_test))
preds = clf.predict(X_test, ptype="class")
print(preds)
print("accuracy: ", np.mean(preds == y_test))
dataset: load_iris -------------------
{'alpha': 1.0, 'dfmax': 10000000000.0, 'exclude': None, 'family': 'multinomial', 'lambdau': None, 'lower_lambdau': None, 'maxit': 100000.0, 'ncores': -1, 'nlambda': 100, 'parallel': False, 'penalty_factor': None, 'pmax': 10000000000.0, 'standardize': True, 'thresh': 1e-07, 'type_measure': 1, 'upper_lambdau': None, 'verbose': False, 'weights': None}
elapsed: 0.5259675979614258
[1. 2. 2. 1. 0. 2. 1. 0. 0. 1. 2. 0. 1. 2. 2. 2. 0. 0. 1. 0. 0. 1. 0. 2.
0. 0. 0. 2. 2. 0.]
accuracy: 0.9666666666666667
dataset: load_breast_cancer -------------------
{'alpha': 1.0, 'dfmax': 10000000000.0, 'exclude': None, 'family': 'multinomial', 'lambdau': None, 'lower_lambdau': None, 'maxit': 100000.0, 'ncores': -1, 'nlambda': 100, 'parallel': False, 'penalty_factor': None, 'pmax': 10000000000.0, 'standardize': True, 'thresh': 1e-07, 'type_measure': 1, 'upper_lambdau': None, 'verbose': False, 'weights': None}
elapsed: 1.3695988655090332
[1. 1. 0. 1. 0. 1. 1. 1. 1. 1. 1. 0. 0. 1. 0. 1. 1. 1. 1. 1. 0. 1. 1. 1.
1. 0. 0. 1. 0. 1. 0. 1. 1. 1. 0. 1. 1. 1. 1. 0. 0. 1. 0. 1. 0. 1. 0. 0.
1. 0. 0. 0. 1. 1. 1. 0. 1. 0. 0. 1. 0. 1. 1. 1. 1. 0. 1. 1. 1. 1. 1. 1.
1. 1. 0. 1. 1. 0. 0. 0. 1. 0. 0. 1. 1. 1. 0. 1. 0. 1. 0. 1. 1. 0. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 0.]
accuracy: 0.956140350877193
dataset: load_wine -------------------
{'alpha': 1.0, 'dfmax': 10000000000.0, 'exclude': None, 'family': 'multinomial', 'lambdau': None, 'lower_lambdau': None, 'maxit': 100000.0, 'ncores': -1, 'nlambda': 100, 'parallel': False, 'penalty_factor': None, 'pmax': 10000000000.0, 'standardize': True, 'thresh': 1e-07, 'type_measure': 1, 'upper_lambdau': None, 'verbose': False, 'weights': None}
elapsed: 0.1249077320098877
[2. 1. 2. 1. 1. 2. 0. 2. 2. 1. 2. 2. 2. 0. 0. 2. 1. 1. 0. 1. 1. 2. 2. 2.
1. 2. 2. 1. 0. 0. 0. 0. 2. 1. 2. 1.]
accuracy: 0.9722222222222222
1 - 2 GLMNet Regression
import numpy as np
import os
import sys
import glmnetforpython as glmnet
from sklearn.datasets import load_diabetes, fetch_california_housing
from sklearn.model_selection import train_test_split
from time import time
datasets = [load_diabetes, fetch_california_housing]
for dataset in datasets:
print(f"\n\n dataset: {dataset.__name__} -------------------")
X, y = dataset(return_X_y=True)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2)
regr = glmnet.GLMNet()
print(regr.get_params())
start = time()
regr.fit(X_train, y_train)
print(f"elapsed: {time() - start}")
regr.print()
print(regr.predict(X_test, s=0.1))
print(regr.predict(X_test, s=np.asarray([0.1, 0.5])))
print(regr.predict(X_test, s=0.5))
start = time()
res_cvglmnet = regr.cvglmnet(X_train, y_train)
print(f"elapsed: {time() - start}")
print("\n best lambda: ", res_cvglmnet.lambda_min)
print("\n best lambda std. dev: ", res_cvglmnet.lambda_1se)
print("\n best coef: ", res_cvglmnet.best_coef)
print("\n best GLMNet: ", res_cvglmnet.cvfit)
dataset: load_diabetes -------------------
{'alpha': 1.0, 'dfmax': 10000000000.0, 'exclude': None, 'family': 'gaussian', 'lambdau': None, 'lower_lambdau': None, 'maxit': 100000.0, 'ncores': -1, 'nlambda': 100, 'parallel': False, 'penalty_factor': None, 'pmax': 10000000000.0, 'standardize': True, 'thresh': 1e-07, 'type_measure': 1, 'upper_lambdau': None, 'verbose': False, 'weights': None}
elapsed: 0.003544330596923828
df %dev lambdau
0 0.000000 0.000000 44.034491
1 1.000000 0.056410 40.122588
2 2.000000 0.118800 36.558208
3 2.000000 0.173050 33.310478
4 2.000000 0.218089 30.351267
5 2.000000 0.255485 27.654944
6 2.000000 0.286528 25.198155
7 2.000000 0.312300 22.959620
8 2.000000 0.333697 20.919951
9 3.000000 0.354121 19.061480
10 4.000000 0.373003 17.368111
11 4.000000 0.390322 15.825176
12 4.000000 0.404704 14.419311
13 4.000000 0.416644 13.138339
14 4.000000 0.426556 11.971165
15 4.000000 0.434786 10.907680
16 4.000000 0.441619 9.938671
17 5.000000 0.447381 9.055747
18 5.000000 0.452319 8.251260
19 5.000000 0.456422 7.518240
20 5.000000 0.459828 6.850341
21 5.000000 0.462655 6.241775
22 5.000000 0.465003 5.687273
23 6.000000 0.468916 5.182032
24 6.000000 0.472756 4.721674
25 6.000000 0.475938 4.302214
26 6.000000 0.478579 3.920017
27 6.000000 0.480772 3.571773
28 7.000000 0.482661 3.254467
29 7.000000 0.485063 2.965349
30 7.000000 0.487080 2.701916
31 7.000000 0.488751 2.461885
32 7.000000 0.490137 2.243178
33 7.000000 0.491289 2.043900
34 7.000000 0.492244 1.862326
35 7.000000 0.493038 1.696882
36 7.000000 0.493697 1.546135
37 8.000000 0.494444 1.408781
38 8.000000 0.495256 1.283629
39 8.000000 0.495927 1.169595
40 8.000000 0.496489 1.065691
41 8.000000 0.496952 0.971018
42 8.000000 0.497335 0.884756
43 8.000000 0.497659 0.806156
44 8.000000 0.497924 0.734540
45 8.000000 0.498143 0.669285
46 8.000000 0.498329 0.609828
47 8.000000 0.498481 0.555652
48 8.000000 0.498610 0.506290
49 8.000000 0.498715 0.461312
50 8.000000 0.498805 0.420331
51 8.000000 0.498877 0.382990
52 8.000000 0.498939 0.348966
53 8.000000 0.498989 0.317965
54 8.000000 0.499032 0.289718
55 9.000000 0.499069 0.263980
56 9.000000 0.499392 0.240529
57 9.000000 0.499741 0.219161
58 9.000000 0.500032 0.199691
59 9.000000 0.500272 0.181951
60 9.000000 0.500476 0.165787
61 9.000000 0.500646 0.151059
62 9.000000 0.500787 0.137639
63 8.000000 0.500861 0.125412
64 9.000000 0.500891 0.114271
65 9.000000 0.500921 0.104119
66 9.000000 0.500946 0.094869
67 9.000000 0.500966 0.086441
68 10.000000 0.500985 0.078762
69 10.000000 0.501074 0.071765
70 10.000000 0.501148 0.065390
71 10.000000 0.501208 0.059581
72 10.000000 0.501261 0.054288
73 10.000000 0.501303 0.049465
74 10.000000 0.501340 0.045071
75 10.000000 0.501371 0.041067
76 10.000000 0.501396 0.037418
77 10.000000 0.501418 0.034094
78 10.000000 0.501436 0.031065
79 10.000000 0.501452 0.028306
80 10.000000 0.501466 0.025791
81 10.000000 0.501477 0.023500
82 10.000000 0.501486 0.021412
83 10.000000 0.501495 0.019510
84 10.000000 0.501501 0.017777
85 10.000000 0.501507 0.016198
86 10.000000 0.501512 0.014759
87 10.000000 0.501517 0.013447
[161.26225363 153.40808479 226.88078039 163.480388 158.15906743
138.70495293 252.60833458 107.20179977 107.04120812 111.4621737
123.02831339 182.46487521 161.8259466 202.19109973 222.70276584
172.29337663 108.23998068 144.9482381 176.11555866 191.67293859
163.44023323 231.8947646 140.21508949 75.13660039 129.39763652
188.26182192 100.80880331 101.63988186 157.52887579 185.93073996
85.10969035 238.43828572 208.13649047 209.71355938 198.52425274
95.48735993 93.58588193 98.38410955 225.11428814 101.19808037
193.69596077 81.44887372 102.8093431 146.00065311 110.88937281
215.06701174 79.87947637 77.58243533 101.06682798 217.30259906
70.16241913 116.23582088 177.21944649 195.88268542 138.92178841
198.65554716 219.68568399 169.97366232 192.47857773 189.04428441
138.71921407 121.43624221 233.40434688 202.68154217 190.88486154
42.03060013 62.01800127 159.28979811 126.65978845 86.64871155
136.58228326 76.93411617 141.41235614 199.19748035 120.79645249
173.18692022 146.96993898 139.31000819 99.86313284 83.63232759
61.45995805 159.5304213 120.28229729 225.93625573 286.05353932
165.66169186 197.95421215 70.40035793 139.89076625]
[[161.26225363 160.79263694]
[153.40808479 150.6281287 ]
[226.88078039 225.5710481 ]
[163.480388 161.80700641]
[158.15906743 157.71369432]
[138.70495293 144.58961694]
[252.60833458 250.39569639]
[107.20179977 110.67344587]
[107.04120812 111.21584102]
[111.4621737 107.93161795]
[123.02831339 122.34617434]
[182.46487521 180.55849115]
[161.8259466 161.4535835 ]
[202.19109973 200.49417412]
[222.70276584 229.60354304]
[172.29337663 170.57681745]
[108.23998068 109.09703513]
[144.9482381 143.71605666]
[176.11555866 177.00946867]
[191.67293859 194.23710327]
[163.44023323 161.7697504 ]
[231.8947646 229.71549579]
[140.21508949 140.591871 ]
[ 75.13660039 78.02802694]
[129.39763652 129.5053364 ]
[188.26182192 186.58248135]
[100.80880331 102.6960668 ]
[101.63988186 104.20365368]
[157.52887579 156.12372213]
[185.93073996 187.20901614]
[ 85.10969035 89.82145958]
[238.43828572 237.95082988]
[208.13649047 207.73770948]
[209.71355938 209.32169425]
[198.52425274 197.67298512]
[ 95.48735993 96.07154965]
[ 93.58588193 95.09805607]
[ 98.38410955 97.25266832]
[225.11428814 220.52646948]
[101.19808037 101.27641956]
[193.69596077 194.77086843]
[ 81.44887372 81.25151312]
[102.8093431 102.64887002]
[146.00065311 144.94838244]
[110.88937281 110.25258101]
[215.06701174 213.51721996]
[ 79.87947637 79.10616278]
[ 77.58243533 81.51256193]
[101.06682798 103.20741885]
[217.30259906 216.7643487 ]
[ 70.16241913 72.0598882 ]
[116.23582088 119.05445336]
[177.21944649 178.45613256]
[195.88268542 197.31526195]
[138.92178841 137.70888526]
[198.65554716 200.13140539]
[219.68568399 218.50018565]
[169.97366232 169.49700466]
[192.47857773 188.32727388]
[189.04428441 186.73052546]
[138.71921407 140.07357784]
[121.43624221 121.14922477]
[233.40434688 231.63901622]
[202.68154217 201.3077663 ]
[190.88486154 189.74608267]
[ 42.03060013 46.44945536]
[ 62.01800127 63.00668405]
[159.28979811 158.37093056]
[126.65978845 126.26280796]
[ 86.64871155 87.59938665]
[136.58228326 136.23598795]
[ 76.93411617 80.10973443]
[141.41235614 140.69343212]
[199.19748035 196.9680135 ]
[120.79645249 119.32968814]
[173.18692022 170.83211938]
[146.96993898 146.07744866]
[139.31000819 139.45758571]
[ 99.86313284 99.37633812]
[ 83.63232759 85.05298366]
[ 61.45995805 64.04582025]
[159.5304213 159.08368556]
[120.28229729 120.78108123]
[225.93625573 224.25244938]
[286.05353932 287.72165668]
[165.66169186 167.91861665]
[197.95421215 194.94689188]
[ 70.40035793 71.35611103]
[139.89076625 139.15500257]]
[160.79263694 150.6281287 225.5710481 161.80700641 157.71369432
144.58961694 250.39569639 110.67344587 111.21584102 107.93161795
122.34617434 180.55849115 161.4535835 200.49417412 229.60354304
170.57681745 109.09703513 143.71605666 177.00946867 194.23710327
161.7697504 229.71549579 140.591871 78.02802694 129.5053364
186.58248135 102.6960668 104.20365368 156.12372213 187.20901614
89.82145958 237.95082988 207.73770948 209.32169425 197.67298512
96.07154965 95.09805607 97.25266832 220.52646948 101.27641956
194.77086843 81.25151312 102.64887002 144.94838244 110.25258101
213.51721996 79.10616278 81.51256193 103.20741885 216.7643487
72.0598882 119.05445336 178.45613256 197.31526195 137.70888526
200.13140539 218.50018565 169.49700466 188.32727388 186.73052546
140.07357784 121.14922477 231.63901622 201.3077663 189.74608267
46.44945536 63.00668405 158.37093056 126.26280796 87.59938665
136.23598795 80.10973443 140.69343212 196.9680135 119.32968814
170.83211938 146.07744866 139.45758571 99.37633812 85.05298366
64.04582025 159.08368556 120.78108123 224.25244938 287.72165668
167.91861665 194.94689188 71.35611103 139.15500257]
elapsed: 0.021459341049194336
best lambda: 1.2836287759411216
best lambda std. dev: 7.518240463343744
best coef: [ 152.36008914 0. 0. 478.69081702 163.09825002
0. 0. -127.63723154 0. 383.45857834
14.02212484]
best GLMNet: {'lambdau': array([4.40344909e+01, 4.01225881e+01, 3.65582080e+01, 3.33104775e+01,
3.03512665e+01, 2.76549436e+01, 2.51981547e+01, 2.29596201e+01,
2.09199507e+01, 1.90614799e+01, 1.73681106e+01, 1.58251755e+01,
1.44193105e+01, 1.31383387e+01, 1.19711649e+01, 1.09076796e+01,
9.93867143e+00, 9.05574725e+00, 8.25125963e+00, 7.51824046e+00,
6.85034070e+00, 6.24177531e+00, 5.68727320e+00, 5.18203152e+00,
4.72167412e+00, 4.30221361e+00, 3.92001681e+00, 3.57177332e+00,
3.25446682e+00, 2.96534896e+00, 2.70191553e+00, 2.46188480e+00,
2.24317774e+00, 2.04390001e+00, 1.86232557e+00, 1.69688170e+00,
1.54613540e+00, 1.40878100e+00, 1.28362878e+00, 1.16959473e+00,
1.06569116e+00, 9.71018095e-01, 8.84755524e-01, 8.06156282e-01,
7.34539579e-01, 6.69285108e-01, 6.09827663e-01, 5.55652254e-01,
5.06289640e-01, 4.61312263e-01, 4.20330553e-01, 3.82989545e-01,
3.48965810e-01, 3.17964649e-01, 2.89717546e-01, 2.63979838e-01,
2.40528596e-01, 2.19160699e-01, 1.99691066e-01, 1.81951062e-01,
1.65787032e-01, 1.51058969e-01, 1.37639306e-01, 1.25411810e-01,
1.14270570e-01, 1.04119088e-01, 9.48694348e-02, 8.64414957e-02,
7.87622714e-02, 7.17652483e-02, 6.53898214e-02, 5.95807699e-02,
5.42877785e-02, 4.94650019e-02, 4.50706675e-02, 4.10667136e-02,
3.74184599e-02, 3.40943071e-02, 3.10654628e-02, 2.83056927e-02,
2.57910930e-02, 2.34998834e-02, 2.14122185e-02, 1.95100160e-02,
1.77768000e-02, 1.61975581e-02, 1.47586116e-02, 1.34474973e-02]), 'cvm': array([5849.67888044, 5588.13049574, 5237.68523549, 4913.35994927,
4643.97138541, 4420.18846237, 4234.28760402, 4079.94561166,
3953.4266667 , 3843.20670735, 3742.11421001, 3650.89110401,
3567.12685974, 3496.28344973, 3438.17453542, 3390.16054415,
3350.74498364, 3318.09382761, 3291.04560008, 3268.40981966,
3249.56159518, 3235.42351215, 3224.36750318, 3210.81451391,
3195.03055142, 3182.43023807, 3170.41713324, 3160.98874011,
3153.61834888, 3147.36615655, 3140.58675366, 3134.53307657,
3126.70250974, 3121.53432349, 3118.77235699, 3117.224526 ,
3116.5750121 , 3116.07320448, 3115.6035719 , 3115.63220558,
3116.16432467, 3116.61109199, 3116.30815949, 3116.14506076,
3116.23491199, 3116.51194066, 3117.07327289, 3117.66910598,
3118.28730793, 3118.94059329, 3119.58984965, 3120.29866814,
3122.45940284, 3124.65303008, 3126.52180218, 3127.45737901,
3128.23057335, 3128.3594194 , 3128.33377776, 3127.73013801,
3127.46559483, 3126.80867588, 3125.85726821, 3125.28303452,
3124.84520112, 3124.63312595, 3124.43410757, 3124.32847712,
3124.23076662, 3124.18911129, 3124.06737071, 3124.09365713,
3124.14154577, 3124.18800616, 3124.32158173, 3124.43107133,
3124.48930905, 3124.54190509, 3124.61269842, 3124.64999794,
3124.63934723, 3124.60634785, 3124.59003554, 3124.58808782,
3124.56949869, 3124.55185597, 3124.55864693, 3124.55978034]), 'cvsd': array([259.52792143, 248.75823371, 221.46767784, 196.31833834,
177.94632187, 165.15950248, 156.81416851, 151.85850454,
149.5689093 , 150.20248582, 149.67202475, 146.56506657,
143.00853209, 140.60041047, 138.55782035, 136.8510055 ,
135.41484385, 134.23857211, 133.26532408, 132.2940369 ,
131.49659758, 130.83158262, 130.54053885, 130.02800433,
129.85450444, 130.97322891, 133.21473862, 135.58414859,
138.00196431, 140.50827777, 143.37107165, 145.89654308,
148.34884887, 150.24805302, 152.2145693 , 154.10961339,
155.93288424, 157.76716921, 159.63011092, 161.33532503,
162.88557649, 164.32062807, 165.470281 , 166.54562968,
167.54645567, 168.49635186, 169.37581278, 170.17848841,
170.94950123, 171.65270522, 172.31624765, 172.88013735,
172.80109307, 172.76638875, 172.75274784, 173.02037182,
173.2886591 , 173.4866617 , 173.69450463, 173.79827944,
173.43628185, 172.75611616, 172.21856656, 171.70858949,
171.08574087, 170.55631258, 170.09854581, 169.68475556,
169.32167531, 168.99748594, 168.74915863, 168.50227099,
168.28568613, 168.08332938, 167.91550145, 167.77407306,
167.65195892, 167.54167529, 167.44368359, 167.34059865,
167.23770408, 167.14209302, 167.05532325, 166.97480747,
166.90199522, 166.83948792, 166.77940688, 166.7253628 ]), 'cvup': array([6109.20680187, 5836.88872945, 5459.15291333, 5109.67828761,
4821.91770728, 4585.34796485, 4391.10177253, 4231.8041162 ,
4102.995576 , 3993.40919317, 3891.78623476, 3797.45617058,
3710.13539183, 3636.8838602 , 3576.73235577, 3527.01154965,
3486.15982749, 3452.33239972, 3424.31092416, 3400.70385656,
3381.05819276, 3366.25509477, 3354.90804202, 3340.84251824,
3324.88505586, 3313.40346698, 3303.63187186, 3296.5728887 ,
3291.62031319, 3287.87443432, 3283.95782531, 3280.42961965,
3275.05135861, 3271.7823765 , 3270.9869263 , 3271.3341394 ,
3272.50789634, 3273.84037369, 3275.23368282, 3276.96753061,
3279.04990117, 3280.93172006, 3281.77844048, 3282.69069045,
3283.78136765, 3285.00829252, 3286.44908566, 3287.84759439,
3289.23680916, 3290.59329851, 3291.9060973 , 3293.17880549,
3295.26049591, 3297.41941883, 3299.27455002, 3300.47775083,
3301.51923245, 3301.8460811 , 3302.02828239, 3301.52841745,
3300.90187668, 3299.56479204, 3298.07583477, 3296.99162401,
3295.93094198, 3295.18943853, 3294.53265338, 3294.01323268,
3293.55244193, 3293.18659723, 3292.81652934, 3292.59592812,
3292.4272319 , 3292.27133554, 3292.23708318, 3292.2051444 ,
3292.14126797, 3292.08358038, 3292.05638201, 3291.99059659,
3291.87705131, 3291.74844088, 3291.64535879, 3291.56289529,
3291.47149391, 3291.39134388, 3291.33805381, 3291.28514314]), 'cvlo': array([5590.15095901, 5339.37226204, 5016.21755765, 4717.04161093,
4466.02506353, 4255.0289599 , 4077.47343551, 3928.08710712,
3803.85775741, 3693.00422154, 3592.44218526, 3504.32603744,
3424.11832765, 3355.68303927, 3299.61671507, 3253.30953865,
3215.33013978, 3183.85525549, 3157.78027601, 3136.11578276,
3118.0649976 , 3104.59192953, 3093.82696433, 3080.78650958,
3065.17604697, 3051.45700916, 3037.20239462, 3025.40459152,
3015.61638456, 3006.85787878, 2997.21568202, 2988.63653348,
2978.35366087, 2971.28627047, 2966.55778769, 2963.11491261,
2960.64212786, 2958.30603527, 2955.97346098, 2954.29688055,
2953.27874818, 2952.29046392, 2950.83787849, 2949.59943108,
2948.68845632, 2948.0155888 , 2947.69746011, 2947.49061756,
2947.3378067 , 2947.28788807, 2947.27360201, 2947.41853078,
2949.65830978, 2951.88664133, 2953.76905433, 2954.43700719,
2954.94191425, 2954.87275769, 2954.63927313, 2953.93185857,
2954.02931298, 2954.05255971, 2953.63870164, 2953.57444503,
2953.75946025, 2954.07681336, 2954.33556176, 2954.64372156,
2954.90909131, 2955.19162535, 2955.31821208, 2955.59138614,
2955.85585964, 2956.10467679, 2956.40608028, 2956.65699827,
2956.83735013, 2957.0002298 , 2957.16901484, 2957.30939929,
2957.40164314, 2957.46425483, 2957.5347123 , 2957.61328035,
2957.66750348, 2957.71236805, 2957.77924005, 2957.83441753]), 'nzero': array([ 0, 1, 2, 2, 2, 2, 2, 2, 2, 3, 4, 4, 4, 4, 4, 4, 4,
5, 5, 5, 5, 5, 5, 6, 6, 6, 6, 6, 7, 7, 7, 7, 7, 7,
7, 7, 7, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8,
8, 8, 8, 8, 9, 9, 9, 9, 9, 9, 9, 9, 8, 9, 9, 9, 9,
10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10,
10, 10, 10]), 'name': 'Mean-Squared Error', 'glmnet_fit': {'a0': array([152.42776204, 152.42566409, 152.43276797, 152.44073179,
152.44798822, 152.45459811, 152.46062269, 152.46611206,
152.47111377, 152.47445162, 152.47048801, 152.45602272,
152.44285433, 152.43085578, 152.41992316, 152.40996176,
152.4008853 , 152.38972315, 152.37419946, 152.36008914,
152.3472322 , 152.33551743, 152.32484337, 152.31445325,
152.30457294, 152.29565898, 152.2875366 , 152.28013579,
152.27233919, 152.25527491, 152.23909998, 152.22445076,
152.2111053 , 152.19894546, 152.18786587, 152.17777057,
152.1685721 , 152.15730549, 152.14410824, 152.13217008,
152.12118064, 152.1112707 , 152.102268 , 152.09394524,
152.08646062, 152.07967323, 152.07337 , 152.06772299,
152.06248036, 152.05779897, 152.05344349, 152.04956603,
152.04594958, 152.04274165, 152.03974152, 152.03723678,
152.03770233, 152.03883229, 152.03986205, 152.04079072,
152.041649 , 152.04242696, 152.04313295, 152.04338785,
152.04364361, 152.04392876, 152.04419377, 152.0444226 ,
152.0447089 , 152.04530852, 152.04554794, 152.0457575 ,
152.04595716, 152.04613018, 152.04629475, 152.04644218,
152.04657382, 152.0466978 , 152.0468066 , 152.0469082 ,
152.04700243, 152.04708911, 152.04716156, 152.04723421,
152.04729311, 152.04735235, 152.04740486, 152.04745049]), 'beta': array([[ 0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, -2.57050755e-01, -6.60326286e-01,
-9.78489254e-01, -1.26685258e+00, -1.53621534e+00,
-1.97998984e+00, -2.33787474e+00, -2.66018889e+00,
-2.95822820e+00, -3.22526923e+00, -3.47211745e+00,
-3.69568054e+00, -3.89802418e+00, -4.08448852e+00,
-4.25217821e+00, -4.40633136e+00, -4.54760761e+00,
-4.67674008e+00, -4.79098332e+00, -4.89877047e+00,
-4.99301452e+00, -5.08205906e+00, -5.16234069e+00,
-5.23446789e+00],
[ 0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, -1.49446624e+01,
-3.33086438e+01, -4.99701883e+01, -6.51519098e+01,
-7.89849273e+01, -9.14807107e+01, -1.02558316e+02,
-1.12553874e+02, -1.21673779e+02, -1.29983486e+02,
-1.37554982e+02, -1.44453847e+02, -1.50739836e+02,
-1.56467395e+02, -1.62068215e+02, -1.67199275e+02,
-1.71883535e+02, -1.76140004e+02, -1.80029098e+02,
-1.83575513e+02, -1.86794371e+02, -1.89737633e+02,
-1.92422931e+02, -1.94857087e+02, -1.97085482e+02,
-1.99105321e+02, -2.00956533e+02, -2.02633054e+02,
-2.04171528e+02, -2.05563237e+02, -2.06842288e+02,
-2.07997804e+02, -2.09060396e+02, -2.10339592e+02,
-2.11701852e+02, -2.12943164e+02, -2.14069597e+02,
-2.15101772e+02, -2.16040314e+02, -2.16894118e+02,
-2.17720672e+02, -2.18460918e+02, -2.19212695e+02,
-2.19879074e+02, -2.20478047e+02, -2.21040126e+02,
-2.21464981e+02, -2.21782875e+02, -2.22068384e+02,
-2.22333206e+02, -2.22569663e+02, -2.22788891e+02,
-2.22987204e+02, -2.23166444e+02, -2.23331996e+02,
-2.23480492e+02, -2.23617233e+02, -2.23742706e+02,
-2.23857470e+02, -2.23958401e+02, -2.24054259e+02,
-2.24137436e+02, -2.24216526e+02, -2.24287751e+02,
-2.24351589e+02],
[ 0.00000000e+00, 8.12690155e+01, 1.36371438e+02,
1.83441721e+02, 2.26329677e+02, 2.65424715e+02,
3.01029532e+02, 3.33471311e+02, 3.63031051e+02,
3.85363952e+02, 4.02298321e+02, 4.14560699e+02,
4.25779668e+02, 4.36001929e+02, 4.45316073e+02,
4.53802774e+02, 4.61535540e+02, 4.68218469e+02,
4.73667496e+02, 4.78690817e+02, 4.83267733e+02,
4.87438049e+02, 4.91237886e+02, 4.92787323e+02,
4.93507456e+02, 4.94238917e+02, 4.94904948e+02,
4.95511813e+02, 4.96218138e+02, 4.97513685e+02,
4.98767866e+02, 4.99952852e+02, 5.01034299e+02,
5.02019709e+02, 5.02917578e+02, 5.03735683e+02,
5.04481110e+02, 5.05182886e+02, 5.05982344e+02,
5.06697422e+02, 5.07366395e+02, 5.07959876e+02,
5.08496411e+02, 5.09003963e+02, 5.09450892e+02,
5.09852635e+02, 5.10237791e+02, 5.10572273e+02,
5.10893667e+02, 5.11168610e+02, 5.11436170e+02,
5.11660454e+02, 5.11883126e+02, 5.12064493e+02,
5.12249641e+02, 5.12397142e+02, 5.12110707e+02,
5.11537926e+02, 5.11015923e+02, 5.10547161e+02,
5.10111373e+02, 5.09717193e+02, 5.09360059e+02,
5.09062511e+02, 5.08773585e+02, 5.08352963e+02,
5.08021209e+02, 5.07729466e+02, 5.07439254e+02,
5.07147323e+02, 5.07002339e+02, 5.06877126e+02,
5.06755269e+02, 5.06652272e+02, 5.06552156e+02,
5.06463320e+02, 5.06384795e+02, 5.06309534e+02,
5.06244857e+02, 5.06183541e+02, 5.06126213e+02,
5.06073252e+02, 5.06031044e+02, 5.05986114e+02,
5.05952149e+02, 5.05915669e+02, 5.05883889e+02,
5.05856727e+02],
[ 0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
1.70132045e+01, 3.85628559e+01, 5.85831519e+01,
7.68171853e+01, 9.34313567e+01, 1.08569571e+02,
1.22362948e+02, 1.34930959e+02, 1.45883461e+02,
1.54892818e+02, 1.63098250e+02, 1.70574758e+02,
1.77387073e+02, 1.83594201e+02, 1.91698488e+02,
1.99877350e+02, 2.07309626e+02, 2.14081739e+02,
2.20252235e+02, 2.25836131e+02, 2.30854142e+02,
2.35350216e+02, 2.39480301e+02, 2.43244261e+02,
2.46673858e+02, 2.49798780e+02, 2.52646093e+02,
2.55240458e+02, 2.58060407e+02, 2.61002208e+02,
2.63672892e+02, 2.66118970e+02, 2.68336072e+02,
2.70353151e+02, 2.72204610e+02, 2.73880339e+02,
2.75403355e+02, 2.76804780e+02, 2.78070207e+02,
2.79234835e+02, 2.80284018e+02, 2.81251353e+02,
2.82120520e+02, 2.82923836e+02, 2.83643348e+02,
2.84310210e+02, 2.84907333e+02, 2.85740812e+02,
2.86495875e+02, 2.87183902e+02, 2.87806755e+02,
2.88379402e+02, 2.88899485e+02, 2.89372184e+02,
2.89761820e+02, 2.90051027e+02, 2.90329831e+02,
2.90583199e+02, 2.90814356e+02, 2.91023079e+02,
2.91367320e+02, 2.91724450e+02, 2.92045544e+02,
2.92343586e+02, 2.92609412e+02, 2.92856115e+02,
2.93079121e+02, 2.93280599e+02, 2.93466879e+02,
2.93633725e+02, 2.93787566e+02, 2.93928760e+02,
2.94057918e+02, 2.94171205e+02, 2.94279355e+02,
2.94372739e+02, 2.94462054e+02, 2.94542347e+02,
2.94614283e+02],
[ 0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, -1.36641339e+01,
-2.56949353e+01, -3.65339408e+01, -4.64069841e+01,
-5.54028686e+01, -6.35995821e+01, -7.10681220e+01,
-7.78731775e+01, -8.94539161e+01, -1.06025634e+02,
-1.20932997e+02, -1.34763877e+02, -1.47137268e+02,
-1.58351801e+02, -1.68835741e+02, -1.78169083e+02,
-1.86601030e+02, -1.94548188e+02, -2.01574005e+02,
-2.08193300e+02, -2.14009275e+02, -2.19511594e+02,
-2.24317764e+02, -2.28887356e+02, -2.32851213e+02,
-2.36640315e+02, -2.40139689e+02, -2.79565543e+02,
-3.25766382e+02, -3.67869415e+02, -4.05713131e+02,
-4.40850849e+02, -4.72649495e+02, -5.01471039e+02,
-5.17176076e+02, -5.23163263e+02, -5.29752212e+02,
-5.35582429e+02, -5.40762533e+02, -5.45810791e+02,
-5.68995791e+02, -5.89704343e+02, -6.08087375e+02,
-6.25426021e+02, -6.40610244e+02, -6.54925655e+02,
-6.67781833e+02, -6.79311821e+02, -6.90104449e+02,
-6.99633636e+02, -7.08506133e+02, -7.16700489e+02,
-7.24221664e+02, -7.30607557e+02, -7.36940752e+02,
-7.42167947e+02, -7.47370783e+02, -7.51998243e+02,
-7.56081896e+02],
[ 0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, -9.68976845e-01, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 3.68720080e-01, 3.09620058e+01,
6.78004170e+01, 1.01371378e+02, 1.31560071e+02,
1.59572246e+02, 1.84928184e+02, 2.07914199e+02,
2.20954218e+02, 2.27129841e+02, 2.33995226e+02,
2.40051777e+02, 2.45431838e+02, 2.50673347e+02,
2.69236079e+02, 2.85159056e+02, 2.99290150e+02,
3.12619423e+02, 3.24291879e+02, 3.35296858e+02,
3.45180286e+02, 3.54043950e+02, 3.62340771e+02,
3.69666591e+02, 3.76487111e+02, 3.82786560e+02,
3.88568616e+02, 3.93477958e+02, 3.98345510e+02,
4.02363641e+02, 4.06361744e+02, 4.09917999e+02,
4.13055709e+02],
[ 0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, -7.42047513e+00, -2.64227558e+01,
-4.37192376e+01, -5.94791524e+01, -7.38389999e+01,
-8.69231581e+01, -9.88449557e+01, -1.09571014e+02,
-1.19034491e+02, -1.27637232e+02, -1.35475817e+02,
-1.42618044e+02, -1.49125776e+02, -1.61151255e+02,
-1.74158633e+02, -1.85931064e+02, -1.96657994e+02,
-2.06431972e+02, -2.15283466e+02, -2.18436551e+02,
-2.21382759e+02, -2.24160360e+02, -2.26693300e+02,
-2.29001265e+02, -2.31104198e+02, -2.33020313e+02,
-2.34766205e+02, -2.29197167e+02, -2.16487393e+02,
-2.05147275e+02, -1.94504046e+02, -1.85092964e+02,
-1.76592652e+02, -1.68514654e+02, -1.61428957e+02,
-1.55063466e+02, -1.48932141e+02, -1.43616020e+02,
-1.38498773e+02, -1.34107644e+02, -1.29850450e+02,
-1.26234871e+02, -1.22698301e+02, -1.19732541e+02,
-1.16801703e+02, -1.14415778e+02, -9.70975143e+01,
-7.79478962e+01, -6.04966620e+01, -4.48311763e+01,
-3.02597389e+01, -1.70813532e+01, -5.14262258e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 9.51704287e-02,
1.00979264e+01, 1.98453510e+01, 2.85092746e+01,
3.66716835e+01, 4.38287086e+01, 5.05691226e+01,
5.66246632e+01, 6.20583085e+01, 6.71405249e+01,
7.16315311e+01, 7.58109830e+01, 7.96692228e+01,
8.32096263e+01, 8.62217260e+01, 8.92032380e+01,
9.16703718e+01, 9.41212923e+01, 9.63022628e+01,
9.82294421e+01],
[ 0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 1.07249423e+01, 2.98708205e+01,
4.70612024e+01, 6.30535776e+01, 7.73214769e+01,
9.02427037e+01, 1.02368722e+02, 1.13126474e+02,
1.22832215e+02, 1.32026948e+02, 1.40117597e+02,
1.47779805e+02, 1.54471866e+02, 1.60842617e+02,
1.66364359e+02, 1.71656130e+02, 1.76198467e+02,
1.80586841e+02, 1.84136116e+02, 1.90324243e+02,
1.95603639e+02, 2.00414755e+02, 2.04723395e+02,
2.08744265e+02, 2.12376626e+02, 2.15664449e+02,
2.16223112e+02, 2.14236250e+02, 2.11891816e+02,
2.09849933e+02, 2.08046446e+02, 2.06334688e+02,
2.09048000e+02, 2.12635649e+02, 2.15838921e+02,
2.18843627e+02, 2.21490940e+02, 2.23973928e+02,
2.26207991e+02, 2.28216571e+02, 2.30089459e+02,
2.31750108e+02, 2.33292294e+02, 2.34713473e+02,
2.36016351e+02, 2.37133821e+02, 2.38230875e+02,
2.39148329e+02, 2.40052084e+02, 2.40858196e+02,
2.41574016e+02],
[ 0.00000000e+00, 0.00000000e+00, 4.39167099e+01,
9.11948248e+01, 1.34273201e+02, 1.73517156e+02,
2.09282242e+02, 2.41870058e+02, 2.71562863e+02,
2.94523024e+02, 3.11724868e+02, 3.23628840e+02,
3.34465299e+02, 3.44339094e+02, 3.53335729e+02,
3.61533128e+02, 3.69002292e+02, 3.75138158e+02,
3.79492227e+02, 3.83458578e+02, 3.87072520e+02,
3.90365409e+02, 3.93365767e+02, 3.95111217e+02,
3.96238866e+02, 3.97315160e+02, 3.98295771e+02,
3.99189268e+02, 4.00198190e+02, 4.08663636e+02,
4.16302632e+02, 4.23144408e+02, 4.29375211e+02,
4.35052422e+02, 4.40225282e+02, 4.44938601e+02,
4.49233201e+02, 4.51860678e+02, 4.54350055e+02,
4.56591666e+02, 4.58667887e+02, 4.60528445e+02,
4.62215542e+02, 4.63789062e+02, 4.65192798e+02,
4.66462188e+02, 4.67654746e+02, 4.68712723e+02,
4.69705653e+02, 4.70583274e+02, 4.71408346e+02,
4.72137042e+02, 4.72821750e+02, 4.73428395e+02,
4.73995322e+02, 4.74609787e+02, 4.88709192e+02,
5.05740213e+02, 5.21260699e+02, 5.35211675e+02,
5.48164290e+02, 5.59886261e+02, 5.70510921e+02,
5.76543528e+02, 5.79304110e+02, 5.82431273e+02,
5.85176234e+02, 5.87610894e+02, 5.89984755e+02,
5.98595185e+02, 6.06028752e+02, 6.12621941e+02,
6.18846132e+02, 6.24291474e+02, 6.29429657e+02,
6.34042562e+02, 6.38177930e+02, 6.42051372e+02,
6.45468850e+02, 6.48652331e+02, 6.51593539e+02,
6.54293636e+02, 6.56582200e+02, 6.58856110e+02,
6.60728503e+02, 6.62595845e+02, 6.64255708e+02,
6.65719010e+02],
[ 0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 2.22420700e+00,
8.40506170e+00, 1.40221248e+01, 1.91402273e+01,
2.38036515e+01, 2.80527900e+01, 3.35745064e+01,
3.91912042e+01, 4.42732349e+01, 4.89039329e+01,
5.31232518e+01, 5.71159051e+01, 6.22997267e+01,
6.71436153e+01, 7.15185283e+01, 7.55036834e+01,
7.91347844e+01, 8.24433079e+01, 8.54579112e+01,
8.82047055e+01, 9.04601432e+01, 9.24531001e+01,
9.42683183e+01, 9.59231132e+01, 9.74301242e+01,
9.88030833e+01, 1.00054948e+02, 1.01194963e+02,
1.02233707e+02, 1.03180712e+02, 1.04043665e+02,
1.04829883e+02, 1.05547098e+02, 1.06199712e+02,
1.06796261e+02, 1.07337852e+02, 1.07834246e+02,
1.08283591e+02, 1.08676167e+02, 1.08790144e+02,
1.08924256e+02, 1.09046456e+02, 1.09158016e+02,
1.09259390e+02, 1.09351846e+02, 1.09436149e+02,
1.09513219e+02, 1.09652233e+02, 1.09844120e+02,
1.09991972e+02, 1.10125028e+02, 1.10252188e+02,
1.10368743e+02, 1.10441912e+02, 1.10508143e+02,
1.10568876e+02, 1.10623828e+02, 1.10674197e+02,
1.10719990e+02, 1.10761597e+02, 1.10799677e+02,
1.10834205e+02, 1.10865760e+02, 1.10894585e+02,
1.10920886e+02, 1.10944569e+02, 1.10966411e+02,
1.10985993e+02, 1.11004066e+02, 1.11020463e+02,
1.11035326e+02]]), 'dev': array([0. , 0.05641001, 0.11880048, 0.17305008, 0.21808882,
0.2554852 , 0.28652791, 0.31230011, 0.33369665, 0.35412118,
0.37300264, 0.39032216, 0.40470384, 0.41664375, 0.42655648,
0.4347862 , 0.44161866, 0.44738094, 0.45231927, 0.4564217 ,
0.45982761, 0.46265525, 0.46500281, 0.46891586, 0.472756 ,
0.47593769, 0.47857921, 0.48077224, 0.48266112, 0.4850628 ,
0.48708015, 0.48875059, 0.49013731, 0.49128859, 0.4922444 ,
0.49303793, 0.49369674, 0.49444377, 0.49525646, 0.49592659,
0.49648931, 0.49695178, 0.4973351 , 0.49765856, 0.49792371,
0.49814318, 0.49832939, 0.49848145, 0.49861048, 0.49871548,
0.49880484, 0.49887732, 0.49893924, 0.49898927, 0.49903219,
0.49906914, 0.49939168, 0.49974097, 0.50003231, 0.50027232,
0.50047642, 0.50064571, 0.5007865 , 0.50086059, 0.50089074,
0.50092133, 0.50094613, 0.50096647, 0.50098483, 0.50107352,
0.50114759, 0.50120815, 0.50126073, 0.50130323, 0.50134019,
0.50137086, 0.50139632, 0.5014184 , 0.50143649, 0.50145215,
0.50146558, 0.50147706, 0.50148616, 0.5014946 , 0.50150113,
0.50150723, 0.50151232, 0.50151656]), 'nulldev': array([3.08783547e+02, 5.68934729e+03, 1.06973020e+04, 7.46975805e+03,
7.10271730e+01, 1.55454842e+03, 6.49188553e+03, 9.16277396e+01,
9.87704227e+02, 4.15093652e+03, 2.23718544e+04, 5.96715558e+02,
1.00857354e+04, 6.53939354e+02, 3.89722547e+03, 1.33916691e+02,
3.03726890e+02, 2.44310366e+03, 2.26311782e+03, 6.53939354e+02,
1.17879309e+04, 7.81946910e+03, 8.72874672e+03, 5.39163624e+03,
1.11150130e+04, 2.06368156e+03, 1.28658572e+04, 2.03850444e+00,
2.34524814e+03, 4.16957392e+03, 6.33174105e+03, 9.49216882e+03,
8.56961924e+03, 4.59148986e+02, 7.81946910e+03, 4.84029629e+03,
9.32619714e+03, 5.96715558e+02, 1.97605541e+04, 2.08160317e+02,
1.45376646e+04, 3.10498359e+01, 2.42494595e+02, 2.44310366e+03,
1.54449269e+02, 4.13161248e+01, 1.55182425e+04, 1.32698185e+03,
6.49188553e+03, 3.41380338e+03, 2.03268430e+04, 1.48781754e+03,
7.84504134e+03, 2.47193220e+00, 1.30593745e+02, 9.91463057e+03,
2.07682887e+03, 1.83798317e+04, 2.64481471e+03, 7.06083830e+02,
5.48860034e+02, 4.65361451e+02, 3.77336995e+03, 4.68235862e+03,
6.03794878e+02, 4.43186287e+03, 7.29790253e+03, 1.55454842e+03,
6.79433595e+03, 9.88587986e+03, 5.89402852e+00, 1.26400017e+04,
1.11454974e+04, 2.42494595e+02, 1.70491093e+04, 9.91463057e+03,
1.19744351e+04, 6.15091386e+03, 5.53949176e+03, 3.20041811e+03,
1.38232311e+04, 3.89722547e+03, 6.30876938e+03, 2.48290102e+04,
2.54295918e+03, 1.40083737e+03, 4.54650309e+03, 2.34524814e+03,
7.12804700e+03, 2.64481471e+03, 1.80304793e+02, 7.99732462e+03,
6.46862491e+03, 8.91660224e+03, 4.59148986e+02, 6.63048043e+03,
2.96238128e+03, 4.15093652e+03, 1.55182425e+04, 2.34524814e+03,
2.12350119e+02, 1.97382604e+03, 1.71625947e+03, 3.79114049e+03,
3.03726890e+02, 1.89853992e+03, 1.33752859e+03, 5.68934729e+03,
5.26672972e+03, 8.54289120e+03, 7.29790253e+03, 1.94804096e+04,
8.88826971e+01, 1.65308204e+04, 1.62378345e+04, 2.26311782e+03,
3.77336995e+03, 1.22262198e+04, 3.29794785e+03, 7.10271730e+01,
6.33174105e+03, 1.88597052e+03, 1.02875909e+04, 7.81946910e+03,
1.70491093e+04, 1.13576419e+04, 3.79114049e+03, 8.54289120e+03,
1.71625947e+03, 1.11454974e+04, 7.12804700e+03, 4.65361451e+02,
1.18527080e+03, 1.55454842e+03, 2.35844323e+04, 6.79433595e+03,
5.53949176e+03, 1.48781754e+03, 4.56600734e+03, 9.34661734e+02,
8.08137655e+02, 1.17565796e+04, 7.99732462e+03, 1.47669290e+03,
5.86330763e+03, 5.68934729e+03, 2.42494595e+02, 1.20060753e+04,
3.77336995e+03, 4.54650309e+03, 5.48860034e+02, 3.41380338e+03,
5.51716489e+01, 8.02318581e+03, 1.13576419e+04, 1.38232311e+04,
1.30593745e+02, 1.42975201e+04, 1.08738221e+02, 1.52700980e+04,
7.64361358e+03, 6.81817448e+03, 1.06095069e+03, 8.56961924e+03,
6.63048043e+03, 6.46862491e+03, 2.26311782e+03, 5.73387877e+01,
2.64481471e+03, 4.59148986e+02, 4.82021414e+03, 2.45740678e+03,
9.87704227e+02, 6.53939354e+02, 4.16957392e+03, 4.41264757e+03,
1.62746759e+04, 6.01745210e+03, 8.75576372e+03, 5.96715558e+02,
4.54650309e+03, 1.21942906e+04, 4.15093652e+03, 2.34524814e+03,
8.88826971e+01, 7.60228306e+02, 8.16372782e+02, 1.09051575e+04,
6.63048043e+03, 9.25848703e+02, 1.00857354e+04, 1.82980363e-01,
2.03850444e+00, 1.65308204e+04, 9.88587986e+03, 4.15093652e+03,
1.47798090e+04, 6.61640812e+00, 1.60205314e+04, 5.48860034e+02,
1.22262198e+04, 1.09353530e+04, 4.56600734e+03, 1.09051575e+04,
3.53165890e+03, 5.96715558e+02, 1.08738221e+02, 1.82980363e-01,
1.02875909e+04, 7.99732462e+03, 8.20333029e+03, 4.28079205e+03,
1.41167307e+03, 6.46862491e+03, 6.63048043e+03, 6.96019148e+03,
2.74867023e+03, 2.69871366e+02, 1.80304793e+02, 1.11772215e+02,
1.89853992e+03, 1.80011499e+03, 2.26311782e+03, 3.08783547e+02,
2.26719988e+04, 2.26311782e+03, 2.70840215e+04, 1.21942906e+04,
1.65308204e+04, 3.83072499e+02, 1.50239535e+04, 2.48290102e+04,
8.35903567e+03, 3.74702113e+04, 7.64361358e+03, 4.02308100e+03,
3.07223680e+03, 4.56600734e+03, 1.41167307e+03, 9.49216882e+03,
5.10192519e+03, 4.82021414e+03, 1.84205643e+02, 6.98431896e+03,
1.32698185e+03, 1.50239535e+04, 1.50239535e+04, 8.38547477e+03,
9.49216882e+03, 4.54650309e+03, 1.18527080e+03, 1.08738221e+02,
1.18527080e+03, 3.39582414e+02, 6.15091386e+03, 2.42027212e+04,
9.91463057e+03, 1.11150130e+04, 1.04914464e+04, 8.02318581e+03,
2.07682887e+03, 6.98426606e+02, 1.65308204e+04, 3.07223680e+03,
2.86998468e+03, 6.53939354e+02, 1.00857354e+04, 1.01147750e+04,
3.89722547e+03, 5.26672972e+03, 5.99505833e+03, 3.07223680e+03,
1.24483643e+04, 4.59148986e+02, 7.10271730e+01, 2.45740678e+03,
1.06095069e+03, 4.70215182e+03, 2.38015842e+02, 3.65151443e+03,
1.33569422e+04, 2.51451547e+04, 1.19523964e+03, 3.27456283e-01,
2.45740678e+03, 1.09353530e+04, 1.98668440e+03, 1.98668440e+03,
9.10645777e+03, 1.09051575e+04, 5.09505927e+02, 3.18409233e+03,
4.68235862e+03, 6.46571082e+02, 6.53939354e+02, 2.85452576e+03,
2.54295918e+03, 2.94606008e+01, 2.16897335e+03, 6.63048043e+03,
1.56596202e+03, 1.06973020e+04, 2.35926230e+03, 1.33569422e+04,
3.08783547e+02, 3.22461886e+04, 8.20333029e+03, 7.06083830e+02,
1.06973020e+04, 1.88597052e+03, 3.89722547e+03, 2.06368156e+03,
1.96050767e+01, 1.00857354e+04, 4.82021414e+03, 8.74517258e+02,
1.42975201e+04, 7.15246343e+03, 1.05155975e+03, 7.64361358e+03,
4.17293462e+02, 3.89722547e+03, 4.04142944e+03, 1.30593745e+02,
7.60228306e+02, 8.08137655e+02, 5.73387877e+01, 6.61640812e+00,
3.41380338e+03, 5.89402852e+00, 6.46862491e+03, 3.29794785e+03,
8.16372782e+02, 8.72874672e+03, 5.68934729e+03, 3.55594889e+04,
1.71625947e+03, 1.08738221e+02, 2.85452576e+03, 2.09011320e+04,
5.89402852e+00, 3.08827363e+03, 7.64361358e+03, 1.33235682e+04,
4.68235862e+03, 3.39582414e+02, 3.31456258e+03, 7.10271730e+01,
4.56600734e+03, 3.65151443e+03, 8.02318581e+03, 4.17293462e+02,
2.86998468e+03]), 'df': array([ 0, 1, 2, 2, 2, 2, 2, 2, 2, 3, 4, 4, 4, 4, 4, 4, 4,
5, 5, 5, 5, 5, 5, 6, 6, 6, 6, 6, 7, 7, 7, 7, 7, 7,
7, 7, 7, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8,
8, 8, 8, 8, 9, 9, 9, 9, 9, 9, 9, 9, 8, 9, 9, 9, 9,
10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10,
10, 10, 10]), 'lambdau': array([4.40344909e+01, 4.01225881e+01, 3.65582080e+01, 3.33104775e+01,
3.03512665e+01, 2.76549436e+01, 2.51981547e+01, 2.29596201e+01,
2.09199507e+01, 1.90614799e+01, 1.73681106e+01, 1.58251755e+01,
1.44193105e+01, 1.31383387e+01, 1.19711649e+01, 1.09076796e+01,
9.93867143e+00, 9.05574725e+00, 8.25125963e+00, 7.51824046e+00,
6.85034070e+00, 6.24177531e+00, 5.68727320e+00, 5.18203152e+00,
4.72167412e+00, 4.30221361e+00, 3.92001681e+00, 3.57177332e+00,
3.25446682e+00, 2.96534896e+00, 2.70191553e+00, 2.46188480e+00,
2.24317774e+00, 2.04390001e+00, 1.86232557e+00, 1.69688170e+00,
1.54613540e+00, 1.40878100e+00, 1.28362878e+00, 1.16959473e+00,
1.06569116e+00, 9.71018095e-01, 8.84755524e-01, 8.06156282e-01,
7.34539579e-01, 6.69285108e-01, 6.09827663e-01, 5.55652254e-01,
5.06289640e-01, 4.61312263e-01, 4.20330553e-01, 3.82989545e-01,
3.48965810e-01, 3.17964649e-01, 2.89717546e-01, 2.63979838e-01,
2.40528596e-01, 2.19160699e-01, 1.99691066e-01, 1.81951062e-01,
1.65787032e-01, 1.51058969e-01, 1.37639306e-01, 1.25411810e-01,
1.14270570e-01, 1.04119088e-01, 9.48694348e-02, 8.64414957e-02,
7.87622714e-02, 7.17652483e-02, 6.53898214e-02, 5.95807699e-02,
5.42877785e-02, 4.94650019e-02, 4.50706675e-02, 4.10667136e-02,
3.74184599e-02, 3.40943071e-02, 3.10654628e-02, 2.83056927e-02,
2.57910930e-02, 2.34998834e-02, 2.14122185e-02, 1.95100160e-02,
1.77768000e-02, 1.61975581e-02, 1.47586116e-02, 1.34474973e-02]), 'npasses': 1211, 'jerr': 0, 'dim': array([10, 88]), 'offset': False, 'class': 'elnet'}, 'lambda_min': array([1.28362878]), 'lambda_1se': array([7.51824046]), 'class': 'cvglmnet'}
dataset: fetch_california_housing -------------------
{'alpha': 1.0, 'dfmax': 10000000000.0, 'exclude': None, 'family': 'gaussian', 'lambdau': None, 'lower_lambdau': None, 'maxit': 100000.0, 'ncores': -1, 'nlambda': 100, 'parallel': False, 'penalty_factor': None, 'pmax': 10000000000.0, 'standardize': True, 'thresh': 1e-07, 'type_measure': 1, 'upper_lambdau': None, 'verbose': False, 'weights': None}
elapsed: 0.0047762393951416016
df %dev lambdau
0 0.000000 0.000000 0.790539
1 1.000000 0.079846 0.720310
2 1.000000 0.146136 0.656320
3 1.000000 0.201171 0.598014
4 1.000000 0.246862 0.544888
5 1.000000 0.284796 0.496482
6 1.000000 0.316289 0.452376
7 1.000000 0.342435 0.412188
8 1.000000 0.364142 0.375570
9 1.000000 0.382163 0.342206
10 1.000000 0.397125 0.311805
11 1.000000 0.409546 0.284105
12 1.000000 0.419859 0.258866
13 1.000000 0.428421 0.235869
14 1.000000 0.435529 0.214915
15 1.000000 0.441430 0.195823
16 2.000000 0.451591 0.178426
17 2.000000 0.460828 0.162575
18 2.000000 0.468496 0.148133
19 2.000000 0.474863 0.134973
20 2.000000 0.480149 0.122982
21 3.000000 0.484680 0.112057
22 3.000000 0.489706 0.102102
23 3.000000 0.493879 0.093032
24 3.000000 0.497344 0.084767
25 3.000000 0.500220 0.077236
26 3.000000 0.502608 0.070375
27 4.000000 0.507848 0.064123
28 4.000000 0.521856 0.058427
29 4.000000 0.533472 0.053236
30 4.000000 0.543117 0.048507
31 4.000000 0.551159 0.044198
32 4.000000 0.557809 0.040271
33 6.000000 0.563606 0.036694
34 6.000000 0.569117 0.033434
35 6.000000 0.573708 0.030464
36 6.000000 0.577542 0.027757
37 6.000000 0.580708 0.025291
38 6.000000 0.583337 0.023045
39 6.000000 0.585536 0.020997
40 6.000000 0.587350 0.019132
41 7.000000 0.589628 0.017432
42 7.000000 0.591806 0.015884
43 7.000000 0.593641 0.014473
44 7.000000 0.595162 0.013187
45 7.000000 0.596442 0.012015
46 7.000000 0.597491 0.010948
47 7.000000 0.598376 0.009975
48 7.000000 0.599099 0.009089
49 7.000000 0.599711 0.008282
50 7.000000 0.600209 0.007546
51 7.000000 0.600633 0.006876
52 7.000000 0.600976 0.006265
53 7.000000 0.601269 0.005708
54 7.000000 0.601506 0.005201
55 7.000000 0.601709 0.004739
56 7.000000 0.601873 0.004318
57 7.000000 0.602014 0.003935
58 7.000000 0.602126 0.003585
59 7.000000 0.602224 0.003267
60 7.000000 0.602306 0.002976
61 7.000000 0.602371 0.002712
62 7.000000 0.602427 0.002471
63 7.000000 0.602471 0.002251
64 7.000000 0.602511 0.002051
65 7.000000 0.602544 0.001869
66 7.000000 0.602569 0.001703
67 7.000000 0.602592 0.001552
68 7.000000 0.602612 0.001414
69 7.000000 0.602626 0.001288
70 7.000000 0.602639 0.001174
71 7.000000 0.602651 0.001070
72 7.000000 0.602659 0.000975
73 7.000000 0.602668 0.000888
74 8.000000 0.602674 0.000809
75 8.000000 0.602680 0.000737
[2.15386169 1.40517538 1.75155998 ... 1.5786708 2.24914669 2.74749123]
[[2.15386169 2.0965379 ]
[1.40517538 1.73841308]
[1.75155998 1.96630653]
...
[1.5786708 1.82758546]
[2.24914669 2.09450709]
[2.74749123 2.33255459]]
[2.0965379 1.73841308 1.96630653 ... 1.82758546 2.09450709 2.33255459]
elapsed: 0.08082914352416992
best lambda: 0.0029763296520373566
best lambda std. dev: 0.015883776165844302
best coef: [-2.89480122e+01 3.87657120e-01 1.00434474e-02 -1.47638444e-02
1.56518514e-01 0.00000000e+00 -2.28921823e-03 -3.44888900e-01
-3.46534665e-01]
best GLMNet: {'lambdau': array([7.90539283e-01, 7.20309952e-01, 6.56319601e-01, 5.98013976e-01,
5.44888063e-01, 4.96481709e-01, 4.52375642e-01, 4.12187837e-01,
3.75570206e-01, 3.42205584e-01, 3.11804983e-01, 2.84105088e-01,
2.58865975e-01, 2.35869035e-01, 2.14915080e-01, 1.95822617e-01,
1.78426275e-01, 1.62575377e-01, 1.48132628e-01, 1.34972934e-01,
1.22982310e-01, 1.12056901e-01, 1.02102075e-01, 9.30316077e-02,
8.47669361e-02, 7.72364751e-02, 7.03749995e-02, 6.41230785e-02,
5.84265609e-02, 5.32361063e-02, 4.85067573e-02, 4.41975507e-02,
4.02711621e-02, 3.66935831e-02, 3.34338263e-02, 3.04636573e-02,
2.77573499e-02, 2.52914635e-02, 2.30446396e-02, 2.09974173e-02,
1.91320646e-02, 1.74324247e-02, 1.58837762e-02, 1.44727053e-02,
1.31869900e-02, 1.20154942e-02, 1.09480708e-02, 9.97547435e-03,
9.08928070e-03, 8.28181406e-03, 7.54608052e-03, 6.87570753e-03,
6.26488862e-03, 5.70833318e-03, 5.20122059e-03, 4.73915849e-03,
4.31814471e-03, 3.93453264e-03, 3.58499960e-03, 3.26651812e-03,
2.97632965e-03, 2.71192073e-03, 2.47100117e-03, 2.25148423e-03,
2.05146858e-03, 1.86922176e-03, 1.70316525e-03, 1.55186075e-03,
1.41399772e-03, 1.28838206e-03, 1.17392574e-03, 1.06963742e-03,
9.74613777e-04, 8.88031775e-04, 8.09141480e-04, 7.37259581e-04]), 'cvm': array([1.32794354, 1.22292703, 1.13481835, 1.06167325, 1.00095081,
0.95054152, 0.90869409, 0.87395456, 0.84511588, 0.82117596,
0.80130284, 0.78480587, 0.77111164, 0.75974414, 0.75030818,
0.74244809, 0.72908801, 0.71682279, 0.70664152, 0.69819024,
0.69117511, 0.68511138, 0.6785909 , 0.67305371, 0.66845763,
0.66464277, 0.66147643, 0.65464086, 0.63599482, 0.6205351 ,
0.6076784 , 0.59699995, 0.58815497, 0.5801874 , 0.57304095,
0.56700578, 0.56200649, 0.55794318, 0.55464407, 0.55190472,
0.54967906, 0.54750774, 0.54502344, 0.54289758, 0.54115612,
0.53975023, 0.53861561, 0.53769851, 0.5369726 , 0.53638576,
0.53592615, 0.53556223, 0.53527887, 0.53506031, 0.5348938 ,
0.53477021, 0.5346767 , 0.53461606, 0.53457253, 0.53454878,
0.53453534, 0.53454145, 0.53454912, 0.53456553, 0.53458344,
0.53460438, 0.53463299, 0.53466219, 0.53468341, 0.53471083,
0.53473878, 0.53475851, 0.53478052, 0.53480475, 0.53482847,
0.53484363]), 'cvsd': array([0.01178608, 0.01293004, 0.01366089, 0.01428224, 0.01477845,
0.01515521, 0.01542664, 0.0156092 , 0.01571898, 0.01577054,
0.01577649, 0.01574747, 0.0156923 , 0.01561818, 0.01553091,
0.01543565, 0.01536175, 0.01521551, 0.01507116, 0.01493074,
0.01479571, 0.01467942, 0.01454884, 0.01441053, 0.01428044,
0.01415866, 0.01404511, 0.01406737, 0.01382548, 0.01361343,
0.01342251, 0.01325314, 0.01310364, 0.01285698, 0.01259207,
0.01240794, 0.0122745 , 0.01218093, 0.01211811, 0.01209786,
0.0120965 , 0.01213827, 0.01201918, 0.0118633 , 0.01172291,
0.01160189, 0.01150054, 0.01141143, 0.01133933, 0.01127929,
0.01122912, 0.01118883, 0.01115662, 0.01113133, 0.01111179,
0.01109708, 0.01108679, 0.01107902, 0.0110729 , 0.01107114,
0.01106948, 0.01107177, 0.01107547, 0.01108005, 0.01108465,
0.0110883 , 0.01109275, 0.01109805, 0.01110229, 0.01110699,
0.01111198, 0.01111601, 0.0111207 , 0.01112377, 0.01112778,
0.01113096]), 'cvup': array([1.33972961, 1.23585706, 1.14847923, 1.07595549, 1.01572926,
0.96569674, 0.92412073, 0.88956376, 0.86083487, 0.8369465 ,
0.81707933, 0.80055334, 0.78680394, 0.77536232, 0.76583909,
0.75788374, 0.74444976, 0.7320383 , 0.72171268, 0.71312098,
0.70597082, 0.6997908 , 0.69313973, 0.68746424, 0.68273806,
0.67880143, 0.67552153, 0.66870824, 0.6498203 , 0.63414852,
0.62110091, 0.61025309, 0.60125861, 0.59304439, 0.58563302,
0.57941372, 0.57428099, 0.57012411, 0.56676219, 0.56400258,
0.56177556, 0.559646 , 0.55704263, 0.55476088, 0.55287903,
0.55135213, 0.55011615, 0.54910994, 0.54831193, 0.54766505,
0.54715527, 0.54675106, 0.54643549, 0.54619164, 0.54600558,
0.5458673 , 0.54576349, 0.54569508, 0.54564544, 0.54561992,
0.54560483, 0.54561322, 0.54562459, 0.54564558, 0.54566809,
0.54569267, 0.54572575, 0.54576024, 0.5457857 , 0.54581782,
0.54585076, 0.54587452, 0.54590122, 0.54592852, 0.54595624,
0.54597458]), 'cvlo': array([1.31615746, 1.20999699, 1.12115746, 1.04739101, 0.98617236,
0.93538631, 0.89326745, 0.85834536, 0.8293969 , 0.80540542,
0.78552635, 0.7690584 , 0.75541934, 0.74412596, 0.73477727,
0.72701244, 0.71372626, 0.70160729, 0.69157036, 0.6832595 ,
0.67637941, 0.67043197, 0.66404206, 0.65864319, 0.65417719,
0.65048411, 0.64743132, 0.64057349, 0.62216933, 0.60692167,
0.59425588, 0.58374681, 0.57505134, 0.56733042, 0.56044887,
0.55459784, 0.54973199, 0.54576224, 0.54252596, 0.53980686,
0.53758257, 0.53536947, 0.53300426, 0.53103428, 0.52943322,
0.52814834, 0.52711506, 0.52628708, 0.52563327, 0.52510648,
0.52469703, 0.5243734 , 0.52412225, 0.52392899, 0.52378201,
0.52367313, 0.52358991, 0.52353705, 0.52349963, 0.52347764,
0.52346586, 0.52346967, 0.52347365, 0.52348548, 0.52349879,
0.52351608, 0.52354024, 0.52356414, 0.52358113, 0.52360384,
0.5236268 , 0.5236425 , 0.52365982, 0.52368098, 0.52370069,
0.52371267]), 'nzero': array([0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 3,
3, 3, 3, 3, 3, 4, 4, 4, 4, 4, 4, 6, 6, 6, 6, 6, 6, 6, 6, 7, 7, 7,
7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7,
7, 7, 7, 7, 7, 7, 7, 7, 8, 8]), 'name': 'Mean-Squared Error', 'glmnet_fit': {'a0': array([ 2.07155206, 1.92869743, 1.79853362, 1.6799332 ,
1.57186891, 1.47340475, 1.38368788, 1.30194121,
1.22745669, 1.15958917, 1.09775081, 1.041406 ,
0.99006671, 0.94328826, 0.90066548, 0.86182919,
0.78419117, 0.70670286, 0.6360984 , 0.57176624,
0.51314918, 0.47506172, 0.58273794, 0.68084849,
0.77024317, 0.85169627, 0.92591331, 0.1340664 ,
-3.21802346, -6.26774253, -9.04592177, -11.588099 ,
-13.8944293 , -16.01874985, -18.08798159, -19.97898154,
-21.71286443, -23.2830481 , -24.71279238, -26.02599746,
-27.21307621, -28.26744172, -28.9480122 , -29.62513797,
-30.24990969, -30.82753782, -31.34777174, -31.82823844,
-32.25971861, -32.65911099, -33.01681384, -33.34884939,
-33.64528032, -33.92136384, -34.16690208, -34.39649895,
-34.59975107, -34.79071341, -34.95880945, -35.11764682,
-35.26352689, -35.39020017, -35.51056873, -35.61482234,
-35.71486741, -35.80749594, -35.88589466, -35.96166964,
-36.03222638, -36.0907743 , -36.14802348, -36.20180937,
-36.24520991, -36.29447496, -36.33196639, -36.3685333 ]), 'beta': array([[ 0.00000000e+00, 3.69089068e-02, 7.05389280e-02,
1.01181351e-01, 1.29101585e-01, 1.54541463e-01,
1.77721332e-01, 1.98841966e-01, 2.18086300e-01,
2.35621021e-01, 2.51598006e-01, 2.66155639e-01,
2.79420013e-01, 2.91506016e-01, 3.02518331e-01,
3.12552343e-01, 3.22748854e-01, 3.32207839e-01,
3.40826515e-01, 3.48679531e-01, 3.55834907e-01,
3.62318017e-01, 3.67885055e-01, 3.72957535e-01,
3.77579389e-01, 3.81790650e-01, 3.85627795e-01,
3.88095546e-01, 3.87056210e-01, 3.86116846e-01,
3.85261953e-01, 3.84464984e-01, 3.83755508e-01,
3.83125549e-01, 3.82627801e-01, 3.82165258e-01,
3.81726613e-01, 3.81342219e-01, 3.80993473e-01,
3.80659146e-01, 3.80369485e-01, 3.82822873e-01,
3.87657120e-01, 3.91827446e-01, 3.95567845e-01,
3.99006714e-01, 4.02098409e-01, 4.04953310e-01,
4.07516028e-01, 4.09888544e-01, 4.12012573e-01,
4.13984518e-01, 4.15744153e-01, 4.17383249e-01,
4.18840038e-01, 4.20202429e-01, 4.21407331e-01,
4.22539521e-01, 4.23534647e-01, 4.24475087e-01,
4.25342286e-01, 4.26092646e-01, 4.26805003e-01,
4.27418926e-01, 4.28007930e-01, 4.28556755e-01,
4.29017421e-01, 4.29462125e-01, 4.29879179e-01,
4.30220014e-01, 4.30552663e-01, 4.30867949e-01,
4.31115041e-01, 4.31407474e-01, 4.31626377e-01,
4.31836134e-01],
[ 0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 1.33369662e-03, 2.76189648e-03,
4.06321901e-03, 5.24893562e-03, 6.32931645e-03,
7.31389310e-03, 8.21261862e-03, 9.03150383e-03,
9.77764153e-03, 1.04574944e-02, 1.10769510e-02,
1.14705147e-02, 1.12926558e-02, 1.11317164e-02,
1.09852236e-02, 1.08491056e-02, 1.07275245e-02,
1.06183086e-02, 1.05377293e-02, 1.04630180e-02,
1.03924495e-02, 1.03303673e-02, 1.02740180e-02,
1.02202715e-02, 1.01734710e-02, 1.00897532e-02,
1.00434474e-02, 9.99046829e-03, 9.94106280e-03,
9.89364905e-03, 9.85243264e-03, 9.81285130e-03,
9.77878640e-03, 9.74577568e-03, 9.71766435e-03,
9.69012117e-03, 9.66695992e-03, 9.64396712e-03,
9.62492522e-03, 9.60572308e-03, 9.59011509e-03,
9.57407572e-03, 9.56133872e-03, 9.54794508e-03,
9.53533462e-03, 9.52584661e-03, 9.51565144e-03,
9.50810723e-03, 9.49964598e-03, 9.49143017e-03,
9.48588215e-03, 9.47946949e-03, 9.47311374e-03,
9.46919501e-03, 9.46440835e-03, 9.45950373e-03,
9.45688749e-03, 9.45132176e-03, 9.44858552e-03,
9.44130490e-03],
[ 0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, -5.45971201e-03,
-1.47638444e-02, -2.29527847e-02, -3.03213745e-02,
-3.71200603e-02, -4.32155195e-02, -4.88623884e-02,
-5.39137617e-02, -5.86079073e-02, -6.27931194e-02,
-6.66959919e-02, -7.01616765e-02, -7.34069655e-02,
-7.62745909e-02, -7.89731217e-02, -8.13432337e-02,
-8.35867927e-02, -8.55424347e-02, -8.74068838e-02,
-8.91292068e-02, -9.06024308e-02, -9.20153003e-02,
-9.32180705e-02, -9.43866996e-02, -9.54795488e-02,
-9.63807212e-02, -9.72633558e-02, -9.80951779e-02,
-9.87597389e-02, -9.94198884e-02, -1.00049959e-01,
-1.00529109e-01, -1.01125665e-01, -1.01555120e-01,
-1.01981360e-01],
[ 0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
1.00918465e-03, 1.50789768e-02, 2.79207547e-02,
3.96644557e-02, 5.03268779e-02, 6.00383505e-02,
6.89282660e-02, 7.69912154e-02, 1.08926792e-01,
1.56518514e-01, 1.98800026e-01, 2.36935453e-01,
2.72099839e-01, 3.03666611e-01, 3.32875537e-01,
3.59039527e-01, 3.83318526e-01, 4.04999643e-01,
4.25184157e-01, 4.43141508e-01, 4.59923618e-01,
4.74786190e-01, 4.88739712e-01, 5.01028304e-01,
5.12628595e-01, 5.22773570e-01, 5.32413696e-01,
5.41308481e-01, 5.48953012e-01, 5.56257505e-01,
5.62508429e-01, 5.68553441e-01, 5.74194278e-01,
5.78881710e-01, 5.83448614e-01, 5.87740830e-01,
5.91206035e-01, 5.94626325e-01, 5.97878609e-01,
6.00389627e-01, 6.03440785e-01, 6.05675334e-01,
6.07851891e-01],
[ 0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, -4.56049861e-08,
-1.37960902e-07],
[ 0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
-2.97720011e-04, -6.06104752e-04, -8.87055238e-04,
-1.14297406e-03, -1.37622247e-03, -1.58875608e-03,
-1.78233867e-03, -1.95878727e-03, -2.12742966e-03,
-2.28921823e-03, -2.43583353e-03, -2.56923372e-03,
-2.69085768e-03, -2.80156567e-03, -2.90253796e-03,
-2.99443853e-03, -3.07827340e-03, -3.15456129e-03,
-3.22416821e-03, -3.28749367e-03, -3.34528751e-03,
-3.39785044e-03, -3.44583583e-03, -3.48946222e-03,
-3.52930315e-03, -3.56550844e-03, -3.59858591e-03,
-3.62875304e-03, -3.65613482e-03, -3.68115910e-03,
-3.70386703e-03, -3.72463531e-03, -3.74359159e-03,
-3.76075916e-03, -3.77646690e-03, -3.79081127e-03,
-3.80377682e-03, -3.81564822e-03, -3.82649783e-03,
-3.83627584e-03, -3.84536500e-03, -3.85318347e-03,
-3.85991343e-03],
[ 0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
-4.26219891e-04, -4.77517531e-03, -8.73778142e-03,
-1.23483605e-02, -1.56381857e-02, -1.86357520e-02,
-2.97667447e-02, -6.62889605e-02, -9.95115211e-02,
-1.29775326e-01, -1.57480595e-01, -1.82604176e-01,
-2.05773443e-01, -2.28277678e-01, -2.48849642e-01,
-2.67724200e-01, -2.84806306e-01, -3.00359530e-01,
-3.14656427e-01, -3.27569944e-01, -3.38568753e-01,
-3.44888900e-01, -3.51353261e-01, -3.57344154e-01,
-3.62896139e-01, -3.67890055e-01, -3.72510103e-01,
-3.76651761e-01, -3.80492916e-01, -3.83925855e-01,
-3.87119816e-01, -3.89964150e-01, -3.92620478e-01,
-3.94975913e-01, -3.97185544e-01, -3.99134759e-01,
-4.00973133e-01, -4.02584619e-01, -4.04114281e-01,
-4.05519980e-01, -4.06733693e-01, -4.07893189e-01,
-4.08891532e-01, -4.09855867e-01, -4.10749904e-01,
-4.11500238e-01, -4.12230965e-01, -4.12912688e-01,
-4.13472586e-01, -4.14025097e-01, -4.14545665e-01,
-4.14960450e-01, -4.15441966e-01, -4.15806129e-01,
-4.16167317e-01],
[ 0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
-9.76512662e-03, -4.87586353e-02, -8.42328207e-02,
-1.16548205e-01, -1.46123093e-01, -1.72949928e-01,
-1.97665029e-01, -2.21590530e-01, -2.43457677e-01,
-2.63512693e-01, -2.81670091e-01, -2.98203052e-01,
-3.13392971e-01, -3.27119885e-01, -3.39115060e-01,
-3.46534665e-01, -3.53989218e-01, -3.60878352e-01,
-3.67252791e-01, -3.72991489e-01, -3.78294486e-01,
-3.83054050e-01, -3.87462486e-01, -3.91408025e-01,
-3.95073220e-01, -3.98342708e-01, -4.01390500e-01,
-4.04098459e-01, -4.06633290e-01, -4.08874683e-01,
-4.10983195e-01, -4.12836695e-01, -4.14590719e-01,
-4.16201933e-01, -4.17598433e-01, -4.18927767e-01,
-4.20076927e-01, -4.21182071e-01, -4.22205705e-01,
-4.23069724e-01, -4.23906908e-01, -4.24686908e-01,
-4.25332016e-01, -4.25964718e-01, -4.26559676e-01,
-4.27037839e-01, -4.27584528e-01, -4.27999904e-01,
-4.28408964e-01]]), 'dev': array([0. , 0.07984633, 0.14613615, 0.20117112, 0.24686213,
0.2847956 , 0.31628864, 0.34243471, 0.36414164, 0.38216311,
0.39712485, 0.40954636, 0.4198589 , 0.42842056, 0.43552861,
0.44142983, 0.45159058, 0.46082763, 0.4684964 , 0.47486315,
0.48014893, 0.48467976, 0.48970616, 0.49387917, 0.49734368,
0.50021998, 0.50260793, 0.50784849, 0.52185647, 0.53347216,
0.54311702, 0.55115888, 0.55780898, 0.56360578, 0.56911693,
0.57370753, 0.57754161, 0.58070799, 0.58333684, 0.58553607,
0.58734952, 0.58962779, 0.59180646, 0.59364078, 0.59516199,
0.59644205, 0.59749115, 0.59837571, 0.59909904, 0.59971064,
0.60020939, 0.6006325 , 0.60097642, 0.60126932, 0.60150647,
0.60170939, 0.60187289, 0.6020136 , 0.60212631, 0.60222397,
0.60230602, 0.6023707 , 0.6024271 , 0.60247148, 0.60251067,
0.60254397, 0.60256948, 0.6025922 , 0.60261166, 0.60262621,
0.60263938, 0.6026508 , 0.60265903, 0.60266799, 0.60267415,
0.60267971]), 'nulldev': array([0.71665039, 1.22201514, 1.94477545, ..., 0.48518478, 1.44347715,
0.36306899]), 'df': array([0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 3,
3, 3, 3, 3, 3, 4, 4, 4, 4, 4, 4, 6, 6, 6, 6, 6, 6, 6, 6, 7, 7, 7,
7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7,
7, 7, 7, 7, 7, 7, 7, 7, 8, 8]), 'lambdau': array([7.90539283e-01, 7.20309952e-01, 6.56319601e-01, 5.98013976e-01,
5.44888063e-01, 4.96481709e-01, 4.52375642e-01, 4.12187837e-01,
3.75570206e-01, 3.42205584e-01, 3.11804983e-01, 2.84105088e-01,
2.58865975e-01, 2.35869035e-01, 2.14915080e-01, 1.95822617e-01,
1.78426275e-01, 1.62575377e-01, 1.48132628e-01, 1.34972934e-01,
1.22982310e-01, 1.12056901e-01, 1.02102075e-01, 9.30316077e-02,
8.47669361e-02, 7.72364751e-02, 7.03749995e-02, 6.41230785e-02,
5.84265609e-02, 5.32361063e-02, 4.85067573e-02, 4.41975507e-02,
4.02711621e-02, 3.66935831e-02, 3.34338263e-02, 3.04636573e-02,
2.77573499e-02, 2.52914635e-02, 2.30446396e-02, 2.09974173e-02,
1.91320646e-02, 1.74324247e-02, 1.58837762e-02, 1.44727053e-02,
1.31869900e-02, 1.20154942e-02, 1.09480708e-02, 9.97547435e-03,
9.08928070e-03, 8.28181406e-03, 7.54608052e-03, 6.87570753e-03,
6.26488862e-03, 5.70833318e-03, 5.20122059e-03, 4.73915849e-03,
4.31814471e-03, 3.93453264e-03, 3.58499960e-03, 3.26651812e-03,
2.97632965e-03, 2.71192073e-03, 2.47100117e-03, 2.25148423e-03,
2.05146858e-03, 1.86922176e-03, 1.70316525e-03, 1.55186075e-03,
1.41399772e-03, 1.28838206e-03, 1.17392574e-03, 1.06963742e-03,
9.74613777e-04, 8.88031775e-04, 8.09141480e-04, 7.37259581e-04]), 'npasses': 800, 'jerr': 0, 'dim': array([ 8, 76]), 'offset': False, 'class': 'elnet'}, 'lambda_min': array([0.00297633]), 'lambda_1se': array([0.01588378]), 'class': 'cvglmnet'}
2 - GLMNet + nnetsauce
import glmnetforpython as glmnet
import mlsauce as ms
import nnetsauce as ns
from sklearn.datasets import load_breast_cancer, load_wine, load_iris
from sklearn.model_selection import train_test_split
from time import time
for dataset in [load_breast_cancer, load_wine, load_iris]:
print(f"\n\n dataset: {dataset.__name__} -----")
X, y = dataset(return_X_y=True)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2,
random_state=123)
regr = ms.MultiTaskRegressor(glmnet.GLMNet(lambdau=1000))
model = ms.GenericBoostingClassifier(regr, tolerance=1e-2)
# Train the model on the training datac
start_time = time()
model.fit(X_train, y_train)
end_time = time()
print(f"Training time: {end_time - start_time} seconds")
# Evaluate the model's performance (e.g., using accuracy)
accuracy = model.score(X_test, y_test)
print(f"Accuracy: {accuracy}")
clf = ns.CustomClassifier(ns.MultitaskClassifier(glmnet.GLMNet(lambdau=1000)),
n_hidden_features=10)
# Train the model on the training datac
start_time = time()
model.fit(X_train, y_train)
end_time = time()
print(f"Training time: {end_time - start_time} seconds")
# Evaluate the model's performance (e.g., using accuracy)
accuracy = model.score(X_test, y_test)
print(f"Accuracy: {accuracy}")
clf = ns.CustomClassifier(ns.SimpleMultitaskClassifier(glmnet.GLMNet(lambdau=1000)))
# Train the model on the training datac
start_time = time()
model.fit(X_train, y_train)
end_time = time()
print(f"Training time: {end_time - start_time} seconds")
# Evaluate the model's performance (e.g., using accuracy)
accuracy = model.score(X_test, y_test)
print(f"Accuracy: {accuracy}")
clf = ns.DeepClassifier(ns.MultitaskClassifier(glmnet.GLMNet(lambdau=1000)))
# Train the model on the training datac
start_time = time()
model.fit(X_train, y_train)
end_time = time()
print(f"Training time: {end_time - start_time} seconds")
# Evaluate the model's performance (e.g., using accuracy)
accuracy = model.score(X_test, y_test)
print(f"Accuracy: {accuracy}")
clf = ns.DeepClassifier(ns.SimpleMultitaskClassifier(glmnet.GLMNet(lambdau=1000)))
# Train the model on the training datac
start_time = time()
model.fit(X_train, y_train)
end_time = time()
print(f"Training time: {end_time - start_time} seconds")
# Evaluate the model's performance (e.g., using accuracy)
accuracy = model.score(X_test, y_test)
print(f"Accuracy: {accuracy}")
/usr/local/lib/python3.10/dist-packages/ipykernel/ipkernel.py:283: DeprecationWarning: `should_run_async` will not call `transform_cell` automatically in the future. Please pass the result to `transformed_cell` argument and any exception that happen during thetransform in `preprocessing_exc_tuple` in IPython 7.17 and above.
and should_run_async(code)
dataset: load_breast_cancer -----
100%|██████████| 100/100 [00:18<00:00, 5.46it/s]
Training time: 18.33358597755432 seconds
Accuracy: 0.9649122807017544
100%|██████████| 100/100 [00:18<00:00, 5.31it/s]
Training time: 18.904021501541138 seconds
Accuracy: 0.9649122807017544
100%|██████████| 100/100 [00:12<00:00, 8.24it/s]
Training time: 12.280655860900879 seconds
Accuracy: 0.9649122807017544
100%|██████████| 100/100 [00:23<00:00, 4.32it/s]
Training time: 23.297285318374634 seconds
Accuracy: 0.9649122807017544
100%|██████████| 100/100 [00:24<00:00, 4.08it/s]
Training time: 24.91062593460083 seconds
Accuracy: 0.9649122807017544
dataset: load_wine -----
100%|██████████| 100/100 [00:03<00:00, 28.64it/s]
Training time: 3.5058298110961914 seconds
Accuracy: 1.0
100%|██████████| 100/100 [00:05<00:00, 16.76it/s]
Training time: 6.019681453704834 seconds
Accuracy: 1.0
100%|██████████| 100/100 [00:08<00:00, 11.76it/s]
Training time: 8.692431688308716 seconds
Accuracy: 1.0
100%|██████████| 100/100 [00:20<00:00, 4.85it/s]
Training time: 20.893232583999634 seconds
Accuracy: 1.0
100%|██████████| 100/100 [00:13<00:00, 7.42it/s]
Training time: 13.870125532150269 seconds
Accuracy: 1.0
dataset: load_iris -----
14%|█▍ | 14/100 [00:00<00:05, 16.97it/s]
Training time: 0.8306210041046143 seconds
Accuracy: 0.9333333333333333
100%|██████████| 14/14 [00:00<00:00, 35.76it/s]
Training time: 0.40160202980041504 seconds
Accuracy: 0.9333333333333333
100%|██████████| 14/14 [00:00<00:00, 30.18it/s]
Training time: 0.47559595108032227 seconds
Accuracy: 0.9333333333333333
100%|██████████| 14/14 [00:00<00:00, 30.39it/s]
Training time: 0.4738032817840576 seconds
Accuracy: 0.9333333333333333
100%|██████████| 14/14 [00:00<00:00, 26.63it/s]
Training time: 0.5447156429290771 seconds
Accuracy: 0.9333333333333333
from sklearn.datasets import load_diabetes, fetch_california_housing
for dataset in [load_diabetes, fetch_california_housing]:
print(f"\n\n dataset: {dataset.__name__} -----")
X, y = dataset(return_X_y=True)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2,
random_state=123)
regr = glmnet.GLMNet(lambdau=1000)
model = ms.GenericBoostingRegressor(regr, backend="cpu", tolerance=1e-2)
# Train the model on the training datac
start_time = time()
model.fit(X_train, y_train)
end_time = time()
print(f"Training time: {end_time - start_time} seconds")
# Evaluate the model's performance (e.g., using accuracy)
preds = model.predict(X_test)
rmse = ((preds - y_test)**2).mean()**0.5
print(f"RMSE: {rmse}")
model = ns.CustomRegressor(regr)
# Train the model on the training datac
start_time = time()
model.fit(X_train, y_train)
end_time = time()
print(f"Training time: {end_time - start_time} seconds")
# Evaluate the model's performance (e.g., using accuracy)
preds = model.predict(X_test)
rmse = ((preds - y_test)**2).mean()**0.5
print(f"RMSE: {rmse}")
model = ns.DeepRegressor(regr)
# Train the model on the training datac
start_time = time()
model.fit(X_train, y_train)
end_time = time()
print(f"Training time: {end_time - start_time} seconds")
# Evaluate the model's performance (e.g., using accuracy)
preds = model.predict(X_test)
rmse = ((preds - y_test)**2).mean()**0.5
print(f"RMSE: {rmse}")
/usr/local/lib/python3.10/dist-packages/ipykernel/ipkernel.py:283: DeprecationWarning: `should_run_async` will not call `transform_cell` automatically in the future. Please pass the result to `transformed_cell` argument and any exception that happen during thetransform in `preprocessing_exc_tuple` in IPython 7.17 and above.
and should_run_async(code)
dataset: load_diabetes -----
57%|█████▋ | 57/100 [00:00<00:00, 230.67it/s]
Training time: 0.25351572036743164 seconds
RMSE: 50.47735955241068
Training time: 0.04386782646179199 seconds
RMSE: 51.2098185574396
Training time: 0.09994053840637207 seconds
RMSE: 51.02354464725009
dataset: fetch_california_housing -----
52%|█████▏ | 52/100 [00:00<00:00, 58.32it/s]
Training time: 0.9048025608062744 seconds
RMSE: 0.8216935762732704
Training time: 0.1747438907623291 seconds
RMSE: 0.8218417233321206
Training time: 0.512531042098999 seconds
RMSE: 0.8218417233321208
For attribution, please cite this work as:
T. Moudiki (2024-11-18). GLMNet in Python: Generalized Linear Models. Retrieved from https://thierrymoudiki.github.io/blog/2024/11/18/python/r/GLMNet-post
BibTeX citation (remove empty spaces)@misc{ tmoudiki20241118, author = { T. Moudiki }, title = { GLMNet in Python: Generalized Linear Models }, url = { https://thierrymoudiki.github.io/blog/2024/11/18/python/r/GLMNet-post }, year = { 2024 } }
Previous publications
- I'm supposed to present 'Conformal Predictive Simulations for Univariate Time Series' at COPA CONFERENCE 2025 in London... Sep 4, 2025
- external regressors in ahead::dynrmf's interface for Machine learning forecasting Sep 1, 2025
- Another interesting decision, now for 'Beyond Nelson-Siegel and splines: A model-agnostic Machine Learning framework for discount curve calibration, interpolation and extrapolation' Aug 20, 2025
- Boosting any randomized based learner for regression, classification and univariate/multivariate time series forcasting Jul 26, 2025
- New nnetsauce version with CustomBackPropRegressor (CustomRegressor with Backpropagation) and ElasticNet2Regressor (Ridge2 with ElasticNet regularization) Jul 15, 2025
- mlsauce (home to a model-agnostic gradient boosting algorithm) can now be installed from PyPI. Jul 10, 2025
- A user-friendly graphical interface to techtonique dot net's API (will eventually contain graphics). Jul 8, 2025
- Calling =TECHTO_MLCLASSIFICATION for Machine Learning supervised CLASSIFICATION in Excel is just a matter of copying and pasting Jul 7, 2025
- Calling =TECHTO_MLREGRESSION for Machine Learning supervised regression in Excel is just a matter of copying and pasting Jul 6, 2025
- Calling =TECHTO_RESERVING and =TECHTO_MLRESERVING for claims triangle reserving in Excel is just a matter of copying and pasting Jul 5, 2025
- Calling =TECHTO_SURVIVAL for Survival Analysis in Excel is just a matter of copying and pasting Jul 4, 2025
- Calling =TECHTO_SIMULATION for Stochastic Simulation in Excel is just a matter of copying and pasting Jul 3, 2025
- Calling =TECHTO_FORECAST for forecasting in Excel is just a matter of copying and pasting Jul 2, 2025
- Random Vector Functional Link (RVFL) artificial neural network with 2 regularization parameters successfully used for forecasting/synthetic simulation in professional settings: Extensions (including Bayesian) Jul 1, 2025
- R version of 'Backpropagating quasi-randomized neural networks' Jun 24, 2025
- Backpropagating quasi-randomized neural networks Jun 23, 2025
- Beyond ARMA-GARCH: leveraging any statistical model for volatility forecasting Jun 21, 2025
- Stacked generalization (Machine Learning model stacking) + conformal prediction for forecasting with ahead::mlf Jun 18, 2025
- An Overfitting dilemma: XGBoost Default Hyperparameters vs GenericBooster + LinearRegression Default Hyperparameters Jun 14, 2025
- Programming language-agnostic reserving using RidgeCV, LightGBM, XGBoost, and ExtraTrees Machine Learning models Jun 13, 2025
- Exceptionally, and on a more personal note (otherwise I may get buried alive)... Jun 10, 2025
- Free R, Python and SQL editors in techtonique dot net Jun 9, 2025
- Beyond Nelson-Siegel and splines: A model-agnostic Machine Learning framework for discount curve calibration, interpolation and extrapolation Jun 7, 2025
- scikit-learn, glmnet, xgboost, lightgbm, pytorch, keras, nnetsauce in probabilistic Machine Learning (for longitudinal data) Reserving (work in progress) Jun 6, 2025
- R version of Probabilistic Machine Learning (for longitudinal data) Reserving (work in progress) Jun 5, 2025
- Probabilistic Machine Learning (for longitudinal data) Reserving (work in progress) Jun 4, 2025
- Python version of Beyond ARMA-GARCH: leveraging model-agnostic Quasi-Randomized networks and conformal prediction for nonparametric probabilistic stock forecasting (ML-ARCH) Jun 3, 2025
- Beyond ARMA-GARCH: leveraging model-agnostic Machine Learning and conformal prediction for nonparametric probabilistic stock forecasting (ML-ARCH) Jun 2, 2025
- Permutations and SHAPley values for feature importance in techtonique dot net's API (with R + Python + the command line) Jun 1, 2025
- Which patient is going to survive longer? Another guide to using techtonique dot net's API (with R + Python + the command line) for survival analysis May 31, 2025
- A Guide to Using techtonique.net's API and rush for simulating and plotting Stochastic Scenarios May 30, 2025
- Simulating Stochastic Scenarios with Diffusion Models: A Guide to Using techtonique.net's API for the purpose May 29, 2025
- Will my apartment in 5th avenue be overpriced or not? Harnessing the power of www.techtonique.net (+ xgboost, lightgbm, catboost) to find out May 28, 2025
- How long must I wait until something happens: A Comprehensive Guide to Survival Analysis via an API May 27, 2025
- Harnessing the Power of techtonique.net: A Comprehensive Guide to Machine Learning Classification via an API May 26, 2025
- Quantile regression with any regressor -- Examples with RandomForestRegressor, RidgeCV, KNeighborsRegressor May 20, 2025
- Survival stacking: survival analysis translated as supervised classification in R and Python May 5, 2025
- 'Bayesian' optimization of hyperparameters in a R machine learning model using the bayesianrvfl package Apr 25, 2025
- A lightweight interface to scikit-learn in R: Bayesian and Conformal prediction Apr 21, 2025
- A lightweight interface to scikit-learn in R Pt.2: probabilistic time series forecasting in conjunction with ahead::dynrmf Apr 20, 2025
- Extending the Theta forecasting method to GLMs, GAMs, GLMBOOST and attention: benchmarking on Tourism, M1, M3 and M4 competition data sets (28000 series) Apr 14, 2025
- Extending the Theta forecasting method to GLMs and attention Apr 8, 2025
- Nonlinear conformalized Generalized Linear Models (GLMs) with R package 'rvfl' (and other models) Mar 31, 2025
- Probabilistic Time Series Forecasting (predictive simulations) in Microsoft Excel using Python, xlwings lite and www.techtonique.net Mar 28, 2025
- Conformalize (improved prediction intervals and simulations) any R Machine Learning model with misc::conformalize Mar 25, 2025
- My poster for the 18th FINANCIAL RISKS INTERNATIONAL FORUM by Institut Louis Bachelier/Fondation du Risque/Europlace Institute of Finance Mar 19, 2025
- Interpretable probabilistic kernel ridge regression using Matérn 3/2 kernels Mar 16, 2025
- (News from) Probabilistic Forecasting of univariate and multivariate Time Series using Quasi-Randomized Neural Networks (Ridge2) and Conformal Prediction Mar 9, 2025
- Word-Online: re-creating Karpathy's char-RNN (with supervised linear online learning of word embeddings) for text completion Mar 8, 2025
- CRAN-like repository for most recent releases of Techtonique's R packages Mar 2, 2025
- Presenting 'Online Probabilistic Estimation of Carbon Beta and Carbon Shapley Values for Financial and Climate Risk' at Institut Louis Bachelier Feb 27, 2025
- Web app with DeepSeek R1 and Hugging Face API for chatting Feb 23, 2025
- tisthemachinelearner: A Lightweight interface to scikit-learn with 2 classes, Classifier and Regressor (in Python and R) Feb 17, 2025
- R version of survivalist: Probabilistic model-agnostic survival analysis using scikit-learn, xgboost, lightgbm (and conformal prediction) Feb 12, 2025
- Model-agnostic global Survival Prediction of Patients with Myeloid Leukemia in QRT/Gustave Roussy Challenge (challengedata.ens.fr): Python's survivalist Quickstart Feb 10, 2025
- A simple test of the martingale hypothesis in esgtoolkit Feb 3, 2025
- Command Line Interface (CLI) for techtonique.net's API Jan 31, 2025
- Gradient-Boosting and Boostrap aggregating anything (alert: high performance): Part5, easier install and Rust backend Jan 27, 2025
- Just got a paper on conformal prediction REJECTED by International Journal of Forecasting despite evidence on 30,000 time series (and more). What's going on? Part2: 1311 time series from the Tourism competition Jan 20, 2025
- Techtonique is out! (with a tutorial in various programming languages and formats) Jan 14, 2025
- Univariate and Multivariate Probabilistic Forecasting with nnetsauce and TabPFN Jan 14, 2025
- Just got a paper on conformal prediction REJECTED by International Journal of Forecasting despite evidence on 30,000 time series (and more). What's going on? Jan 5, 2025
- Python and Interactive dashboard version of Stock price forecasting with Deep Learning: throwing power at the problem (and why it won't make you rich) Dec 31, 2024
- Stock price forecasting with Deep Learning: throwing power at the problem (and why it won't make you rich) Dec 29, 2024
- No-code Machine Learning Cross-validation and Interpretability in techtonique.net Dec 23, 2024
- survivalist: Probabilistic model-agnostic survival analysis using scikit-learn, glmnet, xgboost, lightgbm, pytorch, keras, nnetsauce and mlsauce Dec 15, 2024
- Model-agnostic 'Bayesian' optimization (for hyperparameter tuning) using conformalized surrogates in GPopt Dec 9, 2024
- You can beat Forecasting LLMs (Large Language Models a.k.a foundation models) with nnetsauce.MTS Pt.2: Generic Gradient Boosting Dec 1, 2024
- You can beat Forecasting LLMs (Large Language Models a.k.a foundation models) with nnetsauce.MTS Nov 24, 2024
- Unified interface and conformal prediction (calibrated prediction intervals) for R package forecast (and 'affiliates') Nov 23, 2024
- GLMNet in Python: Generalized Linear Models Nov 18, 2024
- Gradient-Boosting anything (alert: high performance): Part4, Time series forecasting Nov 10, 2024
- Predictive scenarios simulation in R, Python and Excel using Techtonique API Nov 3, 2024
- Chat with your tabular data in www.techtonique.net Oct 30, 2024
- Gradient-Boosting anything (alert: high performance): Part3, Histogram-based boosting Oct 28, 2024
- R editor and SQL console (in addition to Python editors) in www.techtonique.net Oct 21, 2024
- R and Python consoles + JupyterLite in www.techtonique.net Oct 15, 2024
- Gradient-Boosting anything (alert: high performance): Part2, R version Oct 14, 2024
- Gradient-Boosting anything (alert: high performance) Oct 6, 2024
- Benchmarking 30 statistical/Machine Learning models on the VN1 Forecasting -- Accuracy challenge Oct 4, 2024
- Automated random variable distribution inference using Kullback-Leibler divergence and simulating best-fitting distribution Oct 2, 2024
- Forecasting in Excel using Techtonique's Machine Learning APIs under the hood Sep 30, 2024
- Techtonique web app for data-driven decisions using Mathematics, Statistics, Machine Learning, and Data Visualization Sep 25, 2024
- Parallel for loops (Map or Reduce) + New versions of nnetsauce and ahead Sep 16, 2024
- Adaptive (online/streaming) learning with uncertainty quantification using Polyak averaging in learningmachine Sep 10, 2024
- New versions of nnetsauce and ahead Sep 9, 2024
- Prediction sets and prediction intervals for conformalized Auto XGBoost, Auto LightGBM, Auto CatBoost, Auto GradientBoosting Sep 2, 2024
- Quick/automated R package development workflow (assuming you're using macOS or Linux) Part2 Aug 30, 2024
- R package development workflow (assuming you're using macOS or Linux) Aug 27, 2024
- A new method for deriving a nonparametric confidence interval for the mean Aug 26, 2024
- Conformalized adaptive (online/streaming) learning using learningmachine in Python and R Aug 19, 2024
- Bayesian (nonlinear) adaptive learning Aug 12, 2024
- Auto XGBoost, Auto LightGBM, Auto CatBoost, Auto GradientBoosting Aug 5, 2024
- Copulas for uncertainty quantification in time series forecasting Jul 28, 2024
- Forecasting uncertainty: sequential split conformal prediction + Block bootstrap (web app) Jul 22, 2024
- learningmachine for Python (new version) Jul 15, 2024
- learningmachine v2.0.0: Machine Learning with explanations and uncertainty quantification Jul 8, 2024
- My presentation at ISF 2024 conference (slides with nnetsauce probabilistic forecasting news) Jul 3, 2024
- 10 uncertainty quantification methods in nnetsauce forecasting Jul 1, 2024
- Forecasting with XGBoost embedded in Quasi-Randomized Neural Networks Jun 24, 2024
- Forecasting Monthly Airline Passenger Numbers with Quasi-Randomized Neural Networks Jun 17, 2024
- Automated hyperparameter tuning using any conformalized surrogate Jun 9, 2024
- Recognizing handwritten digits with Ridge2Classifier Jun 3, 2024
- Forecasting the Economy May 27, 2024
- A detailed introduction to Deep Quasi-Randomized 'neural' networks May 19, 2024
- Probability of receiving a loan; using learningmachine May 12, 2024
- mlsauce's `v0.18.2`: various examples and benchmarks with dimension reduction May 6, 2024
- mlsauce's `v0.17.0`: boosting with Elastic Net, polynomials and heterogeneity in explanatory variables Apr 29, 2024
- mlsauce's `v0.13.0`: taking into account inputs heterogeneity through clustering Apr 21, 2024
- mlsauce's `v0.12.0`: prediction intervals for LSBoostRegressor Apr 15, 2024
- Conformalized predictive simulations for univariate time series on more than 250 data sets Apr 7, 2024
- learningmachine v1.1.2: for Python Apr 1, 2024
- learningmachine v1.0.0: prediction intervals around the probability of the event 'a tumor being malignant' Mar 25, 2024
- Bayesian inference and conformal prediction (prediction intervals) in nnetsauce v0.18.1 Mar 18, 2024
- Multiple examples of Machine Learning forecasting with ahead Mar 11, 2024
- rtopy (v0.1.1): calling R functions in Python Mar 4, 2024
- ahead forecasting (v0.10.0): fast time series model calibration and Python plots Feb 26, 2024
- A plethora of datasets at your fingertips Part3: how many times do couples cheat on each other? Feb 19, 2024
- nnetsauce's introduction as of 2024-02-11 (new version 0.17.0) Feb 11, 2024
- Tuning Machine Learning models with GPopt's new version Part 2 Feb 5, 2024
- Tuning Machine Learning models with GPopt's new version Jan 29, 2024
- Subsampling continuous and discrete response variables Jan 22, 2024
- DeepMTS, a Deep Learning Model for Multivariate Time Series Jan 15, 2024
- A classifier that's very accurate (and deep) Pt.2: there are > 90 classifiers in nnetsauce Jan 8, 2024
- learningmachine: prediction intervals for conformalized Kernel ridge regression and Random Forest Jan 1, 2024
- A plethora of datasets at your fingertips Part2: how many times do couples cheat on each other? Descriptive analytics, interpretability and prediction intervals using conformal prediction Dec 25, 2023
- Diffusion models in Python with esgtoolkit (Part2) Dec 18, 2023
- Diffusion models in Python with esgtoolkit Dec 11, 2023
- Julia packaging at the command line Dec 4, 2023
- Quasi-randomized nnetworks in Julia, Python and R Nov 27, 2023
- A plethora of datasets at your fingertips Nov 20, 2023
- A classifier that's very accurate (and deep) Nov 12, 2023
- mlsauce version 0.8.10: Statistical/Machine Learning with Python and R Nov 5, 2023
- AutoML in nnetsauce (randomized and quasi-randomized nnetworks) Pt.2: multivariate time series forecasting Oct 29, 2023
- AutoML in nnetsauce (randomized and quasi-randomized nnetworks) Oct 22, 2023
- Version v0.14.0 of nnetsauce for R and Python Oct 16, 2023
- A diffusion model: G2++ Oct 9, 2023
- Diffusion models in ESGtoolkit + announcements Oct 2, 2023
- An infinity of time series forecasting models in nnetsauce (Part 2 with uncertainty quantification) Sep 25, 2023
- (News from) forecasting in Python with ahead (progress bars and plots) Sep 18, 2023
- Forecasting in Python with ahead Sep 11, 2023
- Risk-neutralize simulations Sep 4, 2023
- Comparing cross-validation results using crossval_ml and boxplots Aug 27, 2023
- Reminder Apr 30, 2023
- Did you ask ChatGPT about who you are? Apr 16, 2023
- A new version of nnetsauce (randomized and quasi-randomized 'neural' networks) Apr 2, 2023
- Simple interfaces to the forecasting API Nov 23, 2022
- A web application for forecasting in Python, R, Ruby, C#, JavaScript, PHP, Go, Rust, Java, MATLAB, etc. Nov 2, 2022
- Prediction intervals (not only) for Boosted Configuration Networks in Python Oct 5, 2022
- Boosted Configuration (neural) Networks Pt. 2 Sep 3, 2022
- Boosted Configuration (_neural_) Networks for classification Jul 21, 2022
- A Machine Learning workflow using Techtonique Jun 6, 2022
- Super Mario Bros © in the browser using PyScript May 8, 2022
- News from ESGtoolkit, ycinterextra, and nnetsauce Apr 4, 2022
- Explaining a Keras _neural_ network predictions with the-teller Mar 11, 2022
- New version of nnetsauce -- various quasi-randomized networks Feb 12, 2022
- A dashboard illustrating bivariate time series forecasting with `ahead` Jan 14, 2022
- Hundreds of Statistical/Machine Learning models for univariate time series, using ahead, ranger, xgboost, and caret Dec 20, 2021
- Forecasting with `ahead` (Python version) Dec 13, 2021
- Tuning and interpreting LSBoost Nov 15, 2021
- Time series cross-validation using `crossvalidation` (Part 2) Nov 7, 2021
- Fast and scalable forecasting with ahead::ridge2f Oct 31, 2021
- Automatic Forecasting with `ahead::dynrmf` and Ridge regression Oct 22, 2021
- Forecasting with `ahead` Oct 15, 2021
- Classification using linear regression Sep 26, 2021
- `crossvalidation` and random search for calibrating support vector machines Aug 6, 2021
- parallel grid search cross-validation using `crossvalidation` Jul 31, 2021
- `crossvalidation` on R-universe, plus a classification example Jul 23, 2021
- Documentation and source code for GPopt, a package for Bayesian optimization Jul 2, 2021
- Hyperparameters tuning with GPopt Jun 11, 2021
- A forecasting tool (API) with examples in curl, R, Python May 28, 2021
- Bayesian Optimization with GPopt Part 2 (save and resume) Apr 30, 2021
- Bayesian Optimization with GPopt Apr 16, 2021
- Compatibility of nnetsauce and mlsauce with scikit-learn Mar 26, 2021
- Explaining xgboost predictions with the teller Mar 12, 2021
- An infinity of time series models in nnetsauce Mar 6, 2021
- New activation functions in mlsauce's LSBoost Feb 12, 2021
- 2020 recap, Gradient Boosting, Generalized Linear Models, AdaOpt with nnetsauce and mlsauce Dec 29, 2020
- A deeper learning architecture in nnetsauce Dec 18, 2020
- Classify penguins with nnetsauce's MultitaskClassifier Dec 11, 2020
- Bayesian forecasting for uni/multivariate time series Dec 4, 2020
- Generalized nonlinear models in nnetsauce Nov 28, 2020
- Boosting nonlinear penalized least squares Nov 21, 2020
- Statistical/Machine Learning explainability using Kernel Ridge Regression surrogates Nov 6, 2020
- NEWS Oct 30, 2020
- A glimpse into my PhD journey Oct 23, 2020
- Submitting R package to CRAN Oct 16, 2020
- Simulation of dependent variables in ESGtoolkit Oct 9, 2020
- Forecasting lung disease progression Oct 2, 2020
- New nnetsauce Sep 25, 2020
- Technical documentation Sep 18, 2020
- A new version of nnetsauce, and a new Techtonique website Sep 11, 2020
- Back next week, and a few announcements Sep 4, 2020
- Explainable 'AI' using Gradient Boosted randomized networks Pt2 (the Lasso) Jul 31, 2020
- LSBoost: Explainable 'AI' using Gradient Boosted randomized networks (with examples in R and Python) Jul 24, 2020
- nnetsauce version 0.5.0, randomized neural networks on GPU Jul 17, 2020
- Maximizing your tip as a waiter (Part 2) Jul 10, 2020
- New version of mlsauce, with Gradient Boosted randomized networks and stump decision trees Jul 3, 2020
- Announcements Jun 26, 2020
- Parallel AdaOpt classification Jun 19, 2020
- Comments section and other news Jun 12, 2020
- Maximizing your tip as a waiter Jun 5, 2020
- AdaOpt classification on MNIST handwritten digits (without preprocessing) May 29, 2020
- AdaOpt (a probabilistic classifier based on a mix of multivariable optimization and nearest neighbors) for R May 22, 2020
- AdaOpt May 15, 2020
- Custom errors for cross-validation using crossval::crossval_ml May 8, 2020
- Documentation+Pypi for the `teller`, a model-agnostic tool for Machine Learning explainability May 1, 2020
- Encoding your categorical variables based on the response variable and correlations Apr 24, 2020
- Linear model, xgboost and randomForest cross-validation using crossval::crossval_ml Apr 17, 2020
- Grid search cross-validation using crossval Apr 10, 2020
- Documentation for the querier, a query language for Data Frames Apr 3, 2020
- Time series cross-validation using crossval Mar 27, 2020
- On model specification, identification, degrees of freedom and regularization Mar 20, 2020
- Import data into the querier (now on Pypi), a query language for Data Frames Mar 13, 2020
- R notebooks for nnetsauce Mar 6, 2020
- Version 0.4.0 of nnetsauce, with fruits and breast cancer classification Feb 28, 2020
- Create a specific feed in your Jekyll blog Feb 21, 2020
- Git/Github for contributing to package development Feb 14, 2020
- Feedback forms for contributing Feb 7, 2020
- nnetsauce for R Jan 31, 2020
- A new version of nnetsauce (v0.3.1) Jan 24, 2020
- ESGtoolkit, a tool for Monte Carlo simulation (v0.2.0) Jan 17, 2020
- Search bar, new year 2020 Jan 10, 2020
- 2019 Recap, the nnetsauce, the teller and the querier Dec 20, 2019
- Understanding model interactions with the `teller` Dec 13, 2019
- Using the `teller` on a classifier Dec 6, 2019
- Benchmarking the querier's verbs Nov 29, 2019
- Composing the querier's verbs for data wrangling Nov 22, 2019
- Comparing and explaining model predictions with the teller Nov 15, 2019
- Tests for the significance of marginal effects in the teller Nov 8, 2019
- Introducing the teller Nov 1, 2019
- Introducing the querier Oct 25, 2019
- Prediction intervals for nnetsauce models Oct 18, 2019
- Using R in Python for statistical learning/data science Oct 11, 2019
- Model calibration with `crossval` Oct 4, 2019
- Bagging in the nnetsauce Sep 25, 2019
- Adaboost learning with nnetsauce Sep 18, 2019
- Change in blog's presentation Sep 4, 2019
- nnetsauce on Pypi Jun 5, 2019
- More nnetsauce (examples of use) May 9, 2019
- nnetsauce Mar 13, 2019
- crossval Mar 13, 2019
- test Mar 10, 2019
Comments powered by Talkyard.