Today, give a try to Techtonique web app, a tool designed to help you make informed, data-driven decisions using Mathematics, Statistics, Machine Learning, and Data Visualization
Disclaimer: I have no affiliation with Google (cf. JAX).
nnetsauce is a general purpose tool for Statistical/Machine Learning, in which pattern recognition is achieved by using quasi-randomized networks. A new version, 0.5.0
, is out on Pypi and for R:
- Install by using
pip
(stable version):
pip install nnetsauce --upgrade
- Install from Github (development version):
pip install git+https://github.com/Techtonique/nnetsauce.git --upgrade
- Install from Github, in R console:
library(devtools)
devtools::install_github("thierrymoudiki/nnetsauce/R-package")
library(nnetsauce)
This could be the occasion for you to re-read all the previous posts about nnetsauce, or to play with various examples in Python or R. Here are a few other ways to interact with the nnetsauce:
1) Forms
- If you’re not comfortable with version control yet: a feedback form.
2) Submit Pull Requests on GitHub
- As detailed in this post. Raising issues is another constructive way to interact. You can also contribute examples to this demo repo, using the following naming convention:
yourgithubname_ddmmyy_shortdescriptionofdemo.[ipynb|Rmd]
If it’s a jupyter notebook written in R, then just add _R
to the suffix.
3) Reaching out directly via email
- Use the address: thierry dot moudiki at pm dot me
To those who are contacting me through LinkedIn: no, I’m not declining, please, add a short message to your request, so that I’d know a bit more about who you are, and/or how we can envisage to work together.
This new version, 0.5.0
:
- contains a refactorized code for the
Base
class, and for many other utilities. - makes use of randtoolbox for a faster, more scalable generation of quasi-random numbers.
- contains a (work in progress) implementation of most algorithms on GPUs, using JAX. Most of the nnetsauce’s changes related to GPUs are currently made on potentially time consuming operations such as matrices multiplications and matrices inversions. Though, to see a GPU effect, you need to have loads of data at hand, and a relatively high
n_hidden_features
parameter. How do you try it out? By instantiating a class with the option:
backend = "gpu"
or
backend = "tpu"
An example can be found in this notebook, on GitHub.
nnetsauce’s future release is planned to be much faster on CPU, due the use of Cython, as with mlsauce. There are indeed a lot of nnetsauce’s parts which can be cythonized. If you’ve ever considered joining the project, now is the right time. For example, among other things, I’m looking for a volunteer to do some testing in R+Python on Microsoft Windows. Envisage a smooth onboarding, even if you don’t have a lot of experience.
Comments powered by Talkyard.