Computational graph library for machine learning. The main point is to combine mathematical operation together to form a workflow of choice. The graph takes care of evaluating the gradient of all the inputs to ease up setting up the minimizer.
I have aimed for the library to be simple and transparent so that it would be easy to understand and modify to fit individual needs. Performance was not the main objective as there are plenty of fast alternatives; the aim was smaller, educational models.
Currently, supports most of the useful matrix operations, the Adam stochastic minimizer as well as modules for simplified deployment of dense, convolution and recurrent (vanilla and LSTM) networks.
All feedback and ideas for improvement welcome.
- To be found in tutorial.ipynb, covers basic usage and a simple linear regression model
Setting up and running / training a dense neural network model
Setting up and running / training a Convolution neural network model
Setting up and running / training a Recurrent neural network
http://www.deeplearningbook.org/ section 6.5.1 for more information on computational graphs and the rest of the book for more information about ML/deep learning.
- Python 3.5 or above
- numpy - linear algebra for Python
- scipy - Scientific Python library, here used for utilities
MIT