Computation graph based deep learning framework
- Auto-grad support
- Matrix-based operations
- Forward and Backward Propagation
- Optimizers - Stochastic Gradient Descent, Momentum, Adam, etc.
- Loss functions - log loss, cross-entropy loss, etc.
- Neural network layers - convolution, pooling, etc.
- Model serialization with Protobuf
- Model serving from serialized data
- Distributed trainers that can use Parameter Server and Ring All-Reduce based model training