View Code? Open in Web Editor
NEW
This project forked from abhinavs99/standard-nn-for-digits-classification
Code for neural networks from scratch
Jupyter Notebook 98.85%
Python 1.15%
standard-nn-for-digits-classification's Introduction
Standard-NN-for-Digits-Classification
- The projects explore the use of a neural network implemeneted from scratch using numpy only.
- The neural nets made are used for classification on digits dataset.
- A comparative report based on loss plots and accuracy obtained for various activation functions viz. sigmoid, tanh, linear, relu.
- The output layer is softmax in all the cases and cross entropy loss.
- The model is trained for 100 epochs.
- T-SNE plots for the last layers are also prepared for all the activations.
- The code is well commented and labelled wherever necessary.
- Main.ipynb โ The notebook that presents and generate the results discussed below
- NN.py โ The files contains the Neural Network implemented from scratch using numpy only.
Activation Function |
Accuracy |
Sigmoid |
0.967 |
Tanh |
0.9757 |
Linear |
0.914 |
ReLU |
0.9753 |
Sigmoid |
Tanh |
|
|
Linear |
ReLU |
|
|
Sigmoid |
Tanh |
|
|
Linear |
ReLU |
|
|
standard-nn-for-digits-classification's People
Contributors