Implementing a neural network and applying it to hand-written digit recognition. The program is able to successfully recognizie digits and classify them accordingly.
The cost function for a neural network with 3 layers is given by:
The backpropagation algorithm requires the gradient of the sigmoid function to be computed. Sigmoid Function:
Gradients with Regularization:
An import part of using a neural network is numerically validating your implementation to confirm accurate gradient results. This numerical function uses an alternative approach to compute the derivatives:
The gradient check is used for a few test values, confirms the backpropagation algorithm implementation is correct, then turns off as it is computationally inefficient.
This implementation then uses fmincg to learn the parameter values.
Based off of Stanford's Machine Learning Course taught by professor Andrew Ng.