----------------------------------------------------------------------------------------->
----------------------------------------------------------------------------------------->
Deep Learning is overwhelming the scientific community. While common frameworks like tensorflow, pytorch offer the functionalities to create and exercise with nerual networks, the arithmatic operations have already been encapsulated in the buiding blocks, making the fundamental working mechnism hidden from us, e.g the actual calculations happening in neural networks. As a AI worker, I believe that I will work with neural networks for at least the next 10 years. In this case, I think it's necessary to understand the basic operations inside the big blackbox. To this end, I created this this repository, to record my exercises in developing neural networks from scratch using numpy, using the MNIST datastet http://yann.lecun.com/exdb/mnist/.
It aims at covering the following implementations in jupyter notebook:
-- Fully Connect Neural Network (MLP)
-- Convolutional Neural Network (CNN)
-- Recurrent Neural Network (RNN with LSTM, GRU)
-- Generative Neural Network (GAN)
-- AutoEncoders (AE)
-- More to be added.
Please contact Baiqiang XIA at [email protected] any ambiguities or mistakes. Discussions and feedback are always welcomed!
DR Baiqiang XIA
12/Jan/2019