Giter Club home page Giter Club logo

neural_net_using_numpy's Introduction

Feed Forward Neural Networks using NumPy

This library is a modification of my previous one. Click Here to check my previous library.

Installation

$ [sudo] pip3 install nicenet

Development Installation

$ git clone https://github.com/Subhash3/Neural_Net_Using_NumPy.git

Usage

>>> from nicenet import NeuralNetwork

Creating a Neural Network

inputs = 2
outputs = 1
network = NeuralNetwork(inputs, outputs, cost="mse")

# Add 2 hidden layers with 16 neurons each and activation function 'tanh'
network.addLayer(16, activation_function="tanh") 
network.addLayer(16, activation_function="tanh")

# Finish the neural network by adding the output layer with sigmoid activation function.
network.compile(activation_function="sigmoid")

Building a dataset

The package contains a Dataset class to create a dataset.

>>> from nicenet import Dataset

Make sure you have inputs and target values in seperate files in csv format.

input_file = "inputs.csv"
target_file = "targets.csv"

# Create a dataset object with the same inputs and outputs defined for the network.
datasetCreator = Dataset(inputs, outputs)
datasetCreator.makeDataset(input_file, target_file)
data, size = datasetCreator.getRawData()

If you want to manually make a dataset, follow these rules:

  • Dataset must be a list of data samples.
  • A data sample is a tuple containing inputs and target values.
  • Input and target values are column vector of size (inputs x 1) and (outputs x 1) respectively.

For eg, a typical XOR data set looks something like :

>>> XOR_data = [
    (
        np.array([[0], [0]]),
        np.array([[0]])
    ),
    (
        np.array([[0], [1]]),
        np.array([[1]])
    ),
    (
        np.array([[1], [0]]),
        np.array([[1]])
    ),
    (
        np.array([[1], [1]]),
        np.array([[0]])
    )
]
>>> size = 4

Training The network

The library provides a Train function which accepts the dataset, dataset size, and two optional parameters epochs, and logging.

def Train(dataset, size, epochs=5000, logging=True) :
	....
	....

For Eg: If you want to train your network for 1000 epochs.

>>> network.Train(data, size, epochs=1000)

Notice that I didn't change the value of log_outputs as I want the output to printed for each epoch.

Debugging

Plot a nice epoch vs error graph

>>> network.epoch_vs_error()

Know how well the model performed.

>>> network.evaluate()

To take a look at all the layers' info

>>> network.display()

Sometimes, learning rate might have to be altered for better convergence.

>>> network.setLearningRate(0.1)

Exporting Model

You can export a trained model to a json file which can be loaded and used for predictions in the future.

filename = "model.json"
network.export_model(filename)

Load Model

To load a model from an exported model (json) file. load_model is a static function, so you must not call this on a NeuralNetwork object!.

filename = "model.json"
network = NeuralNetwork.load_model(filename)

Todo

- [x] Generalize the gradient descent algorithm
    - [x] Generalise the loss function => Write a separate class for it!
- [x] Implement Cross Entropy Loss
- [ ] Data scaling
    - [x] Min Max scaler
    - [ ] Data Standardization
- [x] Change the datasample type to a tuple instead of a list.
- [x] Show Progress bar if epoch_logging is False
- [x] Use a function as a parameter to Train method to compare predictions and actual targets.
- [ ] convert all camel-cased vars to snake-case.
    - [ ] NeuralNetwork.py
    - [ ] Layer.py
    - [ ] Dataset.py
    - [ ] ActivationFunction.py
    - [ ] Utils.py
    - [ ] LossFunctions.py

- [ ] API docs
    - [x] Add doc strings to all functions.
    - [x] Make the class/function declarations' docs collapsable.
    - [ ] Merge API md files and embed them in Readme.
    - [ ] Create a section, API, in README to provide documentation for all prototypes.

- [ ] Implement Batch Training
- [ ] Write a separate class for Scalers as the scaling methods increase.
- [ ] Linear and Relu activation functions
- [ ] Ability to perform regression
- [ ] Separate out outputlayer from other layers. => Create a separate class for output layer which inherits Layer.


- [ ] Convolution Nets
- [ ] Recurrent Nets

neural_net_using_numpy's People

Contributors

subhash3 avatar cquential avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.