Giter Club home page Giter Club logo

brabenetz's Introduction

BrabeNetz

BrabeNetz is a supervised neural network written in C++, aiming to be as fast as possible by using bare values instead of objects

BrabeNetz has no bounds- or error-checking for performance reasons, be careful what you feed it.

I've currently trained it to solve XOR, once I fix this, I'll train it to recognize handwritten characters.

Be sure to read the network description

Benchmarks

It's pretty fast (TODO)

Goals

  • Fast Feed-Forward algorithm
  • Fast Backwards-Propagation algorithm
  • Easy to use (Inputs, outputs)
  • Pointer arrays instead of std::vector
  • Fast binary network state saving via state.nn file (Weights, Biases, Sizes)
  • Multithreaded if worth the spawn-overhead (std::thread or NVIDIA CUDA)
  • Scalability (Neuron size, Layer count) - only limited by hardware

Specs

  • Randomly generated values to begin with
  • Easily save/load with network::save(string)/network::load(string)
  • Sigmoid squashing function (TODO: ReLU?)
  • Biases for each neuron
  • network_topology helper objects for loading/saving state and inspecting network (TODO: Remove totally?)

Usage

  1. Constructors

    • network(initializer_list<int>): Create a new neural network with the given topology vector and fill it with random numbers ({ 2, 3, 4, 1} = 2 Input, 3 Hidden, 4 Hidden, 1 Output Neurons - total of 4 layers)
    • network(network_topology&): Create a new neural network with the given network topology and load_ it's values
    • network(string): Create a new neural network with the given path to the sate.nn file and load it.
  2. Functions

    • double feed(double* input_values, int length, int& out_length): Feed the network input_values and return an array of output values (where out_length will be set to the length of returned output values, that's the size of the output layer in topology)
    • double train(double* input_values, int length, double* expected_output): Feed the network input_values and compare predicted output with expected_output, Backwards-Propagate (adjust weights/biases) if needed. Returns the total error of the output layer.
    • void save(string path): Save the current network state (topology, weights, biases) to disk (with the given path or default: state.nn)
    • void set_learnrate(double value): Set the learn rate of the network (used by train(..) function). Should always be 1 / (total train times + 1)

An example of this can be found here, and here

brabenetz's People

Contributors

mrousavy avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.