Giter Club home page Giter Club logo

genetic-neural-network's Introduction

Genetic Neural Network

In an industrial process, a particular feature can only be measured at the end of the process. This program uses the data from three sensors to predict the outcome of the system. In order to do that, a neural network has been constructed, which uses genetic algorithms to update its weights (synapses). The artificial network does have initial random values for the weights, which will be trained with several data tests provided. Finally, after the training has finished, the accuracy of the predictions provided by the artificial neural network is tested by a different list of data sets.
The multilayer network features an input layer, 21 hidden layers and an output layer. In each layer we can find 7 neurons (layer width), every one of which rely on 3 inputs (dendrites). Each input is multiplied by the weight (synapses), before they all are added into a result which is evaluated in order to see if it activates the threshold unit (perceptron). For this purpose a sigmoid function is used, since it is smoother than a step function or other lineal approaches. This function has some very useful properties, such as being stable (since, in the case of big increments in the input, the output does not vary so much) but so much abrupt in the edge.
The genetic algorithm is applied to the weights (synapses) of every neuron of the artificial neural network. An initial random population of 6 chromosomes is created with values in the [-1, 1] interval. Each chromosome represents a solution for a weight. For every set of data for training, an output is calculated for the inputs given. This output is compared with the expected result (which is also known beforehand) to calculate the fitness of each solution proposed. The individuals with the best fitness functions are selected and taken into the mating pool with a size of 4 chromosomes. Then, in the phase of crossover, they are combined using arithmetic crossover, so two different offsprings are obtained by means of linear combination. Afterwards, a mutation is applied to all of the weights.
In order to elucidate the way on which the genetic algorithms should be integrated into the neural network, thus replacing the typical backpropagation algorithm, the following research papers and sources were used:
 - "Evolutionary Algorithm for Optimal Connection Weights in Artificial Neural Networks", by G. V. R. Sagar, Dr. S. Venkata Chalam &  Manoj Kumar Singh.
 - "Genetic Algorithms in Artificial Neural Networks", by Bukarica Leto
 - "Methods of Combining Neural Networks and Genetic Algorithms", by Talib S. Hussain
 
 After a preliminar (yet fully functional) implementation of the artificial neural network was constructed, the data structures supporting it were redesignated, along with the way each one of the neurons performed its computational work during each iteration of the training and testing stages. The first approach, focused in the basic structure of the neural network and the algorithms used to calibrate the perceptrons' input weights, traversed each one of the neurons in a purely sequential way, starting from the first hidden layer, and until the output layer produced an output value. Artificial neural networks are an extremely powerful tool, not only because of their practicality and flexibility, but also because of being particularly prone to be parallelized. So, the underlying structure of the artificial neural network (a mere graph) was redesigned as a pipeline, getting advantage of its multilayer architecture. As the "pipeline" term refers, all of the layers are executed concurrently, getting benefit from multicore and multiprocessor host architectures. This allowed the artificial network to complete its whole processing work in a time substantially lower than the original one.
 In a later optimization, every neuron inside a layer was made to compute concurrently along with their peers (according to a "wait for the last one to finish" policy), thus increasing the artificial neural network throughput even more than before.

genetic-neural-network's People

Contributors

systematic-chaos avatar

Watchers

James Cloos avatar  avatar

Forkers

bijuan

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.