Giter Club home page Giter Club logo

neuralnetworks's Introduction

neuralnetworks

Clojars Project

Build Status Dependency Status

Neural networks library for Clojure. Built on top of core.matrix array programming API. API Documentation

Currently it has the following features

  • Regularization
  • Swappable optimizer. Currently it only supports Gradient Descent with Backtracking Line Search. More optimizer will be added in the future
  • Multiple stopping conditions. Currently it supports stopping conditions based on error or number of iterations. If multiple stopping conditions are provided, it will be treated as OR (if either stopping condition is fulfilled, the optimizer stops training)
  • Swappable activation/sigmoid function. Currently it has 2 functions:
  • Swappable cost/error function. Currently it has 2 functions:
    • Cross-entropy - suitable for classification problem where it penalizes mis-classification
    • Mean squared error - suitable for regression problem (curve fitting)
  • Cost function accepts varargs and will respond to :skip-gradients argument if provided. This will prevent neural networks to perform back-propagation (used in line search)

Usage

The following is an example of how to use neural networks library to train for AND function

(require '[neuralnetworks.core :as nn])
(require '[clojure.core.matrix :as m])
(use '[neuralnetworks.stopping-conditions])

(let [input    (m/array [[0 0]
                         [0 1]
                         [1 0]
                         [1 1]])
      thetas   (nn/randomize-thetas 2 [3] 1)
      output   (m/array [[0]
                         [0]
                         [0]
                         [1]])
      options  {}
      instance (nn/new-instance input thetas output :classification options)]

  (prn "Before training: " (nn/predict instance input))
  (nn/train! instance [(max-error 0.01)])
  (prn "After training: " (nn/predict instance input)))

If an empty map is provided as the options, then the default settings are used. Currently these are the available options

  • :regularization-rate (or lambda) - default value is 0.0
  • :sigmoid-fn - default value is standard logistic function
  • :optimizer - default value is gradient descent with the following settings
    • learning rate of 8
    • learning rate update rate of 0.5

Example of options

(use '[neuralnetworks.sigmoid-fn])
(use '[neuralnetworks.optimizer.gradient-descent])

(def options {:sigmoid-fn (standard-logistic)
              :regularization-rate 0.001
              :optimizer (gradient-descent 8 0.5})

Examples

neuralnetworks-examples

neuralnetworks's People

Contributors

ronaldsuwandi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

Forkers

chunsj

neuralnetworks's Issues

Improve performance

Currently the performance is terrible

When training network for iris.csv (60x4 matrix) using standard settings, it takes really long to complete

| :iteration | :training-error | :cv-error | :test-error | :training-time |
|------------+-----------------+-----------+-------------+----------------|
|          0 |        1.957541 |  2.008086 |    2.022220 |              0 |
|        100 |        0.642222 |  0.597090 |    0.508589 |           5843 |
|        200 |        0.368939 |  0.401693 |    0.249231 |          14659 |
|        400 |        0.198466 |  0.276737 |    0.087201 |          28882 |
|        800 |        0.128815 |  0.216054 |    0.031734 |          58551 |
|       1600 |        0.098140 |  0.218428 |    0.015926 |         138371 |
|       3200 |        0.075886 |  0.224617 |    0.008630 |         256058 |

this is already using vectorz which is faster than regular persistent-vector

Each layer can have its own sigmoid function

Currently sigmoid function is applicable to all layers (hidden layers and output). We should be able to use different sigmoid for each layer so we can implement ReLU network

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.