Neural networks library for Clojure. Built on top of core.matrix
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
src/neuralnetworks
test/neuralnetworks
.gitignore
.travis.yml
CHANGELOG.md
LICENSE
README.md
project.clj

README.md

neuralnetworks

Clojars Project

Build Status Dependency Status

Neural networks library for Clojure. Built on top of core.matrix array programming API. API Documentation

Currently it has the following features

  • Regularization
  • Swappable optimizer. Currently it only supports Gradient Descent with Backtracking Line Search. More optimizer will be added in the future
  • Multiple stopping conditions. Currently it supports stopping conditions based on error or number of iterations. If multiple stopping conditions are provided, it will be treated as OR (if either stopping condition is fulfilled, the optimizer stops training)
  • Swappable activation/sigmoid function. Currently it has 2 functions:
  • Swappable cost/error function. Currently it has 2 functions:
    • Cross-entropy - suitable for classification problem where it penalizes mis-classification
    • Mean squared error - suitable for regression problem (curve fitting)
  • Cost function accepts varargs and will respond to :skip-gradients argument if provided. This will prevent neural networks to perform back-propagation (used in line search)

Usage

The following is an example of how to use neural networks library to train for AND function

(require '[neuralnetworks.core :as nn])
(require '[clojure.core.matrix :as m])
(use '[neuralnetworks.stopping-conditions])

(let [input    (m/array [[0 0]
                         [0 1]
                         [1 0]
                         [1 1]])
      thetas   (nn/randomize-thetas 2 [3] 1)
      output   (m/array [[0]
                         [0]
                         [0]
                         [1]])
      options  {}
      instance (nn/new-instance input thetas output :classification options)]

  (prn "Before training: " (nn/predict instance input))
  (nn/train! instance [(max-error 0.01)])
  (prn "After training: " (nn/predict instance input)))

If an empty map is provided as the options, then the default settings are used. Currently these are the available options

  • :regularization-rate (or lambda) - default value is 0.0
  • :sigmoid-fn - default value is standard logistic function
  • :optimizer - default value is gradient descent with the following settings
    • learning rate of 8
    • learning rate update rate of 0.5

Example of options

(use '[neuralnetworks.sigmoid-fn])
(use '[neuralnetworks.optimizer.gradient-descent])

(def options {:sigmoid-fn (standard-logistic)
              :regularization-rate 0.001
              :optimizer (gradient-descent 8 0.5})

Examples

neuralnetworks-examples