This project provides an idiomatic and functional implementation of a deep- learning neural network using backpropagation learning, as well as additional functions to easily use these neural networks.
The focus is on legibility, not on execution speed. Using a mature library like Encog is recommended; a nice-looking Clojure wrapper can be found at Enclog. While I've added a fair few comments throughout the code, it may only be enough to guide someone who already has a good grasp on backpropagation.
My backpropagation algorithm is based on Norvig and Russell's imperative algorithm described in their excellent textbook on artificial intelligence, "Artificial Intelligence: A Modern Approach".
For usage, see the below example code for training a net to learn AND, OR, XOR and NOR logic gates, and inspecting error before and after.
(def +inputs+ [[1 1] [1 0] [0 1] [0 0]])
;; AND, OR, XOR, NOR gates
(def +expecteds+ [[1, 1, 0, 0] [0, 1, 1, 0] [0, 1, 1, 0] [0, 0, 0, 1]])
(def +untrained-logic-network+
(initialize-network-weights 2 3 4)) ; 2 input nodes, 3 hidden nodes, 4 output nodes
;; try, e.g. 2 3 3 4, for 2 hidden layers
(def +trained-logic-network+
(nth (train +inputs+ +expecteds+ +untrained-logic-network+)
100)) ;; take the 100th trained network (train returns an infinite seq)
(set-error +inputs+ +expecteds+ +untrained-logic-network+) ;; get the error before
(set-error +inputs+ +expecteds+ +trained-logic-network+) ;; and after training
NB: Momentum was purposefully left out. Though adding it would be trivial, it obfuscates the code to keep a copy of the weight changes from the previous round.