A DSL for creating complex neural networks
Scala
Latest commit 14cb24d Jan 16, 2017 @Atry Atry Optimize backward phase.
The backward phase should be evaluated only if the batch contains some weights.
Permalink
Failed to load latest commit information.
BufferedLayer Optimize backward phase. Jan 16, 2017
DifferentiableAny/src/main/scala/com/thoughtworks/deeplearning Rename parameter name Dec 31, 2016
DifferentiableBoolean/src/main/scala/com/thoughtworks/deeplearning Optimize backward phase. Jan 16, 2017
DifferentiableCoproduct Optimize backward phase. Jan 16, 2017
DifferentiableDouble Optimize backward phase. Jan 16, 2017
DifferentiableHList Optimize backward phase. Jan 16, 2017
DifferentiableINDArray Optimize backward phase. Jan 16, 2017
DifferentiableInt Optimize backward phase. Jan 16, 2017
DifferentiableNothing/src/main/scala/com/thoughtworks/deeplearning Optimize backward phase. Jan 16, 2017
DifferentiableSeq Optimize backward phase. Jan 16, 2017
Layer Optimize backward phase. Jan 16, 2017
Lift Optimize backward phase. Jan 16, 2017
Poly/src/main/scala/com/thoughtworks/deeplearning Rename Conversion to Lift Dec 22, 2016
project Build for Scala 2.12 Dec 31, 2016
src/test/scala/com/thoughtworks/deeplearning Fix tests for Scala 2.12 Dec 31, 2016
.gitignore Rename deep-learning to DeepLearning.scala corresponding to the repos… Nov 24, 2016
.scalafmt.conf Rename deep-learning to DeepLearning.scala corresponding to the repos… Nov 24, 2016
.travis.yml Rename deep-learning to DeepLearning.scala corresponding to the repos… Nov 24, 2016
LICENSE Rename deep-learning to DeepLearning.scala corresponding to the repos… Nov 24, 2016
README.md Update README.md Jan 10, 2017
build.sbt Fix tests for Scala 2.12 Dec 31, 2016
deploy.sbt.disabled Setting version to 1.0.0-RC4 Jan 1, 2017
version.sbt Setting version to 1.0.0-SNAPSHOT Jan 1, 2017

README.md

DeepLearning.scala

Join the chat at https://gitter.im/ThoughtWorksInc/DeepLearning.scala Build Status Scaladoc

DifferentiableAny DifferentiableNothing DifferentiableSeq DifferentiableDouble DifferentiableFloat DifferentiableHList DifferentiableCoproduct DifferentiableINDArray

DeepLearning.scala is a DSL for creating complex neural networks.

With the help of DeepLearning.scala, regular programmers are able to build complex neural networks from simple code. You write code almost as usual, the only difference being that code based on DeepLearning.scala is differentiable, which enables such code to evolve by modifying its parameters continuously.

Features

Differentiable basic types

Like Theano and other deep learning toolkits, DeepLearning.scala allows you to build neural networks from mathematical formulas. It supports floats, doubles, GPU-accelerated N-dimensional arrays, and calculates derivatives of the weights in the formulas.

Differentiable ADTs

Neural networks created by DeepLearning.scala support ADT data structures (e.g. HList and Coproduct), and calculate derivatives through these data structures.

Differentiable control flow

Neural networks created by DeepLearning.scala may contains control flows like if/else/match/case in a regular language. Combined with ADT data structures, you can implement arbitary algorithms inside neural networks, and still keep some of the variables used in the algorithms differentiable and trainable.

Composability

Neural networks created by DeepLearning.scala are composable. You can create large networks by combining smaller networks. If two larger networks share some sub-networks, the weights in shared sub-networks trained with one network affect the other network.

Static type system

All of the above features are statically type checked.

Roadmap

v1.0

Version 1.0 is the current version with all of the above features. The final version will be released in Janary 2017.

v2.0

  • Support for/while and other higher-order functions on differenitable Seqs.
  • Support for/while and other higher-order functions on GPU-accelerated differenitable N-dimensional arrays.

Version 2.0 will be released in March 2017.

v3.0

  • Support using custom case classes inside neural networks.
  • Support distributed models and distributed training on Spark.

Version 3.0 will be released in late 2017.

Links

Acknowledges

DeepLearning.scala is heavily inspired by my colleague @MarisaKirisame.

@milessabin's shapeless provides a solid foundation for type-level programming as used in DeepLearning.scala.