Scala Library/REPL for Machine Learning Research
Clone or download
mandar2812 [dynaml-tensorflow]: Learn
  -- Added gradient clipping parameter to model building

Signed-off-by: mandar2812 <mandar2812@gmail.com>
Latest commit a18093d Nov 17, 2018
Permalink
Failed to load latest commit information.
.github/ISSUE_TEMPLATE Update issue templates Jun 29, 2018
conf Update to Apache Spark tests Sep 3, 2018
data Separation of concerns, removing the excess examples and data files May 4, 2016
docs [dynaml-pipes]: Sep 12, 2018
dynaml-core [dynaml-tensorflow]: Learn Nov 17, 2018
dynaml-examples/src/main/scala-2.11/io/github/mandar2812/dynaml/examples dynaml-core: Added spearman correlation to `RegressionMetrics` Aug 13, 2018
dynaml-pipes/src [dynaml-tensorflow]: Sep 13, 2018
dynaml-repl/src/main/scala-2.11/io/github/mandar2812/dynaml/repl Bumped Ammonite dependency to 1.1.0 Mar 31, 2018
project [dynaml-tensorflow]: Oct 16, 2018
scripts [dynaml-tensorflow]: Oct 16, 2018
src/main Update to Apache Spark tests Sep 3, 2018
.gitignore Bumped Ammonite dependency to 1.1.0 Mar 31, 2018
.gitlab-ci.yml Reverting back to sbt 0.13.16 Nov 16, 2017
.travis.yml Added test for decomposable kernels Sep 4, 2018
CODE_OF_CONDUCT.md Update CODE_OF_CONDUCT.md Jun 20, 2017
CONTRIBUTING.md Added contributing page Jun 20, 2017
LICENSE Update LICENSE Jun 13, 2016
NOTICE Adding notice and code notice, Daisy system identification: Power Plant Mar 4, 2016
README.md [docs]: Minor mods to index page Sep 9, 2018
build.sbt updated scala pickling dependency, added data mappings to package task Sep 18, 2018
copy-docs.sh Preparation for v1.5.2 release Mar 3, 2018
install.sh Improvements to install script Jun 20, 2017
jitpack.yml added nitpick config Mar 28, 2018
mkdocs.yml Mods to index page of docs Sep 8, 2018
paper.bib Added paper for JOSS. Jun 29, 2018
paper.md Added paper for JOSS [3]. Jun 29, 2018
params.json Create gh-pages branch via GitHub Feb 16, 2015

README.md

3dplot

DynaML: ML + JVM + Scala

Join the chat at https://gitter.im/DynaML/Lobby Build Status codecov status



DynaML is a Scala & JVM Machine Learning toolbox for research, education & industry.


Plot3d Plot2d

Motivation

  • Interactive. Don't want to create Maven/sbt project skeletons every time you want to try out ideas? Create and execute scala worksheets in the DynaML shell. DynaML comes packaged with a customized version of the Ammonite REPL, with auto-complete, file operations and scripting capabilities.

  • End to End. Create complex pre-processing pipelines with the data pipes API, train models (deep nets, gaussian processes, linear models and more), optimize over hyper-parameters, evaluate model predictions and visualise results.

  • Enterprise Friendly. Take advantage of the JVM and Scala ecosystem, use Apache Spark to write scalable data analysis jobs, Tensorflow for deep learning, all in the same toolbox.


Getting Started

Platform Compatibility

Currently, only *nix and OSX platforms are supported.

DynaML is compatible with Scala 2.11

Installation

Easiest way to install DynaML is cloning & compiling from the github repository. Please take a look at the installation instructions in the user guide, to make sure that you have the pre-requisites and to configure your installation.


CIFAR in 100 lines

Below is a sample script where we train a neural network of stacked Inception cells on the CIFAR-10 image classification task.

import ammonite.ops._
import io.github.mandar2812.dynaml.pipes.DataPipe
import io.github.mandar2812.dynaml.tensorflow.data.AbstractDataSet
import io.github.mandar2812.dynaml.tensorflow.{dtflearn, dtfutils}
import io.github.mandar2812.dynaml.tensorflow.implicits._
import org.platanios.tensorflow.api._
import org.platanios.tensorflow.api.learn.layers.Activation
import org.platanios.tensorflow.data.image.CIFARLoader
import java.nio.file.Paths


val tempdir = home/"tmp"

val dataSet = CIFARLoader.load(
  Paths.get(tempdir.toString()), 
  CIFARLoader.CIFAR_10)

val tf_dataset = AbstractDataSet(
  dataSet.trainImages, dataSet.trainLabels, dataSet.trainLabels.shape(0),
  dataSet.testImages, dataSet.testLabels, dataSet.testLabels.shape(0))

val trainData =
  tf_dataset.training_data
    .repeat()
    .shuffle(10000)
    .batch(128)
    .prefetch(10)


println("Building the model.")
val input = tf.learn.Input(
  UINT8, 
  Shape(
    -1, 
    dataSet.trainImages.shape(1), 
    dataSet.trainImages.shape(2), 
    dataSet.trainImages.shape(3))
)

val trainInput = tf.learn.Input(UINT8, Shape(-1))

val relu_act = DataPipe[String, Activation](tf.learn.ReLU(_))

val architecture = tf.learn.Cast("Input/Cast", FLOAT32) >>
  dtflearn.inception_unit(
    channels = 3,  Seq.fill(4)(10), 
    relu_act)(layer_index = 1) >>
  dtflearn.inception_unit(
    channels = 40, Seq.fill(4)(5), 
    relu_act)(layer_index = 2) >>
  tf.learn.Flatten("Layer_3/Flatten") >>
  dtflearn.feedforward(256)(id = 4) >>
  tf.learn.ReLU("Layer_4/ReLU", 0.1f) >>
  dtflearn.feedforward(10)(id = 5)

val trainingInputLayer = tf.learn.Cast("TrainInput/Cast", INT64)

val loss = 
  tf.learn.SparseSoftmaxCrossEntropy("Loss/CrossEntropy") >> 
    tf.learn.Mean("Loss/Mean") >> 
    tf.learn.ScalarSummary("Loss/Summary", "Loss")

val optimizer = tf.train.Adam(0.1)

val summariesDir = Paths.get((tempdir/"cifar_summaries").toString())

val (model, estimator) = dtflearn.build_tf_model(
  architecture, input, trainInput, trainingInputLayer,
  loss, optimizer, summariesDir, dtflearn.max_iter_stop(500),
  100, 100, 100)(
  trainData, true)

def accuracy(predictions: Tensor, labels: Tensor): Float =
  predictions.argmax(1)
    .cast(UINT8)
    .equal(labels)
    .cast(FLOAT32)
    .mean()
    .scalar
    .asInstanceOf[Float]

val (trainingPreds, testPreds): (Option[Tensor], Option[Tensor]) =
  dtfutils.predict_data[
    Tensor, Output, DataType, Shape, Output,
    Tensor, Output, DataType, Shape, Output,
    Tensor, Tensor](
    estimator,
    data = tf_dataset,
    pred_flags = (true, true),
    buff_size = 20000)

val (trainAccuracy, testAccuracy) = (
  accuracy(trainingPreds.get, dataSet.trainLabels),
  accuracy(testPreds.get, dataSet.testLabels))

print("Train accuracy = ")
pprint.pprintln(trainAccuracy)

print("Test accuracy = ")
pprint.pprintln(testAccuracy)

Support & Community