Skip to content
Scala Library/REPL for Machine Learning Research
Scala Java JavaScript TeX Shell CSS
Branch: master
Clone or download
mandar2812 Merge pull request #121 from scala-steward/update/kernel-0.8.1
Update kernel, scala-interpreter, ... to 0.8.1
Latest commit 51267ab Sep 13, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github/ISSUE_TEMPLATE Update issue templates Jun 29, 2018
.vscode [docs]: Started migration to sbt based docusaurus publishing WIP Sep 2, 2019
conf Scala 2.12 compatibility complete: Dec 18, 2018
data Separation of concerns, removing the excess examples and data files May 4, 2016
docs-old [docs]: Started migration to sbt based docusaurus publishing WIP Sep 2, 2019
docs [docs]: Started migration to sbt based docusaurus publishing WIP Sep 2, 2019
dynaml-core [dynaml-core]: Improvements to monte carlo quadrature rule. Sep 13, 2019
dynaml-examples Updating generated poms Aug 27, 2019
dynaml-notebook Fix to notebook kernel malfunction Aug 27, 2019
dynaml-pipes Clean up of build and dependencies Aug 28, 2019
dynaml-repl Updating generated poms Aug 27, 2019
dynaml-tensorflow/src [dynaml-core]: Improvements to monte carlo quadrature rule. Sep 13, 2019
project Update kernel, scala-interpreter, ... to 0.8.1 Sep 13, 2019
scripts [dynaml-tensorflow]: Added 1d and 2d Burgers equation example. Sep 13, 2019
src/main Fix to notebook kernel malfunction Aug 27, 2019
website Ignoring node deps dir Sep 2, 2019
.dockerignore [docs]: Started migration to sbt based docusaurus publishing WIP Sep 2, 2019
.gitignore Added packaged TF flag to build scripts Sep 6, 2019
.gitlab-ci.yml Reverting back to sbt 0.13.16 Nov 16, 2017
.scalafmt.conf Added packaged TF flag to build scripts Sep 6, 2019
.travis.yml Fixes to logging errors Aug 26, 2019
CODE_OF_CONDUCT.md Update CODE_OF_CONDUCT.md Jun 20, 2017
CONTRIBUTING.md Added contributing page Jun 20, 2017
Dockerfile [docs]: Started migration to sbt based docusaurus publishing WIP Sep 2, 2019
Gemfile changes to travis config Apr 2, 2019
LICENSE Update LICENSE Jun 13, 2016
NOTICE Adding notice and code notice, Daisy system identification: Power Plant Mar 4, 2016
README.md FIxes to images Sep 2, 2019
build.sbt [docs]: Started migration to sbt based docusaurus publishing WIP Sep 2, 2019
copy-docs.sh Added new module dynaml-tensorflow separate from dynaml-core. Aug 30, 2019
docker-compose.yml [docs]: Started migration to sbt based docusaurus publishing WIP Sep 2, 2019
dynaml-jupyter-bootstrap.sh DynaML kernel bindings for Almond: Aug 26, 2019
generate-poms.sh Added new module dynaml-tensorflow separate from dynaml-core. Aug 30, 2019
install.sh Added packaged TF flag to build scripts Sep 6, 2019
jitpack.yml added nitpick config Mar 28, 2018
mkdocs.yml [docs]: Started migration to sbt based docusaurus publishing WIP Sep 2, 2019
paper.bib Added paper for JOSS. Jun 29, 2018
paper.md Added paper for JOSS [3]. Jun 29, 2018
params.json Create gh-pages branch via GitHub Feb 16, 2015
pom.xml Updating generated poms Aug 27, 2019
sbt-coverage.sh [dynaml-tensorflow]: Generalisation of fitness function in `TunableTF… Apr 10, 2019
sbt-shell.sh Updated install script Sep 10, 2019
sonatype.sbt Enabled cross build and publish Dec 20, 2018

README.md

3dplot

DynaML: ML + JVM + Scala

Join the chat at https://gitter.im/DynaML/Lobby Build Status Coverage Status status



DynaML is a Scala & JVM Machine Learning toolbox for research, education & industry.


Plot3d Plot2d

Motivation

  • Interactive. Don't want to create Maven/sbt project skeletons every time you want to try out ideas? Create and execute scala worksheets in the DynaML shell. DynaML comes packaged with a customized version of the Ammonite REPL, with auto-complete, file operations and scripting capabilities.

  • End to End. Create complex pre-processing pipelines with the data pipes API, train models (deep nets, gaussian processes, linear models and more), optimize over hyper-parameters, evaluate model predictions and visualise results.

  • Enterprise Friendly. Take advantage of the JVM and Scala ecosystem, use Apache Spark to write scalable data analysis jobs, Tensorflow for deep learning, all in the same toolbox.


Getting Started

Platform Compatibility

Currently, only *nix and OSX platforms are supported.

DynaML is compatible with Scala 2.11

Installation

Easiest way to install DynaML is cloning & compiling from the github repository. Please take a look at the installation instructions in the user guide, to make sure that you have the pre-requisites and to configure your installation.


CIFAR in 100 lines

Below is a sample script where we train a neural network of stacked Inception cells on the CIFAR-10 image classification task.

import ammonite.ops._
import io.github.mandar2812.dynaml.pipes.DataPipe
import io.github.mandar2812.dynaml.tensorflow.data.AbstractDataSet
import io.github.mandar2812.dynaml.tensorflow.{dtflearn, dtfutils}
import io.github.mandar2812.dynaml.tensorflow.implicits._
import org.platanios.tensorflow.api._
import org.platanios.tensorflow.api.learn.layers.Activation
import org.platanios.tensorflow.data.image.CIFARLoader
import java.nio.file.Paths


val tempdir = home/"tmp"

val dataSet = CIFARLoader.load(
  Paths.get(tempdir.toString()), 
  CIFARLoader.CIFAR_10)

val tf_dataset = AbstractDataSet(
  dataSet.trainImages, dataSet.trainLabels, dataSet.trainLabels.shape(0),
  dataSet.testImages, dataSet.testLabels, dataSet.testLabels.shape(0))

val trainData =
  tf_dataset.training_data
    .repeat()
    .shuffle(10000)
    .batch(128)
    .prefetch(10)


println("Building the model.")
val input = tf.learn.Input(
  UINT8, 
  Shape(
    -1, 
    dataSet.trainImages.shape(1), 
    dataSet.trainImages.shape(2), 
    dataSet.trainImages.shape(3))
)

val trainInput = tf.learn.Input(UINT8, Shape(-1))

val relu_act = DataPipe[String, Activation](tf.learn.ReLU(_))

val architecture = tf.learn.Cast("Input/Cast", FLOAT32) >>
  dtflearn.inception_unit(
    channels = 3,  Seq.fill(4)(10), 
    relu_act)(layer_index = 1) >>
  dtflearn.inception_unit(
    channels = 40, Seq.fill(4)(5), 
    relu_act)(layer_index = 2) >>
  tf.learn.Flatten("Layer_3/Flatten") >>
  dtflearn.feedforward(256)(id = 4) >>
  tf.learn.ReLU("Layer_4/ReLU", 0.1f) >>
  dtflearn.feedforward(10)(id = 5)

val trainingInputLayer = tf.learn.Cast("TrainInput/Cast", INT64)

val loss = 
  tf.learn.SparseSoftmaxCrossEntropy("Loss/CrossEntropy") >> 
    tf.learn.Mean("Loss/Mean") >> 
    tf.learn.ScalarSummary("Loss/Summary", "Loss")

val optimizer = tf.train.Adam(0.1)

val summariesDir = Paths.get((tempdir/"cifar_summaries").toString())

val (model, estimator) = dtflearn.build_tf_model(
  architecture, input, trainInput, trainingInputLayer,
  loss, optimizer, summariesDir, dtflearn.max_iter_stop(500),
  100, 100, 100)(
  trainData, true)

def accuracy(predictions: Tensor, labels: Tensor): Float =
  predictions.argmax(1)
    .cast(UINT8)
    .equal(labels)
    .cast(FLOAT32)
    .mean()
    .scalar
    .asInstanceOf[Float]

val (trainingPreds, testPreds): (Option[Tensor], Option[Tensor]) =
  dtfutils.predict_data[
    Tensor, Output, DataType, Shape, Output,
    Tensor, Output, DataType, Shape, Output,
    Tensor, Tensor](
    estimator,
    data = tf_dataset,
    pred_flags = (true, true),
    buff_size = 20000)

val (trainAccuracy, testAccuracy) = (
  accuracy(trainingPreds.get, dataSet.trainLabels),
  accuracy(testPreds.get, dataSet.testLabels))

print("Train accuracy = ")
pprint.pprintln(trainAccuracy)

print("Test accuracy = ")
pprint.pprintln(testAccuracy)

Support & Community

You can’t perform that action at this time.