Skip to content
An Engine-Agnostic Deep Learning Framework
Java C Jupyter Notebook Python ANTLR C++ Other
Branch: master
Clone or download
frankfliu and stu1130 Add license to mxnet native jar.
Change-Id: Ie2227bddebeee0c5af0ceb4b1b960fe15e763138
Latest commit a26c80a Dec 5, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github Fix release pipeline trigger. Nov 27, 2019
api minor example doc fix Nov 27, 2019
basicdataset Update Javadoc destination Nov 26, 2019
docs
examples Fix example maven publish issue. Nov 29, 2019
gradle/wrapper Upgrade gradle wraper to 6.0 Nov 18, 2019
integration Fix url path for windows. Dec 4, 2019
jupyter Avoid depends on example project. Nov 29, 2019
model-zoo update documentation Nov 26, 2019
mxnet Add license to mxnet native jar. Dec 4, 2019
repository Update Javadoc destination Nov 26, 2019
tensorflow/engine Update Javadoc destination Nov 26, 2019
tools Use utf-8 encoding for javadoc. Dec 4, 2019
website Better favicon (not squeezed) Nov 29, 2019
.gitignore Avoid check in .params file. Nov 4, 2019
CNAME Create CNAME Nov 22, 2019
CODE_OF_CONDUCT.md Initial commit Oct 29, 2019
CONTRIBUTING.md Initial commit Oct 29, 2019
LICENSE Initial commit Oct 29, 2019
NOTICE Initial commit Oct 29, 2019
README.md Slack information on README.md Dec 3, 2019
build.gradle fix gradle Nov 26, 2019
gradle.properties Replace slf4j-simple with formal log4j2. Oct 30, 2019
gradlew Add gradle wrapper scriptes. Oct 30, 2019
gradlew.bat Add gradle wrapper scriptes. Oct 30, 2019
index.html Better favicon (not squeezed) Nov 29, 2019
settings.gradle Move mxnet/engine to mxnet/mxnet-engine to match artificat name. Nov 13, 2019

README.md

DeepJavaLibrary

Deep Java Library (DJL)

Overview

Deep Java Library (DJL) is an open-source, high-level, framework-agnostic Java API for deep learning. DJL is designed to be easy to get started with and simple to use for Java developers. DJL provides a native Java development experience and functions like any other regular Java library.

You don't have to be machine learning/deep learning expert to get started. You can use your existing Java expertise as an on-ramp to learn and use machine learning and deep learning. You can use your favorite IDE to build, train, and deploy your models. DJL makes it easy to integrate these models with your Java applications.

Because DJL is deep learning framework agnostic, you don't have to make a choice between frameworks when creating your projects. You can switch frameworks at any point. To ensure the best performance, DJL also provides automatic CPU/GPU choice based on hardware configuration.

DJL's ergonomic API interface is designed to guide you with best practices to accomplish deep learning tasks. The following pseudocode demonstrates running inference:

    // Assume user uses a pre-trained model from model zoo, they just need to load it
    Map<String, String> criteria = new HashMap<>();
    criteria.put("layers", "18");
    criteria.put("flavor", "v1");

    // Load pre-trained model from model zoo
    try (Model<BufferedImage, Classifications> model = MxModelZoo.RESNET.loadModel(criteria)) {
        try (Predictor<BufferedImage, Classifications> predictor = model.newPredictor()) {
            BufferedImage img = readImage(); // read image
            Classifications result = predictor.predict(img);

            // get the classification and probability
            ...
        }
    }

The following pseudocode demonstrates running training:

    // Construct your neural network with built-in blocks
    Block block = new Mlp(28, 28);

    try (Model model = Model.newInstance()) { // Create an empty model
        model.setBlock(block); // set neural network to model

        // Get training and validation dataset (MNIST dataset)
        Dataset trainingSet = new Mnist.Builder().setUsage(Usage.TRAIN) ... .build();
        Dataset validateSet = new Mnist.Builder().setUsage(Usage.TEST) ... .build();

        // Setup training configurations, such as Initializer, Optimizer, Loss ...
        TrainingConfig config = setupTrainingConfig();
        try (Trainer trainer = model.newTrainer(config)) {
            /*
             * Configure input shape based on dataset to initialize the trainer.
             * 1st axis is batch axis, we can use 1 for initialization.
             * MNIST is 28x28 grayscale image and pre processed into 28 * 28 NDArray.
             */
            Shape inputShape = new Shape(1, 28 * 28);
            trainer.initialize(new Shape[] {inputShape});

            TrainingUtils.fit(trainer, epoch, trainingSet, validateSet);
        }

        // Save the model
        model.save(modelDir, "mlp");
    }

Getting Started

Resources

Release Notes

Building From Source

To build from source, begin by checking out the code. Once you have checked out the code locally, you can build it as follows using Gradle:

./gradlew build

To increase build speed, you can use the following command to skip unit tests:

./gradlew build -x test

Note: SpotBugs is not compatible with JDK 11+. SpotBugs will not be executed if you are using JDK 11+.

Slack channel

Join our slack channel to get in touch with the development team, for questions and discussions.

License

This project is licensed under the Apache-2.0 License.

You can’t perform that action at this time.