Skip to content

Daquexian's NNAPI Library. Run neural network using the new NNAPI on Android !

Notifications You must be signed in to change notification settings

whenSunSet/DNNLibrary

 
 

Repository files navigation

DNNLibrary

Run neural network on your Android phone using the new NNAPI !

Android 8.1 introduces Neural Networks API (NNAPI). It's very exciting to run a model in the "native" way supported by Android System. :)

DNNLirary is a wrapper of NNAPI. It lets you easily make the use of the new NNAPI introduced in Android 8.1. You can convert your caffemodel into daq format by the convert tool and run the model directly.

The demo consists daq model files for LeNet, ResNet-18 and SqueezeNet, and you can get your own model conveniently from pretrained caffemodel.

For how to use this lib directly in your project, check out Usage (it's at the bottom)

Screenshot

This screenshot is from ResNet-18 branch, which lets user pick an image instead of using camera.

Screenshot image resnet

This screenshot is from LeNet branch, which uses camera.

Screenshot camera mnist

Introduction

Android 8.1 introduces NNAPI. From my experient it is very efficient on my Pixel. For example, it takes caffe-android-lib an average time of 43.42ms to do a convolution with 20 5*5 filters on 224*224 image, but it only takes 15.45ms for NNAPI -- about 1/3 of caffe-android-lib.

What's more, we can believe depthwise convolution, which is useful on mobile devices, is optimized in NNAPI. It takes caffe-android-lib and NNAPI 82.32ms and 16.93ms respectively to do 5 * 5 depthwise conv on 224 * 224 * 20 input.

However, NNAPI is not friendly to normal Android developers. It is not designed to be used by normal developers directly. So I wrapped it into a library.

With DNNLibrary it's extremely easy to deploy your caffe model on Android 8.1+ phone. Here is my code to deploy the ResNet-18 on phone:

ModelWrapper.readFile(getAssets(), "resnet18");
ModelWrapper.setOutput("prob");
ModelWrapper.compile(ModelWrapper.PREFERENCE_FAST_SINGLE_ANSWER);

float[] result = ModelWrapper.predict(inputData);

Only four lines! And the model file is got from my convert tool from pretrained caffemodel.

If you use the "raw" NNAPI, the code will increase dramatically. Setting up a LeNet needs 200+ lines. (For the 200+ lines LeNet you can check out the second commit of this repo)

Usage

This library uses beta version of NDK which is not supported by jCenter/JitPack now, so I can't publish this library until they support NDK r16. Please download dnnlibrary-release.aar in lastest Release in this repo and import it in your project.

Preparation

Please make sure the Android System on your phone is 8.1+, or you may want to use API 27 emulator. The latest version of NDK is necessary for NNAPI. If you want to compile the demo please use Android Studio 3.0+.

About

Daquexian's NNAPI Library. Run neural network using the new NNAPI on Android !

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Java 95.5%
  • C++ 4.3%
  • CMake 0.2%