Skip to content
master
Switch branches/tags
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
src
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Build Status

Tensorflow.js Onnx Runner

Run and finetune pretrained Onnx models in the browser with GPU support via the wonderful Tensorflow.js library.

Usage

Installation

You can use this as standalone es5 bundle like this:

<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@0.10.3"></script>
<script src="https://unpkg.com/tfjs-onnx"></script>

Then, loading model is a simple as referencing the path to the model.onnx file.

Here is an example of loading SqueezeNet:

var modelUrl = 'models/squeezenet/model.onnx';

// Initialize the tf.model
var model = new onnx.loadModel(modelUrl);

// Now use tf.model
const pixels = tf.fromPixels(img);
const predictions = model.predict(pixels);

Run Demos

To run the demo, use the following:

npm run build

# Start a webserver
npm run start

Now navigate to http://localhost:8080/demos.

Hint: some of the models are quite big (>30MB). You have to download the Onnx models and place them into the demos/models directory to save bandwith.

Development

npm install

To build a standalone bundle run

npm run build

About

Run and finetune pretrained Onnx models in the browser with GPU support via the wonderful Tensorflow.js library

Resources

License

Releases

No releases published

Packages

No packages published