Skip to content
onnx-go gives the ability to import a pre-trained neural network within Go without being linked to a framework or library.
Branch: master
Clone or download
owulveryck chore: typo and rewrite
* chore: gofmt -s

* chore: gofmt -s

* chore: typo in a comment

* chore: gofmt -s
Latest commit a61ee5b Apr 19, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github/ISSUE_TEMPLATE Update issue templates Apr 2, 2019
backend chore: typo and rewrite Apr 19, 2019
internal chore: typo and rewrite Apr 19, 2019
misc/viewer
vignettes feat: add schema Mar 30, 2019
.gitignore fix: the test file belongs to the gorgonia subdirectory Feb 1, 2019
.travis.yml
CODE_OF_CONDUCT.md
CONTRIBUTING.md Create CONTRIBUTING.md Apr 2, 2019
LICENSE
Makefile
README.md chore: changelog Apr 19, 2019
RELNOTES.md chore: changelog Apr 19, 2019
_config.yml
attributes.go feat: internal method toAttributeProto does not panic but returns an … Apr 15, 2019
backend.go
conv_example_test.go chore: add more README Mar 30, 2019
decoder.go feat: internal method toAttributeProto does not panic but returns an … Apr 15, 2019
errors.go feat: ErrNotImpleneted.Error() is displaying the Message attribute Apr 13, 2019
example_gorgonnx_test.go feat: add documentation about how to run a graph Apr 19, 2019
example_test.go
graph_test.go
io.go
node.go feat: implementation of SetInput and GetOutputTensors Apr 12, 2019
sigmoid_test.go feat: new API as discussed in issue#8 Mar 26, 2019
tensor.go IO Jan 31, 2019

README.md

ONNX Logo Go Logo

GoDoc Go Report Card Build Status

This is a Go Interface to Open Neural Network Exchange (ONNX).

Overview

onnx-go contains primitives to decode a onnx binary model into a computation backend, and use it like any other library in your go code. for more information about onnx, please visit onnx.ai.

The implementation of the the spec of ONNX is partial on the import, and non-existent for the export.

Vision statement

For the Go developer who needs to add a machine learning capability to his/her code, onnx-go is a package that facilitates the use of neural network models (software 2.0) and unlike any other computation library, this package does not require special skills in data-science.

Warning The API is experimental and may change.

Disclaimer

This is a new version of the API.
The tweaked version of Gorgonia have been removed. It is now compatible with the master branch of Gorgonia.
Some operators are not yet available though.

Meanwhile, you can use the old version for a demo by fetching a pre-release version of checking out the old version `01b2e2b`

Install

go get github.com/owulveryck/onnx-go

Example

Those examples assumes that you have a pre-trained model.onnx file available. You can download pre-trained modles from the onnx model zoo.

Very simple example

This example does nothing but decoding the graph into a simple backend. Then you can do whatever you want with the generated graph.

// Create a backend receiver
	backend := simple.NewSimpleGraph()
	// Create a model and set the execution backend
	model := onnx.NewModel(backend)

	// read the onnx model
	b, _ := ioutil.ReadFile("model.onnx")
	// Decode it into the model
	err := model.UnmarshalBinary(b)

Simple example to run a pre-trained model

This example uses Gorgonia as a backend.

import "github.com/owulveryck/onnx-go/backend/x/gorgonnx"

At the present time, Gorgonia does not implement all the operators of ONNX. Therefore, most of the model from the model zoo will not work. Things will go better little by little by adding more operators to the backend.

You can find a list of tested examples and a coverage here.

func Example_gorgonia() {
	// Create a backend receiver
	backend := gorgonnx.NewGraph()
	// Create a model and set the execution backend
	model := onnx.NewModel(backend)

	// read the onnx model
	b, _ := ioutil.ReadFile("model.onnx")
	// Decode it into the model
	err := model.UnmarshalBinary(b)
	if err != nil {
		log.Fatal(err)
	}
	// Set the first input, the number depends of the model
	model.SetInput(0, input)
	err = backend.Run()
	// Check error
	output, _ := model.GetOutputTensors()
	// write the first output to stdout
	fmt.Println(output[0])
}

Internal

ONNX protobuf definition

The protobuf definition of onnx has is compiled into Go with the classic protoc tool. The definition can be found in the internal directory. The definition is not exposed to avoid external dependencies to this repo. Indeed, the pb code can change to use a more efficient compiler such as gogo protobuf and this change should be transparent to the user of this package.

Execution backend

In order to execute the neural network, you need a backend able to execute a computation graph (for more information on computation graphs, please read this blog post

This picture represents the mechanism:

Schema

onnx-go do not provide any executable backend, but for a reference, a simple backend that builds an information graph is provided as an example (see the simple subpackage). Gorgonia is the man target backend of ONNX-Go.

Backend implementation

a backend is basically a Weighted directed graph that can apply on Operation on its nodes. It should fulfill this interface:

type Backend interface {
	OperationCarrier
	graph.DirectedWeightedBuilder
}
type OperationCarrier interface {
	ApplyOperation(Operation, graph.Node) error
}

An Operation is represented by its name and a map of attributes. For example the Convolution operator as described in the spec of onnx will be represented like this:

convOperator := Operation{
		Name: "Conv",
		Attributes: map[string]interface{}{
			"auto_pad":  "NOTSET",
			"dilations": []int64{1, 1},
			"group":     1,
			"pads":      []int64{1, 1},
			"strides":   []int64{1, 1},
		},
	}

Besides, operators, a node can carry a value. Values are described as tensor.Tensor To carry data, a Node of the graph should fulfill this interface:

type DataCarrier interface {
	SetTensor(t tensor.Tensor) error
	GetTensor() tensor.Tensor
}

Backend testing

onnx-go provides a some utilities to test a backend. Visit the testbackend package for more info.

Contributing

Contributions are welcome. A contribution guide will be eventually written. Meanwhile, you can raise an issue or send a PR. You can also contact me via Twitter or on the gophers' slack (I am @owulveryck on both)

This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the Contributor Covenant code of conduct.

Author

Olivier Wulveryck

License

MIT.

You can’t perform that action at this time.