Skip to content

Go binding to TensorRT C API to do inference with pre-trained model in Go

License

Notifications You must be signed in to change notification settings

rai-project/go-tensorrt

Repository files navigation

go-tensorrt

Build Status Build StatusGo Report CardLicense

Go binding for TensorRT C predict API. This is used by the TensorRT agent in MLModelScope to perform model inference in Go.

Installation

Download and install go-tensorrt:

go get -v github.com/rai-project/go-tensorrt

The binding requires TensorRT and other Go packages.

TensorRT

TensorRT currently only works in linux and requires GPU. Please refer to Installing TensorRT to install TensorRT on your system.

Note: TensorRT is expected to be installed in either the system path or /opt/tensorrt. See lib.go for details.

If you get an error about not being able to write to /opt then perform the following

sudo mkdir -p /opt/tensorrt
sudo chown -R `whoami` /opt/tensorrt

If you are using TensorRT docker images or other libary paths, change CGO_CFLAGS, CGO_CXXFLAGS and CGO_LDFLAGS enviroment variables. Refer to Using cgo with the go command.

For example,

    export CGO_CFLAGS="${CGO_CFLAGS} -I/tmp/tensorrt/include"
    export CGO_CXXFLAGS="${CGO_CXXFLAGS} -I/tmp/tensorrt/include"
    export CGO_LDFLAGS="${CGO_LDFLAGS} -L/tmp/tensorrt/lib"

Go Packages

You can install the dependency through go get.

cd $GOPATH/src/github.com/rai-project/tensorflow
go get -u -v ./...

Or use Dep.

dep ensure -v

This installs the dependency in vendor/.

Configure Environmental Variables

Configure the linker environmental variables since the TensorRT C library is under a non-system directory. Place the following in either your ~/.bashrc or ~/.zshrc file

Linux

export LIBRARY_PATH=$LIBRARY_PATH:/tensorrt/tensorrt/lib
export LD_LIBRARY_PATH=/opt/tensorrt/lib:$LD_LIBRARY_PATH

macOS

export LIBRARY_PATH=$LIBRARY_PATH:/opt/tensorrt/lib
export LD_LIBRARY_PATH=/opt/tensorrt/lib:$DYLD_LIBRARY_PATH

Check the Build

Run go build in to check the dependences installation and library paths set-up.

Note : The CGO interface passes go pointers to the C API. This is an error by the CGO runtime. Disable the error by placing

export GODEBUG=cgocheck=0

in your ~/.bashrc or ~/.zshrc file and then run either source ~/.bashrc or source ~/.zshrc

The example shows how to use the MLModelScope tracer to profile the inference. Refer to Set up the external services to start the tracer.

If running on GPU, you can use nvprof to verify the profiling result. Refer to Profiler User's Guide for using nvprof.

About

Go binding to TensorRT C API to do inference with pre-trained model in Go

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages