Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Microsoft.ML.OnnxRuntime.InferenceSession with custom SentencepieceTokenizer operation #137

Closed
reseeker opened this issue Aug 26, 2021 · 3 comments

Comments

@reseeker
Copy link

I'm trying to use TensorFlow model (universal-sentence-encoder-multilingual_3), successfully converted to .onnx format with tf2onnx, in C#. But attempts to initialize an InferenceSession cause OnnxRuntimeException "[ErrorCode:Fail] Fatal error: SentencepieceTokenizer is not a registered function/op". Is it possible to use models converted with the flag --extra_opset in .NET?

@snnn snnn transferred this issue from microsoft/onnxruntime Aug 26, 2021
@snnn
Copy link
Member

snnn commented Aug 26, 2021

@wenbingl is there a way to use onnxruntime-extensions in C#? ONNX Runtime C# binding has custom op support. But I don't know if it works for this.

@wenbingl
Copy link
Member

@wenbingl is there a way to use onnxruntime-extensions in C#? ONNX Runtime C# binding has custom op support. But I don't know if it works for this.

if the onnxruntime can load a native custom op DLL, the issue can be solved by building onnxruntime-extensions to be a DLL and loading the DLL in C# API.

@reseeker
Copy link
Author

Yes, figured it out. To open the universal-sentence-encoder-multilingual-3 model from ThensorFlowHub in C#, you need (besides the conversion using tf2onnx) to build onnxruntime-extensions and obtain ortcustomops.dll as result.
Then this library is registered:

var sessionOptions = new Microsoft.ML.OnnxRuntime.SessionOptions();
sessionOptions.RegisterCustomOpLibraryV2("ortcustomops.dll", out var libraryHandle);

The corresponding options will allow to open the model:

var session = new Microsoft.ML.OnnxRuntime.InferenceSession(
    "universal-sentence-encoder-multilingual-3.onnx",
    sessionOptions
);

Thanks to everybody!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants