Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SpanMarker with ONNX models #26

Open
Ulipenitz opened this issue Aug 15, 2023 · 11 comments
Open

SpanMarker with ONNX models #26

Ulipenitz opened this issue Aug 15, 2023 · 11 comments

Comments

@Ulipenitz
Copy link

Hi @tomaarsen! Is there a ONNX exporter planned? Have you tried using SpanMarker with ONNX models for inference?
Would be really curious if you experimented with that already! :-)

@tomaarsen
Copy link
Owner

Hello!

I have done a very quick experiment to try and export SpanMarker to ONNX, but I got some incomprehensible errors. I don't have the experience with ONNX at the moment to quickly create such an exporter.

  • Tom Aarsen

@polodealvarado
Copy link

Hi @tomaarsen , I would like to collaborate with this issue.

@tomaarsen
Copy link
Owner

That would be awesome! I'm open to PRs on the matter.

@dbuades
Copy link

dbuades commented Oct 31, 2023

This would indeed be a nice feature to add. We export all our models to ONNX before deploying and this is unfortunately not currently possible with SpanMarker.

Keep up the good work!

@abhayalok
Copy link

@tomaarsen , can you upload ONNX format for Span Marker.

@tomaarsen
Copy link
Owner

I'm afraid I haven't been able to convert SpanMarker models to ONNX yet.

@polodealvarado
Copy link

Hello @tomaarsen .
I am independently working on converting span_marker models to the ONNX format and I have started it on a new branch.
I would like to share the results to see if we can make progress on it.
How would you like to proceed?

@tomaarsen
Copy link
Owner

Hello!

Awesome! I'd love to get ONNX support for SpanMarker somehow.
You can fork the repository and push your branch there. Then, you can open a draft pull request of your branch from your fork into the main branch of this repository, and we'll be able to discuss there, look at results, etc. GitHub actions will then automatically run the tests from your branch to make sure everything is working well.
Does that sound good?

  • Tom Aarsen

@polodealvarado
Copy link

Great!

I will push the branch this weekend as soon as I can.

@ogencoglu
Copy link

ONNX support would be amazing! One can also quantize the models for further inference speed optimization once the base models are converted to ONNX. It is essentially 5 lines of code from ONNX to quantized ONNX.

@ganga7445
Copy link

@tomaarsen @polodealvarado is ONNX implementation done?
how to load the models with onnx for faster inference?can you please help here?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants