Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[TFLite Export] Adds TFLite support for LayoutLMv3 #1372

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

salmanmaq
Copy link

What does this PR do?

Adds support for exporting LayoutLMv3 to TFLite. Some tests for all quantization approaches were failing for some tasks, so I limited the supported quantization schemes for now.

It was primarily based on #813 and the ONNX export code already part of the library.

It can be tested as:

pytest tests/exporters/tflite/test_*.py -k "layoutlmv3" -s --exitfirst

You can export a trained model as:

optimum-cli export tflite --model <input_dir> --task <> --sequence_length <> --width <> --height <> <output_dir>
optimum-cli export tflite --model microsoft/layoutlmv3-base --task token-classification --sequence_length 512 --width 224 --height 224 ~/tflite_conversion

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you make sure to update the documentation with your changes? (Not needed as far as I can tell)
  • Did you write any new necessary tests? (Not needed as far as I can tell)

Copy link
Collaborator

@fxmarty fxmarty left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @salmanmaq thank you for your contribution, LGTM!

Copy link
Member

@michaelbenayoun michaelbenayoun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Will wait that the CI goes green to merge. Except for int8 quantization tests that do not seem related to your changes at all.

@salmanmaq
Copy link
Author

Hey @fxmarty @michaelbenayoun Would you like to merge this before it goes stale? Happy to update the PR in case it has

@michaelbenayoun
Copy link
Member

Hi @salmanmaq ,
Happy to do it once the tests related to LayoutLMv3 pass. I think some tests in export with quantization do not pass.

@salmanmaq
Copy link
Author

salmanmaq commented Nov 3, 2023

Hi @salmanmaq , Happy to do it once the tests related to LayoutLMv3 pass. I think some tests in export with quantization do not pass.

Thanks @michaelbenayoun. Is it something with the PR, or at your end? You had previously mentioned

Except for int8 quantization tests that do not seem related to your changes at all.

Would be checking it out though. If you can give some pointers, I am happy to have a look and attempt a fix.

@ahmet-sabbagh
Copy link

@salmanmaq, since there is a suggestion the issue might not be related to your PR, I noticed that your fork is a bit behind, maybe you could pull the latest commits try again?

@salmanmaq
Copy link
Author

Hi @fxmarty @michaelbenayoun . Bringing this to your attention again for a potential merge. I've merged and pushed the latest changes. I've tested LayoutLMv3 conversion (without quantization) and with the supported quantization types (FP16, INT8 dynamic). I am able to successfully convert and use those models. I think the test errors are not related to my code. So I suppose you can consider merging it. If the test errors are due to my code, I am happy to take some guidance and work on those (but I don't think that's the case).

@salmanmaq salmanmaq force-pushed the LayoutLMv3-TFLite-conversion-support branch from 4901d1d to 43fbefb Compare April 26, 2024 11:29
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants