Skip to content
This repository was archived by the owner on Jun 3, 2025. It is now read-only.

Conversation

@spacemanidol
Copy link
Contributor

Adding example for BERT SQUAD with onnx export.

Copy link
Contributor

@bfineran bfineran left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great job on getting this integration working and in script form. Really excited about the results, and I had a good time using your previous notebook.

Took a first pass and left some comments. Didn't comment too much on style because running make style will fix a lot of the formatting. Overall, it would be great if we could structure the readme and script to demonstrate as simple as possible how to integrate sparseML into transformers, pointing this out in both documents.

@bfineran bfineran requested a review from a team February 2, 2021 00:15
@spacemanidol
Copy link
Contributor Author

Done with code changes. Working on Readme changes @bfineran can you give the code another look through for comments/thoughts? README.md will have to wait until all the experiments finish.

Copy link
Member

@markurtz markurtz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A few things at the high level and left comments:

  • let's move this over to a new folder from examples to integrations (the timm and ultrallytics are being migrated there as well)
  • let's rename prune_config_files to recipes
    -- bonus if we can get these models and recipes pushed to thee sparsezoo instead of storing here
  • let's change the formats of the recipes to markdown from yaml and include some info in them such as the example command to run (for when they are pushed to sparsezoo)

@spacemanidol
Copy link
Contributor Author

Changes implemented. Upgraded to new transformers library and getting a blocking error. Opened an issue on their page. huggingface/transformers#10618

@spacemanidol
Copy link
Contributor Author

Fixed issue with transformers. Ready for merge.

Copy link
Member

@markurtz markurtz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As discussed this is in a good state to land for a research integration. Over the next weeks, we need to nail down exact flows and recipes for users (as well as improving the recipes) and then push those to the SparseZoo/potentially huggingface and remove the recipes and table from this integration.

Additionally, as we make things easier we'll need to evaluate how close we can make run_qa.py to the huggingface integration. Layer dropping, distillation, etc should all be added in a separate file.

Copy link
Contributor

@bfineran bfineran left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

excited to see this landed @spacemanidol

@markurtz markurtz merged commit 9c6e60b into main Mar 15, 2021
@spacemanidol spacemanidol deleted the BERT-QA branch March 22, 2021 19:08
markurtz added a commit that referenced this pull request Sep 1, 2021
* Correct deepsparse highlight images

* update highlights, tutorials, and infographic sizes
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants