Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Testing] Add model loader for int8 BERT #10622

Merged
merged 8 commits into from
Mar 16, 2022
Merged

Conversation

masahi
Copy link
Member

@masahi masahi commented Mar 15, 2022

Following tlc-pack/TLCBench#5, add an easy API to load the int8 BERT model. Currently it sits under meta_schedule/testing/tlcbench.py but relay/testing/tlcbench.py is also fine.

@junrushao1994 @sunggg @AndrewZhaoLuo @mbrookhart @jwfromm

Copy link
Member

@junrushao junrushao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot! LGTM

Copy link
Contributor

@sunggg sunggg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe useful when we run the end-to-end model.

python/tvm/meta_schedule/testing/tlcbench.py Outdated Show resolved Hide resolved
@masahi masahi merged commit 794e1e3 into apache:main Mar 16, 2022
pfk-beta pushed a commit to pfk-beta/tvm that referenced this pull request Apr 11, 2022
* add model loader for qat bert-base

* add test

* pylint

* ignore mypy

* Update python/tvm/meta_schedule/testing/tlcbench.py

Co-authored-by: Junru Shao <junrushao1994@gmail.com>

* use a dedicated process for converting

* return input info

* encode batch size and seq_len information in cached file path

Co-authored-by: Junru Shao <junrushao1994@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants