Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BladeDISC estimated inference improvement time? #37

Closed
danpf opened this issue Aug 29, 2022 · 2 comments
Closed

BladeDISC estimated inference improvement time? #37

danpf opened this issue Aug 29, 2022 · 2 comments

Comments

@danpf
Copy link

danpf commented Aug 29, 2022

Hello,
Just curious, in your readme you mention that BladeDISC models can be used to speedup inference time, do you have any examples of how much of an improvement it gives? Trying to see if it is worth the trouble of installing it.

thanks!

@guolinke
Copy link
Member

@yuchaoli can you share some benchmark numbers?

@yuchaoli
Copy link
Contributor

yuchaoli commented Sep 2, 2022

Thanks for your interest!
We release our results at https://github.com/alibaba/BladeDISC/tree/main/examples/PyTorch/Inference/CUDA/AlphaFold

@danpf danpf closed this as completed Sep 4, 2022
PKUfjh pushed a commit to PKUfjh/Uni-Fold that referenced this issue May 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants