-
Notifications
You must be signed in to change notification settings - Fork 31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi, will u opensource distillation training code as well? #2
Comments
Hi, thank you! Personally, we as the authors want to fully release the code. However, part of it might fall under our company's proprietary ownership. We would like to clear this up first so it's fully, safely free for everyone to use. I'll keep the community posted about the final availability of the distillation code in this issue. |
@rendchevi good news |
I am very excited |
+1 Can't wait to play with it. |
@rendchevi Hello, long time after, will you release it? |
Hello, Waiting for the release . Could you confirm the date ? |
@rendchevi any update here? |
@rendchevi Are you planning on releasing the training code anytime soon? I would like to know what's the current status on that if possible. |
nix is quite impressive. I tried it's fast and natural compare with same params-level model.
However, seems the distillation part is not open sourced, Just wonder if these part can available or not? so that users can compress own trained model.
The text was updated successfully, but these errors were encountered: