Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The problem of knowledge distillation #14

Open
AlLIcEW opened this issue Nov 12, 2023 · 0 comments
Open

The problem of knowledge distillation #14

AlLIcEW opened this issue Nov 12, 2023 · 0 comments

Comments

@AlLIcEW
Copy link

AlLIcEW commented Nov 12, 2023

In the knowledge distillation part of this article, I ran the code and found that the acc before distillation was higher than the acc after distillation. The picture below is a random screenshot of my running results, but the acc after each round of distillation was not as good as before distillation acc, why is this? I hope to get your answer.
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant