Skip to content
This repository has been archived by the owner on May 1, 2023. It is now read-only.

Knowledge Distillation #90

Closed
HKLee2040 opened this issue Nov 30, 2018 · 2 comments
Closed

Knowledge Distillation #90

HKLee2040 opened this issue Nov 30, 2018 · 2 comments
Assignees

Comments

@HKLee2040
Copy link

I had read "Knowledge Distillation" https://nervanasystems.github.io/distiller/schedule/index.html#knowledge-distillation

Would you please help to give me a simple example about knowledge distillation?

@guyjacob
Copy link
Contributor

guyjacob commented Dec 2, 2018

If you're looking for examples of how to use knowledge distillation in Distiller, I updated a couple of the sample schedulers in the quantization samples dir with sample command lines and results - FP32 sample, quantized sample.
If you're looking for an example of where knowledge distillation might be used in general, we have a nice introduction to the subject in our docs, with some links to papers using knowledge distillation with other compression methods.

@HKLee2040
Copy link
Author

@guyjacob The quantization samples are what I need. Thanks!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants