You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on May 1, 2023. It is now read-only.
If you're looking for examples of how to use knowledge distillation in Distiller, I updated a couple of the sample schedulers in the quantization samples dir with sample command lines and results - FP32 sample, quantized sample.
If you're looking for an example of where knowledge distillation might be used in general, we have a nice introduction to the subject in our docs, with some links to papers using knowledge distillation with other compression methods.
I had read "Knowledge Distillation" https://nervanasystems.github.io/distiller/schedule/index.html#knowledge-distillation
Would you please help to give me a simple example about knowledge distillation?
The text was updated successfully, but these errors were encountered: