This repository (repo) contains source code which is submitted to the Dacon competition: 월간 데이콘 컴퓨터 비전 학습 경진대회.
In this repo, I applied Knowledge Distillation for modelling MNIST data.
Model spec:
- Teacher model: Wide ResNet-101-2
- Student model: ResNet-50
As a result, I attained rank 55 out of 396 (top 13%).
First, you need to install required libraries with this command
pip install -r requirements.txt
If you want to run our code, please input this command
python ./src/main.py
- Data: Torchvision, Pandas, Pillow
- AI: PyTorch
Apache License © Hee Seung Yun