An implementation of "Triplet Loss for Knowledge Distillation" in Pytorch.
The paper is here https://arxiv.org/abs/2004.08116
- clone the all files to a directory
- run the example.ipynb on the jupyter (If you don't use jupyter, please run the example.py)
run the teacher.ipynb, you can get the scores of teacher model
- OS : Ubuntu 18.04.3 LTS
- Python : v 3.7.3
- Pytorch : v 1.2.0
- torchvision : v 0.4.0