Giter Club home page Giter Club logo

distiller's Introduction

Distiller

alt text

A clean Pytorch implementation to run quick distillation experiments. Our findings are available for reading in our paper "The State of Knowledge Distillation for Classification" linked here.

Python Dependencies

This codebase only supports Python 3.6+.

Required Python packages:

  • torch torchvision tqdm numpy pandas seaborn

All packages can be installed using pip3 install --user -r requirements.txt.

This project is also integerated with Pytorch Lightning. Use the lightning branch to see Pytorch Lightning compatible code.

Run

The benchmarks can be run via python3 evaluate_kd.py and providing the respective command line parameters. For example:

python3 evaluate_kd.py --epochs 200 --teacher resnet18 --student resnet8 --dataset cifar10 --teacher-checkpoint pretrained/resnet18_cifar10_95260_parallel.pth --mode nokd kd

Runs basic student training and knowledge distillation for 200 epochs using a pretrained teacher. There are checkpoints of multiple models in the pretrained folder.

Supported distillation modes

Distillation Technique Mode Description
Baseline Accuracy nokd Plain training with no knowledge distillation.
Baseline Knowledge Distillation Accuracy kd Hinton loss to distill a student network.
Ensemble of Teachers allkd Distill from a list of teacher models and pick the best performing one.
Hyper parameter tuning of Hinton Loss kdparam Distill using varying combinations of temperature and alpha and pick the best performing combination.
Triplet Loss triplet Knowledge Distillation with a triplet loss using the student as negative example.
Ensemble of Students multikd Train a student under an ensemble of students that are picked from a list.
Unsupervised Data Augmentation Loss uda Run knowledge distillation in combination with unsupervised data augmentation.
Teacher Assistant Knowledge Distillation takd Run distillation using Teacher-Assistant distillation.
Activation Boundary Distillation ab Run feature distillation using Activation-Boundary distillation.
Overhaul Distillation oh Run feature distillation using the Feature Overhaul distillation.
Relational Knowledge Distillation rkd Run distillation using the Relational Knowledge distillation.
Patient Knowledge Distillation pkd Run feature distillation using the Patient Knowledge distillation.
Simple Knowledge Distillation sfd Runs a custom feature distillation distillation (Simple Feature Distillation) that just pools and flattens feature layers.

Results

alt text

alt text

distiller's People

Contributors

fruffy avatar imirzadeh avatar karanchahal avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

distiller's Issues

Using Distillation on a differenta dataset using a trained teacher.

Thanks for this amazing work. The current command line example shows how to create a model using the kd process and a single dataset--ie.e, CIFAR10. However, I am trying to create a student model (using its own dataset) via distillation from a teacher model that was trained on a very similar but different dataset. Any guidance on how to accomplish this would be greatly appreciated.

Is it possible to keep the learing rate constant?

I am trying to run some experiments comparing the effects of different paremters.
Is it possible to avoid the reduction of the learning rate?
I saw something related in optimizer.py, but I would like to be sure... Thank you!

Pytorch lightning?

Very thorough and useful work!

I have not seen pytorch-lightning neither in the requirement.txt nor in the code.

Do you have a plan to refactor it to use lightning trainer and module?

Trouble training teaches

Hi!
I am trying to train some networks (the largest ones) in Colag due too shortage of CUDA memory.
Is there any way to use this trainen models as pretrained ones?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.