Unofficial PyTorch implementation of pruning VGG on CIFAR-10 Data set
Reference: Pruning Filters For Efficient ConvNets, ICLR2017
- torch (version: 1.2.0)
- torchvision (version: 0.4.0)
- Pillow (version: 6.1.0)
- matplotlib (version: 3.1.1)
- numpy (version: 1.16.5)
--train-flag
: Train VGG on CIFAR Data set--save-path
: Path to save results, ex) trained_models/--load-path
: Path to load checkpoint, add 'checkpoint.pht' withsave_path
, ex) trained_models/checkpoint.pth--resume-flag
: Resume the training from checkpoint loaded withload-path
--prune-flag
: Prune VGG--prune-layers
: List of target convolution layers for pruning, ex) conv1 conv2--prune-channels
: List of number of channels for pruning theprune-layers
, ex) 4 14--independent-prune-flag
: Prune multiple layers by independent strategy--retrain-flag
: Retrain the pruned nework--retrain-epoch
: Number of epoch for retraining pruned network--retrain-lr
: Number of epoch for retraining pruned network
You can follow the procedure of prune filter in the .ipynb file but switch the network to the corresponding network file: drop_out_network.py and VGG_19_network.py
quantization.py file apply quantization to the original VGG-16 model. we created a new network trying to fit the pruned and retrained network into the original model which unfortunately failed.
https://drive.google.com/drive/folders/1vxF0YL90rF_19sQyHhwGfC_WtgVl7NmW?usp=drive_link