This repo contains the PyTorch implementation for paper Conditional Automated Channel Pruning for Deep Neural Networks.
Current code base is tested under following environment:
- Python 3.7.3
- PyTorch 1.3.1
- CIFAR-10 dataset
Using the following command to install the Dependencies:
pip install -r requirements.txt
Current code base supports the automated pruning of Resnet56 on CIFAR10. The pruning of Resnet56 consists of 2 steps: 1. strategy search and export the pruned weights; 2. fine-tune from pruned weights.
To conduct the full pruning procedure, follow the instructions below:
- Strategy Search and Export the Pruned Weights
bash ./script/search_export_cacp.sh
Note: the checkpoint of best compressed models under different target rates will be automatically saved in the log folder.
- Fine-tune from Pruned Weights
After searching and exporting, we need to fine-tune from the pruned weights. For example, we can fine-tune using RL-step learning rate for 400 epochs by running:
bash ./script/finetune.sh
The following table is the result we get (results might vary a little from the paper due to different random seed):