zhaowangji / knowledge-distillation-for-unet Goto Github PK
View Code? Open in Web Editor NEWThis project forked from vaticancameos99/knowledge-distillation-for-unet
An implementation of Knowledge distillation for segmentation, to train a small (student) UNet from a larger (teacher) UNet thereby reducing the size of the network while achieving performance similar to the heavier model.