zhaowangji / knowledge-distillation-for-unet

An implementation of Knowledge distillation for segmentation, to train a small (student) UNet from a larger (teacher) UNet thereby reducing the size of the network while achieving performance similar to the heavier model.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Knowledge Distillation for UNet

An implementation of Knowledge distillation for segmentation, to train a small (student) UNet from a larger (teacher) UNet thereby reducing the size of the network while achieving performance similar to the heavier model.

Results:

Dataset: Carvana Image Masking Challenge

Models trained without knowledge distillation

Models trained without knowledge distillation

Models trained with knowledge distillation

Models trained with knowledge distillation

References

About

An implementation of Knowledge distillation for segmentation, to train a small (student) UNet from a larger (teacher) UNet thereby reducing the size of the network while achieving performance similar to the heavier model.


Languages

Language:Python 100.0%