There are 2 repositories under graph-knowledge-distillation topic.
Code for AAAI 2023 (Oral) paper "Extracting Low-/High- Frequency Knowledge from Graph Neural Networks and Injecting it into MLPs: An Effective GNN-to-MLP Distillation Framework"
Code for ICML 2023 paper "Quantifying the Knowledge in GNNs for Reliable Distillation into MLPs"
Code for TKDE paper "A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation"