ZhuYun97 / awesome-GNN-distillation

papers of distilling Graph Neural Network

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

awesome-GNN-distillation

Papers of distilling Graph Neural Network

  1. [CVPR2020] Distilling Knowledge from Graph Convolutional Networks [paper][code]
  2. [IJCAI2021] On Self-Distilling Graph Neural Network[paper][code]
  3. [WWW2021] Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework[paper][code]
  4. [Arxiv2104] GKD: Semi-supervised Graph Knowledge Distillation for Graph-Independent Inference [paper][code]
  5. [Arxiv2105] Graph-Free Knowledge Distillation for Graph Neural Networks [paper][code]
  6. [Arxiv2106] Distilling Self-Knowledge From Contrastive Links to Classify Graph Nodes Without Passing Messages [paper][code]
  7. [KDD2021] ROD: Reception-aware Online Distillation for Sparse Graphs [paper][code]
  8. [Arxiv2108] Transferring Knowledge Distillation for Multilingual Social Event Detection [paper][code]
  9. [Arxiv2108] Distilling Holistic Knowledge with Graph Neural Networks [[paper](Distilling Holistic Knowledge with Graph Neural Networks)][code]
  10. [Arxiv2110] Scalable Consistency Training for Graph Neural Networks via Self-Ensemble Self-Distillation [paper][code]
  11. [Arxiv2110] Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation [paper][code]
  12. [Arxiv2111] Cold Brew: Distilling Graph Node Representations with Incomplete or Missing Neighborhoods [paper][code]
  13. [Arxiv2111] On Representation Knowledge Distillation for Graph Neural Networks [paper][code]

About

papers of distilling Graph Neural Network