pursueorigin / CPF

The official code of WWW2021 paper: Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework

Home Page:https://arxiv.org/pdf/2103.02885.pdf

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

CPF

The official code of WWW2021 paper: Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework

PWC

PWC

PWC

PWC

PWC

Getting Started

Requirements

  • Python version >= 3.6
  • PyTorch version >= 1.7.1
  • DGL
  • Optuna (optional)

Usage

Quick start

  1. use python train_dgl.py --dataset=XXX --teacher=XXX to run teacher model.
  2. use python spawn_worker.py --dataset=XXX --teacher=XXX to run student model, we provide our hyper-parameters setting as reported in our paper, and an AutoML version for hyper-parameters search. (Our code supports Optuna to search best hyper-parameters for knowledge distillation. You can use --automl to run Optuna code.)

Add your own datasets

You can add your own datasets to folder data, the formats should accord to DGL requirements.

Add your own models

You can add your own teacher or student model by adding them into folder models, and following the format of model run.

Results

There are some results on GCN teacher model, with different datasets and student varients. More results can be seen in our paper.

Datasets GCN (Teacher) CPF-ind (Student) CPF-tra (Student) improvement
Cora 0.8244 0.8576 0.8567 4.0%
Citeseer 0.7110 0.7619 0.7652 7.6%
Pubmed 0.7804 0.8080 0.8104 3.8%
A-Computers 0.8318 0.8443 0.8443 1.5%
A-Photo 0.9072 0.9317 0.9248 2.7%

Cite

Please cite our paper if you use this code in your own work:

@inproceedings{yang2021extract,
  title={Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework},
  author={Cheng Yang and Jiawei Liu and Chuan Shi},
  booktitle={Proceedings of The Web Conference 2021 (WWW ’21)},
  publisher={ACM},
  year={2021}
}

Contact Us

Please open an issue or contact Liu_Jiawei@bupt.edu.cn with any questions.

About

The official code of WWW2021 paper: Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework

https://arxiv.org/pdf/2103.02885.pdf


Languages

Language:Python 100.0%