SeongKu-Kang / Topology_Distillation_KDD21

Topology Distillation for Recommender System (KDD'21)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Topology Distillation for Recommender System

This repository provides the source code of "Topology Distillation for Recommender System" accepted in KDD2021 as a research paper.

1. Overview

We develop a general topology distillation approach for Recommender System. The topology distillation guides the learning of the student model by the topological structure built upon the relational knowledge in representation space of the teacher model.

Concretely, we propose two topology distillation methods:

  1. Full Topology Distillation (FTD). FTD transfers the full topology, and it is used in the scenario where the student has enough capacity to learn all the teacher’s knowledge.
  2. Hierarchical Topology Distillation (HTD). HTD transfers the decomposed topology hierarchically, and it is adopted in the classical KD scenario where the student has a very limited capacity compared to the teacher.

TD

2. Main Results

  • When the capacity of the student model is highly limited, the student model learns best with HTD.

    TD1

  • As the capacity gap between the teacher model and student model decreases, the student model takes more benefits from FTD.

    TD2

3. Requirements

  • Python version: 3.6.10
  • Pytorch version: 1.5.0

4. How to Run

Please refer to 'Guide to using topology distillation.ipynb' file.

About

Topology Distillation for Recommender System (KDD'21)

License:GNU General Public License v2.0


Languages

Language:Jupyter Notebook 88.5%Language:Python 11.5%