ZainZhao / Awesome-Knowledge-Distillation-in-Recommendation-System

collect papers (knowledge distillation + recommendation system) 整理 知识蒸馏+推荐系统 的相关论文

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Awesome-Knowledge-Distillation-in-Recommendation-System

collect papers (knowledge distillation + recommendation system)

整理 知识蒸馏+推荐系统 的相关论文

continue update ...

2016

Improved recurrent neural networks for session-based recommendations. Tan, Yong Kiam, Xinxing Xu, and Yong Liu. RecSys

2017

Improving session recommendation with recurrent neural networks by exploiting dwell time. Dallmann, Alexander, et al. arXiv preprint arXiv:1706.10231

2018

Adversarial distillation for efficient recommendation with external knowledge. Chen X, Zhang Y, Xu H, et al. TOIS

Neural compatibility modeling with attentive knowledge distillation. Song, Xuemeng, et al. SIGIR

Ranking distillation: Learning compact ranking models with high performance for recommender system. Tang, Jiaxi, and Ke Wang. SIGKDD

2019

A novel enhanced collaborative autoencoder with knowledge distillation for top-N recommender systems. Pan Y, He F, Yu H. Neurocomputing

Binarized collaborative filtering with distilling graph convolutional networks. Wang, Haoyu, Defu Lian, and Yong Ge. arXiv preprint arXiv:1906.01829

Collaborative Distillation for Top-N Recommendation. Lee J, Choi M, Lee J, et al. ICDM

2020

A General Knowledge Distillation Framework for Counterfactual Recommendation via Uniform Data. Liu, Dugang, et al. SIGIR

Developing Multi-Task Recommendations with Long-Term Rewards via Policy Distilled Reinforcement Learning. Liu, Xi, et al. arXiv preprint arXiv:2001.09595

Distilling structured knowledge into embeddings for explainable and accurate recommendation. Zhang, Yuan, et al. ICDM

Lightrec: A memory and search-efficient recommender system. Lian, Defu, et al. WWW

Privileged Features Distillation at Taobao Recommendations. Xu, Chen, et al. SIGKDD

About

collect papers (knowledge distillation + recommendation system) 整理 知识蒸馏+推荐系统 的相关论文