lilujunai / Knowledge-Distillation-Experiments

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Knowledge-Distillation-Experiments

  • 지식 증류 실험:
    • Distilling the Knowledge in a Neural Network (1)
    • Improved Knowledge Distillation via Teacher Assistant (2)

LSTM (1)

  • Data set: '한국어 악성댓글 탐지 데이터셋'


Transformer (1), (2)

  • Data set: '네이버 영화평 감정분석 데이터셋'


About


Languages

Language:Python 100.0%