Shahabks / FewShotNLP

The source codes of the paper "Improving Few-shot Text Classification via Pretrained Language Representations" and "When Low Resource NLP Meets Unsupervised Language Model: Meta-pretraining Then Meta-learning for Few-shot Text Classification".

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Meta-pretraining Then Meta-learning (MTM) Model for FewShot NLP Tasks

GitHub stars GitHub forks

The source codes of the paper "Improving Few-shot Text Classification via Pretrained Language Representations" and "When Low Resource NLP Meets Unsupervised Language Model: Meta-pretraining Then Meta-learning for Few-shot Text Classification".

If you use the code, pleace cite the following paper:

@inproceedings{zhang2019fewshot,
  title={Improving Few-shot Text Classification via Pretrained Language Representations},
  author={Ningyu Zhang, Zhanlin Sun, Shumin Deng, Jiaoyan Chen, Huajun Chen},
  year={2019}
}

@inproceedings{zhang2019mtm,
  title={When Low Resource NLP Meets Unsupervised Language Model:  Meta-pretraining Then Meta-learning for Few-shot Text Classification},
  author={Shumin Deng, Ningyu Zhang, Zhanlin Sun, Jiaoyan Chen, Huajun Chen},
  year={2019}
}

About

The source codes of the paper "Improving Few-shot Text Classification via Pretrained Language Representations" and "When Low Resource NLP Meets Unsupervised Language Model: Meta-pretraining Then Meta-learning for Few-shot Text Classification".


Languages

Language:Python 55.5%Language:Jupyter Notebook 44.3%Language:Shell 0.2%Language:Dockerfile 0.0%