ankit7856 / eacl2021-OffensEval-Dravidian

EACL 2021 paper (SJ_AJ@DravidianLangTech-EACL2021: Task-Adaptive Pre-Training of Multilingual BERT models for Offensive Language Identification)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

EACL 2021 OffensEval in Dravidian Languages

Leaderboard (Team SJ_AJ)

Language Rank Precision Recall F1
Kannada 1 0.73 0.78 0.75
Malayalam 2 0.97 0.97 0.96
Tamil 3 0.75 0.79 0.76

Data & Models

Pretrained Models

  • Pretrained models, submission files and training checkpoints can be downloaded from this drive repo.
  • Scripts for task-adaptive pretraining are placed at ./pretraining

Citation

@misc{jayanthi2021sjajdravidianlangtecheacl2021,
      title={SJ_AJ@DravidianLangTech-EACL2021: Task-Adaptive Pre-Training of Multilingual BERT models for Offensive Language Identification}, 
      author={Sai Muralidhar Jayanthi and Akshat Gupta},
      year={2021},
      eprint={2102.01051},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}

Contact

Feel free to contact us for a quick chat or a discussion at gmail

About

EACL 2021 paper (SJ_AJ@DravidianLangTech-EACL2021: Task-Adaptive Pre-Training of Multilingual BERT models for Offensive Language Identification)


Languages

Language:Python 92.1%Language:Jupyter Notebook 4.2%Language:Shell 3.8%