DoodleJZ / LIMIT-BERT

Source code for "LIMIT-BERT : Linguistics Informed Multi-Task BERT" published at Findings of EMNLP 2020

Home Page:https://arxiv.org/abs/1910.14296

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

LIMIT-BERT

Contents

  1. Requirements
  2. Training

Requirements

  • Python 3.6 or higher.
  • Cython 0.25.2 or any compatible version.
  • PyTorch 1.0.0+.
  • EVALB. Before starting, run make inside the EVALB/ directory to compile an evalb executable. This will be called from Python for evaluation.
  • pytorch-transformers PyTorch 1.0.0+ or any compatible version.

Pre-trained Models (PyTorch)

The following pre-trained models are available for download from Google Drive:

Training

To train LIMIT-BERT, simply run:

sh run_limitbert.sh

Evaluation Instructions

To test after setting model path:

sh test_bert.sh

About

Source code for "LIMIT-BERT : Linguistics Informed Multi-Task BERT" published at Findings of EMNLP 2020

https://arxiv.org/abs/1910.14296


Languages

Language:Python 82.5%Language:Perl 13.1%Language:C 3.0%Language:Shell 1.0%Language:Scilab 0.3%Language:Makefile 0.0%