iamymind / bert-finetuning-catalyst

Code for BERT classifier finetuning for multiclass text classification

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Instruction:

  • specify your data, model, and training parameters in config.yml
  • if needed, customize the code for data processing in src/data.py
  • specify your model in src/model.py, by default it's DistilBERT for sequence classification
  • run python src/train.py

Video-tutorial

I explain the pipeline in detail in a video-tutorial which consists of 4 parts:

Also, see other tutorials/talks on the topic:

About

Code for BERT classifier finetuning for multiclass text classification


Languages

Language:Python 100.0%