jbrry / allennlp_tutorial

Tutorial on how to use AllenNLP for sequence modeling (including hierarchical LSTMs and CRF decoding)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

AllenNLP Tutorial

This tutorial is meant to teach you both how to use AllenNLP and a principled approach to doing deep learning research in NLP. The content is mirrored (and updated) on my personal site: jbarrow.ai. If you're interested in reading the latest version, you can find it there. But the code will always be stored in this repository. It consists of 10 sections, and I recommend you do them in order:

  1. Setup
  2. Building a Dataset Reader
  3. Building a Baseline Model
  4. Configuring Experiments
  5. Tackling Your Own Experiments
  6. Predictors
  7. Debugging [WIP]
  8. Advanced Modeling: Hierarchical LSTMs, CRF Decoding, and BERT [WIP]
  9. Digging Into the Documentation [WIP]
  10. Hyperparameter Search: AllenTune [WIP]

The tutorial makes no assumptions about familiarity with AllenNLP, and goes through using it as an experimental platform, using JSON configurations.

About

Tutorial on how to use AllenNLP for sequence modeling (including hierarchical LSTMs and CRF decoding)

License:MIT License


Languages

Language:Python 74.1%Language:Jsonnet 23.6%Language:Shell 2.3%