akakakakakaa / bertviz

Tool for visualizing attention in BERT and OpenAI GPT-2

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

BertViz

Tool for visualizing attention in BERT and OpenAI GPT-2. Extends Tensor2Tensor visualization tool and pytorch-pretrained-BERT.

Blog posts:

Paper:

Attention-head view

The attention-head view visualizes the attention patterns produced by one or more attention heads in a given transformer layer.

Attention-head view

BERT: [Notebook] [Colab]

OpenAI GPT-2: [Notebook] [Colab]

Model view

The model view provides a birds-eye view of attention across all of the model’s layers and heads.

Model view

BERT: [Notebook] [Colab]

OpenAI GPT-2 [Notebook] [Colab]

Neuron view

The neuron view visualizes the individual neurons in the query and key vectors and shows how they are used to compute attention.

Neuron view

BERT: [Notebook] [Colab]

OpenAI GPT-2 [Notebook] [Colab]

Authors

Citation

When referencing BertViz, please cite this paper.

@article{vig2019transformervis,
  author    = {Jesse Vig},
  title     = {Visualizing Attention in Transformer-Based Language Representation Models},
  journal   = {arXiv preprint arXiv:1904.02679},
  year      = {2019},
  url       = {https://arxiv.org/abs/1904.02679}
}

License

This project is licensed under the Apache 2.0 License - see the LICENSE file for details

Acknowledgments

This project incorporates code from the following repos:

About

Tool for visualizing attention in BERT and OpenAI GPT-2

License:Apache License 2.0


Languages

Language:Jupyter Notebook 98.4%Language:Python 1.2%Language:JavaScript 0.4%