jessevig / bertviz

BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)

Home Page:https://towardsdatascience.com/deconstructing-bert-part-2-visualizing-the-inner-workings-of-attention-60a16d86b5c1

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Typo in README

luckynozomi opened this issue · comments

In the readme file https://github.com/jessevig/bertviz#head-and-model-views, the parameter for adding attention output is output_attentions=True instead of output_attention=True

Fixed, thank you @luckynozomi !