jessevig / bertviz

BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)

Home Page:https://towardsdatascience.com/deconstructing-bert-part-2-visualizing-the-inner-workings-of-attention-60a16d86b5c1

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Missing head_view_bart.ipynb

Serbernari opened this issue · comments

Hi! During looking through issues I found out that previously head_view_bart.ipynb example was existing in this repo, but now it only can be found through history in deleted branch: https://github.com/jessevig/bertviz/blob/b088f44dd169957dbe89019b81243ef5cf5e9dcb/notebooks/head_view_bart.ipynb

Could you tell us why this example (along with many others) was removed? It works perfectly fine now

Thanks @Serbernari. I had removed it to simplify the repo and reduce the maintenance requirements. But I can see that it would be helpful to have notebooks for specific models as well, rather than wading through some of the documentation, especially for encoder-decoder models. I will add this back and some of the others.

when i am using - outputs = model(input_ids=encoder_input_ids, decoder_input_ids=decoder_input_ids) then this is giving a error ValueError: Exception encountered when calling layer "tft5_for_conditional_generation" (type TFT5ForConditionalGeneration).
I am using my fine-tuned model.