jessevig / bertviz

BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)

Home Page:https://towardsdatascience.com/deconstructing-bert-part-2-visualizing-the-inner-workings-of-attention-60a16d86b5c1

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

`cannot unpack non-iterable NoneType object` When using RobertaModel.from_pretrained()

s1530129650 opened this issue · comments

source code

from bertviz.transformers_neuron_view import  RobertaModel,  RobertaTokenizer
from bertviz.neuron_view import show
model_name_or_path='Enoch/Unixcoder-Tuned-Code-Search-Py'
model = RobertaModel.from_pretrained(model_name_or_path, output_attentions=True) 

Error

image

Warning

Model name 'Enoch/Unixcoder-Tuned-Code-Search-Py' was not found in model name list (roberta-base, roberta-large, roberta-large-mnli). We assumed 'Enoch/Unixcoder-Tuned-Code-Search-Py' was a path or url but couldn't find any file associated to this path or url.

How can I specify a custom model?

BTW, transformers supports custom model by:

from transformers import RobertaTokenizer, RobertaConfig, RobertaModel

model_name_or_path="Enoch/Unixcoder-Tuned-Code-Search-Py"
tokenizer = RobertaTokenizer.from_pretrained(model_name_or_path)
config = RobertaConfig.from_pretrained(model_name_or_path)
model = RobertaModel.from_pretrained(model_name_or_path, output_attentions=True) 

Hi @s1530129650, unfortunately the neuron view only works with the custom version of Roberta included with the tool, due to needing special access to the query/key vectors: https://github.com/jessevig/bertviz#neuron-view-1