`cannot unpack non-iterable NoneType object` When using RobertaModel.from_pretrained()
s1530129650 opened this issue · comments
Ensheng Shi (石恩升) commented
source code
from bertviz.transformers_neuron_view import RobertaModel, RobertaTokenizer
from bertviz.neuron_view import show
model_name_or_path='Enoch/Unixcoder-Tuned-Code-Search-Py'
model = RobertaModel.from_pretrained(model_name_or_path, output_attentions=True)
Error
Warning
Model name 'Enoch/Unixcoder-Tuned-Code-Search-Py' was not found in model name list (roberta-base, roberta-large, roberta-large-mnli). We assumed 'Enoch/Unixcoder-Tuned-Code-Search-Py' was a path or url but couldn't find any file associated to this path or url.
How can I specify a custom model?
BTW, transformers supports custom model by:
from transformers import RobertaTokenizer, RobertaConfig, RobertaModel
model_name_or_path="Enoch/Unixcoder-Tuned-Code-Search-Py"
tokenizer = RobertaTokenizer.from_pretrained(model_name_or_path)
config = RobertaConfig.from_pretrained(model_name_or_path)
model = RobertaModel.from_pretrained(model_name_or_path, output_attentions=True)
Jesse Vig commented
Hi @s1530129650, unfortunately the neuron view only works with the custom version of Roberta included with the tool, due to needing special access to the query/key vectors: https://github.com/jessevig/bertviz#neuron-view-1