huawei-noah / Pretrained-Language-Model

Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

TinyBERT for masked LM

danchern97 opened this issue · comments

Hi! I've been trying to measure MLM perplexity for TinyBERT model (in particular, tinybert6l), and I keep getting inconsistent results. Looks like the MLM head for TinyBERT is not loaded properly when loading from AutoModelForMaskedLM or by BertForMaskedLM.