princeton-nlp / LM-BFF

[ACL 2021] LM-BFF: Better Few-shot Fine-tuning of Language Models https://arxiv.org/abs/2012.15723

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

lm_bff.json文件中相互关联的参数:"inspired_templates", "replace_token_map_list"如何设置?

pony-m opened this issue · comments

commented

如下所示,是源代码中针对CoLA任务和数据集所设计的"inspired_templates", "replace_token_map_list",我要进行MNLI任务,该如何设计对应的模板和参数?

{
"model_dir": "./models/t5-3b",
"end_token": "",
"beam": 100,
"inspired_templates": ["*clssentu_0**<extra_id_0>label<extra_id_1>sep+", "cls.<extra_id_0>label**<extra_id_1>+sentu_0sep+*"],
"target_number": 2,
"batch_size": 32,
"gen_max_len": 20,
"truncates": ["head", "tail"],
"first_mask_token": "<extra_id_0>",
"forbid_tokens": [3, 19794, 22354],
"forbid_continuous_token": [5],
"replace_token_map_list": [{
"<extra_id_0>": "*cls
sent_0*",
"<extra_id_1>": "mask",
"<extra_id_2>": "sep+",
"": "sep+",
"▁":""
}, {
"<extra_id_0>": "cls",
"<extra_id_1>": "mask",
"<extra_id_2>": "+sent_0**sep+",
"": "+sent_0**sep+",
"▁":"
"
}]
}
如果作者能整理出GLUE所有9项任务的参数设置,读者能更好地理解该方法。感谢作者~

Hi,I believe all the needed hyper parameters for each task are provided in our script.