Unexpected Keyword arguments prompt_params, adaptor_layers, deep_adaptor_tuning, deep_adaptor_tuning_ffn_only, parallel_adaptors
goru001 opened this issue · comments
Hi @prajdabre ,
Thanks for doing great work with this library. I was trying to use it and ran into this issue where prompt_params, adaptor_layers, deep_adaptor_tuning, deep_adaptor_tuning_ffn_only, parallel_adaptors
params are being passed here to forward
here but the MBartForConditionalGeneration
class's forward function doesn't expect it.
Wanted to understand from you if the fix is as simple as creating these params in forward function call with default value of None (in which case I'm guessing we would need to make changes in the forward functions implementation itself to use these params).
Let me know if you think I might be missing something here. Thanks!
Oh lord. I forgot to push my latest changes. Gimme a moment.
Ok please pull, run setup.py in the transformers folder and try again and lmk.
Thanks @prajdabre for the quick reply. The code still seems to be failing here when we're not using --prompt_tuning
as prompt_params
is None.
When we use --prompt_tuning
it fails here as tgt_len size is same as src_len size in attention mask.
Hi,
My bad again, I forgot to put if else conditions. Could you try using the following commit for now: ed5da80
Just revert to this commit and try again.
Really should not make breaking changes before going on vacation! 🤦
If you want you can also fix this by wrapping the lines you pointed out with an if condition which triggers only if prompt params is not none.
Got it, sure @prajdabre. Thanks a lot for your quick replies. I've switched to tag v2.0 for now, but ya you're right I can fix this at my end. Just didnt do it for now as I wanted to get more comfortable with the code-base first.
I hope to contribute back as well in future. Thanks for doing the great work and Hope you'll enjoy your vacation! :)