XiangLi1999 / PrefixTuning

Prefix-Tuning: Optimizing Continuous Prompts for Generation

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

PyTorch Lightning Version?

ekoenitz opened this issue · comments

What version of PyTorch Lightning was this built with? I followed the setup instructions to install the requirements, but I keep get errors from misnamed parameters in the seq2seq module (the gpt-2 module works fine). I can fix the errors as they come up by consulting the current PyTorch Lightning documentation (filepath in the trace should be dirpath for example), but I'd rather use the code as written instead of manually updating it.

Traceback (most recent call last):
File "finetune.py", line 876, in
main(args)
File "finetune.py", line 782, in main
checkpoint_callback=get_checkpoint_callback(args.output_dir, model.val_metric, args.save_top_k, lower_is_better), #LISA
File "/workspace/PrefixTuning/seq2seq/callbacks.py", line 105, in get_checkpoint_callback
period=0, # maybe save a checkpoint every time val is run, not just end of epoch.
TypeError: init() got an unexpected keyword argument 'filepath'

try pip install pytorch-lightning==0.9.0

Thanks for the prompt response!