salesforce / ctrl

Conditional Transformer Language Model for Controllable Generation

Home Page:https://arxiv.org/abs/1909.05858

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

why set "seq_length = min(args.generate_num, 256)"

luweishuang opened this issue · comments

commented

I notice you have three pretrained models, include seqlen256_v1.ckpt and seqlen512_v1.ckpt. And you say "Only difference is the sequence length used during training. The 512 model uses double the number of tokens as the 256 one for computing the attention but half the batch size (to prevent OOM)." so why in generate.py you set seq_length = min(args.generate_num, 256)?
If I used seqlen512_v1.ckpt model, should I set seq_length = min(args.generate_num, 512)?

Hi @luweishuang,
Not a developer, but I encountered the exact same issue with training.
You will have to change the value to 512 if training on the seqlen512_v1.ckpt model. I have modified this on the version I'm currently using to a function argument inputted at the time of cmd line entry so I would strongly recommend doing this.
All the best
Thursday