rmihaylov / falcontune

Tune any FALCON in 4-bit

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Use prompt get error

631068264 opened this issue · comments

commented
falcontune generate \
    --interactive \
    --model falcon-7b \
    --weights tiiuae/falcon-7b \
    --lora_apply_dir falcon-7b-alpaca \
    --max_new_tokens 50 \
    --use_cache \
    --do_sample \
    --prompt "番茄炒蛋的配料"
番茄炒蛋的配料。

### Input:


### Response:
xxxxx
Took 334.041 s




Enter new prompt: Traceback (most recent call last):
  File "/data/home/yaokj5/anaconda3/envs/falcon/bin/falcontune", line 33, in <module>
    sys.exit(load_entry_point('falcontune==0.1.0', 'console_scripts', 'falcontune')())
  File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/falcontune-0.1.0-py3.10.egg/falcontune/run.py", line 88, in main
    args.func(args)
  File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/falcontune-0.1.0-py3.10.egg/falcontune/generate.py", line 71, in generate
    generated_ids = model.generate(
  File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/falcontune-0.1.0-py3.10.egg/falcontune/generate.py", line 27, in autocast_generate
    return self.model.non_autocast_generate(*args, **kwargs)
  File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/peft/peft_model.py", line 731, in generate
    outputs = self.base_model.generate(**kwargs)
  File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/data/home/yaokj5/anaconda3/envs/falcon/lib/python3.10/site-packages/transformers/generation/utils.py", line 1312, in generate
    and torch.sum(inputs_tensor[:, -1] == generation_config.pad_token_id) > 0
IndexError: index -1 is out of bounds for dimension 1 with size 0