yuhuixu1993 / qa-lora

Official PyTorch implementation of QA-LoRA

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ValueError: Target modules [] not found in the base model.

akkkb opened this issue · comments

Already replaced the auto_gptq/utils/peft_utils.py with give script (peft_utils.py)
While attempting to execute qalora.py with default setup its giving following error (as target_modules=None, Line No 315 qalora.py):

auto_gptq/utils/peft_utils.py", line 409, in get_gptq_peft_model
peft_model = get_peft_model(model.model, peft_config, adapter_name=adapter_name)
File "/python3.10/site-packages/peft/mapping.py", line 133, in get_peft_model
return MODEL_TYPE_TO_PEFT_MODEL_MAPPING[peft_config.task_type](model, peft_config, adapter_name=adapter_name)
File "/python3.10/site-packages/peft/peft_model.py", line 1043, in init
super().init(model, peft_config, adapter_name)
File "/python3.10/site-packages/peft/peft_model.py", line 125, in init
self.base_model = cls(model, {adapter_name: peft_config}, adapter_name)
File "/python3.10/site-packages/peft/tuners/lora/model.py", line 111, in init
super().init(model, config, adapter_name)
File "/python3.10/site-packages/peft/tuners/tuners_utils.py", line 90, in init
self.inject_adapter(self.model, adapter_name)
File "/python3.10/site-packages/peft/tuners/tuners_utils.py", line 250, in inject_adapter
raise ValueError(
ValueError: Target modules [] not found in the base model. Please check the target modules and try again.

During handling of the above exception, another exception occurred:

adapters/qa-lora/qalora.py", line 306, in get_accelerate_model
model = get_gptq_peft_model(model, config, auto_find_all_linears=True, train_mode=True)
File "/python3.10/site-packages/auto_gptq/utils/peft_utils.py , line 413, in get_gptq_peft_model
raise NotImplementedError(
NotImplementedError: LlamaGPTQForCausalLM not support LORA peft type yet.

Any suggestion will be really helpful.

Thanks

Hi, according to your description, it seems that target_modules is empty during the execution of your code, which is quite unlikely in the original code. In qalora.py on line 328, auto_find_all_linears=True, so in peft_utils.py on line 390, peft_config.target_modules will automatically get the Target modules. In the llama model, peft_config.target_modules usually includes ['gate_proj', 'k_proj', 'v_proj', 'up_proj', 'q_proj', 'o_proj', 'down_proj'], and it is not expected that target_modules would be empty.

Please check the following items:

1.Check if the code in qalora.py has been modified.
2.Check if the code in peft_utils.py has been correctly replaced.
3.Confirm that the current environment is consistent with the environment mentioned in the tutorial.

If you have any further questions or need additional assistance, please don't hesitate to reach out.