Lightning-AI / lit-llama

Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

TypeError: super(type, obj): obj must be an instance or subtype of type

Vinter8848 opened this issue · comments

commented

Loading model ...
Traceback (most recent call last):
File "/home/Zhengwt/lit-llama/evaluate/lora.py", line 172, in
CLI(main)
File "/home/Zhengwt/anaconda3/envs/lit-llama/lib/python3.9/site-packages/jsonargparse/_cli.py", line 85, in CLI
return _run_component(component, cfg_init)
File "/home/Zhengwt/anaconda3/envs/lit-llama/lib/python3.9/site-packages/jsonargparse/_cli.py", line 147, in _run_component
return component(**cfg)
File "/home/Zhengwt/lit-llama/evaluate/lora.py", line 105, in main
model = LLaMA.from_name(name)
File "/home/Zhengwt/lit-llama/lit_llama/model.py", line 124, in from_name
return cls(LLaMAConfig.from_name(name))
File "/home/Zhengwt/lit-llama/lit_llama/model.py", line 59, in init
h=nn.ModuleList(Block(config) for _ in range(config.n_layer)),
File "/home/Zhengwt/anaconda3/envs/lit-llama/lib/python3.9/site-packages/torch/nn/modules/container.py", line 279, in init
self += modules
File "/home/Zhengwt/anaconda3/envs/lit-llama/lib/python3.9/site-packages/torch/nn/modules/container.py", line 320, in iadd
return self.extend(modules)
File "/home/Zhengwt/anaconda3/envs/lit-llama/lib/python3.9/site-packages/torch/nn/modules/container.py", line 401, in extend
for i, module in enumerate(modules):
File "/home/Zhengwt/lit-llama/lit_llama/model.py", line 59, in
h=nn.ModuleList(Block(config) for _ in range(config.n_layer)),
File "/home/Zhengwt/lit-llama/lit_llama/model.py", line 150, in init
self.attn = CausalSelfAttention(config)
File "/home/Zhengwt/lit-llama/lit_llama/lora.py", line 428, in init
self.c_attn = MergedLinear(
File "/home/Zhengwt/lit-llama/lit_llama/lora.py", line 134, in init
nn.Linear.init(self, in_features, out_features, **kwargs)
File "/home/Zhengwt/lit-llama/lit_llama/quantization.py", line 45, in init
super().init(*args, **kwargs, has_fp16_weights=False, threshold=6.0)
TypeError: super(type, obj): obj must be an instance or subtype of type

commented

I encountered the above problem when executing this command.

python evaluate/lora.py --quantize llm.int8
commented

Can you provide some ideas to help me solve this problem? If you can, I would be extremely grateful.

This has been fixed in lit-gpt: https://github.com/Lightning-AI/lit-gpt