[BUG] <title>微调Lora模型时, 模型训练过程中出现UserWarning: Could not find a config file in ./openbmb/MiniCPM-Llama3-V-2_5 - will assume that the vocabulary was not modified
hyyuan123 opened this issue · comments
是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
- 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions
该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?
- 我已经搜索过FAQ | I have searched FAQ
当前行为 | Current Behavior
模型训练过程中出现
UserWarning: Could not find a config file in /tmp/hyy1/08_miniCPM/openbmb/MiniCPM-Llama3-V-2_5 - will assume that the vocabulary was not modified
但是不影响模型Loss的下降,但是训练完成的模型,使用训练数据进行效果测试,模型的测试效果并不好,基本上和没有微调结果类似,没什么改变
期望行为 | Expected Behavior
No response
复现方法 | Steps To Reproduce
No response
运行环境 | Environment
- OS:
- Python:3.10
- Transformers:4.20
- PyTorch:2.12
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):12.1
备注 | Anything else?
No response
首先您是否使用lora微调,另外微调后的模型是否按照正确方式加载。