OpenBMB / MiniCPM-V

MiniCPM-Llama3-V 2.5: A GPT-4V Level Multimodal LLM on Your Phone

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

官方提供的web_demo_2.5.py在mmbench上测试 指令遵从能力很差 不能输出abcd

linwenzhao1 opened this issue · comments

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

  • 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

  • 我已经搜索过FAQ | I have searched FAQ

当前行为 | Current Behavior

DAF80036-509F-4B1A-A408-B53046B51AF1

期望行为 | Expected Behavior

简洁的答案:A B C D

复现方法 | Steps To Reproduce

No response

运行环境 | Environment

- OS:
- Python:
- Transformers:
- PyTorch:
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):

备注 | Anything else?

No response

您好,可尝试更直观的prompt,如通过给出全部输出的格式来使模型输出有参考。或者参考我们的评测代码
https://github.com/OpenBMB/MiniCPM-V/blob/main/eval_mm/vlmevalkit/vlmeval/vlm/minicpm_llama3_v_2_5.py#L63

您好,可尝试更直观的prompt,如通过给出全部输出的格式来使模型输出有参考。或者参考我们的评测代码 https://github.com/OpenBMB/MiniCPM-V/blob/main/eval_mm/vlmevalkit/vlmeval/vlm/minicpm_llama3_v_2_5.py#L63

中文的prompt有吗?