haotian-liu / LLaVA

[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.

Home Page:https://llava.hliu.cc

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Question] pretrain_mm_mlp_adapter of llava v1.6 7B not found

YQYI opened this issue · comments

commented

Question

I try to train llava v1.6 7B in lora mode, but can not find pretrain_mm_mlp_adapter file, where can i find it?

@YQYI I guess you can just use the mlp_adapter of llava v1.5 7B as the blog says
image