Lackel / AGLA

Code for paper "AGLA: Mitigating Object Hallucinations in Large Vision-Language Models with Assembly of Global and Local Attention"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

About the Experiments

haohaodw opened this issue · comments

A very nice work! I would like to ask what the temperature setting of the experiment is, are all methods tested at the same temperature?

Hi, thanks for your interest. The temperature is set to 1 for all experiments.

Thank you for your reply. I have another question. There are many versions of llava. Which version are you using? Can you share the Huggingface link?

Hi, the llava used in our model is included in https://github.com/Lackel/AGLA/tree/main/llava. The file is cited from VCD (https://github.com/DAMO-NLP-SG/VCD), so we do not know the specific version since the source code of llava is frequently updated.

Thank you for your reply. Maybe my expression is not clear. I am not referring to the source code of llava, but the checkpoint of llava. I want to know which llava model is used in the paper. Can you share the corresponding Huggingface link?

Hi, this is the llava version we used https://huggingface.co/liuhaotian/llava-v1.5-7b.