kjerk / instructblip-pipeline

A multimodal inference pipeline that integrates InstructBLIP with textgen-webui for Vicuna and related models.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Support for Flan T5 variant?

StrangeArtificialIntelligence opened this issue · comments

AFAIK Only Flan T5 models are "uncensored" in large visual question answering models, and it can generate creative answers than Vicuna variant, LLaVA and MiniGPT.

https://huggingface.co/Salesforce/instructblip-flan-t5-xl
https://huggingface.co/Salesforce/instructblip-flan-t5-xxl

While AutoGPTQ does not support T5 models, GPTQ-For-LLaMA supports it.

I tried to modify the code to make it work, but I don't know much about AI and my skills are lacking.