nomic-ai / gpt4all

gpt4all: run open-source LLMs anywhere

Home Page:https://gpt4all.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Feature] check the compatibility of a hugging face model before fully downloading it ?

SuperUserNameMan opened this issue · comments

Feature Request

Hello again,

It would be cool if the Chat app was able to check the compatibility of a huggingface model before downloading it fully.

Maybe it could be done by checking the GGUF header (if it has one) into the incomplete download file as soon as available ?

commented

Well, just check if its: GUFF and use Q_4 quantization don't solve the problem of vram, example: some model like dolphin 13b won't work, but falcon 13b works fine: why? model size and GPU type, no easy solution for that. only try and fail.