marella / ctransformers

Python bindings for the Transformer models implemented in C/C++ using GGML library.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Add Support for Google/Gemma-2b-it

Arya920 opened this issue · comments

I am facing the same error , I am using GGUF version of a "fine tuned GEMMA-2B-it model"
using the following libraries ~

from langchain_community.llms import CTransformers

model link--> https://huggingface.co/Shritama/GEMMA-2b-GGUF/tree/main
Now while inferencing it 's showing something like this ~
[RuntimeError: Failed to create LLM 'gguf' from 'D:\ISnartech Folder\Project_Folder\Streamlit APP\GgufModels\Q4_K_M.gguf'. ]
please help.