ollama / ollama-js

Ollama JavaScript library

Home Page:https://ollama.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Do local calls require hardware

jiangming96 opened this issue · comments

Is there a requirement for the graphics memory of the local graphics card when calling the local model through Ollamajs? Can it be used directly without pre training?

Hi @jiangming96 this is a client library for the Ollama server and doesn't contain the LLM runtime as of now:
github.com/ollama/ollama

In general though the memory required will vary by the model. You can see some examples in the Ollama repo README linked above. Let me know if you have any more questions.

Hi @jiangming96 this is a client library for the Ollama server and doesn't contain the LLM runtime as of now: github.com/ollama/ollama

In general though the memory required will vary by the model. You can see some examples in the Ollama repo README linked above. Let me know if you have any more questions.

Okay, thank you