mudler / LocalAI

:robot: The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.

Home Page:https://localai.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Possible to use it without docker?

coldcode69 opened this issue · comments

Would really appreciate some instructions or guidance for getting this working directly, without docker. I noticed it's using a modified llama.cpp mixed with golong, but I don't have enough knowledge with this to build it. I did try building it but got a error about lama.h not being found.

good point, I'll have a look as soon as possible.

You can try and copy my Makefile from here: https://github.com/go-skynet/llama-cli/blob/ad18b8305449b54f63383408bc47c246cacdf419/Makefile

If you have all the required GO dependancies, you can use make build and make run to run it on your machine directly.

@mudler if this Makefile is tested and merged, then we could close this issue?

(we could make it a seperate PR if you wish)

you should now be able to do 'make build' if you use latest and have all the required dependancies (ie: go, cmake).

@mudler i think we can close this item.

We would need to update the documentation tho, so this issue doesn't re-apead for lack of docs.

I don't have experience in GoLang. Currently I have a question answering chatbot developed using LangChain Python and hosted as AWS Lambda function. You may want to look into the github repo at https://github.com/limcheekin/serverless-flutter-gpt.

I want to swap out the OpenAI API dependency to LocalAI if possible. Appreciate your advise whether it is feasible to do so and please share the steps on how to do so.

Thanks in advanced.

I don't have experience in GoLang. Currently I have a question answering chatbot developed using LangChain Python and hosted as AWS Lambda function. You may want to look into the github repo at https://github.com/limcheekin/serverless-flutter-gpt.

I want to swap out the OpenAI API dependency to LocalAI if possible. Appreciate your advise whether it is feasible to do so and please share the steps on how to do so.

Thanks in advanced.

@limcheekin 👋 feel free to join our Discord channel, however I think it should be as simple as specifying from the OpenAI client you already use a different base_url, and point it to where the API runs.

I'm closing this issue now, as building locally should be possible with make build. I've opened #66 to track having binary releases

Update: binary releases now are available too https://github.com/go-skynet/LocalAI/releases/tag/v1.15.0

how do we use?