LlamaEdge

LlamaEdge

Geek Repo

The easiest, smallest and fastest local LLM runtime and API server. Creating cross-platform LLM agents and web services in Rust.

Location:United States of America

Home Page:https://LlamaEdge.com/

Twitter:@realwasmedge

Github PK Tool:Github PK Tool

LlamaEdge's repositories

LlamaEdge

The easiest & fastest way to run customized and fine-tuned LLMs locally or on the edge

Language:RustLicense:Apache-2.0Stargazers:734Issues:17Issues:85
Language:TypeScriptLicense:MITStargazers:13Issues:3Issues:0
Language:RustStargazers:8Issues:0Issues:0

Example-LlamaEdge-RAG

This repo contains the code for demonstrating how to using LlamaEdge RAG to build a RAG app

Language:RustStargazers:6Issues:1Issues:0

docker

Docker image for all models

Language:ShellLicense:Apache-2.0Stargazers:1Issues:2Issues:0
Language:JavaScriptLicense:Apache-2.0Stargazers:1Issues:0Issues:0

qdrant

Qdrant - High-performance, massive-scale Vector Database for the next generation of AI. Also available in the cloud https://cloud.qdrant.io/

Language:RustLicense:Apache-2.0Stargazers:0Issues:0Issues:0
Language:RustStargazers:0Issues:0Issues:0

www

Public website

Language:HTMLStargazers:0Issues:0Issues:0