AlanJager's starred repositories
rust-out-of-tree-module
Basic template for an out-of-tree Linux kernel module written in Rust.
GraphRAG-Local-UI
GraphRAG using Local LLMs - Features robust API and multiple apps for Indexing/Prompt Tuning/Query/Chat/Visualizing/Etc. This is meant to be the ultimate GraphRAG/KG local LLM app.
knowledge_graph
Convert any text to a graph of knowledge. This can be used for Graph Augmented Generation or Knowledge Graph based QnA
chatGPTBox
Integrating ChatGPT into your browser deeply, everything you need is here
confluence-dumper
Tool to export Confluence spaces and pages recursively via its API
llama_parse
Parse files for optimal RAG
bootloader
An experimental pure-Rust x86 bootloader
FlagEmbedding
Retrieval and Retrieval-augmented LLMs
anything-llm
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, and more.
lobe-chat
🤯 Lobe Chat - an open-source, modern-design AI chat framework. Supports Multi AI Providers( OpenAI / Claude 3 / Gemini / Ollama / Azure / DeepSeek), Knowledge Base (file upload / knowledge management / RAG ), Multi-Modals (Vision/TTS) and plugin system. One-click FREE deployment of your private ChatGPT/ Claude application.
rust-vmm-container
Container with all dependencies required for running rust-vmm crates integration tests.
llama-recipes
Scripts for fine-tuning Meta Llama with composable FSDP & PEFT methods to cover single/multi-node GPUs. Supports default & custom datasets for applications such as summarization and Q&A. Supporting a number of candid inference solutions such as HF TGI, VLLM for local or cloud deployment. Demo apps to showcase Meta Llama for WhatsApp & Messenger.
llama_index
LlamaIndex is a data framework for your LLM applications
play-music-particle-visualizer
Reverse Engineering of Google Play Music Particle Visualizer
open-interpreter
A natural language interface for computers
LocalAI
:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed inference