There are 14 repositories under local topic.
🏩 A simple process manager for developers. Start apps from your browser and access them using local domains
Local Deep Research achieves ~95% on SimpleQA benchmark (tested with GPT-4.1-mini). Supports local and cloud LLMs (Ollama, Google, Anthropic, ...). Searches 10+ sources - arXiv, PubMed, web, and your private documents. Everything Local & Encrypted.
Tired of pushing to test your .gitlab-ci.yml?
TypeScript-centric app development platform: notebook and AI app builder
System font stack CSS organized by typeface classification for every modern operating system
Claudable is an open-source web builder that leverages local CLI agents, such as Claude Code, Codex, Gemini CLI, Qwen Code, and Cursor Agent, to build and deploy products effortlessly.
Free, local, open-source GUI app for Gemini CLI — Better Chat UI, Multi-agent, Multi-LLMs & apikey polling, Workspace, MCP, Remote WebUi Mode & more | 🌟 Star if you like it!
An open source approach to locally record and enable searching everything you view on your Mac.
A datetime library for Rust that encourages you to jump into the pit of success.
🌧 An easy-to-use API for devices that use Tuya's cloud services. Documentation: https://codetheweb.github.io/tuyapi.
Authentication built for Nuxt 3! Easily add authentication via OAuth providers, credentials or Email Magic URLs!
ExHentai本地漫画标签管理阅读应用, ExHentai local manga tag-manager and reader
Supabase CLI. Manage postgres migrations, run Supabase locally, deploy edge functions. Postgres backups. Generating types from your database schema.
A VM for Drupal development
Fully-featured web interface for Ollama LLMs
Run local LLMs like llama, deepseek-distill, kokoro and more inside your browser
An LLM driven recommendation system based on Radarr and Sonarr library or watch history information
🔹 A Home Assistant integration to handle Tuya devices locally "fork from localtuya"
ChattyUI - your private AI chat for running LLMs in the browser
My personal note about local and global descriptor
Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot.
Swift database - fast, simple and lightweight (iOS, macOS)
From anywhere you can type, query and stream the output of any script (e.g. an LLM)
The local notification plugin provides a way to show local notifications from .Net MAUI and Xamarin Forms apps .
Software environment for web development
ARGO is an open-source AI Agent platform that brings Local Manus to your desktop. With one-click model downloads, seamless closed LLM integration, and offline-first RAG knowledge bases, ARGO becomes a DeepResearch powerhouse for autonomous thinking, task planning, and 100% of your data stays locally. Support Win/Mac/Docker.
A Blade component to quickly login to your local environment
multi1: create o1-like reasoning chains with multiple AI providers (and locally). Supports LiteLLM as backend too for 100+ providers at once.
🎯 Testing your project locally in a clean environment.