There are 5 repositories under agent-memory topic.
🧠 Make your agents learn from experience. Based on the Agentic Context Engineering (ACE) framework.
AgentChat 是一个基于 LLM 的智能体交流平台,内置默认 Agent 并支持用户自定义 Agent。通过多轮对话和任务协作,Agent 可以理解并协助完成复杂任务。项目集成 LangChain、Function Call、MCP 协议、RAG、Memory、Milvus 和 ElasticSearch 等技术,实现高效的知识检索与工具调用,使用 FastAPI 构建高性能后端服务。
The Cursor10x MCP is a persistent multi-dimensional memory system for Cursor that enhances AI assistants with conversation context, project history, and code relationships across sessions.
Simple standalone MCP server giving Claude the ability to remember your conversations and learn from them over time.
DevContext is a cutting-edge Model Context Protocol (MCP) server designed to provide developers with continuous, project-centric context awareness. Unlike traditional context systems, DevContext continuously learns from and adapts to your development patterns and delivers highly relevant context providing a deeper understanding of your codebase.
AI agent that controls computer with OS-level tools, MCP compatible, works with any model
Agent Memory Playground: AI Agent Memory Design & Optimization Techniques
Memory and context manager just works.
LangGraph Typescript Agents Notebooks: email, human in the loop, memory
Curated papers, systems, and benchmarks on memory for LLMs/MLLMs—long-term context, retrieval, and reasoning.
MCP server that execute applescript giving you full control of your Mac
A lightweight MCP server that integrates with Apple Notes to create a personal memory system for AI. Easily recall and save information from your Mac using simple AppleScript commands. Compatible with all macOS versions with minimal setup requirements.
Excuse me sir? Did you order Special Sauce with that Agentic workflow?
This repository introduces the Letta framework, empowering developers to build LLM-based agents with long-term, persistent memory and advanced reasoning capabilities. It leverages concepts from MemGPT to optimize context usage and enable multi-agent collaboration for real-world applications like research, HR, and task management.
This project is a concept demonstration of a layered memory system for Large Language Models. It includes a CLI chatbot and an AI playing Zork I with 'FROTZ' as examples. The true value lies in the memory_handle.py module, designed for easy integration into any Python project requiring LLM memory management (AI agents, games, etc.).
Learn Agentic AI with notes, resources, and practice code covering Generative AI basics, OpenAI agents SDK, prompt engineering, memory, model settings, design patterns and many more.
DevContext is a cutting-edge Model Context Protocol (MCP) server designed to provide developers with continuous, project-centric context awareness. Unlike traditional context systems, DevContext continuously learns from and adapts to your development patterns and delivers highly relevant context providing a deeper understanding of your codebase.
🤖 CodeForge AI: An autonomous multi-agent coding system powered by LangGraph for agentic software development and automated workflows. SOTA custom agentic GraphRag, shared-state memory, auto-model routing for cost optimization, and a range of custom tooling.
The llm_to_mcp_integration_engine is a communication layer designed to enhance the reliability of interactions between LLMs and tools (like MCP servers or functions).
Token-efficient Claude Code workspace with parallel agents and persistent memory. Research → Plan → Implement → Validate workflow.
Knowledge graph agent for personal knowledge management
A production-ready FastAPI service for semantic memory management using Mem0. Store, search, and retrieve AI agent memories with vector embeddings and LLM-powered inference. Perfect for building context-aware conversational AI.
Rails engine that bootstraps AI-ready docs and project memory with SPEC-driven workflows and guides for Rails teams.
My Jarvis is an Alexa skill that can remember and recall memories from users, providing more contextual answers. It utilizes the Redis Agent Memory Server as a memory layer to provide instant-like responses, making your Alexa device appear more human-like in conversation.
AI-powered assistant that indexes Google Drive files to a vector store on upload and answers user queries based on the content.
💬 Engage in continuous chat while dynamically building a JSONL playbook, enhancing insights and strategies with each interaction.
A modular Rust-based self-learning episodic memory system for AI agents, featuring hybrid storage with Turso (SQL) and redb (KV), async execution tracking, reward scoring, reflection, and pattern-based skill evolution. Designed for real-world applicability, maintainability, and scalable agent workflows.
Reason over memory to produce contextual guidance — a Nemori-AI subproject on memory.