There are 2 repositories under llm-ops topic.
RAG (Retrieval Augmented Generation) Framework for building modular, open source applications for production by TrueFoundry
AIConfig is a config-based framework to build generative AI applications.
An end-to-end LLM reference implementation providing a Q&A interface for Airflow and Astronomer
Python SDK for running evaluations on LLM generated responses
Friendli: the fastest serving engine for generative AI
Miscellaneous codes and writings for MLOps
Streamlit-based chatbot leveraging Ollama via LangChain and PostHog-LLM for advanced logging and monitoring