bebekim / llm-ollama-llamaindex-bootstrap-ui

This is a LlamaIndex project bootstrapped with create-llama to act as a full stack UI to accompany Retrieval-Augmented Generation (RAG) Bootstrap Application.

Home Page:https://tyrell.co

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Retrieval-Augmented Generation (RAG) Bootstrap Application UI

This is a LlamaIndex project bootstrapped with create-llama to act as a full stack UI to accompany Retrieval-Augmented Generation (RAG) Bootstrap Application, which can be found in its own repository at https://github.com/tyrell/llm-ollama-llamaindex-bootstrap

My blog post provides more context, motivation and thinking behind these projects.

UI Screenshot

The backend code of this application has been modified as below;

  1. Loading the Vector Store Index created previously in the Retrieval-Augmented Generation (RAG) Bootstrap Application in response to user queries submitted through the frontend UI.
    • Refer backend/app/utils/index.py and the code comments to understand the modifications.
  2. Querying the index with streaming enabled
    • Refer backend/app/api/routers/chat.py and the code comments to understand the modifications.

Running the full stack application

First, startup the backend as described in the backend README.

Second, run the development server of the frontend as described in the frontend README.

Open http://localhost:3000 with your browser to see the result.

License

Apache 2.0

~ Tyrell Perera

About

This is a LlamaIndex project bootstrapped with create-llama to act as a full stack UI to accompany Retrieval-Augmented Generation (RAG) Bootstrap Application.

https://tyrell.co


Languages

Language:TypeScript 50.7%Language:Python 34.2%Language:CSS 11.5%Language:JavaScript 3.6%