yacineali74 / omni-complete

LLM Use Case: LLM Powered, Reusable, Domain Agnostic Autocompletes

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

LLM Autocomplete 🌐

Exploring a valuable LLM use case: LLM Powered, Reusable, Domain Agnostic Autocompletes

LLM Autocomplete

Why care about LLM powered autocompletes? πŸ€”

  • Time Efficiency: Every second saved during user interaction enhances user satisfaction and directly boosts your earnings potential.
  • Reusability Across Domains: These autocompletes can be seamlessly integrated into various tools and applications, regardless of the domain, making them incredibly versatile.
  • Adaptive Learning: LLM autocompletes self-improve with each interaction, becoming more accurate and efficient over time.
  • Actionable Insights: They provide valuable data about user preferences and needs, which can inform business strategies and product improvements.
  • Future-Proof: Staying close to the evolving capabilities of LLMs means that any advancements in the technology will only enhance the functionality of your autocompletes.

Setup πŸ› οΈ

Frontend (Vue.js) πŸ–₯️

  • yarn
  • yarn dev

Backend (Python Flask) 🐍

  • cd server
  • cp .env.sample .env (to create server/.env)
  • Fill in .env
  • python -m venv venv
  • source venv/bin/activate (Linux/Mac) or venv\Scripts\activate (Windows)
  • pip install -r requirements.txt
  • python main.py

Prompt Testing (Promptfoo + Llama 3 + Groq) πŸ§ͺ

  • cp .env.sample .env
  • Fill in .env
  • yarn
  • yarn ptest
  • yarn view

Callouts πŸ“’

  • There are two .env to setup. .env and server/.env.

Resources πŸ“š

About

LLM Use Case: LLM Powered, Reusable, Domain Agnostic Autocompletes


Languages

Language:Vue 59.5%Language:Python 31.5%Language:CSS 3.0%Language:HTML 2.9%Language:TypeScript 2.2%Language:Shell 1.0%