vcappuccio / achainflow

**Ever faced a complex problem that requires expertise from diverse sources?** 🤯

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

This Python script, achainflow.py, simply consults a number of LLMs about a specific query. The script interacts with Groq and Ollama to gather and generate a final answer based on the responses from the other models.

Functionality

  • The script defines functions to interact with the Groq and Ollama APIs for message completion and prompting the same question to different models.

  • It asynchronously consults advisors using different models and aggregates their advice to form a step-by-step plan.

  • The final model will take the advice from the other and will generate a final answers

  • Users can input a problem statement, which is then processed by a series of advisors to generate a final answer.

Usage

  • Make sure to have streamlit installed in your Python environment.
streamlit run achainflow.py
  1. Click the "Solve my problems." button to initiate the consultation process.

  2. The script will display the distributed answer generated by the chain of advisors.

Requirements

To ensure the script runs smoothly, create a requirements.txt file with the necessary dependencies based on the code:

Requirements for achainflow.py

streamlit
groq
ollama

Get started today:

  1. Click the "Solve my problem" button and input your specific concern.
  2. Witness the collaborative power of AI as it generates a tailored solution plan.

About

**Ever faced a complex problem that requires expertise from diverse sources?** 🤯

License:MIT License


Languages

Language:Python 100.0%