shaadclt / CodeAssistant-Ollama-CodeLlama

This project provides a user-friendly Gradio interface that enables you to interact with the custom model based on CodeLlama model from Ollama, an open-source large language model platform. Simply enter your prompt in the textbox, and custom model will generate code based on your input.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Code Assistant with Ollama and CodeLlama

This project provides a user-friendly Gradio interface that enables you to interact with the custom model based on CodeLlama model from Ollama, an open-source large language model platform. Simply enter your prompt in the textbox, and custom model will generate code based on your input.

Installation:

  1. Clone this repository:
git clone https://github.com/shaadclt/CodeAssistant-Ollama-CodeLlama.git
  1. Install dependencies:
cd CodeAssistant-Ollama-CodeLlama
pip install -r requirements.txt

Usage:

  1. Install Ollama:
  2. Download the CodeLlama Model:
  3. Create modelfile
  4. Create custom model using Ollama
  5. Run the custom model.
  6. Start the Gradio Interface:
python app.py
  1. Open http://127.0.0.1:7860 in your web browser.
  2. Enter your prompt in the textbox and click "Submit" to generate code based on your input.

Contributing:

We welcome contributions to this project! Feel free to fork the repository, make your changes, and submit a pull request.

License:

This project is licensed under the MIT License.

About

This project provides a user-friendly Gradio interface that enables you to interact with the custom model based on CodeLlama model from Ollama, an open-source large language model platform. Simply enter your prompt in the textbox, and custom model will generate code based on your input.


Languages

Language:Python 100.0%