Samuel-Ricardo / Rust-Generative-Language-Model-Chat

Creates a Chat wiith a Open Source Generative Language Model with fronted and backend in Rust

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Rust-Generative-Language-Model-Chat

πŸš€ 🟦 Llama Chat 🟦 πŸš€

Full Stack Rust Web Assembly Chat to talk with Open Source AI Llama Models

|   Overview   |    Technologies   |    Run   |   Author   |   


| πŸ›°οΈ About:

Chat online build With Leptos and Rust using WebAssembly, you can talk with Open Source AI Llama Models in real time, locally, using your own computer. To style i use TailwindCSS.


| πŸ—οΈ - Technologies and Concepts Studied:


  • Leptos
  • Rust
  • WebAssembly
  • Docker
  • TailwindCSS
  • SASS
  • Hugging Face
  • Llama
  • Actix
  • Serde

Among Others...

πŸ‘¨β€πŸ’» | How to use


Open your Git Terminal and clone this repository

  $ git clone "git@github.com:Samuel-Ricardo/Rust-Generative-Language-Model-Chat.git"

Make Pull

  $ git pull "git@github.com:Samuel-Ricardo/Rust-Generative-Language-Model-Chat.git"

This application use Docker so you dont need to install and cofigurate anything other than docker on your machine.


Navigate to project folder and run it using docker-compose

  # After setup docker environment just run this commmand on root project folder:

  $ docker-compose up --build   # For First Time run this command

  $ docker-compose up           # to run project

  #Apps Running on:

  $ APP: http://localhost:3000

  See more: docker-compose.yaml

:octocat: | Author:


- Samuel Ricardo

About

Creates a Chat wiith a Open Source Generative Language Model with fronted and backend in Rust

License:MIT License


Languages

Language:CSS 31.5%Language:SCSS 30.4%Language:Rust 28.9%Language:TypeScript 7.6%Language:Dockerfile 1.3%Language:JavaScript 0.4%