Making a Signal Messenger-enabled chatbot, powered by LLMs on commodity hardware
flowchart TD
Middleware --> C[LLM Handler]
Middleware --> D[Signal Messenger API]
Middleware --> E[Emotional State Machine]
Middleware --> F[NLP]
Middleware --> G[sampleMessages]
C --> J[Llama 13b 8-bit quantized model]
At the core of this project are dockerized deployments of the Signal Messenger API, and Llama.cpp server. The middleware is custom code to glue it all together.
The "personality" is defined by prompt injection in the LLM handler file. Currently Vulcan is set to act as a robotic pirate.
The emotional state is managed by a state machine (defined below).
- If he gets a rude message, it'll have a chance to get 'Angry' or 'Sad'.
- If he recieves a compliment, his mood improves.
- If he gets too many boring messages in a row, he gets "bored" and asks a question.
Absolutely! You'll probably need to make changes to the configuration though
State machine to simulate "moods"- Personality tuning
- Allow high reputation conversational partners to influence the personality
Proper logging of messages- Fine-grained control of which Signal chats to respond to
Reply to new chats by default, on wakeword
- Memory
- Short term/conversational memory
- Long term/database memory