schneiderfelipe / chat-splitter

Split chat messages by maximum chat completion token count

Home Page:https://schneiderfelipe.github.io/posts/chat-splitter-first-release/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

chat-splitter

Build Status Latest Version Documentation

For more information, please refer to the blog announcement.

When utilizing the async_openai Rust crate, it is crucial to ensure that you do not exceed the maximum number of tokens specified by OpenAI's chat models.

chat-splitter categorizes chat messages into 'outdated' and 'recent' messages, allowing you to split them based on both the maximum message count and the maximum chat completion token count. The token counting functionality is provided by tiktoken_rs.

Usage

Here's a basic example:

// Get all your previously stored chat messages...
let mut stored_messages = /* get_stored_messages()? */;

// ...and split into 'outdated' and 'recent',
// where 'recent' always fits the context size.
let (outdated_messages, recent_messages) =
    ChatSplitter::default().split(&stored_messages);

For a more detailed example, see examples/chat.rs.

Contributing

Contributions to chat-splitter are welcome! If you find a bug or have a feature request, please submit an issue. If you'd like to contribute code, please feel free to submit a pull request.

License: MIT

About

Split chat messages by maximum chat completion token count

https://schneiderfelipe.github.io/posts/chat-splitter-first-release/

License:MIT License


Languages

Language:Rust 100.0%