Skip to content

schneiderfelipe/chat-splitter

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

chat-splitter

Build Status Latest Version Documentation

For more information, please refer to the blog announcement.

When utilizing the async_openai Rust crate, it is crucial to ensure that you do not exceed the maximum number of tokens specified by OpenAI's chat models.

chat-splitter categorizes chat messages into 'outdated' and 'recent' messages, allowing you to split them based on both the maximum message count and the maximum chat completion token count. The token counting functionality is provided by tiktoken_rs.

Usage

Here's a basic example:

// Get all your previously stored chat messages...
let mut stored_messages = /* get_stored_messages()? */;

// ...and split into 'outdated' and 'recent',
// where 'recent' always fits the context size.
let (outdated_messages, recent_messages) =
    ChatSplitter::default().split(&stored_messages);

For a more detailed example, see examples/chat.rs.

Contributing

Contributions to chat-splitter are welcome! If you find a bug or have a feature request, please submit an issue. If you'd like to contribute code, please feel free to submit a pull request.

License: MIT