2 releases
0.1.1 | Sep 4, 2023 |
---|---|
0.1.0 | Jul 14, 2023 |
#22 in #chat-completion
17KB
196 lines
chat-splitter
For more information, please refer to the blog announcement.
When utilizing the async_openai
Rust crate,
it is crucial to ensure that you do not exceed
the maximum number of tokens specified by OpenAI's chat models.
chat-splitter
categorizes chat messages into 'outdated' and 'recent' messages,
allowing you to split them based on both the maximum
message count and the maximum chat completion token count.
The token counting functionality is provided by
tiktoken_rs
.
Usage
Here's a basic example:
// Get all your previously stored chat messages...
let mut stored_messages = /* get_stored_messages()? */;
// ...and split into 'outdated' and 'recent',
// where 'recent' always fits the context size.
let (outdated_messages, recent_messages) =
ChatSplitter::default().split(&stored_messages);
For a more detailed example,
see examples/chat.rs
.
Contributing
Contributions to chat-splitter
are welcome!
If you find a bug or have a feature request,
please submit an issue.
If you'd like to contribute code,
please feel free to submit a pull request.
License: MIT
Dependencies
~21–38MB
~418K SLoC