Add support for integration tokenizers #1
davidmyersdev
started this conversation in
Roadmap
Replies: 1 comment
-
This one looks like a good option for OpenAI's GPT models. https://github.com/niieani/gpt-tokenizer |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Since tokenizers are specific to the integration, it makes sense to add an interface to integrations that exposes tokenizers to the models.
Beta Was this translation helpful? Give feedback.
All reactions