New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Custom models not work on OpenAiStreamingChatModel because Model 'xxx' is unknown to jtokkit
#804
Labels
bug
Something isn't working
Comments
langchain4j
added a commit
that referenced
this issue
Mar 28, 2024
<!-- Thank you so much for your contribution! --> ## Context See #804 ## Change - `OpenAiStreamingChatModel`: in case `modelName` is not one of the known OpenAI models, do not return `TokenUsage` in the `Response`. This is done for cases when `OpenAiStreamingChatModel` is used to connect to other OpenAI-API-compatible LLM providers like Ollama and Groq. In such cases it is better to not return `TokenUsage` then returning a wrong one. - For all OpenAI models, default `Tokenizer` will now use "gpt-3.5-turbo" model name instead of the one provided by the user in the `modelName` parameter. This is done to avoid crashing with "Model 'ft:gpt-3.5-turbo:my-org:custom_suffix:id' is unknown to jtokkit" for fine-tuned OpenAI models. It should be safe to use "gpt-3.5-turbo" by default with all current OpenAI models, as they all use the same cl100k_base encoding. ## Checklist Before submitting this PR, please check the following points: - [X] I have added unit and integration tests for my change - [X] All unit and integration tests in the module I have added/changed are green - [X] All unit and integration tests in the [core](https://github.com/langchain4j/langchain4j/tree/main/langchain4j-core) and [main](https://github.com/langchain4j/langchain4j/tree/main/langchain4j) modules are green - [ ] I have added/updated the [documentation](https://github.com/langchain4j/langchain4j/tree/main/docs/docs) - [ ] I have added an example in the [examples repo](https://github.com/langchain4j/langchain4j-examples) (only for "big" features) - [ ] I have added my new module in the [BOM](https://github.com/langchain4j/langchain4j/blob/main/langchain4j-bom/pom.xml) (only when a new module is added)
Fixed in 0.29.1 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Custom models not work on
OpenAiStreamingChatModel
because `Model 'xxx' is unknown to jtokkitDescribe the bug
This bug simmilar to this old bug
Log and Stack trace
Expected behavior
Should work with any custom models. Because non-stream version did
Please complete the following information:
The text was updated successfully, but these errors were encountered: